Sample records for large statistically significant

  1. The large sample size fallacy.

    PubMed

    Lantz, Björn

    2013-06-01

    Significance in the statistical sense has little to do with significance in the common practical sense. Statistical significance is a necessary but not a sufficient condition for practical significance. Hence, results that are extremely statistically significant may be highly nonsignificant in practice. The degree of practical significance is generally determined by the size of the observed effect, not the p-value. The results of studies based on large samples are often characterized by extreme statistical significance despite small or even trivial effect sizes. Interpreting such results as significant in practice without further analysis is referred to as the large sample size fallacy in this article. The aim of this article is to explore the relevance of the large sample size fallacy in contemporary nursing research. Relatively few nursing articles display explicit measures of observed effect sizes or include a qualitative discussion of observed effect sizes. Statistical significance is often treated as an end in itself. Effect sizes should generally be calculated and presented along with p-values for statistically significant results, and observed effect sizes should be discussed qualitatively through direct and explicit comparisons with the effects in related literature. © 2012 Nordic College of Caring Science.

  2. Statistical significance test for transition matrices of atmospheric Markov chains

    NASA Technical Reports Server (NTRS)

    Vautard, Robert; Mo, Kingtse C.; Ghil, Michael

    1990-01-01

    Low-frequency variability of large-scale atmospheric dynamics can be represented schematically by a Markov chain of multiple flow regimes. This Markov chain contains useful information for the long-range forecaster, provided that the statistical significance of the associated transition matrix can be reliably tested. Monte Carlo simulation yields a very reliable significance test for the elements of this matrix. The results of this test agree with previously used empirical formulae when each cluster of maps identified as a distinct flow regime is sufficiently large and when they all contain a comparable number of maps. Monte Carlo simulation provides a more reliable way to test the statistical significance of transitions to and from small clusters. It can determine the most likely transitions, as well as the most unlikely ones, with a prescribed level of statistical significance.

  3. Delayed Implants Outcome in Maxillary Molar Region.

    PubMed

    Crespi, Roberto; Capparè, Paolo; Crespi, Giovanni; Gastaldi, Giorgio; Gherlone, Enrico F

    2017-04-01

    The aim of the present study was to assess bone volume changes in maxillary molar regions after delayed implants placement. Patients presented large bone defects after tooth extractions. Reactive soft tissue was left into the defects. No grafts were used. Cone beam computed tomography (CBCT) scans were performed before tooth extractions, at implant placement (at 3 months from extraction) and 3 years after implant placement, bone volume measurements were assessed. Bucco-lingual width showed a statistically significant decrease (p = .013) at implant placement, 3 months after extraction. Moreover, a statistically significant increase (p < .01) was measured 3 years after implant placement. No statistically significant differences (p > .05) were found between baseline values (before extraction) and at 3 years from implant placement. Vertical dimension showed no statistically significant differences (p > .05) at implant placement, 3 months after extraction. Statistically significant differences (p < .0001) were found between baseline values (before extraction) and at 3 months from implant placement as well as between implant placement values and 3 years later. CT scans presented successful outcome of delayed implants placed in large bone defects at 3-year follow-up. © 2016 Wiley Periodicals, Inc.

  4. Statistical Analysis of Big Data on Pharmacogenomics

    PubMed Central

    Fan, Jianqing; Liu, Han

    2013-01-01

    This paper discusses statistical methods for estimating complex correlation structure from large pharmacogenomic datasets. We selectively review several prominent statistical methods for estimating large covariance matrix for understanding correlation structure, inverse covariance matrix for network modeling, large-scale simultaneous tests for selecting significantly differently expressed genes and proteins and genetic markers for complex diseases, and high dimensional variable selection for identifying important molecules for understanding molecule mechanisms in pharmacogenomics. Their applications to gene network estimation and biomarker selection are used to illustrate the methodological power. Several new challenges of Big data analysis, including complex data distribution, missing data, measurement error, spurious correlation, endogeneity, and the need for robust statistical methods, are also discussed. PMID:23602905

  5. Transport Coefficients from Large Deviation Functions

    NASA Astrophysics Data System (ADS)

    Gao, Chloe; Limmer, David

    2017-10-01

    We describe a method for computing transport coefficients from the direct evaluation of large deviation function. This method is general, relying on only equilibrium fluctuations, and is statistically efficient, employing trajectory based importance sampling. Equilibrium fluctuations of molecular currents are characterized by their large deviation functions, which is a scaled cumulant generating function analogous to the free energy. A diffusion Monte Carlo algorithm is used to evaluate the large deviation functions, from which arbitrary transport coefficients are derivable. We find significant statistical improvement over traditional Green-Kubo based calculations. The systematic and statistical errors of this method are analyzed in the context of specific transport coefficient calculations, including the shear viscosity, interfacial friction coefficient, and thermal conductivity.

  6. Got power? A systematic review of sample size adequacy in health professions education research.

    PubMed

    Cook, David A; Hatala, Rose

    2015-03-01

    Many education research studies employ small samples, which in turn lowers statistical power. We re-analyzed the results of a meta-analysis of simulation-based education to determine study power across a range of effect sizes, and the smallest effect that could be plausibly excluded. We systematically searched multiple databases through May 2011, and included all studies evaluating simulation-based education for health professionals in comparison with no intervention or another simulation intervention. Reviewers working in duplicate abstracted information to calculate standardized mean differences (SMD's). We included 897 original research studies. Among the 627 no-intervention-comparison studies the median sample size was 25. Only two studies (0.3%) had ≥80% power to detect a small difference (SMD > 0.2 standard deviations) and 136 (22%) had power to detect a large difference (SMD > 0.8). 110 no-intervention-comparison studies failed to find a statistically significant difference, but none excluded a small difference and only 47 (43%) excluded a large difference. Among 297 studies comparing alternate simulation approaches the median sample size was 30. Only one study (0.3%) had ≥80% power to detect a small difference and 79 (27%) had power to detect a large difference. Of the 128 studies that did not detect a statistically significant effect, 4 (3%) excluded a small difference and 91 (71%) excluded a large difference. In conclusion, most education research studies are powered only to detect effects of large magnitude. For most studies that do not reach statistical significance, the possibility of large and important differences still exists.

  7. Wilderness adventure therapy effects on the mental health of youth participants.

    PubMed

    Bowen, Daniel J; Neill, James T; Crisp, Simon J R

    2016-10-01

    Adventure therapy offers a prevention, early intervention, and treatment modality for people with behavioural, psychological, and psychosocial issues. It can appeal to youth-at-risk who are often less responsive to traditional psychotherapeutic interventions. This study evaluated Wilderness Adventure Therapy (WAT) outcomes based on participants' pre-program, post-program, and follow-up responses to self-report questionnaires. The sample consisted of 36 adolescent out-patients with mixed mental health issues who completed a 10-week, manualised WAT intervention. The overall short-term standardised mean effect size was small, positive, and statistically significant (0.26), with moderate, statistically significant improvements in psychological resilience and social self-esteem. Total short-term effects were within age-based adventure therapy meta-analytic benchmark 90% confidence intervals, except for the change in suicidality which was lower than the comparable benchmark. The short-term changes were retained at the three-month follow-up, except for family functioning (significant reduction) and suicidality (significant improvement). For participants in clinical ranges pre-program, there was a large, statistically significant reduction in depressive symptomology, and large to very large, statistically significant improvements in behavioural and emotional functioning. These changes were retained at the three-month follow-up. These findings indicate that WAT is as effective as traditional psychotherapy techniques for clinically symptomatic people. Future research utilising a comparison or wait-list control group, multiple sources of data, and a larger sample, could help to qualify and extend these findings. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  8. Statistically significant performance results of a mine detector and fusion algorithm from an x-band high-resolution SAR

    NASA Astrophysics Data System (ADS)

    Williams, Arnold C.; Pachowicz, Peter W.

    2004-09-01

    Current mine detection research indicates that no single sensor or single look from a sensor will detect mines/minefields in a real-time manner at a performance level suitable for a forward maneuver unit. Hence, the integrated development of detectors and fusion algorithms are of primary importance. A problem in this development process has been the evaluation of these algorithms with relatively small data sets, leading to anecdotal and frequently over trained results. These anecdotal results are often unreliable and conflicting among various sensors and algorithms. Consequently, the physical phenomena that ought to be exploited and the performance benefits of this exploitation are often ambiguous. The Army RDECOM CERDEC Night Vision Laboratory and Electron Sensors Directorate has collected large amounts of multisensor data such that statistically significant evaluations of detection and fusion algorithms can be obtained. Even with these large data sets care must be taken in algorithm design and data processing to achieve statistically significant performance results for combined detectors and fusion algorithms. This paper discusses statistically significant detection and combined multilook fusion results for the Ellipse Detector (ED) and the Piecewise Level Fusion Algorithm (PLFA). These statistically significant performance results are characterized by ROC curves that have been obtained through processing this multilook data for the high resolution SAR data of the Veridian X-Band radar. We discuss the implications of these results on mine detection and the importance of statistical significance, sample size, ground truth, and algorithm design in performance evaluation.

  9. [Effect sizes, statistical power and sample sizes in "the Japanese Journal of Psychology"].

    PubMed

    Suzukawa, Yumi; Toyoda, Hideki

    2012-04-01

    This study analyzed the statistical power of research studies published in the "Japanese Journal of Psychology" in 2008 and 2009. Sample effect sizes and sample statistical powers were calculated for each statistical test and analyzed with respect to the analytical methods and the fields of the studies. The results show that in the fields like perception, cognition or learning, the effect sizes were relatively large, although the sample sizes were small. At the same time, because of the small sample sizes, some meaningful effects could not be detected. In the other fields, because of the large sample sizes, meaningless effects could be detected. This implies that researchers who could not get large enough effect sizes would use larger samples to obtain significant results.

  10. How large is the gluon polarization in the statistical parton distributions approach?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soffer, Jacques; Bourrely, Claude; Buccella, Franco

    2015-04-10

    We review the theoretical foundations of the quantum statistical approach to parton distributions and we show that by using some recent experimental results from Deep Inelastic Scattering, we are able to improve the description of the data by means of a new determination of the parton distributions. We will see that a large gluon polarization emerges, giving a significant contribution to the proton spin.

  11. Efficacy of double mirrored omega pattern for skin sparing mastectomy to reduce ischemic complications.

    PubMed

    Santanelli di Pompeo, Fabio; Sorotos, Michail; Laporta, Rosaria; Pagnoni, Marco; Longo, Benedetto

    2018-02-01

    Excellent cosmetic results from skin-sparing mastectomy (SSM) are often impaired by skin flaps' necrosis (SFN), from 8%-25% or worse in smokers. This study prospectively investigated the efficacy of Double-Mirrored Omega Pattern (DMOP-SSM) compared to Wise Pattern SSM (WP-SSM) for immediate reconstruction in moderate/large-breasted smokers. From 2008-2010, DMOP-SSM was performed in 51 consecutive immediate breast reconstructions on 41 smokers (mean age = 49.8 years) with moderate/large and ptotic breasts. This active group (AG) was compared to a similar historical control group (CG) of 37 smokers (mean age = 51.1 years) who underwent WP-SSM and immediate breast reconstruction, with a mean follow-up of 37.6 months. Skin ischaemic complications, number of surgical revisions, time to wound healing, and patient satisfaction were analysed. Descriptive statistics were reported and comparison of performance endpoints was performed using Fisher's exact test and Mann-Whitney U-test. A p-value <.05 was considered significant. Patients' mean age (p = .316) and BMI (p = .215) were not statistically different between groups. Ischaemic complications occurred in 11.7% of DMOP-SSMs and in 32.4% of WP-SSMs (p = .017), and revision rates were, respectively, 5.8% and 24.3% (p = .012), both statistically significant. Mean time to wound healing was, respectively, 16.8 days and 18.4 days (p = .205). Mean patients' satisfaction scores were, respectively, 18.9 and 21.1, statistically significant (p = .022). Although tobacco use in moderate/large breasted patients can severely impair outcomes of breast reconstruction, the DMOP-SSM approach, compared to WP-SSM, allows smokers to benefit from SSM, but with statistically significant reduced skin flaps ischaemic complications, revision surgery, and better cosmetic outcomes.

  12. Methodological Issues Related to the Use of P Less than 0.05 in Health Behavior Research

    ERIC Educational Resources Information Center

    Duryea, Elias; Graner, Stephen P.; Becker, Jeremy

    2009-01-01

    This paper reviews methodological issues related to the use of P less than 0.05 in health behavior research and suggests how application and presentation of statistical significance may be improved. Assessment of sample size and P less than 0.05, the file drawer problem, the Law of Large Numbers and the statistical significance arguments in…

  13. Joint resonant CMB power spectrum and bispectrum estimation

    NASA Astrophysics Data System (ADS)

    Meerburg, P. Daniel; Münchmeyer, Moritz; Wandelt, Benjamin

    2016-02-01

    We develop the tools necessary to assess the statistical significance of resonant features in the CMB correlation functions, combining power spectrum and bispectrum measurements. This significance is typically addressed by running a large number of simulations to derive the probability density function (PDF) of the feature-amplitude in the Gaussian case. Although these simulations are tractable for the power spectrum, for the bispectrum they require significant computational resources. We show that, by assuming that the PDF is given by a multivariate Gaussian where the covariance is determined by the Fisher matrix of the sine and cosine terms, we can efficiently produce spectra that are statistically close to those derived from full simulations. By drawing a large number of spectra from this PDF, both for the power spectrum and the bispectrum, we can quickly determine the statistical significance of candidate signatures in the CMB, considering both single frequency and multifrequency estimators. We show that for resonance models, cosmology and foreground parameters have little influence on the estimated amplitude, which allows us to simplify the analysis considerably. A more precise likelihood treatment can then be applied to candidate signatures only. We also discuss a modal expansion approach for the power spectrum, aimed at quickly scanning through large families of oscillating models.

  14. The relative persuasiveness of gain-framed and loss-framed messages for encouraging disease prevention behaviors: a meta-analytic review.

    PubMed

    O'Keefe, Daniel J; Jensen, Jakob D

    2007-01-01

    A meta-analytic review of 93 studies (N = 21,656) finds that in disease prevention messages, gain-framed appeals, which emphasize the advantages of compliance with the communicator's recommendation, are statistically significantly more persuasive than loss-framed appeals, which emphasize the disadvantages of noncompliance. This difference is quite small (corresponding to r = .03), however, and appears attributable to a relatively large (and statistically significant) effect for messages advocating dental hygiene behaviors. Despite very good statistical power, the analysis finds no statistically significant differences in persuasiveness between gain- and loss-framed messages concerning other preventive actions such as safer-sex behaviors, skin cancer prevention behaviors, or diet and nutrition behaviors.

  15. Significance levels for studies with correlated test statistics.

    PubMed

    Shi, Jianxin; Levinson, Douglas F; Whittemore, Alice S

    2008-07-01

    When testing large numbers of null hypotheses, one needs to assess the evidence against the global null hypothesis that none of the hypotheses is false. Such evidence typically is based on the test statistic of the largest magnitude, whose statistical significance is evaluated by permuting the sample units to simulate its null distribution. Efron (2007) has noted that correlation among the test statistics can induce substantial interstudy variation in the shapes of their histograms, which may cause misleading tail counts. Here, we show that permutation-based estimates of the overall significance level also can be misleading when the test statistics are correlated. We propose that such estimates be conditioned on a simple measure of the spread of the observed histogram, and we provide a method for obtaining conditional significance levels. We justify this conditioning using the conditionality principle described by Cox and Hinkley (1974). Application of the method to gene expression data illustrates the circumstances when conditional significance levels are needed.

  16. A large-scale perspective on stress-induced alterations in resting-state networks

    NASA Astrophysics Data System (ADS)

    Maron-Katz, Adi; Vaisvaser, Sharon; Lin, Tamar; Hendler, Talma; Shamir, Ron

    2016-02-01

    Stress is known to induce large-scale neural modulations. However, its neural effect once the stressor is removed and how it relates to subjective experience are not fully understood. Here we used a statistically sound data-driven approach to investigate alterations in large-scale resting-state functional connectivity (rsFC) induced by acute social stress. We compared rsfMRI profiles of 57 healthy male subjects before and after stress induction. Using a parcellation-based univariate statistical analysis, we identified a large-scale rsFC change, involving 490 parcel-pairs. Aiming to characterize this change, we employed statistical enrichment analysis, identifying anatomic structures that were significantly interconnected by these pairs. This analysis revealed strengthening of thalamo-cortical connectivity and weakening of cross-hemispheral parieto-temporal connectivity. These alterations were further found to be associated with change in subjective stress reports. Integrating report-based information on stress sustainment 20 minutes post induction, revealed a single significant rsFC change between the right amygdala and the precuneus, which inversely correlated with the level of subjective recovery. Our study demonstrates the value of enrichment analysis for exploring large-scale network reorganization patterns, and provides new insight on stress-induced neural modulations and their relation to subjective experience.

  17. THE RADIO/GAMMA-RAY CONNECTION IN ACTIVE GALACTIC NUCLEI IN THE ERA OF THE FERMI LARGE AREA TELESCOPE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ackermann, M.; Ajello, M.; Allafort, A.

    We present a detailed statistical analysis of the correlation between radio and gamma-ray emission of the active galactic nuclei (AGNs) detected by Fermi during its first year of operation, with the largest data sets ever used for this purpose. We use both archival interferometric 8.4 GHz data (from the Very Large Array and ATCA, for the full sample of 599 sources) and concurrent single-dish 15 GHz measurements from the Owens Valley Radio Observatory (OVRO, for a sub sample of 199 objects). Our unprecedentedly large sample permits us to assess with high accuracy the statistical significance of the correlation, using amore » surrogate data method designed to simultaneously account for common-distance bias and the effect of a limited dynamical range in the observed quantities. We find that the statistical significance of a positive correlation between the centimeter radio and the broadband (E > 100 MeV) gamma-ray energy flux is very high for the whole AGN sample, with a probability of <10{sup -7} for the correlation appearing by chance. Using the OVRO data, we find that concurrent data improve the significance of the correlation from 1.6 x 10{sup -6} to 9.0 x 10{sup -8}. Our large sample size allows us to study the dependence of correlation strength and significance on specific source types and gamma-ray energy band. We find that the correlation is very significant (chance probability < 10{sup -7}) for both flat spectrum radio quasars and BL Lac objects separately; a dependence of the correlation strength on the considered gamma-ray energy band is also present, but additional data will be necessary to constrain its significance.« less

  18. The radio/gamma-ray connection in active galactic nuclei in the era of the Fermi Large Area Telescope

    DOE PAGES

    Ackermann, M.; Ajello, M.; Allafort, A.; ...

    2011-10-12

    We present a detailed statistical analysis of the correlation between radio and gamma-ray emission of the active galactic nuclei (AGNs) detected by Fermi during its first year of operation, with the largest data sets ever used for this purpose. We use both archival interferometric 8.4 GHz data (from the Very Large Array and ATCA, for the full sample of 599 sources) and concurrent single-dish 15 GHz measurements from the Owens Valley Radio Observatory (OVRO, for a sub sample of 199 objects). Our unprecedentedly large sample permits us to assess with high accuracy the statistical significance of the correlation, using amore » surrogate data method designed to simultaneously account for common-distance bias and the effect of a limited dynamical range in the observed quantities. We find that the statistical significance of a positive correlation between the centimeter radio and the broadband (E > 100 MeV) gamma-ray energy flux is very high for the whole AGN sample, with a probability of <10 –7 for the correlation appearing by chance. Using the OVRO data, we find that concurrent data improve the significance of the correlation from 1.6 × 10 –6 to 9.0 × 10 –8. Our large sample size allows us to study the dependence of correlation strength and significance on specific source types and gamma-ray energy band. As a result, we find that the correlation is very significant (chance probability < 10 –7) for both flat spectrum radio quasars and BL Lac objects separately; a dependence of the correlation strength on the considered gamma-ray energy band is also present, but additional data will be necessary to constrain its significance.« less

  19. The Radio/Gamma-Ray Connection in Active Galactic Nuclei in the Era of the Fermi Large Area Telescope

    NASA Technical Reports Server (NTRS)

    Ackermann, M.; Ajello, M.; Allafort, A.; Angelakis, E.; Axelsson, M.; Baldini, L.; Ballet, J.; Barbiellini, G.; Bastieri, D.; Bellazzini, R.; hide

    2011-01-01

    We present a detailed statistical analysis of the correlation between radio and gamma-ray emission of the active galactic nuclei (AGNs) detected by Fermi during its first year of operation, with the largest data sets ever used for this purpose.We use both archival interferometric 8.4 GHz data (from the Very Large Array and ATCA, for the full sample of 599 sources) and concurrent single-dish 15 GHz measurements from the OwensValley RadioObservatory (OVRO, for a sub sample of 199 objects). Our unprecedentedly large sample permits us to assess with high accuracy the statistical significance of the correlation, using a surrogate data method designed to simultaneously account for common-distance bias and the effect of a limited dynamical range in the observed quantities. We find that the statistical significance of a positive correlation between the centimeter radio and the broadband (E > 100 MeV) gamma-ray energy flux is very high for the whole AGN sample, with a probability of <10(exp -7) for the correlation appearing by chance. Using the OVRO data, we find that concurrent data improve the significance of the correlation from 1.6 10(exp -6) to 9.0 10(exp -8). Our large sample size allows us to study the dependence of correlation strength and significance on specific source types and gamma-ray energy band. We find that the correlation is very significant (chance probability < 10(exp -7)) for both flat spectrum radio quasars and BL Lac objects separately; a dependence of the correlation strength on the considered gamma-ray energy band is also present, but additional data will be necessary to constrain its significance.

  20. No association of dynamin binding protein (DNMBP) gene SNPs and Alzheimer's disease.

    PubMed

    Minster, Ryan L; DeKosky, Steven T; Kamboh, M Ilyas

    2008-10-01

    A recent scan of single nucleotide polymorphisms (SNPs) on chromosome 10q found significant association of six correlated SNPs with late-onset Alzheimer's disease (AD) among Japanese. We examined the SNP with the highest statistical significance (rs3740058) in a large Caucasian American case-control cohort and the remaining five SNPs in a smaller subset of cases and controls. We observed no association of statistical significance in either the total sample or the APOE*4 non-carriers for any of the SNPs.

  1. Significant lexical relationships

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pedersen, T.; Kayaalp, M.; Bruce, R.

    Statistical NLP inevitably deals with a large number of rare events. As a consequence, NLP data often violates the assumptions implicit in traditional statistical procedures such as significance testing. We describe a significance test, an exact conditional test, that is appropriate for NLP data and can be performed using freely available software. We apply this test to the study of lexical relationships and demonstrate that the results obtained using this test are both theoretically more reliable and different from the results obtained using previously applied tests.

  2. Characteristics and contributory causes related to large truck crashes (phase II) : all crashes.

    DOT National Transportation Integrated Search

    2012-03-01

    Statistics clearly demonstrate that large-truck crashes contribute to a significant percentage of high-severity crashes. It is : therefore important for the highway safety community to identify the characteristics and contributory causes of these typ...

  3. Statistical analysis of large wildfires

    Treesearch

    Thomas P. Holmes; Robert J. Jr. Huggett; Anthony L. Westerling

    2008-01-01

    Large, infrequent wildfires cause dramatic ecological and economic impacts. Consequently, they deserve special attention and analysis. The economic significance of large fires is indicated by the fact that approximately 94 percent of fire suppression costs on U.S. Forest Service land during the period 1980-2002 resulted from a mere 1.4 percent of the fires (Strategic...

  4. Cloud-based solution to identify statistically significant MS peaks differentiating sample categories.

    PubMed

    Ji, Jun; Ling, Jeffrey; Jiang, Helen; Wen, Qiaojun; Whitin, John C; Tian, Lu; Cohen, Harvey J; Ling, Xuefeng B

    2013-03-23

    Mass spectrometry (MS) has evolved to become the primary high throughput tool for proteomics based biomarker discovery. Until now, multiple challenges in protein MS data analysis remain: large-scale and complex data set management; MS peak identification, indexing; and high dimensional peak differential analysis with the concurrent statistical tests based false discovery rate (FDR). "Turnkey" solutions are needed for biomarker investigations to rapidly process MS data sets to identify statistically significant peaks for subsequent validation. Here we present an efficient and effective solution, which provides experimental biologists easy access to "cloud" computing capabilities to analyze MS data. The web portal can be accessed at http://transmed.stanford.edu/ssa/. Presented web application supplies large scale MS data online uploading and analysis with a simple user interface. This bioinformatic tool will facilitate the discovery of the potential protein biomarkers using MS.

  5. The effect of clulstering of galaxies on the statistics of gravitational lenses

    NASA Technical Reports Server (NTRS)

    Anderson, N.; Alcock, C.

    1986-01-01

    It is examined whether clustering of galaxies can significantly alter the statistical properties of gravitational lenses? Only models of clustering that resemble the observed distribution of galaxies in the properties of the two-point correlation function are considered. Monte-Carlo simulations of the imaging process are described. It is found that the effect of clustering is too small to be significant, unless the mass of the deflectors is so large that gravitational lenses become common occurrences. A special model is described which was concocted to optimize the effect of clustering on gravitational lensing but still resemble the observed distribution of galaxies; even this simulation did not satisfactorily produce large numbers of wide-angle lenses.

  6. Large-angle correlations in the cosmic microwave background

    NASA Astrophysics Data System (ADS)

    Efstathiou, George; Ma, Yin-Zhe; Hanson, Duncan

    2010-10-01

    It has been argued recently by Copi et al. 2009 that the lack of large angular correlations of the CMB temperature field provides strong evidence against the standard, statistically isotropic, inflationary Lambda cold dark matter (ΛCDM) cosmology. We compare various estimators of the temperature correlation function showing how they depend on assumptions of statistical isotropy and how they perform on the Wilkinson Microwave Anisotropy Probe (WMAP) 5-yr Internal Linear Combination (ILC) maps with and without a sky cut. We show that the low multipole harmonics that determine the large-scale features of the temperature correlation function can be reconstructed accurately from the data that lie outside the sky cuts. The reconstructions are only weakly dependent on the assumed statistical properties of the temperature field. The temperature correlation functions computed from these reconstructions are in good agreement with those computed from the ILC map over the whole sky. We conclude that the large-scale angular correlation function for our realization of the sky is well determined. A Bayesian analysis of the large-scale correlations is presented, which shows that the data cannot exclude the standard ΛCDM model. We discuss the differences between our results and those of Copi et al. Either there exists a violation of statistical isotropy as claimed by Copi et al., or these authors have overestimated the significance of the discrepancy because of a posteriori choices of estimator, statistic and sky cut.

  7. A simulative comparison of respondent driven sampling with incentivized snowball sampling--the "strudel effect".

    PubMed

    Gyarmathy, V Anna; Johnston, Lisa G; Caplinskiene, Irma; Caplinskas, Saulius; Latkin, Carl A

    2014-02-01

    Respondent driven sampling (RDS) and incentivized snowball sampling (ISS) are two sampling methods that are commonly used to reach people who inject drugs (PWID). We generated a set of simulated RDS samples on an actual sociometric ISS sample of PWID in Vilnius, Lithuania ("original sample") to assess if the simulated RDS estimates were statistically significantly different from the original ISS sample prevalences for HIV (9.8%), Hepatitis A (43.6%), Hepatitis B (Anti-HBc 43.9% and HBsAg 3.4%), Hepatitis C (87.5%), syphilis (6.8%) and Chlamydia (8.8%) infections and for selected behavioral risk characteristics. The original sample consisted of a large component of 249 people (83% of the sample) and 13 smaller components with 1-12 individuals. Generally, as long as all seeds were recruited from the large component of the original sample, the simulation samples simply recreated the large component. There were no significant differences between the large component and the entire original sample for the characteristics of interest. Altogether 99.2% of 360 simulation sample point estimates were within the confidence interval of the original prevalence values for the characteristics of interest. When population characteristics are reflected in large network components that dominate the population, RDS and ISS may produce samples that have statistically non-different prevalence values, even though some isolated network components may be under-sampled and/or statistically significantly different from the main groups. This so-called "strudel effect" is discussed in the paper. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  8. A simulative comparison of respondent driven sampling with incentivized snowball sampling – the “strudel effect”

    PubMed Central

    Gyarmathy, V. Anna; Johnston, Lisa G.; Caplinskiene, Irma; Caplinskas, Saulius; Latkin, Carl A.

    2014-01-01

    Background Respondent driven sampling (RDS) and Incentivized Snowball Sampling (ISS) are two sampling methods that are commonly used to reach people who inject drugs (PWID). Methods We generated a set of simulated RDS samples on an actual sociometric ISS sample of PWID in Vilnius, Lithuania (“original sample”) to assess if the simulated RDS estimates were statistically significantly different from the original ISS sample prevalences for HIV (9.8%), Hepatitis A (43.6%), Hepatitis B (Anti-HBc 43.9% and HBsAg 3.4%), Hepatitis C (87.5%), syphilis (6.8%) and Chlamydia (8.8%) infections and for selected behavioral risk characteristics. Results The original sample consisted of a large component of 249 people (83% of the sample) and 13 smaller components with 1 to 12 individuals. Generally, as long as all seeds were recruited from the large component of the original sample, the simulation samples simply recreated the large component. There were no significant differences between the large component and the entire original sample for the characteristics of interest. Altogether 99.2% of 360 simulation sample point estimates were within the confidence interval of the original prevalence values for the characteristics of interest. Conclusions When population characteristics are reflected in large network components that dominate the population, RDS and ISS may produce samples that have statistically non-different prevalence values, even though some isolated network components may be under-sampled and/or statistically significantly different from the main groups. This so-called “strudel effect” is discussed in the paper. PMID:24360650

  9. Comparisons of false negative rates from a trend test alone and from a trend test jointly with a control-high groups pairwise test in the determination of the carcinogenicity of new drugs.

    PubMed

    Lin, Karl K; Rahman, Mohammad A

    2018-05-21

    Interest has been expressed in using a joint test procedure that requires that the results of both a trend test and a pairwise comparison test between the control and the high groups be statistically significant simultaneously at the levels of significance recommended in the FDA 2001 draft guidance for industry document for the separate tests in order for the drug effect on the development of an individual tumor type to be considered as statistically significant. Results of our simulation studies show that there is a serious consequence of large inflations of the false negative rate through large decreases of false positive rate in the use of the above joint test procedure in the final interpretation of the carcinogenicity potential of a new drug if the levels of significance recommended for separate tests are used. The inflation can be as high as 204.5% of the false negative rate when the trend test alone is required to test if the effect is statistically significant. To correct the problem, new sets of levels of significance have also been developed for those who want to use the joint test in reviews of carcinogenicity studies.

  10. Using the bootstrap to establish statistical significance for relative validity comparisons among patient-reported outcome measures

    PubMed Central

    2013-01-01

    Background Relative validity (RV), a ratio of ANOVA F-statistics, is often used to compare the validity of patient-reported outcome (PRO) measures. We used the bootstrap to establish the statistical significance of the RV and to identify key factors affecting its significance. Methods Based on responses from 453 chronic kidney disease (CKD) patients to 16 CKD-specific and generic PRO measures, RVs were computed to determine how well each measure discriminated across clinically-defined groups of patients compared to the most discriminating (reference) measure. Statistical significance of RV was quantified by the 95% bootstrap confidence interval. Simulations examined the effects of sample size, denominator F-statistic, correlation between comparator and reference measures, and number of bootstrap replicates. Results The statistical significance of the RV increased as the magnitude of denominator F-statistic increased or as the correlation between comparator and reference measures increased. A denominator F-statistic of 57 conveyed sufficient power (80%) to detect an RV of 0.6 for two measures correlated at r = 0.7. Larger denominator F-statistics or higher correlations provided greater power. Larger sample size with a fixed denominator F-statistic or more bootstrap replicates (beyond 500) had minimal impact. Conclusions The bootstrap is valuable for establishing the statistical significance of RV estimates. A reasonably large denominator F-statistic (F > 57) is required for adequate power when using the RV to compare the validity of measures with small or moderate correlations (r < 0.7). Substantially greater power can be achieved when comparing measures of a very high correlation (r > 0.9). PMID:23721463

  11. 75 FR 54059 - Extension of Filing Accommodation for Static Pool Information in Filings With Respect to Asset...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-03

    ... information could include a significant amount of statistical information that would be difficult to file... required static pool information. Given the large amount of statistical information involved, commentators....; and 18 U.S.C. 1350. * * * * * 2. Amend Sec. 232.312 paragraph (a) introductory text by removing...

  12. Detection and Evaluation of Spatio-Temporal Spike Patterns in Massively Parallel Spike Train Data with SPADE.

    PubMed

    Quaglio, Pietro; Yegenoglu, Alper; Torre, Emiliano; Endres, Dominik M; Grün, Sonja

    2017-01-01

    Repeated, precise sequences of spikes are largely considered a signature of activation of cell assemblies. These repeated sequences are commonly known under the name of spatio-temporal patterns (STPs). STPs are hypothesized to play a role in the communication of information in the computational process operated by the cerebral cortex. A variety of statistical methods for the detection of STPs have been developed and applied to electrophysiological recordings, but such methods scale poorly with the current size of available parallel spike train recordings (more than 100 neurons). In this work, we introduce a novel method capable of overcoming the computational and statistical limits of existing analysis techniques in detecting repeating STPs within massively parallel spike trains (MPST). We employ advanced data mining techniques to efficiently extract repeating sequences of spikes from the data. Then, we introduce and compare two alternative approaches to distinguish statistically significant patterns from chance sequences. The first approach uses a measure known as conceptual stability, of which we investigate a computationally cheap approximation for applications to such large data sets. The second approach is based on the evaluation of pattern statistical significance. In particular, we provide an extension to STPs of a method we recently introduced for the evaluation of statistical significance of synchronous spike patterns. The performance of the two approaches is evaluated in terms of computational load and statistical power on a variety of artificial data sets that replicate specific features of experimental data. Both methods provide an effective and robust procedure for detection of STPs in MPST data. The method based on significance evaluation shows the best overall performance, although at a higher computational cost. We name the novel procedure the spatio-temporal Spike PAttern Detection and Evaluation (SPADE) analysis.

  13. Detection and Evaluation of Spatio-Temporal Spike Patterns in Massively Parallel Spike Train Data with SPADE

    PubMed Central

    Quaglio, Pietro; Yegenoglu, Alper; Torre, Emiliano; Endres, Dominik M.; Grün, Sonja

    2017-01-01

    Repeated, precise sequences of spikes are largely considered a signature of activation of cell assemblies. These repeated sequences are commonly known under the name of spatio-temporal patterns (STPs). STPs are hypothesized to play a role in the communication of information in the computational process operated by the cerebral cortex. A variety of statistical methods for the detection of STPs have been developed and applied to electrophysiological recordings, but such methods scale poorly with the current size of available parallel spike train recordings (more than 100 neurons). In this work, we introduce a novel method capable of overcoming the computational and statistical limits of existing analysis techniques in detecting repeating STPs within massively parallel spike trains (MPST). We employ advanced data mining techniques to efficiently extract repeating sequences of spikes from the data. Then, we introduce and compare two alternative approaches to distinguish statistically significant patterns from chance sequences. The first approach uses a measure known as conceptual stability, of which we investigate a computationally cheap approximation for applications to such large data sets. The second approach is based on the evaluation of pattern statistical significance. In particular, we provide an extension to STPs of a method we recently introduced for the evaluation of statistical significance of synchronous spike patterns. The performance of the two approaches is evaluated in terms of computational load and statistical power on a variety of artificial data sets that replicate specific features of experimental data. Both methods provide an effective and robust procedure for detection of STPs in MPST data. The method based on significance evaluation shows the best overall performance, although at a higher computational cost. We name the novel procedure the spatio-temporal Spike PAttern Detection and Evaluation (SPADE) analysis. PMID:28596729

  14. Principles of Statistics: What the Sports Medicine Professional Needs to Know.

    PubMed

    Riemann, Bryan L; Lininger, Monica R

    2018-07-01

    Understanding the results and statistics reported in original research remains a large challenge for many sports medicine practitioners and, in turn, may be among one of the biggest barriers to integrating research into sports medicine practice. The purpose of this article is to provide minimal essentials a sports medicine practitioner needs to know about interpreting statistics and research results to facilitate the incorporation of the latest evidence into practice. Topics covered include the difference between statistical significance and clinical meaningfulness; effect sizes and confidence intervals; reliability statistics, including the minimal detectable difference and minimal important difference; and statistical power. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. Is There Any Real Observational Contradictoty To The Lcdm Model?

    NASA Astrophysics Data System (ADS)

    Ma, Yin-Zhe

    2011-01-01

    In this talk, I am going to question the two apparent observational contradictories to LCDM cosmology---- the lack of large angle correlations in the cosmic microwave background, and the very large bulk flow of galaxy peculiar velocities. On the super-horizon scale, "Copi etal. (2009)” have been arguing that the lack of large angular correlations of the CMB temperature field provides strong evidence against the standard, statistically isotropic, LCDM cosmology. I am going to argue that the "ad-hoc” discrepancy is due to the sub-optimal estimator of the low-l multipoles, and a posteriori statistics, which exaggerates the statistical significance. On Galactic scales, "Watkins et al. (2008)” shows that the very large bulk flow prefers a very large density fluctuation, which seems to contradict to the LCDM model. I am going to show that these results are due to their underestimation of the small scale velocity dispersion, and an arbitrary way of combining catalogues. With the appropriate way of combining catalogue data, as well as the treating the small scale velocity dispersion as a free parameter, the peculiar velocity field provides unconvincing evidence against LCDM cosmology.

  16. Ketamine as a novel treatment for major depressive disorder and bipolar depression: a systematic review and quantitative meta-analysis.

    PubMed

    Lee, Ellen E; Della Selva, Megan P; Liu, Anson; Himelhoch, Seth

    2015-01-01

    Given the significant disability, morbidity and mortality associated with depression, the promising recent trials of ketamine highlight a novel intervention. A meta-analysis was conducted to assess the efficacy of ketamine in comparison with placebo for the reduction of depressive symptoms in patients who meet criteria for a major depressive episode. Two electronic databases were searched in September 2013 for English-language studies that were randomized placebo-controlled trials of ketamine treatment for patients with major depressive disorder or bipolar depression and utilized a standardized rating scale. Studies including participants receiving electroconvulsive therapy and adolescent/child participants were excluded. Five studies were included in the quantitative meta-analysis. The quantitative meta-analysis showed that ketamine significantly reduced depressive symptoms. The overall effect size at day 1 was large and statistically significant with an overall standardized mean difference of 1.01 (95% confidence interval 0.69-1.34) (P<.001), with the effects sustained at 7 days postinfusion. The heterogeneity of the studies was low and not statistically significant, and the funnel plot showed no publication bias. The large and statistically significant effect of ketamine on depressive symptoms supports a promising, new and effective pharmacotherapy with rapid onset, high efficacy and good tolerability. Copyright © 2015. Published by Elsevier Inc.

  17. Statistical Misconceptions and Rushton's Writings on Race.

    ERIC Educational Resources Information Center

    Cernovsky, Zack Z.

    The term "statistical significance" is often misunderstood or abused to imply a large effect size. A recent example is in the work of J. P. Rushton (1988, 1990) on differences between Negroids and Caucasoids. Rushton used brain size and cranial size as indicators of intelligence, using Pearson "r"s ranging from 0.03 to 0.35.…

  18. Not a Copernican observer: biased peculiar velocity statistics in the local Universe

    NASA Astrophysics Data System (ADS)

    Hellwing, Wojciech A.; Nusser, Adi; Feix, Martin; Bilicki, Maciej

    2017-05-01

    We assess the effect of the local large-scale structure on the estimation of two-point statistics of the observed radial peculiar velocities of galaxies. A large N-body simulation is used to examine these statistics from the perspective of random observers as well as 'Local Group-like' observers conditioned to reside in an environment resembling the observed Universe within 20 Mpc. The local environment systematically distorts the shape and amplitude of velocity statistics with respect to ensemble-averaged measurements made by a Copernican (random) observer. The Virgo cluster has the most significant impact, introducing large systematic deviations in all the statistics. For a simple 'top-hat' selection function, an idealized survey extending to ˜160 h-1 Mpc or deeper is needed to completely mitigate the effects of the local environment. Using shallower catalogues leads to systematic deviations of the order of 50-200 per cent depending on the scale considered. For a flat redshift distribution similar to the one of the CosmicFlows-3 survey, the deviations are even more prominent in both the shape and amplitude at all separations considered (≲100 h-1 Mpc). Conclusions based on statistics calculated without taking into account the impact of the local environment should be revisited.

  19. A study of mortality patterns at a tyre factory 1951-1985: a reference statistic dilemma.

    PubMed

    Veys, C A

    2004-08-01

    The general and cancer mortalities of rubber workers at a large tyre factory were studied in an area of marked regional variation in death rates. Three quinquennial intakes of male rubber workers engaged between January 1946 and December 1960 formed a composite cohort of 6454 men to be followed up. Over 99% were successfully traced by December 1985. The cohort analysis used both national and local rates as reference statistics for several causes. Between 1951 and 1985, a national standardized mortality ratio (SMRN) of 101 for all causes (based on 2556 deaths) was noted, whereas the local standardized mortality ratio (SMRL) was only 79. For all cancers, the figures were 115 (SMRN) and 93 (SMRL), for stomach cancer they were 137 (SMRN) and 84 (SMRL), and for lung cancer they were 121 (SMRN) and 94 (SMRL). No outright excesses against the national norm were observed for other cancers except for larynx, brain and central nervous system and thyroid cancer and the leukaemias. Excesses were statistically significant for cancer of the gallbladder and the bile ducts, for silicotuberculosis (SMRN = 1000) and for the pneumoconioses (SMRN = 706). Deaths from cerebrovascular diseases, chronic bronchitis and emphysema showed statistically significant deficits using either norm. These results from a large factory cohort study of rubber workers, followed for over three decades, demonstrate the marked discrepancy that can result from using only one reference statistic in areas of significant variation in mortality patterns.

  20. Bayesian evaluation of effect size after replicating an original study

    PubMed Central

    van Aert, Robbie C. M.; van Assen, Marcel A. L. M.

    2017-01-01

    The vast majority of published results in the literature is statistically significant, which raises concerns about their reliability. The Reproducibility Project Psychology (RPP) and Experimental Economics Replication Project (EE-RP) both replicated a large number of published studies in psychology and economics. The original study and replication were statistically significant in 36.1% in RPP and 68.8% in EE-RP suggesting many null effects among the replicated studies. However, evidence in favor of the null hypothesis cannot be examined with null hypothesis significance testing. We developed a Bayesian meta-analysis method called snapshot hybrid that is easy to use and understand and quantifies the amount of evidence in favor of a zero, small, medium and large effect. The method computes posterior model probabilities for a zero, small, medium, and large effect and adjusts for publication bias by taking into account that the original study is statistically significant. We first analytically approximate the methods performance, and demonstrate the necessity to control for the original study’s significance to enable the accumulation of evidence for a true zero effect. Then we applied the method to the data of RPP and EE-RP, showing that the underlying effect sizes of the included studies in EE-RP are generally larger than in RPP, but that the sample sizes of especially the included studies in RPP are often too small to draw definite conclusions about the true effect size. We also illustrate how snapshot hybrid can be used to determine the required sample size of the replication akin to power analysis in null hypothesis significance testing and present an easy to use web application (https://rvanaert.shinyapps.io/snapshot/) and R code for applying the method. PMID:28388646

  1. Using the Bootstrap Method for a Statistical Significance Test of Differences between Summary Histograms

    NASA Technical Reports Server (NTRS)

    Xu, Kuan-Man

    2006-01-01

    A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.

  2. Seasonal variations of volcanic eruption frequencies

    NASA Technical Reports Server (NTRS)

    Stothers, Richard B.

    1989-01-01

    Do volcanic eruptions have a tendency to occur more frequently in the months of May and June? Some past evidence suggests that they do. The present study, based on the new eruption catalog of Simkin et al.(1981), investigates the monthly statistics of the largest eruptions, grouped according to explosive magnitude, geographical latitude, and year. At the 2-delta level, no month-to-month variations in eruption frequency are found to be statistically significant. Examination of previously published month-to-month variations suggests that they, too, are not statistically significant. It is concluded that volcanism, at least averaged over large portions of the globe, is probably not periodic on a seasonal or annual time scale.

  3. Statistical analysis of effective singular values in matrix rank determination

    NASA Technical Reports Server (NTRS)

    Konstantinides, Konstantinos; Yao, Kung

    1988-01-01

    A major problem in using SVD (singular-value decomposition) as a tool in determining the effective rank of a perturbed matrix is that of distinguishing between significantly small and significantly large singular values to the end, conference regions are derived for the perturbed singular values of matrices with noisy observation data. The analysis is based on the theories of perturbations of singular values and statistical significance test. Threshold bounds for perturbation due to finite-precision and i.i.d. random models are evaluated. In random models, the threshold bounds depend on the dimension of the matrix, the noisy variance, and predefined statistical level of significance. Results applied to the problem of determining the effective order of a linear autoregressive system from the approximate rank of a sample autocorrelation matrix are considered. Various numerical examples illustrating the usefulness of these bounds and comparisons to other previously known approaches are given.

  4. Assessment of the midflexion rotational laxity in posterior-stabilized total knee arthroplasty.

    PubMed

    Hino, Kazunori; Kutsuna, Tatsuhiko; Oonishi, Yoshio; Watamori, Kunihiko; Kiyomatsu, Hiroshi; Iseki, Yasutake; Watanabe, Seiji; Ishimaru, Yasumitsu; Miura, Hiromasa

    2017-11-01

    To evaluate changes in midflexion rotational laxity before and after posterior-stabilized (PS)-total knee arthroplasty (TKA). Twenty-nine knees that underwent PS-TKA were evaluated. Manual mild passive rotational stress was applied to the knees, and the internal-external rotational angle was measured automatically by a navigation system at 30°, 45°, 60°, and 90° of knee flexion. The post-operative internal rotational laxity was statistically significantly increased compared to the preoperative level at 30°, 45°, 60°, and 90° of flexion. The post-operative external rotational laxity was statistically significantly decreased compared to the preoperative level at 45° and 60° of flexion. The post-operative internal-external rotational laxity was statistically significantly increased compared to the preoperative level only at 30° of flexion. The preoperative and post-operative rotational laxity showed a significant correlation at 30°, 45°, 60°, and 90° of flexion. Internal-external rotational laxity increases at the initial flexion range due to resection of both the anterior or posterior cruciate ligaments and retention of the collateral ligaments in PS-TKA. Preoperative and post-operative rotational laxity indicated a significant correlation at the midflexion range. This study showed that a large preoperative rotational laxity increased the risk of a large post-operative laxity, especially at the initial flexion range in PS-TKA. III.

  5. Functional annotation of regulatory pathways.

    PubMed

    Pandey, Jayesh; Koyutürk, Mehmet; Kim, Yohan; Szpankowski, Wojciech; Subramaniam, Shankar; Grama, Ananth

    2007-07-01

    Standardized annotations of biomolecules in interaction networks (e.g. Gene Ontology) provide comprehensive understanding of the function of individual molecules. Extending such annotations to pathways is a critical component of functional characterization of cellular signaling at the systems level. We propose a framework for projecting gene regulatory networks onto the space of functional attributes using multigraph models, with the objective of deriving statistically significant pathway annotations. We first demonstrate that annotations of pairwise interactions do not generalize to indirect relationships between processes. Motivated by this result, we formalize the problem of identifying statistically overrepresented pathways of functional attributes. We establish the hardness of this problem by demonstrating the non-monotonicity of common statistical significance measures. We propose a statistical model that emphasizes the modularity of a pathway, evaluating its significance based on the coupling of its building blocks. We complement the statistical model by an efficient algorithm and software, Narada, for computing significant pathways in large regulatory networks. Comprehensive results from our methods applied to the Escherichia coli transcription network demonstrate that our approach is effective in identifying known, as well as novel biological pathway annotations. Narada is implemented in Java and is available at http://www.cs.purdue.edu/homes/jpandey/narada/.

  6. Improvement in Generic Problem-Solving Abilities of Students by Use of Tutor-less Problem-Based Learning in a Large Classroom Setting

    PubMed Central

    Klegeris, Andis; Bahniwal, Manpreet; Hurren, Heather

    2013-01-01

    Problem-based learning (PBL) was originally introduced in medical education programs as a form of small-group learning, but its use has now spread to large undergraduate classrooms in various other disciplines. Introduction of new teaching techniques, including PBL-based methods, needs to be justified by demonstrating the benefits of such techniques over classical teaching styles. Previously, we demonstrated that introduction of tutor-less PBL in a large third-year biochemistry undergraduate class increased student satisfaction and attendance. The current study assessed the generic problem-solving abilities of students from the same class at the beginning and end of the term, and compared student scores with similar data obtained in three classes not using PBL. Two generic problem-solving tests of equal difficulty were administered such that students took different tests at the beginning and the end of the term. Blinded marking showed a statistically significant 13% increase in the test scores of the biochemistry students exposed to PBL, while no trend toward significant change in scores was observed in any of the control groups not using PBL. Our study is among the first to demonstrate that use of tutor-less PBL in a large classroom leads to statistically significant improvement in generic problem-solving skills of students. PMID:23463230

  7. An Independent Filter for Gene Set Testing Based on Spectral Enrichment.

    PubMed

    Frost, H Robert; Li, Zhigang; Asselbergs, Folkert W; Moore, Jason H

    2015-01-01

    Gene set testing has become an indispensable tool for the analysis of high-dimensional genomic data. An important motivation for testing gene sets, rather than individual genomic variables, is to improve statistical power by reducing the number of tested hypotheses. Given the dramatic growth in common gene set collections, however, testing is often performed with nearly as many gene sets as underlying genomic variables. To address the challenge to statistical power posed by large gene set collections, we have developed spectral gene set filtering (SGSF), a novel technique for independent filtering of gene set collections prior to gene set testing. The SGSF method uses as a filter statistic the p-value measuring the statistical significance of the association between each gene set and the sample principal components (PCs), taking into account the significance of the associated eigenvalues. Because this filter statistic is independent of standard gene set test statistics under the null hypothesis but dependent under the alternative, the proportion of enriched gene sets is increased without impacting the type I error rate. As shown using simulated and real gene expression data, the SGSF algorithm accurately filters gene sets unrelated to the experimental outcome resulting in significantly increased gene set testing power.

  8. Statistical Analysis of a Large Sample Size Pyroshock Test Data Set Including Post Flight Data Assessment. Revision 1

    NASA Technical Reports Server (NTRS)

    Hughes, William O.; McNelis, Anne M.

    2010-01-01

    The Earth Observing System (EOS) Terra spacecraft was launched on an Atlas IIAS launch vehicle on its mission to observe planet Earth in late 1999. Prior to launch, the new design of the spacecraft's pyroshock separation system was characterized by a series of 13 separation ground tests. The analysis methods used to evaluate this unusually large amount of shock data will be discussed in this paper, with particular emphasis on population distributions and finding statistically significant families of data, leading to an overall shock separation interface level. The wealth of ground test data also allowed a derivation of a Mission Assurance level for the flight. All of the flight shock measurements were below the EOS Terra Mission Assurance level thus contributing to the overall success of the EOS Terra mission. The effectiveness of the statistical methodology for characterizing the shock interface level and for developing a flight Mission Assurance level from a large sample size of shock data is demonstrated in this paper.

  9. Crank inertial load has little effect on steady-state pedaling coordination.

    PubMed

    Fregly, B J; Zajac, F E; Dairaghi, C A

    1996-12-01

    Inertial load can affect the control of a dynamic system whenever parts of the system are accelerated or decelerated. During steady-state pedaling, because within-cycle variations in crank angular acceleration still exist, the amount of crank inertia present (which varies widely with road-riding gear ratio) may affect the within-cycle coordination of muscles. However, the effect of inertial load on steady-state pedaling coordination is almost always assumed to be negligible, since the net mechanical energy per cycle developed by muscles only depends on the constant cadence and workload. This study test the hypothesis that under steady-state conditions, the net joint torques produced by muscles at the hip, knee, and ankle are unaffected by crank inertial load. To perform the investigation, we constructed a pedaling apparatus which could emulate the low inertial load of a standard ergometer or the high inertial load of a road bicycle in high gear. Crank angle and bilateral pedal force and angle data were collected from ten subjects instructed to pedal steadily (i.e., constant speed across cycles) and smoothly (i.e., constant speed within a cycle) against both inertias at a constant workload. Virtually no statistically significant changes were found in the net hip and knee muscle joint torques calculated from an inverse dynamics analysis. Though the net ankle muscle joint torque, as well as the one- and two-legged crank torque, showed statistically significant increases at the higher inertia, the changes were small. In contrast, large statistically significant reductions were found in crank kinematic variability both within a cycle and between cycles (i.e., cadence), primarily because a larger inertial load means a slower crank dynamic response. Nonetheless, the reduction in cadence variability was somewhat attenuated by a large statistically significant increase in one-legged crank torque variability. We suggest, therefore, that muscle coordination during steady-state pedaling is largely unaffected, though less well regulated, when crank inertial load is increased.

  10. Establishing Interventions via a Theory-Driven Single Case Design Research Cycle

    ERIC Educational Resources Information Center

    Kilgus, Stephen P.; Riley-Tillman, T. Chris; Kratochwill, Thomas R.

    2016-01-01

    Recent studies have suggested single case design (SCD) intervention research is subject to publication bias, wherein studies are more likely to be published if they possess large or statistically significant effects and use rigorous experimental methods. The nature of SCD and the purposes for which it might be used could suggest that large effects…

  11. Does Time Spent Online Have an Influence on Student Performance? Evidence for a Large Business Studies Class

    ERIC Educational Resources Information Center

    Korkofingas, Con; Macri, Joseph

    2013-01-01

    This paper examines, using regression modelling, whether a statistically significant relationship exists between the time spent by a student using the course website and the student's assessment performance for a large third year university business forecasting course. We utilise the online tracking system in Blackboard, a web-based software…

  12. Role of the fibula in the stability of diaphyseal tibial fractures fixed by intramedullary nailing.

    PubMed

    Galbraith, John G; Daly, Charles J; Harty, James A; Dailey, Hannah L

    2016-10-01

    For tibial fractures, the decision to fix a concomitant fibular fracture is undertaken on a case-by-case basis. To aid in this clinical decision-making process, we investigated whether loss of integrity of the fibula significantly destabilises midshaft tibial fractures, whether fixation of the fibula restores stability to the tibia, and whether removal of the fibula and interosseous membrane for expediency in biomechanical testing significantly influences tibial interfragmentary mechanics. Tibia/fibula pairs were harvested from six cadaveric donors with the interosseous membrane intact. A tibial osteotomy fracture was fixed by reamed intramedullary (IM) nailing. Axial, torsion, bending, and shear tests were completed for four models of fibular involvement: intact fibula, osteotomy fracture, fibular plating, and resected fibula and interosseous membrane. Overall construct stiffness decreased slightly with fibular osteotomy compared to intact bone, but this change was not statistically significant. Under low loads, the influence of the fibula on construct stability was only statistically significant in torsion (large effect size). Fibular plating stiffened the construct slightly, but this change was not statistically significant compared to the fibular osteotomy case. Complete resection of the fibula and interosseous membrane significantly decreased construct torsional stiffness only (large effect size). These results suggest that fixation of the fibula may not contribute significantly to the stability of diaphyseal tibial fractures and should not be undertaken unless otherwise clinically indicated. For testing purposes, load-sharing through the interosseous membrane contributes significantly to overall construct mechanics, especially in torsion, and we recommend preservation of these structures when possible. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Statistics and Discoveries at the LHC (1/4)

    ScienceCinema

    Cowan, Glen

    2018-02-09

    The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.

  14. Statistics and Discoveries at the LHC (3/4)

    ScienceCinema

    Cowan, Glen

    2018-02-19

    The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.

  15. Statistics and Discoveries at the LHC (4/4)

    ScienceCinema

    Cowan, Glen

    2018-05-22

    The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.

  16. Statistics and Discoveries at the LHC (2/4)

    ScienceCinema

    Cowan, Glen

    2018-04-26

    The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.

  17. Cluster-level statistical inference in fMRI datasets: The unexpected behavior of random fields in high dimensions.

    PubMed

    Bansal, Ravi; Peterson, Bradley S

    2018-06-01

    Identifying regional effects of interest in MRI datasets usually entails testing a priori hypotheses across many thousands of brain voxels, requiring control for false positive findings in these multiple hypotheses testing. Recent studies have suggested that parametric statistical methods may have incorrectly modeled functional MRI data, thereby leading to higher false positive rates than their nominal rates. Nonparametric methods for statistical inference when conducting multiple statistical tests, in contrast, are thought to produce false positives at the nominal rate, which has thus led to the suggestion that previously reported studies should reanalyze their fMRI data using nonparametric tools. To understand better why parametric methods may yield excessive false positives, we assessed their performance when applied both to simulated datasets of 1D, 2D, and 3D Gaussian Random Fields (GRFs) and to 710 real-world, resting-state fMRI datasets. We showed that both the simulated 2D and 3D GRFs and the real-world data contain a small percentage (<6%) of very large clusters (on average 60 times larger than the average cluster size), which were not present in 1D GRFs. These unexpectedly large clusters were deemed statistically significant using parametric methods, leading to empirical familywise error rates (FWERs) as high as 65%: the high empirical FWERs were not a consequence of parametric methods failing to model spatial smoothness accurately, but rather of these very large clusters that are inherently present in smooth, high-dimensional random fields. In fact, when discounting these very large clusters, the empirical FWER for parametric methods was 3.24%. Furthermore, even an empirical FWER of 65% would yield on average less than one of those very large clusters in each brain-wide analysis. Nonparametric methods, in contrast, estimated distributions from those large clusters, and therefore, by construct rejected the large clusters as false positives at the nominal FWERs. Those rejected clusters were outlying values in the distribution of cluster size but cannot be distinguished from true positive findings without further analyses, including assessing whether fMRI signal in those regions correlates with other clinical, behavioral, or cognitive measures. Rejecting the large clusters, however, significantly reduced the statistical power of nonparametric methods in detecting true findings compared with parametric methods, which would have detected most true findings that are essential for making valid biological inferences in MRI data. Parametric analyses, in contrast, detected most true findings while generating relatively few false positives: on average, less than one of those very large clusters would be deemed a true finding in each brain-wide analysis. We therefore recommend the continued use of parametric methods that model nonstationary smoothness for cluster-level, familywise control of false positives, particularly when using a Cluster Defining Threshold of 2.5 or higher, and subsequently assessing rigorously the biological plausibility of the findings, even for large clusters. Finally, because nonparametric methods yielded a large reduction in statistical power to detect true positive findings, we conclude that the modest reduction in false positive findings that nonparametric analyses afford does not warrant a re-analysis of previously published fMRI studies using nonparametric techniques. Copyright © 2018 Elsevier Inc. All rights reserved.

  18. Results of the NaCo Large Program: probing the occurrence of exoplanets and brown dwarfs at wide orbit

    NASA Astrophysics Data System (ADS)

    Vigan, A.; Chauvin, G.; Bonavita, M.; Desidera, S.; Bonnefoy, M.; Mesa, D.; Beuzit, J.-L.; Augereau, J.-C.; Biller, B.; Boccaletti, A.; Brugaletta, E.; Buenzli, E.; Carson, J.; Covino, E.; Delorme, P.; Eggenberger, A.; Feldt, M.; Hagelberg, J.; Henning, T.; Lagrange, A.-M.; Lanzafame, A.; Ménard, F.; Messina, S.; Meyer, M.; Montagnier, G.; Mordasini, C.; Mouillet, D.; Moutou, C.; Mugnier, L.; Quanz, S. P.; Reggiani, M.; Ségransan, D.; Thalmann, C.; Waters, R.; Zurlo, A.

    2014-01-01

    Over the past decade, a growing number of deep imaging surveys have started to provide meaningful constraints on the population of extrasolar giant planets at large orbital separation. Primary targets for these surveys have been carefully selected based on their age, distance and spectral type, and often on their membership to young nearby associations where all stars share common kinematics, photometric and spectroscopic properties. The next step is a wider statistical analysis of the frequency and properties of low mass companions as a function of stellar mass and orbital separation. In late 2009, we initiated a coordinated European Large Program using angular differential imaging in the H band (1.66 μm) with NaCo at the VLT. Our aim is to provide a comprehensive and statistically significant study of the occurrence of extrasolar giant planets and brown dwarfs at large (5-500 AU) orbital separation around ~150 young, nearby stars, a large fraction of which have never been observed at very deep contrast. The survey has now been completed and we present the data analysis and detection limits for the observed sample, for which we reach the planetary-mass domain at separations of >~50 AU on average. We also present the results of the statistical analysis that has been performed over the 75 targets newly observed at high-contrast. We discuss the details of the statistical analysis and the physical constraints that our survey provides for the frequency and formation scenario of planetary mass companions at large separation.

  19. Statistically significant relational data mining :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berry, Jonathan W.; Leung, Vitus Joseph; Phillips, Cynthia Ann

    This report summarizes the work performed under the project (3z(BStatitically significant relational data mining.(3y (BThe goal of the project was to add more statistical rigor to the fairly ad hoc area of data mining on graphs. Our goal was to develop better algorithms and better ways to evaluate algorithm quality. We concetrated on algorithms for community detection, approximate pattern matching, and graph similarity measures. Approximate pattern matching involves finding an instance of a relatively small pattern, expressed with tolerance, in a large graph of data observed with uncertainty. This report gathers the abstracts and references for the eight refereed publicationsmore » that have appeared as part of this work. We then archive three pieces of research that have not yet been published. The first is theoretical and experimental evidence that a popular statistical measure for comparison of community assignments favors over-resolved communities over approximations to a ground truth. The second are statistically motivated methods for measuring the quality of an approximate match of a small pattern in a large graph. The third is a new probabilistic random graph model. Statisticians favor these models for graph analysis. The new local structure graph model overcomes some of the issues with popular models such as exponential random graph models and latent variable models.« less

  20. Finding Statistically Significant Communities in Networks

    PubMed Central

    Lancichinetti, Andrea; Radicchi, Filippo; Ramasco, José J.; Fortunato, Santo

    2011-01-01

    Community structure is one of the main structural features of networks, revealing both their internal organization and the similarity of their elementary units. Despite the large variety of methods proposed to detect communities in graphs, there is a big need for multi-purpose techniques, able to handle different types of datasets and the subtleties of community structure. In this paper we present OSLOM (Order Statistics Local Optimization Method), the first method capable to detect clusters in networks accounting for edge directions, edge weights, overlapping communities, hierarchies and community dynamics. It is based on the local optimization of a fitness function expressing the statistical significance of clusters with respect to random fluctuations, which is estimated with tools of Extreme and Order Statistics. OSLOM can be used alone or as a refinement procedure of partitions/covers delivered by other techniques. We have also implemented sequential algorithms combining OSLOM with other fast techniques, so that the community structure of very large networks can be uncovered. Our method has a comparable performance as the best existing algorithms on artificial benchmark graphs. Several applications on real networks are shown as well. OSLOM is implemented in a freely available software (http://www.oslom.org), and we believe it will be a valuable tool in the analysis of networks. PMID:21559480

  1. Alignments of parity even/odd-only multipoles in CMB

    NASA Astrophysics Data System (ADS)

    Aluri, Pavan K.; Ralston, John P.; Weltman, Amanda

    2017-12-01

    We compare the statistics of parity even and odd multipoles of the cosmic microwave background (CMB) sky from Planck full mission temperature measurements. An excess power in odd multipoles compared to even multipoles has previously been found on large angular scales. Motivated by this apparent parity asymmetry, we evaluate directional statistics associated with even compared to odd multipoles, along with their significances. Primary tools are the Power tensor and Alignment tensor statistics. We limit our analysis to the first 60 multipoles i.e. l = [2, 61]. We find no evidence for statistically unusual alignments of even parity multipoles. More than one independent statistic finds evidence for alignments of anisotropy axes of odd multipoles, with a significance equivalent to ∼2σ or more. The robustness of alignment axes is tested by making Galactic cuts and varying the multipole range. Very interestingly, the region spanned by the (a)symmetry axes is found to broadly contain other parity (a)symmetry axes previously observed in the literature.

  2. Has the magnitude of floods across the USA changed with global CO2 levels?

    USGS Publications Warehouse

    Hirsch, Robert M.; Ryberg, Karen R.

    2012-01-01

    Statistical relationships between annual floods at 200 long-term (85–127 years of record) streamgauges in the coterminous United States and the global mean carbon dioxide concentration (GMCO2) record are explored. The streamgauge locations are limited to those with little or no regulation or urban development. The coterminous US is divided into four large regions and stationary bootstrapping is used to evaluate if the patterns of these statistical associations are significantly different from what would be expected under the null hypothesis that flood magnitudes are independent of GMCO2. In none of the four regions defined in this study is there strong statistical evidence for flood magnitudes increasing with increasing GMCO2. One region, the southwest, showed a statistically significant negative relationship between GMCO2 and flood magnitudes. The statistical methods applied compensate both for the inter-site correlation of flood magnitudes and the shorter-term (up to a few decades) serial correlation of floods.

  3. Has the magnitude of floods across the USA changed with global CO 2 levels?

    USGS Publications Warehouse

    Hirsch, R.M.; Ryberg, K.R.

    2012-01-01

    Statistical relationships between annual floods at 200 long-term (85-127 years of record) streamgauges in the coterminous United States and the global mean carbon dioxide concentration (GMCO2) record are explored. The streamgauge locations are limited to those with little or no regulation or urban development. The coterminous US is divided into four large regions and stationary bootstrapping is used to evaluate if the patterns of these statistical associations are significantly different from what would be expected under the null hypothesis that flood magnitudes are independent of GMCO2. In none of the four regions defined in this study is there strong statistical evidence for flood magnitudes increasing with increasing GMCO2. One region, the southwest, showed a statistically significant negative relationship between GMCO2 and flood magnitudes. The statistical methods applied compensate both for the inter-site correlation of flood magnitudes and the shorter-term (up to a few decades) serial correlation of floods.

  4. Analysis of Superintendent Longevity in Large School Districts: A Qualitative Study

    ERIC Educational Resources Information Center

    Mouton, Nikki Golar

    2013-01-01

    School district leadership matters, as evidenced by a meta-analysis of 27 reports and 1,210 districts conducted by Waters and Marzano (2006) which highlights a statistically significant correlation between district leadership and student achievement. Because this relationship is significant, it is important for school districts to have effective…

  5. Assessment and improvement of statistical tools for comparative proteomics analysis of sparse data sets with few experimental replicates.

    PubMed

    Schwämmle, Veit; León, Ileana Rodríguez; Jensen, Ole Nørregaard

    2013-09-06

    Large-scale quantitative analyses of biological systems are often performed with few replicate experiments, leading to multiple nonidentical data sets due to missing values. For example, mass spectrometry driven proteomics experiments are frequently performed with few biological or technical replicates due to sample-scarcity or due to duty-cycle or sensitivity constraints, or limited capacity of the available instrumentation, leading to incomplete results where detection of significant feature changes becomes a challenge. This problem is further exacerbated for the detection of significant changes on the peptide level, for example, in phospho-proteomics experiments. In order to assess the extent of this problem and the implications for large-scale proteome analysis, we investigated and optimized the performance of three statistical approaches by using simulated and experimental data sets with varying numbers of missing values. We applied three tools, including standard t test, moderated t test, also known as limma, and rank products for the detection of significantly changing features in simulated and experimental proteomics data sets with missing values. The rank product method was improved to work with data sets containing missing values. Extensive analysis of simulated and experimental data sets revealed that the performance of the statistical analysis tools depended on simple properties of the data sets. High-confidence results were obtained by using the limma and rank products methods for analyses of triplicate data sets that exhibited more than 1000 features and more than 50% missing values. The maximum number of differentially represented features was identified by using limma and rank products methods in a complementary manner. We therefore recommend combined usage of these methods as a novel and optimal way to detect significantly changing features in these data sets. This approach is suitable for large quantitative data sets from stable isotope labeling and mass spectrometry experiments and should be applicable to large data sets of any type. An R script that implements the improved rank products algorithm and the combined analysis is available.

  6. Planck 2015 results. XVI. Isotropy and statistics of the CMB

    NASA Astrophysics Data System (ADS)

    Planck Collaboration; Ade, P. A. R.; Aghanim, N.; Akrami, Y.; Aluri, P. K.; Arnaud, M.; Ashdown, M.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartolo, N.; Basak, S.; Battaner, E.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bock, J. J.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Boulanger, F.; Bucher, M.; Burigana, C.; Butler, R. C.; Calabrese, E.; Cardoso, J.-F.; Casaponsa, B.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, H. C.; Christensen, P. R.; Church, S.; Clements, D. L.; Colombi, S.; Colombo, L. P. L.; Combet, C.; Contreras, D.; Couchot, F.; Coulais, A.; Crill, B. P.; Cruz, M.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Désert, F.-X.; Diego, J. M.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Ducout, A.; Dupac, X.; Efstathiou, G.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Fantaye, Y.; Fergusson, J.; Fernandez-Cobos, R.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A. A.; Franceschi, E.; Frejsel, A.; Frolov, A.; Galeotta, S.; Galli, S.; Ganga, K.; Gauthier, C.; Ghosh, T.; Giard, M.; Giraud-Héraud, Y.; Gjerløw, E.; González-Nuevo, J.; Górski, K. M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J. E.; Hansen, F. K.; Hanson, D.; Harrison, D. L.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Holmes, W. A.; Hornstrup, A.; Hovest, W.; Huang, Z.; Huffenberger, K. M.; Hurier, G.; Jaffe, A. H.; Jaffe, T. R.; Jones, W. C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kim, J.; Kisner, T. S.; Knoche, J.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Lawrence, C. R.; Leonardi, R.; Lesgourgues, J.; Levrier, F.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; Liu, H.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; Maggio, G.; Maino, D.; Mandolesi, N.; Mangilli, A.; Marinucci, D.; Maris, M.; Martin, P. G.; Martínez-González, E.; Masi, S.; Matarrese, S.; McGehee, P.; Meinhold, P. R.; Melchiorri, A.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mikkelsen, K.; Mitra, S.; Miville-Deschênes, M.-A.; Molinari, D.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Murphy, J. A.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C. B.; Nørgaard-Nielsen, H. U.; Noviello, F.; Novikov, D.; Novikov, I.; Oxborrow, C. A.; Paci, F.; Pagano, L.; Pajot, F.; Pant, N.; Paoletti, D.; Pasian, F.; Patanchon, G.; Pearson, T. J.; Perdereau, O.; Perotto, L.; Perrotta, F.; Pettorino, V.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Popa, L.; Pratt, G. W.; Prézeau, G.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Renzi, A.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Rossetti, M.; Rotti, A.; Roudier, G.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Santos, D.; Savelainen, M.; Savini, G.; Scott, D.; Seiffert, M. D.; Shellard, E. P. S.; Souradeep, T.; Spencer, L. D.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sunyaev, R.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Trombetti, T.; Tucci, M.; Tuovinen, J.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Vielva, P.; Villa, F.; Wade, L. A.; Wandelt, B. D.; Wehus, I. K.; Yvon, D.; Zacchei, A.; Zibin, J. P.; Zonca, A.

    2016-09-01

    We test the statistical isotropy and Gaussianity of the cosmic microwave background (CMB) anisotropies using observations made by the Planck satellite. Our results are based mainly on the full Planck mission for temperature, but also include some polarization measurements. In particular, we consider the CMB anisotropy maps derived from the multi-frequency Planck data by several component-separation methods. For the temperature anisotropies, we find excellent agreement between results based on these sky maps over both a very large fraction of the sky and a broad range of angular scales, establishing that potential foreground residuals do not affect our studies. Tests of skewness, kurtosis, multi-normality, N-point functions, and Minkowski functionals indicate consistency with Gaussianity, while a power deficit at large angular scales is manifested in several ways, for example low map variance. The results of a peak statistics analysis are consistent with the expectations of a Gaussian random field. The "Cold Spot" is detected with several methods, including map kurtosis, peak statistics, and mean temperature profile. We thoroughly probe the large-scale dipolar power asymmetry, detecting it with several independent tests, and address the subject of a posteriori correction. Tests of directionality suggest the presence of angular clustering from large to small scales, but at a significance that is dependent on the details of the approach. We perform the first examination of polarization data, finding the morphology of stacked peaks to be consistent with the expectations of statistically isotropic simulations. Where they overlap, these results are consistent with the Planck 2013 analysis based on the nominal mission data and provide our most thorough view of the statistics of the CMB fluctuations to date.

  7. Planck 2015 results: XVI. Isotropy and statistics of the CMB

    DOE PAGES

    Ade, P. A. R.; Aghanim, N.; Akrami, Y.; ...

    2016-09-20

    In this paper, we test the statistical isotropy and Gaussianity of the cosmic microwave background (CMB) anisotropies using observations made by the Planck satellite. Our results are based mainly on the full Planck mission for temperature, but also include some polarization measurements. In particular, we consider the CMB anisotropy maps derived from the multi-frequency Planck data by several component-separation methods. For the temperature anisotropies, we find excellent agreement between results based on these sky maps over both a very large fraction of the sky and a broad range of angular scales, establishing that potential foreground residuals do not affect ourmore » studies. Tests of skewness, kurtosis, multi-normality, N-point functions, and Minkowski functionals indicate consistency with Gaussianity, while a power deficit at large angular scales is manifested in several ways, for example low map variance. The results of a peak statistics analysis are consistent with the expectations of a Gaussian random field. The “Cold Spot” is detected with several methods, including map kurtosis, peak statistics, and mean temperature profile. We thoroughly probe the large-scale dipolar power asymmetry, detecting it with several independent tests, and address the subject of a posteriori correction. Tests of directionality suggest the presence of angular clustering from large to small scales, but at a significance that is dependent on the details of the approach. We perform the first examination of polarization data, finding the morphology of stacked peaks to be consistent with the expectations of statistically isotropic simulations. Finally, where they overlap, these results are consistent with the Planck 2013 analysis based on the nominal mission data and provide our most thorough view of the statistics of the CMB fluctuations to date.« less

  8. Probability as certainty: dichotomous thinking and the misuse of p values.

    PubMed

    Hoekstra, Rink; Finch, Sue; Kiers, Henk A L; Johnson, Addie

    2006-12-01

    Significance testing is widely used and often criticized. The Task Force on Statistical Inference of the American Psychological Association (TFSI, APA; Wilkinson & TFSI, 1999) addressed the use of significance testing and made recommendations that were incorporated in the fifth edition of the APA Publication Manual (APA, 2001). They emphasized the interpretation of significance testing and the importance of reporting confidence intervals and effect sizes. We examined whether 286 Psychonomic Bulletin & Review articles submitted before and after the publication of the TFSI recommendations by APA complied with these recommendations. Interpretation errors when using significance testing were still made frequently, and the new prescriptions were not yet followed on a large scale. Changing the practice of reporting statistics seems doomed to be a slow process.

  9. System Study: Residual Heat Removal 1998-2014

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schroeder, John Alton

    2015-12-01

    This report presents an unreliability evaluation of the residual heat removal (RHR) system in two modes of operation (low-pressure injection in response to a large loss-of-coolant accident and post-trip shutdown-cooling) at 104 U.S. commercial nuclear power plants. Demand, run hours, and failure data from fiscal year 1998 through 2014 for selected components were obtained from the Institute of Nuclear Power Operations (INPO) Consolidated Events Database (ICES). The unreliability results are trended for the most recent 10 year period, while yearly estimates for system unreliability are provided for the entire active period. No statistically significant increasing trends were identified in themore » RHR results. A highly statistically significant decreasing trend was observed for the RHR injection mode start-only unreliability. Statistically significant decreasing trends were observed for RHR shutdown cooling mode start-only unreliability and RHR shutdown cooling model 24-hour unreliability.« less

  10. Massive parallelization of serial inference algorithms for a complex generalized linear model

    PubMed Central

    Suchard, Marc A.; Simpson, Shawn E.; Zorych, Ivan; Ryan, Patrick; Madigan, David

    2014-01-01

    Following a series of high-profile drug safety disasters in recent years, many countries are redoubling their efforts to ensure the safety of licensed medical products. Large-scale observational databases such as claims databases or electronic health record systems are attracting particular attention in this regard, but present significant methodological and computational concerns. In this paper we show how high-performance statistical computation, including graphics processing units, relatively inexpensive highly parallel computing devices, can enable complex methods in large databases. We focus on optimization and massive parallelization of cyclic coordinate descent approaches to fit a conditioned generalized linear model involving tens of millions of observations and thousands of predictors in a Bayesian context. We find orders-of-magnitude improvement in overall run-time. Coordinate descent approaches are ubiquitous in high-dimensional statistics and the algorithms we propose open up exciting new methodological possibilities with the potential to significantly improve drug safety. PMID:25328363

  11. Computer assisted outcomes research in orthopedics: total joint replacement.

    PubMed

    Arslanian, C; Bond, M

    1999-06-01

    Long-term studies are needed to determine clinically relevant outcomes within the practice of orthopedic surgery. Historically, the patient's subjective feelings of quality of life have been largely ignored. However, there has been a strong movement toward measuring perceived quality of life through such instruments as the SF-36. In a large database from an orthopedic practice results are presented. First, computerized data entry using touch screen technology is not only cost effective but user friendly. Second, patients undergoing hip or knee arthroplasty surgeries make statistically significant improvements in seven of the eight domains of the SF-36 in the first 3 months after surgery. Additional statistically significant improvements over the next 6 to 12 months are also seen. The data are presented here in detail to demonstrate the benefits of a patient outcomes program, to enhance the understanding and use of outcomes data and to encourage further work in outcomes measurement in orthopedics.

  12. A Meta-Analysis of Hypnotherapeutic Techniques in the Treatment of PTSD Symptoms.

    PubMed

    O'Toole, Siobhan K; Solomon, Shelby L; Bergdahl, Stephen A

    2016-02-01

    The efficacy of hypnotherapeutic techniques as treatment for symptoms of posttraumatic stress disorder (PTSD) was explored through meta-analytic methods. Studies were selected through a search of 29 databases. Altogether, 81 studies discussing hypnotherapy and PTSD were reviewed for inclusion criteria. The outcomes of 6 studies representing 391 participants were analyzed using meta-analysis. Evaluation of effect sizes related to avoidance and intrusion, in addition to overall PTSD symptoms after hypnotherapy treatment, revealed that all studies showed that hypnotherapy had a positive effect on PTSD symptoms. The overall Cohen's d was large (-1.18) and statistically significant (p < .001). Effect sizes varied based on study quality; however, they were large and statistically significant. Using the classic fail-safe N to assess for publication bias, it was determined it would take 290 nonsignificant studies to nullify these findings. Copyright © 2016 International Society for Traumatic Stress Studies.

  13. An operational definition of a statistically meaningful trend.

    PubMed

    Bryhn, Andreas C; Dimberg, Peter H

    2011-04-28

    Linear trend analysis of time series is standard procedure in many scientific disciplines. If the number of data is large, a trend may be statistically significant even if data are scattered far from the trend line. This study introduces and tests a quality criterion for time trends referred to as statistical meaningfulness, which is a stricter quality criterion for trends than high statistical significance. The time series is divided into intervals and interval mean values are calculated. Thereafter, r(2) and p values are calculated from regressions concerning time and interval mean values. If r(2) ≥ 0.65 at p ≤ 0.05 in any of these regressions, then the trend is regarded as statistically meaningful. Out of ten investigated time series from different scientific disciplines, five displayed statistically meaningful trends. A Microsoft Excel application (add-in) was developed which can perform statistical meaningfulness tests and which may increase the operationality of the test. The presented method for distinguishing statistically meaningful trends should be reasonably uncomplicated for researchers with basic statistics skills and may thus be useful for determining which trends are worth analysing further, for instance with respect to causal factors. The method can also be used for determining which segments of a time trend may be particularly worthwhile to focus on.

  14. Effectiveness of thyroid gland shielding in dental CBCT using a paediatric anthropomorphic phantom

    PubMed Central

    Davies, J; Horner, K; Theodorakou, C

    2015-01-01

    Objectives: The purpose of the study is to evaluate the effectiveness of thyroid shielding in dental CBCT examinations using a paediatric anthropomorphic phantom. Methods: An ATOM® 706-C anthropomorphic phantom (Computerized Imaging Reference Systems Inc., Norfolk, VA) representing a 10-year-old child was loaded with six thermoluminescent dosemeters positioned at the level of the thyroid gland. Absorbed doses to the thyroid were measured for five commercially available thyroid shields using a large field of view (FOV). Results: A statistically significant thyroid gland dose reduction was found using thyroid shielding for paediatric CBCT examinations for a large FOV. In addition, a statistically significant difference in thyroid gland doses was found depending on the position of the thyroid gland. There was little difference in the effectiveness of thyroid shielding when using a lead vs a lead-equivalent thyroid shield. Similar dose reduction was found using 0.25- and 0.50-mm lead-equivalent thyroid shields. Conclusions: Thyroid shields are to be recommended when undertaking large FOV CBCT examinations on young patients. PMID:25411710

  15. Potentiation Effects of Half-Squats Performed in a Ballistic or Nonballistic Manner.

    PubMed

    Suchomel, Timothy J; Sato, Kimitake; DeWeese, Brad H; Ebben, William P; Stone, Michael H

    2016-06-01

    This study examined and compared the acute effects of ballistic and nonballistic concentric-only half-squats (COHSs) on squat jump performance. Fifteen resistance-trained men performed a squat jump 2 minutes after a control protocol or 2 COHSs at 90% of their 1 repetition maximum (1RM) COHS performed in a ballistic or nonballistic manner. Jump height (JH), peak power (PP), and allometrically scaled peak power (PPa) were compared using three 3 × 2 repeated-measures analyses of variance. Statistically significant condition × time interaction effects existed for JH (p = 0.037), PP (p = 0.041), and PPa (p = 0.031). Post hoc analysis revealed that the ballistic condition produced statistically greater JH (p = 0.017 and p = 0.036), PP (p = 0.031 and p = 0.026), and PPa (p = 0.024 and p = 0.023) than the control and nonballistic conditions, respectively. Small effect sizes for JH, PP, and PPa existed during the ballistic condition (d = 0.28-0.44), whereas trivial effect sizes existed during the control (d = 0.0-0.18) and nonballistic (d = 0.0-0.17) conditions. Large statistically significant relationships existed between the JH potentiation response and the subject's relative back squat 1RM (r = 0.520; p = 0.047) and relative COHS 1RM (r = 0.569; p = 0.027) during the ballistic condition. In addition, large statistically significant relationship existed between JH potentiation response and the subject's relative back squat strength (r = 0.633; p = 0.011), whereas the moderate relationship with the subject's relative COHS strength trended toward significance (r = 0.483; p = 0.068). Ballistic COHS produced superior potentiation effects compared with COHS performed in a nonballistic manner. Relative strength may contribute to the elicited potentiation response after ballistic and nonballistic COHS.

  16. [Lymphocytic infiltration in uveal melanoma].

    PubMed

    Sach, J; Kocur, J

    1993-11-01

    After our observation of lymphocytic infiltration in uveal melanomas we present theoretical review to this interesting topic. Due to relatively low incidence of this feature we haven't got sufficiently large collection of cases for presentation of our statistically significant conclusions.

  17. What's in a Name?

    NASA Astrophysics Data System (ADS)

    Bonneau, Joseph; Just, Mike; Matthews, Greg

    We study the efficiency of statistical attacks on human authentication systems relying on personal knowledge questions. We adapt techniques from guessing theory to measure security against a trawling attacker attempting to compromise a large number of strangers' accounts. We then examine a diverse corpus of real-world statistical distributions for likely answer categories such as the names of people, pets, and places and find that personal knowledge questions are significantly less secure than graphical or textual passwords. We also demonstrate that statistics can be used to increase security by proactively shaping the answer distribution to lower the prevalence of common responses.

  18. Statistical physics inspired energy-efficient coded-modulation for optical communications.

    PubMed

    Djordjevic, Ivan B; Xu, Lei; Wang, Ting

    2012-04-15

    Because Shannon's entropy can be obtained by Stirling's approximation of thermodynamics entropy, the statistical physics energy minimization methods are directly applicable to the signal constellation design. We demonstrate that statistical physics inspired energy-efficient (EE) signal constellation designs, in combination with large-girth low-density parity-check (LDPC) codes, significantly outperform conventional LDPC-coded polarization-division multiplexed quadrature amplitude modulation schemes. We also describe an EE signal constellation design algorithm. Finally, we propose the discrete-time implementation of D-dimensional transceiver and corresponding EE polarization-division multiplexed system. © 2012 Optical Society of America

  19. Variations in intensity statistics for representational and abstract art, and for art from the Eastern and Western hemispheres.

    PubMed

    Graham, Daniel J; Field, David J

    2008-01-01

    Two recent studies suggest that natural scenes and paintings show similar statistical properties. But does the content or region of origin of an artwork affect its statistical properties? We addressed this question by having judges place paintings from a large, diverse collection of paintings into one of three subject-matter categories using a forced-choice paradigm. Basic statistics for images whose caterogization was agreed by all judges showed no significant differences between those judged to be 'landscape' and 'portrait/still-life', but these two classes differed from paintings judged to be 'abstract'. All categories showed basic spatial statistical regularities similar to those typical of natural scenes. A test of the full painting collection (140 images) with respect to the works' place of origin (provenance) showed significant differences between Eastern works and Western ones, differences which we find are likely related to the materials and the choice of background color. Although artists deviate slightly from reproducing natural statistics in abstract art (compared to representational art), the great majority of human art likely shares basic statistical limitations. We argue that statistical regularities in art are rooted in the need to make art visible to the eye, not in the inherent aesthetic value of natural-scene statistics, and we suggest that variability in spatial statistics may be generally imposed by manufacture.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Xiangkun; Pan, Chuzhong; Fan, Zuhui

    With numerical simulations, we analyze in detail how the bad data removal, i.e., the mask effect, can influence the peak statistics of the weak-lensing convergence field reconstructed from the shear measurement of background galaxies. It is found that high peak fractions are systematically enhanced because of the presence of masks; the larger the masked area is, the higher the enhancement is. In the case where the total masked area is about 13% of the survey area, the fraction of peaks with signal-to-noise ratio ν ≥ 3 is ∼11% of the total number of peaks, compared with ∼7% of the mask-freemore » case in our considered cosmological model. This can have significant effects on cosmological studies with weak-lensing convergence peak statistics, inducing a large bias in the parameter constraints if the effects are not taken into account properly. Even for a survey area of 9 deg{sup 2}, the bias in (Ω {sub m}, σ{sub 8}) is already intolerably large and close to 3σ. It is noted that most of the affected peaks are close to the masked regions. Therefore, excluding peaks in those regions in the peak statistics can reduce the bias effect but at the expense of losing usable survey areas. Further investigations find that the enhancement of the number of high peaks around the masked regions can be largely attributed to the smaller number of galaxies usable in the weak-lensing convergence reconstruction, leading to higher noise than that of the areas away from the masks. We thus develop a model in which we exclude only those very large masks with radius larger than 3' but keep all the other masked regions in peak counting statistics. For the remaining part, we treat the areas close to and away from the masked regions separately with different noise levels. It is shown that this two-noise-level model can account for the mask effect on peak statistics very well, and the bias in cosmological parameters is significantly reduced if this model is applied in the parameter fitting.« less

  1. Statistical comparison of coherent structures in fully developed turbulent pipe flow with and without drag reduction

    NASA Astrophysics Data System (ADS)

    Sogaro, Francesca; Poole, Robert; Dennis, David

    2014-11-01

    High-speed stereoscopic particle image velocimetry has been performed in fully developed turbulent pipe flow at moderate Reynolds numbers with and without a drag-reducing additive (an aqueous solution of high molecular weight polyacrylamide). Three-dimensional large and very large-scale motions (LSM and VLSM) are extracted from the flow fields by a detection algorithm and the characteristics for each case are statistically compared. The results show that the three-dimensional extent of VLSMs in drag reduced (DR) flow appears to increase significantly compared to their Newtonian counterparts. A statistical increase in azimuthal extent of DR VLSM is observed by means of two-point spatial autocorrelation of the streamwise velocity fluctuation in the radial-azimuthal plane. Furthermore, a remarkable increase in length of these structures is observed by three-dimensional two-point spatial autocorrelation. These results are accompanied by an analysis of the swirling strength in the flow field that shows a significant reduction in strength and number of the vortices for the DR flow. The findings suggest that the damping of the small scales due to polymer addition results in the undisturbed development of longer flow structures.

  2. Erratum to "Large-scale mitochondrial COI gene sequence variability reflects the complex colonization history of the invasive soft-shell clam, Mya arenaria (L.) (Bivalvia)" [Estuar. Coast. Shelf Sci. 181 (2016) 256-265

    NASA Astrophysics Data System (ADS)

    Lasota, Rafal; Pierscieniak, Karolina; Garcia, Pascale; Simon-Bouhet, Benoit; Wolowicz, Maciej

    2017-03-01

    The publisher regrets a printing error in the last paragraph in the Results section. The correct text should read as follows: Tajima's D, Fu and Li's D* and F*, and Fu's Fs were negative for all American populations, and statistically significant in most cases (Table 3). In most of the European populations the values of neutrality tests were positive, but not statistically significant. The highest positive values of neutrality tests were noted in the populations from Reykjavik (Iceland) and Dublin (Ireland) (Table 3).

  3. Tracking of large-scale structures in turbulent channel with direct numerical simulation of low Prandtl number passive scalar

    NASA Astrophysics Data System (ADS)

    Tiselj, Iztok

    2014-12-01

    Channel flow DNS (Direct Numerical Simulation) at friction Reynolds number 180 and with passive scalars of Prandtl numbers 1 and 0.01 was performed in various computational domains. The "normal" size domain was ˜2300 wall units long and ˜750 wall units wide; size taken from the similar DNS of Moser et al. The "large" computational domain, which is supposed to be sufficient to describe the largest structures of the turbulent flows was 3 times longer and 3 times wider than the "normal" domain. The "very large" domain was 6 times longer and 6 times wider than the "normal" domain. All simulations were performed with the same spatial and temporal resolution. Comparison of the standard and large computational domains shows the velocity field statistics (mean velocity, root-mean-square (RMS) fluctuations, and turbulent Reynolds stresses) that are within 1%-2%. Similar agreement is observed for Pr = 1 temperature fields and can be observed also for the mean temperature profiles at Pr = 0.01. These differences can be attributed to the statistical uncertainties of the DNS. However, second-order moments, i.e., RMS temperature fluctuations of standard and large computational domains at Pr = 0.01 show significant differences of up to 20%. Stronger temperature fluctuations in the "large" and "very large" domains confirm the existence of the large-scale structures. Their influence is more or less invisible in the main velocity field statistics or in the statistics of the temperature fields at Prandtl numbers around 1. However, these structures play visible role in the temperature fluctuations at low Prandtl number, where high temperature diffusivity effectively smears the small-scale structures in the thermal field and enhances the relative contribution of large-scales. These large thermal structures represent some kind of an echo of the large scale velocity structures: the highest temperature-velocity correlations are not observed between the instantaneous temperatures and instantaneous streamwise velocities, but between the instantaneous temperatures and velocities averaged over certain time interval.

  4. Outcomes of early carotid stenting and angioplasty in large-vessel anterior circulation strokes treated with mechanical thrombectomy and intravenous thrombolytics.

    PubMed

    Mehta, T; Desai, N; Mehta, K; Parikh, R; Male, S; Hussain, M; Ollenschleger, M; Spiegel, G; Grande, A; Ezzeddine, M; Jagadeesan, B; Tummala, R; McCullough, L

    2018-01-01

    Introduction Proximal cervical internal carotid artery stenosis greater than 50% merits revascularization to mitigate the risk of stroke recurrence among large-vessel anterior circulation strokes undergoing mechanical thrombectomy. Carotid artery stenting necessitates the use of antiplatelets, and there is a theoretical increased risk of hemorrhagic transformation given that such patients may already have received intravenous thrombolytics and have a significant infarct burden. We investigate the outcomes of large-vessel anterior circulation stroke patients treated with intravenous thrombolytics receiving same-day carotid stenting or selective angioplasty compared to no carotid intervention. Materials and methods The study cohort was obtained from the National (Nationwide) Inpatient Sample database between 2006 and 2014, using International Statistical Classification of Diseases, ninth revision discharge diagnosis and procedure codes. A total of 11,825 patients with large-vessel anterior circulation stroke treated with intravenous thrombolytic and mechanical thrombectomy on the same day were identified. The study population was subdivided into three subgroups: no carotid intervention, same-day carotid angioplasty without carotid stenting, and same-day carotid stenting. Outcomes were assessed with respect to mortality, significant disability at discharge, hemorrhagic transformation, and requirement of percutaneous endoscopic gastronomy tube placement, prolonged mechanical ventilation, or craniotomy. Results This study found no statistically significant difference in patient outcomes in those treated with concurrent carotid stenting compared to no carotid intervention in terms of morbidity or mortality. Conclusions If indicated, it is reasonable to consider concurrent carotid stenting and/or angioplasty for large-vessel anterior circulation stroke patients treated with mechanical thrombectomy who also receive intravenous thrombolytics.

  5. Institute for Brain and Neural Systems

    DTIC Science & Technology

    2009-10-06

    to deal with computational complexity when analyzing large amounts of information in visual scenes. It seems natural that in addition to exploring...algorithms using methods from statistical pattern recognition and machine learning. Over the last fifteen years, significant advances had been made in...recognition, robustness to noise and ability to cope with significant variations in lighting conditions. Identifying an occluded target adds another layer of

  6. Can Meditation Influence Quality of Life, Depression, and Disease Outcome in Multiple Sclerosis? Findings from a Large International Web-Based Study

    PubMed Central

    Levin, Adam B.; Hadgkiss, Emily J.; Weiland, Tracey J.; Marck, Claudia H.; van der Meer, Dania M.; Pereira, Naresh G.; Jelinek, George A.

    2014-01-01

    Objectives. To explore the association between meditation and health related quality of life (HRQOL), depression, fatigue, disability level, relapse rates, and disease activity in a large international sample of people with multiple sclerosis (MS). Methods. Participants were invited to take part in an online survey and answer questions relating to HRQOL, depression, fatigue, disability, relapse rates, and their involvement in meditation practices. Results. Statistically and potentially clinically significant differences between those who meditated once a week or more and participants who never meditated were present for mean mental health composite (MHC) scores, cognitive function scale, and health perception scale. The MHC results remained statistically significant on multivariate regression modelling when covariates were accounted for. Physical health composite (PHC) scores were higher in those that meditated; however, the differences were probably not clinically significant. Among those who meditated, fewer screened positive for depression, but there was no relationship with fatigue or relapse rate. Those with worsened disability levels were more likely to meditate. Discussion. The study reveals a significant association between meditation, lower risk of depression, and improved HRQOL in people with MS. PMID:25477709

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ade, P. A. R.; Aghanim, N.; Akrami, Y.

    In this paper, we test the statistical isotropy and Gaussianity of the cosmic microwave background (CMB) anisotropies using observations made by the Planck satellite. Our results are based mainly on the full Planck mission for temperature, but also include some polarization measurements. In particular, we consider the CMB anisotropy maps derived from the multi-frequency Planck data by several component-separation methods. For the temperature anisotropies, we find excellent agreement between results based on these sky maps over both a very large fraction of the sky and a broad range of angular scales, establishing that potential foreground residuals do not affect ourmore » studies. Tests of skewness, kurtosis, multi-normality, N-point functions, and Minkowski functionals indicate consistency with Gaussianity, while a power deficit at large angular scales is manifested in several ways, for example low map variance. The results of a peak statistics analysis are consistent with the expectations of a Gaussian random field. The “Cold Spot” is detected with several methods, including map kurtosis, peak statistics, and mean temperature profile. We thoroughly probe the large-scale dipolar power asymmetry, detecting it with several independent tests, and address the subject of a posteriori correction. Tests of directionality suggest the presence of angular clustering from large to small scales, but at a significance that is dependent on the details of the approach. We perform the first examination of polarization data, finding the morphology of stacked peaks to be consistent with the expectations of statistically isotropic simulations. Finally, where they overlap, these results are consistent with the Planck 2013 analysis based on the nominal mission data and provide our most thorough view of the statistics of the CMB fluctuations to date.« less

  8. OSPAR standard method and software for statistical analysis of beach litter data.

    PubMed

    Schulz, Marcus; van Loon, Willem; Fleet, David M; Baggelaar, Paul; van der Meulen, Eit

    2017-09-15

    The aim of this study is to develop standard statistical methods and software for the analysis of beach litter data. The optimal ensemble of statistical methods comprises the Mann-Kendall trend test, the Theil-Sen slope estimation, the Wilcoxon step trend test and basic descriptive statistics. The application of Litter Analyst, a tailor-made software for analysing the results of beach litter surveys, to OSPAR beach litter data from seven beaches bordering on the south-eastern North Sea, revealed 23 significant trends in the abundances of beach litter types for the period 2009-2014. Litter Analyst revealed a large variation in the abundance of litter types between beaches. To reduce the effects of spatial variation, trend analysis of beach litter data can most effectively be performed at the beach or national level. Spatial aggregation of beach litter data within a region is possible, but resulted in a considerable reduction in the number of significant trends. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Large eddy simulation of orientation and rotation of ellipsoidal particles in isotropic turbulent flows

    NASA Astrophysics Data System (ADS)

    Chen, Jincai; Jin, Guodong; Zhang, Jian

    2016-03-01

    The rotational motion and orientational distribution of ellipsoidal particles in turbulent flows are of significance in environmental and engineering applications. Whereas the translational motion of an ellipsoidal particle is controlled by the turbulent motions at large scales, its rotational motion is determined by the fluid velocity gradient tensor at small scales, which raises a challenge when predicting the rotational dispersion of ellipsoidal particles using large eddy simulation (LES) method due to the lack of subgrid scale (SGS) fluid motions. We report the effects of the SGS fluid motions on the orientational and rotational statistics, such as the alignment between the long axis of ellipsoidal particles and the vorticity, the mean rotational energy at various aspect ratios against those obtained with direct numerical simulation (DNS) and filtered DNS. The performances of a stochastic differential equation (SDE) model for the SGS velocity gradient seen by the particles and the approximate deconvolution method (ADM) for LES are investigated. It is found that the missing SGS fluid motions in LES flow fields have significant effects on the rotational statistics of ellipsoidal particles. Alignment between the particles and the vorticity is weakened; and the rotational energy of the particles is reduced in LES. The SGS-SDE model leads to a large error in predicting the alignment between the particles and the vorticity and over-predicts the rotational energy of rod-like particles. The ADM significantly improves the rotational energy prediction of particles in LES.

  10. Does bearing size influence metal ion levels in large-head metal-on-metal total hip arthroplasty? A comparison of three total hip systems

    PubMed Central

    2014-01-01

    Background The purpose of the study was twofold: first, to determine whether there is a statistically significant difference in the metal ion levels among three different large-head metal-on-metal (MOM) total hip systems. The second objective was to assess whether position of the implanted prostheses, patient demographics or factors such as activity levels influence overall blood metal ion levels and whether there is a difference in the functional outcomes between the systems. Methods In a cross-sectional cohort study, three different metal-on-metal total hip systems were assessed: two monoblock heads, the Durom socket (Zimmer, Warsaw, IN, USA) and the Birmingham socket (Smith and Nephew, Memphis, TN, USA), and one modular metal-on-metal total hip system (Pinnacle, Depuy Orthopedics, Warsaw, IN, USA). Fifty-four patients were recruited, with a mean age of 59.7 years and a mean follow-up time of 41 months (12 to 60). Patients were evaluated clinically, radiologically and biochemically. Statistical analysis was performed on all collected data to assess any differences between the three groups in terms of overall blood metal ion levels and also to identify whether there was any other factor within the group demographics and outcomes that could influence the mean levels of Co and Cr. Results Although the functional outcome scores were similar in all three groups, the blood metal ion levels in the larger monoblock large heads (Durom, Birmingham sockets) were significantly raised compared with those of the Pinnacle group. In addition, the metal ion levels were not found to have a statistically significant relationship to the anteversion or abduction angles as measured on the radiographs. Conclusions When considering a MOM THR, the use of a monoblock large-head system leads to higher elevations in whole blood metal ions and offers no advantage over a smaller head modular system. PMID:24472283

  11. Statistics of galaxy orientations - Morphology and large-scale structure

    NASA Technical Reports Server (NTRS)

    Lambas, Diego G.; Groth, Edward J.; Peebles, P. J. E.

    1988-01-01

    Using the Uppsala General Catalog of bright galaxies and the northern and southern maps of the Lick counts of galaxies, statistical evidence of a morphology-orientation effect is found. Major axes of elliptical galaxies are preferentially oriented along the large-scale features of the Lick maps. However, the orientations of the major axes of spiral and lenticular galaxies show no clear signs of significant nonrandom behavior at a level of less than about one-fifth of the effect seen for ellipticals. The angular scale of the detected alignment effect for Uppsala ellipticals extends to at least theta of about 2 deg, which at a redshift of z of about 0.02 corresponds to a linear scale of about 2/h Mpc.

  12. Evidence for speckle effects on pulsed CO2 lidar signal returns from remote targets

    NASA Technical Reports Server (NTRS)

    Menzies, R. T.; Kavaya, M. J.; Flamant, P. H.

    1984-01-01

    A pulsed CO2 lidar was used to study statistical properties of signal returns from various rough surfaces at distances near 2 km. These included natural in situ topographic materials as well as man-made hard targets. Three lidar configurations were used: heterodyne detection with single temporal mode transmitter pulses, and direct detection with single and multiple temporal mode pulses. The significant differences in signal return statistics, due largely to speckle effects, are discussed.

  13. Simulated performance of an order statistic threshold strategy for detection of narrowband signals

    NASA Technical Reports Server (NTRS)

    Satorius, E.; Brady, R.; Deich, W.; Gulkis, S.; Olsen, E.

    1988-01-01

    The application of order statistics to signal detection is becoming an increasingly active area of research. This is due to the inherent robustness of rank estimators in the presence of large outliers that would significantly degrade more conventional mean-level-based detection systems. A detection strategy is presented in which the threshold estimate is obtained using order statistics. The performance of this algorithm in the presence of simulated interference and broadband noise is evaluated. In this way, the robustness of the proposed strategy in the presence of the interference can be fully assessed as a function of the interference, noise, and detector parameters.

  14. Radon-222 concentrations in ground water and soil gas on Indian reservations in Wisconsin

    USGS Publications Warehouse

    DeWild, John F.; Krohelski, James T.

    1995-01-01

    For sites with wells finished in the sand and gravel aquifer, the coefficient of determination (R2) of the regression of concentration of radon-222 in ground water as a function of well depth is 0.003 and the significance level is 0.32, which indicates that there is not a statistically significant relation between radon-222 concentrations in ground water and well depth. The coefficient of determination of the regression of radon-222 in ground water and soil gas is 0.19 and the root mean square error of the regression line is 271 picocuries per liter. Even though the significance level (0.036) indicates a statistical relation, the root mean square error of the regression is so large that the regression equation would not give reliable predictions. Because of an inadequate number of samples, similar statistical analyses could not be performed for sites with wells finished in the crystalline and sedimentary bedrock aquifers.

  15. How Universal Is the Relationship Between Remotely Sensed Vegetation Indices (VI) and Crop Leaf Area Index (LAI)?

    NASA Technical Reports Server (NTRS)

    Kang, Yanghui; Ozdogan, Mutlu; Zipper, Samuel C.; Roman, Miguel

    2016-01-01

    Global LAI-VI relationships are statistically significant, crop-specific, and mostly non-linear. This research enables the operationalization of large-area crop modeling and, by extension, has relevance to both fundamental and applied agroecosystem research.

  16. Power of mental health nursing research: a statistical analysis of studies in the International Journal of Mental Health Nursing.

    PubMed

    Gaskin, Cadeyrn J; Happell, Brenda

    2013-02-01

    Having sufficient power to detect effect sizes of an expected magnitude is a core consideration when designing studies in which inferential statistics will be used. The main aim of this study was to investigate the statistical power in studies published in the International Journal of Mental Health Nursing. From volumes 19 (2010) and 20 (2011) of the journal, studies were analysed for their power to detect small, medium, and large effect sizes, according to Cohen's guidelines. The power of the 23 studies included in this review to detect small, medium, and large effects was 0.34, 0.79, and 0.94, respectively. In 90% of papers, no adjustments for experiment-wise error were reported. With a median of nine inferential tests per paper, the mean experiment-wise error rate was 0.51. A priori power analyses were only reported in 17% of studies. Although effect sizes for correlations and regressions were routinely reported, effect sizes for other tests (χ(2)-tests, t-tests, ANOVA/MANOVA) were largely absent from the papers. All types of effect sizes were infrequently interpreted. Researchers are strongly encouraged to conduct power analyses when designing studies, and to avoid scattergun approaches to data analysis (i.e. undertaking large numbers of tests in the hope of finding 'significant' results). Because reviewing effect sizes is essential for determining the clinical significance of study findings, researchers would better serve the field of mental health nursing if they reported and interpreted effect sizes. © 2012 The Authors. International Journal of Mental Health Nursing © 2012 Australian College of Mental Health Nurses Inc.

  17. Reproductive potential of Spodoptera eridania (Stoll) (Lepidoptera: Noctuidae) in the laboratory: effect of multiple couples and the size.

    PubMed

    Specht, A; Montezano, D G; Sosa-Gómez, D R; Paula-Moraes, S V; Roque-Specht, V F; Barros, N M

    2016-06-01

    This study aimed to evaluate the effect of keeping three couples in the same cage, and the size of adults emerged from small, medium-sized and large pupae (278.67 mg; 333.20 mg and 381.58 mg, respectively), on the reproductive potential of S. eridania (Stoll, 1782) adults, under controlled conditions (25 ± 1 °C, 70% RH and 14 hour photophase). We evaluated the survival, number of copulations, fecundity and fertility of the adult females. The survival of females from these different pupal sizes did not differ statistically, but the survival of males from large pupae was statistically shorter than from small pupae. Fecundity differed significantly and correlated positively with size. The number of effective copulations (espematophores) and fertility did not vary significantly with pupal size. Our results emphasize the importance of indicating the number of copulations and the size of the insects when reproductive parameters are compared.

  18. Balance exercise for persons with multiple sclerosis using Wii games: a randomised, controlled multi-centre study.

    PubMed

    Nilsagård, Ylva E; Forsberg, Anette S; von Koch, Lena

    2013-02-01

    The use of interactive video games is expanding within rehabilitation. The evidence base is, however, limited. Our aim was to evaluate the effects of a Nintendo Wii Fit® balance exercise programme on balance function and walking ability in people with multiple sclerosis (MS). A multi-centre, randomised, controlled single-blinded trial with random allocation to exercise or no exercise. The exercise group participated in a programme of 12 supervised 30-min sessions of balance exercises using Wii games, twice a week for 6-7 weeks. Primary outcome was the Timed Up and Go test (TUG). In total, 84 participants were enrolled; four were lost to follow-up. After the intervention, there were no statistically significant differences between groups but effect sizes for the TUG, TUGcognitive and, the Dynamic Gait Index (DGI) were moderate and small for all other measures. Statistically significant improvements within the exercise group were present for all measures (large to moderate effect sizes) except in walking speed and balance confidence. The non-exercise group showed statistically significant improvements for the Four Square Step Test and the DGI. In comparison with no intervention, a programme of supervised balance exercise using Nintendo Wii Fit® did not render statistically significant differences, but presented moderate effect sizes for several measures of balance performance.

  19. Graphical augmentations to the funnel plot assess the impact of additional evidence on a meta-analysis.

    PubMed

    Langan, Dean; Higgins, Julian P T; Gregory, Walter; Sutton, Alexander J

    2012-05-01

    We aim to illustrate the potential impact of a new study on a meta-analysis, which gives an indication of the robustness of the meta-analysis. A number of augmentations are proposed to one of the most widely used of graphical displays, the funnel plot. Namely, 1) statistical significance contours, which define regions of the funnel plot in which a new study would have to be located to change the statistical significance of the meta-analysis; and 2) heterogeneity contours, which show how a new study would affect the extent of heterogeneity in a given meta-analysis. Several other features are also described, and the use of multiple features simultaneously is considered. The statistical significance contours suggest that one additional study, no matter how large, may have a very limited impact on the statistical significance of a meta-analysis. The heterogeneity contours illustrate that one outlying study can increase the level of heterogeneity dramatically. The additional features of the funnel plot have applications including 1) informing sample size calculations for the design of future studies eligible for inclusion in the meta-analysis; and 2) informing the updating prioritization of a portfolio of meta-analyses such as those prepared by the Cochrane Collaboration. Copyright © 2012 Elsevier Inc. All rights reserved.

  20. Water-quality trends for selected sites in the Boulder River and Tenmile Creek watersheds, Montana, based on data collected during water years 1997-2013

    USGS Publications Warehouse

    Sando, Steven K.; Clark, Melanie L.; Cleasby, Thomas E.; Barnhart, Elliott P.

    2015-01-01

    Trend results for sites in the Tenmile Creek watershed generally are more variable and difficult to interpret than for sites in the Boulder River watershed. Trend results for Tenmile Creek above City Diversion (site 11) and Minnehaha Creek near Rimini (site 12) for water years 2000–13 indicate decreasing trends in FACs of cadmium, copper, and zinc. The magnitudes of the decreasing trends in FACs of copper generally are moderate and statistically significant for sites 11 and 12. The magnitudes of the decreasing trends in FACs of cadmium and zinc for site 11 are minor to small and not statistically significant; however, the magnitudes for site 12 are moderate and statistically significant. In general, patterns in FACs for Tenmile Creek near Rimini (site 13) are not well represented by fitted trends within the short data collection period, which might indicate that the trend-analysis structure of the study is not appropriate for describing trends in FACs for site 13. The large decreasing trend in FACs of suspended sediment is the strongest indication of change in water quality during the short period of record for site 13; however, this trend is not statistically significant.

  1. Comparison of the Effects of Walking with and without Nordic Pole on Upper Extremity and Lower Extremity Muscle Activation.

    PubMed

    Shim, Je-Myung; Kwon, Hae-Yeon; Kim, Ha-Roo; Kim, Bo-In; Jung, Ju-Hyeon

    2013-12-01

    [Purpose] The aim of this study was to assess the effect of Nordic pole walking on the electromyographic activities of upper extremity and lower extremity muscles. [Subjects and Methods] The subjects were randomly divided into two groups as follows: without Nordic pole walking group (n=13) and with Nordic pole walking group (n=13). The EMG data were collected by measurement while the subjects walking on a treadmill for 30 minutes by measuring from one heel strike to the next. [Results] Both the average values and maximum values of the muscle activity of the upper extremity increased in both the group that used Nordic poles and the group that did not use Nordic poles, and the values showed statistically significant differences. There was an increase in the average value for muscle activity of the latissimus dorsi, but the difference was not statistically significant, although there was a statistically significant increase in its maximum value. The average and maximum values for muscle activity of the lower extremity did not show large differences in either group, and the values did not show any statistically significant differences. [Conclusion] The use of Nordic poles by increased muscle activity of the upper extremity compared with regular walking but did not affect the lower extremity.

  2. Comparison of the Effects of Walking with and without Nordic Pole on Upper Extremity and Lower Extremity Muscle Activation

    PubMed Central

    Shim, Je-myung; Kwon, Hae-yeon; Kim, Ha-roo; Kim, Bo-in; Jung, Ju-hyeon

    2014-01-01

    [Purpose] The aim of this study was to assess the effect of Nordic pole walking on the electromyographic activities of upper extremity and lower extremity muscles. [Subjects and Methods] The subjects were randomly divided into two groups as follows: without Nordic pole walking group (n=13) and with Nordic pole walking group (n=13). The EMG data were collected by measurement while the subjects walking on a treadmill for 30 minutes by measuring from one heel strike to the next. [Results] Both the average values and maximum values of the muscle activity of the upper extremity increased in both the group that used Nordic poles and the group that did not use Nordic poles, and the values showed statistically significant differences. There was an increase in the average value for muscle activity of the latissimus dorsi, but the difference was not statistically significant, although there was a statistically significant increase in its maximum value. The average and maximum values for muscle activity of the lower extremity did not show large differences in either group, and the values did not show any statistically significant differences. [Conclusion] The use of Nordic poles by increased muscle activity of the upper extremity compared with regular walking but did not affect the lower extremity. PMID:24409018

  3. Comparative Analysis of Serum (Anti)oxidative Status Parameters in Healthy Persons

    PubMed Central

    Jansen, Eugène HJM; Ruskovska, Tatjana

    2013-01-01

    Five antioxidant and two oxidative stress assays were applied to serum samples of 43 healthy males. The antioxidant tests showed different inter-assay correlations. A very good correlation of 0.807 was observed between the ferric reducing ability of plasma (FRAP) and total antioxidant status (TAS) assay and also a fair correlation of 0.501 between the biological antioxidant potential (BAP) and TAS assay. There was no statistically significant correlation between the BAP and FRAP assay. The anti-oxidant assays have a high correlation with uric acid, especially the TAS (0.922) and FRAP assay (0.869). The BAP assay has a much lower and no statistically significant correlation with uric acid (0.302), which makes BAP more suitable for the antioxidant status. The total thiol assay showed no statistically significant correlation with uric acid (0.114). The total thiol assay, which is based on a completely different principle, showed a good and statistically significant correlation with the BAP assay (0.510) and also to the TAS assay, but to a lower and not significant extent (0.279) and not with the FRAP assay (−0.008). The oxy-adsorbent test (OXY) assay has no correlation with any of the other assays tested. The oxidative stress assays, reactive oxygen metabolites (ROM) and total oxidant status (TOS), based on a different principle, do not show a statistically significant correlation with the serum samples in this study. Both assays showed a negative, but not significant, correlation with the antioxidant assays. In conclusion, the ROM, TOS, BAP and TTP assays are based on different principles and will have an additional value when a combination of these assays will be applied in large-scale population studies. PMID:23507749

  4. Association between exposure to radiofrequency electromagnetic fields assessed by dosimetry and acute symptoms in children and adolescents: a population based cross-sectional study

    PubMed Central

    2010-01-01

    Background The increase in numbers of mobile phone users was accompanied by some concern that exposure to radiofrequency electromagnetic fields (RF EMF) might adversely affect acute health especially in children and adolescents. The authors investigated this potential association using personal dosimeters. Methods A 24-hour exposure profile of 1484 children and 1508 adolescents was generated in a population-based cross-sectional study in Germany between 2006 and 2008 (participation 52%). Personal interview data on socio-demographic characteristics, self-reported exposure and potential confounders were collected. Acute symptoms were assessed twice during the study day using a symptom diary. Results Only few of the large number of investigated associations were found to be statistically significant. At noon, adolescents with a measured exposure in the highest quartile during morning hours reported a statistically significant higher intensity of headache (Odd Ratio: 1.50; 95% confidence interval: 1.03, 2.19). At bedtime, adolescents with a measured exposure in the highest quartile during afternoon hours reported a statistically significant higher intensity of irritation in the evening (4th quartile 1.79; 1.23, 2.61), while children reported a statistically significant higher intensity of concentration problems (4th quartile 1.55; 1.02, 2.33). Conclusions We observed few statistically significant results which are not consistent over the two time points. Furthermore, when the 10% of the participants with the highest exposure are taken into consideration the significant results of the main analysis could not be confirmed. Based on the pattern of these results, we assume that the few observed significant associations are not causal but rather occurred by chance. PMID:21108839

  5. Association between exposure to radiofrequency electromagnetic fields assessed by dosimetry and acute symptoms in children and adolescents: a population based cross-sectional study.

    PubMed

    Heinrich, Sabine; Thomas, Silke; Heumann, Christian; von Kries, Rüdiger; Radon, Katja

    2010-11-25

    The increase in numbers of mobile phone users was accompanied by some concern that exposure to radiofrequency electromagnetic fields (RF EMF) might adversely affect acute health especially in children and adolescents. The authors investigated this potential association using personal dosimeters. A 24-hour exposure profile of 1484 children and 1508 adolescents was generated in a population-based cross-sectional study in Germany between 2006 and 2008 (participation 52%). Personal interview data on socio-demographic characteristics, self-reported exposure and potential confounders were collected. Acute symptoms were assessed twice during the study day using a symptom diary. Only few of the large number of investigated associations were found to be statistically significant. At noon, adolescents with a measured exposure in the highest quartile during morning hours reported a statistically significant higher intensity of headache (Odd Ratio: 1.50; 95% confidence interval: 1.03, 2.19). At bedtime, adolescents with a measured exposure in the highest quartile during afternoon hours reported a statistically significant higher intensity of irritation in the evening (4th quartile 1.79; 1.23, 2.61), while children reported a statistically significant higher intensity of concentration problems (4th quartile 1.55; 1.02, 2.33). We observed few statistically significant results which are not consistent over the two time points. Furthermore, when the 10% of the participants with the highest exposure are taken into consideration the significant results of the main analysis could not be confirmed. Based on the pattern of these results, we assume that the few observed significant associations are not causal but rather occurred by chance.

  6. Large wood mobility processes in low-order Chilean river channels

    NASA Astrophysics Data System (ADS)

    Iroumé, Andrés; Mao, Luca; Andreoli, Andrea; Ulloa, Héctor; Ardiles, María Paz

    2015-01-01

    Large wood (LW) mobility was studied over several time periods in channel segments of four low-order mountain streams, southern Chile. All wood pieces found within the bankfull channels and on the streambanks extending into the channel with dimensions more than 10 cm in diameter and 1 m in length were measured and their position was referenced. Thirty six percent of measured wood pieces were tagged to investigate log mobility. All segments were first surveyed in summer and then after consecutive rainy winter periods. Annual LW mobility ranged between 0 and 28%. Eighty-four percent of the moved LW had diameters ≤ 40 cm and 92% had lengths ≤ 7 m. Large wood mobility was higher in periods when maximum water level (Hmax) exceeded channel bankfull depth (HBk) than in periods with flows less than HBk, but the difference was not statistically significant. Dimensions of moved LW showed no significant differences between periods with flows exceeding and with flows less than bankfull stage. Statistically significant relationships were found between annual LW mobility (%) and unit stream power (for Hmax) and Hmax/HBk. The mean diameter of transported wood pieces per period was significantly correlated with unit stream power for H15% and H50% (the level above which the flow remains for 15 and 50% of the time, respectively). These results contribute to an understanding of the complexity of LW mobilization processes in mountain streams and can be used to assess and prevent potential damage caused by LW mobilization during floods.

  7. Testing a stepped care model for binge-eating disorder: a two-step randomized controlled trial.

    PubMed

    Tasca, Giorgio A; Koszycki, Diana; Brugnera, Agostino; Chyurlia, Livia; Hammond, Nicole; Francis, Kylie; Ritchie, Kerri; Ivanova, Iryna; Proulx, Genevieve; Wilson, Brian; Beaulac, Julie; Bissada, Hany; Beasley, Erin; Mcquaid, Nancy; Grenon, Renee; Fortin-Langelier, Benjamin; Compare, Angelo; Balfour, Louise

    2018-05-24

    A stepped care approach involves patients first receiving low-intensity treatment followed by higher intensity treatment. This two-step randomized controlled trial investigated the efficacy of a sequential stepped care approach for the psychological treatment of binge-eating disorder (BED). In the first step, all participants with BED (n = 135) received unguided self-help (USH) based on a cognitive-behavioral therapy model. In the second step, participants who remained in the trial were randomized either to 16 weeks of group psychodynamic-interpersonal psychotherapy (GPIP) (n = 39) or to a no-treatment control condition (n = 46). Outcomes were assessed for USH in step 1, and then for step 2 up to 6-months post-treatment using multilevel regression slope discontinuity models. In the first step, USH resulted in large and statistically significant reductions in the frequency of binge eating. Statistically significant moderate to large reductions in eating disorder cognitions were also noted. In the second step, there was no difference in change in frequency of binge eating between GPIP and the control condition. Compared with controls, GPIP resulted in significant and large improvement in attachment avoidance and interpersonal problems. The findings indicated that a second step of a stepped care approach did not significantly reduce binge-eating symptoms beyond the effects of USH alone. The study provided some evidence for the second step potentially to reduce factors known to maintain binge eating in the long run, such as attachment avoidance and interpersonal problems.

  8. A statistical parts-based appearance model of inter-subject variability.

    PubMed

    Toews, Matthew; Collins, D Louis; Arbel, Tal

    2006-01-01

    In this article, we present a general statistical parts-based model for representing the appearance of an image set, applied to the problem of inter-subject MR brain image matching. In contrast with global image representations such as active appearance models, the parts-based model consists of a collection of localized image parts whose appearance, geometry and occurrence frequency are quantified statistically. The parts-based approach explicitly addresses the case where one-to-one correspondence does not exist between subjects due to anatomical differences, as parts are not expected to occur in all subjects. The model can be learned automatically, discovering structures that appear with statistical regularity in a large set of subject images, and can be robustly fit to new images, all in the presence of significant inter-subject variability. As parts are derived from generic scale-invariant features, the framework can be applied in a wide variety of image contexts, in order to study the commonality of anatomical parts or to group subjects according to the parts they share. Experimentation shows that a parts-based model can be learned from a large set of MR brain images, and used to determine parts that are common within the group of subjects. Preliminary results indicate that the model can be used to automatically identify distinctive features for inter-subject image registration despite large changes in appearance.

  9. Validating the simulation of large-scale parallel applications using statistical characteristics

    DOE PAGES

    Zhang, Deli; Wilke, Jeremiah; Hendry, Gilbert; ...

    2016-03-01

    Simulation is a widely adopted method to analyze and predict the performance of large-scale parallel applications. Validating the hardware model is highly important for complex simulations with a large number of parameters. Common practice involves calculating the percent error between the projected and the real execution time of a benchmark program. However, in a high-dimensional parameter space, this coarse-grained approach often suffers from parameter insensitivity, which may not be known a priori. Moreover, the traditional approach cannot be applied to the validation of software models, such as application skeletons used in online simulations. In this work, we present a methodologymore » and a toolset for validating both hardware and software models by quantitatively comparing fine-grained statistical characteristics obtained from execution traces. Although statistical information has been used in tasks like performance optimization, this is the first attempt to apply it to simulation validation. Lastly, our experimental results show that the proposed evaluation approach offers significant improvement in fidelity when compared to evaluation using total execution time, and the proposed metrics serve as reliable criteria that progress toward automating the simulation tuning process.« less

  10. New Probe of Departures from General Relativity Using Minkowski Functionals.

    PubMed

    Fang, Wenjuan; Li, Baojiu; Zhao, Gong-Bo

    2017-05-05

    The morphological properties of the large scale structure of the Universe can be fully described by four Minkowski functionals (MFs), which provide important complementary information to other statistical observables such as the widely used 2-point statistics in configuration and Fourier spaces. In this work, for the first time, we present the differences in the morphology of the large scale structure caused by modifications to general relativity (to address the cosmic acceleration problem), by measuring the MFs from N-body simulations of modified gravity and general relativity. We find strong statistical power when using the MFs to constrain modified theories of gravity: with a galaxy survey that has survey volume ∼0.125(h^{-1}  Gpc)^{3} and galaxy number density ∼1/(h^{-1}  Mpc)^{3}, the two normal-branch Dvali-Gabadadze-Porrati models and the F5 f(R) model that we simulated can be discriminated from the ΛCDM model at a significance level ≳5σ with an individual MF measurement. Therefore, the MF of the large scale structure is potentially a powerful probe of gravity, and its application to real data deserves active exploration.

  11. Comparing and combining process-based crop models and statistical models with some implications for climate change

    NASA Astrophysics Data System (ADS)

    Roberts, Michael J.; Braun, Noah O.; Sinclair, Thomas R.; Lobell, David B.; Schlenker, Wolfram

    2017-09-01

    We compare predictions of a simple process-based crop model (Soltani and Sinclair 2012), a simple statistical model (Schlenker and Roberts 2009), and a combination of both models to actual maize yields on a large, representative sample of farmer-managed fields in the Corn Belt region of the United States. After statistical post-model calibration, the process model (Simple Simulation Model, or SSM) predicts actual outcomes slightly better than the statistical model, but the combined model performs significantly better than either model. The SSM, statistical model and combined model all show similar relationships with precipitation, while the SSM better accounts for temporal patterns of precipitation, vapor pressure deficit and solar radiation. The statistical and combined models show a more negative impact associated with extreme heat for which the process model does not account. Due to the extreme heat effect, predicted impacts under uniform climate change scenarios are considerably more severe for the statistical and combined models than for the process-based model.

  12. Local multiplicity adjustment for the spatial scan statistic using the Gumbel distribution.

    PubMed

    Gangnon, Ronald E

    2012-03-01

    The spatial scan statistic is an important and widely used tool for cluster detection. It is based on the simultaneous evaluation of the statistical significance of the maximum likelihood ratio test statistic over a large collection of potential clusters. In most cluster detection problems, there is variation in the extent of local multiplicity across the study region. For example, using a fixed maximum geographic radius for clusters, urban areas typically have many overlapping potential clusters, whereas rural areas have relatively few. The spatial scan statistic does not account for local multiplicity variation. We describe a previously proposed local multiplicity adjustment based on a nested Bonferroni correction and propose a novel adjustment based on a Gumbel distribution approximation to the distribution of a local scan statistic. We compare the performance of all three statistics in terms of power and a novel unbiased cluster detection criterion. These methods are then applied to the well-known New York leukemia dataset and a Wisconsin breast cancer incidence dataset. © 2011, The International Biometric Society.

  13. Local multiplicity adjustment for the spatial scan statistic using the Gumbel distribution

    PubMed Central

    Gangnon, Ronald E.

    2011-01-01

    Summary The spatial scan statistic is an important and widely used tool for cluster detection. It is based on the simultaneous evaluation of the statistical significance of the maximum likelihood ratio test statistic over a large collection of potential clusters. In most cluster detection problems, there is variation in the extent of local multiplicity across the study region. For example, using a fixed maximum geographic radius for clusters, urban areas typically have many overlapping potential clusters, while rural areas have relatively few. The spatial scan statistic does not account for local multiplicity variation. We describe a previously proposed local multiplicity adjustment based on a nested Bonferroni correction and propose a novel adjustment based on a Gumbel distribution approximation to the distribution of a local scan statistic. We compare the performance of all three statistics in terms of power and a novel unbiased cluster detection criterion. These methods are then applied to the well-known New York leukemia dataset and a Wisconsin breast cancer incidence dataset. PMID:21762118

  14. [Studies on localized low-risk prostate cancer : Do we know enough?

    PubMed

    Weißbach, L; Roloff, C

    2018-06-05

    Treatment of localized low-risk prostate cancer (PCa) is undergoing a paradigm shift: Invasive treatments such as surgery and radiation therapy are being replaced by defensive strategies such as active surveillance (AS) and watchful waiting (WW). The aim of this work is to evaluate the significance of current studies regarding defensive strategies (AS and WW). The best-known AS studies are critically evaluated for their significance in terms of input criteria, follow-up criteria, and statistical significance. The difficulties faced by randomized studies in answering the question of the best treatment for low-risk cancer in two or even more study groups with known low tumor-specific mortality are clearly shown. Some studies fail because of the objective, others-like PIVOT-are underpowered. ProtecT, a renowned randomized, controlled trial (RCT), lists systematic and statistical shortcomings in detail. The time and effort required for RCTs to answer the question of which therapy is best for locally limited low-risk cancer is very large because the low specific mortality rate requires a large number of participants and a long study duration. In any case, RCTs create hand-picked cohorts for statistical evaluation that have little to do with care in daily clinical practice. The necessary randomization is also offset by the decision-making of the informed patient. If further studies of low-risk PCa are needed, they will need real-world conditions that an RCT can not provide. To obtain clinically relevant results, we need to rethink things: When planning the study, biometricians and clinicians must understand that the statistical methods used in RCTs are of limited use and they must select a method (e.g. propensity scores) appropriate for health care research.

  15. An experiment on the impact of a neonicotinoid pesticide on honeybees: the value of a formal analysis of the data.

    PubMed

    Schick, Robert S; Greenwood, Jeremy J D; Buckland, Stephen T

    2017-01-01

    We assess the analysis of the data resulting from a field experiment conducted by Pilling et al. (PLoS ONE. doi: 10.1371/journal.pone.0077193, 5) on the potential effects of thiamethoxam on honeybees. The experiment had low levels of replication, so Pilling et al. concluded that formal statistical analysis would be misleading. This would be true if such an analysis merely comprised tests of statistical significance and if the investigators concluded that lack of significance meant little or no effect. However, an analysis that includes estimation of the size of any effects-with confidence limits-allows one to reach conclusions that are not misleading and that produce useful insights. For the data of Pilling et al., we use straightforward statistical analysis to show that the confidence limits are generally so wide that any effects of thiamethoxam could have been large without being statistically significant. Instead of formal analysis, Pilling et al. simply inspected the data and concluded that they provided no evidence of detrimental effects and from this that thiamethoxam poses a "low risk" to bees. Conclusions derived from the inspection of the data were not just misleading in this case but also are unacceptable in principle, for if data are inadequate for a formal analysis (or only good enough to provide estimates with wide confidence intervals), then they are bound to be inadequate as a basis for reaching any sound conclusions. Given that the data in this case are largely uninformative with respect to the treatment effect, any conclusions reached from such informal approaches can do little more than reflect the prior beliefs of those involved.

  16. A Novel Genome-Information Content-Based Statistic for Genome-Wide Association Analysis Designed for Next-Generation Sequencing Data

    PubMed Central

    Luo, Li; Zhu, Yun

    2012-01-01

    Abstract The genome-wide association studies (GWAS) designed for next-generation sequencing data involve testing association of genomic variants, including common, low frequency, and rare variants. The current strategies for association studies are well developed for identifying association of common variants with the common diseases, but may be ill-suited when large amounts of allelic heterogeneity are present in sequence data. Recently, group tests that analyze their collective frequency differences between cases and controls shift the current variant-by-variant analysis paradigm for GWAS of common variants to the collective test of multiple variants in the association analysis of rare variants. However, group tests ignore differences in genetic effects among SNPs at different genomic locations. As an alternative to group tests, we developed a novel genome-information content-based statistics for testing association of the entire allele frequency spectrum of genomic variation with the diseases. To evaluate the performance of the proposed statistics, we use large-scale simulations based on whole genome low coverage pilot data in the 1000 Genomes Project to calculate the type 1 error rates and power of seven alternative statistics: a genome-information content-based statistic, the generalized T2, collapsing method, multivariate and collapsing (CMC) method, individual χ2 test, weighted-sum statistic, and variable threshold statistic. Finally, we apply the seven statistics to published resequencing dataset from ANGPTL3, ANGPTL4, ANGPTL5, and ANGPTL6 genes in the Dallas Heart Study. We report that the genome-information content-based statistic has significantly improved type 1 error rates and higher power than the other six statistics in both simulated and empirical datasets. PMID:22651812

  17. A novel genome-information content-based statistic for genome-wide association analysis designed for next-generation sequencing data.

    PubMed

    Luo, Li; Zhu, Yun; Xiong, Momiao

    2012-06-01

    The genome-wide association studies (GWAS) designed for next-generation sequencing data involve testing association of genomic variants, including common, low frequency, and rare variants. The current strategies for association studies are well developed for identifying association of common variants with the common diseases, but may be ill-suited when large amounts of allelic heterogeneity are present in sequence data. Recently, group tests that analyze their collective frequency differences between cases and controls shift the current variant-by-variant analysis paradigm for GWAS of common variants to the collective test of multiple variants in the association analysis of rare variants. However, group tests ignore differences in genetic effects among SNPs at different genomic locations. As an alternative to group tests, we developed a novel genome-information content-based statistics for testing association of the entire allele frequency spectrum of genomic variation with the diseases. To evaluate the performance of the proposed statistics, we use large-scale simulations based on whole genome low coverage pilot data in the 1000 Genomes Project to calculate the type 1 error rates and power of seven alternative statistics: a genome-information content-based statistic, the generalized T(2), collapsing method, multivariate and collapsing (CMC) method, individual χ(2) test, weighted-sum statistic, and variable threshold statistic. Finally, we apply the seven statistics to published resequencing dataset from ANGPTL3, ANGPTL4, ANGPTL5, and ANGPTL6 genes in the Dallas Heart Study. We report that the genome-information content-based statistic has significantly improved type 1 error rates and higher power than the other six statistics in both simulated and empirical datasets.

  18. Environmentally safe areas and routes in the Baltic proper using Eulerian tracers.

    PubMed

    Höglund, A; Meier, H E M

    2012-07-01

    In recent years, the shipping of environmentally hazardous cargo has increased considerably in the Baltic proper. In this study, a large number of hypothetical oil spills with an idealized, passive tracer are simulated. From the tracer distributions, statistical measures are calculated to optimize the quantity of tracer from a spill that would stay at sea as long as possible. Increased time may permit action to be taken against the spill before the oil reaches environmentally vulnerable coastal zones. The statistical measures are used to calculate maritime routes with maximum probability that an oil spill will stay at sea as long as possible. Under these assumptions, ships should follow routes that are located south of Bornholm instead of the northern routes in use currently. Our results suggest that the location of the optimal maritime routes depends on the season, although interannual variability is too large to identify statistically significant changes. Copyright © 2012. Published by Elsevier Ltd.

  19. Unexpected individual clinical site variation in eradication rates of group a streptococci by penicillin in multisite clinical trials.

    PubMed

    Kaplan, Edward L; Oakes, J Michael; Johnson, Dwight R

    2007-12-01

    Previously, we reported an unexpectedly large percentage of failures by penicillin to eradicate group A streptococci (GAS) from the upper respiratory tract. Because penicillin has been the recommended therapy for the treatment of GAS pharyngitis, our report prompted controversy. Data from clinical trials in which our laboratory has participated demonstrated marked variation in GAS eradication rates among clinical sites. The reasons for such variation have never been adequately examined. We performed statistical analyses of site variation in eradication rates to assess the potential effect on reported reduced penicillin efficacy. Penicillin GAS eradication rates were compared using data from 4 large multisite pharyngitis treatment trials (75 clinical sites; 1158 subjects). Variation in eradication rates among clinical sites was statistically evaluated [chi(2) tests and generalized estimating equation (GEE) regression models]. There was significant site-to-site variation in GAS eradication rates in each of the trials (range, 17-100%; P < 0.005) as well as between separate trials (mean range, 58-69%; P < 0.033). GEE modeling indicated that GAS eradication rates were significantly higher for clinical sites participating in more than one clinical trial. The statistically significant site-to-site variation in penicillin eradication rates was related to factors (dependencies) at individual sites. Such factors may affect assessment of therapeutic efficacy and indicate a necessity for considering clinical site variation before reporting pooled efficacy data from multiple sites; combined data may result in misleading clinical implications. This is the first report documenting significant variation resulting from individual clinical site-related factors and offers a possible explanation for reduced penicillin eradication.

  20. Daily Spiritual Experiences and Prosocial Behavior

    ERIC Educational Resources Information Center

    Einolf, Christopher J.

    2013-01-01

    This paper examines how the Daily Spiritual Experiences Scale (DSES) relates to range of prosocial behaviors, using a large, nationally representative U.S. data set. It finds that daily spiritual experiences are a statistically and substantively significant predictor of volunteering, charitable giving, and helping individuals one knows personally.…

  1. A Powerful Approach to Estimating Annotation-Stratified Genetic Covariance via GWAS Summary Statistics.

    PubMed

    Lu, Qiongshi; Li, Boyang; Ou, Derek; Erlendsdottir, Margret; Powles, Ryan L; Jiang, Tony; Hu, Yiming; Chang, David; Jin, Chentian; Dai, Wei; He, Qidu; Liu, Zefeng; Mukherjee, Shubhabrata; Crane, Paul K; Zhao, Hongyu

    2017-12-07

    Despite the success of large-scale genome-wide association studies (GWASs) on complex traits, our understanding of their genetic architecture is far from complete. Jointly modeling multiple traits' genetic profiles has provided insights into the shared genetic basis of many complex traits. However, large-scale inference sets a high bar for both statistical power and biological interpretability. Here we introduce a principled framework to estimate annotation-stratified genetic covariance between traits using GWAS summary statistics. Through theoretical and numerical analyses, we demonstrate that our method provides accurate covariance estimates, thereby enabling researchers to dissect both the shared and distinct genetic architecture across traits to better understand their etiologies. Among 50 complex traits with publicly accessible GWAS summary statistics (N total ≈ 4.5 million), we identified more than 170 pairs with statistically significant genetic covariance. In particular, we found strong genetic covariance between late-onset Alzheimer disease (LOAD) and amyotrophic lateral sclerosis (ALS), two major neurodegenerative diseases, in single-nucleotide polymorphisms (SNPs) with high minor allele frequencies and in SNPs located in the predicted functional genome. Joint analysis of LOAD, ALS, and other traits highlights LOAD's correlation with cognitive traits and hints at an autoimmune component for ALS. Copyright © 2017 American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  2. Small studies may overestimate the effect sizes in critical care meta-analyses: a meta-epidemiological study

    PubMed Central

    2013-01-01

    Introduction Small-study effects refer to the fact that trials with limited sample sizes are more likely to report larger beneficial effects than large trials. However, this has never been investigated in critical care medicine. Thus, the present study aimed to examine the presence and extent of small-study effects in critical care medicine. Methods Critical care meta-analyses involving randomized controlled trials and reported mortality as an outcome measure were considered eligible for the study. Component trials were classified as large (≥100 patients per arm) and small (<100 patients per arm) according to their sample sizes. Ratio of odds ratio (ROR) was calculated for each meta-analysis and then RORs were combined using a meta-analytic approach. ROR<1 indicated larger beneficial effect in small trials. Small and large trials were compared in methodological qualities including sequence generating, blinding, allocation concealment, intention to treat and sample size calculation. Results A total of 27 critical care meta-analyses involving 317 trials were included. Of them, five meta-analyses showed statistically significant RORs <1, and other meta-analyses did not reach a statistical significance. Overall, the pooled ROR was 0.60 (95% CI: 0.53 to 0.68); the heterogeneity was moderate with an I2 of 50.3% (chi-squared = 52.30; P = 0.002). Large trials showed significantly better reporting quality than small trials in terms of sequence generating, allocation concealment, blinding, intention to treat, sample size calculation and incomplete follow-up data. Conclusions Small trials are more likely to report larger beneficial effects than large trials in critical care medicine, which could be partly explained by the lower methodological quality in small trials. Caution should be practiced in the interpretation of meta-analyses involving small trials. PMID:23302257

  3. The effect of a workplace violence training program for generalist nurses in the acute hospital setting: A quasi-experimental study.

    PubMed

    Lamont, Scott; Brunero, Scott

    2018-05-19

    Workplace violence prevalence has attracted significant attention within the international nursing literature. Little attention to non-mental health settings and a lack of evaluation rigor have been identified within review literature. To examine the effects of a workplace violence training program in relation to risk assessment and management practices, de-escalation skills, breakaway techniques, and confidence levels, within an acute hospital setting. A quasi-experimental study of nurses using pretest-posttest measurements of educational objectives and confidence levels, with two week follow-up. A 440 bed metropolitan tertiary referral hospital in Sydney, Australia. Nurses working in specialties identified as a 'high risk' for violence. A pre-post-test design was used with participants attending a one day workshop. The workshop evaluation comprised the use of two validated questionnaires: the Continuing Professional Development Reaction questionnaire, and the Confidence in Coping with Patient Aggression Instrument. Descriptive and inferential statistics were calculated. The paired t-test was used to assess the statistical significance of changes in the clinical behaviour intention and confidence scores from pre- to post-intervention. Cohen's d effect sizes were calculated to determine the extent of the significant results. Seventy-eight participants completed both pre- and post-workshop evaluation questionnaires. Statistically significant increases in behaviour intention scores were found in fourteen of the fifteen constructs relating to the three broad workshop objectives, and confidence ratings, with medium to large effect sizes observed in some constructs. A significant increase in overall confidence in coping with patient aggression was also found post-test with large effect size. Positive results were observed from the workplace violence training. Training needs to be complimented by a multi-faceted organisational approach which includes governance, quality and review processes. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Primary prevention of cannabis use: a systematic review of randomized controlled trials.

    PubMed

    Norberg, Melissa M; Kezelman, Sarah; Lim-Howe, Nicholas

    2013-01-01

    A systematic review of primary prevention was conducted for cannabis use outcomes in youth and young adults. The aim of the review was to develop a comprehensive understanding of prevention programming by assessing universal, targeted, uni-modal, and multi-modal approaches as well as individual program characteristics. Twenty-eight articles, representing 25 unique studies, identified from eight electronic databases (EMBASE, MEDLINE, CINAHL, ERIC, PsycINFO, DRUG, EBM Reviews, and Project CORK), were eligible for inclusion. Results indicated that primary prevention programs can be effective in reducing cannabis use in youth populations, with statistically significant effect sizes ranging from trivial (0.07) to extremely large (5.26), with the majority of significant effect sizes being trivial to small. Given that the preponderance of significant effect sizes were trivial to small and that percentages of statistically significant and non-statistically significant findings were often equivalent across program type and individual components, the effectiveness of primary prevention for cannabis use should be interpreted with caution. Universal multi-modal programs appeared to outperform other program types (i.e, universal uni-modal, targeted multi-modal, targeted unimodal). Specifically, universal multi-modal programs that targeted early adolescents (10-13 year olds), utilised non-teacher or multiple facilitators, were short in duration (10 sessions or less), and implemented boosters sessions were associated with large median effect sizes. While there were studies in these areas that contradicted these results, the results highlight the importance of assessing the interdependent relationship of program components and program types. Finally, results indicated that the overall quality of included studies was poor, with an average quality rating of 4.64 out of 9. Thus, further quality research and reporting and the development of new innovative programs are required.

  5. Primary Prevention of Cannabis Use: A Systematic Review of Randomized Controlled Trials

    PubMed Central

    Norberg, Melissa M.; Kezelman, Sarah; Lim-Howe, Nicholas

    2013-01-01

    A systematic review of primary prevention was conducted for cannabis use outcomes in youth and young adults. The aim of the review was to develop a comprehensive understanding of prevention programming by assessing universal, targeted, uni-modal, and multi-modal approaches as well as individual program characteristics. Twenty-eight articles, representing 25 unique studies, identified from eight electronic databases (EMBASE, MEDLINE, CINAHL, ERIC, PsycINFO, DRUG, EBM Reviews, and Project CORK), were eligible for inclusion. Results indicated that primary prevention programs can be effective in reducing cannabis use in youth populations, with statistically significant effect sizes ranging from trivial (0.07) to extremely large (5.26), with the majority of significant effect sizes being trivial to small. Given that the preponderance of significant effect sizes were trivial to small and that percentages of statistically significant and non-statistically significant findings were often equivalent across program type and individual components, the effectiveness of primary prevention for cannabis use should be interpreted with caution. Universal multi-modal programs appeared to outperform other program types (i.e, universal uni-modal, targeted multi-modal, targeted unimodal). Specifically, universal multi-modal programs that targeted early adolescents (10–13 year olds), utilised non-teacher or multiple facilitators, were short in duration (10 sessions or less), and implemented boosters sessions were associated with large median effect sizes. While there were studies in these areas that contradicted these results, the results highlight the importance of assessing the interdependent relationship of program components and program types. Finally, results indicated that the overall quality of included studies was poor, with an average quality rating of 4.64 out of 9. Thus, further quality research and reporting and the development of new innovative programs are required. PMID:23326396

  6. Dose fractionation theorem in 3-D reconstruction (tomography)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glaeser, R.M.

    It is commonly assumed that the large number of projections for single-axis tomography precludes its application to most beam-labile specimens. However, Hegerl and Hoppe have pointed out that the total dose required to achieve statistical significance for each voxel of a computed 3-D reconstruction is the same as that required to obtain a single 2-D image of that isolated voxel, at the same level of statistical significance. Thus a statistically significant 3-D image can be computed from statistically insignificant projections, as along as the total dosage that is distributed among these projections is high enough that it would have resultedmore » in a statistically significant projection, if applied to only one image. We have tested this critical theorem by simulating the tomographic reconstruction of a realistic 3-D model created from an electron micrograph. The simulations verify the basic conclusions of high absorption, signal-dependent noise, varying specimen contrast and missing angular range. Furthermore, the simulations demonstrate that individual projections in the series of fractionated-dose images can be aligned by cross-correlation because they contain significant information derived from the summation of features from different depths in the structure. This latter information is generally not useful for structural interpretation prior to 3-D reconstruction, owing to the complexity of most specimens investigated by single-axis tomography. These results, in combination with dose estimates for imaging single voxels and measurements of radiation damage in the electron microscope, demonstrate that it is feasible to use single-axis tomography with soft X-ray microscopy of frozen-hydrated specimens.« less

  7. Statistical physics of community ecology: a cavity solution to MacArthur’s consumer resource model

    NASA Astrophysics Data System (ADS)

    Advani, Madhu; Bunin, Guy; Mehta, Pankaj

    2018-03-01

    A central question in ecology is to understand the ecological processes that shape community structure. Niche-based theories have emphasized the important role played by competition for maintaining species diversity. Many of these insights have been derived using MacArthur’s consumer resource model (MCRM) or its generalizations. Most theoretical work on the MCRM has focused on small ecosystems with a few species and resources. However theoretical insights derived from small ecosystems many not scale up to large ecosystems with many resources and species because large systems with many interacting components often display new emergent behaviors that cannot be understood or deduced from analyzing smaller systems. To address these shortcomings, we develop a statistical physics inspired cavity method to analyze MCRM when both the number of species and the number of resources is large. Unlike previous work in this limit, our theory addresses resource dynamics and resource depletion and demonstrates that species generically and consistently perturb their environments and significantly modify available ecological niches. We show how our cavity approach naturally generalizes niche theory to large ecosystems by accounting for the effect of collective phenomena on species invasion and ecological stability. Our theory suggests that such phenomena are a generic feature of large, natural ecosystems and must be taken into account when analyzing and interpreting community structure. It also highlights the important role that statistical-physics inspired approaches can play in furthering our understanding of ecology.

  8. Publication bias in situ.

    PubMed

    Phillips, Carl V

    2004-08-05

    Publication bias, as typically defined, refers to the decreased likelihood of studies' results being published when they are near the null, not statistically significant, or otherwise "less interesting." But choices about how to analyze the data and which results to report create a publication bias within the published results, a bias I label "publication bias in situ" (PBIS). PBIS may create much greater bias in the literature than traditionally defined publication bias (the failure to publish any result from a study). The causes of PBIS are well known, consisting of various decisions about reporting that are influenced by the data. But its impact is not generally appreciated, and very little attention is devoted to it. What attention there is consists largely of rules for statistical analysis that are impractical and do not actually reduce the bias in reported estimates. PBIS cannot be reduced by statistical tools because it is not fundamentally a problem of statistics, but rather of non-statistical choices and plain language interpretations. PBIS should be recognized as a phenomenon worthy of study - it is extremely common and probably has a huge impact on results reported in the literature - and there should be greater systematic efforts to identify and reduce it. The paper presents examples, including results of a recent HIV vaccine trial, that show how easily PBIS can have a large impact on reported results, as well as how there can be no simple answer to it. PBIS is a major problem, worthy of substantially more attention than it receives. There are ways to reduce the bias, but they are very seldom employed because they are largely unrecognized.

  9. Asymptotic Linear Spectral Statistics for Spiked Hermitian Random Matrices

    NASA Astrophysics Data System (ADS)

    Passemier, Damien; McKay, Matthew R.; Chen, Yang

    2015-07-01

    Using the Coulomb Fluid method, this paper derives central limit theorems (CLTs) for linear spectral statistics of three "spiked" Hermitian random matrix ensembles. These include Johnstone's spiked model (i.e., central Wishart with spiked correlation), non-central Wishart with rank-one non-centrality, and a related class of non-central matrices. For a generic linear statistic, we derive simple and explicit CLT expressions as the matrix dimensions grow large. For all three ensembles under consideration, we find that the primary effect of the spike is to introduce an correction term to the asymptotic mean of the linear spectral statistic, which we characterize with simple formulas. The utility of our proposed framework is demonstrated through application to three different linear statistics problems: the classical likelihood ratio test for a population covariance, the capacity analysis of multi-antenna wireless communication systems with a line-of-sight transmission path, and a classical multiple sample significance testing problem.

  10. Social support and nocturnal blood pressure dipping: a systematic review.

    PubMed

    Fortmann, Addie L; Gallo, Linda C

    2013-03-01

    Attenuated nocturnal blood pressure (BP) dipping is a better predictor of cardiovascular disease (CVD) morbidity and mortality than resting BP measurements. Studies have reported associations between social support, variously defined, and BP dipping. A systematic review of the literature was conducted to investigate associations of functional and structural social support with nocturnal BP dipping assessed over a minimum of 24 hours. A total of 297 articles were identified. Of these, 11 met criteria for inclusion; all studies were cross-sectional in design and included adult participants only (mean age = 19 to 72 years). Evidence was most consistent for an association between functional support and BP dipping, such that 5 of 7 studies reported statistically (or marginally) significant positive associations with BP dipping. Statistically significant functional support-BP dipping associations were moderate (standardized effect size (d) = 0.41) to large (d = 2.01) in magnitude. Studies examining structural support were fewer and relatively less consistent; however, preliminary evidence was observed for associations of marital status and social contact frequency with BP dipping. Statistically significant structural support findings were medium (d = 0.53) to large (d = 1.13) in magnitude. Overall, findings suggest a link between higher levels of functional support and greater nocturnal BP dipping; preliminary evidence was also observed for the protective effects of marriage and social contact frequency. Nonetheless, the relatively small number of studies conducted to date and the heterogeneity of findings across meaningful subgroups suggest that additional research is needed to substantiate these conclusions.

  11. Effects of vitamin D supplementation in pregnancy.

    PubMed

    Marya, R K; Rathee, S; Lata, V; Mudgil, S

    1981-01-01

    Serum calcium, inorganic phosphate and heat-labile alkaline phosphatase (HLAP) have been estimated in maternal and cord sera of 120 pregnant women at labour. 75 women who did not take any vitamin D supplements during pregnancy showed statistically significant hypocalcaemia, hypophosphataemia and elevation of HLAP. Hypocalcaemia and hypophosphataemia were present in cord blood, too. 25 women who had received 1,200 U vitamin D/day throughout the 3rd trimester, showed significantly lower HLAP levels and increased fetal birth weight but there was no other improvement in maternal or cord blood chemistry. Administration of vitamin D in two large doses of 600,000 U each in the 7th and 8th months of pregnancy in 20 women proved more efficacious. Statistically significant improvement was observed in all the three biochemical parameters in maternal as well as cord sera. Fetal birth weight was also significantly greater with this mode of therapy.

  12. Large-Angle Anomalies in the CMB

    DOE PAGES

    Copi, Craig J.; Huterer, Dragan; Schwarz, Dominik J.; ...

    2010-01-01

    We review the recently found large-scale anomalies in the maps of temperature anisotropies in the cosmic microwave background. These include alignments of the largest modes of CMB anisotropy with each other and with geometry and direction of motion of the solar ssystem, and the unusually low power at these largest scales. We discuss these findings in relation to expectation from standard inflationary cosmology, their statistical significance, the tools to study them, and the various attempts to explain them.

  13. Correspondence between large-scale ictal and interictal epileptic networks revealed by single photon emission computed tomography (SPECT) and electroencephalography (EEG)-functional magnetic resonance imaging (fMRI).

    PubMed

    Tousseyn, Simon; Dupont, Patrick; Goffin, Karolien; Sunaert, Stefan; Van Paesschen, Wim

    2015-03-01

    Epilepsy is increasingly recognized as a network disorder, but the spatial relationship between ictal and interictal networks is still largely unexplored. In this work, we compared hemodynamic changes related to seizures and interictal spikes on a whole brain scale. Twenty-eight patients with refractory focal epilepsy (14 temporal and 14 extratemporal lobe) underwent both subtraction ictal single photon emission computed tomography (SPECT) coregistered to magnetic resonance imaging (MRI) (SISCOM) and spike-related electroencephalography (EEG-functional MRI (fMRI). SISCOM visualized relative perfusion changes during seizures, whereas EEG-fMRI mapped blood oxygen level-dependent (BOLD) changes related to spikes. Similarity between statistical maps of both modalities was analyzed per patient using the following two measures: (1) correlation between unthresholded statistical maps (Pearson's correlation coefficient) and (2) overlap between thresholded images (Dice coefficient). Overlap was evaluated at a regional level, for hyperperfusions and activations and for hypoperfusions and deactivations separately, using different thresholds. Nonparametric permutation tests were applied to assess statistical significance (p ≤ 0.05). We found significant and positive correlations between hemodynamic changes related to seizures and spikes in 27 (96%) of 28 cases (median correlation coefficient 0.29 [range -0.12 to 0.62]). In 20 (71%) of 28 cases, spatial overlap between hyperperfusion on SISCOM and activation on EEG-fMRI was significantly larger than expected by chance. Congruent changes were not restricted to the territory of the presumed epileptogenic zone, but could be seen at distant sites (e.g., cerebellum and basal ganglia). Overlap between ictal hypoperfusion and interictal deactivation was statistically significant in 22 (79%) of 28 patients. Despite the high rate of congruence, discrepancies were observed for both modalities. We conclude that hemodynamic changes related to seizures and spikes varied spatially with the same sign and within a common network. Overlap was present in regions nearby and distant from discharge origin. Wiley Periodicals, Inc. © 2015 International League Against Epilepsy.

  14. Measuring post-secondary stem majors' engagement in sustainability: The creation, assessment, and validation of an instrument for sustainability curricula evaluation

    NASA Astrophysics Data System (ADS)

    Little, David L., II

    Ongoing changes in values, pedagogy, and curriculum concerning sustainability education necessitate that strong curricular elements are identified in sustainability education. However, quantitative research in sustainability education is largely undeveloped or relies on outdated instruments. In part, this is because no widespread quantitative instrument for measuring related educational outcomes has been developed for the field, though their development is pivotal for future efforts in sustainability education related to STEM majors. This research study details the creation, evaluation, and validation of an instrument -- the STEM Sustainability Engagement Instrument (STEMSEI) -- designed to measure sustainability engagement in post-secondary STEM majors. The study was conducted in three phases, using qualitative methods in phase 1, a concurrent mixed methods design in phase 2, and a sequential mixed methods design in phase 3. The STEMSEI was able to successfully predict statistically significant differences in the sample (n= 1017) that were predicted by prior research in environmental education. The STEMSEI also revealed statistically significant differences between STEM majors' sustainability engagement with a large effect size (.203 ≤ eta2 ≤ .211). As hypothesized, statistically significant differences were found on the environmental scales across gender and present religion. With respect to gender, self-perceived measures of emotional engagement with environmental sustainability was higher with females while males had higher measures in cognitive engagement with respect to knowing information related to environmental sustainability. With respect to present religion, self-perceived measures of general engagement and emotional engagement in environmental sustainability were higher for non-Christians as compared to Christians. On the economic scales, statistically significant differences were found across gender. Specifically, measures of males' self-perceived cognitive engagement in knowing information related to economic sustainability were greater than those of females. Future research should establish the generalizability of these results and further test the validity of the STEMSEI.

  15. Statistical learning and selective inference.

    PubMed

    Taylor, Jonathan; Tibshirani, Robert J

    2015-06-23

    We describe the problem of "selective inference." This addresses the following challenge: Having mined a set of data to find potential associations, how do we properly assess the strength of these associations? The fact that we have "cherry-picked"--searched for the strongest associations--means that we must set a higher bar for declaring significant the associations that we see. This challenge becomes more important in the era of big data and complex statistical modeling. The cherry tree (dataset) can be very large and the tools for cherry picking (statistical learning methods) are now very sophisticated. We describe some recent new developments in selective inference and illustrate their use in forward stepwise regression, the lasso, and principal components analysis.

  16. PLASMA PHYSICS AND STATISTICAL MECHANICS IN BRUSSELS, BELGIUM,

    DTIC Science & Technology

    significant research in the theory and experiment of the Tonks-Dattner resonances in a cylindrical plasma column. The second visit was to Professors I ...Prigogine and R. Balescu , of the Faculte des Sciences, Universite Libre de Bruxelles, who together direct a large group of scientists working on all

  17. Climate change impacts on extreme temperature mortality in select metropolitan areas of the United States

    EPA Science Inventory

    Projected mortality from climate change-driven impacts on extremely hot and cold days increases significantly over the 21st century in a large group of United States Metropolitan Statistical Areas. Increases in projected mortality from more hot days are greater than decreases in ...

  18. Dice and DNA

    ERIC Educational Resources Information Center

    Wernersson, Rasmus

    2007-01-01

    An important part of teaching students how to use the BLAST tool for searching large sequence databases, is to train the students to think critically about the quality of the sequence hits found--both in terms of the statistical significance and how informative the individual hits are. This paper describes how generating truly random sequences by…

  19. Robustness of the Sequential Lineup Advantage

    ERIC Educational Resources Information Center

    Gronlund, Scott D.; Carlson, Curt A.; Dailey, Sarah B.; Goodsell, Charles A.

    2009-01-01

    A growing movement in the United States and around the world involves promoting the advantages of conducting an eyewitness lineup in a sequential manner. We conducted a large study (N = 2,529) that included 24 comparisons of sequential versus simultaneous lineups. A liberal statistical criterion revealed only 2 significant sequential lineup…

  20. Mental Representation of Circuit Diagrams: Individual Differences in Procedural Knowledge.

    DTIC Science & Technology

    1983-12-01

    operation. One may know, for example, that a transformer serves to change the voltage of an AC supply, that a particular combination of transitors acts as a...and error measures with respect to overall performance. Even if a large 3-1-- sample could provide statistically significant differences between skill

  1. 4P: fast computing of population genetics statistics from large DNA polymorphism panels

    PubMed Central

    Benazzo, Andrea; Panziera, Alex; Bertorelle, Giorgio

    2015-01-01

    Massive DNA sequencing has significantly increased the amount of data available for population genetics and molecular ecology studies. However, the parallel computation of simple statistics within and between populations from large panels of polymorphic sites is not yet available, making the exploratory analyses of a set or subset of data a very laborious task. Here, we present 4P (parallel processing of polymorphism panels), a stand-alone software program for the rapid computation of genetic variation statistics (including the joint frequency spectrum) from millions of DNA variants in multiple individuals and multiple populations. It handles a standard input file format commonly used to store DNA variation from empirical or simulation experiments. The computational performance of 4P was evaluated using large SNP (single nucleotide polymorphism) datasets from human genomes or obtained by simulations. 4P was faster or much faster than other comparable programs, and the impact of parallel computing using multicore computers or servers was evident. 4P is a useful tool for biologists who need a simple and rapid computer program to run exploratory population genetics analyses in large panels of genomic data. It is also particularly suitable to analyze multiple data sets produced in simulation studies. Unix, Windows, and MacOs versions are provided, as well as the source code for easier pipeline implementations. PMID:25628874

  2. Identification of Intensity Ratio Break Points from Photon Arrival Trajectories in Ratiometric Single Molecule Spectroscopy

    PubMed Central

    Bingemann, Dieter; Allen, Rachel M.

    2012-01-01

    We describe a statistical method to analyze dual-channel photon arrival trajectories from single molecule spectroscopy model-free to identify break points in the intensity ratio. Photons are binned with a short bin size to calculate the logarithm of the intensity ratio for each bin. Stochastic photon counting noise leads to a near-normal distribution of this logarithm and the standard student t-test is used to find statistically significant changes in this quantity. In stochastic simulations we determine the significance threshold for the t-test’s p-value at a given level of confidence. We test the method’s sensitivity and accuracy indicating that the analysis reliably locates break points with significant changes in the intensity ratio with little or no error in realistic trajectories with large numbers of small change points, while still identifying a large fraction of the frequent break points with small intensity changes. Based on these results we present an approach to estimate confidence intervals for the identified break point locations and recommend a bin size to choose for the analysis. The method proves powerful and reliable in the analysis of simulated and actual data of single molecule reorientation in a glassy matrix. PMID:22837704

  3. Graphical tests for Hardy-Weinberg equilibrium based on the ternary plot.

    PubMed

    Graffelman, Jan; Camarena, Jair Morales

    2008-01-01

    We design a graphical test for Hardy-Weinberg equilibrium. This can circumvent the calculation of p values and the statistical (non)significance of a large number of bi-allelic markers can be inferred from their position in a graph. By rewriting expressions for the chi(2) statistic (with and without continuity correction) in terms of the heterozygote frequency an acceptance region for Hardy-Weinberg equilibrium is obtained that can be depicted in a ternary plot. We obtain equations for curves in the ternary plot that separate markers that are out of Hardy-Weinberg equilibrium from those that are in equilibrium. The curves depend on the chosen significance level, the sample size and on a continuity correction parameter. Some examples of graphical tests using a set of 106 SNPs on the long arm of human chromosome 22 are described. Significant markers and poor markers with a lot of missing values are easily identified in the proposed plots. R software for making the diagrams is provided. The proposed graphs can be used as control charts for spotting problematic markers in large scale genotyping studies, and constitute an excellent tool for the graphical exploration of bi-allelic marker data. (c) 2007 S. Karger AG, Basel.

  4. Crash risk factors for interstate large trucks in North Carolina.

    PubMed

    Teoh, Eric R; Carter, Daniel L; Smith, Sarah; McCartt, Anne T

    2017-09-01

    Provide an updated examination of risk factors for large truck involvements in crashes resulting in injury or death. A matched case-control study was conducted in North Carolina of large trucks operated by interstate carriers. Cases were defined as trucks involved in crashes resulting in fatal or non-fatal injury, and one control truck was matched on the basis of location, weekday, time of day, and truck type. The matched-pair odds ratio provided an estimate of the effect of various driver, vehicle, or carrier factors. Out-of-service (OOS) brake violations tripled the risk of crashing; any OOS vehicle defect increased crash risk by 362%. Higher historical crash rates (fatal, injury, or all crashes) of the carrier were associated with increased risk of crashing. Operating on a short-haul exemption increased crash risk by 383%. Antilock braking systems reduced crash risk by 65%. All of these results were statistically significant at the 95% confidence level. Other safety technologies also showed estimated benefits, although not statistically significant. With the exception of the finding that short-haul exemption is associated with increased crash risk, results largely bolster what is currently known about large truck crash risk and reinforce current enforcement practices. Results also suggest vehicle safety technologies can be important in lowering crash risk. This means that as safety technology continues to penetrate the fleet, whether from voluntary usage or government mandates, reductions in large truck crashes may be achieved. Practical application: Results imply that increased enforcement and use of crash avoidance technologies can improve the large truck crash problem. Copyright © 2017 National Safety Council and Elsevier Ltd. All rights reserved.

  5. Cosmic shear measurements with Dark Energy Survey Science Verification data

    DOE PAGES

    Becker, M. R.

    2016-07-06

    Here, we present measurements of weak gravitational lensing cosmic shear two-point statistics using Dark Energy Survey Science Verification data. We demonstrate that our results are robust to the choice of shear measurement pipeline, either ngmix or im3shape, and robust to the choice of two-point statistic, including both real and Fourier-space statistics. Our results pass a suite of null tests including tests for B-mode contamination and direct tests for any dependence of the two-point functions on a set of 16 observing conditions and galaxy properties, such as seeing, airmass, galaxy color, galaxy magnitude, etc. We use a large suite of simulationsmore » to compute the covariance matrix of the cosmic shear measurements and assign statistical significance to our null tests. We find that our covariance matrix is consistent with the halo model prediction, indicating that it has the appropriate level of halo sample variance. We also compare the same jackknife procedure applied to the data and the simulations in order to search for additional sources of noise not captured by the simulations. We find no statistically significant extra sources of noise in the data. The overall detection significance with tomography for our highest source density catalog is 9.7σ. Cosmological constraints from the measurements in this work are presented in a companion paper.« less

  6. Acceleration techniques for dependability simulation. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Barnette, James David

    1995-01-01

    As computer systems increase in complexity, the need to project system performance from the earliest design and development stages increases. We have to employ simulation for detailed dependability studies of large systems. However, as the complexity of the simulation model increases, the time required to obtain statistically significant results also increases. This paper discusses an approach that is application independent and can be readily applied to any process-based simulation model. Topics include background on classical discrete event simulation and techniques for random variate generation and statistics gathering to support simulation.

  7. Novel Kalman Filter Algorithm for Statistical Monitoring of Extensive Landscapes with Synoptic Sensor Data

    PubMed Central

    Czaplewski, Raymond L.

    2015-01-01

    Wall-to-wall remotely sensed data are increasingly available to monitor landscape dynamics over large geographic areas. However, statistical monitoring programs that use post-stratification cannot fully utilize those sensor data. The Kalman filter (KF) is an alternative statistical estimator. I develop a new KF algorithm that is numerically robust with large numbers of study variables and auxiliary sensor variables. A National Forest Inventory (NFI) illustrates application within an official statistics program. Practical recommendations regarding remote sensing and statistical issues are offered. This algorithm has the potential to increase the value of synoptic sensor data for statistical monitoring of large geographic areas. PMID:26393588

  8. Association between large strongyle genera in larval cultures--using rare-event poisson regression.

    PubMed

    Cao, X; Vidyashankar, A N; Nielsen, M K

    2013-09-01

    Decades of intensive anthelmintic treatment has caused equine large strongyles to become quite rare, while the cyathostomins have developed resistance to several drug classes. The larval culture has been associated with low to moderate negative predictive values for detecting Strongylus vulgaris infection. It is unknown whether detection of other large strongyle species can be statistically associated with presence of S. vulgaris. This remains a statistical challenge because of the rare occurrence of large strongyle species. This study used a modified Poisson regression to analyse a dataset for associations between S. vulgaris infection and simultaneous occurrence of Strongylus edentatus and Triodontophorus spp. In 663 horses on 42 Danish farms, the individual prevalences of S. vulgaris, S. edentatus and Triodontophorus spp. were 12%, 3% and 12%, respectively. Both S. edentatus and Triodontophorus spp. were significantly associated with S. vulgaris infection with relative risks above 1. Further, S. edentatus was associated with use of selective therapy on the farms, as well as negatively associated with anthelmintic treatment carried out within 6 months prior to the study. The findings illustrate that occurrence of S. vulgaris in larval cultures can be interpreted as indicative of other large strongyles being likely to be present.

  9. Significant Association of Urinary Toxic Metals and Autism-Related Symptoms—A Nonlinear Statistical Analysis with Cross Validation

    PubMed Central

    Adams, James; Kruger, Uwe; Geis, Elizabeth; Gehn, Eva; Fimbres, Valeria; Pollard, Elena; Mitchell, Jessica; Ingram, Julie; Hellmers, Robert; Quig, David; Hahn, Juergen

    2017-01-01

    Introduction A number of previous studies examined a possible association of toxic metals and autism, and over half of those studies suggest that toxic metal levels are different in individuals with Autism Spectrum Disorders (ASD). Additionally, several studies found that those levels correlate with the severity of ASD. Methods In order to further investigate these points, this paper performs the most detailed statistical analysis to date of a data set in this field. First morning urine samples were collected from 67 children and adults with ASD and 50 neurotypical controls of similar age and gender. The samples were analyzed to determine the levels of 10 urinary toxic metals (UTM). Autism-related symptoms were assessed with eleven behavioral measures. Statistical analysis was used to distinguish participants on the ASD spectrum and neurotypical participants based upon the UTM data alone. The analysis also included examining the association of autism severity with toxic metal excretion data using linear and nonlinear analysis. “Leave-one-out” cross-validation was used to ensure statistical independence of results. Results and Discussion Average excretion levels of several toxic metals (lead, tin, thallium, antimony) were significantly higher in the ASD group. However, ASD classification using univariate statistics proved difficult due to large variability, but nonlinear multivariate statistical analysis significantly improved ASD classification with Type I/II errors of 15% and 18%, respectively. These results clearly indicate that the urinary toxic metal excretion profiles of participants in the ASD group were significantly different from those of the neurotypical participants. Similarly, nonlinear methods determined a significantly stronger association between the behavioral measures and toxic metal excretion. The association was strongest for the Aberrant Behavior Checklist (including subscales on Irritability, Stereotypy, Hyperactivity, and Inappropriate Speech), but significant associations were found for UTM with all eleven autism-related assessments with cross-validation R2 values ranging from 0.12–0.48. PMID:28068407

  10. Power, effects, confidence, and significance: an investigation of statistical practices in nursing research.

    PubMed

    Gaskin, Cadeyrn J; Happell, Brenda

    2014-05-01

    To (a) assess the statistical power of nursing research to detect small, medium, and large effect sizes; (b) estimate the experiment-wise Type I error rate in these studies; and (c) assess the extent to which (i) a priori power analyses, (ii) effect sizes (and interpretations thereof), and (iii) confidence intervals were reported. Statistical review. Papers published in the 2011 volumes of the 10 highest ranked nursing journals, based on their 5-year impact factors. Papers were assessed for statistical power, control of experiment-wise Type I error, reporting of a priori power analyses, reporting and interpretation of effect sizes, and reporting of confidence intervals. The analyses were based on 333 papers, from which 10,337 inferential statistics were identified. The median power to detect small, medium, and large effect sizes was .40 (interquartile range [IQR]=.24-.71), .98 (IQR=.85-1.00), and 1.00 (IQR=1.00-1.00), respectively. The median experiment-wise Type I error rate was .54 (IQR=.26-.80). A priori power analyses were reported in 28% of papers. Effect sizes were routinely reported for Spearman's rank correlations (100% of papers in which this test was used), Poisson regressions (100%), odds ratios (100%), Kendall's tau correlations (100%), Pearson's correlations (99%), logistic regressions (98%), structural equation modelling/confirmatory factor analyses/path analyses (97%), and linear regressions (83%), but were reported less often for two-proportion z tests (50%), analyses of variance/analyses of covariance/multivariate analyses of variance (18%), t tests (8%), Wilcoxon's tests (8%), Chi-squared tests (8%), and Fisher's exact tests (7%), and not reported for sign tests, Friedman's tests, McNemar's tests, multi-level models, and Kruskal-Wallis tests. Effect sizes were infrequently interpreted. Confidence intervals were reported in 28% of papers. The use, reporting, and interpretation of inferential statistics in nursing research need substantial improvement. Most importantly, researchers should abandon the misleading practice of interpreting the results from inferential tests based solely on whether they are statistically significant (or not) and, instead, focus on reporting and interpreting effect sizes, confidence intervals, and significance levels. Nursing researchers also need to conduct and report a priori power analyses, and to address the issue of Type I experiment-wise error inflation in their studies. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  11. New U.S. Geological Survey Method for the Assessment of Reserve Growth

    USGS Publications Warehouse

    Klett, Timothy R.; Attanasi, E.D.; Charpentier, Ronald R.; Cook, Troy A.; Freeman, P.A.; Gautier, Donald L.; Le, Phuong A.; Ryder, Robert T.; Schenk, Christopher J.; Tennyson, Marilyn E.; Verma, Mahendra K.

    2011-01-01

    Reserve growth is defined as the estimated increases in quantities of crude oil, natural gas, and natural gas liquids that have the potential to be added to remaining reserves in discovered accumulations through extension, revision, improved recovery efficiency, and additions of new pools or reservoirs. A new U.S. Geological Survey method was developed to assess the reserve-growth potential of technically recoverable crude oil and natural gas to be added to reserves under proven technology currently in practice within the trend or play, or which reasonably can be extrapolated from geologically similar trends or plays. This method currently is in use to assess potential additions to reserves in discovered fields of the United States. The new approach involves (1) individual analysis of selected large accumulations that contribute most to reserve growth, and (2) conventional statistical modeling of reserve growth in remaining accumulations. This report will focus on the individual accumulation analysis. In the past, the U.S. Geological Survey estimated reserve growth by statistical methods using historical recoverable-quantity data. Those statistical methods were based on growth rates averaged by the number of years since accumulation discovery. Accumulations in mature petroleum provinces with volumetrically significant reserve growth, however, bias statistical models of the data; therefore, accumulations with significant reserve growth are best analyzed separately from those with less significant reserve growth. Large (greater than 500 million barrels) and older (with respect to year of discovery) oil accumulations increase in size at greater rates late in their development history in contrast to more recently discovered accumulations that achieve most growth early in their development history. Such differences greatly affect the statistical methods commonly used to forecast reserve growth. The individual accumulation-analysis method involves estimating the in-place petroleum quantity and its uncertainty, as well as the estimated (forecasted) recoverability and its respective uncertainty. These variables are assigned probabilistic distributions and are combined statistically to provide probabilistic estimates of ultimate recoverable quantities. Cumulative production and remaining reserves are then subtracted from the estimated ultimate recoverable quantities to provide potential reserve growth. In practice, results of the two methods are aggregated to various scales, the highest of which includes an entire country or the world total. The aggregated results are reported along with the statistically appropriate uncertainties.

  12. Wind and wave extremes over the world oceans from very large ensembles

    NASA Astrophysics Data System (ADS)

    Breivik, Øyvind; Aarnes, Ole Johan; Abdalla, Saleh; Bidlot, Jean-Raymond; Janssen, Peter A. E. M.

    2014-07-01

    Global return values of marine wind speed and significant wave height are estimated from very large aggregates of archived ensemble forecasts at +240 h lead time. Long lead time ensures that the forecasts represent independent draws from the model climate. Compared with ERA-Interim, a reanalysis, the ensemble yields higher return estimates for both wind speed and significant wave height. Confidence intervals are much tighter due to the large size of the data set. The period (9 years) is short enough to be considered stationary even with climate change. Furthermore, the ensemble is large enough for nonparametric 100 year return estimates to be made from order statistics. These direct return estimates compare well with extreme value estimates outside areas with tropical cyclones. Like any method employing modeled fields, it is sensitive to tail biases in the numerical model, but we find that the biases are moderate outside areas with tropical cyclones.

  13. Partitioning heritability by functional annotation using genome-wide association summary statistics.

    PubMed

    Finucane, Hilary K; Bulik-Sullivan, Brendan; Gusev, Alexander; Trynka, Gosia; Reshef, Yakir; Loh, Po-Ru; Anttila, Verneri; Xu, Han; Zang, Chongzhi; Farh, Kyle; Ripke, Stephan; Day, Felix R; Purcell, Shaun; Stahl, Eli; Lindstrom, Sara; Perry, John R B; Okada, Yukinori; Raychaudhuri, Soumya; Daly, Mark J; Patterson, Nick; Neale, Benjamin M; Price, Alkes L

    2015-11-01

    Recent work has demonstrated that some functional categories of the genome contribute disproportionately to the heritability of complex diseases. Here we analyze a broad set of functional elements, including cell type-specific elements, to estimate their polygenic contributions to heritability in genome-wide association studies (GWAS) of 17 complex diseases and traits with an average sample size of 73,599. To enable this analysis, we introduce a new method, stratified LD score regression, for partitioning heritability from GWAS summary statistics while accounting for linked markers. This new method is computationally tractable at very large sample sizes and leverages genome-wide information. Our findings include a large enrichment of heritability in conserved regions across many traits, a very large immunological disease-specific enrichment of heritability in FANTOM5 enhancers and many cell type-specific enrichments, including significant enrichment of central nervous system cell types in the heritability of body mass index, age at menarche, educational attainment and smoking behavior.

  14. Lack of large-angle TT correlations persists in WMAP and Planck

    NASA Astrophysics Data System (ADS)

    Copi, Craig J.; Huterer, Dragan; Schwarz, Dominik J.; Starkman, Glenn D.

    2015-08-01

    The lack of large-angle correlations in the observed microwave background temperature fluctuations persists in the final-year maps from Wilkinson Microwave Anisotropy Probe (WMAP) and the first cosmological data release from Planck. We find a statistically robust and significant result: p-values for the missing correlations lying below 0.24 per cent (i.e. evidence at more than 3σ) for foreground cleaned maps, in complete agreement with previous analyses based upon earlier WMAP data. A cut-sky analysis of the Planck HFI 100 GHz frequency band, the `cleanest CMB channel' of this instrument, returns a p-value as small as 0.03 per cent, based on the conservative mask defined by WMAP. These findings are in stark contrast to expectations from the inflationary Lambda cold dark matter model and still lack a convincing explanation. If this lack of large-angle correlations is a true feature of our Universe, and not just a statistical fluke, then the cosmological dipole must be considerably smaller than that predicted in the best-fitting model.

  15. Influence of Idealized Heterogeneity on Wet and Dry Planetary Boundary Layers Coupled to the Land Surface. 1; Instantaneous Fields and Statistics

    NASA Technical Reports Server (NTRS)

    Houser, Paul (Technical Monitor); Patton, Edward G.; Sullivan, Peter P.; Moeng, Chin-Hoh

    2003-01-01

    This is the first in a two-part series of manuscripts describing numerical experiments on the influence of 2-30 km striplike heterogeneity on wet and dry boundary layers coupled to the land surface. The strip-like heterogeneity is shown to dramatically alter the structure of the free-convective boundary layer by inducing significant organized circulations that modify turbulent statistics. The coupling with the land-surface modifies the circulations compared to previous studies using fixed surface forcing. Total boundary layer turbulence kinetic energy increases significantly for surface heterogeneity at scales between Lambda/z(sub i) = 4 and 9, however entrainment rates for all cases are largely unaffected by the strip-like heterogeneity.

  16. A retrospective cohort mortality study of blending and packaging workers of Mobil Corporation.

    PubMed

    Collingwood, K W; Milcarek, B I; Raabe, G K

    1991-01-01

    This retrospective cohort mortality study examined 2,467 workers in lubrication products blending and packaging (B&P) operations at two refineries of Mobil Corporation between January 1, 1945 and December 31, 1978. Ninety-seven percent were male. Compared with U.S. males, there were significantly fewer deaths observed among males due to all causes, external causes, and diseases of the circulatory, respiratory, digestive, and genitourinary systems. Deaths observed from all cancer were fewer than expected, although not statistically significant. No statistically significant excess cause-specific mortality occurred at B&P facilities combined or separately. Nonsignificant increases in mortality were observed for cancers of the stomach, large intestine, prostate, the category of "other lymphatic tissue" cancer, and leukemia and aleukemia. Analyses demonstrated a statistically significant pattern of increasing SMR with employment duration for "other lymphatic tissue" cancer. Within the highest cumulative duration of employment category, the excess was confined to workers after 30 or more years since first employment. Although the interpretation of cancer mortality patterns is limited due to small numbers of deaths, the absence of associations with specific B&P departments is evidence against a causal interpretation.

  17. Seismology: tectonic strain in plate interiors?

    PubMed

    Calais, E; Mattioli, G; DeMets, C; Nocquet, J-M; Stein, S; Newman, A; Rydelek, P

    2005-12-15

    It is not fully understood how or why the inner areas of tectonic plates deform, leading to large, although infrequent, earthquakes. Smalley et al. offer a potential breakthrough by suggesting that surface deformation in the central United States accumulates at rates comparable to those across plate boundaries. However, we find no statistically significant deformation in three independent analyses of the data set used by Smalley et al., and conclude therefore that only the upper bounds of magnitude and repeat time for large earthquakes can be inferred at present.

  18. Comparison of Different Instructional Multimedia Designs for Improving Student Science-Process Skill Learning

    NASA Astrophysics Data System (ADS)

    Chien, Yu-Ta; Chang, Chun-Yen

    2012-02-01

    This study developed three forms of computer-based multimedia, including Static Graphics (SG), Simple Learner-Pacing Animation (SLPA), and Full Learner-Pacing Animation (FLPA), to assist students in learning topographic measuring. The interactive design of FLPA allowed students to physically manipulate the virtual measuring mechanism, rather than passively observe dynamic or static images. The students were randomly assigned to different multimedia groups. The results of a one-way ANOVA analysis indicated that (1) there was a significant difference with a large effect size ( f = .69) in mental effort ratings among three groups, and the post-hoc test indicated that FLPA imposed less cognitive load on students than did SG ( p = .007); (2) the differences of practical performance scores among groups reached the statistic significant level with a large effect size ( f = .76), and the post-hoc test indicated that FLPA fostered better learning outcomes than both SLPA and SG ( p = .004 and p = .05, respectively); (3) the difference in instructional efficiency that was computed by the z-score combination of students' mental effort ratings and practical performance scores among the three groups obtained the statistic significant level with a large effect size ( f = .79), and the post-hoc test indicated that FLPA brought students higher instructional efficiency than those of both SLPA and SG ( p = .01 and .005, respectively); (4) no significant effect was found in instructional time-spans between groups ( p = .637). Overall, FLPA was recommended as the best multimedia form to facilitate topographic measurement learning. The implications of instructional multimedia design were discussed from the perspective of cognitive load theory.

  19. A 24-WEEK, MULTICENTER, OPEN-LABEL, RANDOMIZED STUDY TO COMPARE CHANGES IN GLUCOSE METABOLISM IN PATIENTS WITH SCHIZOPHRENIA RECEIVING TREATMENT WITH OLANZAPINE, QUETIAPINE AND RISPERIDONE

    PubMed Central

    Newcomer, John W.; Ratner, Robert E.; Eriksson, Jan W.; Emsley, Robin; Meulien, Didier; Miller, Frank; Leonova-Edlund, Julia; Leong, Ronald W; Brecher, Martin

    2013-01-01

    Objective This randomized, 24-week, flexible-dose study compared changes in glucose metabolism in patients with schizophrenia receiving initial exposure to olanzapine, quetiapine, or risperidone. Methods The hypothesized primary endpoint was change (baseline to Week 24) in area under the curve 0-2h plasma glucose during oral glucose tolerance test (OGTT); primary analysis: olanzapine versus quetiapine. Secondary endpoints included change in AUC 0-2h plasma insulin, insulin sensitivity index (ISI), and fasting lipids. Results Mean weight change (kg) over 24 weeks was +3.7 (quetiapine), +4.6 (olanzapine), and +3.6 (risperidone). Based on data from 395 patients (quetiapine n=115 [mean 607.0 mg/day], olanzapine n=146 [15.2 mg/day], and risperidone n=134 [5.2 mg/day]), change in AUC 0-2h glucose (mg/dL×h) at Week 24 was significantly lower for quetiapine versus olanzapine (t=1.98; DF=377; p=0.048). Increases in AUC 0-2h glucose were statistically significant with olanzapine (+21.9 mg/dL, 95% CI 11.5, 32.4) and risperidone (+18.8, CI 8.1, 29.4), but not quetiapine (+9.1, CI −2.3, 20.5). AUC 0-2h insulin increased statistically significantly with olanzapine, but not quetiapine or risperidone. Reductions in ISI were statistically significant with olanzapine and risperidone, but not quetiapine. Total cholesterol and LDL increased statistically significantly with olanzapine and quetiapine, but not risperidone. Statistically significant increases in triglycerides, cholesterol/HDL, and triglyceride/HDL ratios were observed with olanzapine only. Conclusion The results indicate a significant difference in the change in glucose tolerance during 6 months’ treatment with olanzapine versus quetiapine, with significant reductions on olanzapine and risperidone, but not quetiapine; these differential changes were largely explained by changes in insulin sensitivity. PMID:19358783

  20. Testing the effect of a science-enhanced curriculum on the science achievement and agricultural competency of secondary agricultural education students

    NASA Astrophysics Data System (ADS)

    Haynes, James Christopher

    Scope and Method of Study. The purpose of this study was to determine if a science-enhanced curriculum produced by the Center for Agricultural and Environmental Research and Training (CAERT) taught in a secondary level animal science or horticulture course would improve students' understanding of selected scientific principles significantly, when compared to students who were instructed using a traditional curriculum. A secondary purpose was to determine the effect that the science-enhanced CAERT curriculum would have on students' agricultural knowledge when compared to students who were instructed using a traditional curriculum. The design of the study was ex post facto, causal comparative because no random assignment of the treatment group occurred. Findings and Conclusions. No statistically significant difference was found between the treatment and comparison groups regarding science achievement. However, the mean score of the treatment group was slightly larger than the comparison group indicating a slightly higher achievement level; a "Small" effect size (d = .16) for this difference was calculated. It was determined that a statistically significant difference (p < .05) existed in agriculture competency scores in animal science (p = .001) and horticulture (p = .000) as a result of the treatment. Moreover, this was considered to be a "very large" effect (d = 1.18) in animal science and a "large" effect (d = .92) in horticulture. When considering student achievement in science, this study found that the use of the science-enhanced CAERT curriculum did not result in a statistically significant increase (p < .05) in student performance as determined by the TerraNova3 science proficiency examination. However, students who were instructed using the CAERT curriculum scored better overall than those who were instructed using a "traditional" curriculum.

  1. Statistical significance approximation in local trend analysis of high-throughput time-series data using the theory of Markov chains.

    PubMed

    Xia, Li C; Ai, Dongmei; Cram, Jacob A; Liang, Xiaoyi; Fuhrman, Jed A; Sun, Fengzhu

    2015-09-21

    Local trend (i.e. shape) analysis of time series data reveals co-changing patterns in dynamics of biological systems. However, slow permutation procedures to evaluate the statistical significance of local trend scores have limited its applications to high-throughput time series data analysis, e.g., data from the next generation sequencing technology based studies. By extending the theories for the tail probability of the range of sum of Markovian random variables, we propose formulae for approximating the statistical significance of local trend scores. Using simulations and real data, we show that the approximate p-value is close to that obtained using a large number of permutations (starting at time points >20 with no delay and >30 with delay of at most three time steps) in that the non-zero decimals of the p-values obtained by the approximation and the permutations are mostly the same when the approximate p-value is less than 0.05. In addition, the approximate p-value is slightly larger than that based on permutations making hypothesis testing based on the approximate p-value conservative. The approximation enables efficient calculation of p-values for pairwise local trend analysis, making large scale all-versus-all comparisons possible. We also propose a hybrid approach by integrating the approximation and permutations to obtain accurate p-values for significantly associated pairs. We further demonstrate its use with the analysis of the Polymouth Marine Laboratory (PML) microbial community time series from high-throughput sequencing data and found interesting organism co-occurrence dynamic patterns. The software tool is integrated into the eLSA software package that now provides accelerated local trend and similarity analysis pipelines for time series data. The package is freely available from the eLSA website: http://bitbucket.org/charade/elsa.

  2. Immunochip Analyses of Epistasis in Rheumatoid Arthritis Confirm Multiple Interactions within MHC and Suggest Novel Non-MHC Epistatic Signals.

    PubMed

    Wei, Wen-Hua; Loh, Chia-Yin; Worthington, Jane; Eyre, Stephen

    2016-05-01

    Studying statistical gene-gene interactions (epistasis) has been limited by the difficulties in performance, both statistically and computationally, in large enough sample numbers to gain sufficient power. Three large Immunochip datasets from cohort samples recruited in the United Kingdom, United States, and Sweden with European ancestry were used to examine epistasis in rheumatoid arthritis (RA). A full pairwise search was conducted in the UK cohort using a high-throughput tool and the resultant significant epistatic signals were tested for replication in the United States and Swedish cohorts. A forward selection approach was applied to remove redundant signals, while conditioning on the preidentified additive effects. We detected abundant genome-wide significant (p < 1.0e-13) epistatic signals, all within the MHC region. These signals were reduced substantially, but a proportion remained significant (p < 1.0e-03) in conditional tests. We identified 11 independent epistatic interactions across the entire MHC, each explaining on average 0.12% of the phenotypic variance, nearly all replicated in both replication cohorts. We also identified non-MHC epistatic interactions between RA susceptible loci LOC100506023 and IRF5 with Immunochip-wide significance (p < 1.1e-08) and between 2 neighboring single-nucleotide polymorphism near PTPN22 that were in low linkage disequilibrium with independent interaction (p < 1.0e-05). Both non-MHC epistatic interactions were statistically replicated with a similar interaction pattern in the US cohort only. There are multiple but relatively weak interactions independent of the additive effects in RA and a larger sample number is required to confidently assign additional non-MHC epistasis.

  3. Short-term Forecasting of the Prevalence of Trachoma: Expert Opinion, Statistical Regression, versus Transmission Models

    PubMed Central

    Liu, Fengchen; Porco, Travis C.; Amza, Abdou; Kadri, Boubacar; Nassirou, Baido; West, Sheila K.; Bailey, Robin L.; Keenan, Jeremy D.; Solomon, Anthony W.; Emerson, Paul M.; Gambhir, Manoj; Lietman, Thomas M.

    2015-01-01

    Background Trachoma programs rely on guidelines made in large part using expert opinion of what will happen with and without intervention. Large community-randomized trials offer an opportunity to actually compare forecasting methods in a masked fashion. Methods The Program for the Rapid Elimination of Trachoma trials estimated longitudinal prevalence of ocular chlamydial infection from 24 communities treated annually with mass azithromycin. Given antibiotic coverage and biannual assessments from baseline through 30 months, forecasts of the prevalence of infection in each of the 24 communities at 36 months were made by three methods: the sum of 15 experts’ opinion, statistical regression of the square-root-transformed prevalence, and a stochastic hidden Markov model of infection transmission (Susceptible-Infectious-Susceptible, or SIS model). All forecasters were masked to the 36-month results and to the other forecasts. Forecasts of the 24 communities were scored by the likelihood of the observed results and compared using Wilcoxon’s signed-rank statistic. Findings Regression and SIS hidden Markov models had significantly better likelihood than community expert opinion (p = 0.004 and p = 0.01, respectively). All forecasts scored better when perturbed to decrease Fisher’s information. Each individual expert’s forecast was poorer than the sum of experts. Interpretation Regression and SIS models performed significantly better than expert opinion, although all forecasts were overly confident. Further model refinements may score better, although would need to be tested and compared in new masked studies. Construction of guidelines that rely on forecasting future prevalence could consider use of mathematical and statistical models. PMID:26302380

  4. Short-term Forecasting of the Prevalence of Trachoma: Expert Opinion, Statistical Regression, versus Transmission Models.

    PubMed

    Liu, Fengchen; Porco, Travis C; Amza, Abdou; Kadri, Boubacar; Nassirou, Baido; West, Sheila K; Bailey, Robin L; Keenan, Jeremy D; Solomon, Anthony W; Emerson, Paul M; Gambhir, Manoj; Lietman, Thomas M

    2015-08-01

    Trachoma programs rely on guidelines made in large part using expert opinion of what will happen with and without intervention. Large community-randomized trials offer an opportunity to actually compare forecasting methods in a masked fashion. The Program for the Rapid Elimination of Trachoma trials estimated longitudinal prevalence of ocular chlamydial infection from 24 communities treated annually with mass azithromycin. Given antibiotic coverage and biannual assessments from baseline through 30 months, forecasts of the prevalence of infection in each of the 24 communities at 36 months were made by three methods: the sum of 15 experts' opinion, statistical regression of the square-root-transformed prevalence, and a stochastic hidden Markov model of infection transmission (Susceptible-Infectious-Susceptible, or SIS model). All forecasters were masked to the 36-month results and to the other forecasts. Forecasts of the 24 communities were scored by the likelihood of the observed results and compared using Wilcoxon's signed-rank statistic. Regression and SIS hidden Markov models had significantly better likelihood than community expert opinion (p = 0.004 and p = 0.01, respectively). All forecasts scored better when perturbed to decrease Fisher's information. Each individual expert's forecast was poorer than the sum of experts. Regression and SIS models performed significantly better than expert opinion, although all forecasts were overly confident. Further model refinements may score better, although would need to be tested and compared in new masked studies. Construction of guidelines that rely on forecasting future prevalence could consider use of mathematical and statistical models. Clinicaltrials.gov NCT00792922.

  5. A scan statistic for binary outcome based on hypergeometric probability model, with an application to detecting spatial clusters of Japanese encephalitis.

    PubMed

    Zhao, Xing; Zhou, Xiao-Hua; Feng, Zijian; Guo, Pengfei; He, Hongyan; Zhang, Tao; Duan, Lei; Li, Xiaosong

    2013-01-01

    As a useful tool for geographical cluster detection of events, the spatial scan statistic is widely applied in many fields and plays an increasingly important role. The classic version of the spatial scan statistic for the binary outcome is developed by Kulldorff, based on the Bernoulli or the Poisson probability model. In this paper, we apply the Hypergeometric probability model to construct the likelihood function under the null hypothesis. Compared with existing methods, the likelihood function under the null hypothesis is an alternative and indirect method to identify the potential cluster, and the test statistic is the extreme value of the likelihood function. Similar with Kulldorff's methods, we adopt Monte Carlo test for the test of significance. Both methods are applied for detecting spatial clusters of Japanese encephalitis in Sichuan province, China, in 2009, and the detected clusters are identical. Through a simulation to independent benchmark data, it is indicated that the test statistic based on the Hypergeometric model outweighs Kulldorff's statistics for clusters of high population density or large size; otherwise Kulldorff's statistics are superior.

  6. Benefits communication: its impact on employee benefits satisfaction under flexible programs.

    PubMed

    Rabin, B R

    1994-01-01

    Using field data from a private organization in upstate New York, a large statistically significant portion of benefits satisfaction was explained by benefits communication, employee benefits choice and change patterns, and demographics. Users of the organization's benefits communication materials and employees reporting fewer unmet benefits needs are more satisfied.

  7. Genetic structure of American chestnut populations based on neutral DNA markers

    Treesearch

    Thomas L. Kubisiak; James H. Roberds

    2006-01-01

    Microsatellite and RAPD markers suggest that American chestnut exists as a highly variable species. Even at the margins of its natural range, with a large proportion of its genetic variability occurring within populations (~95%). A statistically significant proportion also exists among population. Although genetic differentiation among populations has taken place, no...

  8. What Happens to Students Placed into Developmental Education? A Meta-Analysis of Regression Discontinuity Studies

    ERIC Educational Resources Information Center

    Valentine, Jeffrey C.; Konstantopoulos, Spyros; Goldrick-Rab, Sara

    2017-01-01

    This article reports a systematic review and meta-analysis of studies that use regression discontinuity to examine the effects of placement into developmental education. Results suggest that placement into developmental education is associated with effects that are negative, statistically significant, and substantively large for three outcomes:…

  9. A comparison of approaches for estimating bottom-sediment mass in large reservoirs

    USGS Publications Warehouse

    Juracek, Kyle E.

    2006-01-01

    Estimates of sediment and sediment-associated constituent loads and yields from drainage basins are necessary for the management of reservoir-basin systems to address important issues such as reservoir sedimentation and eutrophication. One method for the estimation of loads and yields requires a determination of the total mass of sediment deposited in a reservoir. This method involves a sediment volume-to-mass conversion using bulk-density information. A comparison of four computational approaches (partition, mean, midpoint, strategic) for using bulk-density information to estimate total bottom-sediment mass in four large reservoirs indicated that the differences among the approaches were not statistically significant. However, the lack of statistical significance may be a result of the small sample size. Compared to the partition approach, which was presumed to provide the most accurate estimates of bottom-sediment mass, the results achieved using the strategic, mean, and midpoint approaches differed by as much as ?4, ?20, and ?44 percent, respectively. It was concluded that the strategic approach may merit further investigation as a less time consuming and less costly alternative to the partition approach.

  10. Publication bias in situ

    PubMed Central

    Phillips, Carl V

    2004-01-01

    Background Publication bias, as typically defined, refers to the decreased likelihood of studies' results being published when they are near the null, not statistically significant, or otherwise "less interesting." But choices about how to analyze the data and which results to report create a publication bias within the published results, a bias I label "publication bias in situ" (PBIS). Discussion PBIS may create much greater bias in the literature than traditionally defined publication bias (the failure to publish any result from a study). The causes of PBIS are well known, consisting of various decisions about reporting that are influenced by the data. But its impact is not generally appreciated, and very little attention is devoted to it. What attention there is consists largely of rules for statistical analysis that are impractical and do not actually reduce the bias in reported estimates. PBIS cannot be reduced by statistical tools because it is not fundamentally a problem of statistics, but rather of non-statistical choices and plain language interpretations. PBIS should be recognized as a phenomenon worthy of study – it is extremely common and probably has a huge impact on results reported in the literature – and there should be greater systematic efforts to identify and reduce it. The paper presents examples, including results of a recent HIV vaccine trial, that show how easily PBIS can have a large impact on reported results, as well as how there can be no simple answer to it. Summary PBIS is a major problem, worthy of substantially more attention than it receives. There are ways to reduce the bias, but they are very seldom employed because they are largely unrecognized. PMID:15296515

  11. Holmium laser enucleation versus laparoscopic simple prostatectomy for large adenomas.

    PubMed

    Juaneda, R; Thanigasalam, R; Rizk, J; Perrot, E; Theveniaud, P E; Baumert, H

    2016-01-01

    The aim of this study is to compare Holmium laser enucleation of the prostate with another minimally invasive technique, the laparoscopic simple prostatectomy. We compared outcomes of a series of 40 patients who underwent laparoscopic simple prostatectomy (n=20) with laser enucleation of the prostate (n=20) for large adenomas (>100 grams) at our institution. Study variables included operative time and catheterization time, hospital stay, pre- and post-operative International Prostate Symptom Score and maximum urinary flow rate, complications and economic evaluation. Statistical analyses were performed using the Student t test and Fisher test. There were no significant differences in patient age, preoperative prostatic size, operating time or specimen weight between the 2 groups. Duration of catheterization (P=.0008) and hospital stay (P<.0001) were significantly less in the laser group. Both groups showed a statistically significant improvement in functional variables at 3 months post operatively. The cost utility analysis for Holmium per case was 2589 euros versus 4706 per laparoscopic case. In the laser arm, 4 patients (20%) experienced complications according to the modified Clavien classification system versus 5 (25%) in the laparoscopic group (P>.99). Holmium enucleation of the prostate has similar short term functional results and complication rates compared to laparoscopic simple prostatectomy performed in large glands with the advantage of less catheterization time, lower economic costs and a reduced hospital stay. Copyright © 2015 AEU. Publicado por Elsevier España, S.L.U. All rights reserved.

  12. Inverted ILM flap, free ILM flap and conventional ILM peeling for large macular holes.

    PubMed

    Velez-Montoya, Raul; Ramirez-Estudillo, J Abel; Sjoholm-Gomez de Liano, Carl; Bejar-Cornejo, Francisco; Sanchez-Ramos, Jorge; Guerrero-Naranjo, Jose Luis; Morales-Canton, Virgilio; Hernandez-Da Mota, Sergio E

    2018-01-01

    To assess closure rate after a single surgery of large macular holes and their visual recovery in the short term with three different surgical techniques. Prospective multicenter randomized controlled trial. We included treatment-naïve patients with diagnosis of large macular hole (minimum diameter of > 400 µm). All patients underwent a comprehensive ophthalmological examination. Before surgery, the patients were randomized into three groups: group A: conventional internal limiting membrane peeling, group B: inverted-flap technique and group C: free-flap technique. All study measurements were repeated within the period of 1 and 3 months after surgery. Continuous variables were assessed with a Kruskal-Wallis test, change in visual acuity was assessed with analysis of variance for repeated measurements with a Bonferroni correction for statistical significance. Thirty-eight patients were enrolled (group A: 12, group B: 12, group C: 14). The closure rate was in group A and B: 91.6%; 95% CI 61.52-99.79%. In group C: 85.71%; 95% CI 57.19-98.22%. There were no differences in the macular hole closure rate between groups ( p  = 0.85). All groups improved ≈ 0.2 logMAR, but only group B reached statistical significance ( p  < 0.007). Despite all techniques displayed a trend toward visual improvement, the inverted-flap technique seems to induce a faster and more significant recovery in the short term.

  13. Safety and efficacy of silodosin and tadalafil in ease of negotiation of large ureteroscope in the management of ureteral stone: A prosective randomized trial.

    PubMed

    Bhattar, Rohit; Jain, Vipin; Tomar, Vinay; Yadav, Sher Singh

    2017-12-01

    To evaluate the safety and efficacy of silodosin and tadalafil in ease of negotiation of large size ureteroscope (8/9.8 Fr) in the management of ureteral stone. Between June 2015 and May 2016, 86 patients presented with ureteral stone of size 6-15 mm were on consent randomly assigned to 1 of 3 outpatient treatment arms: silodosin (Group A), tadalafil (Group B), and placebo (Group C). After two weeks of therapy 67 patients underwent ureteroscopy, and ureteral orifice configuration, ureteroscopic negotiation, ureteral dilatation, operating time, procedural complication and drug related side effects were noted in each group. Ureteral negotiation was significantly better in Groups A (73.9%) and B (69.6%) as compared to Group C (38.1%) (p<0.01). Statistically significant difference was noted in the requirement for dilatation in Group C (71.4%) as compared to Groups A (26.1%) and B (39.1%) (p<0.01). Ureteral orifice was found to be more dilated in Groups A (69.6%) and B (60.9%) as compared to Group C (28.6%). Mean operating time was statistically lower in Groups A (35.2 min) and B (34.91 min) as compared to Group C (41.14 min) (p<0.01). Both silodosin and tadalafil not only relax ureteral smooth muscle but also help in forward propagation of large size ureteroscope (8/9.8 Fr) without any significant risk of adverse events.

  14. Ketorolac therapy for the prevention of acute pseudophakic cystoid macular edema: a systematic review

    PubMed Central

    Yilmaz, T; Cordero-Coma, M; Gallagher, M J

    2012-01-01

    To assess the effectiveness of ketorolac vs control for prevention of acute pseudophakic cystoid macular edema (CME). The following databases were searched: Medline (1950–June 11, 2011), The Cochrane Library (Issue 2, 2011), and the TRIP Database (up to 11 June 2011), using no language or other limits. Randomized controlled clinical trials (RCTs) were included that consisted of patients with acute pseudophakic cystoid macular edema, those comparing ketorolac with control, and those having at least a minimum follow-up of 28 days. In the four RCTs evaluating ketorolac vs control, treatment with ketorolac significantly reduced the risk of CME development at the end of treatment (∼4 weeks) compared to control (P=0.008; 95% confidence interval (0.03–0.58)). When analyzed individually, each individual study was statistically nonsignificant in its findings with the exception of one study. When the pooled relative risk was calculated, the large sample size of this systematic review led to overall statistical significance, which is attributable to the review's large sample size and not to the individual studies themselves. In this systematic review of four RCTs, two of which compared ketorolac with no treatment and two of which evaluated ketorolac vs placebo drops, treatment with ketorolac significantly reduced the risk of developing CME at the end of ∼4 weeks of treatment compared with controls. These results, however, should be interpreted with caution considering the paucity of large randomized clinical trials in the literature. PMID:22094296

  15. Hormone replacement therapy is associated with gastro-oesophageal reflux disease: a retrospective cohort study.

    PubMed

    Close, Helen; Mason, James M; Wilson, Douglas; Hungin, A Pali S

    2012-05-29

    Oestrogen and progestogen have the potential to influence gastro-intestinal motility; both are key components of hormone replacement therapy (HRT). Results of observational studies in women taking HRT rely on self-reporting of gastro-oesophageal symptoms and the aetiology of gastro-oesophageal reflux disease (GORD) remains unclear. This study investigated the association between HRT and GORD in menopausal women using validated general practice records. 51,182 menopausal women were identified using the UK General Practice Research Database between 1995-2004. Of these, 8,831 were matched with and without hormone use. Odds ratios (ORs) were calculated for GORD and proton-pump inhibitor (PPI) use in hormone and non-hormone users, adjusting for age, co-morbidities, and co-pharmacy. In unadjusted analysis, all forms of hormone use (oestrogen-only, tibolone, combined HRT and progestogen) were statistically significantly associated with GORD. In adjusted models, this association remained statistically significant for oestrogen-only treatment (OR 1.49; 1.18-1.89). Unadjusted analysis showed a statistically significant association between PPI use and oestrogen-only and combined HRT treatment. When adjusted for covariates, oestrogen-only treatment was significant (OR 1.34; 95% CI 1.03-1.74). Findings from the adjusted model demonstrated the greater use of PPI by progestogen users (OR 1.50; 1.01-2.22). This first large cohort study of the association between GORD and HRT found a statistically significant association between oestrogen-only hormone and GORD and PPI use. This should be further investigated using prospective follow-up to validate the strength of association and describe its clinical significance.

  16. Effects of Large-Scale Solar Installations on Dust Mobilization and Air Quality

    NASA Astrophysics Data System (ADS)

    Pratt, J. T.; Singh, D.; Diffenbaugh, N. S.

    2012-12-01

    Large-scale solar projects are increasingly being developed worldwide and many of these installations are located in arid, desert regions. To examine the effects of these projects on regional dust mobilization and air quality, we analyze aerosol product data from NASA's Multi-angle Imaging Spectroradiometer (MISR) at annual and seasonal time intervals near fifteen photovoltaic and solar thermal stations ranging from 5-200 MW (12-4,942 acres) in size. The stations are distributed over eight different countries and were chosen based on size, location and installation date; most of the installations are large-scale, took place in desert climates and were installed between 2006 and 2010. We also consider air quality measurements of particulate matter between 2.5 and 10 micrometers (PM10) from the Environmental Protection Agency (EPA) monitoring sites near and downwind from the project installations in the U.S. We use monthly wind data from the NOAA's National Center for Atmospheric Prediction (NCEP) Global Reanalysis to select the stations downwind from the installations, and then perform statistical analysis on the data to identify any significant changes in these quantities. We find that fourteen of the fifteen regions have lower aerosol product after the start of the installations as well as all six PM10 monitoring stations showing lower particulate matter measurements after construction commenced. Results fail to show any statistically significant differences in aerosol optical index or PM10 measurements before and after the large-scale solar installations. However, many of the large installations are very recent, and there is insufficient data to fully understand the long-term effects on air quality. More data and higher resolution analysis is necessary to better understand the relationship between large-scale solar, dust and air quality.

  17. The chi-square test of independence.

    PubMed

    McHugh, Mary L

    2013-01-01

    The Chi-square statistic is a non-parametric (distribution free) tool designed to analyze group differences when the dependent variable is measured at a nominal level. Like all non-parametric statistics, the Chi-square is robust with respect to the distribution of the data. Specifically, it does not require equality of variances among the study groups or homoscedasticity in the data. It permits evaluation of both dichotomous independent variables, and of multiple group studies. Unlike many other non-parametric and some parametric statistics, the calculations needed to compute the Chi-square provide considerable information about how each of the groups performed in the study. This richness of detail allows the researcher to understand the results and thus to derive more detailed information from this statistic than from many others. The Chi-square is a significance statistic, and should be followed with a strength statistic. The Cramer's V is the most common strength test used to test the data when a significant Chi-square result has been obtained. Advantages of the Chi-square include its robustness with respect to distribution of the data, its ease of computation, the detailed information that can be derived from the test, its use in studies for which parametric assumptions cannot be met, and its flexibility in handling data from both two group and multiple group studies. Limitations include its sample size requirements, difficulty of interpretation when there are large numbers of categories (20 or more) in the independent or dependent variables, and tendency of the Cramer's V to produce relative low correlation measures, even for highly significant results.

  18. Large truck crash facts 2005

    DOT National Transportation Integrated Search

    2007-02-01

    This annual edition of Large Truck Crash Facts contains descriptive statistics about fatal, injury, and property damage only crashes involving large trucks in 2005. Selected crash statistics on passenger vehicles are also presented for comparison pur...

  19. Relationships between sudden weather changes in summer and mortality in the Czech Republic, 1986-2005

    NASA Astrophysics Data System (ADS)

    Plavcová, Eva; Kyselý, Jan

    2010-09-01

    The study examines the relationship between sudden changes in weather conditions in summer, represented by (1) sudden air temperature changes, (2) sudden atmospheric pressure changes, and (3) passages of strong atmospheric fronts; and variations in daily mortality in the population of the Czech Republic. The events are selected from data covering 1986-2005 and compared with the database of daily excess all-cause mortality for the whole population and persons aged 70 years and above. Relative deviations of mortality, i.e., ratios of the excess mortality to the expected number of deaths, were averaged over the selected events for days D-2 (2 days before a change) up to D+7 (7 days after), and their statistical significance was tested by means of the Monte Carlo method. We find that the periods around weather changes are associated with pronounced patterns in mortality: a significant increase in mortality is found after large temperature increases and on days of large pressure drops; a decrease in mortality (partly due to a harvesting effect) occurs after large temperature drops, pressure increases, and passages of strong cold fronts. The relationship to variations in excess mortality is better expressed for sudden air temperature/pressure changes than for passages of atmospheric fronts. The mortality effects are usually more pronounced in the age group 70 years and above. The impacts associated with large negative changes of pressure are statistically independent of the effects of temperature; the corresponding dummy variable is found to be a significant predictor in the ARIMA model for relative deviations of mortality. This suggests that sudden weather changes should be tested also in time series models for predicting excess mortality as they may enhance their performance.

  20. On use of the multistage dose-response model for assessing laboratory animal carcinogenicity

    PubMed Central

    Nitcheva, Daniella; Piegorsch, Walter W.; West, R. Webster

    2007-01-01

    We explore how well a statistical multistage model describes dose-response patterns in laboratory animal carcinogenicity experiments from a large database of quantal response data. The data are collected from the U.S. EPA’s publicly available IRIS data warehouse and examined statistically to determine how often higher-order values in the multistage predictor yield significant improvements in explanatory power over lower-order values. Our results suggest that the addition of a second-order parameter to the model only improves the fit about 20% of the time, while adding even higher-order terms apparently does not contribute to the fit at all, at least with the study designs we captured in the IRIS database. Also included is an examination of statistical tests for assessing significance of higher-order terms in a multistage dose-response model. It is noted that bootstrap testing methodology appears to offer greater stability for performing the hypothesis tests than a more-common, but possibly unstable, “Wald” test. PMID:17490794

  1. No significant association between prenatal exposure poliovirus epidemics and psychosis.

    PubMed

    Cahill, Matthew; Chant, David; Welham, Joy; McGrath, John

    2002-06-01

    To examine the association between prenatal exposure to poliovirus infection and later development of schizophrenia or affective psychosis in a Southern Hemisphere psychiatric register. We calculated rates of poliomyelitis cases per 10 000 background population and rates for schizophrenia(n = 6078) and affective psychosis (n = 3707)per 10 000 births for the period 1930-1964. Empirically weighted regression was used to measure the association between a given psychosis birth-rate and a poliomyelitis epidemic during gestation. There was no statistically significant association between exposure to a poliomyelitis epidemic during gestation and subsequent development of schizophrenia or affective psychosis. The lack of a consistent statistically significant association between poliovirus epidemics and schizophrenia suggests that either poliovirus may have a small effect which is only detectable with large data-sets and/or the effect may be modified by location. Further investigation of such inconsistencies may help elucidate candidate risk-modifying factors for schizophrenia.

  2. Practices participating in a dental PBRN have substantial and advantageous diversity even though as a group they have much in common with dentists at large

    PubMed Central

    Makhija, Sonia K; Gilbert, Gregg H; Rindal, D Brad; Benjamin, Paul; Richman, Joshua S; Pihlstrom, Daniel J; Qvist, Vibeke

    2009-01-01

    Background Practice-based research networks offer important opportunities to move recent advances into routine clinical practice. If their findings are not only generalizable to dental practices at large, but can also elucidate how practice characteristics are related to treatment outcome, their importance is even further elevated. Our objective was to determine whether we met a key objective for The Dental Practice-Based Research Network (DPBRN): to recruit a diverse range of practitioner-investigators interested in doing DPBRN studies. Methods DPBRN participants completed an enrollment questionnaire about their practices and themselves. To date, more than 1100 practitioners from the five participating regions have completed the questionnaire. The regions consist of: Alabama/Mississippi, Florida/Georgia, Minnesota, Permanente Dental Associates, and Scandinavia (Denmark, Norway, and Sweden). We tested the hypothesis that there are statistically significant differences in key characteristics among DPBRN practices, based on responses from dentists who participated in DPBRN's first network-wide study (n = 546). Results There were statistically significant, substantive regional differences among DPBRN-participating dentists, their practices, and their patient populations. Conclusion Although as a group, participants have much in common with practices at large; their substantial diversity offers important advantages, such as being able to evaluate how practice differences may affect treatment outcomes, while simultaneously offering generalizability to dentists at large. This should help foster knowledge transfer in both the research-to-practice and practice-to-research directions. PMID:19832991

  3. Statistical assessment of crosstalk enrichment between gene groups in biological networks.

    PubMed

    McCormack, Theodore; Frings, Oliver; Alexeyenko, Andrey; Sonnhammer, Erik L L

    2013-01-01

    Analyzing groups of functionally coupled genes or proteins in the context of global interaction networks has become an important aspect of bioinformatic investigations. Assessing the statistical significance of crosstalk enrichment between or within groups of genes can be a valuable tool for functional annotation of experimental gene sets. Here we present CrossTalkZ, a statistical method and software to assess the significance of crosstalk enrichment between pairs of gene or protein groups in large biological networks. We demonstrate that the standard z-score is generally an appropriate and unbiased statistic. We further evaluate the ability of four different methods to reliably recover crosstalk within known biological pathways. We conclude that the methods preserving the second-order topological network properties perform best. Finally, we show how CrossTalkZ can be used to annotate experimental gene sets using known pathway annotations and that its performance at this task is superior to gene enrichment analysis (GEA). CrossTalkZ (available at http://sonnhammer.sbc.su.se/download/software/CrossTalkZ/) is implemented in C++, easy to use, fast, accepts various input file formats, and produces a number of statistics. These include z-score, p-value, false discovery rate, and a test of normality for the null distributions.

  4. IGESS: a statistical approach to integrating individual-level genotype data and summary statistics in genome-wide association studies.

    PubMed

    Dai, Mingwei; Ming, Jingsi; Cai, Mingxuan; Liu, Jin; Yang, Can; Wan, Xiang; Xu, Zongben

    2017-09-15

    Results from genome-wide association studies (GWAS) suggest that a complex phenotype is often affected by many variants with small effects, known as 'polygenicity'. Tens of thousands of samples are often required to ensure statistical power of identifying these variants with small effects. However, it is often the case that a research group can only get approval for the access to individual-level genotype data with a limited sample size (e.g. a few hundreds or thousands). Meanwhile, summary statistics generated using single-variant-based analysis are becoming publicly available. The sample sizes associated with the summary statistics datasets are usually quite large. How to make the most efficient use of existing abundant data resources largely remains an open question. In this study, we propose a statistical approach, IGESS, to increasing statistical power of identifying risk variants and improving accuracy of risk prediction by i ntegrating individual level ge notype data and s ummary s tatistics. An efficient algorithm based on variational inference is developed to handle the genome-wide analysis. Through comprehensive simulation studies, we demonstrated the advantages of IGESS over the methods which take either individual-level data or summary statistics data as input. We applied IGESS to perform integrative analysis of Crohns Disease from WTCCC and summary statistics from other studies. IGESS was able to significantly increase the statistical power of identifying risk variants and improve the risk prediction accuracy from 63.2% ( ±0.4% ) to 69.4% ( ±0.1% ) using about 240 000 variants. The IGESS software is available at https://github.com/daviddaigithub/IGESS . zbxu@xjtu.edu.cn or xwan@comp.hkbu.edu.hk or eeyang@hkbu.edu.hk. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  5. Mechanical and Statistical Evidence of Human-Caused Earthquakes - A Global Data Analysis

    NASA Astrophysics Data System (ADS)

    Klose, C. D.

    2012-12-01

    The causality of large-scale geoengineering activities and the occurrence of earthquakes with magnitudes of up to M=8 is discussed and mechanical and statistical evidence is provided. The earthquakes were caused by artificial water reservoir impoundments, underground and open-pit mining, coastal management, hydrocarbon production and fluid injections/extractions. The presented global earthquake catalog has been recently published in the Journal of Seismology and is available for the public at www.cdklose.com. The data show evidence that geomechanical relationships exist with statistical significance between a) seismic moment magnitudes of observed earthquakes, b) anthropogenic mass shifts on the Earth's crust, and c) lateral distances of the earthquake hypocenters to the locations of the mass shifts. Research findings depend on uncertainties, in particular, of source parameter estimations of seismic events before instrumental recoding. First analyses, however, indicate that that small- to medium size earthquakes (M6) tend to be triggered. The rupture propagation of triggered events might be dominated by pre-existing tectonic stress conditions. Besides event specific evidence, large earthquakes such as China's 2008 M7.9 Wenchuan earthquake fall into a global pattern and can not be considered as outliers or simply seen as an act of god. Observations also indicate that every second seismic event tends to occur after a decade, while pore pressure diffusion seems to only play a role when injecting fluids deep underground. The chance of an earthquake to nucleate after two or 20 years near an area with a significant mass shift is 25% or 75% respectively. Moreover, causative effects of seismic activities highly depend on the tectonic stress regime in the Earth's crust in which geoengineering takes place.

  6. Removal of two large-scale cosmic microwave background anomalies after subtraction of the integrated Sachs-Wolfe effect

    NASA Astrophysics Data System (ADS)

    Rassat, A.; Starck, J.-L.; Dupé, F.-X.

    2013-09-01

    Context. Although there is currently a debate over the significance of the claimed large-scale anomalies in the cosmic microwave background (CMB), their existence is not totally dismissed. In parallel to the debate over their statistical significance, recent work has also focussed on masks and secondary anisotropies as potential sources of these anomalies. Aims: In this work we investigate simultaneously the impact of the method used to account for masked regions as well as the impact of the integrated Sachs-Wolfe (ISW) effect, which is the large-scale secondary anisotropy most likely to affect the CMB anomalies. In this sense, our work is an update of previous works. Our aim is to identify trends in CMB data from different years and with different mask treatments. Methods: We reconstruct the ISW signal due to 2 Micron All-Sky Survey (2MASS) and NRAO VLA Sky Survey (NVSS) galaxies, effectively reconstructing the low-redshift ISW signal out to z ~ 1. We account for regions of missing data using the sparse inpainting technique. We test sparse inpainting of the CMB, large scale structure and ISW and find that it constitutes a bias-free reconstruction method suitable to study large-scale statistical isotropy and the ISW effect. Results: We focus on three large-scale CMB anomalies: the low quadrupole, the quadrupole/octopole alignment, and the octopole planarity. After sparse inpainting, the low quadrupole becomes more anomalous, whilst the quadrupole/octopole alignment becomes less anomalous. The significance of the low quadrupole is unchanged after subtraction of the ISW effect, while the trend amongst the CMB maps is that both the low quadrupole and the quadrupole/octopole alignment have reduced significance, yet other hypotheses remain possible as well (e.g. exotic physics). Our results also suggest that both of these anomalies may be due to the quadrupole alone. While the octopole planarity significance is reduced after inpainting and after ISW subtraction, however, we do not find that it was very anomalous to start with. In the spirit of participating in reproducible research, we make all codes and resulting products which constitute main results of this paper public here: http://www.cosmostat.org/anomaliesCMB.html

  7. Statistics of Magnetic Reconnection X-Lines in Kinetic Turbulence

    NASA Astrophysics Data System (ADS)

    Haggerty, C. C.; Parashar, T.; Matthaeus, W. H.; Shay, M. A.; Wan, M.; Servidio, S.; Wu, P.

    2016-12-01

    In this work we examine the statistics of magnetic reconnection (x-lines) and their associated reconnection rates in intermittent current sheets generated in turbulent plasmas. Although such statistics have been studied previously for fluid simulations (e.g. [1]), they have not yet been generalized to fully kinetic particle-in-cell (PIC) simulations. A significant problem with PIC simulations, however, is electrostatic fluctuations generated due to numerical particle counting statistics. We find that analyzing gradients of the magnetic vector potential from the raw PIC field data identifies numerous artificial (or non-physical) x-points. Using small Orszag-Tang vortex PIC simulations, we analyze x-line identification and show that these artificial x-lines can be removed using sub-Debye length filtering of the data. We examine how turbulent properties such as the magnetic spectrum and scale dependent kurtosis are affected by particle noise and sub-Debye length filtering. We subsequently apply these analysis methods to a large scale kinetic PIC turbulent simulation. Consistent with previous fluid models, we find a range of normalized reconnection rates as large as ½ but with the bulk of the rates being approximately less than to 0.1. [1] Servidio, S., W. H. Matthaeus, M. A. Shay, P. A. Cassak, and P. Dmitruk (2009), Magnetic reconnection and two-dimensional magnetohydrodynamic turbulence, Phys. Rev. Lett., 102, 115003.

  8. Intensive inpatient treatment for bulimia nervosa: Statistical and clinical significance of symptom changes.

    PubMed

    Diedrich, Alice; Schlegl, Sandra; Greetfeld, Martin; Fumi, Markus; Voderholzer, Ulrich

    2018-03-01

    This study examines the statistical and clinical significance of symptom changes during an intensive inpatient treatment program with a strong psychotherapeutic focus for individuals with severe bulimia nervosa. 295 consecutively admitted bulimic patients were administered the Structured Interview for Anorexic and Bulimic Syndromes-Self-Rating (SIAB-S), the Eating Disorder Inventory-2 (EDI-2), the Brief Symptom Inventory (BSI), and the Beck Depression Inventory-II (BDI-II) at treatment intake and discharge. Results indicated statistically significant symptom reductions with large effect sizes regarding severity of binge eating and compensatory behavior (SIAB-S), overall eating disorder symptom severity (EDI-2), overall psychopathology (BSI), and depressive symptom severity (BDI-II) even when controlling for antidepressant medication. The majority of patients showed either reliable (EDI-2: 33.7%, BSI: 34.8%, BDI-II: 18.1%) or even clinically significant symptom changes (EDI-2: 43.2%, BSI: 33.9%, BDI-II: 56.9%). Patients with clinically significant improvement were less distressed at intake and less likely to suffer from a comorbid borderline personality disorder when compared with those who did not improve to a clinically significant extent. Findings indicate that intensive psychotherapeutic inpatient treatment may be effective in about 75% of severely affected bulimic patients. For the remaining non-responding patients, inpatient treatment might be improved through an even stronger focus on the reduction of comorbid borderline personality traits.

  9. Large truck and bus crash facts, 2010.

    DOT National Transportation Integrated Search

    2012-09-01

    This annual edition of Large Truck and Bus Crash Facts contains descriptive statistics about fatal, injury, and : property damage only crashes involving large trucks and buses in 2010. Selected crash statistics on passenger : vehicles are also presen...

  10. Large truck and bus crash facts, 2007.

    DOT National Transportation Integrated Search

    2009-03-01

    This annual edition of Large Truck and Bus Crash Facts contains descriptive statistics about fatal, injury, and : property damage only crashes involving large trucks and buses in 2007. Selected crash statistics on passenger : vehicles are also presen...

  11. Large truck and bus crash facts, 2008. 

    DOT National Transportation Integrated Search

    2010-03-01

    This annual edition of Large Truck and Bus Crash Facts contains descriptive statistics about fatal, injury, and : property damage only crashes involving large trucks and buses in 2008. Selected crash statistics on passenger : vehicles are also presen...

  12. Large truck and bus crash facts, 2011.

    DOT National Transportation Integrated Search

    2013-10-01

    This annual edition of Large Truck and Bus Crash Facts contains descriptive statistics about fatal, injury, and : property damage only crashes involving large trucks and buses in 2011. Selected crash statistics on passenger : vehicles are also presen...

  13. Large truck and bus crash facts, 2013.

    DOT National Transportation Integrated Search

    2015-04-01

    This annual edition of Large Truck and Bus Crash Facts contains descriptive statistics about fatal, injury, and property damage only crashes involving large trucks and buses in 2013. Selected crash statistics on passenger vehicles are also presented ...

  14. Large truck and bus crash facts, 2009.

    DOT National Transportation Integrated Search

    2011-10-01

    This annual edition of Large Truck and Bus Crash Facts contains descriptive statistics about fatal, injury, and : property damage only crashes involving large trucks and buses in 2009. Selected crash statistics on passenger : vehicles are also presen...

  15. Large truck and bus crash facts, 2012.

    DOT National Transportation Integrated Search

    2014-06-01

    This annual edition of Large Truck and Bus Crash Facts contains descriptive statistics about fatal, injury, and property damage only crashes involving large trucks and buses in 2012. Selected crash statistics on passenger vehicles are also presented ...

  16. Toughness and strength of nanocrystalline graphene

    DOE PAGES

    Shekhawat, Ashivni; Ritchie, Robert O.

    2016-01-28

    Pristine monocrystalline graphene is claimed to be the strongest material known with remarkable mechanical and electrical properties. However, graphene made with scalable fabrication techniques is polycrystalline and contains inherent nanoscale line and point defects—grain boundaries and grain-boundary triple junctions—that lead to significant statistical fluctuations in toughness and strength. These fluctuations become particularly pronounced for nanocrystalline graphene where the density of defects is high. Here we use large-scale simulation and continuum modelling to show that the statistical variation in toughness and strength can be understood with ‘weakest-link’ statistics. We develop the first statistical theory of toughness in polycrystalline graphene, and elucidatemore » the nanoscale origins of the grain-size dependence of its strength and toughness. Lastly, our results should lead to more reliable graphene device design, and provide a framework to interpret experimental results in a broad class of two-dimensional materials.« less

  17. A statistical analogy between collapse of solids and death of living organisms: proposal for a 'law of life'.

    PubMed

    Pugno, Nicola M

    2007-01-01

    In this paper we present a statistical analogy between the collapse of solids and living organisms; in particular we deduce a statistical law governing their probability of death. We have derived such a law coupling the widely used Weibull Statistics, developed for describing the distribution of the strength of solids, with a general model for ontogenetic growth recently proposed in literature. The main idea presented in this paper is that cracks can propagate in solids and cause their failure as sick cells in living organisms can cause their death. Making a rough analogy, living organisms are found to behave as "growing" mechanical components under cyclic, i.e., fatigue, loadings and composed by a dynamic evolutionary material that, as an ineluctable fate, deteriorates. The implications on biological scaling laws are discussed. As an example, we apply such a Dynamic Weibull Statistics to large data collections on human deaths due to cancer of various types recorded in Italy: a significant agreement is observed.

  18. DEIVA: a web application for interactive visual analysis of differential gene expression profiles.

    PubMed

    Harshbarger, Jayson; Kratz, Anton; Carninci, Piero

    2017-01-07

    Differential gene expression (DGE) analysis is a technique to identify statistically significant differences in RNA abundance for genes or arbitrary features between different biological states. The result of a DGE test is typically further analyzed using statistical software, spreadsheets or custom ad hoc algorithms. We identified a need for a web-based system to share DGE statistical test results, and locate and identify genes in DGE statistical test results with a very low barrier of entry. We have developed DEIVA, a free and open source, browser-based single page application (SPA) with a strong emphasis on being user friendly that enables locating and identifying single or multiple genes in an immediate, interactive, and intuitive manner. By design, DEIVA scales with very large numbers of users and datasets. Compared to existing software, DEIVA offers a unique combination of design decisions that enable inspection and analysis of DGE statistical test results with an emphasis on ease of use.

  19. AN EXPLORATORY STUDY INVESTIGATING THE EFFECTS OF A TREATMENT MANUAL FOR VIDEO GAME ADDICTION.

    PubMed

    Pallesen, Ståle; Lorvik, Ingjerd Meen; Bu, Eli Hellandsjø; Molde, Helge

    2015-10-01

    This study investigated the effects of a manualized therapy for video game addiction in 12 males, ages 14-18 yr. The manual was based on cognitive-behavioral therapy, short-term strategic family therapy, solution-focused therapy, and motivational interviewing. Treatment response was reported by the patients, their mothers, and the therapists. The patients reported moderate (but statistically non-significant) improvement from pre- to post-treatment. The mothers, however, reported large effects and statistically significant improvement from pre- to post-treatment. The therapists reported marked or moderate treatment response in six of the 12 patients. The ratings of change by mothers converged well with the views of change of both the patients and therapists, whereas the convergence of views on change between the two latter sources was far lower.

  20. Therapies for acute myeloid leukemia: vosaroxin

    PubMed Central

    Sayar, Hamid; Bashardoust, Parvaneh

    2017-01-01

    Vosaroxin, a quinolone-derivative chemotherapeutic agent, was considered a promising drug for the treatment of acute myeloid leukemia (AML). Early-stage clinical trials with this agent led to a large randomized double-blind placebo-controlled study of vosaroxin in combination with intermediate-dose cytarabine for the treatment of relapsed or refractory AML. The study demonstrated better complete remission rates with vosaroxin, but there was no statistically significant overall survival benefit in the whole cohort. A subset analysis censoring patients who had undergone allogeneic stem cell transplantation, however, revealed a modest but statistically significant improvement in overall survival particularly among older patients. This article reviews the data available on vosaroxin including clinical trials in AML and offers an analysis of findings of these studies as well as the current status of vosaroxin. PMID:28860803

  1. Therapies for acute myeloid leukemia: vosaroxin.

    PubMed

    Sayar, Hamid; Bashardoust, Parvaneh

    2017-01-01

    Vosaroxin, a quinolone-derivative chemotherapeutic agent, was considered a promising drug for the treatment of acute myeloid leukemia (AML). Early-stage clinical trials with this agent led to a large randomized double-blind placebo-controlled study of vosaroxin in combination with intermediate-dose cytarabine for the treatment of relapsed or refractory AML. The study demonstrated better complete remission rates with vosaroxin, but there was no statistically significant overall survival benefit in the whole cohort. A subset analysis censoring patients who had undergone allogeneic stem cell transplantation, however, revealed a modest but statistically significant improvement in overall survival particularly among older patients. This article reviews the data available on vosaroxin including clinical trials in AML and offers an analysis of findings of these studies as well as the current status of vosaroxin.

  2. Design of a sediment data-collection program in Kansas as affected by time trends

    USGS Publications Warehouse

    Jordan, P.R.

    1985-01-01

    Data collection programs need to be re-examined periodically in order to insure their usefulness, efficiency, and applicability. The possibility of time trends in sediment concentration, in particular, makes the examination with new statistical techniques desirable. After adjusting sediment concentrations for their relation to streamflow rates and by using a seasonal adaptation of Kendall 's nonparametric statistical test, time trends of flow-adjusted concentrations were detected for 11 of the 38 sediment records tested that were not affected by large reservoirs. Ten of the 11 trends were toward smaller concentrations; only 1 was toward larger concentrations. Of the apparent trends that were not statistically significant (0.05 level) using data available, nearly all were toward smaller concentrations. Because the reason for the lack of statistical significance of an apparent trend may be inadequacy of data rather than absence of trend and because of the prevalence of apparent trends in one direction, the assumption was made that a time trend may be present at any station. This assumption can significantly affect the design of a sediment data collection program. Sudden decreases (step trends) in flow-adjusted sediment concentrations were found at all stations that were short distances downstream from large reservoirs and that had adequate data for a seasonal adaptation of Wilcoxon 's nonparametric statistical test. Examination of sediment records in the 1984 data collection program of the Kansas Water Office indicated 13 stations that can be discontinued temporarily because data are now adequate. Data collection could be resumed in 1992 when new data may be needed because of possible time trends. New data are needed at eight previously operated stations where existing data may be inadequate or misleading because of time trends. Operational changes may be needed at some stations, such as hiring contract observers or installing automatic pumping samplers. Implementing the changes in the program can provide a substantial increase in the quantity of useful information on stream sediment for the same funding as the 1984 level. (Author 's abstract)

  3. The impact of science notebook writing on ELL and low-SES students' science language development and conceptual understanding

    NASA Astrophysics Data System (ADS)

    Huerta, Margarita

    This quantitative study explored the impact of literacy integration in a science inquiry classroom involving the use of science notebooks on the academic language development and conceptual understanding of students from diverse (i.e., English Language Learners, or ELLs) and low socio-economic status (low-SES) backgrounds. The study derived from a randomized, longitudinal, field-based NSF funded research project (NSF Award No. DRL - 0822343) targeting ELL and non-ELL students from low-SES backgrounds in a large urban school district in Southeast Texas. The study used a scoring rubric (modified and tested for validity and reliability) to analyze fifth-grade school students' science notebook entries. Scores for academic language quality (or, for brevity, language ) were used to compare language growth over time across three time points (i.e., beginning, middle, and end of the school year) and to compare students across categories (ELL, former ELL, non-ELL, and gender) using descriptive statistics and mixed between-within subjects analysis of variance (ANOVA). Scores for conceptual understanding (or, for brevity, concept) were used to compare students across categories (ELL, former ELL, non-ELL, and gender) in three domains using descriptive statistics and ANOVA. A correlational analysis was conducted to explore the relationship, if any, between language scores and concept scores for each group. Students demonstrated statistically significant growth over time in their academic language as reflected by science notebook scores. While ELL students scored lower than former ELL and non-ELL students at the first two time points, they caught up to their peers by the third time point. Similarly, females outperformed males in language scores in the first two time points, but males caught up to females in the third time point. In analyzing conceptual scores, ELLs had statistically significant lower scores than former-ELL and non-ELL students, and females outperformed males in the first two domains. These differences, however, were not statistically significant in the last domain. Last, correlations between language and concept scores were overall, positive, large, and significant across domains and groups. The study presents a rubric useful for quantifying diverse students' science notebook entries, and findings add to the sparse research on the impact of writing in diverse students' language development and conceptual understanding in science.

  4. Earning Differences by Major Field of Study: Evidence from Three Cohorts of Recent Canadian Graduates.

    ERIC Educational Resources Information Center

    Finnie, Ross; Frenette, Marc

    2003-01-01

    Analysis of earnings differences by major field of study of three cohorts of graduates (1982, 1986, 1990) with bachelors' degrees from Canadian postsecondary institutions. Finds that earnings differences are large and statistically significant. The patterns are relatively consistent for the three cohorts and for male and female graduates, 2 and 5…

  5. Power-law tail probabilities of drainage areas in river basins

    USGS Publications Warehouse

    Veitzer, S.A.; Troutman, B.M.; Gupta, V.K.

    2003-01-01

    The significance of power-law tail probabilities of drainage areas in river basins was discussed. The convergence to a power law was not observed for all underlying distributions, but for a large class of statistical distributions with specific limiting properties. The article also discussed about the scaling properties of topologic and geometric network properties in river basins.

  6. Living Research: Oral History in the Black Community.

    ERIC Educational Resources Information Center

    Adesiyan, H. Rose

    Both blacks and whites arriving in Hammond, Indiana in the late 1800s and early 1900s played significant roles in its development. The role of the early black settlers has been largely untold outside the black community and is thus unappreciated. The goal of this project was to change this historical neglect. Statistical data from traditional…

  7. Guess Who's (Not) Coming to Class: Student Attitudes as Indicators of Attendance

    ERIC Educational Resources Information Center

    Gump, Steven E.

    2006-01-01

    A survey of 172 undergraduates, carried out during the fall 2002 and spring 2003 semesters at a large research university in the Midwestern United States, found, as expected, a statistically significant positive relationship (r = 0.174, p [is less than] 0.05) between the importance students attributed to attendance and the rates at which they…

  8. Automated Hypothesis Tests and Standard Errors for Nonstandard Problems with Description of Computer Package: A Draft.

    ERIC Educational Resources Information Center

    Lord, Frederic M.; Stocking, Martha

    A general Computer program is described that will compute asymptotic standard errors and carry out significance tests for an endless variety of (standard and) nonstandard large-sample statistical problems, without requiring the statistician to derive asymptotic standard error formulas. The program assumes that the observations have a multinormal…

  9. Evaluating Educational Programs. ETS R&D Scientific and Policy Contributions Series. ETS SPC-11-01. ETS Research Report No. RR-11-15

    ERIC Educational Resources Information Center

    Ball, Samuel

    2011-01-01

    Since its founding in 1947, ETS has conducted a significant and wide-ranging research program that has focused on, among other things, psychometric and statistical methodology; educational evaluation; performance assessment and scoring; large-scale assessment and evaluation; cognitive, developmental, personality, and social psychology; and…

  10. ANALYSIS TO ACCOUNT FOR SMALL AGE RANGE CATEGORIES IN DISTRIBUTIONS OF WATER CONSUMPTION AND BODY WEIGHT IN THE U.S. USING CSFII DATA

    EPA Science Inventory

    Statistical population based estimates of water ingestion play a vital role in many types of exposure and risk analysis. A significant large scale analysis of water ingestion by the population of the United States was recently completed and is documented in the report titled ...

  11. Class Size and Student Evaluations in Sweden

    ERIC Educational Resources Information Center

    Westerlund, Joakim

    2008-01-01

    This paper examines the effect of class size on student evaluations of the quality of an introductory mathematics course at Lund University in Sweden. In contrast to much other studies, we find a large negative, and statistically significant, effect of class size on the quality of the course. This result appears to be quite robust, as almost all…

  12. Identification of Microorganisms by High Resolution Tandem Mass Spectrometry with Accurate Statistical Significance

    NASA Astrophysics Data System (ADS)

    Alves, Gelio; Wang, Guanghui; Ogurtsov, Aleksey Y.; Drake, Steven K.; Gucek, Marjan; Suffredini, Anthony F.; Sacks, David B.; Yu, Yi-Kuo

    2016-02-01

    Correct and rapid identification of microorganisms is the key to the success of many important applications in health and safety, including, but not limited to, infection treatment, food safety, and biodefense. With the advance of mass spectrometry (MS) technology, the speed of identification can be greatly improved. However, the increasing number of microbes sequenced is challenging correct microbial identification because of the large number of choices present. To properly disentangle candidate microbes, one needs to go beyond apparent morphology or simple `fingerprinting'; to correctly prioritize the candidate microbes, one needs to have accurate statistical significance in microbial identification. We meet these challenges by using peptidome profiles of microbes to better separate them and by designing an analysis method that yields accurate statistical significance. Here, we present an analysis pipeline that uses tandem MS (MS/MS) spectra for microbial identification or classification. We have demonstrated, using MS/MS data of 81 samples, each composed of a single known microorganism, that the proposed pipeline can correctly identify microorganisms at least at the genus and species levels. We have also shown that the proposed pipeline computes accurate statistical significances, i.e., E-values for identified peptides and unified E-values for identified microorganisms. The proposed analysis pipeline has been implemented in MiCId, a freely available software for Microorganism Classification and Identification. MiCId is available for download at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads.html.

  13. Neurons from the adult human dentate nucleus: neural networks in the neuron classification.

    PubMed

    Grbatinić, Ivan; Marić, Dušica L; Milošević, Nebojša T

    2015-04-07

    Topological (central vs. border neuron type) and morphological classification of adult human dentate nucleus neurons according to their quantified histomorphological properties using neural networks on real and virtual neuron samples. In the real sample 53.1% and 14.1% of central and border neurons, respectively, are classified correctly with total of 32.8% of misclassified neurons. The most important result present 62.2% of misclassified neurons in border neurons group which is even greater than number of correctly classified neurons (37.8%) in that group, showing obvious failure of network to classify neurons correctly based on computational parameters used in our study. On the virtual sample 97.3% of misclassified neurons in border neurons group which is much greater than number of correctly classified neurons (2.7%) in that group, again confirms obvious failure of network to classify neurons correctly. Statistical analysis shows that there is no statistically significant difference in between central and border neurons for each measured parameter (p>0.05). Total of 96.74% neurons are morphologically classified correctly by neural networks and each one belongs to one of the four histomorphological types: (a) neurons with small soma and short dendrites, (b) neurons with small soma and long dendrites, (c) neuron with large soma and short dendrites, (d) neurons with large soma and long dendrites. Statistical analysis supports these results (p<0.05). Human dentate nucleus neurons can be classified in four neuron types according to their quantitative histomorphological properties. These neuron types consist of two neuron sets, small and large ones with respect to their perykarions with subtypes differing in dendrite length i.e. neurons with short vs. long dendrites. Besides confirmation of neuron classification on small and large ones, already shown in literature, we found two new subtypes i.e. neurons with small soma and long dendrites and with large soma and short dendrites. These neurons are most probably equally distributed throughout the dentate nucleus as no significant difference in their topological distribution is observed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Regional and Temporal Variation in Methamphetamine-Related Incidents: Applications of Spatial and Temporal Scan Statistics

    PubMed Central

    Sudakin, Daniel L.

    2009-01-01

    Introduction This investigation utilized spatial scan statistics, geographic information systems and multiple data sources to assess spatial clustering of statewide methamphetamine-related incidents. Temporal and spatial associations with regulatory interventions to reduce access to precursor chemicals (pseudoephedrine) were also explored. Methods Four statewide data sources were utilized including regional poison control center statistics, fatality incidents, methamphetamine laboratory seizures, and hazardous substance releases involving methamphetamine laboratories. Spatial clustering of methamphetamine incidents was assessed using SaTScan™. SaTScan™ was also utilized to assess space-time clustering of methamphetamine laboratory incidents, in relation to the enactment of regulations to reduce access to pseudoephedrine. Results Five counties with a significantly higher relative risk of methamphetamine-related incidents were identified. The county identified as the most likely cluster had a significantly elevated relative risk of methamphetamine laboratories (RR=11.5), hazardous substance releases (RR=8.3), and fatalities relating to methamphetamine (RR=1.4). A significant increase in relative risk of methamphetamine laboratory incidents was apparent in this same geographic area (RR=20.7) during the time period when regulations were enacted in 2004 and 2005, restricting access to pseudoephedrine. Subsequent to the enactment of these regulations, a significantly lower rate of incidents (RR 0.111, p=0.0001) was observed over a large geographic area of the state, including regions that previously had significantly higher rates. Conclusions Spatial and temporal scan statistics can be effectively applied to multiple data sources to assess regional variation in methamphetamine-related incidents, and explore the impact of preventive regulatory interventions. PMID:19225949

  15. Long-term occlusal changes assessed by the American Board of Orthodontics' model grading system.

    PubMed

    Aszkler, Robert M; Preston, Charles B; Saltaji, Humam; Tabbaa, Sawsan

    2014-02-01

    The purpose of this study was to assess the long-term posttreatment changes in all criteria of the American Board of Orthodontics' (ABO) model grading system. We used plaster models from patients' final and posttreatment records. Thirty patients treated by 1 orthodontist using 1 bracket prescription were selected. An initial discrepancy index for each subject was performed to determine the complexity of each case. The final models were then graded using the ABO's model grading system immediately at posttreatment and postretention. Statistical analysis was performed on the 8 criteria of the model grading system, including paired t tests and Pearson correlations. An alpha of 0.05 was considered statistically significant. The average length of time between the posttreatment and postretention records was 12.7 ± 4.4 years. It was shown that alignment and rotations worsened by postretention (P = 0.014), and a weak statistically significant correlation at posttreatment and postretention was found (0.44; P = 0.016). Both marginal ridges and occlusal contacts scored less well at posttreatment. These criteria showed a significant decrease in scores between posttreatment and postretention (P <0.001), but the correlations were not statistically significant. The average total score showed a significant decrease between posttreatment and postretention (P <0.001), partly because of the large decrease in the previous 2 criteria. Higher scores for occlusal contacts and marginal ridges were found at the end of treatment; however, those scores and the overall scores for the 30 subjects improved in the postretention phase. Copyright © 2014. Published by Mosby, Inc.

  16. A Statistical Analysis of Brain Morphology Using Wild Bootstrapping

    PubMed Central

    Ibrahim, Joseph G.; Tang, Niansheng; Rowe, Daniel B.; Hao, Xuejun; Bansal, Ravi; Peterson, Bradley S.

    2008-01-01

    Methods for the analysis of brain morphology, including voxel-based morphology and surface-based morphometries, have been used to detect associations between brain structure and covariates of interest, such as diagnosis, severity of disease, age, IQ, and genotype. The statistical analysis of morphometric measures usually involves two statistical procedures: 1) invoking a statistical model at each voxel (or point) on the surface of the brain or brain subregion, followed by mapping test statistics (e.g., t test) or their associated p values at each of those voxels; 2) correction for the multiple statistical tests conducted across all voxels on the surface of the brain region under investigation. We propose the use of new statistical methods for each of these procedures. We first use a heteroscedastic linear model to test the associations between the morphological measures at each voxel on the surface of the specified subregion (e.g., cortical or subcortical surfaces) and the covariates of interest. Moreover, we develop a robust test procedure that is based on a resampling method, called wild bootstrapping. This procedure assesses the statistical significance of the associations between a measure of given brain structure and the covariates of interest. The value of this robust test procedure lies in its computationally simplicity and in its applicability to a wide range of imaging data, including data from both anatomical and functional magnetic resonance imaging (fMRI). Simulation studies demonstrate that this robust test procedure can accurately control the family-wise error rate. We demonstrate the application of this robust test procedure to the detection of statistically significant differences in the morphology of the hippocampus over time across gender groups in a large sample of healthy subjects. PMID:17649909

  17. Segment-Wise Genome-Wide Association Analysis Identifies a Candidate Region Associated with Schizophrenia in Three Independent Samples

    PubMed Central

    Rietschel, Marcella; Mattheisen, Manuel; Breuer, René; Schulze, Thomas G.; Nöthen, Markus M.; Levinson, Douglas; Shi, Jianxin; Gejman, Pablo V.; Cichon, Sven; Ophoff, Roel A.

    2012-01-01

    Recent studies suggest that variation in complex disorders (e.g., schizophrenia) is explained by a large number of genetic variants with small effect size (Odds Ratio∼1.05–1.1). The statistical power to detect these genetic variants in Genome Wide Association (GWA) studies with large numbers of cases and controls (∼15,000) is still low. As it will be difficult to further increase sample size, we decided to explore an alternative method for analyzing GWA data in a study of schizophrenia, dramatically reducing the number of statistical tests. The underlying hypothesis was that at least some of the genetic variants related to a common outcome are collocated in segments of chromosomes at a wider scale than single genes. Our approach was therefore to study the association between relatively large segments of DNA and disease status. An association test was performed for each SNP and the number of nominally significant tests in a segment was counted. We then performed a permutation-based binomial test to determine whether this region contained significantly more nominally significant SNPs than expected under the null hypothesis of no association, taking linkage into account. Genome Wide Association data of three independent schizophrenia case/control cohorts with European ancestry (Dutch, German, and US) using segments of DNA with variable length (2 to 32 Mbp) was analyzed. Using this approach we identified a region at chromosome 5q23.3-q31.3 (128–160 Mbp) that was significantly enriched with nominally associated SNPs in three independent case-control samples. We conclude that considering relatively wide segments of chromosomes may reveal reliable relationships between the genome and schizophrenia, suggesting novel methodological possibilities as well as raising theoretical questions. PMID:22723893

  18. Differences in Obesity Prevalence by Demographic Characteristics and Urbanization Level Among Adults in the United States, 2013-2016.

    PubMed

    Hales, Craig M; Fryar, Cheryl D; Carroll, Margaret D; Freedman, David S; Aoki, Yutaka; Ogden, Cynthia L

    2018-06-19

    Differences in obesity by sex, age group, race and Hispanic origin among US adults have been reported, but differences by urbanization level have been less studied. To provide estimates of obesity by demographic characteristics and urbanization level and to examine trends in obesity prevalence by urbanization level. Serial cross-sectional analysis of measured height and weight among adults aged 20 years or older in the 2001-2016 National Health and Nutrition Examination Survey, a nationally representative survey of the civilian, noninstitutionalized US population. Sex, age group, race and Hispanic origin, education level, smoking status, and urbanization level as assessed by metropolitan statistical areas (MSAs; large: ≥1 million population). Prevalence of obesity (body mass index [BMI] ≥30) and severe obesity (BMI ≥40) by subgroups in 2013-2016 and trends by urbanization level between 2001-2004 and 2013-2016. Complete data on weight, height, and urbanization level were available for 10 792 adults (mean age, 48 years; 51% female [weighted]). During 2013-2016, 38.9% (95% CI, 37.0% to 40.7%) of US adults had obesity and 7.6% (95% CI, 6.8% to 8.6%) had severe obesity. Men living in medium or small MSAs had a higher age-adjusted prevalence of obesity compared with men living in large MSAs (42.4% vs 31.8%, respectively; adjusted difference, 9.8 percentage points [95% CI, 5.1 to 14.5 percentage points]); however, the age-adjusted prevalence among men living in non-MSAs was not significantly different compared with men living in large MSAs (38.9% vs 31.8%, respectively; adjusted difference, 4.8 percentage points [95% CI, -2.9 to 12.6 percentage points]). The age-adjusted prevalence of obesity was higher among women living in medium or small MSAs compared with women living in large MSAs (42.5% vs 38.1%, respectively; adjusted difference, 4.3 percentage points [95% CI, 0.2 to 8.5 percentage points]) and among women living in non-MSAs compared with women living in large MSAs (47.2% vs 38.1%, respectively; adjusted difference, 4.7 percentage points [95% CI, 0.2 to 9.3 percentage points]). Similar patterns were seen for severe obesity except that the difference between men living in large MSAs compared with non-MSAs was significant. The age-adjusted prevalence of obesity and severe obesity also varied significantly by age group, race and Hispanic origin, and education level, and these patterns of variation were often different by sex. Between 2001-2004 and 2013-2016, the age-adjusted prevalence of obesity and severe obesity significantly increased among all adults at all urbanization levels. In this nationally representative survey of adults in the United States, the age-adjusted prevalence of obesity and severe obesity in 2013-2016 varied by level of urbanization, with significantly greater prevalence of obesity and severe obesity among adults living in nonmetropolitan statistical areas compared with adults living in large metropolitan statistical areas.

  19. 75 FR 72611 - Assessments, Large Bank Pricing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-24

    ... the worst risk ranking and are included in the statistical analysis. Appendix 1 to the NPR describes the statistical analysis in detail. \\12\\ The percentage approximated by factors is based on the statistical model for that particual year. Actual weights assigned to each scorecard measure are largely based...

  20. Co-administration of furosemide with albumin for overcoming diuretic resistance in patients with hypoalbuminemia: a meta-analysis.

    PubMed

    Kitsios, Georgios D; Mascari, Paolo; Ettunsi, Riad; Gray, Anthony W

    2014-04-01

    To systematically review clinical studies of co-administration of albumin and loop diuretics in hypoalbuminemic patients as a strategy to overcome diuretic resistance. Systematic search of electronic databases up to October 2012. We included randomized clinical trials of adults with hypoalbuminemia, comparing co-administration of loop diuretics and albumin versus loop diuretics alone. Quantitative data were synthesized with meta-analytic techniques for clinical, surrogate (urinary volume and urinary sodium excretion) and intermediate (pharmacokinetic and hemodynamic parameters) outcomes. Ten studies were included, of which 8 trials with crossover design were synthesized with meta-analysis. A statistically significant increase in the amount of urine volume (increment of 231 mL [95% confidence interval 135.5-326.5]) and sodium excreted (15.9 mEq [4.9-26.8]) at 8 hours were found in favor of co-administration of albumin and furosemide. These differences were no longer statistically significant at 24 hours. Meta-analyses for intermediate outcomes (ie, furosemide excretion, distribution volume etc.) did not reveal statistically significant differences. Synthesis of a heterogeneous body of evidence shows transient effects of modest clinical significance for co-administration of albumin with furosemide in hypoalbuminemic patients. Pragmatic, large-scale randomized studies are needed to delineate the role of this strategy. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. No sex differences in use of dopaminergic medication in early Parkinson disease in the US and Canada - baseline findings of a multicenter trial.

    PubMed

    Umeh, Chizoba C; Pérez, Adriana; Augustine, Erika F; Dhall, Rohit; Dewey, Richard B; Mari, Zoltan; Simon, David K; Wills, Anne-Marie A; Christine, Chadwick W; Schneider, Jay S; Suchowersky, Oksana

    2014-01-01

    Sex differences in Parkinson disease clinical features have been reported, but few studies have examined sex influences on use of dopaminergic medication in early Parkinson disease. The objective of this study was to test if there are differences in the type of dopaminergic medication used and levodopa equivalent daily dose between men and women with early Parkinson disease enrolled in a large multicenter study of Creatine as a potential disease modifying therapy - the National Institute of Neurological Disorders and Stroke Exploratory Trials in Parkinson Disease Long-Term Study-1. Baseline data of 1,741 participants from 45 participating sites were analyzed. Participants from the United States and Canada were enrolled within five years of Parkinson Disease diagnosis. Two outcome variables were studied: type of dopaminergic medication used and levodopa equivalent daily dose at baseline in the Long-Term Study-1. Chi-square statistic and linear regression models were used for statistical analysis. There were no statistically significant differences in the frequency of use of different types of dopaminergic medications at baseline between men and women with Parkinson Disease. A small but statistically significant difference was observed in the median unadjusted levodopa equivalent daily dose at baseline between women (300 mg) and men (325 mg), but this was not observed after controlling for disease duration (years since Parkinson disease diagnosis), disease severity (Unified Parkinson's Disease Rating Scale Motor and Activities of Daily Living Scores), and body weight. In this large multicenter study, we did not observe sex differences in the type and dose of dopaminergic medications used in early Parkinson Disease. Further research is needed to evaluate the influence of male or female sex on use of dopaminergic medication in mid- and late-stage Parkinson Disease.

  2. Connecting optical and X-ray tracers of galaxy cluster relaxation

    NASA Astrophysics Data System (ADS)

    Roberts, Ian D.; Parker, Laura C.; Hlavacek-Larrondo, Julie

    2018-04-01

    Substantial effort has been devoted in determining the ideal proxy for quantifying the morphology of the hot intracluster medium in clusters of galaxies. These proxies, based on X-ray emission, typically require expensive, high-quality X-ray observations making them difficult to apply to large surveys of groups and clusters. Here, we compare optical relaxation proxies with X-ray asymmetries and centroid shifts for a sample of Sloan Digital Sky Survey clusters with high-quality, archival X-ray data from Chandra and XMM-Newton. The three optical relaxation measures considered are the shape of the member-galaxy projected velocity distribution - measured by the Anderson-Darling (AD) statistic, the stellar mass gap between the most-massive and second-most-massive cluster galaxy, and the offset between the most-massive galaxy (MMG) position and the luminosity-weighted cluster centre. The AD statistic and stellar mass gap correlate significantly with X-ray relaxation proxies, with the AD statistic being the stronger correlator. Conversely, we find no evidence for a correlation between X-ray asymmetry or centroid shift and the MMG offset. High-mass clusters (Mhalo > 1014.5 M⊙) in this sample have X-ray asymmetries, centroid shifts, and Anderson-Darling statistics which are systematically larger than for low-mass systems. Finally, considering the dichotomy of Gaussian and non-Gaussian clusters (measured by the AD test), we show that the probability of being a non-Gaussian cluster correlates significantly with X-ray asymmetry but only shows a marginal correlation with centroid shift. These results confirm the shape of the radial velocity distribution as a useful proxy for cluster relaxation, which can then be applied to large redshift surveys lacking extensive X-ray coverage.

  3. Unified risk analysis of fatigue failure in ductile alloy components during all three stages of fatigue crack evolution process.

    PubMed

    Patankar, Ravindra

    2003-10-01

    Statistical fatigue life of a ductile alloy specimen is traditionally divided into three stages, namely, crack nucleation, small crack growth, and large crack growth. Crack nucleation and small crack growth show a wide variation and hence a big spread on cycles versus crack length graph. Relatively, large crack growth shows a lesser variation. Therefore, different models are fitted to the different stages of the fatigue evolution process, thus treating different stages as different phenomena. With these independent models, it is impossible to predict one phenomenon based on the information available about the other phenomenon. Experimentally, it is easier to carry out crack length measurements of large cracks compared to nucleating cracks and small cracks. Thus, it is easier to collect statistical data for large crack growth compared to the painstaking effort it would take to collect statistical data for crack nucleation and small crack growth. This article presents a fracture mechanics-based stochastic model of fatigue crack growth in ductile alloys that are commonly encountered in mechanical structures and machine components. The model has been validated by Ray (1998) for crack propagation by various statistical fatigue data. Based on the model, this article proposes a technique to predict statistical information of fatigue crack nucleation and small crack growth properties that uses the statistical properties of large crack growth under constant amplitude stress excitation. The statistical properties of large crack growth under constant amplitude stress excitation can be obtained via experiments.

  4. Understanding Short-Term Nonmigrating Tidal Variability in the Ionospheric Dynamo Region from SABER Using Information Theory and Bayesian Statistics

    NASA Astrophysics Data System (ADS)

    Kumari, K.; Oberheide, J.

    2017-12-01

    Nonmigrating tidal diagnostics of SABER temperature observations in the ionospheric dynamo region reveal a large amount of variability on time-scales of a few days to weeks. In this paper, we discuss the physical reasons for the observed short-term tidal variability using a novel approach based on Information theory and Bayesian statistics. We diagnose short-term tidal variability as a function of season, QBO, ENSO, and solar cycle and other drivers using time dependent probability density functions, Shannon entropy and Kullback-Leibler divergence. The statistical significance of the approach and its predictive capability is exemplified using SABER tidal diagnostics with emphasis on the responses to the QBO and solar cycle. Implications for F-region plasma density will be discussed.

  5. Statistical Ensemble of Large Eddy Simulations

    NASA Technical Reports Server (NTRS)

    Carati, Daniele; Rogers, Michael M.; Wray, Alan A.; Mansour, Nagi N. (Technical Monitor)

    2001-01-01

    A statistical ensemble of large eddy simulations (LES) is run simultaneously for the same flow. The information provided by the different large scale velocity fields is used to propose an ensemble averaged version of the dynamic model. This produces local model parameters that only depend on the statistical properties of the flow. An important property of the ensemble averaged dynamic procedure is that it does not require any spatial averaging and can thus be used in fully inhomogeneous flows. Also, the ensemble of LES's provides statistics of the large scale velocity that can be used for building new models for the subgrid-scale stress tensor. The ensemble averaged dynamic procedure has been implemented with various models for three flows: decaying isotropic turbulence, forced isotropic turbulence, and the time developing plane wake. It is found that the results are almost independent of the number of LES's in the statistical ensemble provided that the ensemble contains at least 16 realizations.

  6. Sex-specific substance abuse treatment for female healthcare professionals: implications.

    PubMed

    Koos, Erin; Brand, Michael; Rojas, Julio; Li, Ji

    2014-01-01

    Gender plays a significant role in the development and treatment of substance abuse disorders. Sex-specific treatment for girls and women has recurrently proven more effective, with better outcomes than traditional treatment. Research on impaired healthcare professionals (HCPs) has largely focused on men, garnering little attention for women and sex differences. With the increasing numbers of female HCPs, it is imperative to identify potential sex differences that may have implications for treatment. Our study compared a convenience sample of male and female HCPs with substance abuse disorders treated in an outpatient program to identify sex differences that may have implications for treatment. Our sample consisted of 96 HCPs (54 men, 42 women) and 17 non-healthcare professional (N-HCP) women. All of the participants were evaluated using the program's clinical interview and the Personality Assessment Inventory (PAI). Chart review data contained categorical variables, qualitative variables, diagnoses, and psychological test scores. A second analysis was conducted through two separate comparisons: the PAI results of comparing impaired female HCPs with impaired male HCPs and the PAI results of comparing impaired female HCPs with impaired female N-HCPs. Statistically significant differences indicated more male participants received prior treatment and more intensive treatment than female participants. More female subjects reported being diagnosed as having a comorbid psychiatric condition and taking psychotropic medications. Several statistically significant differences in the PAI scores were found. Among female HCPs, elevations were found in anxiety, depression, paranoia, and borderline personality disorder. Substantive differences, although not statistically significant, were elevations in somatic complaints and anxiety disorders in female HCPs. In the comparison of female HCPs and N-HCPs, the only statistically significant difference was the significantly higher anxiety score of N-HCPs. The results indicate greater differences between female HCPs and male HCPs than between female HCPs and N-HCPs.

  7. Hormone replacement therapy is associated with gastro-oesophageal reflux disease: a retrospective cohort study

    PubMed Central

    2012-01-01

    Background Oestrogen and progestogen have the potential to influence gastro-intestinal motility; both are key components of hormone replacement therapy (HRT). Results of observational studies in women taking HRT rely on self-reporting of gastro-oesophageal symptoms and the aetiology of gastro-oesophageal reflux disease (GORD) remains unclear. This study investigated the association between HRT and GORD in menopausal women using validated general practice records. Methods 51,182 menopausal women were identified using the UK General Practice Research Database between 1995–2004. Of these, 8,831 were matched with and without hormone use. Odds ratios (ORs) were calculated for GORD and proton-pump inhibitor (PPI) use in hormone and non-hormone users, adjusting for age, co-morbidities, and co-pharmacy. Results In unadjusted analysis, all forms of hormone use (oestrogen-only, tibolone, combined HRT and progestogen) were statistically significantly associated with GORD. In adjusted models, this association remained statistically significant for oestrogen-only treatment (OR 1.49; 1.18–1.89). Unadjusted analysis showed a statistically significant association between PPI use and oestrogen-only and combined HRT treatment. When adjusted for covariates, oestrogen-only treatment was significant (OR 1.34; 95% CI 1.03–1.74). Findings from the adjusted model demonstrated the greater use of PPI by progestogen users (OR 1.50; 1.01–2.22). Conclusions This first large cohort study of the association between GORD and HRT found a statistically significant association between oestrogen-only hormone and GORD and PPI use. This should be further investigated using prospective follow-up to validate the strength of association and describe its clinical significance. PMID:22642788

  8. Soil carbon inventories under a bioenergy crop (switchgrass): Measurement limitations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garten, C.T. Jr.; Wullschleger, S.D.

    Approximately 5 yr after planting, coarse root carbon (C) and soil organic C (SOC) inventories were compared under different types of plant cover at four switchgrass (Panicum virgatum L.) production field trials in the southeastern USA. There was significantly more coarse root C under switchgrass (Alamo variety) and forest cover than tall fescue (Festuca arundinacea Schreb.), corn (Zea mays L.), or native pastures of mixed grasses. Inventories of SOC under switchgrass were not significantly greater than SOC inventories under other plant covers. At some locations the statistical power associated with ANOVA of SOC inventories was low, which raised questions aboutmore » whether differences in SOC could be detected statistically. A minimum detectable difference (MDD) for SOC inventories was calculated. The MDD is the smallest detectable difference between treatment means once the variation, significance level, statistical power, and sample size are specified. The analysis indicated that a difference of {approx}50 mg SOC/cm{sup 2} or 5 Mg SOC/ha, which is {approx}10 to 15% of existing SOC, could be detected with reasonable sample sizes and good statistical power. The smallest difference in SOC inventories that can be detected, and only with exceedingly large sample sizes, is {approx}2 to 3%. These measurement limitations have implications for monitoring and verification of proposals to ameliorate increasing global atmospheric CO{sub 2} concentrations by sequestering C in soils.« less

  9. Activation of Methane by FeO+: Determining Reaction Pathways through Temperature-Dependent Kinetics and Statistical Modeling (Postprint)

    DTIC Science & Technology

    2014-02-25

    benchmarks for the reaction surface. ■ INTRODUCTION There is significant interest in procuring and employing natural gas as a viable alternative to...petroleum for both energy and chemical feed stocks.1,2 One of the primary impediments to natural gas utilization is that methane (∼90% of natural gas ...is significant, which typically limits its use to areas where large natural gas deposits are in very close proximity, neglecting the many smaller

  10. Physical activity and healthy weight maintenance from childhood to adulthood.

    PubMed

    Cleland, Verity J; Dwyer, Terence; Venn, Alison J

    2008-06-01

    The objective of this study was to determine whether change in physical activity was associated with maintaining a healthy weight from childhood to adulthood. This prospective cohort study examined 1,594 young Australian adults (48.9% female) aged 27-36 years who were first examined at age 9-15 years as part of a national health and fitness survey. BMI was calculated from measured height and weight, and physical activity was self-reported at both time points; pedometers were also used at follow-up. Change in physical activity was characterized by calculating the difference between baseline and follow-up z-scores. Change scores were categorized as decreasing (large, moderate), stable, or increasing (large, moderate). Healthy weight was defined in childhood as a BMI less than international overweight cutoff points, and in adulthood as BMI<25 kg/m(2). Healthy weight maintainers were healthy weight at both time points. Compared with those who demonstrated large relative decreases in physical activity, females in all other groups were 25-37% more likely to be healthy weight maintainers, although associations differed according to the physical activity measure used at follow-up and few reached statistical significance. Although younger males whose relative physical activity moderately or largely increased were 27-34% more likely to be healthy weight maintainers than those whose relative physical activity largely decreased, differences were not statistically significant. In conclusion, relatively increasing and stable physical activity from childhood to adulthood was only weakly associated with healthy weight maintenance. Examining personal, social, and environmental factors associated with healthy weight maintenance will be an important next step in understanding why some groups avoid becoming overweight.

  11. Differences in results of analyses of concurrent and split stream-water samples collected and analyzed by the US Geological Survey and the Illinois Environmental Protection Agency, 1985-91

    USGS Publications Warehouse

    Melching, C.S.; Coupe, R.H.

    1995-01-01

    During water years 1985-91, the U.S. Geological Survey (USGS) and the Illinois Environmental Protection Agency (IEPA) cooperated in the collection and analysis of concurrent and split stream-water samples from selected sites in Illinois. Concurrent samples were collected independently by field personnel from each agency at the same time and sent to the IEPA laboratory, whereas the split samples were collected by USGS field personnel and divided into aliquots that were sent to each agency's laboratory for analysis. The water-quality data from these programs were examined by means of the Wilcoxon signed ranks test to identify statistically significant differences between results of the USGS and IEPA analyses. The data sets for constituents and properties identified by the Wilcoxon test as having significant differences were further examined by use of the paired t-test, mean relative percentage difference, and scattergrams to determine if the differences were important. Of the 63 constituents and properties in the concurrent-sample analysis, differences in only 2 (pH and ammonia) were statistically significant and large enough to concern water-quality engineers and planners. Of the 27 constituents and properties in the split-sample analysis, differences in 9 (turbidity, dissolved potassium, ammonia, total phosphorus, dissolved aluminum, dissolved barium, dissolved iron, dissolved manganese, and dissolved nickel) were statistically significant and large enough to con- cern water-quality engineers and planners. The differences in concentration between pairs of the concurrent samples were compared to the precision of the laboratory or field method used. The differences in concentration between pairs of the concurrent samples were compared to the precision of the laboratory or field method used. The differences in concentration between paris of split samples were compared to the precision of the laboratory method used and the interlaboratory precision of measuring a given concentration or property. Consideration of method precision indicated that differences between concurrent samples were insignificant for all concentrations and properties except pH, and that differences between split samples were significant for all concentrations and properties. Consideration of interlaboratory precision indicated that the differences between the split samples were not unusually large. The results for the split samples illustrate the difficulty in obtaining comparable and accurate water-quality data.

  12. Safety and efficacy of silodosin and tadalafil in ease of negotiation of large ureteroscope in the management of ureteral stone: A prosective randomized trial

    PubMed Central

    Bhattar, Rohit; Jain, Vipin; Tomar, Vinay; Yadav, Sher Singh

    2017-01-01

    Objective To evaluate the safety and efficacy of silodosin and tadalafil in ease of negotiation of large size ureteroscope (8/9.8 Fr) in the management of ureteral stone. Material and methods Between June 2015 and May 2016, 86 patients presented with ureteral stone of size 6–15 mm were on consent randomly assigned to 1 of 3 outpatient treatment arms: silodosin (Group A), tadalafil (Group B), and placebo (Group C). After two weeks of therapy 67 patients underwent ureteroscopy, and ureteral orifice configuration, ureteroscopic negotiation, ureteral dilatation, operating time, procedural complication and drug related side effects were noted in each group. Results Ureteral negotiation was significantly better in Groups A (73.9%) and B (69.6%) as compared to Group C (38.1%) (p<0.01). Statistically significant difference was noted in the requirement for dilatation in Group C (71.4%) as compared to Groups A (26.1%) and B (39.1%) (p<0.01). Ureteral orifice was found to be more dilated in Groups A (69.6%) and B (60.9%) as compared to Group C (28.6%). Mean operating time was statistically lower in Groups A (35.2 min) and B (34.91 min) as compared to Group C (41.14 min) (p<0.01). Conclusion Both silodosin and tadalafil not only relax ureteral smooth muscle but also help in forward propagation of large size ureteroscope (8/9.8 Fr) without any significant risk of adverse events. PMID:29201512

  13. Construct validity of the Groningen Frailty Indicator established in a large sample of home-dwelling elderly persons: Evidence of stability across age and gender.

    PubMed

    Peters, L L; Boter, H; Burgerhof, J G M; Slaets, J P J; Buskens, E

    2015-09-01

    The primary objective of the present study was to evaluate the validity of the Groningen Frailty Indicator (GFI) in a sample of Dutch elderly persons participating in LifeLines, a large population-based cohort study. Additional aims were to assess differences between frail and non-frail elderly and examine which individual characteristics were associated with frailty. By December 2012, 5712 elderly persons were enrolled in LifeLines and complied with the inclusion criteria of the present study. Mann-Whitney U or Kruskal-Wallis tests were used to assess the variability of GFI-scores among elderly subgroups that differed in demographic characteristics, morbidity, obesity, and healthcare utilization. Within subgroups Kruskal-Wallis tests were also used to examine differences in GFI-scores across age groups. Multivariate logistic regression analyses were performed to assess associations between individual characteristics and frailty. The GFI discriminated between subgroups: statistically significantly higher GFI-median scores (interquartile range) were found in e.g. males (1 [0-2]), the oldest old (2 [1-3]), in elderly who were single (1 [0-2]), with lower socio economic status (1 [0-3]), with increasing co-morbidity (2 [1-3]), who were obese (2 [1-3]), and used more healthcare (2 [1-4]). Overall age had an independent and statistically significant association with GFI scores. Compared with the non-frail, frail elderly persons experienced statistically significantly more chronic stress and more social/psychological related problems. In the multivariate logistic regression model, psychological morbidity had the strongest association with frailty. The present study supports the construct validity of the GFI and provides an insight in the characteristics of (non)frail community-dwelling elderly persons participating in LifeLines. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. On Interestingness Measures for Mining Statistically Significant and Novel Clinical Associations from EMRs

    PubMed Central

    Abar, Orhan; Charnigo, Richard J.; Rayapati, Abner

    2017-01-01

    Association rule mining has received significant attention from both the data mining and machine learning communities. While data mining researchers focus more on designing efficient algorithms to mine rules from large datasets, the learning community has explored applications of rule mining to classification. A major problem with rule mining algorithms is the explosion of rules even for moderate sized datasets making it very difficult for end users to identify both statistically significant and potentially novel rules that could lead to interesting new insights and hypotheses. Researchers have proposed many domain independent interestingness measures using which, one can rank the rules and potentially glean useful rules from the top ranked ones. However, these measures have not been fully explored for rule mining in clinical datasets owing to the relatively large sizes of the datasets often encountered in healthcare and also due to limited access to domain experts for review/analysis. In this paper, using an electronic medical record (EMR) dataset of diagnoses and medications from over three million patient visits to the University of Kentucky medical center and affiliated clinics, we conduct a thorough evaluation of dozens of interestingness measures proposed in data mining literature, including some new composite measures. Using cumulative relevance metrics from information retrieval, we compare these interestingness measures against human judgments obtained from a practicing psychiatrist for association rules involving the depressive disorders class as the consequent. Our results not only surface new interesting associations for depressive disorders but also indicate classes of interestingness measures that weight rule novelty and statistical strength in contrasting ways, offering new insights for end users in identifying interesting rules. PMID:28736771

  15. Testing for voter rigging in small polling stations

    PubMed Central

    Jimenez, Raúl; Hidalgo, Manuel; Klimek, Peter

    2017-01-01

    Nowadays, a large number of countries combine formal democratic institutions with authoritarian practices. Although in these countries the ruling elites may receive considerable voter support, they often use several manipulation tools to control election outcomes. A common practice of these regimes is the coercion and mobilization of large numbers of voters. This electoral irregularity is known as voter rigging, distinguishing it from vote rigging, which involves ballot stuffing or stealing. We develop a statistical test to quantify the extent to which the results of a particular election display traces of voter rigging. Our key hypothesis is that small polling stations are more susceptible to voter rigging because it is easier to identify opposing individuals, there are fewer eyewitnesses, and interested parties might reasonably expect fewer visits from election observers. We devise a general statistical method for testing whether voting behavior in small polling stations is significantly different from the behavior in their neighbor stations in a way that is consistent with the widespread occurrence of voter rigging. On the basis of a comparative analysis, the method enables third parties to conclude that an explanation other than simple variability is needed to explain geographic heterogeneities in vote preferences. We analyze 21 elections in 10 countries and find significant statistical anomalies compatible with voter rigging in Russia from 2007 to 2011, in Venezuela from 2006 to 2013, and in Uganda in 2011. Particularly disturbing is the case of Venezuela, where the smallest polling stations were decisive to the outcome of the 2013 presidential elections. PMID:28695193

  16. Testing for voter rigging in small polling stations.

    PubMed

    Jimenez, Raúl; Hidalgo, Manuel; Klimek, Peter

    2017-06-01

    Nowadays, a large number of countries combine formal democratic institutions with authoritarian practices. Although in these countries the ruling elites may receive considerable voter support, they often use several manipulation tools to control election outcomes. A common practice of these regimes is the coercion and mobilization of large numbers of voters. This electoral irregularity is known as voter rigging, distinguishing it from vote rigging, which involves ballot stuffing or stealing. We develop a statistical test to quantify the extent to which the results of a particular election display traces of voter rigging. Our key hypothesis is that small polling stations are more susceptible to voter rigging because it is easier to identify opposing individuals, there are fewer eyewitnesses, and interested parties might reasonably expect fewer visits from election observers. We devise a general statistical method for testing whether voting behavior in small polling stations is significantly different from the behavior in their neighbor stations in a way that is consistent with the widespread occurrence of voter rigging. On the basis of a comparative analysis, the method enables third parties to conclude that an explanation other than simple variability is needed to explain geographic heterogeneities in vote preferences. We analyze 21 elections in 10 countries and find significant statistical anomalies compatible with voter rigging in Russia from 2007 to 2011, in Venezuela from 2006 to 2013, and in Uganda in 2011. Particularly disturbing is the case of Venezuela, where the smallest polling stations were decisive to the outcome of the 2013 presidential elections.

  17. An entropy-based statistic for genomewide association studies.

    PubMed

    Zhao, Jinying; Boerwinkle, Eric; Xiong, Momiao

    2005-07-01

    Efficient genotyping methods and the availability of a large collection of single-nucleotide polymorphisms provide valuable tools for genetic studies of human disease. The standard chi2 statistic for case-control studies, which uses a linear function of allele frequencies, has limited power when the number of marker loci is large. We introduce a novel test statistic for genetic association studies that uses Shannon entropy and a nonlinear function of allele frequencies to amplify the differences in allele and haplotype frequencies to maintain statistical power with large numbers of marker loci. We investigate the relationship between the entropy-based test statistic and the standard chi2 statistic and show that, in most cases, the power of the entropy-based statistic is greater than that of the standard chi2 statistic. The distribution of the entropy-based statistic and the type I error rates are validated using simulation studies. Finally, we apply the new entropy-based test statistic to two real data sets, one for the COMT gene and schizophrenia and one for the MMP-2 gene and esophageal carcinoma, to evaluate the performance of the new method for genetic association studies. The results show that the entropy-based statistic obtained smaller P values than did the standard chi2 statistic.

  18. "Hold the Phone!": Cell Phone Use and Partner Reaction among University Students

    ERIC Educational Resources Information Center

    Beaver, Tiffany; Knox, David; Zusman, Marty E.

    2010-01-01

    Analysis of survey data from 995 undergraduates at a large southeastern university revealed that 93% reported owning a cell phone and a statistically significant difference between women and men (95% versus 91.2%) and between Whites (95.1%) and Blacks (87.7%). In addition, Blacks were twice as likely as Whites to be bothered by their partner's use…

  19. Intervention for First Graders with Limited Number Knowledge: Large-Scale Replication of a Randomized Controlled Trial

    ERIC Educational Resources Information Center

    Gersten, Russell; Rolfhus, Eric; Clarke, Ben; Decker, Lauren E.; Wilkins, Chuck; Dimino, Joseph

    2015-01-01

    Replication studies are extremely rare in education. This randomized controlled trial (RCT) is a scale-up replication of Fuchs et al., which in a sample of 139 found a statistically significant positive impact for Number Rockets, a small-group intervention for at-risk first graders that focused on building understanding of number operations. The…

  20. The Effectiveness of School-Type Classes Compared to the Traditional Lecture/Tutorial Method for Teaching Quantitative Methods to Business Students.

    ERIC Educational Resources Information Center

    Goldfinch, Judy

    1996-01-01

    A study compared the effectiveness of two methods (medium-size class instruction and large lectures with tutorial sessions) for teaching mathematics and statistics to first-year business students. Students and teachers overwhelmingly preferred the medium-size class method, which produced higher exam scores but had no significant effect on…

  1. A Comparative Study on the Effectiveness of English-Medium and Turkish-Medium Accounting Education: Gazi University Case

    ERIC Educational Resources Information Center

    Zaif, Figen; Karapinar, Aydin; Yangin Eksi, Gonca

    2017-01-01

    The authors explore the effect of medium of instruction on students' attainments at a large state university in Turkey in the Department of Business Administration. The findings indicate no statistically significant difference in grades of 386 students with respect to medium of instruction. As for entry rankings however, those in the…

  2. On the association between synoptic circulation and wildfires in the Eastern Mediterranean

    NASA Astrophysics Data System (ADS)

    Papadopoulos, A.; Paschalidou, A. K.; Kassomenos, P. A.; McGregor, G.

    2014-02-01

    In the present paper cluster analysis of 2-month air mass back-trajectories for three contrasting fire and non-fire events is conducted (high, low, and zero burnt area). The large fire event displays an air mass history dissimilar to other events whereby a 39-day period of warm and dry chiefly northerly anticyclonic conditions is evident, before a week of warmer predominantly southwesterly cyclonic activity, immediately prior to ignition. The pressure level of these anticyclonic air masses is above 800 hPa for more than 75 % of the trajectory length; this region is above the principal moisture transport regime of 800 hPa altitude. Analysis of variance on the mean rate of change of potential temperature identified weak statistically significant differences between two air mass pairs regarding the large fire: anticyclonic and cyclonic air masses in both cases ( p = 0.038 and p = 0.020). Such regularity of type and occurrence, approach pressure levels and statistically significant differences are not evident for the small and non-fire event air masses. Such understanding is expected to permit appropriate steps to be undertaken including superior prediction and improved suppression strategy.

  3. Drivers and seasonal predictability of extreme wind speeds in the ECMWF System 4 and a statistical model

    NASA Astrophysics Data System (ADS)

    Walz, M. A.; Donat, M.; Leckebusch, G. C.

    2017-12-01

    As extreme wind speeds are responsible for large socio-economic losses in Europe, a skillful prediction would be of great benefit for disaster prevention as well as for the actuarial community. Here we evaluate patterns of large-scale atmospheric variability and the seasonal predictability of extreme wind speeds (e.g. >95th percentile) in the European domain in the dynamical seasonal forecast system ECMWF System 4, and compare to the predictability based on a statistical prediction model. The dominant patterns of atmospheric variability show distinct differences between reanalysis and ECMWF System 4, with most patterns in System 4 extended downstream in comparison to ERA-Interim. The dissimilar manifestations of the patterns within the two models lead to substantially different drivers associated with the occurrence of extreme winds in the respective model. While the ECMWF System 4 is shown to provide some predictive power over Scandinavia and the eastern Atlantic, only very few grid cells in the European domain have significant correlations for extreme wind speeds in System 4 compared to ERA-Interim. In contrast, a statistical model predicts extreme wind speeds during boreal winter in better agreement with the observations. Our results suggest that System 4 does not seem to capture the potential predictability of extreme winds that exists in the real world, and therefore fails to provide reliable seasonal predictions for lead months 2-4. This is likely related to the unrealistic representation of large-scale patterns of atmospheric variability. Hence our study points to potential improvements of dynamical prediction skill by improving the simulation of large-scale atmospheric dynamics.

  4. Experimental Study of the Effect of the Initial Spectrum Width on the Statistics of Random Wave Groups

    NASA Astrophysics Data System (ADS)

    Shemer, L.; Sergeeva, A.

    2009-12-01

    The statistics of random water wave field determines the probability of appearance of extremely high (freak) waves. This probability is strongly related to the spectral wave field characteristics. Laboratory investigation of the spatial variation of the random wave-field statistics for various initial conditions is thus of substantial practical importance. Unidirectional nonlinear random wave groups are investigated experimentally in the 300 m long Large Wave Channel (GWK) in Hannover, Germany, which is the biggest facility of its kind in Europe. Numerous realizations of a wave field with the prescribed frequency power spectrum, yet randomly-distributed initial phases of each harmonic, were generated by a computer-controlled piston-type wavemaker. Several initial spectral shapes with identical dominant wave length but different width were considered. For each spectral shape, the total duration of sampling in all realizations was long enough to yield sufficient sample size for reliable statistics. Through all experiments, an effort had been made to retain the characteristic wave height value and thus the degree of nonlinearity of the wave field. Spatial evolution of numerous statistical wave field parameters (skewness, kurtosis and probability distributions) is studied using about 25 wave gauges distributed along the tank. It is found that, depending on the initial spectral shape, the frequency spectrum of the wave field may undergo significant modification in the course of its evolution along the tank; the values of all statistical wave parameters are strongly related to the local spectral width. A sample of the measured wave height probability functions (scaled by the variance of surface elevation) is plotted in Fig. 1 for the initially narrow rectangular spectrum. The results in Fig. 1 resemble findings obtained in [1] for the initial Gaussian spectral shape. The probability of large waves notably surpasses that predicted by the Rayleigh distribution and is the highest at the distance of about 100 m. Acknowledgement This study is carried out in the framework of the EC supported project "Transnational access to large-scale tests in the Large Wave Channel (GWK) of Forschungszentrum Küste (Contract HYDRALAB III - No. 022441). [1] L. Shemer and A. Sergeeva, J. Geophys. Res. Oceans 114, C01015 (2009). Figure 1. Variation along the tank of the measured wave height distribution for rectangular initial spectral shape, the carrier wave period T0=1.5 s.

  5. Increasing large scale windstorm damage in Western, Central and Northern European forests, 1951-2010

    NASA Astrophysics Data System (ADS)

    Gregow, H.; Laaksonen, A.; Alper, M. E.

    2017-04-01

    Using reports of forest losses caused directly by large scale windstorms (or primary damage, PD) from the European forest institute database (comprising 276 PD reports from 1951-2010), total growing stock (TGS) statistics of European forests and the daily North Atlantic Oscillation (NAO) index, we identify a statistically significant change in storm intensity in Western, Central and Northern Europe (17 countries). Using the validated set of storms, we found that the year 1990 represents a change-point at which the average intensity of the most destructive storms indicated by PD/TGS > 0.08% increased by more than a factor of three. A likelihood ratio test provides strong evidence that the change-point represents a real shift in the statistical behaviour of the time series. All but one of the seven catastrophic storms (PD/TGS > 0.2%) occurred since 1990. Additionally, we detected a related decrease in September-November PD/TGS and an increase in December-February PD/TGS. Our analyses point to the possibility that the impact of climate change on the North Atlantic storms hitting Europe has started during the last two and half decades.

  6. Increasing large scale windstorm damage in Western, Central and Northern European forests, 1951–2010

    PubMed Central

    Gregow, H.; Laaksonen, A.; Alper, M. E.

    2017-01-01

    Using reports of forest losses caused directly by large scale windstorms (or primary damage, PD) from the European forest institute database (comprising 276 PD reports from 1951–2010), total growing stock (TGS) statistics of European forests and the daily North Atlantic Oscillation (NAO) index, we identify a statistically significant change in storm intensity in Western, Central and Northern Europe (17 countries). Using the validated set of storms, we found that the year 1990 represents a change-point at which the average intensity of the most destructive storms indicated by PD/TGS > 0.08% increased by more than a factor of three. A likelihood ratio test provides strong evidence that the change-point represents a real shift in the statistical behaviour of the time series. All but one of the seven catastrophic storms (PD/TGS > 0.2%) occurred since 1990. Additionally, we detected a related decrease in September–November PD/TGS and an increase in December–February PD/TGS. Our analyses point to the possibility that the impact of climate change on the North Atlantic storms hitting Europe has started during the last two and half decades. PMID:28401947

  7. Using Residential Solar PV Quote Data to Analyze the Relationship Between Installer Pricing and Firm Size

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Shaughnessy, Eric; Margolis, Robert

    2017-04-01

    The vast majority of U.S. residential solar PV installers are small local-scale companies, however the industry is relatively concentrated in a few large national-scale installers. We develop a novel approach using solar PV quote data to study the price behavior of large solar PV installers in the United States. Through a paired differences approach, we find that large installer quotes are about higher, on average, than non-large installer quotes made to the same customer. The difference is statistically significant and robust after controlling for factors such as system size, equipment quality, and time effects. The results suggest that low pricesmore » are not the primary value proposition of large installer systems. We explore several hypotheses for this finding, including that large installers are able to exercise some market power and/or earn returns from reputations.« less

  8. Mechanical and statistical evidence of the causality of human-made mass shifts on the Earth's upper crust and the occurrence of earthquakes

    NASA Astrophysics Data System (ADS)

    Klose, Christian D.

    2013-01-01

    A global catalog of small- to large-sized earthquakes was systematically analyzed to identify causality and correlatives between human-made mass shifts in the upper Earth's crust and the occurrence of earthquakes. The mass shifts, ranging between 1 kt and 1 Tt, result from large-scale geoengineering operations, including mining, water reservoirs, hydrocarbon production, fluid injection/extractions, deep geothermal energy production and coastal management. This article shows evidence that geomechanical relationships exist with statistical significance between (a) seismic moment magnitudes M of observed earthquakes, (b) lateral distances of the earthquake hypocenters to the geoengineering "operation points" and (c) mass removals or accumulations on the Earth's crust. Statistical findings depend on uncertainties, in particular, of source parameter estimations of seismic events before instrumental recoding. Statistical observations, however, indicate that every second, seismic event tends to occur after a decade. The chance of an earthquake to nucleate after 2 or 20 years near an area with a significant mass shift is 25 or 75 %, respectively. Moreover, causative effects of seismic activities highly depend on the tectonic stress regime in which the operations take place (i.e., extensive, transverse or compressive). Results are summarized as follows: First, seismic moment magnitudes increase the more mass is locally shifted on the Earth's crust. Second, seismic moment magnitudes increase the larger the area in the crust is geomechanically polluted. Third, reverse faults tend to be more trigger-sensitive than normal faults due to a stronger alteration of the minimum vertical principal stress component. Pure strike-slip faults seem to rupture randomly and independently from the magnitude of the mass changes. Finally, mainly due to high estimation uncertainties of source parameters and, in particular, of shallow seismic events (<10 km), it remains still very difficult to discriminate between induced and triggered earthquakes with respect to the data catalog of this study. However, first analyses indicate that small- to medium-sized earthquakes (M6) seem to be triggered. The rupture propagation of triggered events might be dominated by pre-existing tectonic stress conditions.

  9. Simulating statistics of lightning-induced and man made fires

    NASA Astrophysics Data System (ADS)

    Krenn, R.; Hergarten, S.

    2009-04-01

    The frequency-area distributions of forest fires show power-law behavior with scaling exponents α in a quite narrow range, relating wildfire research to the theoretical framework of self-organized criticality. Examples of self-organized critical behavior can be found in computer simulations of simple cellular automata. The established self-organized critical Drossel-Schwabl forest fire model (DS-FFM) is one of the most widespread models in this context. Despite its qualitative agreement with event-size statistics from nature, its applicability is still questioned. Apart from general concerns that the DS-FFM apparently oversimplifies the complex nature of forest dynamics, it significantly overestimates the frequency of large fires. We present a straightforward modification of the model rules that increases the scaling exponent α by approximately 1•3 and brings the simulated event-size statistics close to those observed in nature. In addition, combined simulations of both the original and the modified model predict a dependence of the overall distribution on the ratio of lightning induced and man made fires as well as a difference between their respective event-size statistics. The increase of the scaling exponent with decreasing lightning probability as well as the splitting of the partial distributions are confirmed by the analysis of the Canadian Large Fire Database. As a consequence, lightning induced and man made forest fires cannot be treated separately in wildfire modeling, hazard assessment and forest management.

  10. Trends and fluctuations in the severity of interstate wars

    PubMed Central

    Clauset, Aaron

    2018-01-01

    Since 1945, there have been relatively few large interstate wars, especially compared to the preceding 30 years, which included both World Wars. This pattern, sometimes called the long peace, is highly controversial. Does it represent an enduring trend caused by a genuine change in the underlying conflict-generating processes? Or is it consistent with a highly variable but otherwise stable system of conflict? Using the empirical distributions of interstate war sizes and onset times from 1823 to 2003, we parameterize stationary models of conflict generation that can distinguish trends from statistical fluctuations in the statistics of war. These models indicate that both the long peace and the period of great violence that preceded it are not statistically uncommon patterns in realistic but stationary conflict time series. This fact does not detract from the importance of the long peace or the proposed mechanisms that explain it. However, the models indicate that the postwar pattern of peace would need to endure at least another 100 to 140 years to become a statistically significant trend. This fact places an implicit upper bound on the magnitude of any change in the true likelihood of a large war after the end of the Second World War. The historical patterns of war thus seem to imply that the long peace may be substantially more fragile than proponents believe, despite recent efforts to identify mechanisms that reduce the likelihood of interstate wars. PMID:29507877

  11. Long-term variability of the thunderstorm and hail potential in Europe

    NASA Astrophysics Data System (ADS)

    Mohr, Susanna; Kunz, Michael; Speidel, Johannes; Piper, David

    2016-04-01

    Severe thunderstorms and associated hazardous weather events such as hail frequently cause considerable damage to buildings, crops, and automobiles, resulting in large monetary costs in many parts of Europe and the world. To relate single extreme hail events to the historic context and to estimate their return periods and possible trends related to climate change, long-term statistics of hail events are required. Due to the local-scale nature of hail and a lack of suitable observation systems, however, hailstorms are not captured reliably and comprehensively for a long period of time. In view of this fact, different proxies (indirect climate data) obtained from sounding stations and regional climate models can be used to infer the probability and intensity of thunderstorms or hailstorms. In contrast to direct observational data, such proxies are available homogeneously over a long time period. The aim of the study is to investigate the potential for severe thunderstorms and their changes over past decades. Statistical analyses of sounding data show that the convective potential over the past 20 - 30 years has significantly increased over large parts of Central Europe, making severe thunderstorms more likely. A similar picture results from analyses of weather types that are most likely associated with damaging hailstorms. These weather patterns have increased, even if only slightly but nevertheless statistically significantly, in the time period from 1971 to 2000. To improve the diagnostics of hail events in regional climate models, a logistic hail model has been developed by means of a multivariate analysis method. The model is based on a combination of appropriate hail-relevant meteorological parameters. The output of the model is a new index that estimates the potential of the atmosphere for hailstorm development, referred to as potential hail index (PHI). Applied to a high-resolved reanalysis run for Europe driven by NCEP/NCAR1, long-term changes of the PHI for 60 years (1951-2010) show large annual and multiannual variability. The trends are mostly positive in the western parts and negative to the east. However, due to the large temporal variability, the trends are not significant at most of the grid points. Furthermore, it becomes clear that the environmental conditions that favor the formation of hailstorms prevail in larger areas. This finding suggests that, despite the local-scale nature of convective storms, the ambient conditions favoring these events are mainly controlled by large-scale circulation patterns and mechanisms. This result is important to estimate the convective potential of the atmosphere in case of single events.

  12. Statistical Models for Predicting Automobile Driving Postures for Men and Women Including Effects of Age.

    PubMed

    Park, Jangwoon; Ebert, Sheila M; Reed, Matthew P; Hallman, Jason J

    2016-03-01

    Previously published statistical models of driving posture have been effective for vehicle design but have not taken into account the effects of age. The present study developed new statistical models for predicting driving posture. Driving postures of 90 U.S. drivers with a wide range of age and body size were measured in laboratory mockup in nine package conditions. Posture-prediction models for female and male drivers were separately developed by employing a stepwise regression technique using age, body dimensions, vehicle package conditions, and two-way interactions, among other variables. Driving posture was significantly associated with age, and the effects of other variables depended on age. A set of posture-prediction models is presented for women and men. The results are compared with a previously developed model. The present study is the first study of driver posture to include a large cohort of older drivers and the first to report a significant effect of age. The posture-prediction models can be used to position computational human models or crash-test dummies for vehicle design and assessment. © 2015, Human Factors and Ergonomics Society.

  13. Publication Bias ( The "File-Drawer Problem") in Scientific Inference

    NASA Technical Reports Server (NTRS)

    Scargle, Jeffrey D.; DeVincenzi, Donald (Technical Monitor)

    1999-01-01

    Publication bias arises whenever the probability that a study is published depends on the statistical significance of its results. This bias, often called the file-drawer effect since the unpublished results are imagined to be tucked away in researchers' file cabinets, is potentially a severe impediment to combining the statistical results of studies collected from the literature. With almost any reasonable quantitative model for publication bias, only a small number of studies lost in the file-drawer will produce a significant bias. This result contradicts the well known Fail Safe File Drawer (FSFD) method for setting limits on the potential harm of publication bias, widely used in social, medical and psychic research. This method incorrectly treats the file drawer as unbiased, and almost always miss-estimates the seriousness of publication bias. A large body of not only psychic research, but medical and social science studies, has mistakenly relied on this method to validate claimed discoveries. Statistical combination can be trusted only if it is known with certainty that all studies that have been carried out are included. Such certainty is virtually impossible to achieve in literature surveys.

  14. Three-dimensional accuracy of different correction methods for cast implant bars

    PubMed Central

    Kwon, Ji-Yung; Kim, Chang-Whe; Lim, Young-Jun; Kwon, Ho-Beom

    2014-01-01

    PURPOSE The aim of the present study was to evaluate the accuracy of three techniques for correction of cast implant bars. MATERIALS AND METHODS Thirty cast implant bars were fabricated on a metal master model. All cast implant bars were sectioned at 5 mm from the left gold cylinder using a disk of 0.3 mm thickness, and then each group of ten specimens was corrected by gas-air torch soldering, laser welding, and additional casting technique. Three dimensional evaluation including horizontal, vertical, and twisting measurements was based on measurement and comparison of (1) gap distances of the right abutment replica-gold cylinder interface at buccal, distal, lingual side, (2) changes of bar length, and (3) axis angle changes of the right gold cylinders at the step of the post-correction measurements on the three groups with a contact and non-contact coordinate measuring machine. One-way analysis of variance (ANOVA) and paired t-test were performed at the significance level of 5%. RESULTS Gap distances of the cast implant bars after correction procedure showed no statistically significant difference among groups. Changes in bar length between pre-casting and post-correction measurement were statistically significance among groups. Axis angle changes of the right gold cylinders were not statistically significance among groups. CONCLUSION There was no statistical significance among three techniques in horizontal, vertical and axial errors. But, gas-air torch soldering technique showed the most consistent and accurate trend in the correction of implant bar error. However, Laser welding technique, showed a large mean and standard deviation in vertical and twisting measurement and might be technique-sensitive method. PMID:24605205

  15. Rapid Classification and Identification of Multiple Microorganisms with Accurate Statistical Significance via High-Resolution Tandem Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Alves, Gelio; Wang, Guanghui; Ogurtsov, Aleksey Y.; Drake, Steven K.; Gucek, Marjan; Sacks, David B.; Yu, Yi-Kuo

    2018-06-01

    Rapid and accurate identification and classification of microorganisms is of paramount importance to public health and safety. With the advance of mass spectrometry (MS) technology, the speed of identification can be greatly improved. However, the increasing number of microbes sequenced is complicating correct microbial identification even in a simple sample due to the large number of candidates present. To properly untwine candidate microbes in samples containing one or more microbes, one needs to go beyond apparent morphology or simple "fingerprinting"; to correctly prioritize the candidate microbes, one needs to have accurate statistical significance in microbial identification. We meet these challenges by using peptide-centric representations of microbes to better separate them and by augmenting our earlier analysis method that yields accurate statistical significance. Here, we present an updated analysis workflow that uses tandem MS (MS/MS) spectra for microbial identification or classification. We have demonstrated, using 226 MS/MS publicly available data files (each containing from 2500 to nearly 100,000 MS/MS spectra) and 4000 additional MS/MS data files, that the updated workflow can correctly identify multiple microbes at the genus and often the species level for samples containing more than one microbe. We have also shown that the proposed workflow computes accurate statistical significances, i.e., E values for identified peptides and unified E values for identified microbes. Our updated analysis workflow MiCId, a freely available software for Microorganism Classification and Identification, is available for download at https://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads.html.

  16. Rapid Classification and Identification of Multiple Microorganisms with Accurate Statistical Significance via High-Resolution Tandem Mass Spectrometry.

    PubMed

    Alves, Gelio; Wang, Guanghui; Ogurtsov, Aleksey Y; Drake, Steven K; Gucek, Marjan; Sacks, David B; Yu, Yi-Kuo

    2018-06-05

    Rapid and accurate identification and classification of microorganisms is of paramount importance to public health and safety. With the advance of mass spectrometry (MS) technology, the speed of identification can be greatly improved. However, the increasing number of microbes sequenced is complicating correct microbial identification even in a simple sample due to the large number of candidates present. To properly untwine candidate microbes in samples containing one or more microbes, one needs to go beyond apparent morphology or simple "fingerprinting"; to correctly prioritize the candidate microbes, one needs to have accurate statistical significance in microbial identification. We meet these challenges by using peptide-centric representations of microbes to better separate them and by augmenting our earlier analysis method that yields accurate statistical significance. Here, we present an updated analysis workflow that uses tandem MS (MS/MS) spectra for microbial identification or classification. We have demonstrated, using 226 MS/MS publicly available data files (each containing from 2500 to nearly 100,000 MS/MS spectra) and 4000 additional MS/MS data files, that the updated workflow can correctly identify multiple microbes at the genus and often the species level for samples containing more than one microbe. We have also shown that the proposed workflow computes accurate statistical significances, i.e., E values for identified peptides and unified E values for identified microbes. Our updated analysis workflow MiCId, a freely available software for Microorganism Classification and Identification, is available for download at https://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads.html . Graphical Abstract ᅟ.

  17. A retrospective analysis of eye conditions among children attending St. John Eye Hospital, Hebron, Palestine.

    PubMed

    Banayot, Riyad G

    2016-04-05

    Eye diseases are important causes of medical consultations, with the spectrum varying in different regions. This hospital-based descriptive study aimed to determine the profile of childhood eye conditions at St. John tertiary Eye hospital serving in Hebron, Palestine. Files of all new patients less than 16 years old who presented to St. John Eye Hospital-Hebron, Palestine between January 2013 and December 2013 were retrospectively reviewed. Age at presentation, sex, and clinical diagnosis were extracted from medical records. Data were stored and analyzed using Wizard data analysis version 1.6.0 by Evan Miller. The Chi square test was used to compare variables and a p value of less than 0.05 was considered statistically significant. We evaluated the records of 1102 patients, with a female: male ratio of 1:1.1. Patients aged 0-5 years old were the largest group (40.2%). Refractive errors were the most common ocular disorders seen (31.6%), followed by conjunctival diseases (23.7%) and strabismus and amblyopia (13.8%). Refractive errors were recorded more frequently and statistically significant (p < 0.001) among (11-15) age group. Within the conjunctival diseases category, conjunctivitis and dry eyes was more prominent and statistically significant (p < 0.001) among the 6-10 year old age group. Within the strabismus and amblyopia category, convergent strabismus was more common and statistically significant among the youngest age group (0-5 years old). The most common causes of ocular morbidity are largely treatable or preventable. These results suggest the need for awareness campaigns and early intervention programs.

  18. Topology of Neutral Hydrogen within the Small Magellanic Cloud

    NASA Astrophysics Data System (ADS)

    Chepurnov, A.; Gordon, J.; Lazarian, A.; Stanimirovic, S.

    2008-12-01

    In this paper, genus statistics have been applied to an H I column density map of the Small Magellanic Cloud in order to study its topology. To learn how topology changes with the scale of the system, we provide topology studies for column density maps at varying resolutions. To evaluate the statistical error of the genus, we randomly reassign the phases of the Fourier modes while keeping the amplitudes. We find that at the smallest scales studied (40 pc <= λ <= 80 pc), the genus shift is negative in all regions, implying a clump topology. At the larger scales (110 pc <= λ <= 250 pc), the topology shift is detected to be negative (a "meatball" topology) in four cases and positive (a "swiss cheese" topology) in two cases. In four regions, there is no statistically significant topology shift at large scales.

  19. Modeling coverage gaps in haplotype frequencies via Bayesian inference to improve stem cell donor selection.

    PubMed

    Louzoun, Yoram; Alter, Idan; Gragert, Loren; Albrecht, Mark; Maiers, Martin

    2018-05-01

    Regardless of sampling depth, accurate genotype imputation is limited in regions of high polymorphism which often have a heavy-tailed haplotype frequency distribution. Many rare haplotypes are thus unobserved. Statistical methods to improve imputation by extending reference haplotype distributions using linkage disequilibrium patterns that relate allele and haplotype frequencies have not yet been explored. In the field of unrelated stem cell transplantation, imputation of highly polymorphic human leukocyte antigen (HLA) genes has an important application in identifying the best-matched stem cell donor when searching large registries totaling over 28,000,000 donors worldwide. Despite these large registry sizes, a significant proportion of searched patients present novel HLA haplotypes. Supporting this observation, HLA population genetic models have indicated that many extant HLA haplotypes remain unobserved. The absent haplotypes are a significant cause of error in haplotype matching. We have applied a Bayesian inference methodology for extending haplotype frequency distributions, using a model where new haplotypes are created by recombination of observed alleles. Applications of this joint probability model offer significant improvement in frequency distribution estimates over the best existing alternative methods, as we illustrate using five-locus HLA frequency data from the National Marrow Donor Program registry. Transplant matching algorithms and disease association studies involving phasing and imputation of rare variants may benefit from this statistical inference framework.

  20. Effects of the Nurse Athlete Program on the Healthy Lifestyle Behaviors, Physical Health, and Mental Well-being of New Graduate Nurses.

    PubMed

    Hrabe, David P; Melnyk, Bernadette Mazurek; Buck, Jacalyn; Sinnott, Loraine T

    Recognizing that transition from nursing student to point-of-care nurse can be a stressful time period in one's career. A pilot study at a large Midwestern medical center tested the preliminary effects of a health-oriented workshop, the Nurse Athlete, on new graduate nurses' healthy lifestyle beliefs, healthy lifestyle behaviors, depressive and anxiety symptoms, as well as health outcomes. The Nurse Athlete workshop, provided in partnership with Johnson & Johnson's Human Performance Institute (HPI), used materials from HPI's Corporate Athlete program. The 2-day workshop focuses on energy management through a comprehensive examination of goals and values in relation to one's spiritual, mental, emotional, and physical development and provides practical strategies to improve self-care. Eighty-eight new graduate nurses hired at the university's medical center were offered the opportunity to participate in the Nurse Athlete program and associated study. Sixty-nine percent of these new graduate nurses (n = 61) consented and participated in the program. There was a statistically significant decrease in the participants' weight and body mass index from baseline to the 6-month follow-up assessment, which resulted in small to medium positive effects for the Nurse Athlete program. There was also a significant decrease in body fat percentage across time, resulting in a large positive intervention effect. Statistically significant reductions in depressive symptoms were measured between baseline and 6 months.

  1. Endothelium dysfunction markers in patients with diabetic retinopathy.

    PubMed

    Siemianowicz, Krzysztof; Francuz, Tomasz; Gminski, Jan; Telega, Alicja; Syzdól, Marcin

    2005-03-01

    Diabetes mellitus leads to endothelium dysfunction and an accelerated progression of atherosclerosis. Vascular complications of diabetes mellitus can affect not only large and medium arteries resulting in coronary heart disease and peripheral arteries diseases, but also small vessels leading to retinopathy and nephropathy. Intercellular adhesion molecule-1 (ICAM-1), vascular adhesion molecule-1 (VCAM-1), E-selectin and von Willebrand factor (vWF) are considered as markers of endothelium dysfunction. The aim of our study was to evaluate plasma levels of ICAM-1, VCAM-1, E-selectin and vWF in patients with type 2 diabetes mellitus receiving insulin therapy and who had diabetic non-proliferative retinopathy, proliferative retinopathy, or did not develop diabetic retinopathy. There were no statistically significant differences between studied groups in any of evaluated endothelium dysfunction markers. There was no statistically significant correlation between measured parameters and a period of diabetic history. None of the studied markers presented a significant correlation with a period of insulin treatment.

  2. Human Genetic Variation and Yellow Fever Mortality during 19th Century U.S. Epidemics

    PubMed Central

    2014-01-01

    ABSTRACT We calculated the incidence, mortality, and case fatality rates for Caucasians and non-Caucasians during 19th century yellow fever (YF) epidemics in the United States and determined statistical significance for differences in the rates in different populations. We evaluated nongenetic host factors, including socioeconomic, environmental, cultural, demographic, and acquired immunity status that could have influenced these differences. While differences in incidence rates were not significant between Caucasians and non-Caucasians, differences in mortality and case fatality rates were statistically significant for all epidemics tested (P < 0.01). Caucasians diagnosed with YF were 6.8 times more likely to succumb than non-Caucasians with the disease. No other major causes of death during the 19th century demonstrated a similar mortality skew toward Caucasians. Nongenetic host factors were examined and could not explain these large differences. We propose that the remarkably lower case mortality rates for individuals of non-Caucasian ancestry is the result of human genetic variation in loci encoding innate immune mediators. PMID:24895309

  3. High-Citation Papers in Space Physics: Examination of Gender, Country, and Paper Characteristics

    NASA Astrophysics Data System (ADS)

    Moldwin, Mark B.; Liemohn, Michael W.

    2018-04-01

    The number of citations to a refereed journal article from other refereed journal articles is a measure of its impact. Papers, individuals, journals, departments, and institutions are increasingly judged by the impact they have in their disciplines, and citation counts are now a relatively easy (though not necessarily accurate or straightforward) way of attempting to quantify impact. This study examines papers published in the Journal of Geophysical Research—Space Physics in the year 2012 (n = 705) and analyzes the characteristics of high-citation papers compared to low-citation papers. We find that high-citation papers generally have a large number of authors (>5) and cite significantly more articles in the reference section than low-citation papers. We also examined the gender and country of institution of the first author and found that there is not a statistically significant gender bias, but there are some significant differences in citation statistics between articles based on the country of first-author institution.

  4. Effects of the Self-Regulation Empowerment Program (SREP) on middle school students' strategic skills, self-efficacy, and mathematics achievement.

    PubMed

    Cleary, Timothy J; Velardi, Brittany; Schnaidman, Bracha

    2017-10-01

    The current study examined the effectiveness of an applied self-regulated learning intervention (Self-Regulation Empowerment Program (SREP)) relative to an existing, school-based remedial mathematics intervention for improving the motivation, strategic skills, and mathematics achievement of academically at-risk middle school students. Although significant group differences in student self-regulated learning (SRL) were not observed when using self-report questionnaires, medium to large and statistically significant group differences were observed across several contextualized, situation-specific measures of strategic and regulatory thinking. The SREP group also exhibited a statistically significant and more positive trend in achievement scores over two years in middle school relative to the comparison condition. Finally, SREP students and coaches reported SREP to be a socially-valid intervention, in terms of acceptability and importance. The importance of this study and critical areas for future research are highlighted and discussed. Copyright © 2017 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  5. Remote sensing estimation of the total phosphorus concentration in a large lake using band combinations and regional multivariate statistical modeling techniques.

    PubMed

    Gao, Yongnian; Gao, Junfeng; Yin, Hongbin; Liu, Chuansheng; Xia, Ting; Wang, Jing; Huang, Qi

    2015-03-15

    Remote sensing has been widely used for ater quality monitoring, but most of these monitoring studies have only focused on a few water quality variables, such as chlorophyll-a, turbidity, and total suspended solids, which have typically been considered optically active variables. Remote sensing presents a challenge in estimating the phosphorus concentration in water. The total phosphorus (TP) in lakes has been estimated from remotely sensed observations, primarily using the simple individual band ratio or their natural logarithm and the statistical regression method based on the field TP data and the spectral reflectance. In this study, we investigated the possibility of establishing a spatial modeling scheme to estimate the TP concentration of a large lake from multi-spectral satellite imagery using band combinations and regional multivariate statistical modeling techniques, and we tested the applicability of the spatial modeling scheme. The results showed that HJ-1A CCD multi-spectral satellite imagery can be used to estimate the TP concentration in a lake. The correlation and regression analysis showed a highly significant positive relationship between the TP concentration and certain remotely sensed combination variables. The proposed modeling scheme had a higher accuracy for the TP concentration estimation in the large lake compared with the traditional individual band ratio method and the whole-lake scale regression-modeling scheme. The TP concentration values showed a clear spatial variability and were high in western Lake Chaohu and relatively low in eastern Lake Chaohu. The northernmost portion, the northeastern coastal zone and the southeastern portion of western Lake Chaohu had the highest TP concentrations, and the other regions had the lowest TP concentration values, except for the coastal zone of eastern Lake Chaohu. These results strongly suggested that the proposed modeling scheme, i.e., the band combinations and the regional multivariate statistical modeling techniques, demonstrated advantages for estimating the TP concentration in a large lake and had a strong potential for universal application for the TP concentration estimation in large lake waters worldwide. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Validation of the Carotid Intima-Media Thickness Variability: Can Manual Segmentations Be Trusted as Ground Truth?

    PubMed

    Meiburger, Kristen M; Molinari, Filippo; Wong, Justin; Aguilar, Luis; Gallo, Diego; Steinman, David A; Morbiducci, Umberto

    2016-07-01

    The common carotid artery intima-media thickness (IMT) is widely accepted and used as an indicator of atherosclerosis. Recent studies, however, have found that the irregularity of the IMT along the carotid artery wall has a stronger correlation with atherosclerosis than the IMT itself. We set out to validate IMT variability (IMTV), a parameter defined to assess IMT irregularities along the wall. In particular, we analyzed whether or not manual segmentations of the lumen-intima and media-adventitia can be considered reliable in calculation of the IMTV parameter. To do this, we used a total of 60 simulated ultrasound images with a priori IMT and IMTV values. The images, simulated using the Fast And Mechanistic Ultrasound Simulation software, presented five different morphologies, four nominal IMT values and three different levels of variability along the carotid artery wall (no variability, small variability and large variability). Three experts traced the lumen-intima (LI) and media-adventitia (MA) profiles, and two automated algorithms were employed to obtain the LI and MA profiles. One expert also re-traced the LI and MA profiles to test intra-reader variability. The average IMTV measurements of the profiles used to simulate the longitudinal B-mode images were 0.002 ± 0.002, 0.149 ± 0.035 and 0.286 ± 0.068 mm for the cases of no variability, small variability and large variability, respectively. The IMTV measurements of one of the automated algorithms were statistically similar (p > 0.05, Wilcoxon signed rank) when considering small and large variability, but non-significant when considering no variability (p < 0.05, Wilcoxon signed rank). The second automated algorithm resulted in statistically similar values in the small variability case. Two readers' manual tracings, however, produced IMTV measurements with a statistically significant difference considering all three variability levels, whereas the third reader found a statistically significant difference in both the no variability and large variability cases. Moreover, the error range between the reader and automatic IMTV values was approximately 0.15 mm, which is on the same order of small IMTV values, indicating that manual and automatic IMTV readings should be not used interchangeably in clinical practice. On the basis of our findings, we conclude that expert manual tracings should not be considered reliable in IMTV measurement and, therefore, should not be trusted as ground truth. On the other hand, our automated algorithm was found to be more reliable, indicating how automated techniques could therefore foster analysis of the carotid artery intima-media thickness irregularity. Copyright © 2016 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  7. Randomized trial of thoracic irradiation plus combination chemotherapy for unresectable adenocarcinoma and large cell carcinoma of the lung

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eagan, R.T.; Lee, R.E.; Frytak, S.

    1979-08-01

    Sixty-eight evaluable patients with unresectable adenocarcinoma and large cell carcinoma of the lung were treated on a prospective randomized trial comparing thoracic radiation therapy (TRT) plus combination chemotherapy with either cyclophosphamide, Adriamycin and cis-platinum (CAP) or cyclophosphamide, Adriamycin (same dosages) and DTIC (CAD), 34 on each arm. Patients treated with TRT plus CAP had a better overall regression rate (59% vs 47%) and a statistically significant superiority in time to disease progression (147 days vs 303 days) and survival (217 days vs 504 days).

  8. Applications of spatial statistical network models to stream data

    USGS Publications Warehouse

    Isaak, Daniel J.; Peterson, Erin E.; Ver Hoef, Jay M.; Wenger, Seth J.; Falke, Jeffrey A.; Torgersen, Christian E.; Sowder, Colin; Steel, E. Ashley; Fortin, Marie-Josée; Jordan, Chris E.; Ruesch, Aaron S.; Som, Nicholas; Monestiez, Pascal

    2014-01-01

    Streams and rivers host a significant portion of Earth's biodiversity and provide important ecosystem services for human populations. Accurate information regarding the status and trends of stream resources is vital for their effective conservation and management. Most statistical techniques applied to data measured on stream networks were developed for terrestrial applications and are not optimized for streams. A new class of spatial statistical model, based on valid covariance structures for stream networks, can be used with many common types of stream data (e.g., water quality attributes, habitat conditions, biological surveys) through application of appropriate distributions (e.g., Gaussian, binomial, Poisson). The spatial statistical network models account for spatial autocorrelation (i.e., nonindependence) among measurements, which allows their application to databases with clustered measurement locations. Large amounts of stream data exist in many areas where spatial statistical analyses could be used to develop novel insights, improve predictions at unsampled sites, and aid in the design of efficient monitoring strategies at relatively low cost. We review the topic of spatial autocorrelation and its effects on statistical inference, demonstrate the use of spatial statistics with stream datasets relevant to common research and management questions, and discuss additional applications and development potential for spatial statistics on stream networks. Free software for implementing the spatial statistical network models has been developed that enables custom applications with many stream databases.

  9. An empirical evaluation of genetic distance statistics using microsatellite data from bear (Ursidae) populations.

    PubMed

    Paetkau, D; Waits, L P; Clarkson, P L; Craighead, L; Strobeck, C

    1997-12-01

    A large microsatellite data set from three species of bear (Ursidae) was used to empirically test the performance of six genetic distance measures in resolving relationships at a variety of scales ranging from adjacent areas in a continuous distribution to species that diverged several million years ago. At the finest scale, while some distance measures performed extremely well, statistics developed specifically to accommodate the mutational processes of microsatellites performed relatively poorly, presumably because of the relatively higher variance of these statistics. At the other extreme, no statistic was able to resolve the close sister relationship of polar bears and brown bears from more distantly related pairs of species. This failure is most likely due to constraints on allele distributions at microsatellite loci. At intermediate scales, both within continuous distributions and in comparisons to insular populations of late Pleistocene origin, it was not possible to define the point where linearity was lost for each of the statistics, except that it is clearly lost after relatively short periods of independent evolution. All of the statistics were affected by the amount of genetic diversity within the populations being compared, significantly complicating the interpretation of genetic distance data.

  10. An Empirical Evaluation of Genetic Distance Statistics Using Microsatellite Data from Bear (Ursidae) Populations

    PubMed Central

    Paetkau, D.; Waits, L. P.; Clarkson, P. L.; Craighead, L.; Strobeck, C.

    1997-01-01

    A large microsatellite data set from three species of bear (Ursidae) was used to empirically test the performance of six genetic distance measures in resolving relationships at a variety of scales ranging from adjacent areas in a continuous distribution to species that diverged several million years ago. At the finest scale, while some distance measures performed extremely well, statistics developed specifically to accommodate the mutational processes of microsatellites performed relatively poorly, presumably because of the relatively higher variance of these statistics. At the other extreme, no statistic was able to resolve the close sister relationship of polar bears and brown bears from more distantly related pairs of species. This failure is most likely due to constraints on allele distributions at microsatellite loci. At intermediate scales, both within continuous distributions and in comparisons to insular populations of late Pleistocene origin, it was not possible to define the point where linearity was lost for each of the statistics, except that it is clearly lost after relatively short periods of independent evolution. All of the statistics were affected by the amount of genetic diversity within the populations being compared, significantly complicating the interpretation of genetic distance data. PMID:9409849

  11. Population sexual behavior and HIV prevalence in Sub-Saharan Africa: missing links?

    PubMed

    Omori, Ryosuke; Abu-Raddad, Laith J

    2016-03-01

    Patterns of sexual partnering should shape HIV transmission in human populations. The objective of this study was to assess empirical associations between population casual sex behavior and HIV prevalence, and between different measures of casual sex behavior. An ecological study design was applied to nationally representative data, those of the Demographic and Health Surveys, in 25 countries of Sub-Saharan Africa. Spearman rank correlation was used to assess different correlations for males and females and their statistical significance. Correlations between HIV prevalence and means and variances of the number of casual sex partners were positive, but small and statistically insignificant. The majority of correlations across means and variances of the number of casual sex partners were positive, large, and statistically significant. However, all correlations between the means, as well as variances, and the variance of unmarried females were weak and statistically insignificant. Population sexual behavior was not predictive of HIV prevalence across these countries. Nevertheless, the strong correlations across means and variances of sexual behavior suggest that self-reported sexual data are self-consistent and convey valid information content. Unmarried female behavior seemed puzzling, but could be playing an influential role in HIV transmission patterns. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. Agriculture, population growth, and statistical analysis of the radiocarbon record.

    PubMed

    Zahid, H Jabran; Robinson, Erick; Kelly, Robert L

    2016-01-26

    The human population has grown significantly since the onset of the Holocene about 12,000 y ago. Despite decades of research, the factors determining prehistoric population growth remain uncertain. Here, we examine measurements of the rate of growth of the prehistoric human population based on statistical analysis of the radiocarbon record. We find that, during most of the Holocene, human populations worldwide grew at a long-term annual rate of 0.04%. Statistical analysis of the radiocarbon record shows that transitioning farming societies experienced the same rate of growth as contemporaneous foraging societies. The same rate of growth measured for populations dwelling in a range of environments and practicing a variety of subsistence strategies suggests that the global climate and/or endogenous biological factors, not adaptability to local environment or subsistence practices, regulated the long-term growth of the human population during most of the Holocene. Our results demonstrate that statistical analyses of large ensembles of radiocarbon dates are robust and valuable for quantitatively investigating the demography of prehistoric human populations worldwide.

  13. Computer-aided auditing of prescription drug claims.

    PubMed

    Iyengar, Vijay S; Hermiz, Keith B; Natarajan, Ramesh

    2014-09-01

    We describe a methodology for identifying and ranking candidate audit targets from a database of prescription drug claims. The relevant audit targets may include various entities such as prescribers, patients and pharmacies, who exhibit certain statistical behavior indicative of potential fraud and abuse over the prescription claims during a specified period of interest. Our overall approach is consistent with related work in statistical methods for detection of fraud and abuse, but has a relative emphasis on three specific aspects: first, based on the assessment of domain experts, certain focus areas are selected and data elements pertinent to the audit analysis in each focus area are identified; second, specialized statistical models are developed to characterize the normalized baseline behavior in each focus area; and third, statistical hypothesis testing is used to identify entities that diverge significantly from their expected behavior according to the relevant baseline model. The application of this overall methodology to a prescription claims database from a large health plan is considered in detail.

  14. Evaluating SPLASH-2 Applications Using MapReduce

    NASA Astrophysics Data System (ADS)

    Zhu, Shengkai; Xiao, Zhiwei; Chen, Haibo; Chen, Rong; Zhang, Weihua; Zang, Binyu

    MapReduce has been prevalent for running data-parallel applications. By hiding other non-functionality parts such as parallelism, fault tolerance and load balance from programmers, MapReduce significantly simplifies the programming of large clusters. Due to the mentioned features of MapReduce above, researchers have also explored the use of MapReduce on other application domains, such as machine learning, textual retrieval and statistical translation, among others.

  15. A Theory-Based Model for Understanding Faculty Intention to Use Students Ratings to Improve Teaching in a Health Sciences Institution in Puerto Rico

    ERIC Educational Resources Information Center

    Collazo, Andrés A.

    2018-01-01

    A model derived from the theory of planned behavior was empirically assessed for understanding faculty intention to use student ratings for teaching improvement. A sample of 175 professors participated in the study. The model was statistically significant and had a very large explanatory power. Instrumental attitude, affective attitude, perceived…

  16. What is too much variation? The null hypothesis in small-area analysis.

    PubMed Central

    Diehr, P; Cain, K; Connell, F; Volinn, E

    1990-01-01

    A small-area analysis (SAA) in health services research often calculates surgery rates for several small areas, compares the largest rate to the smallest, notes that the difference is large, and attempts to explain this discrepancy as a function of service availability, physician practice styles, or other factors. SAAs are often difficult to interpret because there is little theoretical basis for determining how much variation would be expected under the null hypothesis that all of the small areas have similar underlying surgery rates and that the observed variation is due to chance. We developed a computer program to simulate the distribution of several commonly used descriptive statistics under the null hypothesis, and used it to examine the variability in rates among the counties of the state of Washington. The expected variability when the null hypothesis is true is surprisingly large, and becomes worse for procedures with low incidence, for smaller populations, when there is variability among the populations of the counties, and when readmissions are possible. The characteristics of four descriptive statistics were studied and compared. None was uniformly good, but the chi-square statistic had better performance than the others. When we reanalyzed five journal articles that presented sufficient data, the results were usually statistically significant. Since SAA research today is tending to deal with low-incidence events, smaller populations, and measures where readmissions are possible, more research is needed on the distribution of small-area statistics under the null hypothesis. New standards are proposed for the presentation of SAA results. PMID:2312306

  17. What is too much variation? The null hypothesis in small-area analysis.

    PubMed

    Diehr, P; Cain, K; Connell, F; Volinn, E

    1990-02-01

    A small-area analysis (SAA) in health services research often calculates surgery rates for several small areas, compares the largest rate to the smallest, notes that the difference is large, and attempts to explain this discrepancy as a function of service availability, physician practice styles, or other factors. SAAs are often difficult to interpret because there is little theoretical basis for determining how much variation would be expected under the null hypothesis that all of the small areas have similar underlying surgery rates and that the observed variation is due to chance. We developed a computer program to simulate the distribution of several commonly used descriptive statistics under the null hypothesis, and used it to examine the variability in rates among the counties of the state of Washington. The expected variability when the null hypothesis is true is surprisingly large, and becomes worse for procedures with low incidence, for smaller populations, when there is variability among the populations of the counties, and when readmissions are possible. The characteristics of four descriptive statistics were studied and compared. None was uniformly good, but the chi-square statistic had better performance than the others. When we reanalyzed five journal articles that presented sufficient data, the results were usually statistically significant. Since SAA research today is tending to deal with low-incidence events, smaller populations, and measures where readmissions are possible, more research is needed on the distribution of small-area statistics under the null hypothesis. New standards are proposed for the presentation of SAA results.

  18. Subjective evaluation of compressed image quality

    NASA Astrophysics Data System (ADS)

    Lee, Heesub; Rowberg, Alan H.; Frank, Mark S.; Choi, Hyung-Sik; Kim, Yongmin

    1992-05-01

    Lossy data compression generates distortion or error on the reconstructed image and the distortion becomes visible as the compression ratio increases. Even at the same compression ratio, the distortion appears differently depending on the compression method used. Because of the nonlinearity of the human visual system and lossy data compression methods, we have evaluated subjectively the quality of medical images compressed with two different methods, an intraframe and interframe coding algorithms. The evaluated raw data were analyzed statistically to measure interrater reliability and reliability of an individual reader. Also, the analysis of variance was used to identify which compression method is better statistically, and from what compression ratio the quality of a compressed image is evaluated as poorer than that of the original. Nine x-ray CT head images from three patients were used as test cases. Six radiologists participated in reading the 99 images (some were duplicates) compressed at four different compression ratios, original, 5:1, 10:1, and 15:1. The six readers agree more than by chance alone and their agreement was statistically significant, but there were large variations among readers as well as within a reader. The displacement estimated interframe coding algorithm is significantly better in quality than that of the 2-D block DCT at significance level 0.05. Also, 10:1 compressed images with the interframe coding algorithm do not show any significant differences from the original at level 0.05.

  19. From the Law of Large Numbers to Large Deviation Theory in Statistical Physics: An Introduction

    NASA Astrophysics Data System (ADS)

    Cecconi, Fabio; Cencini, Massimo; Puglisi, Andrea; Vergni, Davide; Vulpiani, Angelo

    This contribution aims at introducing the topics of this book. We start with a brief historical excursion on the developments from the law of large numbers to the central limit theorem and large deviations theory. The same topics are then presented using the language of probability theory. Finally, some applications of large deviations theory in physics are briefly discussed through examples taken from statistical mechanics, dynamical and disordered systems.

  20. Enabling Comprehension of Patient Subgroups and Characteristics in Large Bipartite Networks: Implications for Precision Medicine

    PubMed Central

    Bhavnani, Suresh K.; Chen, Tianlong; Ayyaswamy, Archana; Visweswaran, Shyam; Bellala, Gowtham; Rohit, Divekar; Kevin E., Bassler

    2017-01-01

    A primary goal of precision medicine is to identify patient subgroups based on their characteristics (e.g., comorbidities or genes) with the goal of designing more targeted interventions. While network visualization methods such as Fruchterman-Reingold have been used to successfully identify such patient subgroups in small to medium sized data sets, they often fail to reveal comprehensible visual patterns in large and dense networks despite having significant clustering. We therefore developed an algorithm called ExplodeLayout, which exploits the existence of significant clusters in bipartite networks to automatically “explode” a traditional network layout with the goal of separating overlapping clusters, while at the same time preserving key network topological properties that are critical for the comprehension of patient subgroups. We demonstrate the utility of ExplodeLayout by visualizing a large dataset extracted from Medicare consisting of readmitted hip-fracture patients and their comorbidities, demonstrate its statistically significant improvement over a traditional layout algorithm, and discuss how the resulting network visualization enabled clinicians to infer mechanisms precipitating hospital readmission in specific patient subgroups. PMID:28815099

  1. Novel Kalman filter algorithm for statistical monitoring of extensive landscapes with synoptic sensor data

    Treesearch

    Raymond L. Czaplewski

    2015-01-01

    Wall-to-wall remotely sensed data are increasingly available to monitor landscape dynamics over large geographic areas. However, statistical monitoring programs that use post-stratification cannot fully utilize those sensor data. The Kalman filter (KF) is an alternative statistical estimator. I develop a new KF algorithm that is numerically robust with large numbers of...

  2. Common pitfalls in statistical analysis: Clinical versus statistical significance

    PubMed Central

    Ranganathan, Priya; Pramesh, C. S.; Buyse, Marc

    2015-01-01

    In clinical research, study results, which are statistically significant are often interpreted as being clinically important. While statistical significance indicates the reliability of the study results, clinical significance reflects its impact on clinical practice. The third article in this series exploring pitfalls in statistical analysis clarifies the importance of differentiating between statistical significance and clinical significance. PMID:26229754

  3. Large scale landslide susceptibility assessment using the statistical methods of logistic regression and BSA - study case: the sub-basin of the small Niraj (Transylvania Depression, Romania)

    NASA Astrophysics Data System (ADS)

    Roşca, S.; Bilaşco, Ş.; Petrea, D.; Fodorean, I.; Vescan, I.; Filip, S.; Măguţ, F.-L.

    2015-11-01

    The existence of a large number of GIS models for the identification of landslide occurrence probability makes difficult the selection of a specific one. The present study focuses on the application of two quantitative models: the logistic and the BSA models. The comparative analysis of the results aims at identifying the most suitable model. The territory corresponding to the Niraj Mic Basin (87 km2) is an area characterised by a wide variety of the landforms with their morphometric, morphographical and geological characteristics as well as by a high complexity of the land use types where active landslides exist. This is the reason why it represents the test area for applying the two models and for the comparison of the results. The large complexity of input variables is illustrated by 16 factors which were represented as 72 dummy variables, analysed on the basis of their importance within the model structures. The testing of the statistical significance corresponding to each variable reduced the number of dummy variables to 12 which were considered significant for the test area within the logistic model, whereas for the BSA model all the variables were employed. The predictability degree of the models was tested through the identification of the area under the ROC curve which indicated a good accuracy (AUROC = 0.86 for the testing area) and predictability of the logistic model (AUROC = 0.63 for the validation area).

  4. The statistical significance of error probability as determined from decoding simulations for long codes

    NASA Technical Reports Server (NTRS)

    Massey, J. L.

    1976-01-01

    The very low error probability obtained with long error-correcting codes results in a very small number of observed errors in simulation studies of practical size and renders the usual confidence interval techniques inapplicable to the observed error probability. A natural extension of the notion of a 'confidence interval' is made and applied to such determinations of error probability by simulation. An example is included to show the surprisingly great significance of as few as two decoding errors in a very large number of decoding trials.

  5. Analysis of defect structure in silicon. Characterization of SEMIX material. Silicon sheet growth development for the large area silicon sheet task of the low-cost solar array project

    NASA Technical Reports Server (NTRS)

    Natesh, R.; Stringfellow, G. B.; Virkar, A. V.; Dunn, J.; Guyer, T.

    1983-01-01

    Statistically significant quantitative structural imperfection measurements were made on samples from ubiquitous crystalline process (UCP) Ingot 5848 - 13C. Important correlation was obtained between defect densities, cell efficiency, and diffusion length. Grain boundary substructure displayed a strong influence on the conversion efficiency of solar cells from Semix material. Quantitative microscopy measurements gave statistically significant information compared to other microanalytical techniques. A surface preparation technique to obtain proper contrast of structural defects suitable for quantimet quantitative image analyzer (QTM) analysis was perfected and is used routinely. The relationships between hole mobility and grain boundary density was determined. Mobility was measured using the van der Pauw technique, and grain boundary density was measured using quantitative microscopy technique. Mobility was found to decrease with increasing grain boundary density.

  6. How allele frequency and study design affect association test statistics with misrepresentation errors.

    PubMed

    Escott-Price, Valentina; Ghodsi, Mansoureh; Schmidt, Karl Michael

    2014-04-01

    We evaluate the effect of genotyping errors on the type-I error of a general association test based on genotypes, showing that, in the presence of errors in the case and control samples, the test statistic asymptotically follows a scaled non-central $\\chi ^2$ distribution. We give explicit formulae for the scaling factor and non-centrality parameter for the symmetric allele-based genotyping error model and for additive and recessive disease models. They show how genotyping errors can lead to a significantly higher false-positive rate, growing with sample size, compared with the nominal significance levels. The strength of this effect depends very strongly on the population distribution of the genotype, with a pronounced effect in the case of rare alleles, and a great robustness against error in the case of large minor allele frequency. We also show how these results can be used to correct $p$-values.

  7. The slip resistance of common footwear materials measured with two slipmeters.

    PubMed

    Chang, W R; Matz, S

    2001-12-01

    The slip resistance of 16 commonly used footwear materials was measured with the Brungraber Mark II and the English XL on 3 floor surfaces under surface conditions of dry, wet, oily and oily wet. Three samples were used for each material combination and surface condition. The results of a one way ANOVA analysis indicated that the differences among different samples were statistically significant for a large number of material combinations and surface conditions. The results indicated that the ranking of materials based on their slip resistance values depends highly on the slipmeters, floor surfaces and surface conditions. For contaminated surfaces including wet, oily and oily wet surfaces, the slip resistance obtained with the English XL was usually higher than that measured with the Brungraber Mark II. The correlation coefficients between the slip resistance obtained with these two slipmeters calculated for different surface conditions indicated a strong correlation with statistical significance.

  8. Whole-body concentrations of elements in three fish species from offshore oil platforms and natural areas in the Southern California Bight, USA

    USGS Publications Warehouse

    Love, Milton S.; Saiki, Michael K.; May, Thomas W.; Yee, Julie L.

    2013-01-01

    elements. Forty-two elements were excluded from statistical comparisons as they (1) consisted of major cations that were unlikely to accumulate to potentially toxic concentrations; (2) were not detected by the analytical procedures; or (3) were detected at concentrations too low to yield reliable quantitative measurements. The remaining 21 elements consisted of aluminum, arsenic, barium, cadmium, chromium, cobalt, copper, gallium, iron, lead, lithium, manganese, mercury, nickel, rubidium, selenium, strontium, tin, titanium, vanadium, and zinc. Statistical comparisons of these elements indicated that none consistently exhibited higher concentrations at oil platforms than at natural areas. However, the concentrations of copper, selenium, titanium, and vanadium in Pacific sanddab were unusual because small individuals exhibited either no differences between oil platforms and natural areas or significantly lower concentrations at oil platforms than at natural areas, whereas large individuals exhibited significantly higher concentrations at oil platforms than at natural areas.

  9. The relationship between STEM educational attainment and utility patent conferrals: A state-level analysis

    NASA Astrophysics Data System (ADS)

    Liao, Yuqi

    The utility patent, as a legal record of invention, is widely believed to be a close proxy for innovation among firms, industries, and economies as a whole. One of the critical drivers of patenting -- and ultimately, innovation -- is education. The science, technology, engineering and math (STEM) fields in education are of special importance. There is, however, little empirical research to substantiate a connection between STEM education and innovation outcomes. Seeking to fill this gap, this paper finds that, in general, there is no evidence of a meaningful relationship between STEM educational attainment and utility patent conferrals. The relationship of interest, though generally not statistically significant, is stronger for temporary US visa holders than for US citizens or permanent US residents. However, I find a large and statistically significant association between STEM educational attainment and utility patent conferrals for states that have above-average college educational attainment or above-average advanced industries workforce concentration.

  10. Kinetics of bacterial fluorescence staining with 3,3'-diethylthiacyanine.

    PubMed

    Thomas, Marlon S; Nuñez, Vicente; Upadhyayula, Srigokul; Zielins, Elizabeth R; Bao, Duoduo; Vasquez, Jacob M; Bahmani, Baharak; Vullev, Valentine I

    2010-06-15

    For more than a century, colorimetric and fluorescence staining have been the foundation of a broad range of key bioanalytical techniques. The dynamics of such staining processes, however, still remains largely unexplored. We investigated the kinetics of fluorescence staining of two gram-negative and two gram-positive species with 3,3'-diethylthiacyanine (THIA) iodide. An increase in the THIA fluorescence quantum yield, induced by the bacterial dye uptake, was the principal reason for the observed emission enhancement. The fluorescence quantum yield of THIA depended on the media viscosity and not on the media polarity, which suggested that the microenvironment of the dye molecules taken up by the cells was restrictive. The kinetics of fluorescence staining did not manifest a statistically significant dependence neither on the dye concentration, nor on the cell count. In the presence of surfactant additives, however, the fluorescence-enhancement kinetic patterns manifested species specificity with statistically significant discernibility.

  11. Enhanced Higgs boson to τ(+)τ(-) search with deep learning.

    PubMed

    Baldi, P; Sadowski, P; Whiteson, D

    2015-03-20

    The Higgs boson is thought to provide the interaction that imparts mass to the fundamental fermions, but while measurements at the Large Hadron Collider (LHC) are consistent with this hypothesis, current analysis techniques lack the statistical power to cross the traditional 5σ significance barrier without more data. Deep learning techniques have the potential to increase the statistical power of this analysis by automatically learning complex, high-level data representations. In this work, deep neural networks are used to detect the decay of the Higgs boson to a pair of tau leptons. A Bayesian optimization algorithm is used to tune the network architecture and training algorithm hyperparameters, resulting in a deep network of eight nonlinear processing layers that improves upon the performance of shallow classifiers even without the use of features specifically engineered by physicists for this application. The improvement in discovery significance is equivalent to an increase in the accumulated data set of 25%.

  12. Long Range Earthquake Interaction in Iceland

    NASA Astrophysics Data System (ADS)

    Goltz, C.

    2003-12-01

    It has been observed that earthquakes can be triggered by similarly sized events at large distances. The phenomenon has recently been shown to be statistically significant at a range up to several source dimensions in global earthquake data. The most appropriate explanation of the phenomenon seems to be criticality of the Earth's crust as e.g. changes in static and dynamic stresses would otherwise be too small to trigger remote events. I present results for a regional (as opposed to global) study of seismicity in Iceland which is based on a high quality reprocessed catalogue. Results include the time-dependent determination of the maximum range of interaction and the correlation length and also address the question whether small events can trigger larger ones. Pitfalls such as data accuracy and geometry as well as boundary effects are thoroughly discussed. A comparison with surrogate data helps to assess the statistical significance of the results.

  13. Interrelationships Between 3 Keratoconic Cone Parameters.

    PubMed

    Tu, Kyaw L; Tourkmani, Abdo K; Srinivas, Singaram

    2017-09-01

    To find out the interrelationships between 3 parameters of the keratoconic cone. A total of 101 keratoconic eyes of 58 patients were included in this retrospective case series study. A complete eye examination was performed. Kmax (K) and pachymetry at the thinnest point (T) were obtained from the Pentacam tomographer. The vertex to thinnest pachymetry distance (D for decentration) was calculated using trigonometry. Pearson correlation coefficients between T and D, between T and K, and between D and K were calculated. There is a statistically significant positive correlation between thinnest point pachymetry and decentration (R = 0.366, P = 0.0002) and also statistically significant negative correlation between thinnest point pachymetry and Kmax (R = -0.719, P < 0.00001) and decentration and Kmax (R = -0.281, P = 0.0044). The interrelationships between the 3 keratoconic cone parameters suggest that the thinner cones are largely central, that is, decenter less, but show greater steepening.

  14. Factors associated with sinus bradycardia during crizotinib treatment: a retrospective analysis of two large-scale multinational trials (PROFILE 1005 and 1007).

    PubMed

    Ou, Sai-Hong Ignatius; Tang, Yiyun; Polli, Anna; Wilner, Keith D; Schnell, Patrick

    2016-04-01

    Decreases in heart rate (HR) have been described in patients receiving crizotinib. We performed a large retrospective analysis of HR changes during crizotinib therapy. HRs from vital-sign data for patients with anaplastic lymphoma kinase (ALK)-positive nonsmall cell lung cancer enrolled in PROFILE 1005 and the crizotinib arm of PROFILE 1007 were analyzed. Sinus bradycardia (SB) was defined as HR <60 beats per minute (bpm). Magnitude and timing of HR changes were assessed. Potential risk factors for SB were investigated by logistic regression analysis. Progression-free survival (PFS) was evaluated according to HR decrease by <20 versus ≥ 20 bpm within the first 50 days of starting treatment. For the 1053 patients analyzed, the mean maximum postbaseline HR decrease was 25 bpm (standard deviation 15.8). Overall, 441 patients (41.9%) had at least one episode of postbaseline SB. The mean precrizotinib treatment HR was significantly lower among patients with versus without postbaseline SB (82.2 bpm vs. 92.6 bpm). The likelihood of experiencing SB was statistically significantly higher among patients with a precrizotinib treatment HR <70 bpm. PFS was comparable among patients with or without HR decrease of ≥ 20 bpm within the first 50 days of starting crizotinib. Decrease in HR is very common among patients on crizotinib. The likelihood of experiencing SB was statistically significantly higher among patients with a precrizotinib treatment HR <70 bpm. This is the first large-scale report investigating the association between treatment with a tyrosine kinase inhibitor and the development of bradycardia. HRs should be closely monitored during crizotinib treatment. © 2016 The Authors. Cancer Medicine published by John Wiley & Sons Ltd.

  15. US Intergroup Anal Carcinoma Trial: Tumor Diameter Predicts for Colostomy

    PubMed Central

    Ajani, Jaffer A.; Winter, Kathryn A.; Gunderson, Leonard L.; Pedersen, John; Benson, Al B.; Thomas, Charles R.; Mayer, Robert J.; Haddock, Michael G.; Rich, Tyvin A.; Willett, Christopher G.

    2009-01-01

    Purpose The US Gastrointestinal Intergroup Radiation Therapy Oncology Group 98-11 anal carcinoma trial showed that cisplatin-based concurrent chemoradiotherapy resulted in a significantly higher rate of colostomy compared with mitomycin-based therapy. Established prognostic variables for patients with anal carcinoma include tumor diameter, clinical nodal status, and sex, but pretreatment variables that would predict the likelihood of colostomy are unknown. Methods A secondary analysis was performed by combining patients in the two treatment arms to evaluate whether new predictive and prognostic variables would emerge. Univariate and multivariate analyses were carried out to correlate overall survival (OS), disease-free survival, and time to colostomy (TTC) with pretreatment and treatment variables. Results Of 682 patients enrolled, 644 patients were assessable and analyzed. In the multivariate analysis, tumor-related prognosticators for poorer OS included node-positive cancer (P ≤ .0001), large (> 5 cm) tumor diameter (P = .01), and male sex (P = .016). In the treatment-related categories, cisplatin-based therapy was statistically significantly associated with a higher rate of colostomy (P = .03) than was mitomycin-based therapy. In the pretreatment variables category, only large tumor diameter independently predicted for TTC (P = .008). Similarly, the cumulative 5-year colostomy rate was statistically significantly higher for large tumor diameter than for small tumor diameter (Gray's test; P = .0074). Clinical nodal status and sex were not predictive of TTC. Conclusion The combined analysis of the two arms of RTOG 98-11, representing the largest prospective database, reveals that tumor diameter (irrespective of the nodal status) is the only independent pretreatment variable that predicts TTC and 5-year colostomy rate in patients with anal carcinoma. PMID:19139424

  16. Guided self-help cognitive-behaviour Intervention for VoicEs (GiVE): Results from a pilot randomised controlled trial in a transdiagnostic sample.

    PubMed

    Hazell, Cassie M; Hayward, Mark; Cavanagh, Kate; Jones, Anna-Marie; Strauss, Clara

    2018-05-01

    Few patients have access to cognitive behaviour therapy for psychosis (CBTp) even though at least 16 sessions of CBTp is recommended in treatment guidelines. Briefer CBTp could improve access as the same number of therapists could see more patients. In addition, focusing on single psychotic symptoms, such as auditory hallucinations ('voices'), rather than on psychosis more broadly, may yield greater benefits. This pilot RCT recruited 28 participants (with a range of diagnoses) from NHS mental health services who were distressed by hearing voices. The study compared an 8-session guided self-help CBT intervention for distressing voices with a wait-list control. Data were collected at baseline and at 12weeks with post-therapy assessments conducted blind to allocation. Voice-impact was the pre-determined primary outcome. Secondary outcomes were depression, anxiety, wellbeing and recovery. Mechanism measures were self-esteem, beliefs about self, beliefs about voices and voice-relating. Recruitment and retention was feasible with low study (3.6%) and therapy (14.3%) dropout. There were large, statistically significant between-group effects on the primary outcome of voice-impact (d=1.78; 95% CIs: 0.86-2.70), which exceeded the minimum clinically important difference. Large, statistically significant effects were found on a number of secondary and mechanism measures. Large effects on the pre-determined primary outcome of voice-impact are encouraging, and criteria for progressing to a definitive trial are met. Significant between-group effects on measures of self-esteem, negative beliefs about self and beliefs about voice omnipotence are consistent with these being mechanisms of change and this requires testing in a future trial. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  17. Random fractional ultrapulsed CO2 resurfacing of photodamaged facial skin: long-term evaluation.

    PubMed

    Tretti Clementoni, Matteo; Galimberti, Michela; Tourlaki, Athanasia; Catenacci, Maximilian; Lavagno, Rosalia; Bencini, Pier Luca

    2013-02-01

    Although numerous papers have recently been published on ablative fractional resurfacing, there is a lack of information in literature on very long-term results. The aim of this retrospective study is to evaluate the efficacy, adverse side effects, and long-term results of a random fractional ultrapulsed CO2 laser on a large population with photodamaged facial skin. Three hundred twelve patients with facial photodamaged skin were enrolled and underwent a single full-face treatment. Six aspects of photodamaged skin were recorded using a 5 point scale at 3, 6, and 24 months after the treatment. The results were compared with a non-parametric statistical test, the Wilcoxon's exact test. Three hundred one patients completed the study. All analyzed features showed a significant statistical improvement 3 months after the procedure. Three months later all features, except for pigmentations, once again showed a significant statistical improvement. Results after 24 months were similar to those assessed 18 months before. No long-term or other serious complications were observed. From the significant number of patients analyzed, long-term results demonstrate not only how fractional ultrapulsed CO2 resurfacing can achieve good results on photodamaged facial skin but also how these results can be considered stable 2 years after the procedure.

  18. A supportive-educative telephone program: impact on knowledge and anxiety after coronary artery bypass graft surgery.

    PubMed

    Beckie, T

    1989-01-01

    The purpose of this study was to investigate the impact of a supportive-educative telephone program on the levels of knowledge and anxiety of patients undergoing coronary artery bypass graft surgery during the first 6 weeks after hospital discharge. With a posttest-only control group design, the first 74 patients scheduled, between September 1986 and February 1987, for coronary artery bypass graft surgery in a large, western Canadian teaching hospital were randomly assigned to either an experimental or a control group. The effect of the intervention, which was implemented by a cardiac rehabilitation nurse specialist, was assessed by a knowledge test and a state anxiety inventory. Data were collected without knowledge of the participants' group assignment. As hypothesized, data analysis with independent t tests revealed a statistically significant (p less than 0.05) difference between the knowledge level of the experimental and the control group in the areas of coronary artery disease, diet, medications, physical activity restrictions, exercise, and rest. A statistically significant difference between the state anxiety level of the experimental and the control group was also evident, as was a statistically significant inverse relationship between participants' knowledge and anxiety levels. From these findings, several implications and recommendations for nursing practice and research have been generated.

  19. Incorporating Functional Annotations for Fine-Mapping Causal Variants in a Bayesian Framework Using Summary Statistics.

    PubMed

    Chen, Wenan; McDonnell, Shannon K; Thibodeau, Stephen N; Tillmans, Lori S; Schaid, Daniel J

    2016-11-01

    Functional annotations have been shown to improve both the discovery power and fine-mapping accuracy in genome-wide association studies. However, the optimal strategy to incorporate the large number of existing annotations is still not clear. In this study, we propose a Bayesian framework to incorporate functional annotations in a systematic manner. We compute the maximum a posteriori solution and use cross validation to find the optimal penalty parameters. By extending our previous fine-mapping method CAVIARBF into this framework, we require only summary statistics as input. We also derived an exact calculation of Bayes factors using summary statistics for quantitative traits, which is necessary when a large proportion of trait variance is explained by the variants of interest, such as in fine mapping expression quantitative trait loci (eQTL). We compared the proposed method with PAINTOR using different strategies to combine annotations. Simulation results show that the proposed method achieves the best accuracy in identifying causal variants among the different strategies and methods compared. We also find that for annotations with moderate effects from a large annotation pool, screening annotations individually and then combining the top annotations can produce overly optimistic results. We applied these methods on two real data sets: a meta-analysis result of lipid traits and a cis-eQTL study of normal prostate tissues. For the eQTL data, incorporating annotations significantly increased the number of potential causal variants with high probabilities. Copyright © 2016 by the Genetics Society of America.

  20. Logical analysis of diffuse large B-cell lymphomas.

    PubMed

    Alexe, G; Alexe, S; Axelrod, D E; Hammer, P L; Weissmann, D

    2005-07-01

    The goal of this study is to re-examine the oligonucleotide microarray dataset of Shipp et al., which contains the intensity levels of 6817 genes of 58 patients with diffuse large B-cell lymphoma (DLBCL) and 19 with follicular lymphoma (FL), by means of the combinatorics, optimisation, and logic-based methodology of logical analysis of data (LAD). The motivations for this new analysis included the previously demonstrated capabilities of LAD and its expected potential (1) to identify different informative genes than those discovered by conventional statistical methods, (2) to identify combinations of gene expression levels capable of characterizing different types of lymphoma, and (3) to assemble collections of such combinations that if considered jointly are capable of accurately distinguishing different types of lymphoma. The central concept of LAD is a pattern or combinatorial biomarker, a concept that resembles a rule as used in decision tree methods. LAD is able to exhaustively generate the collection of all those patterns which satisfy certain quality constraints, through a systematic combinatorial process guided by clear optimization criteria. Then, based on a set covering approach, LAD aggregates the collection of patterns into classification models. In addition, LAD is able to use the information provided by large collections of patterns in order to extract subsets of variables, which collectively are able to distinguish between different types of disease. For the differential diagnosis of DLBCL versus FL, a model based on eight significant genes is constructed and shown to have a sensitivity of 94.7% and a specificity of 100% on the test set. For the prognosis of good versus poor outcome among the DLBCL patients, a model is constructed on another set consisting also of eight significant genes, and shown to have a sensitivity of 87.5% and a specificity of 90% on the test set. The genes selected by LAD also work well as a basis for other kinds of statistical analysis, indicating their robustness. These two models exhibit accuracies that compare favorably to those in the original study. In addition, the current study also provides a ranking by importance of the genes in the selected significant subsets as well as a library of dozens of combinatorial biomarkers (i.e. pairs or triplets of genes) that can serve as a source of mathematically generated, statistically significant research hypotheses in need of biological explanation.

  1. Statistical Analyses of Satellite Cloud Object Data from CERES. Part II; Tropical Convective Cloud Objects During 1998 El Nino and Validation of the Fixed Anvil Temperature Hypothesis

    NASA Technical Reports Server (NTRS)

    Xu, Kuan-Man; Wong, Takmeng; Wielicki, Bruce a.; Parker, Lindsay; Lin, Bing; Eitzen, Zachary A.; Branson, Mark

    2006-01-01

    Characteristics of tropical deep convective cloud objects observed over the tropical Pacific during January-August 1998 are examined using the Tropical Rainfall Measuring Mission/ Clouds and the Earth s Radiant Energy System single scanner footprint (SSF) data. These characteristics include the frequencies of occurrence and statistical distributions of cloud physical properties. Their variations with cloud-object size, sea surface temperature (SST), and satellite precessing cycle are analyzed in detail. A cloud object is defined as a contiguous patch of the Earth composed of satellite footprints within a single dominant cloud-system type. It is found that statistical distributions of cloud physical properties are significantly different among three size categories of cloud objects with equivalent diameters of 100 - 150 km (small), 150 - 300 km (medium), and > 300 km (large), respectively, except for the distributions of ice particle size. The distributions for the larger-size category of cloud objects are more skewed towards high SSTs, high cloud tops, low cloud-top temperature, large ice water path, high cloud optical depth, low outgoing longwave (LW) radiation, and high albedo than the smaller-size category. As SST varied from one satellite precessing cycle to another, the changes in macrophysical properties of cloud objects over the entire tropical Pacific were small for the large-size category of cloud objects, relative to those of the small- and medium-size categories. This result suggests that the fixed anvil temperature hypothesis of Hartmann and Larson may be valid for the large-size category. Combining with the result that a higher percentage of the large-size category of cloud objects occurs during higher SST subperiods, this implies that macrophysical properties of cloud objects would be less sensitive to further warming of the climate. On the other hand, when cloud objects are classified according to SSTs where large-scale dynamics plays important roles, statistical characteristics of cloud microphysical properties, optical depth and albedo are not sensitive to the SST, but those of cloud macrophysical properties are strongly dependent upon the SST. Frequency distributions of vertical velocity from the European Center for Medium-range Weather Forecasts model that is matched to each cloud object are used to interpret some of the findings in this study.

  2. Effect size and statistical power in the rodent fear conditioning literature - A systematic review.

    PubMed

    Carneiro, Clarissa F D; Moulin, Thiago C; Macleod, Malcolm R; Amaral, Olavo B

    2018-01-01

    Proposals to increase research reproducibility frequently call for focusing on effect sizes instead of p values, as well as for increasing the statistical power of experiments. However, it is unclear to what extent these two concepts are indeed taken into account in basic biomedical science. To study this in a real-case scenario, we performed a systematic review of effect sizes and statistical power in studies on learning of rodent fear conditioning, a widely used behavioral task to evaluate memory. Our search criteria yielded 410 experiments comparing control and treated groups in 122 articles. Interventions had a mean effect size of 29.5%, and amnesia caused by memory-impairing interventions was nearly always partial. Mean statistical power to detect the average effect size observed in well-powered experiments with significant differences (37.2%) was 65%, and was lower among studies with non-significant results. Only one article reported a sample size calculation, and our estimated sample size to achieve 80% power considering typical effect sizes and variances (15 animals per group) was reached in only 12.2% of experiments. Actual effect sizes correlated with effect size inferences made by readers on the basis of textual descriptions of results only when findings were non-significant, and neither effect size nor power correlated with study quality indicators, number of citations or impact factor of the publishing journal. In summary, effect sizes and statistical power have a wide distribution in the rodent fear conditioning literature, but do not seem to have a large influence on how results are described or cited. Failure to take these concepts into consideration might limit attempts to improve reproducibility in this field of science.

  3. Effect size and statistical power in the rodent fear conditioning literature – A systematic review

    PubMed Central

    Macleod, Malcolm R.

    2018-01-01

    Proposals to increase research reproducibility frequently call for focusing on effect sizes instead of p values, as well as for increasing the statistical power of experiments. However, it is unclear to what extent these two concepts are indeed taken into account in basic biomedical science. To study this in a real-case scenario, we performed a systematic review of effect sizes and statistical power in studies on learning of rodent fear conditioning, a widely used behavioral task to evaluate memory. Our search criteria yielded 410 experiments comparing control and treated groups in 122 articles. Interventions had a mean effect size of 29.5%, and amnesia caused by memory-impairing interventions was nearly always partial. Mean statistical power to detect the average effect size observed in well-powered experiments with significant differences (37.2%) was 65%, and was lower among studies with non-significant results. Only one article reported a sample size calculation, and our estimated sample size to achieve 80% power considering typical effect sizes and variances (15 animals per group) was reached in only 12.2% of experiments. Actual effect sizes correlated with effect size inferences made by readers on the basis of textual descriptions of results only when findings were non-significant, and neither effect size nor power correlated with study quality indicators, number of citations or impact factor of the publishing journal. In summary, effect sizes and statistical power have a wide distribution in the rodent fear conditioning literature, but do not seem to have a large influence on how results are described or cited. Failure to take these concepts into consideration might limit attempts to improve reproducibility in this field of science. PMID:29698451

  4. Statistical Characterization of the Chandra Source Catalog

    NASA Astrophysics Data System (ADS)

    Primini, Francis A.; Houck, John C.; Davis, John E.; Nowak, Michael A.; Evans, Ian N.; Glotfelty, Kenny J.; Anderson, Craig S.; Bonaventura, Nina R.; Chen, Judy C.; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G.; Grier, John D.; Hain, Roger M.; Hall, Diane M.; Harbo, Peter N.; He, Xiangqun Helen; Karovska, Margarita; Kashyap, Vinay L.; Lauer, Jennifer; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph B.; Mitschang, Arik W.; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Plummer, David A.; Refsdal, Brian L.; Rots, Arnold H.; Siemiginowska, Aneta; Sundheim, Beth A.; Tibbetts, Michael S.; Van Stone, David W.; Winkelman, Sherry L.; Zografou, Panagoula

    2011-06-01

    The first release of the Chandra Source Catalog (CSC) contains ~95,000 X-ray sources in a total area of 0.75% of the entire sky, using data from ~3900 separate ACIS observations of a multitude of different types of X-ray sources. In order to maximize the scientific benefit of such a large, heterogeneous data set, careful characterization of the statistical properties of the catalog, i.e., completeness, sensitivity, false source rate, and accuracy of source properties, is required. Characterization efforts of other large Chandra catalogs, such as the ChaMP Point Source Catalog or the 2 Mega-second Deep Field Surveys, while informative, cannot serve this purpose, since the CSC analysis procedures are significantly different and the range of allowable data is much less restrictive. We describe here the characterization process for the CSC. This process includes both a comparison of real CSC results with those of other, deeper Chandra catalogs of the same targets and extensive simulations of blank-sky and point-source populations.

  5. Abundance gradients in cooling flow clusters: Ginga Large Area Counters and Einstein Solid State Spectrometer spectra of A496, A1795, A2142, and A2199

    NASA Technical Reports Server (NTRS)

    White, Raymond E., III; Day, C. S. R.; Hatsukade, Isamu; Hughes, John P.

    1994-01-01

    We analyze the Ginga Large Area Counters (LAC) and Einstein Solid State Spectrometer (SSS) spectra of four cooling flow clusters, A496, A1795, A2142, and A2199, each of which shows firm evidence of a relatively cool component. The inclusion of such cool spectral components in joint fits of SSS and LAC data leads to somewhat higher global temperatures than are derived from the high-energy LAC data alone. We find little evidence of cool emission outside the SSS field of view. Metal abundances appear to be centrally enhanced in all four clusters, with varying degrees of model dependence and statistical significance: the evidence is statistically strongest for A496 and A2142, somewhat weaker for A2199 and weakest for A1795. We also explore the model dependence in the amount of cold, X-ray-absorbing matter discovered in these clusters by White et al.

  6. Next-generation prognostic assessment for diffuse large B-cell lymphoma

    PubMed Central

    Staton, Ashley D; Kof, Jean L; Chen, Qiushi; Ayer, Turgay; Flowers, Christopher R

    2015-01-01

    Current standard of care therapy for diffuse large B-cell lymphoma (DLBCL) cures a majority of patients with additional benefit in salvage therapy and autologous stem cell transplant for patients who relapse. The next generation of prognostic models for DLBCL aims to more accurately stratify patients for novel therapies and risk-adapted treatment strategies. This review discusses the significance of host genetic and tumor genomic alterations seen in DLBCL, clinical and epidemiologic factors, and how each can be integrated into risk stratification algorithms. In the future, treatment prediction and prognostic model development and subsequent validation will require data from a large number of DLBCL patients to establish sufficient statistical power to correctly predict outcome. Novel modeling approaches can augment these efforts. PMID:26289217

  7. Next-generation prognostic assessment for diffuse large B-cell lymphoma.

    PubMed

    Staton, Ashley D; Koff, Jean L; Chen, Qiushi; Ayer, Turgay; Flowers, Christopher R

    2015-01-01

    Current standard of care therapy for diffuse large B-cell lymphoma (DLBCL) cures a majority of patients with additional benefit in salvage therapy and autologous stem cell transplant for patients who relapse. The next generation of prognostic models for DLBCL aims to more accurately stratify patients for novel therapies and risk-adapted treatment strategies. This review discusses the significance of host genetic and tumor genomic alterations seen in DLBCL, clinical and epidemiologic factors, and how each can be integrated into risk stratification algorithms. In the future, treatment prediction and prognostic model development and subsequent validation will require data from a large number of DLBCL patients to establish sufficient statistical power to correctly predict outcome. Novel modeling approaches can augment these efforts.

  8. Statistical Significance of Optical Map Alignments

    PubMed Central

    Sarkar, Deepayan; Goldstein, Steve; Schwartz, David C.

    2012-01-01

    Abstract The Optical Mapping System constructs ordered restriction maps spanning entire genomes through the assembly and analysis of large datasets comprising individually analyzed genomic DNA molecules. Such restriction maps uniquely reveal mammalian genome structure and variation, but also raise computational and statistical questions beyond those that have been solved in the analysis of smaller, microbial genomes. We address the problem of how to filter maps that align poorly to a reference genome. We obtain map-specific thresholds that control errors and improve iterative assembly. We also show how an optimal self-alignment score provides an accurate approximation to the probability of alignment, which is useful in applications seeking to identify structural genomic abnormalities. PMID:22506568

  9. X-ray light curves of active galactic nuclei are phase incoherent

    NASA Technical Reports Server (NTRS)

    Krolik, Julian; Done, Chris; Madejski, Grzegorz

    1993-01-01

    We compute the Fourier phase spectra for the light curves of five low-luminosity active galactic nuclei observed by EXOSAT. There is no statistically significant phase coherence in any of them. This statement is equivalent, subject to a technical caveat, to a demonstration that their fluctuation statistics are Gaussian. Models in which the X-ray output is controlled wholly by a unitary process undergoing a nonlinear limit cycle are therefore ruled out, while models with either a large number of randomly excited independent oscillation modes or nonlinearly interacting spatially dependent oscillations are favored. We also demonstrate how the degree of phase coherence in light curve fluctuations influences the application of causality bounds on internal length scales.

  10. GPU-computing in econophysics and statistical physics

    NASA Astrophysics Data System (ADS)

    Preis, T.

    2011-03-01

    A recent trend in computer science and related fields is general purpose computing on graphics processing units (GPUs), which can yield impressive performance. With multiple cores connected by high memory bandwidth, today's GPUs offer resources for non-graphics parallel processing. This article provides a brief introduction into the field of GPU computing and includes examples. In particular computationally expensive analyses employed in financial market context are coded on a graphics card architecture which leads to a significant reduction of computing time. In order to demonstrate the wide range of possible applications, a standard model in statistical physics - the Ising model - is ported to a graphics card architecture as well, resulting in large speedup values.

  11. Color image encryption using random transforms, phase retrieval, chaotic maps, and diffusion

    NASA Astrophysics Data System (ADS)

    Annaby, M. H.; Rushdi, M. A.; Nehary, E. A.

    2018-04-01

    The recent tremendous proliferation of color imaging applications has been accompanied by growing research in data encryption to secure color images against adversary attacks. While recent color image encryption techniques perform reasonably well, they still exhibit vulnerabilities and deficiencies in terms of statistical security measures due to image data redundancy and inherent weaknesses. This paper proposes two encryption algorithms that largely treat these deficiencies and boost the security strength through novel integration of the random fractional Fourier transforms, phase retrieval algorithms, as well as chaotic scrambling and diffusion. We show through detailed experiments and statistical analysis that the proposed enhancements significantly improve security measures and immunity to attacks.

  12. VizieR Online Data Catalog: The ESO DIBs Large Exploration Survey (Cox+, 2017)

    NASA Astrophysics Data System (ADS)

    Cox, N. L. J.; Cami, J.; Farhang, A.; Smoker, J.; Monreal-Ibero, A.; Lallement, R.; Sarre, P. J.; Marshall, C. C. M.; Smith, K. T.; Evans, C. J.; Royer, P.; Linnartz, H.; Cordiner, M. A.; Joblin, C.; van Loon, J. T.; Foing, B. H.; Bhatt, N. H.; Bron, E.; Elyajouri, M.; de Koter, A.; Ehrenfreund, P.; Javadi, A.; Kaper, L.; Khosroshadi, H. G.; Laverick, M.; Le Petit, F.; Mulas, G.; Roueff, E.; Salama, F.; Spaans, M.

    2018-01-01

    We constructed a statistically representative survey sample that probes a wide range of interstellar environment parameters including reddening E(B-V), visual extinction AV, total-to-selective extinction ratio RV, and molecular hydrogen fraction fH2. EDIBLES provides the community with optical (~305-1042nm) spectra at high spectral resolution (R~70000 in the blue arm and 100000 in the red arm) and high signal-to-noise (S/N; median value ~500-1000), for a statistically significant sample of interstellar sightlines. Many of the >100 sightlines included in the survey already have auxiliary available ultraviolet, infrared and/or polarisation data on the dust and gas components. (2 data files).

  13. A Statistical Test of Correlations and Periodicities in the Geological Records

    NASA Astrophysics Data System (ADS)

    Yabushita, S.

    1997-09-01

    Matsumoto & Kubotani argued that there is a positive and statistically significant correlation between cratering and mass extinction. This argument is critically examined by adopting a method of Ertel used by Matsumoto & Kubotani but by applying it more directly to the extinction and cratering records. It is shown that on the null-hypothesis of random distribution of crater ages, the observed correlation has a probability of occurrence of 13%. However, when large craters are excluded whose ages agree with the times of peaks of extinction rate of marine fauna, one obtains a negative correlation. This result strongly indicates that mass extinction are not due to accumulation of impacts but due to isolated gigantic impacts.

  14. Fruit and vegetable intake and risk of breast cancer by hormone receptor status.

    PubMed

    Jung, Seungyoun; Spiegelman, Donna; Baglietto, Laura; Bernstein, Leslie; Boggs, Deborah A; van den Brandt, Piet A; Buring, Julie E; Cerhan, James R; Gaudet, Mia M; Giles, Graham G; Goodman, Gary; Hakansson, Niclas; Hankinson, Susan E; Helzlsouer, Kathy; Horn-Ross, Pamela L; Inoue, Manami; Krogh, Vittorio; Lof, Marie; McCullough, Marjorie L; Miller, Anthony B; Neuhouser, Marian L; Palmer, Julie R; Park, Yikyung; Robien, Kim; Rohan, Thomas E; Scarmo, Stephanie; Schairer, Catherine; Schouten, Leo J; Shikany, James M; Sieri, Sabina; Tsugane, Schoichiro; Visvanathan, Kala; Weiderpass, Elisabete; Willett, Walter C; Wolk, Alicja; Zeleniuch-Jacquotte, Anne; Zhang, Shumin M; Zhang, Xuehong; Ziegler, Regina G; Smith-Warner, Stephanie A

    2013-02-06

    Estrogen receptor-negative (ER(-)) breast cancer has few known or modifiable risk factors. Because ER(-) tumors account for only 15% to 20% of breast cancers, large pooled analyses are necessary to evaluate precisely the suspected inverse association between fruit and vegetable intake and risk of ER(-) breast cancer. Among 993 466 women followed for 11 to 20 years in 20 cohort studies, we documented 19 869 estrogen receptor positive (ER(+)) and 4821 ER(-) breast cancers. We calculated study-specific multivariable relative risks (RRs) and 95% confidence intervals (CIs) using Cox proportional hazards regression analyses and then combined them using a random-effects model. All statistical tests were two-sided. Total fruit and vegetable intake was statistically significantly inversely associated with risk of ER(-) breast cancer but not with risk of breast cancer overall or of ER(+) tumors. The inverse association for ER(-) tumors was observed primarily for vegetable consumption. The pooled relative risks comparing the highest vs lowest quintile of total vegetable consumption were 0.82 (95% CI = 0.74 to 0.90) for ER(-) breast cancer and 1.04 (95% CI = 0.97 to 1.11) for ER(+) breast cancer (P (common-effects) by ER status < .001). Total fruit consumption was non-statistically significantly associated with risk of ER(-) breast cancer (pooled multivariable RR comparing the highest vs lowest quintile = 0.94, 95% CI = 0.85 to 1.04). We observed no association between total fruit and vegetable intake and risk of overall breast cancer. However, vegetable consumption was inversely associated with risk of ER(-) breast cancer in our large pooled analyses.

  15. Simulating high spatial resolution high severity burned area in Sierra Nevada forests for California Spotted Owl habitat climate change risk assessment and management.

    NASA Astrophysics Data System (ADS)

    Keyser, A.; Westerling, A. L.; Jones, G.; Peery, M. Z.

    2017-12-01

    Sierra Nevada forests have experienced an increase in very large fires with significant areas of high burn severity, such as the Rim (2013) and King (2014) fires, that have impacted habitat of endangered species such as the California spotted owl. In order to support land manager forest management planning and risk assessment activities, we used historical wildfire histories from the Monitoring Trends in Burn Severity project and gridded hydroclimate and land surface characteristics data to develope statistical models to simulate the frequency, location and extent of high severity burned area in Sierra Nevada forest wildfires as functions of climate and land surface characteristics. We define high severity here as BA90 area: the area comprising patches with ninety percent or more basal area killed within a larger fire. We developed a system of statistical models to characterize the probability of large fire occurrence, the probability of significant BA90 area present given a large fire, and the total extent of BA90 area in a fire on a 1/16 degree lat/lon grid over the Sierra Nevada. Repeated draws from binomial and generalized pareto distributions using these probabilities generated a library of simulated histories of high severity fire for a range of near (50 yr) future climate and fuels management scenarios. Fuels management scenarios were provided by USFS Region 5. Simulated BA90 area was then downscaled to 30 m resolution using a statistical model we developed using Random Forest techniques to estimate the probability of adjacent 30m pixels burning with ninety percent basal kill as a function of fire size and vegetation and topographic features. The result is a library of simulated high resolution maps of BA90 burned areas for a range of climate and fuels management scenarios with which we estimated conditional probabilities of owl nesting sites being impacted by high severity wildfire.

  16. Prostate segmentation in MRI using a convolutional neural network architecture and training strategy based on statistical shape models.

    PubMed

    Karimi, Davood; Samei, Golnoosh; Kesch, Claudia; Nir, Guy; Salcudean, Septimiu E

    2018-05-15

    Most of the existing convolutional neural network (CNN)-based medical image segmentation methods are based on methods that have originally been developed for segmentation of natural images. Therefore, they largely ignore the differences between the two domains, such as the smaller degree of variability in the shape and appearance of the target volume and the smaller amounts of training data in medical applications. We propose a CNN-based method for prostate segmentation in MRI that employs statistical shape models to address these issues. Our CNN predicts the location of the prostate center and the parameters of the shape model, which determine the position of prostate surface keypoints. To train such a large model for segmentation of 3D images using small data (1) we adopt a stage-wise training strategy by first training the network to predict the prostate center and subsequently adding modules for predicting the parameters of the shape model and prostate rotation, (2) we propose a data augmentation method whereby the training images and their prostate surface keypoints are deformed according to the displacements computed based on the shape model, and (3) we employ various regularization techniques. Our proposed method achieves a Dice score of 0.88, which is obtained by using both elastic-net and spectral dropout for regularization. Compared with a standard CNN-based method, our method shows significantly better segmentation performance on the prostate base and apex. Our experiments also show that data augmentation using the shape model significantly improves the segmentation results. Prior knowledge about the shape of the target organ can improve the performance of CNN-based segmentation methods, especially where image features are not sufficient for a precise segmentation. Statistical shape models can also be employed to synthesize additional training data that can ease the training of large CNNs.

  17. Tyrosinase, a new innate humoral immune parameter in large yellow croaker ( Pseudosciaena crocea R)

    NASA Astrophysics Data System (ADS)

    Wang, Shuhong; Wang, Yilei; Zhang, Ziping; Xie, Fangjing; Lin, Peng; Tai, Zhengang

    2009-09-01

    We evaluated the immune response to infection with a pathogen in large yellow croaker ( Pseudosciaena crocea Richardson). The fish were given an intraperitoneal (i.p.) injection of Vibrio parahaemolyticus or sterile sea water (control). We collected blood sera from the fish 0.17, 1, 2, 4, 8, 12, or 16 d after injection (dpi). We measured tyrosinase activity and the concentrations of lysozyme, NOS, and antibodies. Serum tyrosinase activity was significantly higher at 0.17 and 4 dpi than in the control group, and peaked at 8 dpi. Lysozyme activity was significantly higher at 2 and 12 dpi than in the control group, but lower at 16 dpi. There is no statistical difference in the level of nitric oxides synthase (NOS) activity or antibodies between the control and injection groups. This is the first report of the tyrosinase activity in the serum of large yellow croaker. Our results indicate that tyrosinase plays an important role in the immediate immune defense against V. parahaemolyticus in large yellow croaker. Tyrosinase is a candidate parameter for investigation of fish innate immune defense.

  18. Diurnal fluctuations in brain volume: Statistical analyses of MRI from large populations.

    PubMed

    Nakamura, Kunio; Brown, Robert A; Narayanan, Sridar; Collins, D Louis; Arnold, Douglas L

    2015-09-01

    We investigated fluctuations in brain volume throughout the day using statistical modeling of magnetic resonance imaging (MRI) from large populations. We applied fully automated image analysis software to measure the brain parenchymal fraction (BPF), defined as the ratio of the brain parenchymal volume and intracranial volume, thus accounting for variations in head size. The MRI data came from serial scans of multiple sclerosis (MS) patients in clinical trials (n=755, 3269 scans) and from subjects participating in the Alzheimer's Disease Neuroimaging Initiative (ADNI, n=834, 6114 scans). The percent change in BPF was modeled with a linear mixed effect (LME) model, and the model was applied separately to the MS and ADNI datasets. The LME model for the MS datasets included random subject effects (intercept and slope over time) and fixed effects for the time-of-day, time from the baseline scan, and trial, which accounted for trial-related effects (for example, different inclusion criteria and imaging protocol). The model for ADNI additionally included the demographics (baseline age, sex, subject type [normal, mild cognitive impairment, or Alzheimer's disease], and interaction between subject type and time from baseline). There was a statistically significant effect of time-of-day on the BPF change in MS clinical trial datasets (-0.180 per day, that is, 0.180% of intracranial volume, p=0.019) as well as the ADNI dataset (-0.438 per day, that is, 0.438% of intracranial volume, p<0.0001), showing that the brain volume is greater in the morning. Linearly correcting the BPF values with the time-of-day reduced the required sample size to detect a 25% treatment effect (80% power and 0.05 significance level) on change in brain volume from 2 time-points over a period of 1year by 2.6%. Our results have significant implications for future brain volumetric studies, suggesting that there is a potential acquisition time bias that should be randomized or statistically controlled to account for the day-to-day brain volume fluctuations. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Bayesian statistics in radionuclide metrology: measurement of a decaying source

    NASA Astrophysics Data System (ADS)

    Bochud, François O.; Bailat, Claude J.; Laedermann, Jean-Pascal

    2007-08-01

    The most intuitive way of defining a probability is perhaps through the frequency at which it appears when a large number of trials are realized in identical conditions. The probability derived from the obtained histogram characterizes the so-called frequentist or conventional statistical approach. In this sense, probability is defined as a physical property of the observed system. By contrast, in Bayesian statistics, a probability is not a physical property or a directly observable quantity, but a degree of belief or an element of inference. The goal of this paper is to show how Bayesian statistics can be used in radionuclide metrology and what its advantages and disadvantages are compared with conventional statistics. This is performed through the example of an yttrium-90 source typically encountered in environmental surveillance measurement. Because of the very low activity of this kind of source and the small half-life of the radionuclide, this measurement takes several days, during which the source decays significantly. Several methods are proposed to compute simultaneously the number of unstable nuclei at a given reference time, the decay constant and the background. Asymptotically, all approaches give the same result. However, Bayesian statistics produces coherent estimates and confidence intervals in a much smaller number of measurements. Apart from the conceptual understanding of statistics, the main difficulty that could deter radionuclide metrologists from using Bayesian statistics is the complexity of the computation.

  20. Singular spectrum analysis in nonlinear dynamics, with applications to paleoclimatic time series

    NASA Technical Reports Server (NTRS)

    Vautard, R.; Ghil, M.

    1989-01-01

    Two dimensions of a dynamical system given by experimental time series are distinguished. Statistical dimension gives a theoretical upper bound for the minimal number of degrees of freedom required to describe the attractor up to the accuracy of the data, taking into account sampling and noise problems. The dynamical dimension is the intrinsic dimension of the attractor and does not depend on the quality of the data. Singular Spectrum Analysis (SSA) provides estimates of the statistical dimension. SSA also describes the main physical phenomena reflected by the data. It gives adaptive spectral filters associated with the dominant oscillations of the system and clarifies the noise characteristics of the data. SSA is applied to four paleoclimatic records. The principal climatic oscillations and the regime changes in their amplitude are detected. About 10 degrees of freedom are statistically significant in the data. Large noise and insufficient sample length do not allow reliable estimates of the dynamical dimension.

  1. Tipping points in the arctic: eyeballing or statistical significance?

    PubMed

    Carstensen, Jacob; Weydmann, Agata

    2012-02-01

    Arctic ecosystems have experienced and are projected to experience continued large increases in temperature and declines in sea ice cover. It has been hypothesized that small changes in ecosystem drivers can fundamentally alter ecosystem functioning, and that this might be particularly pronounced for Arctic ecosystems. We present a suite of simple statistical analyses to identify changes in the statistical properties of data, emphasizing that changes in the standard error should be considered in addition to changes in mean properties. The methods are exemplified using sea ice extent, and suggest that the loss rate of sea ice accelerated by factor of ~5 in 1996, as reported in other studies, but increases in random fluctuations, as an early warning signal, were observed already in 1990. We recommend to employ the proposed methods more systematically for analyzing tipping points to document effects of climate change in the Arctic.

  2. Chemical Plume Detection with an Iterative Background Estimation Technique

    DTIC Science & Technology

    2016-05-17

    schemes because of contamination of background statistics by the plume. To mitigate the effects of plume contamination , a first pass of the detector...can be used to create a background mask. However, large diffuse plumes are typically not removed by a single pass. Instead, contamination can be...is estimated using plume-pixels, the covariance matrix is contaminated and detection performance may be significantly reduced. To avoid Further author

  3. Surveying Future Surveys

    NASA Astrophysics Data System (ADS)

    Carlstrom, John E.

    2016-06-01

    The now standard model of cosmology has been tested and refined by the analysis of increasingly sensitive, large astronomical surveys, especially with statistically significant millimeter-wave surveys of the cosmic microwave background and optical surveys of the distribution of galaxies. This talk will offer a glimpse of the future, which promises an acceleration of this trend with cosmological information coming from new surveys across the electromagnetic spectrum as well as particles and even gravitational waves.

  4. Semi-empirical seismic relations of A-F stars from COROT and Kepler legacy data

    NASA Astrophysics Data System (ADS)

    Moya, A.; Suárez, J. C.; García Hernández, A.; Mendoza, M. A.

    2017-10-01

    Asteroseismology is witnessing a revolution, thanks to high-precise asteroseismic space data (MOST, COROT, Kepler, BRITE) and their large ground-based follow-up programs. Those instruments have provided an unprecedented large amount of information, which allows us to scrutinize its statistical properties in the quest for hidden relations among pulsational and/or physical observables. This approach might be particularly useful for stars whose pulsation content is difficult to interpret. This is the case of intermediate-mass classical pulsating stars (I.e. γ Dor, δ Scuti, hybrids) for which current theories do not properly predict the observed oscillation spectra. Here, we establish a first step in finding such hidden relations from data mining techniques for these stars. We searched for those hidden relations in a sample of δ Scuti and hybrid stars observed by COROT and Kepler (74 and 153, respectively). No significant correlations between pairs of observables were found. However, two statistically significant correlations emerged from multivariable correlations in the observed seismic data, which describe the total number of observed frequencies and the largest one, respectively. Moreover, three different sets of stars were found to cluster according to their frequency density distribution. Such sets are in apparent agreement with the asteroseismic properties commonly accepted for A-F pulsating stars.

  5. A Large Scale (N=400) Investigation of Gray Matter Differences in Schizophrenia Using Optimized Voxel-based Morphometry

    PubMed Central

    Meda, Shashwath A.; Giuliani, Nicole R.; Calhoun, Vince D.; Jagannathan, Kanchana; Schretlen, David J.; Pulver, Anne; Cascella, Nicola; Keshavan, Matcheri; Kates, Wendy; Buchanan, Robert; Sharma, Tonmoy; Pearlson, Godfrey D.

    2008-01-01

    Background Many studies have employed voxel-based morphometry (VBM) of MRI images as an automated method of investigating cortical gray matter differences in schizophrenia. However, results from these studies vary widely, likely due to different methodological or statistical approaches. Objective To use VBM to investigate gray matter differences in schizophrenia in a sample significantly larger than any published to date, and to increase statistical power sufficiently to reveal differences missed in smaller analyses. Methods Magnetic resonance whole brain images were acquired from four geographic sites, all using the same model 1.5T scanner and software version, and combined to form a sample of 200 patients with both first episode and chronic schizophrenia and 200 healthy controls, matched for age, gender and scanner location. Gray matter concentration was assessed and compared using optimized VBM. Results Compared to the healthy controls, schizophrenia patients showed significantly less gray matter concentration in multiple cortical and subcortical regions, some previously unreported. Overall, we found lower concentrations of gray matter in regions identified in prior studies, most of which reported only subsets of the affected areas. Conclusions Gray matter differences in schizophrenia are most comprehensively elucidated using a large, diverse and representative sample. PMID:18378428

  6. Prevalence of premenstrual syndrome and its relationship to depressive symptoms in first-year university students

    PubMed Central

    Acikgoz, Ayla; Dayi, Ayfer; Binbay, Tolga

    2017-01-01

    Objectives: To determine the prevalence of and factors influencing premenstrual syndrome (PMS) in first-year students at a university health campus and to evaluate the relationship between depression and PMS. Methods: This cross-sectional study was conducted on a population of 618 university students from March to June 2016 at Dokuz Eylül University, Izmir, Turkey. Data were collected using the Premenstrual Syndrome Scale (PMSS), Beck Depression Inventory and Student Identification Form. The data were analyzed with Version 20.0 of the Statistical Package for the Social Science. Descriptive statistics, Pearson’s chi-square test, and Chi-square test for trend, and independent samples t test and logistic regression analysis were used. Results: The prevalence of PMS in the university students was 58.1%. Premenstrual syndrome was significantly higher in students who smoked, drink alcohol, and consumed a large amount of fatty and high-calorie foods, in students who had a bad to very bad perception of their economic situation, and those who had any chronic disease or anemia (p<0.05). Premenstrual syndrome was significantly higher in students who had a risk of depression (p<0.01). A statistically significant relationship was determined between the risk of depression and PMSS total score and all PMSS subscale scores except for appetite changes (p<0.01). Conclusion: Premenstrual syndrome was found in more than half of the students who participated in the study. Premenstrual syndrome was higher in students who had a chronic disease and/or an unhealthy lifestyle. There was a statistically significant relationship between PMS and risk of depression. Students who have PMS symptoms should be evaluated for the risk of depression. PMID:29114701

  7. Critical difference applied to exercise-induced salivary testosterone and cortisol using enzyme-linked immunosorbent assay (ELISA): distinguishing biological from statistical change.

    PubMed

    Hayes, Lawrence D; Sculthorpe, Nicholas; Young, John D; Baker, Julien S; Grace, Fergal M

    2014-12-01

    Due to its noninvasive, convenient, and practical nature, salivary testosterone (sal-T) and cortisol (sal-C) are frequently used in a clinical and applied setting. However, few studies report biological and analytical error and even fewer report the 'critical difference' which is the change required before a true biological difference can be claimed. It was hypothesized that (a) exercise would result in a statistically significant change in sal-C and sal-T and (b) the exercise-induced change would be within the critical difference for both salivary hormones. In study 1, we calculated the critical difference of sal-T and sal-C of 18 healthy adult males aged 23.2 ± 3.0 years every 60 min in a seated position over a 12-h period (08:00-20:00 hours [study 1]). As proof-of-concept, sal-C and sal-T was also obtained pre and at 5 and 60 min post a maximal exercise protocols in a separate group of 17 healthy males (aged 20.1 ± 2.8 years [study 2]). The critical difference of sal-T calculated as 90 %. For sal-C, the critical difference was 148 % (study 1). Maximal exercise was associated with a statistically significant (p < 0.05) changes in sal-T and sal-C. However, these changes were all within the critical difference range. Results from this investigation indicate that a large magnitude of change for sal-C and sal-T is required before a biologically significant mean change can be claimed. Studies utilizing sal-T and sal-C should appreciate the critical difference of these measures and assess the biological significance of any statistical changes.

  8. Efficient Transition Probability Computation for Continuous-Time Branching Processes via Compressed Sensing.

    PubMed

    Xu, Jason; Minin, Vladimir N

    2015-07-01

    Branching processes are a class of continuous-time Markov chains (CTMCs) with ubiquitous applications. A general difficulty in statistical inference under partially observed CTMC models arises in computing transition probabilities when the discrete state space is large or uncountable. Classical methods such as matrix exponentiation are infeasible for large or countably infinite state spaces, and sampling-based alternatives are computationally intensive, requiring integration over all possible hidden events. Recent work has successfully applied generating function techniques to computing transition probabilities for linear multi-type branching processes. While these techniques often require significantly fewer computations than matrix exponentiation, they also become prohibitive in applications with large populations. We propose a compressed sensing framework that significantly accelerates the generating function method, decreasing computational cost up to a logarithmic factor by only assuming the probability mass of transitions is sparse. We demonstrate accurate and efficient transition probability computations in branching process models for blood cell formation and evolution of self-replicating transposable elements in bacterial genomes.

  9. Efficient Transition Probability Computation for Continuous-Time Branching Processes via Compressed Sensing

    PubMed Central

    Xu, Jason; Minin, Vladimir N.

    2016-01-01

    Branching processes are a class of continuous-time Markov chains (CTMCs) with ubiquitous applications. A general difficulty in statistical inference under partially observed CTMC models arises in computing transition probabilities when the discrete state space is large or uncountable. Classical methods such as matrix exponentiation are infeasible for large or countably infinite state spaces, and sampling-based alternatives are computationally intensive, requiring integration over all possible hidden events. Recent work has successfully applied generating function techniques to computing transition probabilities for linear multi-type branching processes. While these techniques often require significantly fewer computations than matrix exponentiation, they also become prohibitive in applications with large populations. We propose a compressed sensing framework that significantly accelerates the generating function method, decreasing computational cost up to a logarithmic factor by only assuming the probability mass of transitions is sparse. We demonstrate accurate and efficient transition probability computations in branching process models for blood cell formation and evolution of self-replicating transposable elements in bacterial genomes. PMID:26949377

  10. Discrete-element modeling of nacre-like materials: Effects of random microstructures on strain localization and mechanical performance

    NASA Astrophysics Data System (ADS)

    Abid, Najmul; Mirkhalaf, Mohammad; Barthelat, Francois

    2018-03-01

    Natural materials such as nacre, collagen, and spider silk are composed of staggered stiff and strong inclusions in a softer matrix. This type of hybrid microstructure results in remarkable combinations of stiffness, strength, and toughness and it now inspires novel classes of high-performance composites. However, the analytical and numerical approaches used to predict and optimize the mechanics of staggered composites often neglect statistical variations and inhomogeneities, which may have significant impacts on modulus, strength, and toughness. Here we present an analysis of localization using small representative volume elements (RVEs) and large scale statistical volume elements (SVEs) based on the discrete element method (DEM). DEM is an efficient numerical method which enabled the evaluation of more than 10,000 microstructures in this study, each including about 5,000 inclusions. The models explore the combined effects of statistics, inclusion arrangement, and interface properties. We find that statistical variations have a negative effect on all properties, in particular on the ductility and energy absorption because randomness precipitates the localization of deformations. However, the results also show that the negative effects of random microstructures can be offset by interfaces with large strain at failure accompanied by strain hardening. More specifically, this quantitative study reveals an optimal range of interface properties where the interfaces are the most effective at delaying localization. These findings show how carefully designed interfaces in bioinspired staggered composites can offset the negative effects of microstructural randomness, which is inherent to most current fabrication methods.

  11. Computation of large-scale statistics in decaying isotropic turbulence

    NASA Technical Reports Server (NTRS)

    Chasnov, Jeffrey R.

    1993-01-01

    We have performed large-eddy simulations of decaying isotropic turbulence to test the prediction of self-similar decay of the energy spectrum and to compute the decay exponents of the kinetic energy. In general, good agreement between the simulation results and the assumption of self-similarity were obtained. However, the statistics of the simulations were insufficient to compute the value of gamma which corrects the decay exponent when the spectrum follows a k(exp 4) wave number behavior near k = 0. To obtain good statistics, it was found necessary to average over a large ensemble of turbulent flows.

  12. Analyzing Large Gene Expression and Methylation Data Profiles Using StatBicRM: Statistical Biclustering-Based Rule Mining

    PubMed Central

    Maulik, Ujjwal; Mallik, Saurav; Mukhopadhyay, Anirban; Bandyopadhyay, Sanghamitra

    2015-01-01

    Microarray and beadchip are two most efficient techniques for measuring gene expression and methylation data in bioinformatics. Biclustering deals with the simultaneous clustering of genes and samples. In this article, we propose a computational rule mining framework, StatBicRM (i.e., statistical biclustering-based rule mining) to identify special type of rules and potential biomarkers using integrated approaches of statistical and binary inclusion-maximal biclustering techniques from the biological datasets. At first, a novel statistical strategy has been utilized to eliminate the insignificant/low-significant/redundant genes in such way that significance level must satisfy the data distribution property (viz., either normal distribution or non-normal distribution). The data is then discretized and post-discretized, consecutively. Thereafter, the biclustering technique is applied to identify maximal frequent closed homogeneous itemsets. Corresponding special type of rules are then extracted from the selected itemsets. Our proposed rule mining method performs better than the other rule mining algorithms as it generates maximal frequent closed homogeneous itemsets instead of frequent itemsets. Thus, it saves elapsed time, and can work on big dataset. Pathway and Gene Ontology analyses are conducted on the genes of the evolved rules using David database. Frequency analysis of the genes appearing in the evolved rules is performed to determine potential biomarkers. Furthermore, we also classify the data to know how much the evolved rules are able to describe accurately the remaining test (unknown) data. Subsequently, we also compare the average classification accuracy, and other related factors with other rule-based classifiers. Statistical significance tests are also performed for verifying the statistical relevance of the comparative results. Here, each of the other rule mining methods or rule-based classifiers is also starting with the same post-discretized data-matrix. Finally, we have also included the integrated analysis of gene expression and methylation for determining epigenetic effect (viz., effect of methylation) on gene expression level. PMID:25830807

  13. Analyzing large gene expression and methylation data profiles using StatBicRM: statistical biclustering-based rule mining.

    PubMed

    Maulik, Ujjwal; Mallik, Saurav; Mukhopadhyay, Anirban; Bandyopadhyay, Sanghamitra

    2015-01-01

    Microarray and beadchip are two most efficient techniques for measuring gene expression and methylation data in bioinformatics. Biclustering deals with the simultaneous clustering of genes and samples. In this article, we propose a computational rule mining framework, StatBicRM (i.e., statistical biclustering-based rule mining) to identify special type of rules and potential biomarkers using integrated approaches of statistical and binary inclusion-maximal biclustering techniques from the biological datasets. At first, a novel statistical strategy has been utilized to eliminate the insignificant/low-significant/redundant genes in such way that significance level must satisfy the data distribution property (viz., either normal distribution or non-normal distribution). The data is then discretized and post-discretized, consecutively. Thereafter, the biclustering technique is applied to identify maximal frequent closed homogeneous itemsets. Corresponding special type of rules are then extracted from the selected itemsets. Our proposed rule mining method performs better than the other rule mining algorithms as it generates maximal frequent closed homogeneous itemsets instead of frequent itemsets. Thus, it saves elapsed time, and can work on big dataset. Pathway and Gene Ontology analyses are conducted on the genes of the evolved rules using David database. Frequency analysis of the genes appearing in the evolved rules is performed to determine potential biomarkers. Furthermore, we also classify the data to know how much the evolved rules are able to describe accurately the remaining test (unknown) data. Subsequently, we also compare the average classification accuracy, and other related factors with other rule-based classifiers. Statistical significance tests are also performed for verifying the statistical relevance of the comparative results. Here, each of the other rule mining methods or rule-based classifiers is also starting with the same post-discretized data-matrix. Finally, we have also included the integrated analysis of gene expression and methylation for determining epigenetic effect (viz., effect of methylation) on gene expression level.

  14. Replicability of time-varying connectivity patterns in large resting state fMRI samples.

    PubMed

    Abrol, Anees; Damaraju, Eswar; Miller, Robyn L; Stephen, Julia M; Claus, Eric D; Mayer, Andrew R; Calhoun, Vince D

    2017-12-01

    The past few years have seen an emergence of approaches that leverage temporal changes in whole-brain patterns of functional connectivity (the chronnectome). In this chronnectome study, we investigate the replicability of the human brain's inter-regional coupling dynamics during rest by evaluating two different dynamic functional network connectivity (dFNC) analysis frameworks using 7 500 functional magnetic resonance imaging (fMRI) datasets. To quantify the extent to which the emergent functional connectivity (FC) patterns are reproducible, we characterize the temporal dynamics by deriving several summary measures across multiple large, independent age-matched samples. Reproducibility was demonstrated through the existence of basic connectivity patterns (FC states) amidst an ensemble of inter-regional connections. Furthermore, application of the methods to conservatively configured (statistically stationary, linear and Gaussian) surrogate datasets revealed that some of the studied state summary measures were indeed statistically significant and also suggested that this class of null model did not explain the fMRI data fully. This extensive testing of reproducibility of similarity statistics also suggests that the estimated FC states are robust against variation in data quality, analysis, grouping, and decomposition methods. We conclude that future investigations probing the functional and neurophysiological relevance of time-varying connectivity assume critical importance. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  15. Replicability of time-varying connectivity patterns in large resting state fMRI samples

    PubMed Central

    Abrol, Anees; Damaraju, Eswar; Miller, Robyn L.; Stephen, Julia M.; Claus, Eric D.; Mayer, Andrew R.; Calhoun, Vince D.

    2018-01-01

    The past few years have seen an emergence of approaches that leverage temporal changes in whole-brain patterns of functional connectivity (the chronnectome). In this chronnectome study, we investigate the replicability of the human brain’s inter-regional coupling dynamics during rest by evaluating two different dynamic functional network connectivity (dFNC) analysis frameworks using 7 500 functional magnetic resonance imaging (fMRI) datasets. To quantify the extent to which the emergent functional connectivity (FC) patterns are reproducible, we characterize the temporal dynamics by deriving several summary measures across multiple large, independent age-matched samples. Reproducibility was demonstrated through the existence of basic connectivity patterns (FC states) amidst an ensemble of inter-regional connections. Furthermore, application of the methods to conservatively configured (statistically stationary, linear and Gaussian) surrogate datasets revealed that some of the studied state summary measures were indeed statistically significant and also suggested that this class of null model did not explain the fMRI data fully. This extensive testing of reproducibility of similarity statistics also suggests that the estimated FC states are robust against variation in data quality, analysis, grouping, and decomposition methods. We conclude that future investigations probing the functional and neurophysiological relevance of time-varying connectivity assume critical importance. PMID:28916181

  16. 77 FR 18304 - Agency Information Collection; Activity Under OMB Review; Report of Financial and Operating...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-27

    ... Operating Statistics for Large Certificated Air Carriers AGENCY: Research & Innovative Technology... DEPARTMENT OF TRANSPORTATION Research & Innovative Technology Administration [Docket ID Number... Operating Statistics for Large Certificated Air Carriers. Form No.: BTS Form 41. Type Of Review...

  17. A simulation study of the strength of evidence in the recommendation of medications based on two trials with statistically significant results

    PubMed Central

    Ioannidis, John P. A.

    2017-01-01

    A typical rule that has been used for the endorsement of new medications by the Food and Drug Administration is to have two trials, each convincing on its own, demonstrating effectiveness. “Convincing” may be subjectively interpreted, but the use of p-values and the focus on statistical significance (in particular with p < .05 being coined significant) is pervasive in clinical research. Therefore, in this paper, we calculate with simulations what it means to have exactly two trials, each with p < .05, in terms of the actual strength of evidence quantified by Bayes factors. Our results show that different cases where two trials have a p-value below .05 have wildly differing Bayes factors. Bayes factors of at least 20 in favor of the alternative hypothesis are not necessarily achieved and they fail to be reached in a large proportion of cases, in particular when the true effect size is small (0.2 standard deviations) or zero. In a non-trivial number of cases, evidence actually points to the null hypothesis, in particular when the true effect size is zero, when the number of trials is large, and when the number of participants in both groups is low. We recommend use of Bayes factors as a routine tool to assess endorsement of new medications, because Bayes factors consistently quantify strength of evidence. Use of p-values may lead to paradoxical and spurious decision-making regarding the use of new medications. PMID:28273140

  18. [The problem of small "n" and big "P" in neuropsycho-pharmacology, or how to keep the rate of false discoveries under control].

    PubMed

    Petschner, Péter; Bagdy, György; Tóthfalusi, Laszló

    2015-03-01

    One of the characteristics of many methods used in neuropsychopharmacology is that a large number of parameters (P) are measured in relatively few subjects (n). Functional magnetic resonance imaging, electroencephalography (EEG) and genomic studies are typical examples. For example one microarray chip can contain thousands of probes. Therefore, in studies using microarray chips, P may be several thousand-fold larger than n. Statistical analysis of such studies is a challenging task and they are refereed to in the statistical literature such as the small "n" big "P" problem. The problem has many facets including the controversies associated with multiple hypothesis testing. A typical scenario in this context is, when two or more groups are compared by the individual attributes. If the increased classification error due to the multiple testing is neglected, then several highly significant differences will be discovered. But in reality, some of these significant differences are coincidental, not reproducible findings. Several methods were proposed to solve this problem. In this review we discuss two of the proposed solutions, algorithms to compare sets and statistical hypothesis tests controlling the false discovery rate.

  19. Feature selection from a facial image for distinction of sasang constitution.

    PubMed

    Koo, Imhoi; Kim, Jong Yeol; Kim, Myoung Geun; Kim, Keun Ho

    2009-09-01

    Recently, oriental medicine has received attention for providing personalized medicine through consideration of the unique nature and constitution of individual patients. With the eventual goal of globalization, the current trend in oriental medicine research is the standardization by adopting western scientific methods, which could represent a scientific revolution. The purpose of this study is to establish methods for finding statistically significant features in a facial image with respect to distinguishing constitution and to show the meaning of those features. From facial photo images, facial elements are analyzed in terms of the distance, angle and the distance ratios, for which there are 1225, 61 250 and 749 700 features, respectively. Due to the very large number of facial features, it is quite difficult to determine truly meaningful features. We suggest a process for the efficient analysis of facial features including the removal of outliers, control for missing data to guarantee data confidence and calculation of statistical significance by applying ANOVA. We show the statistical properties of selected features according to different constitutions using the nine distances, 10 angles and 10 rates of distance features that are finally established. Additionally, the Sasang constitutional meaning of the selected features is shown here.

  20. Point and interval estimation of pollinator importance: a study using pollination data of Silene caroliniana.

    PubMed

    Reynolds, Richard J; Fenster, Charles B

    2008-05-01

    Pollinator importance, the product of visitation rate and pollinator effectiveness, is a descriptive parameter of the ecology and evolution of plant-pollinator interactions. Naturally, sources of its variation should be investigated, but the SE of pollinator importance has never been properly reported. Here, a Monte Carlo simulation study and a result from mathematical statistics on the variance of the product of two random variables are used to estimate the mean and confidence limits of pollinator importance for three visitor species of the wildflower, Silene caroliniana. Both methods provided similar estimates of mean pollinator importance and its interval if the sample size of the visitation and effectiveness datasets were comparatively large. These approaches allowed us to determine that bumblebee importance was significantly greater than clearwing hawkmoth, which was significantly greater than beefly. The methods could be used to statistically quantify temporal and spatial variation in pollinator importance of particular visitor species. The approaches may be extended for estimating the variance of more than two random variables. However, unless the distribution function of the resulting statistic is known, the simulation approach is preferable for calculating the parameter's confidence limits.

  1. Water quality analysis of the Rapur area, Andhra Pradesh, South India using multivariate techniques

    NASA Astrophysics Data System (ADS)

    Nagaraju, A.; Sreedhar, Y.; Thejaswi, A.; Sayadi, Mohammad Hossein

    2017-10-01

    The groundwater samples from Rapur area were collected from different sites to evaluate the major ion chemistry. The large number of data can lead to difficulties in the integration, interpretation, and representation of the results. Two multivariate statistical methods, hierarchical cluster analysis (HCA) and factor analysis (FA), were applied to evaluate their usefulness to classify and identify geochemical processes controlling groundwater geochemistry. Four statistically significant clusters were obtained from 30 sampling stations. This has resulted two important clusters viz., cluster 1 (pH, Si, CO3, Mg, SO4, Ca, K, HCO3, alkalinity, Na, Na + K, Cl, and hardness) and cluster 2 (EC and TDS) which are released to the study area from different sources. The application of different multivariate statistical techniques, such as principal component analysis (PCA), assists in the interpretation of complex data matrices for a better understanding of water quality of a study area. From PCA, it is clear that the first factor (factor 1), accounted for 36.2% of the total variance, was high positive loading in EC, Mg, Cl, TDS, and hardness. Based on the PCA scores, four significant cluster groups of sampling locations were detected on the basis of similarity of their water quality.

  2. A study of correlations between crude oil spot and futures markets: A rolling sample test

    NASA Astrophysics Data System (ADS)

    Liu, Li; Wan, Jieqiu

    2011-10-01

    In this article, we investigate the asymmetries of exceedance correlations and cross-correlations between West Texas Intermediate (WTI) spot and futures markets. First, employing the test statistic proposed by Hong et al. [Asymmetries in stock returns: statistical tests and economic evaluation, Review of Financial Studies 20 (2007) 1547-1581], we find that the exceedance correlations were overall symmetric. However, the results from rolling windows show that some occasional events could induce the significant asymmetries of the exceedance correlations. Second, employing the test statistic proposed by Podobnik et al. [Quantifying cross-correlations using local and global detrending approaches, European Physics Journal B 71 (2009) 243-250], we find that the cross-correlations were significant even for large lagged orders. Using the detrended cross-correlation analysis proposed by Podobnik and Stanley [Detrended cross-correlation analysis: a new method for analyzing two nonstationary time series, Physics Review Letters 100 (2008) 084102], we find that the cross-correlations were weakly persistent and were stronger between spot and futures contract with larger maturity. Our results from rolling sample test also show the apparent effects of the exogenous events. Additionally, we have some relevant discussions on the obtained evidence.

  3. Feature Selection from a Facial Image for Distinction of Sasang Constitution

    PubMed Central

    Koo, Imhoi; Kim, Jong Yeol; Kim, Myoung Geun

    2009-01-01

    Recently, oriental medicine has received attention for providing personalized medicine through consideration of the unique nature and constitution of individual patients. With the eventual goal of globalization, the current trend in oriental medicine research is the standardization by adopting western scientific methods, which could represent a scientific revolution. The purpose of this study is to establish methods for finding statistically significant features in a facial image with respect to distinguishing constitution and to show the meaning of those features. From facial photo images, facial elements are analyzed in terms of the distance, angle and the distance ratios, for which there are 1225, 61 250 and 749 700 features, respectively. Due to the very large number of facial features, it is quite difficult to determine truly meaningful features. We suggest a process for the efficient analysis of facial features including the removal of outliers, control for missing data to guarantee data confidence and calculation of statistical significance by applying ANOVA. We show the statistical properties of selected features according to different constitutions using the nine distances, 10 angles and 10 rates of distance features that are finally established. Additionally, the Sasang constitutional meaning of the selected features is shown here. PMID:19745013

  4. Distinct polymer physics principles govern chromatin dynamics in mouse and Drosophila topological domains.

    PubMed

    Ea, Vuthy; Sexton, Tom; Gostan, Thierry; Herviou, Laurie; Baudement, Marie-Odile; Zhang, Yunzhe; Berlivet, Soizik; Le Lay-Taha, Marie-Noëlle; Cathala, Guy; Lesne, Annick; Victor, Jean-Marc; Fan, Yuhong; Cavalli, Giacomo; Forné, Thierry

    2015-08-15

    In higher eukaryotes, the genome is partitioned into large "Topologically Associating Domains" (TADs) in which the chromatin displays favoured long-range contacts. While a crumpled/fractal globule organization has received experimental supports at higher-order levels, the organization principles that govern chromatin dynamics within these TADs remain unclear. Using simple polymer models, we previously showed that, in mouse liver cells, gene-rich domains tend to adopt a statistical helix shape when no significant locus-specific interaction takes place. Here, we use data from diverse 3C-derived methods to explore chromatin dynamics within mouse and Drosophila TADs. In mouse Embryonic Stem Cells (mESC), that possess large TADs (median size of 840 kb), we show that the statistical helix model, but not globule models, is relevant not only in gene-rich TADs, but also in gene-poor and gene-desert TADs. Interestingly, this statistical helix organization is considerably relaxed in mESC compared to liver cells, indicating that the impact of the constraints responsible for this organization is weaker in pluripotent cells. Finally, depletion of histone H1 in mESC alters local chromatin flexibility but not the statistical helix organization. In Drosophila, which possesses TADs of smaller sizes (median size of 70 kb), we show that, while chromatin compaction and flexibility are finely tuned according to the epigenetic landscape, chromatin dynamics within TADs is generally compatible with an unconstrained polymer configuration. Models issued from polymer physics can accurately describe the organization principles governing chromatin dynamics in both mouse and Drosophila TADs. However, constraints applied on this dynamics within mammalian TADs have a peculiar impact resulting in a statistical helix organization.

  5. Ensemble of Thermostatically Controlled Loads: Statistical Physics Approach.

    PubMed

    Chertkov, Michael; Chernyak, Vladimir

    2017-08-17

    Thermostatically controlled loads, e.g., air conditioners and heaters, are by far the most widespread consumers of electricity. Normally the devices are calibrated to provide the so-called bang-bang control - changing from on to off, and vice versa, depending on temperature. We considered aggregation of a large group of similar devices into a statistical ensemble, where the devices operate following the same dynamics, subject to stochastic perturbations and randomized, Poisson on/off switching policy. Using theoretical and computational tools of statistical physics, we analyzed how the ensemble relaxes to a stationary distribution and established a relationship between the relaxation and the statistics of the probability flux associated with devices' cycling in the mixed (discrete, switch on/off, and continuous temperature) phase space. This allowed us to derive the spectrum of the non-equilibrium (detailed balance broken) statistical system and uncover how switching policy affects oscillatory trends and the speed of the relaxation. Relaxation of the ensemble is of practical interest because it describes how the ensemble recovers from significant perturbations, e.g., forced temporary switching off aimed at utilizing the flexibility of the ensemble to provide "demand response" services to change consumption temporarily to balance a larger power grid. We discuss how the statistical analysis can guide further development of the emerging demand response technology.

  6. Ensemble of Thermostatically Controlled Loads: Statistical Physics Approach

    DOE PAGES

    Chertkov, Michael; Chernyak, Vladimir

    2017-01-17

    Thermostatically Controlled Loads (TCL), e.g. air-conditioners and heaters, are by far the most wide-spread consumers of electricity. Normally the devices are calibrated to provide the so-called bang-bang control of temperature - changing from on to off , and vice versa, depending on temperature. Aggregation of a large group of similar devices into a statistical ensemble is considered, where the devices operate following the same dynamics subject to stochastic perturbations and randomized, Poisson on/off switching policy. We analyze, using theoretical and computational tools of statistical physics, how the ensemble relaxes to a stationary distribution and establish relation between the re- laxationmore » and statistics of the probability flux, associated with devices' cycling in the mixed (discrete, switch on/off , and continuous, temperature) phase space. This allowed us to derive and analyze spec- trum of the non-equilibrium (detailed balance broken) statistical system. and uncover how switching policy affects oscillatory trend and speed of the relaxation. Relaxation of the ensemble is of a practical interest because it describes how the ensemble recovers from significant perturbations, e.g. forceful temporary switching o aimed at utilizing flexibility of the ensemble in providing "demand response" services relieving consumption temporarily to balance larger power grid. We discuss how the statistical analysis can guide further development of the emerging demand response technology.« less

  7. Ensemble of Thermostatically Controlled Loads: Statistical Physics Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chertkov, Michael; Chernyak, Vladimir

    Thermostatically Controlled Loads (TCL), e.g. air-conditioners and heaters, are by far the most wide-spread consumers of electricity. Normally the devices are calibrated to provide the so-called bang-bang control of temperature - changing from on to off , and vice versa, depending on temperature. Aggregation of a large group of similar devices into a statistical ensemble is considered, where the devices operate following the same dynamics subject to stochastic perturbations and randomized, Poisson on/off switching policy. We analyze, using theoretical and computational tools of statistical physics, how the ensemble relaxes to a stationary distribution and establish relation between the re- laxationmore » and statistics of the probability flux, associated with devices' cycling in the mixed (discrete, switch on/off , and continuous, temperature) phase space. This allowed us to derive and analyze spec- trum of the non-equilibrium (detailed balance broken) statistical system. and uncover how switching policy affects oscillatory trend and speed of the relaxation. Relaxation of the ensemble is of a practical interest because it describes how the ensemble recovers from significant perturbations, e.g. forceful temporary switching o aimed at utilizing flexibility of the ensemble in providing "demand response" services relieving consumption temporarily to balance larger power grid. We discuss how the statistical analysis can guide further development of the emerging demand response technology.« less

  8. Quantum probability, choice in large worlds, and the statistical structure of reality.

    PubMed

    Ross, Don; Ladyman, James

    2013-06-01

    Classical probability models of incentive response are inadequate in "large worlds," where the dimensions of relative risk and the dimensions of similarity in outcome comparisons typically differ. Quantum probability models for choice in large worlds may be motivated pragmatically - there is no third theory - or metaphysically: statistical processing in the brain adapts to the true scale-relative structure of the universe.

  9. Assessment of Reliable Change Using 95% Credible Intervals for the Differences in Proportions: A Statistical Analysis for Case-Study Methodology.

    PubMed

    Unicomb, Rachael; Colyvas, Kim; Harrison, Elisabeth; Hewat, Sally

    2015-06-01

    Case-study methodology studying change is often used in the field of speech-language pathology, but it can be criticized for not being statistically robust. Yet with the heterogeneous nature of many communication disorders, case studies allow clinicians and researchers to closely observe and report on change. Such information is valuable and can further inform large-scale experimental designs. In this research note, a statistical analysis for case-study data is outlined that employs a modification to the Reliable Change Index (Jacobson & Truax, 1991). The relationship between reliable change and clinical significance is discussed. Example data are used to guide the reader through the use and application of this analysis. A method of analysis is detailed that is suitable for assessing change in measures with binary categorical outcomes. The analysis is illustrated using data from one individual, measured before and after treatment for stuttering. The application of this approach to assess change in categorical, binary data has potential application in speech-language pathology. It enables clinicians and researchers to analyze results from case studies for their statistical and clinical significance. This new method addresses a gap in the research design literature, that is, the lack of analysis methods for noncontinuous data (such as counts, rates, proportions of events) that may be used in case-study designs.

  10. Human genetic variation and yellow fever mortality during 19th century U.S. epidemics.

    PubMed

    Blake, Lauren E; Garcia-Blanco, Mariano A

    2014-06-03

    We calculated the incidence, mortality, and case fatality rates for Caucasians and non-Caucasians during 19th century yellow fever (YF) epidemics in the United States and determined statistical significance for differences in the rates in different populations. We evaluated nongenetic host factors, including socioeconomic, environmental, cultural, demographic, and acquired immunity status that could have influenced these differences. While differences in incidence rates were not significant between Caucasians and non-Caucasians, differences in mortality and case fatality rates were statistically significant for all epidemics tested (P < 0.01). Caucasians diagnosed with YF were 6.8 times more likely to succumb than non-Caucasians with the disease. No other major causes of death during the 19th century demonstrated a similar mortality skew toward Caucasians. Nongenetic host factors were examined and could not explain these large differences. We propose that the remarkably lower case mortality rates for individuals of non-Caucasian ancestry is the result of human genetic variation in loci encoding innate immune mediators. Different degrees of severity of yellow fever have been observed across diverse populations, but this study is the first to demonstrate a statistically significant association between ancestry and the outcome of yellow fever (YF). With the global burden of mosquito-borne flaviviral infections, such as YF and dengue, on the rise, identifying and characterizing host factors could prove pivotal in the prevention of epidemics and the development of effective treatments. Copyright © 2014 Blake and Garcia-Blanco.

  11. Redesigning a Large Introductory Course to Incorporate the GAISE Guidelines

    ERIC Educational Resources Information Center

    Woodard, Roger; McGowan, Herle

    2012-01-01

    In 2005, the "Guidelines for Assessment and Instruction in Statistics Education" (GAISE) college report described several recommendations for teaching introductory statistics. This paper discusses how a large multi-section introductory course was redesigned in order to implement these recommendations. The experience described discusses…

  12. A comparative analysis of the statistical properties of large mobile phone calling networks.

    PubMed

    Li, Ming-Xia; Jiang, Zhi-Qiang; Xie, Wen-Jie; Miccichè, Salvatore; Tumminello, Michele; Zhou, Wei-Xing; Mantegna, Rosario N

    2014-05-30

    Mobile phone calling is one of the most widely used communication methods in modern society. The records of calls among mobile phone users provide us a valuable proxy for the understanding of human communication patterns embedded in social networks. Mobile phone users call each other forming a directed calling network. If only reciprocal calls are considered, we obtain an undirected mutual calling network. The preferential communication behavior between two connected users can be statistically tested and it results in two Bonferroni networks with statistically validated edges. We perform a comparative analysis of the statistical properties of these four networks, which are constructed from the calling records of more than nine million individuals in Shanghai over a period of 110 days. We find that these networks share many common structural properties and also exhibit idiosyncratic features when compared with previously studied large mobile calling networks. The empirical findings provide us an intriguing picture of a representative large social network that might shed new lights on the modelling of large social networks.

  13. Stability of knotted vortices in wave chaos

    NASA Astrophysics Data System (ADS)

    Taylor, Alexander; Dennis, Mark

    Large scale tangles of disordered filaments occur in many diverse physical systems, from turbulent superfluids to optical volume speckle to liquid crystal phases. They can exhibit particular large scale random statistics despite very different local physics. We have previously used the topological statistics of knotting and linking to characterise the large scale tangling, using the vortices of three-dimensional wave chaos as a universal model system whose physical lengthscales are set only by the wavelength. Unlike geometrical quantities, the statistics of knotting depend strongly on the physical system and boundary conditions. Although knotting patterns characterise different systems, the topology of vortices is highly unstable to perturbation, under which they may reconnect with one another. In systems of constructed knots, these reconnections generally rapidly destroy the knot, but for vortex tangles the topological statistics must be stable. Using large scale simulations of chaotic eigenfunctions, we numerically investigate the prevalence and impact of reconnection events, and their effect on the topology of the tangle.

  14. In silico identification and comparative analysis of differentially expressed genes in human and mouse tissues

    PubMed Central

    Pao, Sheng-Ying; Lin, Win-Li; Hwang, Ming-Jing

    2006-01-01

    Background Screening for differentially expressed genes on the genomic scale and comparative analysis of the expression profiles of orthologous genes between species to study gene function and regulation are becoming increasingly feasible. Expressed sequence tags (ESTs) are an excellent source of data for such studies using bioinformatic approaches because of the rich libraries and tremendous amount of data now available in the public domain. However, any large-scale EST-based bioinformatics analysis must deal with the heterogeneous, and often ambiguous, tissue and organ terms used to describe EST libraries. Results To deal with the issue of tissue source, in this work, we carefully screened and organized more than 8 million human and mouse ESTs into 157 human and 108 mouse tissue/organ categories, to which we applied an established statistic test using different thresholds of the p value to identify genes differentially expressed in different tissues. Further analysis of the tissue distribution and level of expression of human and mouse orthologous genes showed that tissue-specific orthologs tended to have more similar expression patterns than those lacking significant tissue specificity. On the other hand, a number of orthologs were found to have significant disparity in their expression profiles, hinting at novel functions, divergent regulation, or new ortholog relationships. Conclusion Comprehensive statistics on the tissue-specific expression of human and mouse genes were obtained in this very large-scale, EST-based analysis. These statistical results have been organized into a database, freely accessible at our website , for easy searching of human and mouse tissue-specific genes and for investigating gene expression profiles in the context of comparative genomics. Comparative analysis showed that, although highly tissue-specific genes tend to exhibit similar expression profiles in human and mouse, there are significant exceptions, indicating that orthologous genes, while sharing basic genomic properties, could result in distinct phenotypes. PMID:16626500

  15. Robustness of the sequential lineup advantage.

    PubMed

    Gronlund, Scott D; Carlson, Curt A; Dailey, Sarah B; Goodsell, Charles A

    2009-06-01

    A growing movement in the United States and around the world involves promoting the advantages of conducting an eyewitness lineup in a sequential manner. We conducted a large study (N = 2,529) that included 24 comparisons of sequential versus simultaneous lineups. A liberal statistical criterion revealed only 2 significant sequential lineup advantages and 3 significant simultaneous advantages. Both sequential advantages occurred when the good photograph of the guilty suspect or either innocent suspect was in the fifth position in the sequential lineup; all 3 simultaneous advantages occurred when the poorer quality photograph of the guilty suspect or either innocent suspect was in the second position. Adjusting the statistical criterion to control for the multiple tests (.05/24) revealed no significant sequential advantages. Moreover, despite finding more conservative overall choosing for the sequential lineup, no support was found for the proposal that a sequential advantage was due to that conservative criterion shift. Unless lineups with particular characteristics predominate in the real world, there appears to be no strong preference for conducting lineups in either a sequential or a simultaneous manner. (PsycINFO Database Record (c) 2009 APA, all rights reserved).

  16. 1979 Reserve Force Studies Surveys: Survey Design, Sample Design and Administrative Procedures,

    DTIC Science & Technology

    1981-08-01

    three factors: the need for a statistically significant number of usable questionnaires from different groups within the random sampls and from...Because of the multipurpose nature of these surveys and the large number of questions needed to fully address some of the topics covered, we...varies. Collection of data at the unit level is needed to accurately estimate actual reserve compensation and benefits and their possible role in both

  17. Women Scientific Researchers in Morocco

    NASA Astrophysics Data System (ADS)

    Bettachy, Amina; Maaroufi, Fatiha; Nouira, Asmae; Baitoul, Mimouna

    2009-04-01

    Despite Moroccan progress in working toward gender equity, and the removal of many discriminatory practices and barriers for women, females are still significantly underrepresented in most fields, particularly science. Attitudes about the role of women in society, which continue to define careers as either male or female, are largely responsible for this imbalance. We present statistics about the current status of women and give recommendations to encourage girls and women to pursue and take leadership positions in science.

  18. Improving Domain-specific Machine Translation by Constraining the Language Model

    DTIC Science & Technology

    2012-07-01

    performance. To make up for the lack of parallel training data, one assumption is that more monolingual target language data should be used in building the...target language model. Prior work on domain-specific MT has focused on training target language models with monolingual 2 domain-specific data...showed that the using a large dictionary extracted from medical domain documents in a statistical MT system to generalize the training data significantly

  19. Scaling up to address data science challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendelberger, Joanne R.

    Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less

  20. Scaling up to address data science challenges

    DOE PAGES

    Wendelberger, Joanne R.

    2017-04-27

    Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less

  1. Isotropy analyses of the Planck convergence map

    NASA Astrophysics Data System (ADS)

    Marques, G. A.; Novaes, C. P.; Bernui, A.; Ferreira, I. S.

    2018-01-01

    The presence of matter in the path of relic photons causes distortions in the angular pattern of the cosmic microwave background (CMB) temperature fluctuations, modifying their properties in a slight but measurable way. Recently, the Planck Collaboration released the estimated convergence map, an integrated measure of the large-scale matter distribution that produced the weak gravitational lensing (WL) phenomenon observed in Planck CMB data. We perform exhaustive analyses of this convergence map calculating the variance in small and large regions of the sky, but excluding the area masked due to Galactic contaminations, and compare them with the features expected in the set of simulated convergence maps, also released by the Planck Collaboration. Our goal is to search for sky directions or regions where the WL imprints anomalous signatures to the variance estimator revealed through a χ2 analyses at a statistically significant level. In the local analysis of the Planck convergence map, we identified eight patches of the sky in disagreement, in more than 2σ, with what is observed in the average of the simulations. In contrast, in the large regions analysis we found no statistically significant discrepancies, but, interestingly, the regions with the highest χ2 values are surrounding the ecliptic poles. Thus, our results show a good agreement with the features expected by the Λ cold dark matter concordance model, as given by the simulations. Yet, the outliers regions found here could suggest that the data still contain residual contamination, like noise, due to over- or underestimation of systematic effects in the simulation data set.

  2. European consumer attitudes on the associated health benefits of neutraceutical-containing processed meats using Co-enzyme Q10 as a sample functional ingredient.

    PubMed

    Tobin, Brian D; O'Sullivan, Maurice G; Hamill, Ruth; Kerry, Joseph P

    2014-06-01

    This study accumulated European consumer attitudes towards processed meats and their use as a functional food. A survey was set up using an online web-application to gather information on consumer perception of processed meats as well as neutraceutical-containing processed meats. 548 responses were obtained and statistical analysis was carried out using a statistical software package. Data was summarized as frequencies for each question and statistical differences analyzed using the Chi-Square statistical test with a significance level of 5% (P<0.05). The majority of consumer attitudes towards processed meat indicate that they are unhealthy products. Most believe that processed meats contain large quantities of harmful chemicals, fat and salt. Consumers were found to be very pro-bioactive compounds in yogurt style products but unsure of their feelings in meat based products, which is likely due to the lack of familiarity to these products. Many of the respondents were willing to consume meat based functional foods but were not willing to pay more for them. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Comparison of statistical models for writer verification

    NASA Astrophysics Data System (ADS)

    Srihari, Sargur; Ball, Gregory R.

    2009-01-01

    A novel statistical model for determining whether a pair of documents, a known and a questioned, were written by the same individual is proposed. The goal of this formulation is to learn the specific uniqueness of style in a particular author's writing, given the known document. Since there are often insufficient samples to extrapolate a generalized model of an writer's handwriting based solely on the document, we instead generalize over the differences between the author and a large population of known different writers. This is in contrast to an earlier model proposed whereby probability distributions were a priori without learning. We show the performance of the model along with a comparison in performance to the non-learning, older model, which shows significant improvement.

  4. Statistical analysis of the 70 meter antenna surface distortions

    NASA Technical Reports Server (NTRS)

    Kiedron, K.; Chian, C. T.; Chuang, K. L.

    1987-01-01

    Statistical analysis of surface distortions of the 70 meter NASA/JPL antenna, located at Goldstone, was performed. The purpose of this analysis is to verify whether deviations due to gravity loading can be treated as quasi-random variables with normal distribution. Histograms of the RF pathlength error distribution for several antenna elevation positions were generated. The results indicate that the deviations from the ideal antenna surface are not normally distributed. The observed density distribution for all antenna elevation angles is taller and narrower than the normal density, which results in large positive values of kurtosis and a significant amount of skewness. The skewness of the distribution changes from positive to negative as the antenna elevation changes from zenith to horizon.

  5. Robust non-Gaussian statistics and long-range correlation of total ozone

    NASA Astrophysics Data System (ADS)

    Toumi, R.; Syroka, J.; Barnes, C.; Lewis, P.

    2001-01-01

    Three long-term total ozone time series at Camborne, Lerwick and Arosa are examined for their statistical properties. Non-Gaussian behaviour is seen for all locations. There are large interannual fluctuations in the higher moments of the probability distribution. However, only the mean for all stations and summer standard deviation at Lerwick show significant trends. This suggests that there has been no long-term change in the stratospheric circulation, but there are decadal variations. The time series can be also characterised as scale invariant with a Hurst exponent of about 0.8 for all three sites. The Arosa time series was found to be weakly intermittent, in agreement with the non-Gaussian characteristics of the data set

  6. SLIDE - a web-based tool for interactive visualization of large-scale -omics data.

    PubMed

    Ghosh, Soumita; Datta, Abhik; Tan, Kaisen; Choi, Hyungwon

    2018-06-28

    Data visualization is often regarded as a post hoc step for verifying statistically significant results in the analysis of high-throughput data sets. This common practice leaves a large amount of raw data behind, from which more information can be extracted. However, existing solutions do not provide capabilities to explore large-scale raw datasets using biologically sensible queries, nor do they allow user interaction based real-time customization of graphics. To address these drawbacks, we have designed an open-source, web-based tool called Systems-Level Interactive Data Exploration, or SLIDE to visualize large-scale -omics data interactively. SLIDE's interface makes it easier for scientists to explore quantitative expression data in multiple resolutions in a single screen. SLIDE is publicly available under BSD license both as an online version as well as a stand-alone version at https://github.com/soumitag/SLIDE. Supplementary Information are available at Bioinformatics online.

  7. National transportation statistics 2010

    DOT National Transportation Integrated Search

    2010-01-01

    National Transportation Statistics presents statistics on the U.S. transportation system, including its physical components, safety record, economic performance, the human and natural environment, and national security. This is a large online documen...

  8. Georgia Public Library Statistics, 1975.

    ERIC Educational Resources Information Center

    Georgia State Dept of Education, Atlanta. Div. of Public Library Services.

    Statistical data on Georgia public libraries are provided in tables covering regional and large county library systems, audiovisual materials, audiovisual expenditures, analysis of federal funds received, and Title II construction. Data on the services of the state agency are given for technical services, reader services, large group loans, state…

  9. Statistical Analysis of Large-Scale Structure of Universe

    NASA Astrophysics Data System (ADS)

    Tugay, A. V.

    While galaxy cluster catalogs were compiled many decades ago, other structural elements of cosmic web are detected at definite level only in the newest works. For example, extragalactic filaments were described by velocity field and SDSS galaxy distribution during the last years. Large-scale structure of the Universe could be also mapped in the future using ATHENA observations in X-rays and SKA in radio band. Until detailed observations are not available for the most volume of Universe, some integral statistical parameters can be used for its description. Such methods as galaxy correlation function, power spectrum, statistical moments and peak statistics are commonly used with this aim. The parameters of power spectrum and other statistics are important for constraining the models of dark matter, dark energy, inflation and brane cosmology. In the present work we describe the growth of large-scale density fluctuations in one- and three-dimensional case with Fourier harmonics of hydrodynamical parameters. In result we get power-law relation for the matter power spectrum.

  10. An ecological study of cancer incidence in Port Hope, Ontario from 1992 to 2007.

    PubMed

    Chen, Jing; Moir, Deborah; Lane, Rachel; Thompson, Patsy

    2013-03-01

    A plant processing radium and uranium ores has been operating in the town of Port Hope since 1932. Given the nuclear industry located in the community and ongoing public health concerns, cancer incidence rates in Port Hope were studied for a recent 16 year period (1992-2007) for continued periodic cancer incidence surveillance of the community. The cancer incidence in the local community for all cancers combined was similar to the Ontario population, health regions with similar socio-economic characteristics in Ontario and in Canada, and the Canadian population. No statistically significant differences in childhood cancer, leukaemia or other radiosensitive cancer incidence were observed, with the exception of statistically significant elevated lung cancer incidence among women. However, the statistical significance was reduced or disappeared when the comparison was made to populations with similar socio-economic characteristics. These findings are consistent with previous ecological, case-control and cohort studies conducted in Port Hope, environmental assessments, and epidemiological studies conducted elsewhere on populations living around similar facilities or exposed to similar environmental contaminants. Although the current study covered an extended period of time, the power to detect risk at the sub-regional level of analysis was limited since the Port Hope population is small (16,500). The study nevertheless indicated that large differences in cancer incidence are not occurring in Port Hope compared to other similar communities and the general population.

  11. A statistical model including age to predict passenger postures in the rear seats of automobiles.

    PubMed

    Park, Jangwoon; Ebert, Sheila M; Reed, Matthew P; Hallman, Jason J

    2016-06-01

    Few statistical models of rear seat passenger posture have been published, and none has taken into account the effects of occupant age. This study developed new statistical models for predicting passenger postures in the rear seats of automobiles. Postures of 89 adults with a wide range of age and body size were measured in a laboratory mock-up in seven seat configurations. Posture-prediction models for female and male passengers were separately developed by stepwise regression using age, body dimensions, seat configurations and two-way interactions as potential predictors. Passenger posture was significantly associated with age and the effects of other two-way interaction variables depended on age. A set of posture-prediction models are presented for women and men, and the prediction results are compared with previously published models. This study is the first study of passenger posture to include a large cohort of older passengers and the first to report a significant effect of age for adults. The presented models can be used to position computational and physical human models for vehicle design and assessment. Practitioner Summary: The significant effects of age, body dimensions and seat configuration on rear seat passenger posture were identified. The models can be used to accurately position computational human models or crash test dummies for older passengers in known rear seat configurations.

  12. Multinomial logistic regression analysis for differentiating 3 treatment outcome trajectory groups for headache-associated disability.

    PubMed

    Lewis, Kristin Nicole; Heckman, Bernadette Davantes; Himawan, Lina

    2011-08-01

    Growth mixture modeling (GMM) identified latent groups based on treatment outcome trajectories of headache disability measures in patients in headache subspecialty treatment clinics. Using a longitudinal design, 219 patients in headache subspecialty clinics in 4 large cities throughout Ohio provided data on their headache disability at pretreatment and 3 follow-up assessments. GMM identified 3 treatment outcome trajectory groups: (1) patients who initiated treatment with elevated disability levels and who reported statistically significant reductions in headache disability (high-disability improvers; 11%); (2) patients who initiated treatment with elevated disability but who reported no reductions in disability (high-disability nonimprovers; 34%); and (3) patients who initiated treatment with moderate disability and who reported statistically significant reductions in headache disability (moderate-disability improvers; 55%). Based on the final multinomial logistic regression model, a dichotomized treatment appointment attendance variable was a statistically significant predictor for differentiating high-disability improvers from high-disability nonimprovers. Three-fourths of patients who initiated treatment with elevated disability levels did not report reductions in disability after 5 months of treatment with new preventive pharmacotherapies. Preventive headache agents may be most efficacious for patients with moderate levels of disability and for patients with high disability levels who attend all treatment appointments. Copyright © 2011 International Association for the Study of Pain. Published by Elsevier B.V. All rights reserved.

  13. Investigation of the potential carcinogenicity of a range of chromium containing materials on rat lung.

    PubMed Central

    Levy, L S; Martin, P A; Bidstrup, P L

    1986-01-01

    Twenty one chromium containing materials were examined for carcinogenic activity in a two year study using an intrabronchial pellet implantation system whereby pellets loaded with test material were surgically implanted into the lower left bronchus of rats. The principal aim of the study was to extend our knowledge of the carcinogenic potential of chromium compounds and, in particular, chromates (Cr6+). A statistically significant incidence of treatment related lung tumours was found with some sparingly soluble chromate materials. All tumours were large keratinizing squamous carcinomas of the left lung, except for a single left lung adenocarcinoma and two left lung anaplastic carcinomas. No bronchial carcinomas (0/100) were seen in the negative control group (blank pellet loaded with cholesterol), whereas bronchial carcinomas (22/48 and 25/100) occurred in the two positive control groups which received pellets loaded with 20-methylcholanthrene and calcium chromate respectively. Among the 20 test materials, only three groups gave statistically significant numbers of bronchial carcinomas. Two of these were groups receiving different samples of strontium chromate which gave 43/99 and 62/99 tumours. The third group, zinc chromate (low solubility), gave 5/100 bronchial carcinomas. A further zinc chromate group (Norge composition) produced 3/100 bronchial carcinomas which was not statistically significant. A few lung tumours were observed in other test groups. Images PMID:3964573

  14. Average ambulatory measures of sound pressure level, fundamental frequency, and vocal dose do not differ between adult females with phonotraumatic lesions and matched control subjects

    PubMed Central

    Van Stan, Jarrad H.; Mehta, Daryush D.; Zeitels, Steven M.; Burns, James A.; Barbu, Anca M.; Hillman, Robert E.

    2015-01-01

    Objectives Clinical management of phonotraumatic vocal fold lesions (nodules, polyps) is based largely on assumptions that abnormalities in habitual levels of sound pressure level (SPL), fundamental frequency (f0), and/or amount of voice use play a major role in lesion development and chronic persistence. This study used ambulatory voice monitoring to evaluate if significant differences in voice use exist between patients with phonotraumatic lesions and normal matched controls. Methods Subjects were 70 adult females: 35 with vocal fold nodules or polyps and 35 age-, sex-, and occupation-matched normal individuals. Weeklong summary statistics of voice use were computed from anterior neck surface acceleration recorded using a smartphone-based ambulatory voice monitor. Results Paired t-tests and Kolmogorov-Smirnov tests resulted in no statistically significant differences between patients and matched controls regarding average measures of SPL, f0, vocal dose measures, and voicing/voice rest periods. Paired t-tests comparing f0 variability between the groups resulted in statistically significant differences with moderate effect sizes. Conclusions Individuals with phonotraumatic lesions did not exhibit differences in average ambulatory measures of vocal behavior when compared with matched controls. More refined characterizations of underlying phonatory mechanisms and other potentially contributing causes are warranted to better understand risk factors associated with phonotraumatic lesions. PMID:26024911

  15. Association Studies of 22 Candidate SNPs with Late-Onset Alzheimer's Disease

    PubMed Central

    Figgins, Jessica A.; Minster, Ryan L.; Demirci, F. Yesim; DeKosky, Steven T.; Kamboh, M. Ilyas

    2009-01-01

    Alzheimer's disease (AD) is a complex and multifactorial disease with the possible involvement of several genes. With the exception of the APOE gene as a susceptibility marker, no other genes have been shown consistently to be associated with late-onset AD (LOAD). A recent genome-wide association study of 17,343 gene-based putative functional single nucleotide polymorphisms (SNPs) found 19 significant variants, including 3 linked to APOE, showing association with LOAD (Hum Mol Genet 2007; 16:865–873). We have set out to replicate the 16 new significant associations in a large case-control cohort of American Whites. Additionally, we examined six variants present in positional and/or biological candidate genes for AD. We genotyped the 22 SNPs in up to 1,009 Caucasian Americans with LOAD and up to 1,010 age-matched healthy Caucasian Americans, using 5′ nuclease assays. We did not observe a statistically significant association between the SNPs and the risk of AD, either individually or stratified by APOE. Our data suggest that the association of the studied variants with LOAD risk, if it exists, is not statistically significant in our sample. PMID:18780302

  16. Seasonal and nonseasonal variability of satellite-derived chlorophyll and colored dissolved organic matter concentration in the California Current

    NASA Astrophysics Data System (ADS)

    Kahru, Mati; Mitchell, B. Greg

    2001-02-01

    Time series of surface chlorophyll a concentration (Chl) and colored dissolved organic matter (CDOM) derived from the Ocean Color and Temperature Sensor and Sea-Viewing Wide Field-of-View Sensor were evaluated for the California Current area using regional algorithms. Satellite data composited for 8-day periods provide the ability to describe large-scale changes in surface parameters. These changes are difficult to detect based on in situ observations alone that suffer from undersampling the large temporal and spatial variability, especially in Chl. We detected no significant bias in satellite Chl estimates compared with ship-based measurements. The variability in CDOM concentration was significantly smaller than that in Chl, both spatially and temporally. While being subject to large interannual and short-term variations, offshore waters (100-1000 km from the shore) have an annual cycle of Chl and CDOM with a maximum in winter-spring (December-March) and a minimum in late summer. For inshore waters the maximum is more likely in spring (April-May). We detect significant increase in both Chl and CDOM off central and southern California during the La Niña year of 1999. The trend of increasing Chl and CDOM from October 1996 to June 2000 is statistically significant in many areas.

  17. Analysing the spatial patterns of erosion scars using point process theory at the coastal chalk cliff of Mesnil-Val, (Normandy, Northern France)

    NASA Astrophysics Data System (ADS)

    Rohmer, J.; Dewez, D.

    2014-09-01

    Over the last decade, many cliff erosion studies have focused on frequency-size statistics using inventories of sea cliff retreat sizes. By comparison, only a few paid attention to quantifying the spatial and temporal organisation of erosion scars over a cliff face. Yet, this spatial organisation carries essential information about the external processes and the environmental conditions that promote or initiate sea-cliff instabilities. In this article, we use summary statistics of spatial point process theory as a tool to examine the spatial and temporal pattern of a rockfall inventory recorded with repeated terrestrial laser scanning surveys at the chalk coastal cliff site of Mesnil-Val (Normandy, France). Results show that: (1) the spatial density of erosion scars is specifically conditioned alongshore by the distance to an engineered concrete groin, with an exponential-like decreasing trend, and vertically focused both at wave breaker height and on strong lithological contrasts; (2) small erosion scars (10-3-10-2 m3) aggregate in clusters within a radius of 5 to 10 m, which suggests some sort of attraction or focused causative process, and disperse above this critical distance; (3) on the contrary, larger erosion scars (10-2-101 m3) tend to disperse above a radius of 1 to 5 m, possibly due to the spreading of successive failures across the cliff face; (4) large scars significantly occur albeit moderately, where previous large rockfalls have occurred during preceeding winter; (5) this temporal trend is not apparent for small events. In conclusion, this study shows, with a worked example, how spatial point process summary statistics are a tool to test and quantify the significance of geomorphological observation organisation.

  18. Analysing the spatial patterns of erosion scars using point process theory at the coastal chalk cliff of Mesnil-Val, Normandy, northern France

    NASA Astrophysics Data System (ADS)

    Rohmer, J.; Dewez, T.

    2015-02-01

    Over the last decade, many cliff erosion studies have focused on frequency-size statistics using inventories of sea cliff retreat sizes. By comparison, only a few paid attention to quantifying the spatial and temporal organisation of erosion scars over a cliff face. Yet, this spatial organisation carries essential information about the external processes and the environmental conditions that promote or initiate sea-cliff instabilities. In this article, we use summary statistics of spatial point process theory as a tool to examine the spatial and temporal pattern of a rockfall inventory recorded with repeated terrestrial laser scanning surveys at the chalk coastal cliff site of Mesnil-Val (Normandy, France). Results show that: (1) the spatial density of erosion scars is specifically conditioned alongshore by the distance to an engineered concrete groyne, with an exponential-like decreasing trend, and vertically focused both at wave breaker height and on strong lithological contrasts; (2) small erosion scars (10-3 to 10-2 m3) aggregate in clusters within a radius of 5 to 10 m, which suggests some sort of attraction or focused causative process, and disperse above this critical distance; (3) on the contrary, larger erosion scars (10-2 to 101 m3) tend to disperse above a radius of 1 to 5 m, possibly due to the spreading of successive failures across the cliff face; (4) large scars significantly occur albeit moderately, where previous large rockfalls have occurred during preceding winter; (5) this temporal trend is not apparent for small events. In conclusion, this study shows, with a worked example, how spatial point process summary statistics are a tool to test and quantify the significance of geomorphological observation organisation.

  19. Prediction of rainfall anomalies during the dry to wet transition season over the Southern Amazonia using machine learning tools

    NASA Astrophysics Data System (ADS)

    Shan, X.; Zhang, K.; Zhuang, Y.; Fu, R.; Hong, Y.

    2017-12-01

    Seasonal prediction of rainfall during the dry-to-wet transition season in austral spring (September-November) over southern Amazonia is central for improving planting crops and fire mitigation in that region. Previous studies have identified the key large-scale atmospheric dynamic and thermodynamics pre-conditions during the dry season (June-August) that influence the rainfall anomalies during the dry to wet transition season over Southern Amazonia. Based on these key pre-conditions during dry season, we have evaluated several statistical models and developed a Neural Network based statistical prediction system to predict rainfall during the dry to wet transition for Southern Amazonia (5-15°S, 50-70°W). Multivariate Empirical Orthogonal Function (EOF) Analysis is applied to the following four fields during JJA from the ECMWF Reanalysis (ERA-Interim) spanning from year 1979 to 2015: geopotential height at 200 hPa, surface relative humidity, convective inhibition energy (CIN) index and convective available potential energy (CAPE), to filter out noise and highlight the most coherent spatial and temporal variations. The first 10 EOF modes are retained for inputs to the statistical models, accounting for at least 70% of the total variance in the predictor fields. We have tested several linear and non-linear statistical methods. While the regularized Ridge Regression and Lasso Regression can generally capture the spatial pattern and magnitude of rainfall anomalies, we found that that Neural Network performs best with an accuracy greater than 80%, as expected from the non-linear dependence of the rainfall on the large-scale atmospheric thermodynamic conditions and circulation. Further tests of various prediction skill metrics and hindcasts also suggest this Neural Network prediction approach can significantly improve seasonal prediction skill than the dynamic predictions and regression based statistical predictions. Thus, this statistical prediction system could have shown potential to improve real-time seasonal rainfall predictions in the future.

  20. Statistical Significance Testing from Three Perspectives and Interpreting Statistical Significance and Nonsignificance and the Role of Statistics in Research.

    ERIC Educational Resources Information Center

    Levin, Joel R.; And Others

    1993-01-01

    Journal editors respond to criticisms of reliance on statistical significance in research reporting. Joel R. Levin ("Journal of Educational Psychology") defends its use, whereas William D. Schafer ("Measurement and Evaluation in Counseling and Development") emphasizes the distinction between statistically significant and important. William Asher…

  1. Some limit theorems for ratios of order statistics from uniform random variables.

    PubMed

    Xu, Shou-Fang; Miao, Yu

    2017-01-01

    In this paper, we study the ratios of order statistics based on samples drawn from uniform distribution and establish some limit properties such as the almost sure central limit theorem, the large deviation principle, the Marcinkiewicz-Zygmund law of large numbers and complete convergence.

  2. Statistical Downscaling in Multi-dimensional Wave Climate Forecast

    NASA Astrophysics Data System (ADS)

    Camus, P.; Méndez, F. J.; Medina, R.; Losada, I. J.; Cofiño, A. S.; Gutiérrez, J. M.

    2009-04-01

    Wave climate at a particular site is defined by the statistical distribution of sea state parameters, such as significant wave height, mean wave period, mean wave direction, wind velocity, wind direction and storm surge. Nowadays, long-term time series of these parameters are available from reanalysis databases obtained by numerical models. The Self-Organizing Map (SOM) technique is applied to characterize multi-dimensional wave climate, obtaining the relevant "wave types" spanning the historical variability. This technique summarizes multi-dimension of wave climate in terms of a set of clusters projected in low-dimensional lattice with a spatial organization, providing Probability Density Functions (PDFs) on the lattice. On the other hand, wind and storm surge depend on instantaneous local large-scale sea level pressure (SLP) fields while waves depend on the recent history of these fields (say, 1 to 5 days). Thus, these variables are associated with large-scale atmospheric circulation patterns. In this work, a nearest-neighbors analog method is used to predict monthly multi-dimensional wave climate. This method establishes relationships between the large-scale atmospheric circulation patterns from numerical models (SLP fields as predictors) with local wave databases of observations (monthly wave climate SOM PDFs as predictand) to set up statistical models. A wave reanalysis database, developed by Puertos del Estado (Ministerio de Fomento), is considered as historical time series of local variables. The simultaneous SLP fields calculated by NCEP atmospheric reanalysis are used as predictors. Several applications with different size of sea level pressure grid and with different temporal domain resolution are compared to obtain the optimal statistical model that better represents the monthly wave climate at a particular site. In this work we examine the potential skill of this downscaling approach considering perfect-model conditions, but we will also analyze the suitability of this methodology to be used for seasonal forecast and for long-term climate change scenario projection of wave climate.

  3. Statistical Model to Analyze Quantitative Proteomics Data Obtained by 18O/16O Labeling and Linear Ion Trap Mass Spectrometry

    PubMed Central

    Jorge, Inmaculada; Navarro, Pedro; Martínez-Acedo, Pablo; Núñez, Estefanía; Serrano, Horacio; Alfranca, Arántzazu; Redondo, Juan Miguel; Vázquez, Jesús

    2009-01-01

    Statistical models for the analysis of protein expression changes by stable isotope labeling are still poorly developed, particularly for data obtained by 16O/18O labeling. Besides large scale test experiments to validate the null hypothesis are lacking. Although the study of mechanisms underlying biological actions promoted by vascular endothelial growth factor (VEGF) on endothelial cells is of considerable interest, quantitative proteomics studies on this subject are scarce and have been performed after exposing cells to the factor for long periods of time. In this work we present the largest quantitative proteomics study to date on the short term effects of VEGF on human umbilical vein endothelial cells by 18O/16O labeling. Current statistical models based on normality and variance homogeneity were found unsuitable to describe the null hypothesis in a large scale test experiment performed on these cells, producing false expression changes. A random effects model was developed including four different sources of variance at the spectrum-fitting, scan, peptide, and protein levels. With the new model the number of outliers at scan and peptide levels was negligible in three large scale experiments, and only one false protein expression change was observed in the test experiment among more than 1000 proteins. The new model allowed the detection of significant protein expression changes upon VEGF stimulation for 4 and 8 h. The consistency of the changes observed at 4 h was confirmed by a replica at a smaller scale and further validated by Western blot analysis of some proteins. Most of the observed changes have not been described previously and are consistent with a pattern of protein expression that dynamically changes over time following the evolution of the angiogenic response. With this statistical model the 18O labeling approach emerges as a very promising and robust alternative to perform quantitative proteomics studies at a depth of several thousand proteins. PMID:19181660

  4. Relation Between Flow and Dissolved Oxygen in the Roanoke River Between Roanoke Rapids and Jamesville, North Carolina, 1998-2005

    USGS Publications Warehouse

    Wehmeyer, Loren L.; Bales, Jerad D.

    2009-01-01

    Understanding the relation between dam release characteristics and downstream water quality in the lower Roanoke River, North Carolina, is important for natural-resource management and ecosystem protection. Data from four raingages, four water-quality monitoring sites, and one streamflow-measurement site were used to identify statistical relations and discernible quantitative or qualitative patterns linking Roanoke River instream dissolved-oxygen (DO) levels to releases at Roanoke Rapids Dam for the period 1998-2005. The time-series DO data, complicated by the occurrence of major hurricanes in the short period of hourly DO data collection at the dam, present a mixed picture of the effects of hydropower peaking (a technique used by hydropower dam operators to produce electricity when consumption is high by passing a large volume of water through the dam turbines, which dramatically increases the volume of flow below the dam) on downstream DO. Other than in 2003 when dissolved-oxygen concentrations in the Roanoke River were likely affected by runoff from Hurricane Isabel rains, there were not consistent, statistically significant differences detected in the annual medians of hourly and(or) daily DO values during peaking versus nonpeaking periods. Along the Roanoke River, downstream of Roanoke Rapids Dam at Oak City, North Carolina, using a 95-percent confidence interval, the median value of the May-November daily mean DO concentrations for each year was lower during peaking periods for 2 years, higher for 2 years, and not significantly different for 4 years. Downstream at Jamesville, North Carolina, also using a 95-percent confidence interval, the median value of the annual May-November daily mean DO concentrations during hydropower peaking was lower for 4 years, higher for 2 years, and not significantly different for 2 years. In summary, the effect of hydropower peaking on downstream DO was inconsistent. Conversely, large precipitation events downstream from the dam resulted in consistent, statistically significant decreases in DO in the mainstem of the Roanoke River at Oak City and Jamesville.

  5. Significant Linkage for Tourette Syndrome in a Large French Canadian Family

    PubMed Central

    Mérette, Chantal; Brassard, Andrée; Potvin, Anne; Bouvier, Hélène; Rousseau, François; Émond, Claudia; Bissonnette, Luc; Roy, Marc-André; Maziade, Michel; Ott, Jurg; Caron, Chantal

    2000-01-01

    Family and twin studies provide strong evidence that genetic factors are involved in the transmission of Gilles de la Tourette syndrome (TS) and related psychiatric disorders. To detect the underlying susceptibility gene(s) for TS, we performed linkage analysis in one large French Canadian family (127 members) from the Charlevoix region, in which 20 family members were definitely affected by TS and 20 others showed related tic disorders. Using model-based linkage analysis, we observed a LOD score of 3.24 on chromosome 11 (11q23). This result was obtained in a multipoint approach involving marker D11S1377, the marker for which significant linkage disequilibrium with TS recently has been detected in an Afrikaner population. Altogether, 25 markers were studied, and, for level of significance, we derived a criterion that took into account the multiple testing arising from the use of three phenotype definitions and three modes of inheritance, a procedure that yielded a LOD score of 3.18. Hence, even after adjustment for multiple testing, the present study shows statistically significant evidence for genetic linkage with TS. PMID:10986045

  6. Large roads reduce bat activity across multiple species.

    PubMed

    Kitzes, Justin; Merenlender, Adina

    2014-01-01

    Although the negative impacts of roads on many terrestrial vertebrate and bird populations are well documented, there have been few studies of the road ecology of bats. To examine the effects of large roads on bat populations, we used acoustic recorders to survey bat activity along ten 300 m transects bordering three large highways in northern California, applying a newly developed statistical classifier to identify recorded calls to the species level. Nightly counts of bat passes were analyzed with generalized linear mixed models to determine the relationship between bat activity and distance from a road. Total bat activity recorded at points adjacent to roads was found to be approximately one-half the level observed at 300 m. Statistically significant road effects were also found for the Brazilian free-tailed bat (Tadarida brasiliensis), big brown bat (Eptesicus fuscus), hoary bat (Lasiurus cinereus), and silver-haired bat (Lasionycteris noctivagans). The road effect was found to be temperature dependent, with hot days both increasing total activity at night and reducing the difference between activity levels near and far from roads. These results suggest that the environmental impacts of road construction may include degradation of bat habitat and that mitigation activities for this habitat loss may be necessary to protect bat populations.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, Janine Camille; Thompson, David; Pebay, Philippe Pierre

    Statistical analysis is typically used to reduce the dimensionality of and infer meaning from data. A key challenge of any statistical analysis package aimed at large-scale, distributed data is to address the orthogonal issues of parallel scalability and numerical stability. Many statistical techniques, e.g., descriptive statistics or principal component analysis, are based on moments and co-moments and, using robust online update formulas, can be computed in an embarrassingly parallel manner, amenable to a map-reduce style implementation. In this paper we focus on contingency tables, through which numerous derived statistics such as joint and marginal probability, point-wise mutual information, information entropy,more » and {chi}{sup 2} independence statistics can be directly obtained. However, contingency tables can become large as data size increases, requiring a correspondingly large amount of communication between processors. This potential increase in communication prevents optimal parallel speedup and is the main difference with moment-based statistics (which we discussed in [1]) where the amount of inter-processor communication is independent of data size. Here we present the design trade-offs which we made to implement the computation of contingency tables in parallel. We also study the parallel speedup and scalability properties of our open source implementation. In particular, we observe optimal speed-up and scalability when the contingency statistics are used in their appropriate context, namely, when the data input is not quasi-diffuse.« less

  8. Astrometric Detection of Extrasolar Planets: Results of a Feasibility Study with the Palomar 5 Meter Telescope

    NASA Technical Reports Server (NTRS)

    Pravdo, Steven H.; Shaklan, Stuart B.

    1996-01-01

    The detection of extrasolar planets around stars like the Sun remains an important goal of astronomy. We present results from Palomar 5 m observations of the open cluster NGC 2420 in which we measure some of the sources of noise that will be present in an astrometric search for extrasolar planets. This is the first time that such a large aperture has been used for high-precision astrometry. We find that the atmospheric noise is 150 micro-arcsec hr(exp 1/2) across a 90 sec field of view and that differential chromatic refraction (DCR) can be calibrated to 128 micro-arcsec for observations within 1 hr of the meridian and 45 deg of zenith. These results confirm that a model for astrometric measurements can be extrapolated to large apertures. We demonstrate, based upon these results, that a large telescope achieves the sensitivity required to perform a statistically significant search for extra solar planets. We describe an astrometric technique to detect planets, the astrometric signals expected, the role of reference stars, and the sources of measurement noise: photometric noise, atmospheric motion between stars, sky background, instrumental noise, and DCR. For the latter, we discuss a method to reduce the noise further to 66 micro-arcsecond for observations within 1 hr of the meridian and 45 deg of zenith. We discuss optimal lists of target stars taken from the latest Gliese & Jahreiss catalog of nearby stars with the largest potential astrometric signals, declination limits for both telescope accessibility and reduced DCR, and galactic latitude limits for a sufficiant number of reference stars. Two samples are described from which one can perform statistically significant searches for gas giant planets around nearby stars. One sample contains 100 "solar class" stars with an average stellar mass of 0.82 solar mass; the other maximizes the number of stars, 574, by searching mainly low-mass M stars. We perform Monte Carlo simulations of the statistical significance of the expected results by using measured and estimated noise quantities. We show the semimajor axis parameter spaces that are searched for each star and how an increase in the length of the observing program expands these spaces. The search over semimajor axis parameter space relates to the theory of gas giant planet formation.

  9. "Simulated molecular evolution" or computer-generated artifacts?

    PubMed

    Darius, F; Rojas, R

    1994-11-01

    1. The authors define a function with value 1 for the positive examples and 0 for the negative ones. They fit a continuous function but do not deal at all with the error margin of the fit, which is almost as large as the function values they compute. 2. The term "quality" for the value of the fitted function gives the impression that some biological significance is associated with values of the fitted function strictly between 0 and 1, but there is no justification for this kind of interpretation and finding the point where the fit achieves its maximum does not make sense. 3. By neglecting the error margin the authors try to optimize the fitted function using differences in the second, third, fourth, and even fifth decimal place which have no statistical significance. 4. Even if such a fit could profit from more data points, the authors should first prove that the region of interest has some kind of smoothness, that is, that a continuous fit makes any sense at all. 5. "Simulated molecular evolution" is a misnomer. We are dealing here with random search. Since the margin of error is so large, the fitted function does not provide statistically significant information about the points in search space where strings with cleavage sites could be found. This implies that the method is a highly unreliable stochastic search in the space of strings, even if the neural network is capable of learning some simple correlations. 6. Classical statistical methods are for these kind of problems with so few data points clearly superior to the neural networks used as a "black box" by the authors, which in the way they are structured provide a model with an error margin as large as the numbers being computed.7. And finally, even if someone would provide us with a function which separates strings with cleavage sites from strings without them perfectly, so-called simulated molecular evolution would not be better than random selection.Since a perfect fit would only produce exactly ones or zeros,starting a search in a region of space where all strings in the neighborhood get the value zero would not provide any kind of directional information for new iterations. We would just skip from one point to the other in a typical random walk manner.

  10. SEM-PLS Analysis of Inhibiting Factors of Cost Performance for Large Construction Projects in Malaysia: Perspective of Clients and Consultants

    PubMed Central

    Memon, Aftab Hameed; Rahman, Ismail Abdul

    2014-01-01

    This study uncovered inhibiting factors to cost performance in large construction projects of Malaysia. Questionnaire survey was conducted among clients and consultants involved in large construction projects. In the questionnaire, a total of 35 inhibiting factors grouped in 7 categories were presented to the respondents for rating significant level of each factor. A total of 300 questionnaire forms were distributed. Only 144 completed sets were received and analysed using advanced multivariate statistical software of Structural Equation Modelling (SmartPLS v2). The analysis involved three iteration processes where several of the factors were deleted in order to make the model acceptable. The result of the analysis found that R 2 value of the model is 0.422 which indicates that the developed model has a substantial impact on cost performance. Based on the final form of the model, contractor's site management category is the most prominent in exhibiting effect on cost performance of large construction projects. This finding is validated using advanced techniques of power analysis. This vigorous multivariate analysis has explicitly found the significant category which consists of several causative factors to poor cost performance in large construction projects. This will benefit all parties involved in construction projects for controlling cost overrun. PMID:24693227

  11. SEM-PLS analysis of inhibiting factors of cost performance for large construction projects in Malaysia: perspective of clients and consultants.

    PubMed

    Memon, Aftab Hameed; Rahman, Ismail Abdul

    2014-01-01

    This study uncovered inhibiting factors to cost performance in large construction projects of Malaysia. Questionnaire survey was conducted among clients and consultants involved in large construction projects. In the questionnaire, a total of 35 inhibiting factors grouped in 7 categories were presented to the respondents for rating significant level of each factor. A total of 300 questionnaire forms were distributed. Only 144 completed sets were received and analysed using advanced multivariate statistical software of Structural Equation Modelling (SmartPLS v2). The analysis involved three iteration processes where several of the factors were deleted in order to make the model acceptable. The result of the analysis found that R(2) value of the model is 0.422 which indicates that the developed model has a substantial impact on cost performance. Based on the final form of the model, contractor's site management category is the most prominent in exhibiting effect on cost performance of large construction projects. This finding is validated using advanced techniques of power analysis. This vigorous multivariate analysis has explicitly found the significant category which consists of several causative factors to poor cost performance in large construction projects. This will benefit all parties involved in construction projects for controlling cost overrun.

  12. Intensive agriculture erodes β-diversity at large scales.

    PubMed

    Karp, Daniel S; Rominger, Andrew J; Zook, Jim; Ranganathan, Jai; Ehrlich, Paul R; Daily, Gretchen C

    2012-09-01

    Biodiversity is declining from unprecedented land conversions that replace diverse, low-intensity agriculture with vast expanses under homogeneous, intensive production. Despite documented losses of species richness, consequences for β-diversity, changes in community composition between sites, are largely unknown, especially in the tropics. Using a 10-year data set on Costa Rican birds, we find that low-intensity agriculture sustained β-diversity across large scales on a par with forest. In high-intensity agriculture, low local (α) diversity inflated β-diversity as a statistical artefact. Therefore, at small spatial scales, intensive agriculture appeared to retain β-diversity. Unlike in forest or low-intensity systems, however, high-intensity agriculture also homogenised vegetation structure over large distances, thereby decoupling the fundamental ecological pattern of bird communities changing with geographical distance. This ~40% decline in species turnover indicates a significant decline in β-diversity at large spatial scales. These findings point the way towards multi-functional agricultural systems that maintain agricultural productivity while simultaneously conserving biodiversity. © 2012 Blackwell Publishing Ltd/CNRS.

  13. The 2014–2015 Ebola virus disease outbreak and primary healthcare delivery in Liberia: Time-series analyses for 2010–2016

    PubMed Central

    Beste, Jason; Toomay, Stephen J.; Dunbar, Nelson; Bawo, Luke; Wesseh, Chea Sanford

    2018-01-01

    Background The aim of this study is to estimate the immediate and lasting effects of the 2014–2015 Ebola virus disease (EVD) outbreak on public-sector primary healthcare delivery in Liberia using 7 years of comprehensive routine health information system data. Methods and findings We analyzed 10 key primary healthcare indicators before, during, and after the EVD outbreak using 31,836 facility-month service outputs from 1 January 2010 to 31 December 2016 across a census of 379 public-sector health facilities in Liberia (excluding Montserrado County). All indicators had statistically significant decreases during the first 4 months of the EVD outbreak, with all indicators having their lowest raw mean outputs in August 2014. Decreases in outputs comparing the end of the initial EVD period (September 2014) to May 2014 (pre-EVD) ranged in magnitude from a 67.3% decrease in measles vaccinations (95% CI: −77.9%, −56.8%, p < 0.001) and a 61.4% decrease in artemisinin-based combination therapy (ACT) treatments for malaria (95% CI: −69.0%, −53.8%, p < 0.001) to a 35.2% decrease in first antenatal care (ANC) visits (95% CI: −45.8%, −24.7%, p < 0.001) and a 38.5% decrease in medroxyprogesterone acetate doses (95% CI: −47.6%, −29.5%, p < 0.001). Following the nadir of system outputs in August 2014, all indicators showed statistically significant increases from October 2014 to December 2014. All indicators had significant positive trends during the post-EVD period, with every system output exceeding pre-Ebola forecasted trends for 3 consecutive months by November 2016. Health system outputs lost during and after the EVD outbreak were large and sustained for most indicators. Prior to exceeding pre-EVD forecasted trends for 3 months, we estimate statistically significant cumulative losses of −776,110 clinic visits (95% CI: −1,480,896, −101,357, p = 0.030); −24,449 bacille Calmette–Guérin vaccinations (95% CI: −45,947, −2,020, p = 0.032); −9,129 measles vaccinations (95% CI: −12,312, −5,659, p < 0.001); −17,191 postnatal care (PNC) visits within 6 weeks of birth (95% CI: −28,344, −5,775, p = 0.002); and −101,857 ACT malaria treatments (95% CI: −205,839, −2,139, p = 0.044) due to the EVD outbreak. Other outputs showed statistically significant cumulative losses only through December 2014, including losses of −12,941 first pentavalent vaccinations (95% CI: −20,309, −5,527, p = 0.002); −5,122 institutional births (95% CI: −8,767, −1,234, p = 0.003); and −45,024 acute respiratory infections treated (95% CI: −66,185, −24,019, p < 0.001). Compared to pre-EVD forecasted trends, medroxyprogesterone acetate doses and first ANC visits did not show statistically significant net losses. ACT treatment for malaria was the only indicator with an estimated net increase in system outputs through December 2016, showing an excess of +78,583 outputs (95% CI: −309,417, +450,661, p = 0.634) compared to pre-EVD forecasted trends, although this increase was not statistically significant. However, comparing December 2013 to December 2017, ACT malaria cases have increased 49.2% (95% CI: 33.9%, 64.5%, p < 0.001). Compared to pre-EVD forecasted trends, there remains a statistically significant loss of −15,144 PNC visits within 6 weeks (95% CI: −29,453, −787, p = 0.040) through December 2016. Conclusions The Liberian public-sector primary healthcare system has made strides towards recovery from the 2014–2015 EVD outbreak. All primary healthcare indicators tracked have recovered to pre-EVD levels as of November 2016. Yet, for most indicators, it took more than 1 year to recover to pre-EVD levels. During this time, large losses of essential primary healthcare services occurred compared to what would have been expected had the EVD outbreak not occurred. The disruption of malaria case management during the EVD outbreak may have resulted in increased malaria cases. Large and sustained investments in public-sector primary care health system strengthening are urgently needed for EVD-affected countries. PMID:29462138

  14. Genetic interactions contribute less than additive effects to quantitative trait variation in yeast

    PubMed Central

    Bloom, Joshua S.; Kotenko, Iulia; Sadhu, Meru J.; Treusch, Sebastian; Albert, Frank W.; Kruglyak, Leonid

    2015-01-01

    Genetic mapping studies of quantitative traits typically focus on detecting loci that contribute additively to trait variation. Genetic interactions are often proposed as a contributing factor to trait variation, but the relative contribution of interactions to trait variation is a subject of debate. Here we use a very large cross between two yeast strains to accurately estimate the fraction of phenotypic variance due to pairwise QTL–QTL interactions for 20 quantitative traits. We find that this fraction is 9% on average, substantially less than the contribution of additive QTL (43%). Statistically significant QTL–QTL pairs typically have small individual effect sizes, but collectively explain 40% of the pairwise interaction variance. We show that pairwise interaction variance is largely explained by pairs of loci at least one of which has a significant additive effect. These results refine our understanding of the genetic architecture of quantitative traits and help guide future mapping studies. PMID:26537231

  15. Maximal use of kinematic information for the extraction of the mass of the top quark in single-lepton tt bar events at DO

    NASA Astrophysics Data System (ADS)

    Estrada Vigil, Juan Cruz

    The mass of the top (t) quark has been measured in the lepton+jets channel of tt¯ final states studied by the DØ and CDF experiments at Fermilab using data from Run I of the Tevatron pp¯ collider. The result published by DØ is 173.3 +/- 5.6(stat) +/- 5.5(syst) GeV. We present a different method to perform this measurement using the existing data. The new technique uses all available kinematic information in an event, and provides a significantly smaller statistical uncertainty than achieved in previous analyses. The preliminary results presented in this thesis indicate a statistical uncertainty for the extracted mass of the top quark of 3.5 GeV, which represents a significant improvement over the previous value of 5.6 GeV. The method of analysis is very general, and may be particularly useful in situations where there is a small signal and a large background.

  16. Impact of distributions on the archetypes and prototypes in heterogeneous nanoparticle ensembles.

    PubMed

    Fernandez, Michael; Wilson, Hugh F; Barnard, Amanda S

    2017-01-05

    The magnitude and complexity of the structural and functional data available on nanomaterials requires data analytics, statistical analysis and information technology to drive discovery. We demonstrate that multivariate statistical analysis can recognise the sets of truly significant nanostructures and their most relevant properties in heterogeneous ensembles with different probability distributions. The prototypical and archetypal nanostructures of five virtual ensembles of Si quantum dots (SiQDs) with Boltzmann, frequency, normal, Poisson and random distributions are identified using clustering and archetypal analysis, where we find that their diversity is defined by size and shape, regardless of the type of distribution. At the complex hull of the SiQD ensembles, simple configuration archetypes can efficiently describe a large number of SiQDs, whereas more complex shapes are needed to represent the average ordering of the ensembles. This approach provides a route towards the characterisation of computationally intractable virtual nanomaterial spaces, which can convert big data into smart data, and significantly reduce the workload to simulate experimentally relevant virtual samples.

  17. Improving Project Performance through Implementation of Agile Methodologies in the Renewable Energy Construction Industry

    NASA Astrophysics Data System (ADS)

    Hernandez Mendez, Arturo

    Collaborative inquiry within undergraduate research experiences (UREs) is an effective curriculum tool to support student growth. This study seeks to understand how collaborative inquiry within undergraduate biology student experiences are affected within faculty mentored experiences and non-mentored experiences at a large private southeastern university. Undergraduate biology students engaged in UREs (faculty as mentor and non-mentor experiences) were examined for statistically significant differences in student self-efficacy. Self-efficacy was measured in three subcomponents (thinking and working like a scientist, scientific self-efficacy, and scientific identity) from student responses obtained in an online survey. Responses were analyzed using a nonparametric equivalent of a t test (Mann Whitney U test) to make comparisons between faculty mentored and non-mentored student groups. The conclusions of this study highlight the statistically significant effect of faculty mentoring in all three subcomponents. Faculty and university policy makers can apply these findings to develop further support for effective faculty mentoring practices in UREs.

  18. Scalable detection of statistically significant communities and hierarchies, using message passing for modularity

    PubMed Central

    Zhang, Pan; Moore, Cristopher

    2014-01-01

    Modularity is a popular measure of community structure. However, maximizing the modularity can lead to many competing partitions, with almost the same modularity, that are poorly correlated with each other. It can also produce illusory ‘‘communities’’ in random graphs where none exist. We address this problem by using the modularity as a Hamiltonian at finite temperature and using an efficient belief propagation algorithm to obtain the consensus of many partitions with high modularity, rather than looking for a single partition that maximizes it. We show analytically and numerically that the proposed algorithm works all of the way down to the detectability transition in networks generated by the stochastic block model. It also performs well on real-world networks, revealing large communities in some networks where previous work has claimed no communities exist. Finally we show that by applying our algorithm recursively, subdividing communities until no statistically significant subcommunities can be found, we can detect hierarchical structure in real-world networks more efficiently than previous methods. PMID:25489096

  19. SOS1 gene polymorphisms are associated with gestational diabetes mellitus in a Chinese population: Results from a nested case-control study in Taiyuan, China.

    PubMed

    Chen, Qiong; Yang, Hailan; Feng, Yongliang; Zhang, Ping; Wu, Weiwei; Li, Shuzhen; Thompson, Brian; Wang, Xin; Peng, Tingting; Wang, Fang; Xie, Bingjie; Guo, Pengge; Li, Mei; Wang, Ying; Zhao, Nan; Wang, Suping; Zhang, Yawei

    2018-03-01

    Gestational diabetes mellitus is a growing public health concern due to its large disease burden; however, the underlying pathophysiology remains unclear. Therefore, we examined the relationship between 107 single-nucleotide polymorphisms in insulin signalling pathway genes and gestational diabetes mellitus risk using a nested case-control study. The SOS1 rs7598922 GA and AA genotype were statistically significantly associated with reduced gestational diabetes mellitus risk ( p trend  = 0.0006) compared with GG genotype. At the gene level, SOS1 was statistically significantly associated with gestational diabetes mellitus risk after adjusting for multiple comparisons. Moreover, AGGA and GGGG haplotypes in SOS1 gene were associated with reduced risk of gestational diabetes mellitus. Our study provides evidence for an association between the SOS1 gene and risk of gestational diabetes mellitus; however, its role in the pathogenesis of gestational diabetes mellitus will need to be verified by further studies.

  20. Quality Assessments of Long-Term Quantitative Proteomic Analysis of Breast Cancer Xenograft Tissues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Jian-Ying; Chen, Lijun; Zhang, Bai

    The identification of protein biomarkers requires large-scale analysis of human specimens to achieve statistical significance. In this study, we evaluated the long-term reproducibility of an iTRAQ (isobaric tags for relative and absolute quantification) based quantitative proteomics strategy using one channel for universal normalization across all samples. A total of 307 liquid chromatography tandem mass spectrometric (LC-MS/MS) analyses were completed, generating 107 one-dimensional (1D) LC-MS/MS datasets and 8 offline two-dimensional (2D) LC-MS/MS datasets (25 fractions for each set) for human-in-mouse breast cancer xenograft tissues representative of basal and luminal subtypes. Such large-scale studies require the implementation of robust metrics to assessmore » the contributions of technical and biological variability in the qualitative and quantitative data. Accordingly, we developed a quantification confidence score based on the quality of each peptide-spectrum match (PSM) to remove quantification outliers from each analysis. After combining confidence score filtering and statistical analysis, reproducible protein identification and quantitative results were achieved from LC-MS/MS datasets collected over a 16 month period.« less

  1. Understanding Statistics and Statistics Education: A Chinese Perspective

    ERIC Educational Resources Information Center

    Shi, Ning-Zhong; He, Xuming; Tao, Jian

    2009-01-01

    In recent years, statistics education in China has made great strides. However, there still exists a fairly large gap with the advanced levels of statistics education in more developed countries. In this paper, we identify some existing problems in statistics education in Chinese schools and make some proposals as to how they may be overcome. We…

  2. Different Manhattan project: automatic statistical model generation

    NASA Astrophysics Data System (ADS)

    Yap, Chee Keng; Biermann, Henning; Hertzmann, Aaron; Li, Chen; Meyer, Jon; Pao, Hsing-Kuo; Paxia, Salvatore

    2002-03-01

    We address the automatic generation of large geometric models. This is important in visualization for several reasons. First, many applications need access to large but interesting data models. Second, we often need such data sets with particular characteristics (e.g., urban models, park and recreation landscape). Thus we need the ability to generate models with different parameters. We propose a new approach for generating such models. It is based on a top-down propagation of statistical parameters. We illustrate the method in the generation of a statistical model of Manhattan. But the method is generally applicable in the generation of models of large geographical regions. Our work is related to the literature on generating complex natural scenes (smoke, forests, etc) based on procedural descriptions. The difference in our approach stems from three characteristics: modeling with statistical parameters, integration of ground truth (actual map data), and a library-based approach for texture mapping.

  3. Generic Difference Between Early and Late Stages of BATSE Gamma-Ray Bursts

    NASA Technical Reports Server (NTRS)

    Mitrofanov, Igor G.; Litvak, Maxim L.; Anfimov, Dimitrij S.; Sanin, Anton B.; Briggs, Michael S.; Paciesas, William S.; Pendleton, Geoffrey N.; Preece, Robert D.; Meegan, Charles A.

    2001-01-01

    The early and late stages of gamma-ray bursts are studied in a statistical analysis of the large sample of long BATSE events. The primary peak is used as the boundary between the early and late stages of emission. Significant differences are found between the stages: the early stage is shorter, it has harder emission, and it becomes a smaller fraction of the total burst duration for burst groups of decreasing intensity.

  4. General Differences between Early and Late Stages of BATSE Gamma-Ray Bursts

    NASA Technical Reports Server (NTRS)

    Mitrofanov, I. G.; Litvak, M. L.; Anfimov, D. S.; Sanin, A. B.; Briggs, M. S.; Paciesas, W. S.; Pendleton, G. N.; Preece, R. D.; Meegan, C. A.; Rose, M. Franklin (Technical Monitor)

    2000-01-01

    The early and late stages of gamma-ray bursts are studied in a statistical analysis of the large sample of long BATSE events. The primary peak is used as the boundary between the early and late stages of emission. Significant differences are found between the stages: the early stage is shorter, it has harder emission, and it becomes a smaller fraction of the total burst duration for burst groups of decreasing intensity.

  5. Large-scale online semantic indexing of biomedical articles via an ensemble of multi-label classification models.

    PubMed

    Papanikolaou, Yannis; Tsoumakas, Grigorios; Laliotis, Manos; Markantonatos, Nikos; Vlahavas, Ioannis

    2017-09-22

    In this paper we present the approach that we employed to deal with large scale multi-label semantic indexing of biomedical papers. This work was mainly implemented within the context of the BioASQ challenge (2013-2017), a challenge concerned with biomedical semantic indexing and question answering. Our main contribution is a MUlti-Label Ensemble method (MULE) that incorporates a McNemar statistical significance test in order to validate the combination of the constituent machine learning algorithms. Some secondary contributions include a study on the temporal aspects of the BioASQ corpus (observations apply also to the BioASQ's super-set, the PubMed articles collection) and the proper parametrization of the algorithms used to deal with this challenging classification task. The ensemble method that we developed is compared to other approaches in experimental scenarios with subsets of the BioASQ corpus giving positive results. In our participation in the BioASQ challenge we obtained the first place in 2013 and the second place in the four following years, steadily outperforming MTI, the indexing system of the National Library of Medicine (NLM). The results of our experimental comparisons, suggest that employing a statistical significance test to validate the ensemble method's choices, is the optimal approach for ensembling multi-label classifiers, especially in contexts with many rare labels.

  6. Risk of Idiopathic Dilated Cardiomyopathy in 29 000 Patients With Celiac Disease

    PubMed Central

    Emilsson, Louise; Andersson, Bert; Elfström, Peter; Green, Peter H.R.; Ludvigsson, Jonas F.

    2012-01-01

    Background Dilated cardiomyopathy (DCM) is a rare disease of largely unknown origin. Previous studies have suggested an increased prevalence of celiac disease (CD) in patients with DCM. These studies, however, were based on a maximum of 5 patients with both CD and DCM. In the present large Swedish population-based cohort study, we examined the risk of idiopathic DCM in patients with CD determined by small-intestinal histopathology. Methods and Results From 2006 to 2008, we collected duodenal/jejunal biopsy data on CD (equal to villous atrophy, Marsh stage 3, n=29 071 unique individuals) from (all) 28 pathology departments in Sweden. These individuals were compared with 144 429 reference individuals matched for age, sex, calendar year, and county. Data on DCM were obtained through the National Patient Register and confirmed by patient charts and echocardiography data. During follow-up, 17 patients with CD and 52 reference individuals developed idiopathic DCM. Thus, patients with CD were at an increased risk of idiopathic DCM (hazard ratio, 1.73; 95% confidence interval, 1.00 to 3.00), although the risk estimate failed to attain statistical significance (P=0.052). Conclusion This nationwide study found a moderately but not statistically significantly increased risk of idiopathic DCM in patients with biopsy-verified CD. (J Am Heart Assoc. 2012;1:e001594 doi: 10.1161/JAHA.112.001594.) PMID:23130142

  7. Particle precipitation prior to large earthquakes of both the Sumatra and Philippine Regions: A statistical analysis

    NASA Astrophysics Data System (ADS)

    Fidani, Cristiano

    2015-12-01

    A study of statistical correlation between low L-shell electrons precipitating into the atmosphere and strong earthquakes is presented. More than 11 years of the Medium Energy Protons Electrons Detector data from the NOAA-15 Sun-synchronous polar orbiting satellite were analysed. Electron fluxes were analysed using a set of adiabatic coordinates. From this, significant electron counting rate fluctuations were evidenced during geomagnetic quiet periods. Electron counting rates were compared to earthquakes by defining a seismic event L-shell obtained radially projecting the epicentre geographical positions to a given altitude towards the zenith. Counting rates were grouped in every satellite semi-orbit together with strong seismic events and these were chosen with the L-shell coordinates close to each other. NOAA-15 electron data from July 1998 to December 2011 were compared for nearly 1800 earthquakes with magnitudes larger than or equal to 6, occurring worldwide. When considering 30-100 keV precipitating electrons detected by the vertical NOAA-15 telescope and earthquake epicentre projections at altitudes greater that 1300 km, a significant correlation appeared where a 2-3 h electron precipitation was detected prior to large events in the Sumatra and Philippine Regions. This was in physical agreement with different correlation times obtained from past studies that considered particles with greater energies. The Discussion below of satellite orbits and detectors is useful for future satellite missions for earthquake mitigation.

  8. Similar range of motion and function after resurfacing large–head or standard total hip arthroplasty

    PubMed Central

    2013-01-01

    Background and purpose Large–size hip articulations may improve range of motion (ROM) and function compared to a 28–mm THA, and the low risk of dislocation allows the patients more activity postoperatively. On the other hand, the greater extent of surgery for resurfacing hip arthroplasty (RHA) could impair rehabilitation. We investigated the effect of head size and surgical procedure on postoperative rehabilitation in a randomized clinical trial (RCT). Methods We followed randomized groups of RHAs, large–head THAs and standard THAs at 2 months, 6 months, 1 and 2 years postoperatively, recording clinical rehabilitation parameters. Results Large articulations increased the mean total range of motion by 13° during the first 6 postoperative months. The increase was not statistically significant and was transient. The 2–year total ROM (SD) for RHA, standard THA, and large–head THA was 221° (35), 232° (36), and 225° (30) respectively, but the differences were not statistically significant. The 3 groups were similar regarding Harris hip score, UCLA activity score, step rate, and sick leave. Interpretation Head size had no influence on range of motion. The lack of restriction allowed for large articulations did not improve the clinical and patient–perceived outcomes. The more extensive surgical procedure of RHA did not impair the rehabilitation. This project is registered at ClinicalTrials.gov under # NCT01113762. PMID:23530872

  9. Bladder radiotherapy treatment: A retrospective comparison of 3-dimensional conformal radiotherapy, intensity-modulated radiation therapy, and volumetric-modulated arc therapy plans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pasciuti, Katia, E-mail: k.pasciuti@virgilio.it; Kuthpady, Shrinivas; Anderson, Anne

    To examine tumor's and organ's response when different radiotherapy plan techniques are used. Ten patients with confirmed bladder tumors were first treated using 3-dimensional conformal radiotherapy (3DCRT) and subsequently the original plans were re-optimized using the intensity-modulated radiation treatment (IMRT) and volumetric-modulated arc therapy (VMAT)-techniques. Targets coverage in terms of conformity and homogeneity index, TCP, and organs' dose limits, including integral dose analysis were evaluated. In addition, MUs and treatment delivery times were compared. Better minimum target coverage (1.3%) was observed in VMAT plans when compared to 3DCRT and IMRT ones confirmed by a statistically significant conformity index (CI) results.more » Large differences were observed among techniques in integral dose results of the femoral heads. Even if no statistically significant differences were reported in rectum and tissue, a large amount of energy deposition was observed in 3DCRT plans. In any case, VMAT plans provided better organs and tissue sparing confirmed also by the normal tissue complication probability (NTCP) analysis as well as a better tumor control probability (TCP) result. Our analysis showed better overall results in planning using VMAT techniques. Furthermore, a total time reduction in treatment observed among techniques including gantry and collimator rotation could encourage using the more recent one, reducing target movements and patient discomfort.« less

  10. Publication of statistically significant research findings in prosthodontics & implant dentistry in the context of other dental specialties.

    PubMed

    Papageorgiou, Spyridon N; Kloukos, Dimitrios; Petridis, Haralampos; Pandis, Nikolaos

    2015-10-01

    To assess the hypothesis that there is excessive reporting of statistically significant studies published in prosthodontic and implantology journals, which could indicate selective publication. The last 30 issues of 9 journals in prosthodontics and implant dentistry were hand-searched for articles with statistical analyses. The percentages of significant and non-significant results were tabulated by parameter of interest. Univariable/multivariable logistic regression analyses were applied to identify possible predictors of reporting statistically significance findings. The results of this study were compared with similar studies in dentistry with random-effects meta-analyses. From the 2323 included studies 71% of them reported statistically significant results, with the significant results ranging from 47% to 86%. Multivariable modeling identified that geographical area and involvement of statistician were predictors of statistically significant results. Compared to interventional studies, the odds that in vitro and observational studies would report statistically significant results was increased by 1.20 times (OR: 2.20, 95% CI: 1.66-2.92) and 0.35 times (OR: 1.35, 95% CI: 1.05-1.73), respectively. The probability of statistically significant results from randomized controlled trials was significantly lower compared to various study designs (difference: 30%, 95% CI: 11-49%). Likewise the probability of statistically significant results in prosthodontics and implant dentistry was lower compared to other dental specialties, but this result did not reach statistical significant (P>0.05). The majority of studies identified in the fields of prosthodontics and implant dentistry presented statistically significant results. The same trend existed in publications of other specialties in dentistry. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. LD-SPatt: large deviations statistics for patterns on Markov chains.

    PubMed

    Nuel, G

    2004-01-01

    Statistics on Markov chains are widely used for the study of patterns in biological sequences. Statistics on these models can be done through several approaches. Central limit theorem (CLT) producing Gaussian approximations are one of the most popular ones. Unfortunately, in order to find a pattern of interest, these methods have to deal with tail distribution events where CLT is especially bad. In this paper, we propose a new approach based on the large deviations theory to assess pattern statistics. We first recall theoretical results for empiric mean (level 1) as well as empiric distribution (level 2) large deviations on Markov chains. Then, we present the applications of these results focusing on numerical issues. LD-SPatt is the name of GPL software implementing these algorithms. We compare this approach to several existing ones in terms of complexity and reliability and show that the large deviations are more reliable than the Gaussian approximations in absolute values as well as in terms of ranking and are at least as reliable as compound Poisson approximations. We then finally discuss some further possible improvements and applications of this new method.

  12. A Method for Characterizing Phenotypic Changes in Highly Variable Cell Populations and its Application to High Content Screening of Arabidopsis thaliana Protoplastsa

    PubMed Central

    Johnson, Gregory R.; Kangas, Joshua D.; Dovzhenko, Alexander; Trojok, Rüdiger; Voigt, Karsten; Majarian, Timothy D.; Palme, Klaus; Murphy, Robert F.

    2017-01-01

    Quantitative image analysis procedures are necessary for the automated discovery of effects of drug treatment in large collections of fluorescent micrographs. When compared to their mammalian counterparts, the effects of drug conditions on protein localization in plant species are poorly understood and underexplored. To investigate this relationship, we generated a large collection of images of single plant cells after various drug treatments. For this, protoplasts were isolated from six transgenic lines of A. thaliana expressing fluorescently tagged proteins. Nine drugs at three concentrations were applied to protoplast cultures followed by automated image acquisition. For image analysis, we developed a cell segmentation protocol for detecting drug effects using a Hough-transform based region of interest detector and a novel cross-channel texture feature descriptor. In order to determine treatment effects, we summarized differences between treated and untreated experiments with an L1 Cramér-von Mises statistic. The distribution of these statistics across all pairs of treated and untreated replicates was compared to the variation within control replicates to determine the statistical significance of observed effects. Using this pipeline, we report the dose dependent drug effects in the first high-content Arabidopsis thaliana drug screen of its kind. These results can function as a baseline for comparison to other protein organization modeling approaches in plant cells. PMID:28245335

  13. SparRec: An effective matrix completion framework of missing data imputation for GWAS

    NASA Astrophysics Data System (ADS)

    Jiang, Bo; Ma, Shiqian; Causey, Jason; Qiao, Linbo; Hardin, Matthew Price; Bitts, Ian; Johnson, Daniel; Zhang, Shuzhong; Huang, Xiuzhen

    2016-10-01

    Genome-wide association studies present computational challenges for missing data imputation, while the advances of genotype technologies are generating datasets of large sample sizes with sample sets genotyped on multiple SNP chips. We present a new framework SparRec (Sparse Recovery) for imputation, with the following properties: (1) The optimization models of SparRec, based on low-rank and low number of co-clusters of matrices, are different from current statistics methods. While our low-rank matrix completion (LRMC) model is similar to Mendel-Impute, our matrix co-clustering factorization (MCCF) model is completely new. (2) SparRec, as other matrix completion methods, is flexible to be applied to missing data imputation for large meta-analysis with different cohorts genotyped on different sets of SNPs, even when there is no reference panel. This kind of meta-analysis is very challenging for current statistics based methods. (3) SparRec has consistent performance and achieves high recovery accuracy even when the missing data rate is as high as 90%. Compared with Mendel-Impute, our low-rank based method achieves similar accuracy and efficiency, while the co-clustering based method has advantages in running time. The testing results show that SparRec has significant advantages and competitive performance over other state-of-the-art existing statistics methods including Beagle and fastPhase.

  14. International migration data: their problems and usefulness in Venezuela.

    PubMed

    Torrealba, R

    1987-01-01

    During the 1940s and 1950s Venezuela was an important destination for migrants from Southern Europe, a flow that disappeared almost entirely during the 1960s, to be replaced by border movements and the largely illegal migration of Colombians. The oil boom of the 1970s saw an increase of the latter, which may have subsided during the 1980s due to the more difficult economic conditions that have also led to significant emigration levels of Venezuelans and former immigrants. Methods of data collection systems that provide information on migrants include the National Population and Housing Census, the National Household Survey, migration surveys, arrival and departure statistics, registration systems operated by the Direccion General Sectorial de identificacion y Control de Extranjeros, the 1980 regularization drive, statistics gathered by the Ministry of Labor, and vital and civil registration statistics. The lack of effective coordination among the different government agencies gathering information and the administrative nature of the data collected give rise to problems of comparability. Mechanisms to publish and disseminate the data available are not well developed, so that researchers often have no access to potentially useful sources of information. Problems of timeliness in the publication of the most widely used information are also present, as is the large gap existing in data pertaining to emigration, be it of Venezuelan nationals or of immigrants leaving the country.

  15. Gravity Waves characteristics and their impact on turbulent transport above an Antarctic Ice Sheet

    NASA Astrophysics Data System (ADS)

    Cava, Daniela; Giostra, Umberto; Katul, Gabriel

    2016-04-01

    Turbulence within the stable boundary layer (SBL) remains a ubiquitous feature of many geophysical flows, especially over glaciers and ice-sheets. Although numerous studies have investigated various aspects of the boundary layer motion during stable atmospheric conditions, a unified picture of turbulent transport within the SBL remains elusive. In a strongly stratified SBL, turbulence generation is frequently associated with interactions with sub-meso scale motions that are often a combination of gravity waves (GWs) and horizontal modes. While some progress has been made in the inclusion of GW parameterisation within global models, description and parameterisation of the turbulence-wave interaction remain an open question. The discrimination between waves and turbulence is a focal point needed to make progress as these two motions have different properties with regards to heat, moisture and pollutant transport. In fact, the occurrence of GWs can cause significant differences and ambiguities in the interpretation of turbulence statistics and fluxes if not a priori filtered from the analysis. In this work, the characteristics of GW and their impact on turbulent statistics were investigated using wind velocity components and scalars collected above an Antarctic Ice sheet during an Austral Summer. Antarctica is an ideal location for exploring the characteristics of GW because of persistent conditions of strongly stable atmospheric stability in the lower troposphere. Periods dominated by wavy motions have been identified by analysing time series measured by fast response instrumentation. The GWs nature and features have been investigated using Fourier cross-spectral indicators. The detected waves were frequently characterised by variable amplitude and period; moreover, they often produced non-stationarity and large intermittency in turbulent fluctuations that can significantly alter the estimation of turbulence statistics in general and fluxes in particular. A multi-resolution decomposition based on the Haar wavelet has been applied to separate gravity waves from turbulent fluctuations in case of a sufficiently defined spectral gap. Statistics computed after removing wavy disturbances highlight the large impact of gravity waves on second order turbulent quantities. One of the most impacted parameters is turbulent kinetic energy, in particular in the longitudinal and lateral components. The effect of wave activity on momentum and scalar fluxes is more complex because waves can produce large errors in sign and magnitude of computed turbulent fluxes or they themselves can contribute to intermittent turbulent mixing. The proposed filtering procedure based on the multi-resolution decomposition restored the correct sign in the turbulent sensible heat flux values. These findings highlight the significance of a correct evaluation of the impact of wave components when the goal is determining the turbulent transport component of mass and energy in the SBL.

  16. The effect of baryons in the cosmological lensing PDFs

    NASA Astrophysics Data System (ADS)

    Castro, Tiago; Quartin, Miguel; Giocoli, Carlo; Borgani, Stefano; Dolag, Klaus

    2018-07-01

    Observational cosmology is passing through a unique moment of grandeur with the amount of quality data growing fast. However, in order to better take advantage of this moment, data analysis tools have to keep up the pace. Understanding the effect of baryonic matter on the large-scale structure is one of the challenges to be faced in cosmology. In this work, we have thoroughly studied the effect of baryonic physics on different lensing statistics. Making use of the Magneticum Pathfinder suite of simulations, we show that the influence of luminous matter on the 1-point lensing statistics of point sources is significant, enhancing the probability of magnified objects with μ > 3 by a factor of 2 and the occurrence of multiple images by a factor of 5-500, depending on the source redshift and size. We also discuss the dependence of the lensing statistics on the angular resolution of sources. Our results and methodology were carefully tested to guarantee that our uncertainties are much smaller than the effects here presented.

  17. On the Statistical Dependency of Identity Theft on Demographics

    NASA Astrophysics Data System (ADS)

    di Crescenzo, Giovanni

    An improved understanding of the identity theft problem is widely agreed to be necessary to succeed in counter-theft efforts in legislative, financial and research institutions. In this paper we report on a statistical study about the existence of relationships between identity theft and area demographics in the US. The identity theft data chosen was the number of citizen complaints to the Federal Trade Commission in a large number of US municipalities. The list of demographics used for any such municipality included: estimated population, median resident age, estimated median household income, percentage of citizens with a high school or higher degree, percentage of unemployed residents, percentage of married residents, percentage of foreign born residents, percentage of residents living in poverty, density of law enforcement employees, crime index, and political orientation according to the 2004 presidential election. Our study findings, based on linear regression techniques, include statistically significant relationships between the number of identity theft complaints and a non-trivial subset of these demographics.

  18. A statistical investigation into the relationship between meteorological parameters and suicide

    NASA Astrophysics Data System (ADS)

    Dixon, Keith W.; Shulman, Mark D.

    1983-06-01

    Many previous studies of relationships between weather and suicides have been inconclusive and contradictory. This study investigated the relationship between suicide frequency and meteorological conditions in people who are psychologically predisposed to commit suicide. Linear regressions of diurnal temperature change, departure of temperature from the climatic norm, mean daytime sky cover, and the number of hours of precipitation for each day were performed on daily suicide totals using standard computer methods. Statistical analyses of suicide data for days with and without frontal passages were also performed. Days with five or more suicides (clusterdays) were isolated, and their weather parameters compared with those of nonclusterdays. Results show that neither suicide totals nor clusterday occurrence can be predicted using these meteorological parameters, since statistically significant relationships were not found. Although the data hinted that frontal passages and large daily temperature changes may occur on days with above average suicide totals, it was concluded that the influence of the weather parameters used, on the suicide rate, is a minor one, if indeed one exists.

  19. A Flexible Approach for the Statistical Visualization of Ensemble Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Potter, K.; Wilson, A.; Bremer, P.

    2009-09-29

    Scientists are increasingly moving towards ensemble data sets to explore relationships present in dynamic systems. Ensemble data sets combine spatio-temporal simulation results generated using multiple numerical models, sampled input conditions and perturbed parameters. While ensemble data sets are a powerful tool for mitigating uncertainty, they pose significant visualization and analysis challenges due to their complexity. We present a collection of overview and statistical displays linked through a high level of interactivity to provide a framework for gaining key scientific insight into the distribution of the simulation results as well as the uncertainty associated with the data. In contrast to methodsmore » that present large amounts of diverse information in a single display, we argue that combining multiple linked statistical displays yields a clearer presentation of the data and facilitates a greater level of visual data analysis. We demonstrate this approach using driving problems from climate modeling and meteorology and discuss generalizations to other fields.« less

  20. A statistical study of global ionospheric map total electron content changes prior to occurrences of M ≥ 6.0 earthquakes during 2000-2014

    NASA Astrophysics Data System (ADS)

    Thomas, J. N.; Huard, J.; Masci, F.

    2017-02-01

    There are many reports on the occurrence of anomalous changes in the ionosphere prior to large earthquakes. However, whether or not these changes are reliable precursors that could be useful for earthquake prediction is controversial within the scientific community. To test a possible statistical relationship between ionospheric disturbances and earthquakes, we compare changes in the total electron content (TEC) of the ionosphere with occurrences of M ≥ 6.0 earthquakes globally for 2000-2014. We use TEC data from the global ionosphere map (GIM) and an earthquake list declustered for aftershocks. For each earthquake, we look for anomalous changes in GIM-TEC within 2.5° latitude and 5.0° longitude of the earthquake location (the spatial resolution of GIM-TEC). Our analysis has not found any statistically significant changes in GIM-TEC prior to earthquakes. Thus, we have found no evidence that would suggest that monitoring changes in GIM-TEC might be useful for predicting earthquakes.

  1. The effect of baryons in the cosmological lensing PDFs

    NASA Astrophysics Data System (ADS)

    Castro, Tiago; Quartin, Miguel; Giocoli, Carlo; Borgani, Stefano; Dolag, Klaus

    2018-05-01

    Observational cosmology is passing through a unique moment of grandeur with the amount of quality data growing fast. However, in order to better take advantage of this moment, data analysis tools have to keep up the pace. Understanding the effect of baryonic matter on the large-scale structure is one of the challenges to be faced in cosmology. In this work, we have thoroughly studied the effect of baryonic physics on different lensing statistics. Making use of the Magneticum Pathfinder suite of simulations we show that the influence of luminous matter on the 1-point lensing statistics of point sources is significant, enhancing the probability of magnified objects with μ > 3 by a factor of 2 and the occurrence of multiple-images by a factor 5 - 500 depending on the source redshift and size. We also discuss the dependence of the lensing statistics on the angular resolution of sources. Our results and methodology were carefully tested in order to guarantee that our uncertainties are much smaller than the effects here presented.

  2. Spatial statistical analysis of tree deaths using airborne digital imagery

    NASA Astrophysics Data System (ADS)

    Chang, Ya-Mei; Baddeley, Adrian; Wallace, Jeremy; Canci, Michael

    2013-04-01

    High resolution digital airborne imagery offers unprecedented opportunities for observation and monitoring of vegetation, providing the potential to identify, locate and track individual vegetation objects over time. Analytical tools are required to quantify relevant information. In this paper, locations of trees over a large area of native woodland vegetation were identified using morphological image analysis techniques. Methods of spatial point process statistics were then applied to estimate the spatially-varying tree death risk, and to show that it is significantly non-uniform. [Tree deaths over the area were detected in our previous work (Wallace et al., 2008).] The study area is a major source of ground water for the city of Perth, and the work was motivated by the need to understand and quantify vegetation changes in the context of water extraction and drying climate. The influence of hydrological variables on tree death risk was investigated using spatial statistics (graphical exploratory methods, spatial point pattern modelling and diagnostics).

  3. Is everything we eat associated with cancer? A systematic cookbook review.

    PubMed

    Schoenfeld, Jonathan D; Ioannidis, John P A

    2013-01-01

    Nutritional epidemiology is a highly prolific field. Debates on associations of nutrients with disease risk are common in the literature and attract attention in public media. We aimed to examine the conclusions, statistical significance, and reproducibility in the literature on associations between specific foods and cancer risk. We selected 50 common ingredients from random recipes in a cookbook. PubMed queries identified recent studies that evaluated the relation of each ingredient to cancer risk. Information regarding author conclusions and relevant effect estimates were extracted. When >10 articles were found, we focused on the 10 most recent articles. Forty ingredients (80%) had articles reporting on their cancer risk. Of 264 single-study assessments, 191 (72%) concluded that the tested food was associated with an increased (n = 103) or a decreased (n = 88) risk; 75% of the risk estimates had weak (0.05 > P ≥ 0.001) or no statistical (P > 0.05) significance. Statistically significant results were more likely than nonsignificant findings to be published in the study abstract than in only the full text (P < 0.0001). Meta-analyses (n = 36) presented more conservative results; only 13 (26%) reported an increased (n = 4) or a decreased (n = 9) risk (6 had more than weak statistical support). The median RRs (IQRs) for studies that concluded an increased or a decreased risk were 2.20 (1.60, 3.44) and 0.52 (0.39, 0.66), respectively. The RRs from the meta-analyses were on average null (median: 0.96; IQR: 0.85, 1.10). Associations with cancer risk or benefits have been claimed for most food ingredients. Many single studies highlight implausibly large effects, even though evidence is weak. Effect sizes shrink in meta-analyses.

  4. Extended local similarity analysis (eLSA) of microbial community and other time series data with replicates.

    PubMed

    Xia, Li C; Steele, Joshua A; Cram, Jacob A; Cardon, Zoe G; Simmons, Sheri L; Vallino, Joseph J; Fuhrman, Jed A; Sun, Fengzhu

    2011-01-01

    The increasing availability of time series microbial community data from metagenomics and other molecular biological studies has enabled the analysis of large-scale microbial co-occurrence and association networks. Among the many analytical techniques available, the Local Similarity Analysis (LSA) method is unique in that it captures local and potentially time-delayed co-occurrence and association patterns in time series data that cannot otherwise be identified by ordinary correlation analysis. However LSA, as originally developed, does not consider time series data with replicates, which hinders the full exploitation of available information. With replicates, it is possible to understand the variability of local similarity (LS) score and to obtain its confidence interval. We extended our LSA technique to time series data with replicates and termed it extended LSA, or eLSA. Simulations showed the capability of eLSA to capture subinterval and time-delayed associations. We implemented the eLSA technique into an easy-to-use analytic software package. The software pipeline integrates data normalization, statistical correlation calculation, statistical significance evaluation, and association network construction steps. We applied the eLSA technique to microbial community and gene expression datasets, where unique time-dependent associations were identified. The extended LSA analysis technique was demonstrated to reveal statistically significant local and potentially time-delayed association patterns in replicated time series data beyond that of ordinary correlation analysis. These statistically significant associations can provide insights to the real dynamics of biological systems. The newly designed eLSA software efficiently streamlines the analysis and is freely available from the eLSA homepage, which can be accessed at http://meta.usc.edu/softs/lsa.

  5. Extended local similarity analysis (eLSA) of microbial community and other time series data with replicates

    PubMed Central

    2011-01-01

    Background The increasing availability of time series microbial community data from metagenomics and other molecular biological studies has enabled the analysis of large-scale microbial co-occurrence and association networks. Among the many analytical techniques available, the Local Similarity Analysis (LSA) method is unique in that it captures local and potentially time-delayed co-occurrence and association patterns in time series data that cannot otherwise be identified by ordinary correlation analysis. However LSA, as originally developed, does not consider time series data with replicates, which hinders the full exploitation of available information. With replicates, it is possible to understand the variability of local similarity (LS) score and to obtain its confidence interval. Results We extended our LSA technique to time series data with replicates and termed it extended LSA, or eLSA. Simulations showed the capability of eLSA to capture subinterval and time-delayed associations. We implemented the eLSA technique into an easy-to-use analytic software package. The software pipeline integrates data normalization, statistical correlation calculation, statistical significance evaluation, and association network construction steps. We applied the eLSA technique to microbial community and gene expression datasets, where unique time-dependent associations were identified. Conclusions The extended LSA analysis technique was demonstrated to reveal statistically significant local and potentially time-delayed association patterns in replicated time series data beyond that of ordinary correlation analysis. These statistically significant associations can provide insights to the real dynamics of biological systems. The newly designed eLSA software efficiently streamlines the analysis and is freely available from the eLSA homepage, which can be accessed at http://meta.usc.edu/softs/lsa. PMID:22784572

  6. Coming of Age in Spain: The Self-identification, Beliefs and Self-Esteem of the Second Generation1

    PubMed Central

    Portes, Alejandro; Vickstrom, Erik; Aparicio, Rosa

    2013-01-01

    We review the literature on determinants of ethnic/national self-identities and self-esteem as a prelude to examining these outcomes among a large, statistically representative sample of second generation adolescents in Madrid and Barcelona. While these psycho-social outcomes are malleable, they still represent important dimensions of immigrant adaptation and can have significant consequences both for individual mobility and collective mobilizations. Current theories are largely based on data from the USA and other Anglophone countries. The availability of a new large Spanish survey allows us to test those theories in an entirely different socio-cultural context. The analysis concludes with a structural equations model that summarizes key determinants of national identities and self-esteem among children of immigrants in Spain. Theoretical and practical implications of these findings are discussed. PMID:21899520

  7. The topology of large-scale structure. VI - Slices of the universe

    NASA Astrophysics Data System (ADS)

    Park, Changbom; Gott, J. R., III; Melott, Adrian L.; Karachentsev, I. D.

    1992-03-01

    Results of an investigation of the topology of large-scale structure in two observed slices of the universe are presented. Both slices pass through the Coma cluster and their depths are 100 and 230/h Mpc. The present topology study shows that the largest void in the CfA slice is divided into two smaller voids by a statistically significant line of galaxies. The topology of toy models like the white noise and bubble models is shown to be inconsistent with that of the observed slices. A large N-body simulation was made of the biased cloud dark matter model and the slices are simulated by matching them in selection functions and boundary conditions. The genus curves for these simulated slices are spongelike and have a small shift in the direction of a meatball topology like those of observed slices.

  8. The topology of large-scale structure. VI - Slices of the universe

    NASA Technical Reports Server (NTRS)

    Park, Changbom; Gott, J. R., III; Melott, Adrian L.; Karachentsev, I. D.

    1992-01-01

    Results of an investigation of the topology of large-scale structure in two observed slices of the universe are presented. Both slices pass through the Coma cluster and their depths are 100 and 230/h Mpc. The present topology study shows that the largest void in the CfA slice is divided into two smaller voids by a statistically significant line of galaxies. The topology of toy models like the white noise and bubble models is shown to be inconsistent with that of the observed slices. A large N-body simulation was made of the biased cloud dark matter model and the slices are simulated by matching them in selection functions and boundary conditions. The genus curves for these simulated slices are spongelike and have a small shift in the direction of a meatball topology like those of observed slices.

  9. Morphological comparison of Astropecten cingulatus and a new species of Astropecten (Paxillosida, Astropectinidae) from the Gulf of Mexico.

    PubMed

    Lawrence, John M; Cobb, Janessa C; Herrera, Joan C; DurÁn-gonzÁlez, Alicia; SolÍs-marÍn, Francisco Alonso

    2018-04-09

    Astropecten cingulatus is a conspicuous species, which displays a large superomarginal plate series on the abactinal surface. Herein we describe a new species from off the Texas coast that shows the superficial appearance of A. cingulatus, including these large superomarginal plates, but with armature differing from that of typological A. cingulatus. This species shows the actinal surface of the inferomarginal plates without the squamules present on A. cingulatus. In addition, the adambulacral plates possessed but a single central large spine surrounded by a circle of spines rather than spine rows. The abactinal paxillar region was also very narrow. Statistical analysis of these and other morphological characters showed the specimens differed significantly from those of A. cingulatus. The regression of the slope of R:SM# vs. R was significant but the intercept was not. Therefore the two species are indistinguishable at small sizes based on R:SM. Compared to known Atlantic Astropecten spp. these observed characters warrant the description of a new species, Astropecten karankawai, for the specimens from off the coasts of Texas and Mexico.

  10. A statistical study on synergetic effects of atmospheric rivers and cut-off lows upon precipitation

    NASA Astrophysics Data System (ADS)

    Tsuji, H.; Takayabu, Y. N.

    2017-12-01

    Effects of atmospheric rivers (ARs) on precipitation in the western North Pacific (WNP) region has been less studied compared with that in the eastern Pacific. Recently, Hirota et al. (2016, MWR) analyzed the extreme rainfall event which caused a disastrous flood in Hiroshima, Japan, on 19 August 2014. They showed that a coincidence of very moist troposphere associated with AR and instability and dynamical ascent associated with a cut-off low (COL) played a significant role for the rainfall event. As in this case, AR in the WNP region seems to enhance rainfall with additional instability or ascending motion brought by another system. However, it is not clear how far large-scale conditions such as AR and COL can determine locations of severe rainfall. In this study, we statistically investigate the differences in precipitation between cases in which AR exists near a COL (AR category) and those in which AR does not exist near a COL (non-AR category). Precipitation data are obtained from hourly Global Satellite Mapping of Precipitation (GSMaP) data (0.1 degree grid). We define AR and COL with six-hourly JRA55 (1.25 degree grid) precipitable water (PW) and 350 K isentropic potential vorticity, respectively. The analyses are conducted in the WNP region (100E-160W, 0-60N), from March 2000 to February 2013. Composite results show that precipitation amount around the front side of COL's moving direction (north) in the AR category (139 cases) is larger than that in the non-AR category (63 cases). In particular, the difference of precipitation around the front-left side (north-west) is statistically significant. The relationship among the locations of COL, positive PW anomaly region corresponding to AR, and the region where the difference of precipitation is statistically significant is similar with that among the locations of COL, AR, and extreme precipitation area in the event of Hiroshima. It is indicated that a precipitation enhancement can occur associated with a synergistic effect of large-scale conditions such as AR and a COL, and Hiroshima flood was one of such cases. Acknowledgments: This research was supported by the Environment Research and Technology Development Fund (2-1503) of Environmental Restoration and Conservation Agency, and JSPS KAKENHI Grant Number 15H02132.

  11. Rate of complications due to neuromuscular scoliosis spine surgery in a 30-years consecutive series.

    PubMed

    Turturro, Francesco; Montanaro, Antonello; Calderaro, Cosma; Labianca, Luca; Di Sanzo, Vincenzo; Ferretti, Andrea

    2017-10-01

    The aim of this study was to evaluate the rate of intraoperative and postoperative complications in a large series of patients affected by neuromuscular scoliosis. It was a monocentric retrospective study. In this study have been considered complications those events that significantly affected the course of treatment, such as getting the hospital stay longer, or requiring a subsequent surgical procedure, or corrupting the final result of the treatment. Of the 358 patients affected by neuromuscular scoliosis treated from January 1985 to December 2010, 185 that met the inclusion criteria were included in the study. There were recorded 66 complications in 55/185 patients. Of that 66 complications, 54 complications occurred in 46/120 patients with Luque's instrumentation, while only 12 complications occurred in 9/65 patients with hybrid instrumentation and this difference was statistically significant (p < 0.05); 11/126 patients with pelvic fixation and 5/59 without pelvic fixation, as well as 45/156 patients treated by posterior approach alone and 10/29 patient that underwent combined anterior-posterior approach suffered complications but both this did not result in a statistical significant difference (p > 0.05). The surgical treatment in neuromuscular scoliosis is burdened by a large number of complications. An accurate knowledge of possible complications is mandatory to prepare strategies due to prevent adverse events. A difference in definitions could completely change results in good or bad as well as in our same series the adverse events amounted at almost 30% of cases, but complications that due to complete failure would amount at 9.19% of patients.

  12. How LEND sees the water on the Moon

    NASA Astrophysics Data System (ADS)

    Sanin, Anton; Mitrofanov, Igor; Litvak, Maxim; Boynton, William; Bodnarik, Julia; Hamara, Dave; Harshman, Karl; Chin, Gordon; Evans, Larry; Livengood, Timothy; McClanahan, Timothy; Sagdeev, Roald; Starr, Richard

    2016-04-01

    The Lunar Exploration Neutron Detector (LEND) is operating on orbit around the Moon on-board the Lunar Reconnaissance Orbiter (LRO) spacecraft more than six years. LEND has been designed and manufactured to investigate presence and determine average amount of hydrogen in upper (~1 m depth) subsurface layer of the Lunar regolith with spatial resolution ~10 km from 50 km orbit and to check the hypothesis what the permanently shadowed regions (PSRs) at circumpolar regions are the main reservoirs of a large deposition of water ice on the Moon. One of most interesting and surprising LEND observations that not all large PSRs contain a detectable amount of hydrogen but there are neutron suppression regions (NSRs) with statistically significant suppression of neutron flux. The NSRs partially overlap or include PSRs in craters Cabeus, Shoemaker, Haworth (on South) and Rozhdestvensky U (on North) but significant part of their area spread out at sunlit territory. This means that hydrogen may be preserved for a long time or even accumulated at a subsurface regolith layer of sunlit areas. The majority of PSRs do not show statistically significant suppressions of neutron flux in comparison with neighbor sunlit vicinity. This implies a hypothesis what a permanent shadow is not only necessary condition for the hydrogen accumulation and preservation in the lunar subsurface. A method of water equivalent hydrogen (WEH) in top ~1 meter regolith estimation using LEND data has been developed. Maps of WEH distribution in North and South polar regions will be presented and discussed. Also, WEH estimation in case of hydrogen bearing regolith layer coverage by a dry regolith will be presented for largest NSRs.

  13. The best motivator priorities parents choose via analytical hierarchy process

    NASA Astrophysics Data System (ADS)

    Farah, R. N.; Latha, P.

    2015-05-01

    Motivation is probably the most important factor that educators can target in order to improve learning. Numerous cross-disciplinary theories have been postulated to explain motivation. While each of these theories has some truth, no single theory seems to adequately explain all human motivation. The fact is that human beings in general and pupils in particular are complex creatures with complex needs and desires. In this paper, Analytic Hierarchy Process (AHP) has been proposed as an emerging solution to move towards too large, dynamic and complex real world multi-criteria decision making problems in selecting the most suitable motivator when choosing school for their children. Data were analyzed using SPSS 17.0 ("Statistical Package for Social Science") software. Statistic testing used are descriptive and inferential statistic. Descriptive statistic used to identify respondent pupils and parents demographic factors. The statistical testing used to determine the pupils and parents highest motivator priorities and parents' best priorities using AHP to determine the criteria chosen by parents such as school principals, teachers, pupils and parents. The moderating factors are selected schools based on "Standard Kualiti Pendidikan Malaysia" (SKPM) in Ampang. Inferential statistics such as One-way ANOVA used to get the significant and data used to calculate the weightage of AHP. School principals is found to be the best motivator for parents in choosing school for their pupils followed by teachers, parents and pupils.

  14. Cosmological constraints with weak-lensing peak counts and second-order statistics in a large-field survey

    NASA Astrophysics Data System (ADS)

    Peel, Austin; Lin, Chieh-An; Lanusse, François; Leonard, Adrienne; Starck, Jean-Luc; Kilbinger, Martin

    2017-03-01

    Peak statistics in weak-lensing maps access the non-Gaussian information contained in the large-scale distribution of matter in the Universe. They are therefore a promising complementary probe to two-point and higher-order statistics to constrain our cosmological models. Next-generation galaxy surveys, with their advanced optics and large areas, will measure the cosmic weak-lensing signal with unprecedented precision. To prepare for these anticipated data sets, we assess the constraining power of peak counts in a simulated Euclid-like survey on the cosmological parameters Ωm, σ8, and w0de. In particular, we study how Camelus, a fast stochastic model for predicting peaks, can be applied to such large surveys. The algorithm avoids the need for time-costly N-body simulations, and its stochastic approach provides full PDF information of observables. Considering peaks with a signal-to-noise ratio ≥ 1, we measure the abundance histogram in a mock shear catalogue of approximately 5000 deg2 using a multiscale mass-map filtering technique. We constrain the parameters of the mock survey using Camelus combined with approximate Bayesian computation, a robust likelihood-free inference algorithm. Peak statistics yield a tight but significantly biased constraint in the σ8-Ωm plane, as measured by the width ΔΣ8 of the 1σ contour. We find Σ8 = σ8(Ωm/ 0.27)α = 0.77-0.05+0.06 with α = 0.75 for a flat ΛCDM model. The strong bias indicates the need to better understand and control the model systematics before applying it to a real survey of this size or larger. We perform a calibration of the model and compare results to those from the two-point correlation functions ξ± measured on the same field. We calibrate the ξ± result as well, since its contours are also biased, although not as severely as for peaks. In this case, we find for peaks Σ8 = 0.76-0.03+0.02 with α = 0.65, while for the combined ξ+ and ξ- statistics the values are Σ8 = 0.76-0.01+0.02 and α = 0.70. We conclude that the constraining power can therefore be comparable between the two weak-lensing observables in large-field surveys. Furthermore, the tilt in the σ8-Ωm degeneracy direction for peaks with respect to that of ξ± suggests that a combined analysis would yield tighter constraints than either measure alone. As expected, w0de cannot be well constrained without a tomographic analysis, but its degeneracy directions with the other two varied parameters are still clear for both peaks and ξ±.

  15. Probabilistic Models For Earthquakes With Large Return Periods In Himalaya Region

    NASA Astrophysics Data System (ADS)

    Chaudhary, Chhavi; Sharma, Mukat Lal

    2017-12-01

    Determination of the frequency of large earthquakes is of paramount importance for seismic risk assessment as large events contribute to significant fraction of the total deformation and these long return period events with low probability of occurrence are not easily captured by classical distributions. Generally, with a small catalogue these larger events follow different distribution function from the smaller and intermediate events. It is thus of special importance to use statistical methods that analyse as closely as possible the range of its extreme values or the tail of the distributions in addition to the main distributions. The generalised Pareto distribution family is widely used for modelling the events which are crossing a specified threshold value. The Pareto, Truncated Pareto, and Tapered Pareto are the special cases of the generalised Pareto family. In this work, the probability of earthquake occurrence has been estimated using the Pareto, Truncated Pareto, and Tapered Pareto distributions. As a case study, the Himalayas whose orogeny lies in generation of large earthquakes and which is one of the most active zones of the world, has been considered. The whole Himalayan region has been divided into five seismic source zones according to seismotectonic and clustering of events. Estimated probabilities of occurrence of earthquakes have also been compared with the modified Gutenberg-Richter distribution and the characteristics recurrence distribution. The statistical analysis reveals that the Tapered Pareto distribution better describes seismicity for the seismic source zones in comparison to other distributions considered in the present study.

  16. Universal statistics of vortex tangles in three-dimensional random waves

    NASA Astrophysics Data System (ADS)

    Taylor, Alexander J.

    2018-02-01

    The tangled nodal lines (wave vortices) in random, three-dimensional wavefields are studied as an exemplar of a fractal loop soup. Their statistics are a three-dimensional counterpart to the characteristic random behaviour of nodal domains in quantum chaos, but in three dimensions the filaments can wind around one another to give distinctly different large scale behaviours. By tracing numerically the structure of the vortices, their conformations are shown to follow recent analytical predictions for random vortex tangles with periodic boundaries, where the local disorder of the model ‘averages out’ to produce large scale power law scaling relations whose universality classes do not depend on the local physics. These results explain previous numerical measurements in terms of an explicit effect of the periodic boundaries, where the statistics of the vortices are strongly affected by the large scale connectedness of the system even at arbitrarily high energies. The statistics are investigated primarily for static (monochromatic) wavefields, but the analytical results are further shown to directly describe the reconnection statistics of vortices evolving in certain dynamic systems, or occurring during random perturbations of the static configuration.

  17. Stock market returns and clinical trial results of investigational compounds: an event study analysis of large biopharmaceutical companies.

    PubMed

    Hwang, Thomas J

    2013-01-01

    For biopharmaceutical companies, investments in research and development are risky, and the results from clinical trials are key inflection points in the process. Few studies have explored how and to what extent the public equity market values clinical trial results. Our study dataset matched announcements of clinical trial results for investigational compounds from January 2011 to May 2013 with daily stock market returns of large United States-listed pharmaceutical and biotechnology companies. Event study methodology was used to examine the relationship between clinical research events and changes in stock returns. We identified public announcements for clinical trials of 24 investigational compounds, including 16 (67%) positive and 8 (33%) negative events. The majority of announcements were for Phase 3 clinical trials (N = 13, 54%), and for oncologic (N = 7, 29%) and neurologic (N = 6, 24%) indications. The median cumulative abnormal returns on the day of the announcement were 0.8% (95% confidence interval [CI]: -2.3, 13.4%; P = 0.02) for positive events and -2.0% (95% CI: -9.1, 0.7%; P = 0.04) for negative events, with statistically significant differences from zero. In the day immediately following the announcement, firms with positive events were associated with stock price corrections, with median cumulative abnormal returns falling to 0.4% (95% CI: -3.8, 12.3%; P = 0.33). For firms with negative announcements, the median cumulative abnormal returns were -1.7% (95% CI: -9.5, 1.0%; P = 0.03), and remained significantly negative over the two day event window. The magnitude of abnormal returns did not differ statistically by indication, by trial phase, or between biotechnology and pharmaceutical firms. The release of clinical trial results is an economically significant event and has meaningful effects on market value for large biopharmaceutical companies. Stock return underperformance due to negative events is greater in magnitude and persists longer than abnormal returns due to positive events, suggesting asymmetric market reactions.

  18. Multimorbidity Patterns in the Elderly: A New Approach of Disease Clustering Identifies Complex Interrelations between Chronic Conditions

    PubMed Central

    Schäfer, Ingmar; von Leitner, Eike-Christin; Schön, Gerhard; Koller, Daniela; Hansen, Heike; Kolonko, Tina; Kaduszkiewicz, Hanna; Wegscheider, Karl; Glaeske, Gerd; van den Bussche, Hendrik

    2010-01-01

    Objective Multimorbidity is a common problem in the elderly that is significantly associated with higher mortality, increased disability and functional decline. Information about interactions of chronic diseases can help to facilitate diagnosis, amend prevention and enhance the patients' quality of life. The aim of this study was to increase the knowledge of specific processes of multimorbidity in an unselected elderly population by identifying patterns of statistically significantly associated comorbidity. Methods Multimorbidity patterns were identified by exploratory tetrachoric factor analysis based on claims data of 63,104 males and 86,176 females in the age group 65+. Analyses were based on 46 diagnosis groups incorporating all ICD-10 diagnoses of chronic diseases with a prevalence ≥ 1%. Both genders were analyzed separately. Persons were assigned to multimorbidity patterns if they had at least three diagnosis groups with a factor loading of 0.25 on the corresponding pattern. Results Three multimorbidity patterns were found: 1) cardiovascular/metabolic disorders [prevalence female: 30%; male: 39%], 2) anxiety/depression/somatoform disorders and pain [34%; 22%], and 3) neuropsychiatric disorders [6%; 0.8%]. The sampling adequacy was meritorious (Kaiser-Meyer-Olkin measure: 0.85 and 0.84, respectively) and the factors explained a large part of the variance (cumulative percent: 78% and 75%, respectively). The patterns were largely age-dependent and overlapped in a sizeable part of the population. Altogether 50% of female and 48% of male persons were assigned to at least one of the three multimorbidity patterns. Conclusion This study shows that statistically significant co-occurrence of chronic diseases can be subsumed in three prevalent multimorbidity patterns if accounting for the fact that different multimorbidity patterns share some diagnosis groups, influence each other and overlap in a large part of the population. In recognizing the full complexity of multimorbidity we might improve our ability to predict needs and achieve possible benefits for elderly patients who suffer from multimorbidity. PMID:21209965

  19. Periodontal disease and carotid atherosclerosis: A meta-analysis of 17,330 participants.

    PubMed

    Zeng, Xian-Tao; Leng, Wei-Dong; Lam, Yat-Yin; Yan, Bryan P; Wei, Xue-Mei; Weng, Hong; Kwong, Joey S W

    2016-01-15

    The association between periodontal disease and carotid atherosclerosis has been evaluated primarily in single-center studies, and whether periodontal disease is an independent risk factor of carotid atherosclerosis remains uncertain. This meta-analysis aimed to evaluate the association between periodontal disease and carotid atherosclerosis. We searched PubMed and Embase for relevant observational studies up to February 20, 2015. Two authors independently extracted data from included studies, and odds ratios (ORs) with 95% confidence intervals (CIs) were calculated for overall and subgroup meta-analyses. Statistical heterogeneity was assessed by the chi-squared test (P<0.1 for statistical significance) and quantified by the I(2) statistic. Data analysis was conducted using the Comprehensive Meta-Analysis (CMA) software. Fifteen observational studies involving 17,330 participants were included in the meta-analysis. The overall pooled result showed that periodontal disease was associated with carotid atherosclerosis (OR: 1.27, 95% CI: 1.14-1.41; P<0.001) but statistical heterogeneity was substantial (I(2)=78.90%). Subgroup analysis of adjusted smoking and diabetes mellitus showed borderline significance (OR: 1.08; 95% CI: 1.00-1.18; P=0.05). Sensitivity and cumulative analyses both indicated that our results were robust. Findings of our meta-analysis indicated that the presence of periodontal disease was associated with carotid atherosclerosis; however, further large-scale, well-conducted clinical studies are needed to explore the precise risk of developing carotid atherosclerosis in patients with periodontal disease. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  20. On the diffuse fraction of daily and monthly global radiation for the island of Cyprus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacovides, C.P.; Hadjioannou, L.; Pashiardis, S.

    1996-06-01

    Six years of hourly global and diffuse irradiation measurements on a horizontal surface performed at Athalassa, Cyprus, are used to establish a relationship between the daily diffuse fraction and the daily clearness index. Two types of correlations - yearly and seasonal - have been developed. These correlations, of first and third order in the clearness index are compared to the various correlations established by Collares-Pereira and Rabl (1979), Newland (1989), Erbs et al. (1982), Rao et al. (1984), Page (1961), Liu and Jordan (1960) and Lalas et al. (1987). The comparison has been performed in terms of the widely usedmore » statistical indicators (MBE) and (RMSE) errors; and additional statistical indicator, the t-statistic, combining the earlier indicators, is introduced. The results indicate that the proposed yearly correlation matches the earlier correlations quite closely and all correlations examined yield results that are statistically significant. For large K{sub t} > 0.60 values, most of the earlier correlations exhibit a slight tendency to systematically overestimate the diffuse fraction. This marginal disagreement between the earlier correlations and the proposed model is probably significantly affected by the clear sky conditions that prevail over Cyprus for most of the time as well as atmospheric humidity content. It is clear that the standard correlations examined in this analysis appear to be location-independent models for diffuse irradiation predictions, at least for the Cyprus case. 13 refs., 5 figs., 4 tabs.« less

  1. Electron microscopic quantification of collagen fibril diameters in the rabbit medial collateral ligament: a baseline for comparison.

    PubMed

    Frank, C; Bray, D; Rademaker, A; Chrusch, C; Sabiston, P; Bodie, D; Rangayyan, R

    1989-01-01

    To establish a normal baseline for comparison, thirty-one thousand collagen fibril diameters were measured in calibrated transmission electron (TEM) photomicrographs of normal rabbit medial collateral ligaments (MCL's). A new automated method of quantitation was used to compare statistically fibril minimum diameter distributions in one midsubstance location in both MCL's from six animals at 3 months of age (immature) and three animals at 10 months of age (mature). Pooled results demonstrate that rabbit MCL's have statistically different (p less than 0.001) mean minimum diameters at these two ages. Interanimal differences in mean fibril minimum diameters were also significant (p less than 0.001) and varied by 20% to 25% in both mature and immature animals. Finally, there were significant differences (p less than 0.001) in mean diameters and distributions from side-to-side in all animals. These mean left-to-right differences were less than 10% in all mature animals but as much as 62% in some immature animals. Statistical analysis of these data demonstrate that animal-to-animal comparisons using these protocols require a large number of animals with appropriate numbers of fibrils being measured to detect small intergroup differences. With experiments which compare left to right ligaments, far fewer animals are required to detect similarly small differences. These results demonstrate the necessity for rigorous control of sampling, an extensive normal baseline and statistically confirmed experimental designs in any TEM comparisons of collagen fibril diameters.

  2. CALIPSO Observations of Near-Cloud Aerosol Properties as a Function of Cloud Fraction

    NASA Technical Reports Server (NTRS)

    Yang, Weidong; Marshak, Alexander; Varnai, Tamas; Wood, Robert

    2015-01-01

    This paper uses spaceborne lidar data to study how near-cloud aerosol statistics of attenuated backscatter depend on cloud fraction. The results for a large region around the Azores show that: (1) far-from-cloud aerosol statistics are dominated by samples from scenes with lower cloud fractions, while near-cloud aerosol statistics are dominated by samples from scenes with higher cloud fractions; (2) near-cloud enhancements of attenuated backscatter occur for any cloud fraction but are most pronounced for higher cloud fractions; (3) the difference in the enhancements for different cloud fractions is most significant within 5km from clouds; (4) near-cloud enhancements can be well approximated by logarithmic functions of cloud fraction and distance to clouds. These findings demonstrate that if variability in cloud fraction across the scenes used to composite aerosol statistics are not considered, a sampling artifact will affect these statistics calculated as a function of distance to clouds. For the Azores-region dataset examined here, this artifact occurs mostly within 5 km from clouds, and exaggerates the near-cloud enhancements of lidar backscatter and color ratio by about 30. This shows that for accurate characterization of the changes in aerosol properties with distance to clouds, it is important to account for the impact of changes in cloud fraction.

  3. Pooling sexes when assessing ground reaction forces during walking: Statistical Parametric Mapping versus traditional approach.

    PubMed

    Castro, Marcelo P; Pataky, Todd C; Sole, Gisela; Vilas-Boas, Joao Paulo

    2015-07-16

    Ground reaction force (GRF) data from men and women are commonly pooled for analyses. However, it may not be justifiable to pool sexes on the basis of discrete parameters extracted from continuous GRF gait waveforms because this can miss continuous effects. Forty healthy participants (20 men and 20 women) walked at a cadence of 100 steps per minute across two force plates, recording GRFs. Two statistical methods were used to test the null hypothesis of no mean GRF differences between sexes: (i) Statistical Parametric Mapping-using the entire three-component GRF waveform; and (ii) traditional approach-using the first and second vertical GRF peaks. Statistical Parametric Mapping results suggested large sex differences, which post-hoc analyses suggested were due predominantly to higher anterior-posterior and vertical GRFs in early stance in women compared to men. Statistically significant differences were observed for the first GRF peak and similar values for the second GRF peak. These contrasting results emphasise that different parts of the waveform have different signal strengths and thus that one may use the traditional approach to choose arbitrary metrics and make arbitrary conclusions. We suggest that researchers and clinicians consider both the entire gait waveforms and sex-specificity when analysing GRF data. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Comparison of Artificial Neural Networks and ARIMA statistical models in simulations of target wind time series

    NASA Astrophysics Data System (ADS)

    Kolokythas, Kostantinos; Vasileios, Salamalikis; Athanassios, Argiriou; Kazantzidis, Andreas

    2015-04-01

    The wind is a result of complex interactions of numerous mechanisms taking place in small or large scales, so, the better knowledge of its behavior is essential in a variety of applications, especially in the field of power production coming from wind turbines. In the literature there is a considerable number of models, either physical or statistical ones, dealing with the problem of simulation and prediction of wind speed. Among others, Artificial Neural Networks (ANNs) are widely used for the purpose of wind forecasting and, in the great majority of cases, outperform other conventional statistical models. In this study, a number of ANNs with different architectures, which have been created and applied in a dataset of wind time series, are compared to Auto Regressive Integrated Moving Average (ARIMA) statistical models. The data consist of mean hourly wind speeds coming from a wind farm on a hilly Greek region and cover a period of one year (2013). The main goal is to evaluate the models ability to simulate successfully the wind speed at a significant point (target). Goodness-of-fit statistics are performed for the comparison of the different methods. In general, the ANN showed the best performance in the estimation of wind speed prevailing over the ARIMA models.

  5. Effects of wind direction on coarse and fine particulate matter concentrations in southeast Kansas.

    PubMed

    Guerra, Sergio A; Lane, Dennis D; Marotz, Glen A; Carter, Ray E; Hohl, Carrie M; Baldauf, Richard W

    2006-11-01

    Field data for coarse particulate matter ([PM] PM10) and fine particulate matter (PM2.5) were collected at selected sites in Southeast Kansas from March 1999 to October 2000, using portable MiniVol particulate samplers. The purpose was to assess the influence on air quality of four industrial facilities that burn hazardous waste in the area located in the communities of Chanute, Independence, Fredonia, and Coffeyville. Both spatial and temporal variation were observed in the data. Variation because of sampling site was found to be statistically significant for PM10 but not for PM2.5. PM10 concentrations were typically slightly higher at sites located within the four study communities than at background sites. Sampling sites were located north and south of the four targeted sources to provide upwind and downwind monitoring pairs. No statistically significant differences were found between upwind and downwind samples for either PM10 or PM2.5, indicating that the targeted sources did not contribute significantly to PM concentrations. Wind direction can frequently contribute to temporal variation in air pollutant concentrations and was investigated in this study. Sampling days were divided into four classifications: predominantly south winds, predominantly north winds, calm/variable winds, and winds from other directions. The effect of wind direction was found to be statistically significant for both PM10 and PM2.5. For both size ranges, PM concentrations were typically highest on days with predominantly south winds; days with calm/variable winds generally produced higher concentrations than did those with predominantly north winds or those with winds from "other" directions. The significant effect of wind direction suggests that regional sources may exert a large influence on PM concentrations in the area.

  6. Intermittency in generalized NLS equation with focusing six-wave interactions

    NASA Astrophysics Data System (ADS)

    Agafontsev, D. S.; Zakharov, V. E.

    2015-10-01

    We study numerically the statistics of waves for generalized one-dimensional Nonlinear Schrödinger (NLS) equation that takes into account focusing six-wave interactions, dumping and pumping terms. We demonstrate the universal behavior of this system for the region of parameters when six-wave interactions term affects significantly only the largest waves. In particular, in the statistically steady state of this system the probability density function (PDF) of wave amplitudes turns out to be strongly non-Rayleigh one for large waves, with characteristic "fat tail" decaying with amplitude | Ψ | close to ∝ exp ⁡ (- γ | Ψ |), where γ > 0 is constant. The corresponding non-Rayleigh addition to the PDF indicates strong intermittency, vanishes in the absence of six-wave interactions, and increases with six-wave coupling coefficient.

  7. Assessing signal-to-noise in quantitative proteomics: multivariate statistical analysis in DIGE experiments.

    PubMed

    Friedman, David B

    2012-01-01

    All quantitative proteomics experiments measure variation between samples. When performing large-scale experiments that involve multiple conditions or treatments, the experimental design should include the appropriate number of individual biological replicates from each condition to enable the distinction between a relevant biological signal from technical noise. Multivariate statistical analyses, such as principal component analysis (PCA), provide a global perspective on experimental variation, thereby enabling the assessment of whether the variation describes the expected biological signal or the unanticipated technical/biological noise inherent in the system. Examples will be shown from high-resolution multivariable DIGE experiments where PCA was instrumental in demonstrating biologically significant variation as well as sample outliers, fouled samples, and overriding technical variation that would not be readily observed using standard univariate tests.

  8. X-ray studies of quasars with the Einstein Observatory. IV - X-ray dependence on radio emission

    NASA Technical Reports Server (NTRS)

    Worrall, D. M.; Tananbaum, H.; Giommi, P.; Zamorani, G.

    1987-01-01

    The X-ray properties of a sample of 114 radio-loud quasars observed with the Einstein Observatory are examined, and the results are compared with those obtained from a large sample of radio-quiet quasars. The results of statistical analysis of the dependence of X-ray luminosity on combined functions of optical and radio luminosity show that the dependence on both luminosities is important. However, statistically significant differences are found between subsamples of flat radio spectra quasars and steep radio spectra quasars with regard to dependence of X-ray luminosity on only radio luminosity. The data are consistent with radio-loud quasars having a physical component, not directly related to the optical luminosity, which produces the core radio luminosity plus 'extra' X-ray emission.

  9. Modified Distribution-Free Goodness-of-Fit Test Statistic.

    PubMed

    Chun, So Yeon; Browne, Michael W; Shapiro, Alexander

    2018-03-01

    Covariance structure analysis and its structural equation modeling extensions have become one of the most widely used methodologies in social sciences such as psychology, education, and economics. An important issue in such analysis is to assess the goodness of fit of a model under analysis. One of the most popular test statistics used in covariance structure analysis is the asymptotically distribution-free (ADF) test statistic introduced by Browne (Br J Math Stat Psychol 37:62-83, 1984). The ADF statistic can be used to test models without any specific distribution assumption (e.g., multivariate normal distribution) of the observed data. Despite its advantage, it has been shown in various empirical studies that unless sample sizes are extremely large, this ADF statistic could perform very poorly in practice. In this paper, we provide a theoretical explanation for this phenomenon and further propose a modified test statistic that improves the performance in samples of realistic size. The proposed statistic deals with the possible ill-conditioning of the involved large-scale covariance matrices.

  10. Regional-scale analysis of extreme precipitation from short and fragmented records

    NASA Astrophysics Data System (ADS)

    Libertino, Andrea; Allamano, Paola; Laio, Francesco; Claps, Pierluigi

    2018-02-01

    Rain gauge is the oldest and most accurate instrument for rainfall measurement, able to provide long series of reliable data. However, rain gauge records are often plagued by gaps, spatio-temporal discontinuities and inhomogeneities that could affect their suitability for a statistical assessment of the characteristics of extreme rainfall. Furthermore, the need to discard the shorter series for obtaining robust estimates leads to ignore a significant amount of information which can be essential, especially when large return periods estimates are sought. This work describes a robust statistical framework for dealing with uneven and fragmented rainfall records on a regional spatial domain. The proposed technique, named "patched kriging" allows one to exploit all the information available from the recorded series, independently of their length, to provide extreme rainfall estimates in ungauged areas. The methodology involves the sequential application of the ordinary kriging equations, producing a homogeneous dataset of synthetic series with uniform lengths. In this way, the errors inherent to any regional statistical estimation can be easily represented in the spatial domain and, possibly, corrected. Furthermore, the homogeneity of the obtained series, provides robustness toward local artefacts during the parameter-estimation phase. The application to a case study in the north-western Italy demonstrates the potential of the methodology and provides a significant base for discussing its advantages over previous techniques.

  11. Four hundred or more participants needed for stable contingency table estimates of clinical prediction rule performance.

    PubMed

    Kent, Peter; Boyle, Eleanor; Keating, Jennifer L; Albert, Hanne B; Hartvigsen, Jan

    2017-02-01

    To quantify variability in the results of statistical analyses based on contingency tables and discuss the implications for the choice of sample size for studies that derive clinical prediction rules. An analysis of three pre-existing sets of large cohort data (n = 4,062-8,674) was performed. In each data set, repeated random sampling of various sample sizes, from n = 100 up to n = 2,000, was performed 100 times at each sample size and the variability in estimates of sensitivity, specificity, positive and negative likelihood ratios, posttest probabilities, odds ratios, and risk/prevalence ratios for each sample size was calculated. There were very wide, and statistically significant, differences in estimates derived from contingency tables from the same data set when calculated in sample sizes below 400 people, and typically, this variability stabilized in samples of 400-600 people. Although estimates of prevalence also varied significantly in samples below 600 people, that relationship only explains a small component of the variability in these statistical parameters. To reduce sample-specific variability, contingency tables should consist of 400 participants or more when used to derive clinical prediction rules or test their performance. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Statistical tests to compare motif count exceptionalities

    PubMed Central

    Robin, Stéphane; Schbath, Sophie; Vandewalle, Vincent

    2007-01-01

    Background Finding over- or under-represented motifs in biological sequences is now a common task in genomics. Thanks to p-value calculation for motif counts, exceptional motifs are identified and represent candidate functional motifs. The present work addresses the related question of comparing the exceptionality of one motif in two different sequences. Just comparing the motif count p-values in each sequence is indeed not sufficient to decide if this motif is significantly more exceptional in one sequence compared to the other one. A statistical test is required. Results We develop and analyze two statistical tests, an exact binomial one and an asymptotic likelihood ratio test, to decide whether the exceptionality of a given motif is equivalent or significantly different in two sequences of interest. For that purpose, motif occurrences are modeled by Poisson processes, with a special care for overlapping motifs. Both tests can take the sequence compositions into account. As an illustration, we compare the octamer exceptionalities in the Escherichia coli K-12 backbone versus variable strain-specific loops. Conclusion The exact binomial test is particularly adapted for small counts. For large counts, we advise to use the likelihood ratio test which is asymptotic but strongly correlated with the exact binomial test and very simple to use. PMID:17346349

  13. Effect of crowd size on patient volume at a large, multipurpose, indoor stadium.

    PubMed

    De Lorenzo, R A; Gray, B C; Bennett, P C; Lamparella, V J

    1989-01-01

    A prediction of patient volume expected at "mass gatherings" is desirable in order to provide optimal on-site emergency medical care. While several methods of predicting patient loads have been suggested, a reliable technique has not been established. This study examines the frequency of medical emergencies at the Syracuse University Carrier Dome, a 50,500-seat indoor stadium. Patient volume and level of care at collegiate basketball and football games as well as rock concerts, over a 7-year period were examined and tabulated. This information was analyzed using simple regression and nonparametric statistical methods to determine level of correlation between crowd size and patient volume. These analyses demonstrated no statistically significant increase in patient volume for increasing crowd size for basketball and football events. There was a small but statistically significant increase in patient volume for increasing crowd size for concerts. A comparison of similar crowd size for each of the three events showed that patient frequency is greatest for concerts and smallest for basketball. The study suggests that crowd size alone has only a minor influence on patient volume at any given event. Structuring medical services based solely on expected crowd size and not considering other influences such as event type and duration may give poor results.

  14. Quality of water in the upper Ohio River basin and at Erie, Pennsylvania

    USGS Publications Warehouse

    Lewis, Samuel James

    1906-01-01

    This paper discusses the quality of water on the most important tributaries of Ohio River in Pennsylvania, New York, West Virginia, and Maryland, and the nature of the water supply at Erie, Pa. The amount and character of the pollution is described and the results of drinking contaminated water as shown by typhoid statistics are indicated. The conditions on the tributaries of Ohio River in Ohio are discussed in Water-Supply and Irrigation Paper No. 79, United States Geological Survey, pages 129-187. The water supplies and sewerage of small towns high up toward the head of a large drainage system do not in many cases receive the attention they should. Epidemics of a waterborne disease which affect large municipalities near the mouth of the river and therefore attract attention must necessarily have their origin in the pollution of the watershed above. It is evident, therefore, that adequate sanitation of the small towns and a water supply as carefully guarded as that of a large city would prevent disease at its very source and be far less expensive than the costly battles which are waged against epidemics in huge centers of population after disease has broken out. Typhoid fever statistics for small towns in this section are seldom available and are more or less unreliable at best. The few figures given show the existence of virulent typhoid fever in most towns of the drainage areas in certain years, and as these towns drain into the streams the liability ofthe water to infection is evident. The significance of typhoid fever death rates will be better understood from the statistics presented below, which have been collated from a number of cities having excellent water supplies.

  15. New Insights into Handling Missing Values in Environmental Epidemiological Studies

    PubMed Central

    Roda, Célina; Nicolis, Ioannis; Momas, Isabelle; Guihenneuc, Chantal

    2014-01-01

    Missing data are unavoidable in environmental epidemiologic surveys. The aim of this study was to compare methods for handling large amounts of missing values: omission of missing values, single and multiple imputations (through linear regression or partial least squares regression), and a fully Bayesian approach. These methods were applied to the PARIS birth cohort, where indoor domestic pollutant measurements were performed in a random sample of babies' dwellings. A simulation study was conducted to assess performances of different approaches with a high proportion of missing values (from 50% to 95%). Different simulation scenarios were carried out, controlling the true value of the association (odds ratio of 1.0, 1.2, and 1.4), and varying the health outcome prevalence. When a large amount of data is missing, omitting these missing data reduced statistical power and inflated standard errors, which affected the significance of the association. Single imputation underestimated the variability, and considerably increased risk of type I error. All approaches were conservative, except the Bayesian joint model. In the case of a common health outcome, the fully Bayesian approach is the most efficient approach (low root mean square error, reasonable type I error, and high statistical power). Nevertheless for a less prevalent event, the type I error is increased and the statistical power is reduced. The estimated posterior distribution of the OR is useful to refine the conclusion. Among the methods handling missing values, no approach is absolutely the best but when usual approaches (e.g. single imputation) are not sufficient, joint modelling approach of missing process and health association is more efficient when large amounts of data are missing. PMID:25226278

  16. A standardised protocol for texture feature analysis of endoscopic images in gynaecological cancer.

    PubMed

    Neofytou, Marios S; Tanos, Vasilis; Pattichis, Marios S; Pattichis, Constantinos S; Kyriacou, Efthyvoulos C; Koutsouris, Dimitris D

    2007-11-29

    In the development of tissue classification methods, classifiers rely on significant differences between texture features extracted from normal and abnormal regions. Yet, significant differences can arise due to variations in the image acquisition method. For endoscopic imaging of the endometrium, we propose a standardized image acquisition protocol to eliminate significant statistical differences due to variations in: (i) the distance from the tissue (panoramic vs close up), (ii) difference in viewing angles and (iii) color correction. We investigate texture feature variability for a variety of targets encountered in clinical endoscopy. All images were captured at clinically optimum illumination and focus using 720 x 576 pixels and 24 bits color for: (i) a variety of testing targets from a color palette with a known color distribution, (ii) different viewing angles, (iv) two different distances from a calf endometrial and from a chicken cavity. Also, human images from the endometrium were captured and analysed. For texture feature analysis, three different sets were considered: (i) Statistical Features (SF), (ii) Spatial Gray Level Dependence Matrices (SGLDM), and (iii) Gray Level Difference Statistics (GLDS). All images were gamma corrected and the extracted texture feature values were compared against the texture feature values extracted from the uncorrected images. Statistical tests were applied to compare images from different viewing conditions so as to determine any significant differences. For the proposed acquisition procedure, results indicate that there is no significant difference in texture features between the panoramic and close up views and between angles. For a calibrated target image, gamma correction provided an acquired image that was a significantly better approximation to the original target image. In turn, this implies that the texture features extracted from the corrected images provided for better approximations to the original images. Within the proposed protocol, for human ROIs, we have found that there is a large number of texture features that showed significant differences between normal and abnormal endometrium. This study provides a standardized protocol for avoiding any significant texture feature differences that may arise due to variability in the acquisition procedure or the lack of color correction. After applying the protocol, we have found that significant differences in texture features will only be due to the fact that the features were extracted from different types of tissue (normal vs abnormal).

  17. Statistical model for forecasting monthly large wildfire events in western United States

    Treesearch

    Haiganoush K. Preisler; Anthony L. Westerling

    2006-01-01

    The ability to forecast the number and location of large wildfire events (with specified confidence bounds) is important to fire managers attempting to allocate and distribute suppression efforts during severe fire seasons. This paper describes the development of a statistical model for assessing the forecasting skills of fire-danger predictors and producing 1-month-...

  18. Length and Rate of Individual Participation in Various Activities on Recreation Sites and Areas

    Treesearch

    Gary L. Tyre; George A. James

    1971-01-01

    While statistically reliable methods exist for estimating recreation use on large areas, they often prove prohibitively expensive. Inexpensive alternatives involving the length and rate of individual participation in specific activites are presented, together with data and statistics on the recreational use of three large areas on the National Forests. This...

  19. The Effects of Run-of-River Hydroelectric Power Schemes on Fish Community Composition in Temperate Streams and Rivers

    PubMed Central

    2016-01-01

    The potential environmental impacts of large-scale storage hydroelectric power (HEP) schemes have been well-documented in the literature. In Europe, awareness of these potential impacts and limited opportunities for politically-acceptable medium- to large-scale schemes, have caused attention to focus on smaller-scale HEP schemes, particularly run-of-river (ROR) schemes, to contribute to meeting renewable energy targets. Run-of-river HEP schemes are often presumed to be less environmentally damaging than large-scale storage HEP schemes. However, there is currently a lack of peer-reviewed studies on their physical and ecological impact. The aim of this article was to investigate the effects of ROR HEP schemes on communities of fish in temperate streams and rivers, using a Before-After, Control-Impact (BACI) study design. The study makes use of routine environmental surveillance data collected as part of long-term national and international monitoring programmes at 23 systematically-selected ROR HEP schemes and 23 systematically-selected paired control sites. Six area-normalised metrics of fish community composition were analysed using a linear mixed effects model (number of species, number of fish, number of Atlantic salmon—Salmo salar, number of >1 year old Atlantic salmon, number of brown trout—Salmo trutta, and number of >1 year old brown trout). The analyses showed that there was a statistically significant effect (p<0.05) of ROR HEP construction and operation on the number of species. However, no statistically significant effects were detected on the other five metrics of community composition. The implications of these findings are discussed in this article and recommendations are made for best-practice study design for future fish community impact studies. PMID:27191717

  20. Twitter-Based Analysis of the Dynamics of Collective Attention to Political Parties

    PubMed Central

    Eom, Young-Ho; Puliga, Michelangelo; Smailović, Jasmina; Mozetič, Igor; Caldarelli, Guido

    2015-01-01

    Large-scale data from social media have a significant potential to describe complex phenomena in the real world and to anticipate collective behaviors such as information spreading and social trends. One specific case of study is represented by the collective attention to the action of political parties. Not surprisingly, researchers and stakeholders tried to correlate parties' presence on social media with their performances in elections. Despite the many efforts, results are still inconclusive since this kind of data is often very noisy and significant signals could be covered by (largely unknown) statistical fluctuations. In this paper we consider the number of tweets (tweet volume) of a party as a proxy of collective attention to the party, identify the dynamics of the volume, and show that this quantity has some information on the election outcome. We find that the distribution of the tweet volume for each party follows a log-normal distribution with a positive autocorrelation of the volume over short terms, which indicates the volume has large fluctuations of the log-normal distribution yet with a short-term tendency. Furthermore, by measuring the ratio of two consecutive daily tweet volumes, we find that the evolution of the daily volume of a party can be described by means of a geometric Brownian motion (i.e., the logarithm of the volume moves randomly with a trend). Finally, we determine the optimal period of averaging tweet volume for reducing fluctuations and extracting short-term tendencies. We conclude that the tweet volume is a good indicator of parties' success in the elections when considered over an optimal time window. Our study identifies the statistical nature of collective attention to political issues and sheds light on how to model the dynamics of collective attention in social media. PMID:26161795

  1. Mid-term functional outcome after the internal fixation of distal radius fractures

    PubMed Central

    2012-01-01

    Background Distal radius fracture is a common injury with a variety of operative and non-operative management options. There remains debate as to the optimal treatment for a given patient and fracture. Despite the popularity of volar locking plate fixation, there are few large cohort or long term follow up studies to justify this modality. Our aim was to report the functional outcome of a large number of patients at a significant follow up time after fixation of their distal radius with a volar locking plate. Methods 180 patients with 183 fractures and a mean age of 62.4 years were followed up retrospectively at a mean of 30 months (Standard deviation = 10.4). Functional assessment was performed using the Disabilities of the Arm, Shoulder and Hand (DASH) and modified MAYO wrist scores. Statistical analysis was performed to identify possible variables affecting outcome and radiographs were assessed to determine time to fracture union. Results The median DASH score was 2.3 and median MAYO score was 90 for the whole group. Overall, 133 patients (74%) had a good or excellent DASH and MAYO score. Statistical analysis showed that no specific variable including gender, age, fracture type, post-operative immobilisation or surgeon grade significantly affected outcome. Complications occurred in 27 patients (15%) and in 11 patients were major (6%). Conclusion This single centre large population series demonstrates good to excellent results in the majority of patients after volar locking plate fixation of the distal radius, with complication rates comparable to other non-operative and operative treatment modalities. On this basis we recommend this mode of fixation for distal radius fractures requiting operative intervention. PMID:22280557

  2. Mid-term functional outcome after the internal fixation of distal radius fractures.

    PubMed

    Phadnis, Joideep; Trompeter, Alex; Gallagher, Kieran; Bradshaw, Lucy; Elliott, David S; Newman, Kevin J

    2012-01-26

    Distal radius fracture is a common injury with a variety of operative and non-operative management options. There remains debate as to the optimal treatment for a given patient and fracture. Despite the popularity of volar locking plate fixation, there are few large cohort or long term follow up studies to justify this modality. Our aim was to report the functional outcome of a large number of patients at a significant follow up time after fixation of their distal radius with a volar locking plate. 180 patients with 183 fractures and a mean age of 62.4 years were followed up retrospectively at a mean of 30 months (Standard deviation=10.4). Functional assessment was performed using the Disabilities of the Arm, Shoulder and Hand (DASH) and modified MAYO wrist scores. Statistical analysis was performed to identify possible variables affecting outcome and radiographs were assessed to determine time to fracture union. The median DASH score was 2.3 and median MAYO score was 90 for the whole group. Overall, 133 patients (74%) had a good or excellent DASH and MAYO score. Statistical analysis showed that no specific variable including gender, age, fracture type, post-operative immobilisation or surgeon grade significantly affected outcome. Complications occurred in 27 patients (15%) and in 11 patients were major (6%). This single centre large population series demonstrates good to excellent results in the majority of patients after volar locking plate fixation of the distal radius, with complication rates comparable to other non-operative and operative treatment modalities. On this basis we recommend this mode of fixation for distal radius fractures requiting operative intervention.

  3. The Effects of Run-of-River Hydroelectric Power Schemes on Fish Community Composition in Temperate Streams and Rivers.

    PubMed

    Bilotta, Gary S; Burnside, Niall G; Gray, Jeremy C; Orr, Harriet G

    2016-01-01

    The potential environmental impacts of large-scale storage hydroelectric power (HEP) schemes have been well-documented in the literature. In Europe, awareness of these potential impacts and limited opportunities for politically-acceptable medium- to large-scale schemes, have caused attention to focus on smaller-scale HEP schemes, particularly run-of-river (ROR) schemes, to contribute to meeting renewable energy targets. Run-of-river HEP schemes are often presumed to be less environmentally damaging than large-scale storage HEP schemes. However, there is currently a lack of peer-reviewed studies on their physical and ecological impact. The aim of this article was to investigate the effects of ROR HEP schemes on communities of fish in temperate streams and rivers, using a Before-After, Control-Impact (BACI) study design. The study makes use of routine environmental surveillance data collected as part of long-term national and international monitoring programmes at 23 systematically-selected ROR HEP schemes and 23 systematically-selected paired control sites. Six area-normalised metrics of fish community composition were analysed using a linear mixed effects model (number of species, number of fish, number of Atlantic salmon-Salmo salar, number of >1 year old Atlantic salmon, number of brown trout-Salmo trutta, and number of >1 year old brown trout). The analyses showed that there was a statistically significant effect (p<0.05) of ROR HEP construction and operation on the number of species. However, no statistically significant effects were detected on the other five metrics of community composition. The implications of these findings are discussed in this article and recommendations are made for best-practice study design for future fish community impact studies.

  4. Twitter-Based Analysis of the Dynamics of Collective Attention to Political Parties.

    PubMed

    Eom, Young-Ho; Puliga, Michelangelo; Smailović, Jasmina; Mozetič, Igor; Caldarelli, Guido

    2015-01-01

    Large-scale data from social media have a significant potential to describe complex phenomena in the real world and to anticipate collective behaviors such as information spreading and social trends. One specific case of study is represented by the collective attention to the action of political parties. Not surprisingly, researchers and stakeholders tried to correlate parties' presence on social media with their performances in elections. Despite the many efforts, results are still inconclusive since this kind of data is often very noisy and significant signals could be covered by (largely unknown) statistical fluctuations. In this paper we consider the number of tweets (tweet volume) of a party as a proxy of collective attention to the party, identify the dynamics of the volume, and show that this quantity has some information on the election outcome. We find that the distribution of the tweet volume for each party follows a log-normal distribution with a positive autocorrelation of the volume over short terms, which indicates the volume has large fluctuations of the log-normal distribution yet with a short-term tendency. Furthermore, by measuring the ratio of two consecutive daily tweet volumes, we find that the evolution of the daily volume of a party can be described by means of a geometric Brownian motion (i.e., the logarithm of the volume moves randomly with a trend). Finally, we determine the optimal period of averaging tweet volume for reducing fluctuations and extracting short-term tendencies. We conclude that the tweet volume is a good indicator of parties' success in the elections when considered over an optimal time window. Our study identifies the statistical nature of collective attention to political issues and sheds light on how to model the dynamics of collective attention in social media.

  5. Development and Two-Year Follow-Up Evaluation of a Training Workshop for the Large Preventive Positive Psychology Happy Family Kitchen Project in Hong Kong

    PubMed Central

    Lai, Agnes Y.; Mui, Moses W.; Wan, Alice; Stewart, Sunita M.; Yew, Carol; Lam, Tai-hing; Chan, Sophia S.

    2016-01-01

    Evidence-based practice and capacity-building approaches are essential for large-scale health promotion interventions. However, there are few models in the literature to guide and evaluate training of social service workers in community settings. This paper presents the development and evaluation of the “train-the-trainer” workshop (TTT) for the first large scale, community-based, family intervention projects, entitled “Happy Family Kitchen Project” (HFK) under the FAMILY project, a Hong Kong Jockey Club Initiative for a Harmonious Society. The workshop aimed to enhance social workers’ competence and performance in applying positive psychology constructs in their family interventions under HFK to improve family well-being of the community they served. The two-day TTT was developed and implemented by a multidisciplinary team in partnership with community agencies to 50 social workers (64% women). It focused on the enhancement of knowledge, attitude, and practice of five specific positive psychology themes, which were the basis for the subsequent development of the 23 family interventions for 1419 participants. Acceptability and applicability were enhanced by completing a needs assessment prior to the training. The TTT was evaluated by trainees’ reactions to the training content and design, changes in learners (trainees) and benefits to the service organizations. Focus group interviews to evaluate the workshop at three months after the training, and questionnaire survey at pre-training, immediately after, six months, one year and two years after training were conducted. There were statistically significant increases with large to moderate effect size in perceived knowledge, self-efficacy and practice after training, which sustained to 2-year follow-up. Furthermore, there were statistically significant improvements in family communication and well-being of the participants in the HFK interventions they implemented after training. This paper offers a practical example of development, implementation and model-based evaluation of training programs, which may be helpful to others seeking to develop such programs in diverse communities. PMID:26808541

  6. Evaluation of a seismic quiescence pattern in southeastern sicily

    NASA Astrophysics Data System (ADS)

    Mulargia, F.; Broccio, F.; Achilli, V.; Baldi, P.

    1985-07-01

    Southeastern Sicily experienced a very peculiar seismic activity in historic times, with a long series of ruinous earthquakes. A last large event, with magnitude probably in excess of 7.5, occurred on Jan., 11, 1693, totally destroying the city of Catania and killing 60,000 people. Only a few moderate events were reported since then, and a seismic gap issue has been proposed on this basis. A close scrutiny of the available data further shows that all significant seismic activity ceased after year 1850, suggesting one of the largest quiescence patterns ever encountered. This is examined together with the complex tectonic setting of the region, characterized by a wrenching mechanism with most significant seismicity located in its northern graben structure. An attempt to ascertain the imminence and the size of a future earthquake through commonly accepted empirical relations based on size and duration of the quiescence pattern did not provide any feasible result. A precision levelling survey which we recently completed yielded a relative subsidence of ~ 3 mm/yr, consistent with an aseismic slip on the northern graben structure at a rate of ~ 15 mm/yr. Comparing these results with sedimentological and tidal data suggests that the area is undergoing an accelerated deformation process; this issue is further supported by Rikitake's ultimate strain statistics. If the imminence of a damaging ( M = 5.4) event is strongly favoured by Weibull statistics applied to the time series of occurrence of large events, the accumulated strain does not appear sufficient for a large earthquake ( M ⪸ 7.0). Within the limits of reliability of present semi-empirical approaches we conclude that the available evidence is consistent with the occurrence of a moderate-to-large ( M ≅ 6.0) event in the near future. Several questions regarding the application of simple models to real (and complex) tectonic settings remain nevertheless unanswered.

  7. Development and Two-Year Follow-Up Evaluation of a Training Workshop for the Large Preventive Positive Psychology Happy Family Kitchen Project in Hong Kong.

    PubMed

    Lai, Agnes Y; Mui, Moses W; Wan, Alice; Stewart, Sunita M; Yew, Carol; Lam, Tai-Hing; Chan, Sophia S

    2016-01-01

    Evidence-based practice and capacity-building approaches are essential for large-scale health promotion interventions. However, there are few models in the literature to guide and evaluate training of social service workers in community settings. This paper presents the development and evaluation of the "train-the-trainer" workshop (TTT) for the first large scale, community-based, family intervention projects, entitled "Happy Family Kitchen Project" (HFK) under the FAMILY project, a Hong Kong Jockey Club Initiative for a Harmonious Society. The workshop aimed to enhance social workers' competence and performance in applying positive psychology constructs in their family interventions under HFK to improve family well-being of the community they served. The two-day TTT was developed and implemented by a multidisciplinary team in partnership with community agencies to 50 social workers (64% women). It focused on the enhancement of knowledge, attitude, and practice of five specific positive psychology themes, which were the basis for the subsequent development of the 23 family interventions for 1419 participants. Acceptability and applicability were enhanced by completing a needs assessment prior to the training. The TTT was evaluated by trainees' reactions to the training content and design, changes in learners (trainees) and benefits to the service organizations. Focus group interviews to evaluate the workshop at three months after the training, and questionnaire survey at pre-training, immediately after, six months, one year and two years after training were conducted. There were statistically significant increases with large to moderate effect size in perceived knowledge, self-efficacy and practice after training, which sustained to 2-year follow-up. Furthermore, there were statistically significant improvements in family communication and well-being of the participants in the HFK interventions they implemented after training. This paper offers a practical example of development, implementation and model-based evaluation of training programs, which may be helpful to others seeking to develop such programs in diverse communities.

  8. Dynamics of the wetland vegetation in large lakes of the Yangtze Plain in response to both fertilizer consumption and climatic changes

    NASA Astrophysics Data System (ADS)

    Hou, Xuejiao; Feng, Lian; Chen, Xiaoling; Zhang, Yunlin

    2018-07-01

    Using moderate-resolution imaging spectroradiometer (MODIS) data that cover the 15-year period from 2000 to 2014 and a phenology-based classification method, the long-term changes in the wetland vegetation of 25 large lakes on the Yangtze Plain were obtained. The classification method was developed based on the phenological information extracted from time series of MODIS observations, which demonstrated mean user's/producer's accuracies of 76.17% and 84.58%, respectively. The first comprehensive record of the spatial distribution and temporal dynamics of wetland vegetation in the large lakes on the Yangtze Plain was created. Of the 25 lakes examined, 17 showed a decreasing trend of vegetation area percentages (VAPs) during the study period, and 7 were statistically significant (p < 0.05). The same number of lakes was found to display decreasing trends in vegetation greenness over this 15-year period, and these decreasing trends were statistically significant (p < 0.05) for 11 of the lakes. Substantially fewer lakes showed increases in either their VAPs or their vegetation greenness values. Analysis using a multiple general linear model revealed that the amounts of chemical fertilizer used for farmlands surrounding the lakes, precipitation, daily sunshine hours, temperature and water turbidity played the most important roles in regulating the interannual changes in vegetation greenness in 40% (10/25), 12% (3/25), 4% (1/25), 20% (5/25) and 12% (3/25) of the lake wetlands, respectively. On average, the combined effects of these five driving factors above explained 89.08 ± 7.89% of the variation in greenness over this 15-year period for the 25 lakes. This wetland vegetation environmental data record (EDR) of large lakes in Yangtze Plain demonstration will provide a crucial baseline information for the wetland environment conservation and restoration.

  9. Information Entropy Production of Maximum Entropy Markov Chains from Spike Trains

    NASA Astrophysics Data System (ADS)

    Cofré, Rodrigo; Maldonado, Cesar

    2018-01-01

    We consider the maximum entropy Markov chain inference approach to characterize the collective statistics of neuronal spike trains, focusing on the statistical properties of the inferred model. We review large deviations techniques useful in this context to describe properties of accuracy and convergence in terms of sampling size. We use these results to study the statistical fluctuation of correlations, distinguishability and irreversibility of maximum entropy Markov chains. We illustrate these applications using simple examples where the large deviation rate function is explicitly obtained for maximum entropy models of relevance in this field.

  10. Trends in on-road vehicle emissions and ambient air quality in Atlanta, Georgia, USA, from the late 1990s through 2009.

    PubMed

    Vijayaraghavan, Krish; DenBleyker, Allison; Ma, Lan; Lindhjem, Chris; Yarwood, Greg

    2014-07-01

    On-road vehicle emissions of carbon monoxide (CO), nitrogen oxides (NO(x)), and volatile organic compounds (VOCs) during 1995-2009 in the Atlanta Metropolitan Statistical Area were estimated using the Motor Vehicle Emission Simulator (MOVES) model and data from the National Emissions Inventories and the State of Georgia. Statistically significant downward trends (computed using the nonparametric Theil-Sen method) in annual on-road CO, NO(x), and VOC emissions of 6.1%, 3.3%, and 6.0% per year, respectively, are noted during the 1995-2009 period despite an increase in total vehicle distance traveled. The CO and NO(x) emission trends are correlated with statistically significant downward trends in ambient air concentrations of CO and NO(x) in Atlanta ranging from 8.0% to 11.8% per year and from 5.8% to 8.7% per year, respectively, during similar time periods. Weather-adjusted summertime ozone concentrations in Atlanta exhibited a statistically significant declining trend of 2.3% per year during 2001-2009. Although this trend coexists with the declining trends in on-road NO(x), VOC, and CO emissions, identifying the cause of the downward trend in ozone is complicated by reductions in multiple precursors from different source sectors. Implications: Large reductions in on-road vehicle emissions of CO and NO(x) in Atlanta from the late 1990s to 2009, despite an increase in total vehicle distance traveled, contributed to a significant improvement in air quality through decreases in ambient air concentrations of CO and NO(x) during this time period. Emissions reductions in motor vehicles and other source sectors resulted in these improvements and the observed declining trend in ozone concentrations over the past decade. Although these historical trends cannot be extrapolated to the future because pollutant concentration contributions due to on-road vehicle emissions will likely become an increasingly smaller fraction of the atmospheric total, they provide an indication of the benefits of past control measures.

  11. Prospectively measured triiodothyronine levels are positively associated with breast cancer risk in postmenopausal women

    PubMed Central

    2010-01-01

    Introduction The potential association between hypo- and hyperthyroid disorders and breast cancer has been investigated in a large number of studies during the last decades without conclusive results. This prospective cohort study investigated prediagnostic levels of thyrotropin (TSH) and triiodothyronine (T3) in relation to breast cancer incidence in pre- and postmenopausal women. Methods In the Malmö Preventive Project, 2,696 women had T3 and/or TSH levels measured at baseline. During a mean follow-up of 19.3 years, 173 incident breast cancer cases were retrieved using record linkage with The Swedish Cancer Registry. Quartile cut-points for T3 and TSH were based on the distribution among all women in the study cohort. A Cox's proportional hazards analysis was used to estimate relative risks (RR), with a confidence interval (CI) of 95%. Trends over quartiles of T3 and TSH were calculated considering a P-value < 0.05 as statistically significant. All analyses were repeated for pre- and peri/postmenopausal women separately. Results Overall there was a statistically significant association between T3 and breast cancer risk, the adjusted RR in the fourth quartile, as compared to the first, was 1.87 (1.12 to 3.14). In postmenopausal women the RRs for the second, third and fourth quartiles, as compared to the first, were 3.26 (0.96 to 11.1), 5.53 (1.65 to 18.6) and 6.87 (2.09 to 22.6), (P-trend: < 0.001). There were no such associations in pre-menopausal women, and no statistically significant interaction between T3 and menopausal status. Also, no statistically significant association was seen between serum TSH and breast cancer. Conclusions This is the first prospective study on T3 levels in relation to breast cancer risk. T3 levels in postmenopausal women were positively associated with the risk of breast cancer in a dose-response manner. PMID:20540734

  12. Observed and Projected Precipitation Changes over the Nine US Climate Regions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chylek, Petr; Dubey, Manvendra; Hengartner, Nicholas

    Here, we analyze the past (1900–2015) temperature and precipitation changes in nine separate US climate regions. We find that the temperature increased in a statistically significant (95% confidence level equivalent to alpha level of 0.05) manner in all of these regions. However, the variability in the observed precipitation was much more complex. In the eastern US (east of Rocky Mountains), the precipitation increased in all five climate regions and the increase was statistically significant in three of them. In contract, in the western US, the precipitation increased in two regions and decreased in two with no statistical significance in anymore » region. The CMIP5 climate models (an ensemble mean) were not able to capture properly either the large precipitation differences between the eastern and the western US, or the changes of precipitation between 1900 and 2015 in eastern US. The statistical regression model explains the differences between the eastern and western US precipitation as results of different significant predictors. The anthropogenic greenhouse gases and aerosol (GHGA) are the major forcing of the precipitation in the eastern part of US, while the Pacific Decadal Oscillation (PDO) has the major influence on precipitation in the western part of the US. This analysis suggests that the precipitation over the eastern US increased at an approximate rate of 6.7%/K, in agreement with the Clausius-Clapeyron equation, while the precipitation of the western US was approximately constant, independent of the temperature. Future precipitation over the western part of the US will depend on the behavior of the PDO, and how it (PDO) may be affected by future warming. Low hydrological sensitivity (percent increase of precipitation per one K of warming) projected by the CMIP5 models for the eastern US suggests either an underestimate of future precipitation or an overestimate of future warming.« less

  13. Analysis of spatial and temporal rainfall trends in Sicily during the 1921-2012 period

    NASA Astrophysics Data System (ADS)

    Liuzzo, Lorena; Bono, Enrico; Sammartano, Vincenzo; Freni, Gabriele

    2016-10-01

    Precipitation patterns worldwide are changing under the effects of global warming. The impacts of these changes could dramatically affect the hydrological cycle and, consequently, the availability of water resources. In order to improve the quality and reliability of forecasting models, it is important to analyse historical precipitation data to account for possible future changes. For these reasons, a large number of studies have recently been carried out with the aim of investigating the existence of statistically significant trends in precipitation at different spatial and temporal scales. In this paper, the existence of statistically significant trends in rainfall from observational datasets, which were measured by 245 rain gauges over Sicily (Italy) during the 1921-2012 period, was investigated. Annual, seasonal and monthly time series were examined using the Mann-Kendall non-parametric statistical test to detect statistically significant trends at local and regional scales, and their significance levels were assessed. Prior to the application of the Mann-Kendall test, the historical dataset was completed using a geostatistical spatial interpolation technique, the residual ordinary kriging, and then processed to remove the influence of serial correlation on the test results, applying the procedure of trend-free pre-whitening. Once the trends at each site were identified, the spatial patterns of the detected trends were examined using spatial interpolation techniques. Furthermore, focusing on the 30 years from 1981 to 2012, the trend analysis was repeated with the aim of detecting short-term trends or possible changes in the direction of the trends. Finally, the effect of climate change on the seasonal distribution of rainfall during the year was investigated by analysing the trend in the precipitation concentration index. The application of the Mann-Kendall test to the rainfall data provided evidence of a general decrease in precipitation in Sicily during the 1921-2012 period. Downward trends frequently occurred during the autumn and winter months. However, an increase in total annual precipitation was detected during the period from 1981 to 2012.

  14. Observed and Projected Precipitation Changes over the Nine US Climate Regions

    DOE PAGES

    Chylek, Petr; Dubey, Manvendra; Hengartner, Nicholas; ...

    2017-10-25

    Here, we analyze the past (1900–2015) temperature and precipitation changes in nine separate US climate regions. We find that the temperature increased in a statistically significant (95% confidence level equivalent to alpha level of 0.05) manner in all of these regions. However, the variability in the observed precipitation was much more complex. In the eastern US (east of Rocky Mountains), the precipitation increased in all five climate regions and the increase was statistically significant in three of them. In contract, in the western US, the precipitation increased in two regions and decreased in two with no statistical significance in anymore » region. The CMIP5 climate models (an ensemble mean) were not able to capture properly either the large precipitation differences between the eastern and the western US, or the changes of precipitation between 1900 and 2015 in eastern US. The statistical regression model explains the differences between the eastern and western US precipitation as results of different significant predictors. The anthropogenic greenhouse gases and aerosol (GHGA) are the major forcing of the precipitation in the eastern part of US, while the Pacific Decadal Oscillation (PDO) has the major influence on precipitation in the western part of the US. This analysis suggests that the precipitation over the eastern US increased at an approximate rate of 6.7%/K, in agreement with the Clausius-Clapeyron equation, while the precipitation of the western US was approximately constant, independent of the temperature. Future precipitation over the western part of the US will depend on the behavior of the PDO, and how it (PDO) may be affected by future warming. Low hydrological sensitivity (percent increase of precipitation per one K of warming) projected by the CMIP5 models for the eastern US suggests either an underestimate of future precipitation or an overestimate of future warming.« less

  15. Statistical properties of filtered pseudorandom digital sequences formed from the sum of maximum-length sequences

    NASA Technical Reports Server (NTRS)

    Wallace, G. R.; Weathers, G. D.; Graf, E. R.

    1973-01-01

    The statistics of filtered pseudorandom digital sequences called hybrid-sum sequences, formed from the modulo-two sum of several maximum-length sequences, are analyzed. The results indicate that a relation exists between the statistics of the filtered sequence and the characteristic polynomials of the component maximum length sequences. An analysis procedure is developed for identifying a large group of sequences with good statistical properties for applications requiring the generation of analog pseudorandom noise. By use of the analysis approach, the filtering process is approximated by the convolution of the sequence with a sum of unit step functions. A parameter reflecting the overall statistical properties of filtered pseudorandom sequences is derived. This parameter is called the statistical quality factor. A computer algorithm to calculate the statistical quality factor for the filtered sequences is presented, and the results for two examples of sequence combinations are included. The analysis reveals that the statistics of the signals generated with the hybrid-sum generator are potentially superior to the statistics of signals generated with maximum-length generators. Furthermore, fewer calculations are required to evaluate the statistics of a large group of hybrid-sum generators than are required to evaluate the statistics of the same size group of approximately equivalent maximum-length sequences.

  16. Peripheral vascular damage in systemic lupus erythematosus: data from LUMINA, a large multi-ethnic U.S. cohort (LXIX).

    PubMed

    Burgos, P I; Vilá, L M; Reveille, J D; Alarcón, G S

    2009-12-01

    To determine the factors associated with peripheral vascular damage in systemic lupus erythematosus patients and its impact on survival from Lupus in Minorities, Nature versus Nurture, a longitudinal US multi-ethnic cohort. Peripheral vascular damage was defined by the Systemic Lupus International Collaborating Clinics Damage Index (SDI). Factors associated with peripheral vascular damage were examined by univariable and multi-variable logistic regression models and its impact on survival by a Cox multi-variable regression. Thirty-four (5.3%) of 637 patients (90% women, mean [SD] age 36.5 [12.6] [16-87] years) developed peripheral vascular damage. Age and the SDI (without peripheral vascular damage) were statistically significant (odds ratio [OR] = 1.05, 95% confidence interval [CI] 1.01-1.08; P = 0.0107 and OR = 1.30, 95% CI 0.09-1.56; P = 0.0043, respectively) in multi-variable analyses. Azathioprine, warfarin and statins were also statistically significant, and glucocorticoid use was borderline statistically significant (OR = 1.03, 95% CI 0.10-1.06; P = 0.0975). In the survival analysis, peripheral vascular damage was independently associated with a diminished survival (hazard ratio = 2.36; 95% CI 1.07-5.19; P = 0.0334). In short, age was independently associated with peripheral vascular damage, but so was the presence of damage in other organs (ocular, neuropsychiatric, renal, cardiovascular, pulmonary, musculoskeletal and integument) and some medications (probably reflecting more severe disease). Peripheral vascular damage also negatively affected survival.

  17. Synchrotron radiation μCT and histology evaluation of bone-to-implant contact.

    PubMed

    Neldam, Camilla Albeck; Sporring, Jon; Rack, Alexander; Lauridsen, Torsten; Hauge, Ellen-Margrethe; Jørgensen, Henrik L; Jørgensen, Niklas Rye; Feidenhansl, Robert; Pinholt, Else Marie

    2017-09-01

    The purpose of this study was to evaluate bone-to-implant contact (BIC) in two-dimensional (2D) histology compared to high-resolution three-dimensional (3D) synchrotron radiation micro computed tomography (SR micro-CT). High spatial resolution, excellent signal-to-noise ratio, and contrast establish SR micro-CT as the leading imaging modality for hard X-ray microtomography. Using SR micro-CT at voxel size 5 μm in an experimental goat mandible model, no statistically significant difference was found between the different treatment modalities nor between recipient and reconstructed bone. The histological evaluation showed a statistically significant difference between BIC in reconstructed and recipient bone (p < 0.0001). Further, no statistically significant difference was found between the different treatment modalities which we found was due to large variation and subsequently due to low power. Comparing histology and SR micro-CT evaluation a bias of 5.2% was found in reconstructed area, and 15.3% in recipient bone. We conclude that for evaluation of BIC with histology and SR micro-CT, SR micro-CT cannot be proven more precise than histology for evaluation of BIC, however, with this SR micro-CT method, one histologic bone section is comparable to the 3D evaluation. Further, the two methods complement each other with knowledge on BIC in 2D and 3D. Copyright © 2017 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  18. A Novel Candidate Molecule in Pathological Grading Of Gliomas: ELABELA.

    PubMed

    Artas, Gokhan; Ozturk, Sait; Kuloglu, Tuncay; Dagli, Adile Ferda; Gonen, Murat; Artas, Hakan; Aydin, Suleyman; Erol, Fatih Serhat

    2018-04-06

    This study aimed to investigate the possible role of ELABELA (ELA) in the histopathological grading of gliomas. We retrospectively assessed pathological specimens of patients who underwent surgery for intracranial space-occupying lesions. Only primary glioma specimens were included in this study. We enrolled 11 patients histologically diagnosed with low-grade glioma and 22 patients with high-grade glioma. The ELA antibody was applied to 4-6-µm-thick sections obtained from paraffin blocks. Histoscores were calculated using the distribution and intensity of staining immunoreactivity. An independent sample t-test was used for two-point inter-group assessments, whereas one-way analysis of variance was used for the other assessments. P 0.05 was considered statistically significant. The histoscores of the control brain, low-grade glioma, and high-grade glioma tissues were found to be 0.08, 0.37, and 0.92, respectively. The difference in ELA immunoreactivity between the control brain tissue and glioma tissue was statistically significant (p 0.05). In addition, a statistically significant increase was observed in ELA immunoreactivity in high-grade glioma tissues compared with that in low-grade glioma tissues (p 0.05). ELA has an angiogenetic role in the progression of glial tumors. ELA, which is an endogenous ligand of the apelin receptor, activates the apelinergic system and causes the progression of glial tumors. Further studies with a large number of patients are necessary to investigate the angiogenetic role of ELA in glial tumors.

  19. Using non-specialist observers in 4AFC human observer studies

    NASA Astrophysics Data System (ADS)

    Elangovan, Premkumar; Mackenzie, Alistair; Dance, David R.; Young, Kenneth C.; Wells, Kevin

    2017-03-01

    Virtual clinical trials (VCTs) are an emergent approach for rapid evaluation and comparison of various breast imaging technologies and techniques using computer-based modeling tools. Increasingly 4AFC (Four alternative forced choice) virtual clinical trials are used to compare detection performances of different breast imaging modalities. Most prior studies have used physicists and/or radiologists and physicists interchangeably. However, large scale use of statistically significant 4AFC observer studies is challenged by the individual time commitment and cost of such observers, often drawn from a limited local pool of specialists. This work aims to investigate whether non-specialist observers can be used to supplement such studies. A team of five specialist observers (medical physicists) and five non-specialists participated in a 4AFC study containing simulated 2D-mammography and DBT (digital breast tomosynthesis) images, produced using the OPTIMAM toolbox for VCTs. The images contained 4mm irregular solid masses and 4mm spherical targets at a range of contrast levels embedded in a realistic breast phantom background. There was no statistically significant difference between the detection performance of medical physicists and non-specialists (p>0.05). However, non-specialists took longer to complete the study than their physicist counterparts, which was statistically significant (p<0.05). Overall, the results from both observer groups indicate that DBT has a lower detectable threshold contrast than 2D-mammography for both masses and spheres, and both groups found spheres easier to detect than irregular solid masses.

  20. Does high-flow nasal cannula oxygen improve outcome in acute hypoxemic respiratory failure? A systematic review and meta-analysis.

    PubMed

    Lin, Si-Ming; Liu, Kai-Xiong; Lin, Zhi-Hong; Lin, Pei-Hong

    2017-10-01

    To evaluate the efficacy of high-flow nasal cannula (HFNC) in the rate of intubation and mortality for patients with acute hypoxemic respiratory failure. We searched Pubmed, EMBASE, and the Cochrane Library for relevant studies. Two reviewers extracted data and reviewed the quality of the studies independently. The primary outcome was the rate of intubation; secondary outcome was mortality in the hospital. Study-level data were pooled using a random-effects model when I2 was >50% or a fixed-effects model when I2 was <50%. Eight randomized controlled studies with a total of 1,818patients were considered. Pooled analysis showed that no statistically significant difference was found between groups regarding the rate of intubation (odds ratio [OR] = 0.79; 95% confidence interval [CI]: 0.60-1.04; P = 0.09; I2 = 36%) and no statistically significant difference was found between groups regarding hospital mortality (OR = 0.89; 95% CI: 0.62-127; P = 0.51; I2 = 47%). The use of HFNC showed a trend toward reduction in the intubation rate, which did not meet statistical significance, in patients with acute respiratory failure compared with conventional oxygen therapy (COT) and noninvasive ventilation (NIV). Moreover no difference in mortality. So, Large, well-designed, randomized, multi-center trials are needed to confirm the effects of HFNC in acute hypoxemic respiratory failure patients. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Muons and neutrinos

    NASA Technical Reports Server (NTRS)

    Stanev, T.

    1986-01-01

    The first generation of large and precise detectors, some initially dedicated to search for nucleon decay has accumulated significant statistics on neutrinos and high-energy muons. A second generation of even better and bigger detectors are already in operation or in advanced construction stage. The present set of experimental data on muon groups and neutrinos is qualitatively better than several years ago and the expectations for the following years are high. Composition studies with underground muon groups, neutrino detection, and expected extraterrestrial neutrino fluxes are discussed.

  2. No association of SORL1 SNPs with Alzheimer's disease.

    PubMed

    Minster, Ryan L; DeKosky, Steven T; Kamboh, M Ilyas

    2008-08-01

    SORL1 is an element of the amyloid precursor protein processing pathway and is therefore a good candidate for affecting Alzheimer's disease (AD) risk. Indeed, there have been reports of associations between variation in SORL1 and AD risk. We examined six statistically significant single-nucleotide polymorphisms from the initial observation in a large Caucasian American case-controls cohort (1000 late-onset AD [LOAD] cases and 1000 older controls). Analysis of allele, genotype and haplotype frequencies revealed no association with LOAD risk in our cohort.

  3. Audiologic correlates of hearing handicap in the elderly.

    PubMed

    Weinstein, B E; Ventry, I M

    1983-03-01

    This investigation was conducted to determine the relationship between self-assessed hearing handicap and audiometric measures in a large sample of noninstitutionalized elderly individuals. Eighty subjects underwent a complete audiological evaluation and responded to the Hearing Measurement Scale (HMS). Each of the correlations between measures of sensitivity and the HMS score was statistically significant. The speech discrimination scores showed a somewhat lower correlation with the HMS score than did pure-tone measures. The implications of the above findings are discussed.

  4. Antibiotics for exacerbations of chronic obstructive pulmonary disease.

    PubMed

    Vollenweider, Daniela J; Jarrett, Harish; Steurer-Stey, Claudia A; Garcia-Aymerich, Judith; Puhan, Milo A

    2012-12-12

    Many patients with an exacerbation of chronic obstructive pulmonary disease (COPD) are treated with antibiotics. However, the value of antibiotics remains uncertain as systematic reviews and clinical trials have shown conflicting results. To assess the effects of antibiotics in the management of acute COPD exacerbations on treatment failure as observed between seven days and one month after treatment initiation (primary outcome) and on other patient-important outcomes (mortality, adverse events, length of hospital stay). We searched the Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE, EMBASE and other electronically available databases up to September 2012. Randomised controlled trials (RCTs) in people with acute COPD exacerbations comparing antibiotic therapy and placebo with a follow-up of at least seven days. Two review authors independently screened references and extracted data from trial reports. We kept the three groups of outpatients, inpatients and patients admitted to the intensive care unit (ICU) separate for benefit outcomes and mortality because we considered them to be clinically too different to be summarised in one group. We considered outpatients to have a mild to moderate exacerbation, inpatients to have a severe exacerbation and ICU patients to have a very severe exacerbation. Where outcomes or study details were not reported we requested missing data from the authors of the primary studies. We calculated pooled risk ratios (RR) for treatment failure, Peto odds ratios (OR) for rare events (mortality and adverse events) and weighted mean differences (MD) for continuous outcomes using fixed-effect models. We used GRADE to assess the quality of the evidence. Sixteen trials with 2068 participants were included. In outpatients (mild to moderate exacerbations), there was evidence of low quality that antibiotics did statistically significantly reduce the risk for treatment failure between seven days and one month after treatment initiation (RR 0.75; 95% CI 0.60 to 0.94; I(2) = 35%) but they did not significantly reduce the risk when the meta-analysis was restricted to currently available drugs (RR 0.80; 95% CI 0.63 to 1.01; I(2) = 33%). Evidence of high quality showed that antibiotics statistically significantly reduced the risk of treatment failure in inpatients with severe exacerbations (ICU not included) (RR 0.77; 95% CI 0.65 to 0.91; I(2) = 47%) regardless of whether restricted to current drugs. The only trial with 93 patients admitted to the ICU showed a large and statistically significant effect on treatment failure (RR 0.19; 95% CI 0.08 to 0.45; high-quality evidence).Evidence of low-quality from four trials in inpatients showed no effect of antibiotics on mortality (Peto OR 1.02; 95% CI 0.37 to 2.79). High-quality evidence from one trial showed a statistically significant effect on mortality in ICU patients (Peto OR 0.21; 95% CI 0.06 to 0.72). Length of hospital stay (in days) was similar in the antibiotics and placebo groups except for the ICU study where antibiotics statistically significantly reduced length of hospital stay (mean difference -9.60 days; 95% CI -12.84 to -6.36 days). One trial showed no effect of antibiotics on re-exacerbations between two and six weeks after treatment initiation. Only one trial (N = 35) reported health-related quality of life but did not show a statistically significant difference between the treatment and control group.Evidence of moderate quality showed that the overall incidence of adverse events was higher in the antibiotics groups (Peto OR 1.53; 95% CI 1.03 to 2.27). Patients treated with antibiotics experienced statistically significantly more diarrhoea based on three trials (Peto OR 2.62; 95% CI 1.11 to 6.17; high-quality evidence). Antibiotics for COPD exacerbations showed large and consistent beneficial effects across outcomes of patients admitted to an ICU. However, for outpatients and inpatients the results were inconsistent. The risk for treatment failure was significantly reduced in both inpatients and outpatients when all trials (1957 to 2012) were included but not when the analysis for outpatients was restricted to currently used antibiotics. Also, antibiotics had no statistically significant effect on mortality and length of hospital stay in inpatients and almost no data on patient-reported outcomes exist. These inconsistent effects call for research into clinical signs and biomarkers that help identify patients who benefit from antibiotics and patients who experience no effect, and in whom downsides of antibiotics (side effects, costs and multi-resistance) could be avoided.

  5. Combining censored and uncensored data in a U-statistic: design and sample size implications for cell therapy research.

    PubMed

    Moyé, Lemuel A; Lai, Dejian; Jing, Kaiyan; Baraniuk, Mary Sarah; Kwak, Minjung; Penn, Marc S; Wu, Colon O

    2011-01-01

    The assumptions that anchor large clinical trials are rooted in smaller, Phase II studies. In addition to specifying the target population, intervention delivery, and patient follow-up duration, physician-scientists who design these Phase II studies must select the appropriate response variables (endpoints). However, endpoint measures can be problematic. If the endpoint assesses the change in a continuous measure over time, then the occurrence of an intervening significant clinical event (SCE), such as death, can preclude the follow-up measurement. Finally, the ideal continuous endpoint measurement may be contraindicated in a fraction of the study patients, a change that requires a less precise substitution in this subset of participants.A score function that is based on the U-statistic can address these issues of 1) intercurrent SCE's and 2) response variable ascertainments that use different measurements of different precision. The scoring statistic is easy to apply, clinically relevant, and provides flexibility for the investigators' prospective design decisions. Sample size and power formulations for this statistic are provided as functions of clinical event rates and effect size estimates that are easy for investigators to identify and discuss. Examples are provided from current cardiovascular cell therapy research.

  6. Statistics of Optical Coherence Tomography Data From Human Retina

    PubMed Central

    de Juan, Joaquín; Ferrone, Claudia; Giannini, Daniela; Huang, David; Koch, Giorgio; Russo, Valentina; Tan, Ou; Bruni, Carlo

    2010-01-01

    Optical coherence tomography (OCT) has recently become one of the primary methods for noninvasive probing of the human retina. The pseudoimage formed by OCT (the so-called B-scan) varies probabilistically across pixels due to complexities in the measurement technique. Hence, sensitive automatic procedures of diagnosis using OCT may exploit statistical analysis of the spatial distribution of reflectance. In this paper, we perform a statistical study of retinal OCT data. We find that the stretched exponential probability density function can model well the distribution of intensities in OCT pseudoimages. Moreover, we show a small, but significant correlation between neighbor pixels when measuring OCT intensities with pixels of about 5 µm. We then develop a simple joint probability model for the OCT data consistent with known retinal features. This model fits well the stretched exponential distribution of intensities and their spatial correlation. In normal retinas, fit parameters of this model are relatively constant along retinal layers, but varies across layers. However, in retinas with diabetic retinopathy, large spikes of parameter modulation interrupt the constancy within layers, exactly where pathologies are visible. We argue that these results give hope for improvement in statistical pathology-detection methods even when the disease is in its early stages. PMID:20304733

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hagos, Samson M.; Feng, Zhe; Burleyson, Casey D.

    Regional cloud permitting model simulations of cloud populations observed during the 2011 ARM Madden Julian Oscillation Investigation Experiment/ Dynamics of Madden-Julian Experiment (AMIE/DYNAMO) field campaign are evaluated against radar and ship-based measurements. Sensitivity of model simulated surface rain rate statistics to parameters and parameterization of hydrometeor sizes in five commonly used WRF microphysics schemes are examined. It is shown that at 2 km grid spacing, the model generally overestimates rain rate from large and deep convective cores. Sensitivity runs involving variation of parameters that affect rain drop or ice particle size distribution (more aggressive break-up process etc) generally reduce themore » bias in rain-rate and boundary layer temperature statistics as the smaller particles become more vulnerable to evaporation. Furthermore significant improvement in the convective rain-rate statistics is observed when the horizontal grid-spacing is reduced to 1 km and 0.5 km, while it is worsened when run at 4 km grid spacing as increased turbulence enhances evaporation. The results suggest modulation of evaporation processes, through parameterization of turbulent mixing and break-up of hydrometeors may provide a potential avenue for correcting cloud statistics and associated boundary layer temperature biases in regional and global cloud permitting model simulations.« less

  8. Midweek Intensification of Rain in the U.S.: Does Air Pollution Invigorate Storms?

    NASA Technical Reports Server (NTRS)

    Bell, T. L.; Rosenfeld, D.; Hahnenberger, M.

    2005-01-01

    The effect of pollution on rainfall has been observed to depend both on the type of pollution and the precipitating environment. The climatological consequences of pollution for rainfall are uncertain. In some urban areas, pollution varies with the day of the week because of weekly variations in human activity, in effect providing a repeated experiment on the effects of pollution. Weekly variations in temperature, pressure, cloud characteristics, hails and lightning are observed in many areas. Observing a weekly cycle in rainfall statistics has proven to be more difficult, although there is some evidence for it. Here we examine rainfall statistics from the Tropical Rainfall Measuring Mission (TRMM) satellite over the southern U.S. and adjacent waters, and find that there is a distinct, statistically significant weekly cycle in summertime rainfall over the southeast U.S., as well as weekly variations in rainfall over the nearby Atlantic and the Gulf of Mexico. Rainfall over land peaks in the middle of the week, suggesting that summer rainfall on large scales may increase as pollution levels rise. Both rain statistics over land and what appear to be compensating effects over adjacent seas support the suggestion that air pollution invigorates convection and outflow aloft.

  9. Cortisol level and hemodynamic changes during tooth extraction at hypertensive and normotensive patients.

    PubMed

    Agani, Zana Bajrami; Benedetti, Alberto; Krasniqi, Vjosa Hamiti; Ahmedi, Jehona; Sejfija, Zana; Loxha, Mergime Prekazi; Murtezani, Arben; Rexhepi, Aida Namani; Ibraimi, Zana

    2015-04-01

    The patients that are subjects to oral-surgical interventions produce large amounts of steroids in comparison with healthy patients which are not a subject to any dental intervention. The aim of research was to determine the level of stress hormone cortisol in serum, arterial blood pressure and arterial pulse, and to compare the effectiveness of the usage of lidocaine with adrenalin in comparison with lidocaine without adrenalin during the tooth extraction. This clinical research includes patients with indication of tooth extraction divided in hypertensive and normotensive patients. There is no important statistical distinction between groups, for the cortisol levels before, during and after tooth extraction regardless of the type of anesthetic used, while we registered higher values of systolic and diastolic values at hypertensive patients, regardless of the type of anesthetic. There is significant systolic and diastolic blood pressure rise in both groups of patients hypertensive and normotensive patients, (regardless of anesthetic used with or without vasoconstrictor), who underwent tooth extraction. The special emphasize is attributed to hypertensive patients where these changes are more significant. As per cortisol level and pulse rate, our results indicate no significant statistical difference in between groups.

  10. Walking with a four wheeled walker (rollator) significantly reduces EMG lower-limb muscle activity in healthy subjects.

    PubMed

    Suica, Zorica; Romkes, Jacqueline; Tal, Amir; Maguire, Clare

    2016-01-01

    To investigate the immediate effect of four-wheeled- walker(rollator)walking on lower-limb muscle activity and trunk-sway in healthy subjects. In this cross-sectional design electromyographic (EMG) data was collected in six lower-limb muscle groups and trunk-sway was measured as peak-to-peak angular displacement of the centre-of-mass (level L2/3) in the sagittal and frontal-planes using the SwayStar balance system. 19 subjects walked at self-selected speed firstly without a rollator then in randomised order 1. with rollator 2. with rollator with increased weight-bearing. Rollator-walking caused statistically significant reductions in EMG activity in lower-limb muscle groups and effect-sizes were medium to large. Increased weight-bearing increased the effect. Trunk-sway in the sagittal and frontal-planes showed no statistically significant difference between conditions. Rollator-walking reduces lower-limb muscle activity but trunk-sway remains unchanged as stability is likely gained through forces generated by the upper-limbs. Short-term stability is gained but the long-term effect is unclear and requires investigation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Cortisol Level and Hemodynamic Changes During Tooth Extraction at Hypertensive and Normotensive Patients

    PubMed Central

    Agani, Zana Bajrami; Benedetti, Alberto; Krasniqi, Vjosa Hamiti; Ahmedi, Jehona; Sejfija, Zana; Loxha, Mergime Prekazi; Murtezani, Arben; Rexhepi, Aida Namani; Ibraimi, Zana

    2015-01-01

    Background: The patients that are subjects to oral-surgical interventions produce large amounts of steroids in comparison with healthy patients which are not a subject to any dental intervention. The aim of research was to determine the level of stress hormone cortisol in serum, arterial blood pressure and arterial pulse, and to compare the effectiveness of the usage of lidocaine with adrenalin in comparison with lidocaine without adrenalin during the tooth extraction. Patients and methods: This clinical research includes patients with indication of tooth extraction divided in hypertensive and normotensive patients. Results: There is no important statistical distinction between groups, for the cortisol levels before, during and after tooth extraction regardless of the type of anesthetic used, while we registered higher values of systolic and diastolic values at hypertensive patients, regardless of the type of anesthetic Conclusion: There is significant systolic and diastolic blood pressure rise in both groups of patients hypertensive and normotensive patients, (regardless of anesthetic used with or without vasoconstrictor), who underwent tooth extraction. The special emphasize is attributed to hypertensive patients where these changes are more significant. As per cortisol level and pulse rate, our results indicate no significant statistical difference in between groups. PMID:26005263

  12. Influence of skin peeling procedure in allergic contact dermatitis.

    PubMed

    Kim, Jung Eun; Park, Hyun Jeong; Cho, Baik Kee; Lee, Jun Young

    2008-03-01

    The prevalence of allergic contact dermatitis in patients who have previously undergone skin peeling has been rarely studied. We compared the frequency of positive patch test (PT) reactions in a patient group with a history of peeling, to that of a control group with no history of peeling. The Korean standard series and cosmetic series were performed on a total of 262 patients. 62 patients had previously undergone peeling and 200 patients did not. The frequency of positive PT reactions on Korean standard series was significantly higher in the peeling group compared with that of the control group (P < 0.05, chi-square test). However, the most commonly identified allergens were mostly cosmetic-unrelated allergens. The frequency of positive PT reactions on cosmetic series in the peeling group was higher than that of the control group, but lacked statistical significance. The frequency (%) of positive PT reactions on cosmetic series in the high-frequency peel group was higher than that of the low-frequency group, but lacked statistical significance. It appears peeling may not generally affect the development of contact sensitization. Further work is required focusing on the large-scale prospective studies by performing a PT before and after peeling.

  13. Circulating 25-Hydroxyvitamin D and Risk of Kidney Cancer

    PubMed Central

    Gallicchio, Lisa; Moore, Lee E.; Stevens, Victoria L.; Ahn, Jiyoung; Albanes, Demetrius; Hartmuller, Virginia; Setiawan, V. Wendy; Helzlsouer, Kathy J.; Yang, Gong; Xiang, Yong-Bing; Shu, Xiao-Ou; Snyder, Kirk; Weinstein, Stephanie J.; Yu, Kai; Zeleniuch-Jacquotte, Anne; Zheng, Wei; Cai, Qiuyin; Campbell, David S.; Chen, Yu; Chow, Wong-Ho; Horst, Ronald L.; Kolonel, Laurence N.; McCullough, Marjorie L.; Purdue, Mark P.; Koenig, Karen L.

    2010-01-01

    Although the kidney is a major organ for vitamin D metabolism, activity, and calcium-related homeostasis, little is known about whether this nutrient plays a role in the development or the inhibition of kidney cancer. To address this gap in knowledge, the authors examined the association between circulating 25-hydroxyvitamin D (25(OH)D) and kidney cancer within a large, nested case-control study developed as part of the Cohort Consortium Vitamin D Pooling Project of Rarer Cancers. Concentrations of 25(OH)D were measured from 775 kidney cancer cases and 775 age-, sex-, race-, and season-matched controls from 8 prospective cohort studies. Overall, neither low nor high concentrations of circulating 25(OH)D were significantly associated with kidney cancer risk. Although the data showed a statistically significant decreased risk for females (odds ratio = 0.31, 95% confidence interval: 0.12, 0.85) with 25(OH)D concentrations of ≥75 nmol/L, the linear trend was not statistically significant and the number of cases in this category was small (n = 14). The findings from this consortium-based study do not support the hypothesis that vitamin D is inversely associated with the risk of kidney cancer overall or with renal cell carcinoma specifically. PMID:20562187

  14. The Relationship Between Posttraumatic Growth and Psychosocial Variables in Survivors of State Terrorism and Their Relatives.

    PubMed

    Cárdenas-Castro, Manuel; Faúndez-Abarca, Ximena; Arancibia-Martini, Héctor; Ceruti-Mahn, Cristián

    2017-08-01

    The present study explores reports of growth in survivors and family members of victims of state terrorism ( N = 254) in Chile from 1973 to 1990. The results indicate the presence of reports of posttraumatic growth ( M = 4.69) and a positive and statistically significant correlation with variables related to the life impact of the stressful events ( r = .46), social sharing of emotions ( r = .32), deliberate rumination ( r = .37), positive reappraisal ( r = .35), reconciliation ( r = .39), spiritual practices ( r = .33), and meaning in life ( r = .51). The relationship between growth and forgiveness is not statistically significant. The variables that best predict posttraumatic growth are positive reappraisal (β = .28), life impact (β = .24), meaning in life β = .23), and reconciliation (β = .20). The forward-method hierarchical model indicates that these variables are significant predictors of growth levels, R 2 = .53, F(8, 210) = 30.08, p < .001. The results indicate that a large proportion of the victims of state terrorism manage to grow after these experiences, and the redefinition of meaning in life and the positive reappraisal of the traumatic experiences are the elements that make it possible to create a new narrative about the past.

  15. Effects of rearing temperature and density on growth, survival and development of sea cucumber larvae, Apostichopus japonicus (Selenka)

    NASA Astrophysics Data System (ADS)

    Liu, Guangbin; Yang, Hongsheng; Liu, Shilin

    2010-07-01

    In laboratory conditions, effects of rearing temperature and stocking density were examined on hatching of fertilized egg and growth of auricularia larvae of Apostichopus japonicus respectively. Data series like larval length and density, metamorphic time, and survival rate of the larvae were recorded. Statistics showed that for A. japonicus, survival rate (from fertilized egg to late auricularia) decreased significantly with the increasing rearing temperature ( P<0.05). At different temperatures SGR was statistically significant as well ( P<0.05) from day 1, and maximal SGR was found on day 9 at 24°C (159.26±3.28). This study clearly indicated that at low temperature (<24°C), metamorphic rate was remarkably higher than at higher temperature (>26°C). Hatching rate was significantly different between 0.2-5 ind./ml groups and 20-50 ind./ml groups. Rearing larvae at the higher density had the smaller maximal-length, whereas needed longer time to complete metamorphosis. This study suggested that 21°C and 0.4 ind./ml can be used as the most suitable rearing temperature and stocking density for large -scale artificial breeding of A. japonicus’s larvae.

  16. Weather and childbirth: a further search for relationships.

    PubMed

    Driscoll, D M

    1995-03-01

    Previous attempts to find relationships between weather and parturition (childbirth) and its onset (the beginning of labor pains) have revealed, firstly, limited but statistically significant relationships between weather conditions much colder than the day before, with high winds and low pressure, and increased onsets; and secondly, increased numbers of childbirths during periods of atmospheric pressure rise (highly statistically significant). To test these findings, this study examined weather data coincident childbirth data from a hospital at Bryan-College Station, Texas (for a period of 30 cool months from 1987 to 1992). Tests for (1) days of cold fronts, (2) a day before and a day after the cold front, (3) days with large temperature increases, and (4) decreases from the day before revealed no relationship with mean daily rate of onset. Cold days with high winds and low pressure had significantly fewer onsets, a result that is the opposite of previous findings. The postulated relationship between periods of pressure rise and increased birth frequency was negative, i.e., significantly fewer births occurred at those times--again, the opposite of the apparent occurrence in an earlier study. The coincidence of diurnal variations in both atmospheric pressure and frequency of childbirths, was shown to account for fairly strong negative associations between the two variables.(ABSTRACT TRUNCATED AT 250 WORDS)

  17. Systematic change in global patterns of streamflow following volcanic eruptions.

    PubMed

    Iles, Carley E; Hegerl, Gabriele C

    2015-11-01

    Following large explosive volcanic eruptions precipitation decreases over much of the globe1-6, particularly in climatologically wet regions4,5. Stratospheric volcanic aerosols reflect sunlight, which reduces evaporation, whilst surface cooling stabilises the atmosphere and reduces its water-holding capacity7. Circulation changes modulate this global precipitation reduction on regional scales1,8-10. Despite the importance of rivers to people, it has been unclear whether volcanism causes detectable changes in streamflow given large natural variability. Here we analyse observational records of streamflow volume for fifty large rivers from around the world which cover between two and 6 major volcanic eruptions in the 20 th and late 19 th century. We find statistically significant reductions in flow following eruptions for the Amazon, Congo, Nile, Orange, Ob, Yenisey and Kolyma amongst others. When data from neighbouring rivers are combined - based on the areas where climate models simulate either an increase or a decrease in precipitation following eruptions - a significant (p<0.1) decrease in streamflow following eruptions is detected in northern South American, central African and high-latitude Asian rivers, and on average across wet tropical and subtropical regions. We also detect a significant increase in southern South American and SW North American rivers. This suggests that future volcanic eruptions could substantially affect global water availability.

  18. Systematic change in global patterns of streamflow following volcanic eruptions

    PubMed Central

    Iles, Carley E.; Hegerl, Gabriele C.

    2016-01-01

    Following large explosive volcanic eruptions precipitation decreases over much of the globe1–6, particularly in climatologically wet regions4,5. Stratospheric volcanic aerosols reflect sunlight, which reduces evaporation, whilst surface cooling stabilises the atmosphere and reduces its water-holding capacity7. Circulation changes modulate this global precipitation reduction on regional scales1,8–10. Despite the importance of rivers to people, it has been unclear whether volcanism causes detectable changes in streamflow given large natural variability. Here we analyse observational records of streamflow volume for fifty large rivers from around the world which cover between two and 6 major volcanic eruptions in the 20th and late 19th century. We find statistically significant reductions in flow following eruptions for the Amazon, Congo, Nile, Orange, Ob, Yenisey and Kolyma amongst others. When data from neighbouring rivers are combined - based on the areas where climate models simulate either an increase or a decrease in precipitation following eruptions – a significant (p<0.1) decrease in streamflow following eruptions is detected in northern South American, central African and high-latitude Asian rivers, and on average across wet tropical and subtropical regions. We also detect a significant increase in southern South American and SW North American rivers. This suggests that future volcanic eruptions could substantially affect global water availability. PMID:27279897

  19. Methods for evaluating temporal groundwater quality data and results of decadal-scale changes in chloride, dissolved solids, and nitrate concentrations in groundwater in the United States, 1988-2010

    USGS Publications Warehouse

    Lindsey, Bruce D.; Rupert, Michael G.

    2012-01-01

    Decadal-scale changes in groundwater quality were evaluated by the U.S. Geological Survey National Water-Quality Assessment (NAWQA) Program. Samples of groundwater collected from wells during 1988-2000 - a first sampling event representing the decade ending the 20th century - were compared on a pair-wise basis to samples from the same wells collected during 2001-2010 - a second sampling event representing the decade beginning the 21st century. The data set consists of samples from 1,236 wells in 56 well networks, representing major aquifers and urban and agricultural land-use areas, with analytical results for chloride, dissolved solids, and nitrate. Statistical analysis was done on a network basis rather than by individual wells. Although spanning slightly more or less than a 10-year period, the two-sample comparison between the first and second sampling events is referred to as an analysis of decadal-scale change based on a step-trend analysis. The 22 principal aquifers represented by these 56 networks account for nearly 80 percent of the estimated withdrawals of groundwater used for drinking-water supply in the Nation. Well networks where decadal-scale changes in concentrations were statistically significant were identified using the Wilcoxon-Pratt signed-rank test. For the statistical analysis of chloride, dissolved solids, and nitrate concentrations at the network level, more than half revealed no statistically significant change over the decadal period. However, for networks that had statistically significant changes, increased concentrations outnumbered decreased concentrations by a large margin. Statistically significant increases of chloride concentrations were identified for 43 percent of 56 networks. Dissolved solids concentrations increased significantly in 41 percent of the 54 networks with dissolved solids data, and nitrate concentrations increased significantly in 23 percent of 56 networks. At least one of the three - chloride, dissolved solids, or nitrate - had a statistically significant increase in concentration in 66 percent of the networks. Statistically significant decreases in concentrations were identified in 4 percent of the networks for chloride, 2 percent of the networks for dissolved solids, and 9 percent of the networks for nitrate. A larger percentage of urban land-use networks had statistically significant increases in chloride, dissolved solids, and nitrate concentrations than agricultural land-use networks. In order to assess the magnitude of statistically significant changes, the median of the differences between constituent concentrations from the first full-network sampling event and those from the second full-network sampling event was calculated using the Turnbull method. The largest median decadal increases in chloride concentrations were in networks in the Upper Illinois River Basin (67 mg/L) and in the New England Coastal Basins (34 mg/L), whereas the largest median decadal decrease in chloride concentrations was in the Upper Snake River Basin (1 mg/L). The largest median decadal increases in dissolved solids concentrations were in networks in the Rio Grande Valley (260 mg/L) and the Upper Illinois River Basin (160 mg/L). The largest median decadal decrease in dissolved solids concentrations was in the Apalachicola-Chattahoochee-Flint River Basin (6.0 mg/L). The largest median decadal increases in nitrate as nitrogen (N) concentrations were in networks in the South Platte River Basin (2.0 mg/L as N) and the San Joaquin-Tulare Basins (1.0 mg/L as N). The largest median decadal decrease in nitrate concentrations was in the Santee River Basin and Coastal Drainages (0.63 mg/L). The magnitude of change in networks with statistically significant increases typically was much larger than the magnitude of change in networks with statistically significant decreases. The magnitude of change was greatest for chloride in the urban land-use networks and greatest for dissolved solids and nitrate in the agricultural land-use networks. Analysis of data from all networks combined indicated statistically significant increases for chloride, dissolved solids, and nitrate. Although chloride, dissolved solids, and nitrate concentrations were typically less than the drinking-water standards and guidelines, a statistical test was used to determine whether or not the proportion of samples exceeding the drinking-water standard or guideline changed significantly between the first and second full-network sampling events. The proportion of samples exceeding the U.S. Environmental Protection Agency (USEPA) Secondary Maximum Contaminant Level for dissolved solids (500 milligrams per liter) increased significantly between the first and second full-network sampling events when evaluating all networks combined at the national level. Also, for all networks combined, the proportion of samples exceeding the USEPA Maximum Contaminant Level (MCL) of 10 mg/L as N for nitrate increased significantly. One network in the Delmarva Peninsula had a significant increase in the proportion of samples exceeding the MCL for nitrate. A subset of 261 wells was sampled every other year (biennially) to evaluate decadal-scale changes using a time-series analysis. The analysis of the biennial data set showed that changes were generally similar to the findings from the analysis of decadal-scale change that was based on a step-trend analysis. Because of the small number of wells in a network with biennial data (typically 4-5 wells), the time-series analysis is more useful for understanding water-quality responses to changes in site-specific conditions rather than as an indicator of the change for the entire network.

  20. Large-scale variation in subsurface stream biofilms: a cross-regional comparison of metabolic function and community similarity.

    PubMed

    Findlay, S; Sinsabaugh, R L

    2006-10-01

    We examined bacterial metabolic activity and community similarity in shallow subsurface stream sediments distributed across three regions of the eastern United States to assess whether there were parallel changes in functional and structural attributes at this large scale. Bacterial growth, oxygen consumption, and a suite of extracellular enzyme activities were assayed to describe functional variability. Community similarity was assessed using randomly amplified polymorphic DNA (RAPD) patterns. There were significant differences in streamwater chemistry, metabolic activity, and bacterial growth among regions with, for instance, twofold higher bacterial production in streams near Baltimore, MD, compared to Hubbard Brook, NH. Five of eight extracellular enzymes showed significant differences among regions. Cluster analyses of individual streams by metabolic variables showed clear groups with significant differences in representation of sites from different regions among groups. Clustering of sites based on randomly amplified polymorphic DNA banding resulted in groups with generally less internal similarity although there were still differences in distribution of regional sites. There was a marginally significant (p = 0.09) association between patterns based on functional and structural variables. There were statistically significant but weak (r2 approximately 30%) associations between landcover and measures of both structure and function. These patterns imply a large-scale organization of biofilm communities and this structure may be imposed by factor(s) such as landcover and covariates such as nutrient concentrations, which are known to also cause differences in macrobiota of stream ecosystems.

Top