Sample records for calculated statistical analyses

  1. Using R-Project for Free Statistical Analysis in Extension Research

    ERIC Educational Resources Information Center

    Mangiafico, Salvatore S.

    2013-01-01

    One option for Extension professionals wishing to use free statistical software is to use online calculators, which are useful for common, simple analyses. A second option is to use a free computing environment capable of performing statistical analyses, like R-project. R-project is free, cross-platform, powerful, and respected, but may be…

  2. Aircraft Maneuvers for the Evaluation of Flying Qualities and Agility. Volume 1. Maneuver Development Process and Initial Maneuver Set

    DTIC Science & Technology

    1993-08-01

    subtitled "Simulation Data," consists of detailed infonrnation on the design parmneter variations tested, subsequent statistical analyses conducted...used with confidence during the design process. The data quality can be examined in various forms such as statistical analyses of measure of merit data...merit, such as time to capture or nmaximurn pitch rate, can be calculated from the simulation time history data. Statistical techniques are then used

  3. Orphan therapies: making best use of postmarket data.

    PubMed

    Maro, Judith C; Brown, Jeffrey S; Dal Pan, Gerald J; Li, Lingling

    2014-08-01

    Postmarket surveillance of the comparative safety and efficacy of orphan therapeutics is challenging, particularly when multiple therapeutics are licensed for the same orphan indication. To make best use of product-specific registry data collected to fulfill regulatory requirements, we propose the creation of a distributed electronic health data network among registries. Such a network could support sequential statistical analyses designed to detect early warnings of excess risks. We use a simulated example to explore the circumstances under which a distributed network may prove advantageous. We perform sample size calculations for sequential and non-sequential statistical studies aimed at comparing the incidence of hepatotoxicity following initiation of two newly licensed therapies for homozygous familial hypercholesterolemia. We calculate the sample size savings ratio, or the proportion of sample size saved if one conducted a sequential study as compared to a non-sequential study. Then, using models to describe the adoption and utilization of these therapies, we simulate when these sample sizes are attainable in calendar years. We then calculate the analytic calendar time savings ratio, analogous to the sample size savings ratio. We repeat these analyses for numerous scenarios. Sequential analyses detect effect sizes earlier or at the same time as non-sequential analyses. The most substantial potential savings occur when the market share is more imbalanced (i.e., 90% for therapy A) and the effect size is closest to the null hypothesis. However, due to low exposure prevalence, these savings are difficult to realize within the 30-year time frame of this simulation for scenarios in which the outcome of interest occurs at or more frequently than one event/100 person-years. We illustrate a process to assess whether sequential statistical analyses of registry data performed via distributed networks may prove a worthwhile infrastructure investment for pharmacovigilance.

  4. Epidemiologic programs for computers and calculators. A microcomputer program for multiple logistic regression by unconditional and conditional maximum likelihood methods.

    PubMed

    Campos-Filho, N; Franco, E L

    1989-02-01

    A frequent procedure in matched case-control studies is to report results from the multivariate unmatched analyses if they do not differ substantially from the ones obtained after conditioning on the matching variables. Although conceptually simple, this rule requires that an extensive series of logistic regression models be evaluated by both the conditional and unconditional maximum likelihood methods. Most computer programs for logistic regression employ only one maximum likelihood method, which requires that the analyses be performed in separate steps. This paper describes a Pascal microcomputer (IBM PC) program that performs multiple logistic regression by both maximum likelihood estimation methods, which obviates the need for switching between programs to obtain relative risk estimates from both matched and unmatched analyses. The program calculates most standard statistics and allows factoring of categorical or continuous variables by two distinct methods of contrast. A built-in, descriptive statistics option allows the user to inspect the distribution of cases and controls across categories of any given variable.

  5. [A Review on the Use of Effect Size in Nursing Research].

    PubMed

    Kang, Hyuncheol; Yeon, Kyupil; Han, Sang Tae

    2015-10-01

    The purpose of this study was to introduce the main concepts of statistical testing and effect size and to provide researchers in nursing science with guidance on how to calculate the effect size for the statistical analysis methods mainly used in nursing. For t-test, analysis of variance, correlation analysis, regression analysis which are used frequently in nursing research, the generally accepted definitions of the effect size were explained. Some formulae for calculating the effect size are described with several examples in nursing research. Furthermore, the authors present the required minimum sample size for each example utilizing G*Power 3 software that is the most widely used program for calculating sample size. It is noted that statistical significance testing and effect size measurement serve different purposes, and the reliance on only one side may be misleading. Some practical guidelines are recommended for combining statistical significance testing and effect size measure in order to make more balanced decisions in quantitative analyses.

  6. A General Framework for Power Analysis to Detect the Moderator Effects in Two- and Three-Level Cluster Randomized Trials

    ERIC Educational Resources Information Center

    Dong, Nianbo; Spybrook, Jessaca; Kelcey, Ben

    2016-01-01

    The purpose of this study is to propose a general framework for power analyses to detect the moderator effects in two- and three-level cluster randomized trials (CRTs). The study specifically aims to: (1) develop the statistical formulations for calculating statistical power, minimum detectable effect size (MDES) and its confidence interval to…

  7. Statistical analysis of QC data and estimation of fuel rod behaviour

    NASA Astrophysics Data System (ADS)

    Heins, L.; Groβ, H.; Nissen, K.; Wunderlich, F.

    1991-02-01

    The behaviour of fuel rods while in reactor is influenced by many parameters. As far as fabrication is concerned, fuel pellet diameter and density, and inner cladding diameter are important examples. Statistical analyses of quality control data show a scatter of these parameters within the specified tolerances. At present it is common practice to use a combination of superimposed unfavorable tolerance limits (worst case dataset) in fuel rod design calculations. Distributions are not considered. The results obtained in this way are very conservative but the degree of conservatism is difficult to quantify. Probabilistic calculations based on distributions allow the replacement of the worst case dataset by a dataset leading to results with known, defined conservatism. This is achieved by response surface methods and Monte Carlo calculations on the basis of statistical distributions of the important input parameters. The procedure is illustrated by means of two examples.

  8. Small studies may overestimate the effect sizes in critical care meta-analyses: a meta-epidemiological study

    PubMed Central

    2013-01-01

    Introduction Small-study effects refer to the fact that trials with limited sample sizes are more likely to report larger beneficial effects than large trials. However, this has never been investigated in critical care medicine. Thus, the present study aimed to examine the presence and extent of small-study effects in critical care medicine. Methods Critical care meta-analyses involving randomized controlled trials and reported mortality as an outcome measure were considered eligible for the study. Component trials were classified as large (≥100 patients per arm) and small (<100 patients per arm) according to their sample sizes. Ratio of odds ratio (ROR) was calculated for each meta-analysis and then RORs were combined using a meta-analytic approach. ROR<1 indicated larger beneficial effect in small trials. Small and large trials were compared in methodological qualities including sequence generating, blinding, allocation concealment, intention to treat and sample size calculation. Results A total of 27 critical care meta-analyses involving 317 trials were included. Of them, five meta-analyses showed statistically significant RORs <1, and other meta-analyses did not reach a statistical significance. Overall, the pooled ROR was 0.60 (95% CI: 0.53 to 0.68); the heterogeneity was moderate with an I2 of 50.3% (chi-squared = 52.30; P = 0.002). Large trials showed significantly better reporting quality than small trials in terms of sequence generating, allocation concealment, blinding, intention to treat, sample size calculation and incomplete follow-up data. Conclusions Small trials are more likely to report larger beneficial effects than large trials in critical care medicine, which could be partly explained by the lower methodological quality in small trials. Caution should be practiced in the interpretation of meta-analyses involving small trials. PMID:23302257

  9. Implementation and validation of fully relativistic GW calculations: Spin–orbit coupling in molecules, nanocrystals, and solids

    DOE PAGES

    Scherpelz, Peter; Govoni, Marco; Hamada, Ikutaro; ...

    2016-06-22

    We present an implementation of G 0W 0 calculations including spin–orbit coupling (SOC) enabling investigations of large systems, with thousands of electrons, and we discuss results for molecules, solids, and nanocrystals. Using a newly developed set of molecules with heavy elements (called GW-SOC81), we find that, when based upon hybrid density functional calculations, fully relativistic (FR) and scalar-relativistic (SR) G 0W 0 calculations of vertical ionization potentials both yield excellent performance compared to experiment, with errors below 1.9%. We demonstrate that while SR calculations have higher random errors, FR calculations systematically underestimate the VIP by 0.1 to 0.2 eV. Wemore » further verify that SOC effects may be well approximated at the FR density functional level and then added to SR G 0W 0 results for a broad class of systems. We also address the use of different root-finding algorithms for the G 0W 0 quasiparticle equation and the significant influence of including d electrons in the valence partition of the pseudopotential for G 0W 0 calculations. Lastly, we present statistical analyses of our data, highlighting the importance of separating definitive improvements from those that may occur by chance due to a limited number of samples. We suggest the statistical analyses used here will be useful in the assessment of the accuracy of a large variety of electronic structure methods« less

  10. Implementation and validation of fully relativistic GW calculations: Spin–orbit coupling in molecules, nanocrystals, and solids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scherpelz, Peter; Govoni, Marco; Hamada, Ikutaro

    We present an implementation of G 0W 0 calculations including spin–orbit coupling (SOC) enabling investigations of large systems, with thousands of electrons, and we discuss results for molecules, solids, and nanocrystals. Using a newly developed set of molecules with heavy elements (called GW-SOC81), we find that, when based upon hybrid density functional calculations, fully relativistic (FR) and scalar-relativistic (SR) G 0W 0 calculations of vertical ionization potentials both yield excellent performance compared to experiment, with errors below 1.9%. We demonstrate that while SR calculations have higher random errors, FR calculations systematically underestimate the VIP by 0.1 to 0.2 eV. Wemore » further verify that SOC effects may be well approximated at the FR density functional level and then added to SR G 0W 0 results for a broad class of systems. We also address the use of different root-finding algorithms for the G 0W 0 quasiparticle equation and the significant influence of including d electrons in the valence partition of the pseudopotential for G 0W 0 calculations. Lastly, we present statistical analyses of our data, highlighting the importance of separating definitive improvements from those that may occur by chance due to a limited number of samples. We suggest the statistical analyses used here will be useful in the assessment of the accuracy of a large variety of electronic structure methods« less

  11. SAS Code for Calculating Intraclass Correlation Coefficients and Effect Size Benchmarks for Site-Randomized Education Experiments

    ERIC Educational Resources Information Center

    Brandon, Paul R.; Harrison, George M.; Lawton, Brian E.

    2013-01-01

    When evaluators plan site-randomized experiments, they must conduct the appropriate statistical power analyses. These analyses are most likely to be valid when they are based on data from the jurisdictions in which the studies are to be conducted. In this method note, we provide software code, in the form of a SAS macro, for producing statistical…

  12. Periodicity of microfilariae of human filariasis analysed by a trigonometric method (Aikat and Das).

    PubMed

    Tanaka, H

    1981-04-01

    The microfilarial periodicity of human filariae was characterized statistically by fitting the observed change of microfilaria (mf) counts to the formula of a simple harmonic wave using two parameters, the peak hour (K) and periodicity index (D) (Sasa & Tanaka, 1972, 1974). Later Aikat and Das (1976) proposed a simple calculation method using trigonometry (A-D method) to determine the peak hour (K) and periodicity index (P). All data of microfilarial periodicity analysed previously by the method of Sasa and Tanaka (S-T method) were calculated again by the A-D method in the present study to evaluate the latter method. The results of calculations showed that P was not proportional to D and the ratios of P/D were mostly smaller than expected, especially when P or D was small in less periodic forms. The peak hour calculated by the A-D method did not differ much from that calculated by the S-T method. Goodness of fit was improved slightly by the A-K method in two thirds of analysed data. The classification of human filariae in respect of the type of periodicity was, however, changed little by the results calculated by the A-D method.

  13. Recent evaluations of crack-opening-area in circumferentially cracked pipes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rahman, S.; Brust, F.; Ghadiali, N.

    1997-04-01

    Leak-before-break (LBB) analyses for circumferentially cracked pipes are currently being conducted in the nuclear industry to justify elimination of pipe whip restraints and jet shields which are present because of the expected dynamic effects from pipe rupture. The application of the LBB methodology frequently requires calculation of leak rates. The leak rates depend on the crack-opening area of the through-wall crack in the pipe. In addition to LBB analyses which assume a hypothetical flaw size, there is also interest in the integrity of actual leaking cracks corresponding to current leakage detection requirements in NRC Regulatory Guide 1.45, or for assessingmore » temporary repair of Class 2 and 3 pipes that have leaks as are being evaluated in ASME Section XI. The objectives of this study were to review, evaluate, and refine current predictive models for performing crack-opening-area analyses of circumferentially cracked pipes. The results from twenty-five full-scale pipe fracture experiments, conducted in the Degraded Piping Program, the International Piping Integrity Research Group Program, and the Short Cracks in Piping and Piping Welds Program, were used to verify the analytical models. Standard statistical analyses were performed to assess used to verify the analytical models. Standard statistical analyses were performed to assess quantitatively the accuracy of the predictive models. The evaluation also involved finite element analyses for determining the crack-opening profile often needed to perform leak-rate calculations.« less

  14. Statistical parameters of random heterogeneity estimated by analysing coda waves based on finite difference method

    NASA Astrophysics Data System (ADS)

    Emoto, K.; Saito, T.; Shiomi, K.

    2017-12-01

    Short-period (<1 s) seismograms are strongly affected by small-scale (<10 km) heterogeneities in the lithosphere. In general, short-period seismograms are analysed based on the statistical method by considering the interaction between seismic waves and randomly distributed small-scale heterogeneities. Statistical properties of the random heterogeneities have been estimated by analysing short-period seismograms. However, generally, the small-scale random heterogeneity is not taken into account for the modelling of long-period (>2 s) seismograms. We found that the energy of the coda of long-period seismograms shows a spatially flat distribution. This phenomenon is well known in short-period seismograms and results from the scattering by small-scale heterogeneities. We estimate the statistical parameters that characterize the small-scale random heterogeneity by modelling the spatiotemporal energy distribution of long-period seismograms. We analyse three moderate-size earthquakes that occurred in southwest Japan. We calculate the spatial distribution of the energy density recorded by a dense seismograph network in Japan at the period bands of 8-16 s, 4-8 s and 2-4 s and model them by using 3-D finite difference (FD) simulations. Compared to conventional methods based on statistical theories, we can calculate more realistic synthetics by using the FD simulation. It is not necessary to assume a uniform background velocity, body or surface waves and scattering properties considered in general scattering theories. By taking the ratio of the energy of the coda area to that of the entire area, we can separately estimate the scattering and the intrinsic absorption effects. Our result reveals the spectrum of the random inhomogeneity in a wide wavenumber range including the intensity around the corner wavenumber as P(m) = 8πε2a3/(1 + a2m2)2, where ε = 0.05 and a = 3.1 km, even though past studies analysing higher-frequency records could not detect the corner. Finally, we estimate the intrinsic attenuation by modelling the decay rate of the energy. The method proposed in this study is suitable for quantifying the statistical properties of long-wavelength subsurface random inhomogeneity, which leads the way to characterizing a wider wavenumber range of spectra, including the corner wavenumber.

  15. The Data from Aeromechanics Test and Analytics -- Management and Analysis Package (DATAMAP). Volume I. User’s Manual.

    DTIC Science & Technology

    1980-12-01

    to sound pressure level in decibels assuming a fre- quency of 1000 Hz. 249 The perceived noisiness values are derived from a formula specified in...Analyses .......... 244 6.i.16 Perceived Noise Level Analysis .............249 6.1.17 Acoustic Weighting Networks ................250 6.2 DERIVATIONS...BAND ANALYSIS BASIC STATISTICAL ANALYSES: *OCTAVE ANALYSIS MEAN *THIRD OCTAVE ANALYSIS VARIANCE *PERCEIVED NOISE LEVEL STANDARD DEVIATION CALCULATION

  16. SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.

    PubMed

    Chu, Annie; Cui, Jenny; Dinov, Ivo D

    2009-03-01

    The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models.

  17. An operational definition of a statistically meaningful trend.

    PubMed

    Bryhn, Andreas C; Dimberg, Peter H

    2011-04-28

    Linear trend analysis of time series is standard procedure in many scientific disciplines. If the number of data is large, a trend may be statistically significant even if data are scattered far from the trend line. This study introduces and tests a quality criterion for time trends referred to as statistical meaningfulness, which is a stricter quality criterion for trends than high statistical significance. The time series is divided into intervals and interval mean values are calculated. Thereafter, r(2) and p values are calculated from regressions concerning time and interval mean values. If r(2) ≥ 0.65 at p ≤ 0.05 in any of these regressions, then the trend is regarded as statistically meaningful. Out of ten investigated time series from different scientific disciplines, five displayed statistically meaningful trends. A Microsoft Excel application (add-in) was developed which can perform statistical meaningfulness tests and which may increase the operationality of the test. The presented method for distinguishing statistically meaningful trends should be reasonably uncomplicated for researchers with basic statistics skills and may thus be useful for determining which trends are worth analysing further, for instance with respect to causal factors. The method can also be used for determining which segments of a time trend may be particularly worthwhile to focus on.

  18. Weighted Statistical Binning: Enabling Statistically Consistent Genome-Scale Phylogenetic Analyses

    PubMed Central

    Bayzid, Md Shamsuzzoha; Mirarab, Siavash; Boussau, Bastien; Warnow, Tandy

    2015-01-01

    Because biological processes can result in different loci having different evolutionary histories, species tree estimation requires multiple loci from across multiple genomes. While many processes can result in discord between gene trees and species trees, incomplete lineage sorting (ILS), modeled by the multi-species coalescent, is considered to be a dominant cause for gene tree heterogeneity. Coalescent-based methods have been developed to estimate species trees, many of which operate by combining estimated gene trees, and so are called "summary methods". Because summary methods are generally fast (and much faster than more complicated coalescent-based methods that co-estimate gene trees and species trees), they have become very popular techniques for estimating species trees from multiple loci. However, recent studies have established that summary methods can have reduced accuracy in the presence of gene tree estimation error, and also that many biological datasets have substantial gene tree estimation error, so that summary methods may not be highly accurate in biologically realistic conditions. Mirarab et al. (Science 2014) presented the "statistical binning" technique to improve gene tree estimation in multi-locus analyses, and showed that it improved the accuracy of MP-EST, one of the most popular coalescent-based summary methods. Statistical binning, which uses a simple heuristic to evaluate "combinability" and then uses the larger sets of genes to re-calculate gene trees, has good empirical performance, but using statistical binning within a phylogenomic pipeline does not have the desirable property of being statistically consistent. We show that weighting the re-calculated gene trees by the bin sizes makes statistical binning statistically consistent under the multispecies coalescent, and maintains the good empirical performance. Thus, "weighted statistical binning" enables highly accurate genome-scale species tree estimation, and is also statistically consistent under the multi-species coalescent model. New data used in this study are available at DOI: http://dx.doi.org/10.6084/m9.figshare.1411146, and the software is available at https://github.com/smirarab/binning. PMID:26086579

  19. Methodological reporting of randomized trials in five leading Chinese nursing journals.

    PubMed

    Shi, Chunhu; Tian, Jinhui; Ren, Dan; Wei, Hongli; Zhang, Lihuan; Wang, Quan; Yang, Kehu

    2014-01-01

    Randomized controlled trials (RCTs) are not always well reported, especially in terms of their methodological descriptions. This study aimed to investigate the adherence of methodological reporting complying with CONSORT and explore associated trial level variables in the Chinese nursing care field. In June 2012, we identified RCTs published in five leading Chinese nursing journals and included trials with details of randomized methods. The quality of methodological reporting was measured through the methods section of the CONSORT checklist and the overall CONSORT methodological items score was calculated and expressed as a percentage. Meanwhile, we hypothesized that some general and methodological characteristics were associated with reporting quality and conducted a regression with these data to explore the correlation. The descriptive and regression statistics were calculated via SPSS 13.0. In total, 680 RCTs were included. The overall CONSORT methodological items score was 6.34 ± 0.97 (Mean ± SD). No RCT reported descriptions and changes in "trial design," changes in "outcomes" and "implementation," or descriptions of the similarity of interventions for "blinding." Poor reporting was found in detailing the "settings of participants" (13.1%), "type of randomization sequence generation" (1.8%), calculation methods of "sample size" (0.4%), explanation of any interim analyses and stopping guidelines for "sample size" (0.3%), "allocation concealment mechanism" (0.3%), additional analyses in "statistical methods" (2.1%), and targeted subjects and methods of "blinding" (5.9%). More than 50% of trials described randomization sequence generation, the eligibility criteria of "participants," "interventions," and definitions of the "outcomes" and "statistical methods." The regression analysis found that publication year and ITT analysis were weakly associated with CONSORT score. The completeness of methodological reporting of RCTs in the Chinese nursing care field is poor, especially with regard to the reporting of trial design, changes in outcomes, sample size calculation, allocation concealment, blinding, and statistical methods.

  20. Effects of Alzheimer’s Disease in the Prediagnosis Period on Financial Outcomes

    DTIC Science & Technology

    2017-10-01

    merged data; derived key dependent and independent variables and calculated descriptive statistics; and performed initial analyses of the effect of AD on...during the period before it is diagnosable on financial outcomes differ depending on whether the financial head of household is afflicted or the spouse

  1. An investigative comparison of purging and non-purging groundwater sampling methods in Karoo aquifer monitoring wells

    NASA Astrophysics Data System (ADS)

    Gomo, M.; Vermeulen, D.

    2015-03-01

    An investigation was conducted to statistically compare the influence of non-purging and purging groundwater sampling methods on analysed inorganic chemistry parameters and calculated saturation indices. Groundwater samples were collected from 15 monitoring wells drilled in Karoo aquifers before and after purging for the comparative study. For the non-purging method, samples were collected from groundwater flow zones located in the wells using electrical conductivity (EC) profiling. The two data sets of non-purged and purged groundwater samples were analysed for inorganic chemistry parameters at the Institute of Groundwater Studies (IGS) laboratory of the Free University in South Africa. Saturation indices for mineral phases that were found in the data base of PHREEQC hydrogeochemical model were calculated for each data set. Four one-way ANOVA tests were conducted using Microsoft excel 2007 to investigate if there is any statistically significant difference between: (1) all inorganic chemistry parameters measured in the non-purged and purged groundwater samples per each specific well, (2) all mineral saturation indices calculated for the non-purged and purged groundwater samples per each specific well, (3) individual inorganic chemistry parameters measured in the non-purged and purged groundwater samples across all wells and (4) Individual mineral saturation indices calculated for non-purged and purged groundwater samples across all wells. For all the ANOVA tests conducted, the calculated alpha values (p) are greater than 0.05 (significance level) and test statistic (F) is less than the critical value (Fcrit) (F < Fcrit). The results imply that there was no statistically significant difference between the two data sets. With a 95% confidence, it was therefore concluded that the variance between groups was rather due to random chance and not to the influence of the sampling methods (tested factor). It is therefore be possible that in some hydrogeologic conditions, non-purged groundwater samples might be just as representative as the purged ones. The findings of this study can provide an important platform for future evidence oriented research investigations to establish the necessity of purging prior to groundwater sampling in different aquifer systems.

  2. Nonindependence and sensitivity analyses in ecological and evolutionary meta-analyses.

    PubMed

    Noble, Daniel W A; Lagisz, Malgorzata; O'dea, Rose E; Nakagawa, Shinichi

    2017-05-01

    Meta-analysis is an important tool for synthesizing research on a variety of topics in ecology and evolution, including molecular ecology, but can be susceptible to nonindependence. Nonindependence can affect two major interrelated components of a meta-analysis: (i) the calculation of effect size statistics and (ii) the estimation of overall meta-analytic estimates and their uncertainty. While some solutions to nonindependence exist at the statistical analysis stages, there is little advice on what to do when complex analyses are not possible, or when studies with nonindependent experimental designs exist in the data. Here we argue that exploring the effects of procedural decisions in a meta-analysis (e.g. inclusion of different quality data, choice of effect size) and statistical assumptions (e.g. assuming no phylogenetic covariance) using sensitivity analyses are extremely important in assessing the impact of nonindependence. Sensitivity analyses can provide greater confidence in results and highlight important limitations of empirical work (e.g. impact of study design on overall effects). Despite their importance, sensitivity analyses are seldom applied to problems of nonindependence. To encourage better practice for dealing with nonindependence in meta-analytic studies, we present accessible examples demonstrating the impact that ignoring nonindependence can have on meta-analytic estimates. We also provide pragmatic solutions for dealing with nonindependent study designs, and for analysing dependent effect sizes. Additionally, we offer reporting guidelines that will facilitate disclosure of the sources of nonindependence in meta-analyses, leading to greater transparency and more robust conclusions. © 2017 John Wiley & Sons Ltd.

  3. Cocaine profiling for strategic intelligence, a cross-border project between France and Switzerland: part II. Validation of the statistical methodology for the profiling of cocaine.

    PubMed

    Lociciro, S; Esseiva, P; Hayoz, P; Dujourdy, L; Besacier, F; Margot, P

    2008-05-20

    Harmonisation and optimization of analytical and statistical methodologies were carried out between two forensic laboratories (Lausanne, Switzerland and Lyon, France) in order to provide drug intelligence for cross-border cocaine seizures. Part I dealt with the optimization of the analytical method and its robustness. This second part investigates statistical methodologies that will provide reliable comparison of cocaine seizures analysed on two different gas chromatographs interfaced with a flame ionisation detectors (GC-FIDs) in two distinct laboratories. Sixty-six statistical combinations (ten data pre-treatments followed by six different distance measurements and correlation coefficients) were applied. One pre-treatment (N+S: area of each peak is divided by its standard deviation calculated from the whole data set) followed by the Cosine or Pearson correlation coefficients were found to be the best statistical compromise for optimal discrimination of linked and non-linked samples. The centralisation of the analyses in one single laboratory is not a required condition anymore to compare samples seized in different countries. This allows collaboration, but also, jurisdictional control over data.

  4. A Simple Method to Control Positive Baseline Trend within Data Nonoverlap

    ERIC Educational Resources Information Center

    Parker, Richard I.; Vannest, Kimberly J.; Davis, John L.

    2014-01-01

    Nonoverlap is widely used as a statistical summary of data; however, these analyses rarely correct unwanted positive baseline trend. This article presents and validates the graph rotation for overlap and trend (GROT) technique, a hand calculation method for controlling positive baseline trend within an analysis of data nonoverlap. GROT is…

  5. Surface pressure maps from scatterometer data

    NASA Technical Reports Server (NTRS)

    Brown, R. A.; Levy, Gad

    1991-01-01

    The ability to determine surface pressure fields from satellite scatterometer data was shown by Brown and Levy (1986). The surface winds are used to calculate the gradient winds above the planetary boundary layer, and these are directly related to the pressure gradients. There are corrections for variable stratification, variable surface roughness, horizontal inhomogeneity, humidity and baroclinity. The Seasat-A Satellite Scatterometer (SASS) data have been used in a systematic study of 50 synoptic weather events (regions of approximately 1000 X 1000 km). The preliminary statistics of agreement with national weather service surface pressure maps are calculated. The resulting surface pressure maps can be used together with SASS winds and Scanning Multichannel Microwave Radiometer (SMMR) water vapor and liquid water analyses to provide good front and storm system analyses.

  6. Learning investment indicators through data extension

    NASA Astrophysics Data System (ADS)

    Dvořák, Marek

    2017-07-01

    Stock prices in the form of time series were analysed using single and multivariate statistical methods. After simple data preprocessing in the form of logarithmic differences, we augmented this single variate time series to a multivariate representation. This method makes use of sliding windows to calculate several dozen of new variables using simple statistic tools like first and second moments as well as more complicated statistic, like auto-regression coefficients and residual analysis, followed by an optional quadratic transformation that was further used for data extension. These were used as a explanatory variables in a regularized logistic LASSO regression which tried to estimate Buy-Sell Index (BSI) from real stock market data.

  7. Plant selection for ethnobotanical uses on the Amalfi Coast (Southern Italy).

    PubMed

    Savo, V; Joy, R; Caneva, G; McClatchey, W C

    2015-07-15

    Many ethnobotanical studies have investigated selection criteria for medicinal and non-medicinal plants. In this paper we test several statistical methods using different ethnobotanical datasets in order to 1) define to which extent the nature of the datasets can affect the interpretation of results; 2) determine if the selection for different plant uses is based on phylogeny, or other selection criteria. We considered three different ethnobotanical datasets: two datasets of medicinal plants and a dataset of non-medicinal plants (handicraft production, domestic and agro-pastoral practices) and two floras of the Amalfi Coast. We performed residual analysis from linear regression, the binomial test and the Bayesian approach for calculating under-used and over-used plant families within ethnobotanical datasets. Percentages of agreement were calculated to compare the results of the analyses. We also analyzed the relationship between plant selection and phylogeny, chorology, life form and habitat using the chi-square test. Pearson's residuals for each of the significant chi-square analyses were examined for investigating alternative hypotheses of plant selection criteria. The three statistical analysis methods differed within the same dataset, and between different datasets and floras, but with some similarities. In the two medicinal datasets, only Lamiaceae was identified in both floras as an over-used family by all three statistical methods. All statistical methods in one flora agreed that Malvaceae was over-used and Poaceae under-used, but this was not found to be consistent with results of the second flora in which one statistical result was non-significant. All other families had some discrepancy in significance across methods, or floras. Significant over- or under-use was observed in only a minority of cases. The chi-square analyses were significant for phylogeny, life form and habitat. Pearson's residuals indicated a non-random selection of woody species for non-medicinal uses and an under-use of plants of temperate forests for medicinal uses. Our study showed that selection criteria for plant uses (including medicinal) are not always based on phylogeny. The comparison of different statistical methods (regression, binomial and Bayesian) under different conditions led to the conclusion that the most conservative results are obtained using regression analysis.

  8. Accounting for Multiple Births in Neonatal and Perinatal Trials: Systematic Review and Case Study

    PubMed Central

    Hibbs, Anna Maria; Black, Dennis; Palermo, Lisa; Cnaan, Avital; Luan, Xianqun; Truog, William E; Walsh, Michele C; Ballard, Roberta A

    2010-01-01

    Objectives To determine the prevalence in the neonatal literature of statistical approaches accounting for the unique clustering patterns of multiple births. To explore the sensitivity of an actual trial to several analytic approaches to multiples. Methods A systematic review of recent perinatal trials assessed the prevalence of studies accounting for clustering of multiples. The NO CLD trial served as a case study of the sensitivity of the outcome to several statistical strategies. We calculated odds ratios using non-clustered (logistic regression) and clustered (generalized estimating equations, multiple outputation) analyses. Results In the systematic review, most studies did not describe the randomization of twins and did not account for clustering. Of those studies that did, exclusion of multiples and generalized estimating equations were the most common strategies. The NO CLD study included 84 infants with a sibling enrolled in the study. Multiples were more likely than singletons to be white and were born to older mothers (p<0.01). Analyses that accounted for clustering were statistically significant; analyses assuming independence were not. Conclusions The statistical approach to multiples can influence the odds ratio and width of confidence intervals, thereby affecting the interpretation of a study outcome. A minority of perinatal studies address this issue. PMID:19969305

  9. Accounting for multiple births in neonatal and perinatal trials: systematic review and case study.

    PubMed

    Hibbs, Anna Maria; Black, Dennis; Palermo, Lisa; Cnaan, Avital; Luan, Xianqun; Truog, William E; Walsh, Michele C; Ballard, Roberta A

    2010-02-01

    To determine the prevalence in the neonatal literature of statistical approaches accounting for the unique clustering patterns of multiple births and to explore the sensitivity of an actual trial to several analytic approaches to multiples. A systematic review of recent perinatal trials assessed the prevalence of studies accounting for clustering of multiples. The Nitric Oxide to Prevent Chronic Lung Disease (NO CLD) trial served as a case study of the sensitivity of the outcome to several statistical strategies. We calculated odds ratios using nonclustered (logistic regression) and clustered (generalized estimating equations, multiple outputation) analyses. In the systematic review, most studies did not describe the random assignment of twins and did not account for clustering. Of those studies that did, exclusion of multiples and generalized estimating equations were the most common strategies. The NO CLD study included 84 infants with a sibling enrolled in the study. Multiples were more likely than singletons to be white and were born to older mothers (P < .01). Analyses that accounted for clustering were statistically significant; analyses assuming independence were not. The statistical approach to multiples can influence the odds ratio and width of confidence intervals, thereby affecting the interpretation of a study outcome. A minority of perinatal studies address this issue. Copyright 2010 Mosby, Inc. All rights reserved.

  10. Statistical analysis plan for the Alveolar Recruitment for Acute Respiratory Distress Syndrome Trial (ART). A randomized controlled trial

    PubMed Central

    Damiani, Lucas Petri; Berwanger, Otavio; Paisani, Denise; Laranjeira, Ligia Nasi; Suzumura, Erica Aranha; Amato, Marcelo Britto Passos; Carvalho, Carlos Roberto Ribeiro; Cavalcanti, Alexandre Biasi

    2017-01-01

    Background The Alveolar Recruitment for Acute Respiratory Distress Syndrome Trial (ART) is an international multicenter randomized pragmatic controlled trial with allocation concealment involving 120 intensive care units in Brazil, Argentina, Colombia, Italy, Poland, Portugal, Malaysia, Spain, and Uruguay. The primary objective of ART is to determine whether maximum stepwise alveolar recruitment associated with PEEP titration, adjusted according to the static compliance of the respiratory system (ART strategy), is able to increase 28-day survival in patients with acute respiratory distress syndrome compared to conventional treatment (ARDSNet strategy). Objective To describe the data management process and statistical analysis plan. Methods The statistical analysis plan was designed by the trial executive committee and reviewed and approved by the trial steering committee. We provide an overview of the trial design with a special focus on describing the primary (28-day survival) and secondary outcomes. We describe our data management process, data monitoring committee, interim analyses, and sample size calculation. We describe our planned statistical analyses for primary and secondary outcomes as well as pre-specified subgroup analyses. We also provide details for presenting results, including mock tables for baseline characteristics, adherence to the protocol and effect on clinical outcomes. Conclusion According to best trial practice, we report our statistical analysis plan and data management plan prior to locking the database and beginning analyses. We anticipate that this document will prevent analysis bias and enhance the utility of the reported results. Trial registration ClinicalTrials.gov number, NCT01374022. PMID:28977255

  11. M-TraCE: a new tool for high-resolution computation and statistical elaboration of backward trajectories on the Italian domain

    NASA Astrophysics Data System (ADS)

    Vitali, Lina; Righini, Gaia; Piersanti, Antonio; Cremona, Giuseppe; Pace, Giandomenico; Ciancarella, Luisella

    2017-12-01

    Air backward trajectory calculations are commonly used in a variety of atmospheric analyses, in particular for source attribution evaluation. The accuracy of backward trajectory analysis is mainly determined by the quality and the spatial and temporal resolution of the underlying meteorological data set, especially in the cases of complex terrain. This work describes a new tool for the calculation and the statistical elaboration of backward trajectories. To take advantage of the high-resolution meteorological database of the Italian national air quality model MINNI, a dedicated set of procedures was implemented under the name of M-TraCE (MINNI module for Trajectories Calculation and statistical Elaboration) to calculate and process the backward trajectories of air masses reaching a site of interest. Some outcomes from the application of the developed methodology to the Italian Network of Special Purpose Monitoring Stations are shown to assess its strengths for the meteorological characterization of air quality monitoring stations. M-TraCE has demonstrated its capabilities to provide a detailed statistical assessment of transport patterns and region of influence of the site under investigation, which is fundamental for correctly interpreting pollutants measurements and ascertaining the official classification of the monitoring site based on meta-data information. Moreover, M-TraCE has shown its usefulness in supporting other assessments, i.e., spatial representativeness of a monitoring site, focussing specifically on the analysis of the effects due to meteorological variables.

  12. SOCR Analyses – an Instructional Java Web-based Statistical Analysis Toolkit

    PubMed Central

    Chu, Annie; Cui, Jenny; Dinov, Ivo D.

    2011-01-01

    The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test. The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website. In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models. PMID:21546994

  13. Sample Size Calculations for Precise Interval Estimation of the Eta-Squared Effect Size

    ERIC Educational Resources Information Center

    Shieh, Gwowen

    2015-01-01

    Analysis of variance is one of the most frequently used statistical analyses in the behavioral, educational, and social sciences, and special attention has been paid to the selection and use of an appropriate effect size measure of association in analysis of variance. This article presents the sample size procedures for precise interval estimation…

  14. Differential Neonatal and Postneonatal Infant Mortality Rates across US Counties: The Role of Socioeconomic Conditions and Rurality

    ERIC Educational Resources Information Center

    Sparks, P. Johnelle; McLaughlin, Diane K.; Stokes, C. Shannon

    2009-01-01

    Purpose: To examine differences in correlates of neonatal and postneonatal infant mortality rates, across counties, by degree of rurality. Methods: Neonatal and postneonatal mortality rates were calculated from the 1998 to 2002 Compressed Mortality Files from the National Center for Health Statistics. Bivariate analyses assessed the relationship…

  15. Methodological Reporting of Randomized Trials in Five Leading Chinese Nursing Journals

    PubMed Central

    Shi, Chunhu; Tian, Jinhui; Ren, Dan; Wei, Hongli; Zhang, Lihuan; Wang, Quan; Yang, Kehu

    2014-01-01

    Background Randomized controlled trials (RCTs) are not always well reported, especially in terms of their methodological descriptions. This study aimed to investigate the adherence of methodological reporting complying with CONSORT and explore associated trial level variables in the Chinese nursing care field. Methods In June 2012, we identified RCTs published in five leading Chinese nursing journals and included trials with details of randomized methods. The quality of methodological reporting was measured through the methods section of the CONSORT checklist and the overall CONSORT methodological items score was calculated and expressed as a percentage. Meanwhile, we hypothesized that some general and methodological characteristics were associated with reporting quality and conducted a regression with these data to explore the correlation. The descriptive and regression statistics were calculated via SPSS 13.0. Results In total, 680 RCTs were included. The overall CONSORT methodological items score was 6.34±0.97 (Mean ± SD). No RCT reported descriptions and changes in “trial design,” changes in “outcomes” and “implementation,” or descriptions of the similarity of interventions for “blinding.” Poor reporting was found in detailing the “settings of participants” (13.1%), “type of randomization sequence generation” (1.8%), calculation methods of “sample size” (0.4%), explanation of any interim analyses and stopping guidelines for “sample size” (0.3%), “allocation concealment mechanism” (0.3%), additional analyses in “statistical methods” (2.1%), and targeted subjects and methods of “blinding” (5.9%). More than 50% of trials described randomization sequence generation, the eligibility criteria of “participants,” “interventions,” and definitions of the “outcomes” and “statistical methods.” The regression analysis found that publication year and ITT analysis were weakly associated with CONSORT score. Conclusions The completeness of methodological reporting of RCTs in the Chinese nursing care field is poor, especially with regard to the reporting of trial design, changes in outcomes, sample size calculation, allocation concealment, blinding, and statistical methods. PMID:25415382

  16. Have the temperature time series a structural change after 1998?

    NASA Astrophysics Data System (ADS)

    Werner, Rolf; Valev, Dimitare; Danov, Dimitar

    2012-07-01

    The global and hemisphere temperature GISS and Hadcrut3 time series were analysed for structural changes. We postulate the continuity of the preceding temperature function depending from the time. The slopes are calculated for a sequence of segments limited by time thresholds. We used a standard method, the restricted linear regression with dummy variables. We performed the calculations and tests for different number of thresholds. The thresholds are searched continuously in determined time intervals. The F-statistic is used to obtain the time points of the structural changes.

  17. The determinants of bond angle variability in protein/peptide backbones: A comprehensive statistical/quantum mechanics analysis.

    PubMed

    Improta, Roberto; Vitagliano, Luigi; Esposito, Luciana

    2015-11-01

    The elucidation of the mutual influence between peptide bond geometry and local conformation has important implications for protein structure refinement, validation, and prediction. To gain insights into the structural determinants and the energetic contributions associated with protein/peptide backbone plasticity, we here report an extensive analysis of the variability of the peptide bond angles by combining statistical analyses of protein structures and quantum mechanics calculations on small model peptide systems. Our analyses demonstrate that all the backbone bond angles strongly depend on the peptide conformation and unveil the existence of regular trends as function of ψ and/or φ. The excellent agreement of the quantum mechanics calculations with the statistical surveys of protein structures validates the computational scheme here employed and demonstrates that the valence geometry of protein/peptide backbone is primarily dictated by local interactions. Notably, for the first time we show that the position of the H(α) hydrogen atom, which is an important parameter in NMR structural studies, is also dependent on the local conformation. Most of the trends observed may be satisfactorily explained by invoking steric repulsive interactions; in some specific cases the valence bond variability is also influenced by hydrogen-bond like interactions. Moreover, we can provide a reliable estimate of the energies involved in the interplay between geometry and conformations. © 2015 Wiley Periodicals, Inc.

  18. SIRIUS. An automated method for the analysis of the preferred packing arrangements between protein groups.

    PubMed

    Singh, J; Thornton, J M

    1990-02-05

    Automated methods have been developed to determine the preferred packing arrangement between interacting protein groups. A suite of FORTRAN programs, SIRIUS, is described for calculating and analysing the geometries of interacting protein groups using crystallographically derived atomic co-ordinates. The programs involved in calculating the geometries search for interacting pairs of protein groups using a distance criterion, and then calculate the spatial disposition and orientation of the pair. The second set of programs is devoted to analysis. This involves calculating the observed and expected distributions of the angles and assessing the statistical significance of the difference between the two. A database of the geometries of the 400 combinations of side-chain to side-chain interaction has been created. The approach used in analysing the geometrical information is illustrated here with specific examples of interactions between side-chains, peptide groups and particular types of atom. At the side-chain level, an analysis of aromatic-amino interactions, and the interactions of peptide carbonyl groups with arginine residues is presented. At the atomic level the analyses include the spatial disposition of oxygen atoms around tyrosine residues, and the frequency and type of contact between carbon, nitrogen and oxygen atoms. This information is currently being applied to the modelling of protein interactions.

  19. Descriptive Statistics for Modern Test Score Distributions: Skewness, Kurtosis, Discreteness, and Ceiling Effects.

    PubMed

    Ho, Andrew D; Yu, Carol C

    2015-06-01

    Many statistical analyses benefit from the assumption that unconditional or conditional distributions are continuous and normal. More than 50 years ago in this journal, Lord and Cook chronicled departures from normality in educational tests, and Micerri similarly showed that the normality assumption is met rarely in educational and psychological practice. In this article, the authors extend these previous analyses to state-level educational test score distributions that are an increasingly common target of high-stakes analysis and interpretation. Among 504 scale-score and raw-score distributions from state testing programs from recent years, nonnormal distributions are common and are often associated with particular state programs. The authors explain how scaling procedures from item response theory lead to nonnormal distributions as well as unusual patterns of discreteness. The authors recommend that distributional descriptive statistics be calculated routinely to inform model selection for large-scale test score data, and they illustrate consequences of nonnormality using sensitivity studies that compare baseline results to those from normalized score scales.

  20. Meta-analysis of neutropenia or leukopenia as a prognostic factor in patients with malignant disease undergoing chemotherapy.

    PubMed

    Shitara, Kohei; Matsuo, Keitaro; Oze, Isao; Mizota, Ayako; Kondo, Chihiro; Nomura, Motoo; Yokota, Tomoya; Takahari, Daisuke; Ura, Takashi; Muro, Kei

    2011-08-01

    We performed a systematic review and meta-analysis to determine the impact of neutropenia or leukopenia experienced during chemotherapy on survival. Eligible studies included prospective or retrospective analyses that evaluated neutropenia or leukopenia as a prognostic factor for overall survival or disease-free survival. Statistical analyses were conducted to calculate a summary hazard ratio and 95% confidence interval (CI) using random-effects or fixed-effects models based on the heterogeneity of the included studies. Thirteen trials were selected for the meta-analysis, with a total of 9,528 patients. The hazard ratio of death was 0.69 (95% CI, 0.64-0.75) for patients with higher-grade neutropenia or leukopenia compared to patients with lower-grade or lack of cytopenia. Our analysis was also stratified by statistical method (any statistical method to decrease lead-time bias; time-varying analysis or landmark analysis), but no differences were observed. Our results indicate that neutropenia or leukopenia experienced during chemotherapy is associated with improved survival in patients with advanced cancer or hematological malignancies undergoing chemotherapy. Future prospective analyses designed to investigate the potential impact of chemotherapy dose adjustment coupled with monitoring of neutropenia or leukopenia on survival are warranted.

  1. Reanalysis of the start of the UK 1967 to 1968 foot-and-mouth disease epidemic to calculate airborne transmission probabilities.

    PubMed

    Sanson, R L; Gloster, J; Burgin, L

    2011-09-24

    The aims of this study were to statistically reassess the likelihood that windborne spread of foot-and-mouth disease (FMD) virus (FMDV) occurred at the start of the UK 1967 to 1968 FMD epidemic at Oswestry, Shropshire, and to derive dose-response probability of infection curves for farms exposed to airborne FMDV. To enable this, data on all farms present in 1967 in the parishes near Oswestry were assembled. Cases were infected premises whose date of appearance of first clinical signs was within 14 days of the depopulation of the index farm. Logistic regression was used to evaluate the association between infection status and distance and direction from the index farm. The UK Met Office's NAME atmospheric dispersion model (ADM) was used to generate plumes for each day that FMDV was excreted from the index farm based on actual historical weather records from October 1967. Daily airborne FMDV exposure rates for all farms in the study area were calculated using a geographical information system. Probit analyses were used to calculate dose-response probability of infection curves to FMDV, using relative exposure rates on case and control farms. Both the logistic regression and probit analyses gave strong statistical support to the hypothesis that airborne spread occurred. There was some evidence that incubation period was inversely proportional to the exposure rate.

  2. The heterogeneity statistic I(2) can be biased in small meta-analyses.

    PubMed

    von Hippel, Paul T

    2015-04-14

    Estimated effects vary across studies, partly because of random sampling error and partly because of heterogeneity. In meta-analysis, the fraction of variance that is due to heterogeneity is estimated by the statistic I(2). We calculate the bias of I(2), focusing on the situation where the number of studies in the meta-analysis is small. Small meta-analyses are common; in the Cochrane Library, the median number of studies per meta-analysis is 7 or fewer. We use Mathematica software to calculate the expectation and bias of I(2). I(2) has a substantial bias when the number of studies is small. The bias is positive when the true fraction of heterogeneity is small, but the bias is typically negative when the true fraction of heterogeneity is large. For example, with 7 studies and no true heterogeneity, I(2) will overestimate heterogeneity by an average of 12 percentage points, but with 7 studies and 80 percent true heterogeneity, I(2) can underestimate heterogeneity by an average of 28 percentage points. Biases of 12-28 percentage points are not trivial when one considers that, in the Cochrane Library, the median I(2) estimate is 21 percent. The point estimate I(2) should be interpreted cautiously when a meta-analysis has few studies. In small meta-analyses, confidence intervals should supplement or replace the biased point estimate I(2).

  3. Borrowing of strength and study weights in multivariate and network meta-analysis.

    PubMed

    Jackson, Dan; White, Ian R; Price, Malcolm; Copas, John; Riley, Richard D

    2017-12-01

    Multivariate and network meta-analysis have the potential for the estimated mean of one effect to borrow strength from the data on other effects of interest. The extent of this borrowing of strength is usually assessed informally. We present new mathematical definitions of 'borrowing of strength'. Our main proposal is based on a decomposition of the score statistic, which we show can be interpreted as comparing the precision of estimates from the multivariate and univariate models. Our definition of borrowing of strength therefore emulates the usual informal assessment. We also derive a method for calculating study weights, which we embed into the same framework as our borrowing of strength statistics, so that percentage study weights can accompany the results from multivariate and network meta-analyses as they do in conventional univariate meta-analyses. Our proposals are illustrated using three meta-analyses involving correlated effects for multiple outcomes, multiple risk factor associations and multiple treatments (network meta-analysis).

  4. Borrowing of strength and study weights in multivariate and network meta-analysis

    PubMed Central

    Jackson, Dan; White, Ian R; Price, Malcolm; Copas, John; Riley, Richard D

    2016-01-01

    Multivariate and network meta-analysis have the potential for the estimated mean of one effect to borrow strength from the data on other effects of interest. The extent of this borrowing of strength is usually assessed informally. We present new mathematical definitions of ‘borrowing of strength’. Our main proposal is based on a decomposition of the score statistic, which we show can be interpreted as comparing the precision of estimates from the multivariate and univariate models. Our definition of borrowing of strength therefore emulates the usual informal assessment. We also derive a method for calculating study weights, which we embed into the same framework as our borrowing of strength statistics, so that percentage study weights can accompany the results from multivariate and network meta-analyses as they do in conventional univariate meta-analyses. Our proposals are illustrated using three meta-analyses involving correlated effects for multiple outcomes, multiple risk factor associations and multiple treatments (network meta-analysis). PMID:26546254

  5. Effects of street tree shade on asphalt concrete pavement performance

    Treesearch

    E.G. McPherson; J. Muchnick

    2005-01-01

    Forty-eight street segments were paired into 24 high-and low-shade pairs in Modesto, California, U.S. Field data were collected to calculate a Pavement Condition Index (PCI) and Tree Shade Index (TSI) for each segment. Statistical analyses found that greater PCI was associated with greater TSI, indicating that tree shade was partially responsible for reduced pavement...

  6. Bayesian statistical inference enhances the interpretation of contemporary randomized controlled trials.

    PubMed

    Wijeysundera, Duminda N; Austin, Peter C; Hux, Janet E; Beattie, W Scott; Laupacis, Andreas

    2009-01-01

    Randomized trials generally use "frequentist" statistics based on P-values and 95% confidence intervals. Frequentist methods have limitations that might be overcome, in part, by Bayesian inference. To illustrate these advantages, we re-analyzed randomized trials published in four general medical journals during 2004. We used Medline to identify randomized superiority trials with two parallel arms, individual-level randomization and dichotomous or time-to-event primary outcomes. Studies with P<0.05 in favor of the intervention were deemed "positive"; otherwise, they were "negative." We used several prior distributions and exact conjugate analyses to calculate Bayesian posterior probabilities for clinically relevant effects. Of 88 included studies, 39 were positive using a frequentist analysis. Although the Bayesian posterior probabilities of any benefit (relative risk or hazard ratio<1) were high in positive studies, these probabilities were lower and variable for larger benefits. The positive studies had only moderate probabilities for exceeding the effects that were assumed for calculating the sample size. By comparison, there were moderate probabilities of any benefit in negative studies. Bayesian and frequentist analyses complement each other when interpreting the results of randomized trials. Future reports of randomized trials should include both.

  7. Forensic genetic analyses in isolated populations with examples of central European Valachs and Roma.

    PubMed

    Ehler, Edvard; Vanek, Daniel

    2017-05-01

    Isolated populations present a constant threat to the correctness of forensic genetic casework. In this review article we present several examples of how analyzing samples from isolated populations can bias the results of the forensic statistics and analyses. We select our examples from isolated populations from central and southeastern Europe, namely the Valachs and the European Roma. We also provide the reader with general strategies and principles to improve the laboratory practice (best practice) and reporting of samples from supposedly isolated populations. These include reporting the precise population data used for computing the forensic statistics, using the appropriate θ correction factor for calculating allele frequencies, typing ancestry informative markers in samples of unknown or uncertain ethnicity and establishing ethnic-specific forensic databases. Copyright © 2017 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  8. A novel alkaloid isolated from Crotalaria paulina and identified by NMR and DFT calculations

    NASA Astrophysics Data System (ADS)

    Oliveira, Ramon Prata; Demuner, Antonio Jacinto; Alvarenga, Elson Santiago; Barbosa, Luiz Claudio Almeida; de Melo Silva, Thiago

    2018-01-01

    Pyrrolizidine alkaloids (PAs) are secondary metabolites found in Crotalaria genus and are known to have several biological activities. A novel macrocycle bislactone alkaloid, coined ethylcrotaline, was isolated and purified from the aerial parts of Crotalaria paulina. The novel macrocycle was identified with the aid of high resolution mass spectrometry and advanced nuclear magnetic resonance techniques. The relative stereochemistry of the alkaloid was defined by comparing the calculated quantum mechanical hydrogen and carbon chemical shifts of eight candidate structures with the experimental NMR data. The best fit between the eight candidate structures and the experimental NMR chemical shifts was defined by the DP4 statistical analyses and the Mean Absolute Error (MAE) calculations.

  9. Identifying and characterizing hepatitis C virus hotspots in Massachusetts: a spatial epidemiological approach.

    PubMed

    Stopka, Thomas J; Goulart, Michael A; Meyers, David J; Hutcheson, Marga; Barton, Kerri; Onofrey, Shauna; Church, Daniel; Donahue, Ashley; Chui, Kenneth K H

    2017-04-20

    Hepatitis C virus (HCV) infections have increased during the past decade but little is known about geographic clustering patterns. We used a unique analytical approach, combining geographic information systems (GIS), spatial epidemiology, and statistical modeling to identify and characterize HCV hotspots, statistically significant clusters of census tracts with elevated HCV counts and rates. We compiled sociodemographic and HCV surveillance data (n = 99,780 cases) for Massachusetts census tracts (n = 1464) from 2002 to 2013. We used a five-step spatial epidemiological approach, calculating incremental spatial autocorrelations and Getis-Ord Gi* statistics to identify clusters. We conducted logistic regression analyses to determine factors associated with the HCV hotspots. We identified nine HCV clusters, with the largest in Boston, New Bedford/Fall River, Worcester, and Springfield (p < 0.05). In multivariable analyses, we found that HCV hotspots were independently and positively associated with the percent of the population that was Hispanic (adjusted odds ratio [AOR]: 1.07; 95% confidence interval [CI]: 1.04, 1.09) and the percent of households receiving food stamps (AOR: 1.83; 95% CI: 1.22, 2.74). HCV hotspots were independently and negatively associated with the percent of the population that were high school graduates or higher (AOR: 0.91; 95% CI: 0.89, 0.93) and the percent of the population in the "other" race/ethnicity category (AOR: 0.88; 95% CI: 0.85, 0.91). We identified locations where HCV clusters were a concern, and where enhanced HCV prevention, treatment, and care can help combat the HCV epidemic in Massachusetts. GIS, spatial epidemiological and statistical analyses provided a rigorous approach to identify hotspot clusters of disease, which can inform public health policy and intervention targeting. Further studies that incorporate spatiotemporal cluster analyses, Bayesian spatial and geostatistical models, spatially weighted regression analyses, and assessment of associations between HCV clustering and the built environment are needed to expand upon our combined spatial epidemiological and statistical methods.

  10. Arkansas StreamStats: a U.S. Geological Survey web map application for basin characteristics and streamflow statistics

    USGS Publications Warehouse

    Pugh, Aaron L.

    2014-01-01

    Users of streamflow information often require streamflow statistics and basin characteristics at various locations along a stream. The USGS periodically calculates and publishes streamflow statistics and basin characteristics for streamflowgaging stations and partial-record stations, but these data commonly are scattered among many reports that may or may not be readily available to the public. The USGS also provides and periodically updates regional analyses of streamflow statistics that include regression equations and other prediction methods for estimating statistics for ungaged and unregulated streams across the State. Use of these regional predictions for a stream can be complex and often requires the user to determine a number of basin characteristics that may require interpretation. Basin characteristics may include drainage area, classifiers for physical properties, climatic characteristics, and other inputs. Obtaining these input values for gaged and ungaged locations traditionally has been time consuming, subjective, and can lead to inconsistent results.

  11. Approach for Input Uncertainty Propagation and Robust Design in CFD Using Sensitivity Derivatives

    NASA Technical Reports Server (NTRS)

    Putko, Michele M.; Taylor, Arthur C., III; Newman, Perry A.; Green, Lawrence L.

    2002-01-01

    An implementation of the approximate statistical moment method for uncertainty propagation and robust optimization for quasi 3-D Euler CFD code is presented. Given uncertainties in statistically independent, random, normally distributed input variables, first- and second-order statistical moment procedures are performed to approximate the uncertainty in the CFD output. Efficient calculation of both first- and second-order sensitivity derivatives is required. In order to assess the validity of the approximations, these moments are compared with statistical moments generated through Monte Carlo simulations. The uncertainties in the CFD input variables are also incorporated into a robust optimization procedure. For this optimization, statistical moments involving first-order sensitivity derivatives appear in the objective function and system constraints. Second-order sensitivity derivatives are used in a gradient-based search to successfully execute a robust optimization. The approximate methods used throughout the analyses are found to be valid when considering robustness about input parameter mean values.

  12. Sc2O@Cs(126339)-C92: Di-scandium oxide cluster encapsulated into a large fullerene cage

    NASA Astrophysics Data System (ADS)

    Gu, Yong-Xin; Li, Qiao-Zhi; Li, De-Huai; Zhao, Rui-Sheng; Zhao, Xiang

    2018-04-01

    The geometric, electronic structure and thermodynamic stability of Sc2O@C92 has been characterized by using hybrid density functional theory calculations combined with statistical thermodynamic analyses. Results indicate that the isolated pentagon rule (IPR) isomers Sc2O@Cs(126339)-C92, Sc2O@C1(126367)-C92 and Sc2O@C1(126390)-C92 are favorable. Noteworthy, it is the first time to declare that fullerene isomer Cs(126339)-C92 could be considered as the suitable cage to encapsulate metallic cluster. The electronic properties of these three isomers were performed with frontier molecular orbital (HOMO and LUMO) analyses and bond order calculations. Finally, 13C NMR and UV-vis-NIR spectra were simulated to provide valuable information for future experiments.

  13. Atmospheric water vapour over oceans from SSM/I measurements

    NASA Technical Reports Server (NTRS)

    Schluessel, Peter; Emery, William J.

    1990-01-01

    A statistical retrieval technique is developed to derive the atmospheric water vapor column content from the Special Sensor Microwave/Imager (SSM/I) measurements. The radiometer signals are simulated by means of radiative-transfer calculations for a large set of atmospheric/oceanic situations. These simulated responses are subsequently summarized by multivariate analyses, giving water-vapor coefficients and error estimates. Radiative-transfer calculations show that the SSM/I microwave imager can detect atmospheric water vapor structures with an accuracy from 0.145 to 0.17 g/sq cm. The accuracy of the method is confirmed by globally distributed match-ups with radiosonde measurements.

  14. Data preparation techniques for a perinatal psychiatric study based on linked data.

    PubMed

    Xu, Fenglian; Hilder, Lisa; Austin, Marie-Paule; Sullivan, Elizabeth A

    2012-06-08

    In recent years there has been an increase in the use of population-based linked data. However, there is little literature that describes the method of linked data preparation. This paper describes the method for merging data, calculating the statistical variable (SV), recoding psychiatric diagnoses and summarizing hospital admissions for a perinatal psychiatric study. The data preparation techniques described in this paper are based on linked birth data from the New South Wales (NSW) Midwives Data Collection (MDC), the Register of Congenital Conditions (RCC), the Admitted Patient Data Collection (APDC) and the Pharmaceutical Drugs of Addiction System (PHDAS). The master dataset is the meaningfully linked data which include all or major study data collections. The master dataset can be used to improve the data quality, calculate the SV and can be tailored for different analyses. To identify hospital admissions in the periods before pregnancy, during pregnancy and after birth, a statistical variable of time interval (SVTI) needs to be calculated. The methods and SPSS syntax for building a master dataset, calculating the SVTI, recoding the principal diagnoses of mental illness and summarizing hospital admissions are described. Linked data preparation, including building the master dataset and calculating the SV, can improve data quality and enhance data function.

  15. Statistical analyses and characteristics of volcanic tremor on Stromboli Volcano (Italy)

    NASA Astrophysics Data System (ADS)

    Falsaperla, S.; Langer, H.; Spampinato, S.

    A study of volcanic tremor on Stromboli is carried out on the basis of data recorded daily between 1993 and 1995 by a permanent seismic station (STR) located 1.8km away from the active craters. We also consider the signal of a second station (TF1), which operated for a shorter time span. Changes in the spectral tremor characteristics can be related to modifications in volcanic activity, particularly to lava effusions and explosive sequences. Statistical analyses were carried out on a set of spectra calculated daily from seismic signals where explosion quakes were present or excluded. Principal component analysis and cluster analysis were applied to identify different classes of spectra. Three clusters of spectra are associated with two different states of volcanic activity. One cluster corresponds to a state of low to moderate activity, whereas the two other clusters are present during phases with a high magma column as inferred from the occurrence of lava fountains or effusions. We therefore conclude that variations in volcanic activity at Stromboli are usually linked to changes in the spectral characteristics of volcanic tremor. Site effects are evident when comparing the spectra calculated from signals synchronously recorded at STR and TF1. However, some major spectral peaks at both stations may reflect source properties. Statistical considerations and polarization analysis are in favor of a prevailing presence of P-waves in the tremor signal along with a position of the source northwest of the craters and at shallow depth.

  16. Selected quality assurance data for water samples collected by the US Geological Survey, Idaho National Engineering Laboratory, Idaho, 1980 to 1988

    USGS Publications Warehouse

    Wegner, S.J.

    1989-01-01

    Multiple water samples from 115 wells and 3 surface water sites were collected between 1980 and 1988 for the ongoing quality assurance program at the Idaho National Engineering Laboratory. The reported results from the six laboratories involved were analyzed for agreement using descriptive statistics. The constituents and properties included: tritium, plutonium-238, plutonium-239, -240 (undivided), strontium-90, americium-241, cesium-137, total dissolved chromium, selected dissolved trace metals, sodium, chloride, nitrate, selected purgeable organic compounds, and specific conductance. Agreement could not be calculated for purgeable organic compounds, trace metals, some nitrates and blank sample analyses because analytical uncertainties were not consistently reported. However, differences between results for most of these data were calculated. The blank samples were not analyzed for differences. The laboratory results analyzed using descriptive statistics showed a median agreement between all useable data pairs of 95%. (USGS)

  17. Program for Weibull Analysis of Fatigue Data

    NASA Technical Reports Server (NTRS)

    Krantz, Timothy L.

    2005-01-01

    A Fortran computer program has been written for performing statistical analyses of fatigue-test data that are assumed to be adequately represented by a two-parameter Weibull distribution. This program calculates the following: (1) Maximum-likelihood estimates of the Weibull distribution; (2) Data for contour plots of relative likelihood for two parameters; (3) Data for contour plots of joint confidence regions; (4) Data for the profile likelihood of the Weibull-distribution parameters; (5) Data for the profile likelihood of any percentile of the distribution; and (6) Likelihood-based confidence intervals for parameters and/or percentiles of the distribution. The program can account for tests that are suspended without failure (the statistical term for such suspension of tests is "censoring"). The analytical approach followed in this program for the software is valid for type-I censoring, which is the removal of unfailed units at pre-specified times. Confidence regions and intervals are calculated by use of the likelihood-ratio method.

  18. Application of a planetary wave breaking parameterization to stratospheric circulation statistics

    NASA Technical Reports Server (NTRS)

    Randel, William J.; Garcia, Rolando R.

    1994-01-01

    The planetary wave parameterization scheme developed recently by Garcia is applied to statospheric circulation statistics derived from 12 years of National Meteorological Center operational stratospheric analyses. From the data a planetary wave breaking criterion (based on the ratio of the eddy to zonal mean meridional potential vorticity (PV) gradients), a wave damping rate, and a meridional diffusion coefficient are calculated. The equatorward flank of the polar night jet during winter is identified as a wave breaking region from the observed PV gradients; the region moves poleward with season, covering all high latitudes in spring. Derived damping rates maximize in the subtropical upper stratosphere (the 'surf zone'), with damping time scales of 3-4 days. Maximum diffusion coefficients follow the spatial patterns of the wave breaking criterion, with magnitudes comparable to prior published estimates. Overall, the observed results agree well with the parameterized calculations of Garcia.

  19. Active control of aerothermoelastic effects for a conceptual hypersonic aircraft

    NASA Technical Reports Server (NTRS)

    Heeg, Jennifer; Gilbert, Michael G.; Pototzky, Anthony S.

    1990-01-01

    This paper describes the procedures for an results of aeroservothermoelastic studies. The objectives of these studies were to develop the necessary procedures for performing an aeroelastic analysis of an aerodynamically heated vehicle and to analyze a configuration in the classical 'cold' state and in a 'hot' state. Major tasks include the development of the structural and aerodynamic models, open loop analyses, design of active control laws for improving dynamic responses and analyses of the closed loop vehicles. The analyses performed focused on flutter speed calculations, short period eigenvalue trends and statistical analyses of the vehicle response to controls and turbulence. Improving the ride quality of the vehicle and raising the flutter boundary of the aerodynamically-heated vehicle up to that of the cold vehicle were the objectives of the control law design investigations.

  20. A graphical user interface (GUI) toolkit for the calculation of three-dimensional (3D) multi-phase biological effective dose (BED) distributions including statistical analyses.

    PubMed

    Kauweloa, Kevin I; Gutierrez, Alonso N; Stathakis, Sotirios; Papanikolaou, Niko; Mavroidis, Panayiotis

    2016-07-01

    A toolkit has been developed for calculating the 3-dimensional biological effective dose (BED) distributions in multi-phase, external beam radiotherapy treatments such as those applied in liver stereotactic body radiation therapy (SBRT) and in multi-prescription treatments. This toolkit also provides a wide range of statistical results related to dose and BED distributions. MATLAB 2010a, version 7.10 was used to create this GUI toolkit. The input data consist of the dose distribution matrices, organ contour coordinates, and treatment planning parameters from the treatment planning system (TPS). The toolkit has the capability of calculating the multi-phase BED distributions using different formulas (denoted as true and approximate). Following the calculations of the BED distributions, the dose and BED distributions can be viewed in different projections (e.g. coronal, sagittal and transverse). The different elements of this toolkit are presented and the important steps for the execution of its calculations are illustrated. The toolkit is applied on brain, head & neck and prostate cancer patients, who received primary and boost phases in order to demonstrate its capability in calculating BED distributions, as well as measuring the inaccuracy and imprecision of the approximate BED distributions. Finally, the clinical situations in which the use of the present toolkit would have a significant clinical impact are indicated. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  1. Can effective teaching and learning strategies help student nurses to retain drug calculation skills?

    PubMed

    Wright, Kerri

    2008-10-01

    Student nurses need to develop and retain drug calculation skills in order accurately to calculate drug dosages in clinical practice. If student nurses are to qualify and be fit to practise accurate drug calculation skills, then educational strategies need to not only show that the skills of student nurses have improved but that these skills have been retained over a period of time. A quasi-experimental approach was used to test the effectiveness of a range of strategies in improving retention of drug calculation skills. The results from an IV additive drug calculation test were used to compare the drug calculation skills of student nurses between two groups of students who had received different approaches to teaching drug calculation skills. The sample group received specific teaching and learning strategies in relation to drug calculation skills and the second group received only lectures on drug calculation skills. All test results for students were anonymous. The results from the test for both groups were statistically analysed using the Mann Whitney test to ascertain whether the range of strategies improved the results for the IV additive test. The results were further analysed and compared to ascertain the types and numbers of errors made in each of the sample groups. The results showed that there is a highly significant difference between the two samples using a two-tailed test (U=39.5, p<0.001). The strategies implemented therefore did make a difference to the retention of drug calculation skills in the students in the intervention group. Further research is required into the retention of drug calculation skills by students and nurses, but there does appears to be evidence to suggest that sound teaching and learning strategies do result in better retention of drug calculation skills.

  2. Pediatric patient safety events during hospitalization: approaches to accounting for institution-level effects.

    PubMed

    Slonim, Anthony D; Marcin, James P; Turenne, Wendy; Hall, Matt; Joseph, Jill G

    2007-12-01

    To determine the rates, patient, and institutional characteristics associated with the occurrence of patient safety indicators (PSIs) in hospitalized children and the degree of statistical difference derived from using three approaches of controlling for institution level effects. Pediatric Health Information System Dataset consisting of all pediatric discharges (<21 years of age) from 34 academic, freestanding children's hospitals for calendar year 2003. The rates of PSIs were computed for all discharges. The patient and institutional characteristics associated with these PSIs were calculated. The analyses sequentially applied three increasingly conservative methods to control for the institution-level effects robust standard error estimation, a fixed effects model, and a random effects model. The degree of difference from a "base state," which excluded institution-level variables, and between the models was calculated. The effects of these analyses on the interpretation of the PSIs are presented. PSIs are relatively infrequent events in hospitalized children ranging from 0 per 10,000 (postoperative hip fracture) to 87 per 10,000 (postoperative respiratory failure). Significant variables associated PSIs included age (neonates), race (Caucasians), payor status (public insurance), severity of illness (extreme), and hospital size (>300 beds), which all had higher rates of PSIs than their reference groups in the bivariable logistic regression results. The three different approaches of adjusting for institution-level effects demonstrated that there were similarities in both the clinical and statistical significance across each of the models. Institution-level effects can be appropriately controlled for by using a variety of methods in the analyses of administrative data. Whenever possible, resource-conservative methods should be used in the analyses especially if clinical implications are minimal.

  3. Association between sleep difficulties as well as duration and hypertension: is BMI a mediator?

    PubMed

    Carrillo-Larco, R M; Bernabe-Ortiz, A; Sacksteder, K A; Diez-Canseco, F; Cárdenas, M K; Gilman, R H; Miranda, J J

    2017-01-01

    Sleep difficulties and short sleep duration have been associated with hypertension. Though body mass index (BMI) may be a mediator variable, the mediation effect has not been defined. We aimed to assess the association between sleep duration and sleep difficulties with hypertension, to determine if BMI is a mediator variable, and to quantify the mediation effect. We conducted a mediation analysis and calculated prevalence ratios with 95% confidence intervals. The exposure variables were sleep duration and sleep difficulties, and the outcome was hypertension. Sleep difficulties were statistically significantly associated with a 43% higher prevalence of hypertension in multivariable analyses; results were not statistically significant for sleep duration. In these analyses, and in sex-specific subgroup analyses, we found no strong evidence that BMI mediated the association between sleep indices and risk of hypertension. Our findings suggest that BMI does not appear to mediate the association between sleep patterns and hypertension. These results highlight the need to further study the mechanisms underlying the relationship between sleep patterns and cardiovascular risk factors.

  4. Automated brain volumetrics in multiple sclerosis: a step closer to clinical application

    PubMed Central

    Beadnall, H N; Hatton, S N; Bader, G; Tomic, D; Silva, D G

    2016-01-01

    Background Whole brain volume (WBV) estimates in patients with multiple sclerosis (MS) correlate more robustly with clinical disability than traditional, lesion-based metrics. Numerous algorithms to measure WBV have been developed over the past two decades. We compare Structural Image Evaluation using Normalisation of Atrophy-Cross-sectional (SIENAX) to NeuroQuant and MSmetrix, for assessment of cross-sectional WBV in patients with MS. Methods MRIs from 61 patients with relapsing-remitting MS and 2 patients with clinically isolated syndrome were analysed. WBV measurements were calculated using SIENAX, NeuroQuant and MSmetrix. Statistical agreement between the methods was evaluated using linear regression and Bland-Altman plots. Precision and accuracy of WBV measurement was calculated for (1) NeuroQuant versus SIENAX and (2) MSmetrix versus SIENAX. Results Precision (Pearson's r) of WBV estimation for NeuroQuant and MSmetrix versus SIENAX was 0.983 and 0.992, respectively. Accuracy (Cb) was 0.871 and 0.994, respectively. NeuroQuant and MSmetrix showed a 5.5% and 1.0% volume difference compared with SIENAX, respectively, that was consistent across low and high values. Conclusions In the analysed population, NeuroQuant and MSmetrix both quantified cross-sectional WBV with comparable statistical agreement to SIENAX, a well-validated cross-sectional tool that has been used extensively in MS clinical studies. PMID:27071647

  5. Sample size considerations when groups are the appropriate unit of analyses

    PubMed Central

    Sadler, Georgia Robins; Ko, Celine Marie; Alisangco, Jennifer; Rosbrook, Bradley P.; Miller, Eric; Fullerton, Judith

    2007-01-01

    This paper discusses issues to be considered by nurse researchers when groups should be used as a unit of randomization. Advantages and disadvantages are presented, with statistical calculations needed to determine effective sample size. Examples of these concepts are presented using data from the Black Cosmetologists Promoting Health Program. Different hypothetical scenarios and their impact on sample size are presented. Given the complexity of calculating sample size when using groups as a unit of randomization, it’s advantageous for researchers to work closely with statisticians when designing and implementing studies that anticipate the use of groups as the unit of randomization. PMID:17693219

  6. An audit of the statistics and the comparison with the parameter in the population

    NASA Astrophysics Data System (ADS)

    Bujang, Mohamad Adam; Sa'at, Nadiah; Joys, A. Reena; Ali, Mariana Mohamad

    2015-10-01

    The sufficient sample size that is needed to closely estimate the statistics for particular parameters are use to be an issue. Although sample size might had been calculated referring to objective of the study, however, it is difficult to confirm whether the statistics are closed with the parameter for a particular population. All these while, guideline that uses a p-value less than 0.05 is widely used as inferential evidence. Therefore, this study had audited results that were analyzed from various sub sample and statistical analyses and had compared the results with the parameters in three different populations. Eight types of statistical analysis and eight sub samples for each statistical analysis were analyzed. Results found that the statistics were consistent and were closed to the parameters when the sample study covered at least 15% to 35% of population. Larger sample size is needed to estimate parameter that involve with categorical variables compared with numerical variables. Sample sizes with 300 to 500 are sufficient to estimate the parameters for medium size of population.

  7. Critical analysis of adsorption data statistically

    NASA Astrophysics Data System (ADS)

    Kaushal, Achla; Singh, S. K.

    2017-10-01

    Experimental data can be presented, computed, and critically analysed in a different way using statistics. A variety of statistical tests are used to make decisions about the significance and validity of the experimental data. In the present study, adsorption was carried out to remove zinc ions from contaminated aqueous solution using mango leaf powder. The experimental data was analysed statistically by hypothesis testing applying t test, paired t test and Chi-square test to (a) test the optimum value of the process pH, (b) verify the success of experiment and (c) study the effect of adsorbent dose in zinc ion removal from aqueous solutions. Comparison of calculated and tabulated values of t and χ 2 showed the results in favour of the data collected from the experiment and this has been shown on probability charts. K value for Langmuir isotherm was 0.8582 and m value for Freundlich adsorption isotherm obtained was 0.725, both are <1, indicating favourable isotherms. Karl Pearson's correlation coefficient values for Langmuir and Freundlich adsorption isotherms were obtained as 0.99 and 0.95 respectively, which show higher degree of correlation between the variables. This validates the data obtained for adsorption of zinc ions from the contaminated aqueous solution with the help of mango leaf powder.

  8. Polygenic scores via penalized regression on summary statistics.

    PubMed

    Mak, Timothy Shin Heng; Porsch, Robert Milan; Choi, Shing Wan; Zhou, Xueya; Sham, Pak Chung

    2017-09-01

    Polygenic scores (PGS) summarize the genetic contribution of a person's genotype to a disease or phenotype. They can be used to group participants into different risk categories for diseases, and are also used as covariates in epidemiological analyses. A number of possible ways of calculating PGS have been proposed, and recently there is much interest in methods that incorporate information available in published summary statistics. As there is no inherent information on linkage disequilibrium (LD) in summary statistics, a pertinent question is how we can use LD information available elsewhere to supplement such analyses. To answer this question, we propose a method for constructing PGS using summary statistics and a reference panel in a penalized regression framework, which we call lassosum. We also propose a general method for choosing the value of the tuning parameter in the absence of validation data. In our simulations, we showed that pseudovalidation often resulted in prediction accuracy that is comparable to using a dataset with validation phenotype and was clearly superior to the conservative option of setting the tuning parameter of lassosum to its lowest value. We also showed that lassosum achieved better prediction accuracy than simple clumping and P-value thresholding in almost all scenarios. It was also substantially faster and more accurate than the recently proposed LDpred. © 2017 WILEY PERIODICALS, INC.

  9. Approach for Uncertainty Propagation and Robust Design in CFD Using Sensitivity Derivatives

    NASA Technical Reports Server (NTRS)

    Putko, Michele M.; Newman, Perry A.; Taylor, Arthur C., III; Green, Lawrence L.

    2001-01-01

    This paper presents an implementation of the approximate statistical moment method for uncertainty propagation and robust optimization for a quasi 1-D Euler CFD (computational fluid dynamics) code. Given uncertainties in statistically independent, random, normally distributed input variables, a first- and second-order statistical moment matching procedure is performed to approximate the uncertainty in the CFD output. Efficient calculation of both first- and second-order sensitivity derivatives is required. In order to assess the validity of the approximations, the moments are compared with statistical moments generated through Monte Carlo simulations. The uncertainties in the CFD input variables are also incorporated into a robust optimization procedure. For this optimization, statistical moments involving first-order sensitivity derivatives appear in the objective function and system constraints. Second-order sensitivity derivatives are used in a gradient-based search to successfully execute a robust optimization. The approximate methods used throughout the analyses are found to be valid when considering robustness about input parameter mean values.

  10. Performance of statistical process control methods for regional surgical site infection surveillance: a 10-year multicentre pilot study.

    PubMed

    Baker, Arthur W; Haridy, Salah; Salem, Joseph; Ilieş, Iulian; Ergai, Awatef O; Samareh, Aven; Andrianas, Nicholas; Benneyan, James C; Sexton, Daniel J; Anderson, Deverick J

    2017-11-24

    Traditional strategies for surveillance of surgical site infections (SSI) have multiple limitations, including delayed and incomplete outbreak detection. Statistical process control (SPC) methods address these deficiencies by combining longitudinal analysis with graphical presentation of data. We performed a pilot study within a large network of community hospitals to evaluate performance of SPC methods for detecting SSI outbreaks. We applied conventional Shewhart and exponentially weighted moving average (EWMA) SPC charts to 10 previously investigated SSI outbreaks that occurred from 2003 to 2013. We compared the results of SPC surveillance to the results of traditional SSI surveillance methods. Then, we analysed the performance of modified SPC charts constructed with different outbreak detection rules, EWMA smoothing factors and baseline SSI rate calculations. Conventional Shewhart and EWMA SPC charts both detected 8 of the 10 SSI outbreaks analysed, in each case prior to the date of traditional detection. Among detected outbreaks, conventional Shewhart chart detection occurred a median of 12 months prior to outbreak onset and 22 months prior to traditional detection. Conventional EWMA chart detection occurred a median of 7 months prior to outbreak onset and 14 months prior to traditional detection. Modified Shewhart and EWMA charts additionally detected several outbreaks earlier than conventional SPC charts. Shewhart and SPC charts had low false-positive rates when used to analyse separate control hospital SSI data. Our findings illustrate the potential usefulness and feasibility of real-time SPC surveillance of SSI to rapidly identify outbreaks and improve patient safety. Further study is needed to optimise SPC chart selection and calculation, statistical outbreak detection rules and the process for reacting to signals of potential outbreaks. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  11. Effect Size Analyses of Souvenaid in Patients with Alzheimer's Disease.

    PubMed

    Cummings, Jeffrey; Scheltens, Philip; McKeith, Ian; Blesa, Rafael; Harrison, John E; Bertolucci, Paulo H F; Rockwood, Kenneth; Wilkinson, David; Wijker, Wouter; Bennett, David A; Shah, Raj C

    2017-01-01

    Souvenaid® (uridine monophosphate, docosahexaenoic acid, eicosapentaenoic acid, choline, phospholipids, folic acid, vitamins B12, B6, C, and E, and selenium), was developed to support the formation and function of neuronal membranes. To determine effect sizes observed in clinical trials of Souvenaid and to calculate the number needed to treat to show benefit or harm. Data from all three reported randomized controlled trials of Souvenaid in Alzheimer's disease (AD) dementia (Souvenir I, Souvenir II, and S-Connect) and an open-label extension study were included in analyses of effect size for cognitive, functional, and behavioral outcomes. Effect size was determined by calculating Cohen's d statistic (or Cramér's V method for nominal data), number needed to treat and number needed to harm. Statistical calculations were performed for the intent-to-treat populations. In patients with mild AD, effect sizes were 0.21 (95% confidence intervals: -0.06, 0.49) for the primary outcome in Souvenir II (neuropsychological test battery memory z-score) and 0.20 (0.10, 0.34) for the co-primary outcome of Souvenir I (Wechsler memory scale delayed recall). No effect was shown on cognition in patients with mild-to-moderate AD (S-Connect). The number needed to treat (6 and 21 for Souvenir I and II, respectively) and high number needed to harm values indicate a favorable harm:benefit ratio for Souvenaid versus control in patients with mild AD. The favorable safety profile and impact on outcome measures converge to corroborate the putative mode of action and demonstrate that Souvenaid can achieve clinically detectable effects in patients with early AD.

  12. Angular Baryon Acoustic Oscillation measure at z=2.225 from the SDSS quasar survey

    NASA Astrophysics Data System (ADS)

    de Carvalho, E.; Bernui, A.; Carvalho, G. C.; Novaes, C. P.; Xavier, H. S.

    2018-04-01

    Following a quasi model-independent approach we measure the transversal BAO mode at high redshift using the two-point angular correlation function (2PACF). The analyses done here are only possible now with the quasar catalogue from the twelfth data release (DR12Q) from the Sloan Digital Sky Survey, because it is spatially dense enough to allow the measurement of the angular BAO signature with moderate statistical significance and acceptable precision. Our analyses with quasars in the redshift interval z in [2.20,2.25] produce the angular BAO scale θBAO = 1.77° ± 0.31° with a statistical significance of 2.12 σ (i.e., 97% confidence level), calculated through a likelihood analysis performed using the theoretical covariance matrix sourced by the analytical power spectra expected in the ΛCDM concordance model. Additionally, we show that the BAO signal is robust—although with less statistical significance—under diverse bin-size choices and under small displacements of the quasars' angular coordinates. Finally, we also performed cosmological parameter analyses comparing the θBAO predictions for wCDM and w(a)CDM models with angular BAO data available in the literature, including the measurement obtained here, jointly with CMB data. The constraints on the parameters ΩM, w0 and wa are in excellent agreement with the ΛCDM concordance model.

  13. An evaluation system for electronic retrospective analyses in radiation oncology: implemented exemplarily for pancreatic cancer

    NASA Astrophysics Data System (ADS)

    Kessel, Kerstin A.; Jäger, Andreas; Bohn, Christian; Habermehl, Daniel; Zhang, Lanlan; Engelmann, Uwe; Bougatf, Nina; Bendl, Rolf; Debus, Jürgen; Combs, Stephanie E.

    2013-03-01

    To date, conducting retrospective clinical analyses is rather difficult and time consuming. Especially in radiation oncology, handling voluminous datasets from various information systems and different documentation styles efficiently is crucial for patient care and research. With the example of patients with pancreatic cancer treated with radio-chemotherapy, we performed a therapy evaluation by using analysis tools connected with a documentation system. A total number of 783 patients have been documented into a professional, web-based documentation system. Information about radiation therapy, diagnostic images and dose distributions have been imported. For patients with disease progression after neoadjuvant chemoradiation, we designed and established an analysis workflow. After automatic registration of the radiation plans with the follow-up images, the recurrence volumes are segmented manually. Based on these volumes the DVH (dose-volume histogram) statistic is calculated, followed by the determination of the dose applied to the region of recurrence. All results are stored in the database and included in statistical calculations. The main goal of using an automatic evaluation system is to reduce time and effort conducting clinical analyses, especially with large patient groups. We showed a first approach and use of some existing tools, however manual interaction is still necessary. Further steps need to be taken to enhance automation. Already, it has become apparent that the benefits of digital data management and analysis lie in the central storage of data and reusability of the results. Therefore, we intend to adapt the evaluation system to other types of tumors in radiation oncology.

  14. Quantitative EEG analysis of the maturational changes associated with childhood absence epilepsy

    NASA Astrophysics Data System (ADS)

    Rosso, O. A.; Hyslop, W.; Gerlach, R.; Smith, R. L. L.; Rostas, J. A. P.; Hunter, M.

    2005-10-01

    This study aimed to examine the background electroencephalography (EEG) in children with childhood absence epilepsy, a condition whose presentation has strong developmental links. EEG hallmarks of absence seizure activity are widely accepted and there is recognition that the bulk of inter-ictal EEG in this group is normal to the naked eye. This multidisciplinary study aimed to use the normalized total wavelet entropy (NTWS) (Signal Processing 83 (2003) 1275) to examine the background EEG of those patients demonstrating absence seizure activity, and compare it with children without absence epilepsy. This calculation can be used to define the degree of order in a system, with higher levels of entropy indicating a more disordered (chaotic) system. Results were subjected to further statistical analyses of significance. Entropy values were calculated for patients versus controls. For all channels combined, patients with absence epilepsy showed (statistically significant) lower entropy values than controls. The size of the difference in entropy values was not uniform, with certain EEG electrodes consistently showing greater differences than others.

  15. Hood of the truck statistics for food animal practitioners.

    PubMed

    Slenning, Barrett D

    2006-03-01

    This article offers some tips on working with statistics and develops four relatively simple procedures to deal with most kinds of data with which veterinarians work. The criterion for a procedure to be a "Hood of the Truck Statistics" (HOT Stats) technique is that it must be simple enough to be done with pencil, paper, and a calculator. The goal of HOT Stats is to have the tools available to run quick analyses in only a few minutes so that decisions can be made in a timely fashion. The discipline allows us to move away from the all-too-common guess work about effects and differences we perceive following a change in treatment or management. The techniques allow us to move toward making more defensible, credible, and more quantifiably "risk-aware" real-time recommendations to our clients.

  16. Using Meta-analyses for Comparative Effectiveness Research

    PubMed Central

    Ruppar, Todd M.; Phillips, Lorraine J.; Chase, Jo-Ana D.

    2012-01-01

    Comparative effectiveness research seeks to identify the most effective interventions for particular patient populations. Meta-analysis is an especially valuable form of comparative effectiveness research because it emphasizes the magnitude of intervention effects rather than relying on tests of statistical significance among primary studies. Overall effects can be calculated for diverse clinical and patient-centered variables to determine the outcome patterns. Moderator analyses compare intervention characteristics among primary studies by determining if effect sizes vary among studies with different intervention characteristics. Intervention effectiveness can be linked to patient characteristics to provide evidence for patient-centered care. Moderator analyses often answer questions never posed by primary studies because neither multiple intervention characteristics nor populations are compared in single primary studies. Thus meta-analyses provide unique contributions to knowledge. Although meta-analysis is a powerful comparative effectiveness strategy, methodological challenges and limitations in primary research must be acknowledged to interpret findings. PMID:22789450

  17. Calculating solar photovoltaic potential on residential rooftops in Kailua Kona, Hawaii

    NASA Astrophysics Data System (ADS)

    Carl, Caroline

    As carbon based fossil fuels become increasingly scarce, renewable energy sources are coming to the forefront of policy discussions around the globe. As a result, the State of Hawaii has implemented aggressive goals to achieve energy independence by 2030. Renewable electricity generation using solar photovoltaic technologies plays an important role in these efforts. This study utilizes geographic information systems (GIS) and Light Detection and Ranging (LiDAR) data with statistical analysis to identify how much solar photovoltaic potential exists for residential rooftops in the town of Kailua Kona on Hawaii Island. This study helps to quantify the magnitude of possible solar photovoltaic (PV) potential for Solar World SW260 monocrystalline panels on residential rooftops within the study area. Three main areas were addressed in the execution of this research: (1) modeling solar radiation, (2) estimating available rooftop area, and (3) calculating PV potential from incoming solar radiation. High resolution LiDAR data and Esri's solar modeling tools and were utilized to calculate incoming solar radiation on a sample set of digitized rooftops. Photovoltaic potential for the sample set was then calculated with the equations developed by Suri et al. (2005). Sample set rooftops were analyzed using a statistical model to identify the correlation between rooftop area and lot size. Least squares multiple linear regression analysis was performed to identify the influence of slope, elevation, rooftop area, and lot size on the modeled PV potential values. The equations built from these statistical analyses of the sample set were applied to the entire study region to calculate total rooftop area and PV potential. The total study area statistical analysis findings estimate photovoltaic electric energy generation potential for rooftops is approximately 190,000,000 kWh annually. This is approximately 17 percent of the total electricity the utility provided to the entire island in 2012. Based on these findings, full rooftop PV installations on the 4,460 study area homes could provide enough energy to power over 31,000 homes annually. The methods developed here suggest a means to calculate rooftop area and PV potential in a region with limited available data. The use of LiDAR point data offers a major opportunity for future research in both automating rooftop inventories and calculating incoming solar radiation and PV potential for homeowners.

  18. Environmental flow allocation and statistics calculator

    USGS Publications Warehouse

    Konrad, Christopher P.

    2011-01-01

    The Environmental Flow Allocation and Statistics Calculator (EFASC) is a computer program that calculates hydrologic statistics based on a time series of daily streamflow values. EFASC will calculate statistics for daily streamflow in an input file or will generate synthetic daily flow series from an input file based on rules for allocating and protecting streamflow and then calculate statistics for the synthetic time series. The program reads dates and daily streamflow values from input files. The program writes statistics out to a series of worksheets and text files. Multiple sites can be processed in series as one run. EFASC is written in MicrosoftRegistered Visual BasicCopyright for Applications and implemented as a macro in MicrosoftOffice Excel 2007Registered. EFASC is intended as a research tool for users familiar with computer programming. The code for EFASC is provided so that it can be modified for specific applications. All users should review how output statistics are calculated and recognize that the algorithms may not comply with conventions used to calculate streamflow statistics published by the U.S. Geological Survey.

  19. Ataxia Telangiectasia–Mutated Gene Polymorphisms and Acute Normal Tissue Injuries in Cancer Patients After Radiation Therapy: A Systematic Review and Meta-analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dong, Lihua; Cui, Jingkun; Tang, Fengjiao

    Purpose: Studies of the association between ataxia telangiectasia–mutated (ATM) gene polymorphisms and acute radiation injuries are often small in sample size, and the results are inconsistent. We conducted the first meta-analysis to provide a systematic review of published findings. Methods and Materials: Publications were identified by searching PubMed up to April 25, 2014. Primary meta-analysis was performed for all acute radiation injuries, and subgroup meta-analyses were based on clinical endpoint. The influence of sample size and radiation injury incidence on genetic effects was estimated in sensitivity analyses. Power calculations were also conducted. Results: The meta-analysis was conducted on the ATMmore » polymorphism rs1801516, including 5 studies with 1588 participants. For all studies, the cut-off for differentiating cases from controls was grade 2 acute radiation injuries. The primary meta-analysis showed a significant association with overall acute radiation injuries (allelic model: odds ratio = 1.33, 95% confidence interval: 1.04-1.71). Subgroup analyses detected an association between the rs1801516 polymorphism and a significant increase in urinary and lower gastrointestinal injuries and an increase in skin injury that was not statistically significant. There was no between-study heterogeneity in any meta-analyses. In the sensitivity analyses, small studies did not show larger effects than large studies. In addition, studies with high incidence of acute radiation injuries showed larger effects than studies with low incidence. Power calculations revealed that the statistical power of the primary meta-analysis was borderline, whereas there was adequate power for the subgroup analysis of studies with high incidence of acute radiation injuries. Conclusions: Our meta-analysis showed a consistency of the results from the overall and subgroup analyses. We also showed that the genetic effect of the rs1801516 polymorphism on acute radiation injuries was dependent on the incidence of the injury. These support the evidence of an association between the rs1801516 polymorphism and acute radiation injuries, encouraging further research of this topic.« less

  20. Statistical analysis of lightning electric field measured under Malaysian condition

    NASA Astrophysics Data System (ADS)

    Salimi, Behnam; Mehranzamir, Kamyar; Abdul-Malek, Zulkurnain

    2014-02-01

    Lightning is an electrical discharge during thunderstorms that can be either within clouds (Inter-Cloud), or between clouds and ground (Cloud-Ground). The Lightning characteristics and their statistical information are the foundation for the design of lightning protection system as well as for the calculation of lightning radiated fields. Nowadays, there are various techniques to detect lightning signals and to determine various parameters produced by a lightning flash. Each technique provides its own claimed performances. In this paper, the characteristics of captured broadband electric fields generated by cloud-to-ground lightning discharges in South of Malaysia are analyzed. A total of 130 cloud-to-ground lightning flashes from 3 separate thunderstorm events (each event lasts for about 4-5 hours) were examined. Statistical analyses of the following signal parameters were presented: preliminary breakdown pulse train time duration, time interval between preliminary breakdowns and return stroke, multiplicity of stroke, and percentages of single stroke only. The BIL model is also introduced to characterize the lightning signature patterns. Observations on the statistical analyses show that about 79% of lightning signals fit well with the BIL model. The maximum and minimum of preliminary breakdown time duration of the observed lightning signals are 84 ms and 560 us, respectively. The findings of the statistical results show that 7.6% of the flashes were single stroke flashes, and the maximum number of strokes recorded was 14 multiple strokes per flash. A preliminary breakdown signature in more than 95% of the flashes can be identified.

  1. Strengthen forensic entomology in court--the need for data exploration and the validation of a generalised additive mixed model.

    PubMed

    Baqué, Michèle; Amendt, Jens

    2013-01-01

    Developmental data of juvenile blow flies (Diptera: Calliphoridae) are typically used to calculate the age of immature stages found on or around a corpse and thus to estimate a minimum post-mortem interval (PMI(min)). However, many of those data sets don't take into account that immature blow flies grow in a non-linear fashion. Linear models do not supply a sufficient reliability on age estimates and may even lead to an erroneous determination of the PMI(min). According to the Daubert standard and the need for improvements in forensic science, new statistic tools like smoothing methods and mixed models allow the modelling of non-linear relationships and expand the field of statistical analyses. The present study introduces into the background and application of these statistical techniques by analysing a model which describes the development of the forensically important blow fly Calliphora vicina at different temperatures. The comparison of three statistical methods (linear regression, generalised additive modelling and generalised additive mixed modelling) clearly demonstrates that only the latter provided regression parameters that reflect the data adequately. We focus explicitly on both the exploration of the data--to assure their quality and to show the importance of checking it carefully prior to conducting the statistical tests--and the validation of the resulting models. Hence, we present a common method for evaluating and testing forensic entomological data sets by using for the first time generalised additive mixed models.

  2. The theory precision analyse of RFM localization of satellite remote sensing imagery

    NASA Astrophysics Data System (ADS)

    Zhang, Jianqing; Xv, Biao

    2009-11-01

    The tradition method of detecting precision of Rational Function Model(RFM) is to make use of a great deal check points, and it calculates mean square error through comparing calculational coordinate with known coordinate. This method is from theory of probability, through a large number of samples to statistic estimate value of mean square error, we can think its estimate value approaches in its true when samples are well enough. This paper is from angle of survey adjustment, take law of propagation of error as the theory basis, and it calculates theory precision of RFM localization. Then take the SPOT5 three array imagery as experiment data, and the result of traditional method and narrated method in the paper are compared, while has confirmed tradition method feasible, and answered its theory precision question from the angle of survey adjustment.

  3. Teaching Statistics Online Using "Excel"

    ERIC Educational Resources Information Center

    Jerome, Lawrence

    2011-01-01

    As anyone who has taught or taken a statistics course knows, statistical calculations can be tedious and error-prone, with the details of a calculation sometimes distracting students from understanding the larger concepts. Traditional statistics courses typically use scientific calculators, which can relieve some of the tedium and errors but…

  4. AMAS: a fast tool for alignment manipulation and computing of summary statistics.

    PubMed

    Borowiec, Marek L

    2016-01-01

    The amount of data used in phylogenetics has grown explosively in the recent years and many phylogenies are inferred with hundreds or even thousands of loci and many taxa. These modern phylogenomic studies often entail separate analyses of each of the loci in addition to multiple analyses of subsets of genes or concatenated sequences. Computationally efficient tools for handling and computing properties of thousands of single-locus or large concatenated alignments are needed. Here I present AMAS (Alignment Manipulation And Summary), a tool that can be used either as a stand-alone command-line utility or as a Python package. AMAS works on amino acid and nucleotide alignments and combines capabilities of sequence manipulation with a function that calculates basic statistics. The manipulation functions include conversions among popular formats, concatenation, extracting sites and splitting according to a pre-defined partitioning scheme, creation of replicate data sets, and removal of taxa. The statistics calculated include the number of taxa, alignment length, total count of matrix cells, overall number of undetermined characters, percent of missing data, AT and GC contents (for DNA alignments), count and proportion of variable sites, count and proportion of parsimony informative sites, and counts of all characters relevant for a nucleotide or amino acid alphabet. AMAS is particularly suitable for very large alignments with hundreds of taxa and thousands of loci. It is computationally efficient, utilizes parallel processing, and performs better at concatenation than other popular tools. AMAS is a Python 3 program that relies solely on Python's core modules and needs no additional dependencies. AMAS source code and manual can be downloaded from http://github.com/marekborowiec/AMAS/ under GNU General Public License.

  5. Spatial analyses for nonoverlapping objects with size variations and their application to coral communities.

    PubMed

    Muko, Soyoka; Shimatani, Ichiro K; Nozawa, Yoko

    2014-07-01

    Spatial distributions of individuals are conventionally analysed by representing objects as dimensionless points, in which spatial statistics are based on centre-to-centre distances. However, if organisms expand without overlapping and show size variations, such as is the case for encrusting corals, interobject spacing is crucial for spatial associations where interactions occur. We introduced new pairwise statistics using minimum distances between objects and demonstrated their utility when examining encrusting coral community data. We also calculated the conventional point process statistics and the grid-based statistics to clarify the advantages and limitations of each spatial statistical method. For simplicity, coral colonies were approximated by disks in these demonstrations. Focusing on short-distance effects, the use of minimum distances revealed that almost all coral genera were aggregated at a scale of 1-25 cm. However, when fragmented colonies (ramets) were treated as a genet, a genet-level analysis indicated weak or no aggregation, suggesting that most corals were randomly distributed and that fragmentation was the primary cause of colony aggregations. In contrast, point process statistics showed larger aggregation scales, presumably because centre-to-centre distances included both intercolony spacing and colony sizes (radius). The grid-based statistics were able to quantify the patch (aggregation) scale of colonies, but the scale was strongly affected by the colony size. Our approach quantitatively showed repulsive effects between an aggressive genus and a competitively weak genus, while the grid-based statistics (covariance function) also showed repulsion although the spatial scale indicated from the statistics was not directly interpretable in terms of ecological meaning. The use of minimum distances together with previously proposed spatial statistics helped us to extend our understanding of the spatial patterns of nonoverlapping objects that vary in size and the associated specific scales. © 2013 The Authors. Journal of Animal Ecology © 2013 British Ecological Society.

  6. Additive scales in degenerative disease--calculation of effect sizes and clinical judgment.

    PubMed

    Riepe, Matthias W; Wilkinson, David; Förstl, Hans; Brieden, Andreas

    2011-12-16

    The therapeutic efficacy of an intervention is often assessed in clinical trials by scales measuring multiple diverse activities that are added to produce a cumulative global score. Medical communities and health care systems subsequently use these data to calculate pooled effect sizes to compare treatments. This is done because major doubt has been cast over the clinical relevance of statistically significant findings relying on p values with the potential to report chance findings. Hence in an aim to overcome this pooling the results of clinical studies into a meta-analyses with a statistical calculus has been assumed to be a more definitive way of deciding of efficacy. We simulate the therapeutic effects as measured with additive scales in patient cohorts with different disease severity and assess the limitations of an effect size calculation of additive scales which are proven mathematically. We demonstrate that the major problem, which cannot be overcome by current numerical methods, is the complex nature and neurobiological foundation of clinical psychiatric endpoints in particular and additive scales in general. This is particularly relevant for endpoints used in dementia research. 'Cognition' is composed of functions such as memory, attention, orientation and many more. These individual functions decline in varied and non-linear ways. Here we demonstrate that with progressive diseases cumulative values from multidimensional scales are subject to distortion by the limitations of the additive scale. The non-linearity of the decline of function impedes the calculation of effect sizes based on cumulative values from these multidimensional scales. Statistical analysis needs to be guided by boundaries of the biological condition. Alternatively, we suggest a different approach avoiding the error imposed by over-analysis of cumulative global scores from additive scales.

  7. The impact of registration accuracy on imaging validation study design: A novel statistical power calculation.

    PubMed

    Gibson, Eli; Fenster, Aaron; Ward, Aaron D

    2013-10-01

    Novel imaging modalities are pushing the boundaries of what is possible in medical imaging, but their signal properties are not always well understood. The evaluation of these novel imaging modalities is critical to achieving their research and clinical potential. Image registration of novel modalities to accepted reference standard modalities is an important part of characterizing the modalities and elucidating the effect of underlying focal disease on the imaging signal. The strengths of the conclusions drawn from these analyses are limited by statistical power. Based on the observation that in this context, statistical power depends in part on uncertainty arising from registration error, we derive a power calculation formula relating registration error, number of subjects, and the minimum detectable difference between normal and pathologic regions on imaging, for an imaging validation study design that accommodates signal correlations within image regions. Monte Carlo simulations were used to evaluate the derived models and test the strength of their assumptions, showing that the model yielded predictions of the power, the number of subjects, and the minimum detectable difference of simulated experiments accurate to within a maximum error of 1% when the assumptions of the derivation were met, and characterizing sensitivities of the model to violations of the assumptions. The use of these formulae is illustrated through a calculation of the number of subjects required for a case study, modeled closely after a prostate cancer imaging validation study currently taking place at our institution. The power calculation formulae address three central questions in the design of imaging validation studies: (1) What is the maximum acceptable registration error? (2) How many subjects are needed? (3) What is the minimum detectable difference between normal and pathologic image regions? Copyright © 2013 Elsevier B.V. All rights reserved.

  8. Effect Size Analyses of Souvenaid in Patients with Alzheimer’s Disease

    PubMed Central

    Cummings, Jeffrey; Scheltens, Philip; McKeith, Ian; Blesa, Rafael; Harrison, John E.; Bertolucci, Paulo H.F.; Rockwood, Kenneth; Wilkinson, David; Wijker, Wouter; Bennett, David A.; Shah, Raj C.

    2016-01-01

    Background: Souvenaid® (uridine monophosphate, docosahexaenoic acid, eicosapentaenoic acid, choline, phospholipids, folic acid, vitamins B12, B6, C, and E, and selenium), was developed to support the formation and function of neuronal membranes. Objective: To determine effect sizes observed in clinical trials of Souvenaid and to calculate the number needed to treat to show benefit or harm. Methods: Data from all three reported randomized controlled trials of Souvenaid in Alzheimer’s disease (AD) dementia (Souvenir I, Souvenir II, and S-Connect) and an open-label extension study were included in analyses of effect size for cognitive, functional, and behavioral outcomes. Effect size was determined by calculating Cohen’s d statistic (or Cramér’s V method for nominal data), number needed to treat and number needed to harm. Statistical calculations were performed for the intent-to-treat populations. Results: In patients with mild AD, effect sizes were 0.21 (95% confidence intervals: –0.06, 0.49) for the primary outcome in Souvenir II (neuropsychological test battery memory z-score) and 0.20 (0.10, 0.34) for the co-primary outcome of Souvenir I (Wechsler memory scale delayed recall). No effect was shown on cognition in patients with mild-to-moderate AD (S-Connect). The number needed to treat (6 and 21 for Souvenir I and II, respectively) and high number needed to harm values indicate a favorable harm:benefit ratio for Souvenaid versus control in patients with mild AD. Conclusions: The favorable safety profile and impact on outcome measures converge to corroborate the putative mode of action and demonstrate that Souvenaid can achieve clinically detectable effects in patients with early AD. PMID:27767993

  9. Incorporating GIS building data and census housing statistics for sub-block-level population estimation

    USGS Publications Warehouse

    Wu, S.-S.; Wang, L.; Qiu, X.

    2008-01-01

    This article presents a deterministic model for sub-block-level population estimation based on the total building volumes derived from geographic information system (GIS) building data and three census block-level housing statistics. To assess the model, we generated artificial blocks by aggregating census block areas and calculating the respective housing statistics. We then applied the model to estimate populations for sub-artificial-block areas and assessed the estimates with census populations of the areas. Our analyses indicate that the average percent error of population estimation for sub-artificial-block areas is comparable to those for sub-census-block areas of the same size relative to associated blocks. The smaller the sub-block-level areas, the higher the population estimation errors. For example, the average percent error for residential areas is approximately 0.11 percent for 100 percent block areas and 35 percent for 5 percent block areas.

  10. [Database supported electronic retrospective analyses in radiation oncology: establishing a workflow using the example of pancreatic cancer].

    PubMed

    Kessel, K A; Habermehl, D; Bohn, C; Jäger, A; Floca, R O; Zhang, L; Bougatf, N; Bendl, R; Debus, J; Combs, S E

    2012-12-01

    Especially in the field of radiation oncology, handling a large variety of voluminous datasets from various information systems in different documentation styles efficiently is crucial for patient care and research. To date, conducting retrospective clinical analyses is rather difficult and time consuming. With the example of patients with pancreatic cancer treated with radio-chemotherapy, we performed a therapy evaluation by using an analysis system connected with a documentation system. A total number of 783 patients have been documented into a professional, database-based documentation system. Information about radiation therapy, diagnostic images and dose distributions have been imported into the web-based system. For 36 patients with disease progression after neoadjuvant chemoradiation, we designed and established an analysis workflow. After an automatic registration of the radiation plans with the follow-up images, the recurrence volumes are segmented manually. Based on these volumes the DVH (dose volume histogram) statistic is calculated, followed by the determination of the dose applied to the region of recurrence. All results are saved in the database and included in statistical calculations. The main goal of using an automatic analysis tool is to reduce time and effort conducting clinical analyses, especially with large patient groups. We showed a first approach and use of some existing tools, however manual interaction is still necessary. Further steps need to be taken to enhance automation. Already, it has become apparent that the benefits of digital data management and analysis lie in the central storage of data and reusability of the results. Therefore, we intend to adapt the analysis system to other types of tumors in radiation oncology.

  11. Methodological and Reporting Quality of Systematic Reviews and Meta-analyses in Endodontics.

    PubMed

    Nagendrababu, Venkateshbabu; Pulikkotil, Shaju Jacob; Sultan, Omer Sheriff; Jayaraman, Jayakumar; Peters, Ove A

    2018-06-01

    The aim of this systematic review (SR) was to evaluate the quality of SRs and meta-analyses (MAs) in endodontics. A comprehensive literature search was conducted to identify relevant articles in the electronic databases from January 2000 to June 2017. Two reviewers independently assessed the articles for eligibility and data extraction. SRs and MAs on interventional studies with a minimum of 2 therapeutic strategies in endodontics were included in this SR. Methodologic and reporting quality were assessed using A Measurement Tool to Assess Systematic Reviews (AMSTAR) and Preferred Reporting Items for Systematic Review and Meta-Analyses (PRISMA), respectively. The interobserver reliability was calculated using the Cohen kappa statistic. Statistical analysis with the level of significance at P < .05 was performed using Kruskal-Wallis tests and simple linear regression analysis. A total of 30 articles were selected for the current SR. Using AMSTAR, the item related to the scientific quality of studies used in conclusion was adhered by less than 40% of studies. Using PRISMA, 3 items were reported by less than 40% of studies, which were on objectives, protocol registration, and funding. No association was evident comparing the number of authors and country with quality. Statistical significance was observed when quality was compared among journals, with studies published as Cochrane reviews superior to those published in other journals. AMSTAR and PRISMA scores were significantly related. SRs in endodontics showed variability in both methodologic and reporting quality. Copyright © 2018 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  12. A common base method for analysis of qPCR data and the application of simple blocking in qPCR experiments.

    PubMed

    Ganger, Michael T; Dietz, Geoffrey D; Ewing, Sarah J

    2017-12-01

    qPCR has established itself as the technique of choice for the quantification of gene expression. Procedures for conducting qPCR have received significant attention; however, more rigorous approaches to the statistical analysis of qPCR data are needed. Here we develop a mathematical model, termed the Common Base Method, for analysis of qPCR data based on threshold cycle values (C q ) and efficiencies of reactions (E). The Common Base Method keeps all calculations in the logscale as long as possible by working with log 10 (E) ∙ C q , which we call the efficiency-weighted C q value; subsequent statistical analyses are then applied in the logscale. We show how efficiency-weighted C q values may be analyzed using a simple paired or unpaired experimental design and develop blocking methods to help reduce unexplained variation. The Common Base Method has several advantages. It allows for the incorporation of well-specific efficiencies and multiple reference genes. The method does not necessitate the pairing of samples that must be performed using traditional analysis methods in order to calculate relative expression ratios. Our method is also simple enough to be implemented in any spreadsheet or statistical software without additional scripts or proprietary components.

  13. Four hundred or more participants needed for stable contingency table estimates of clinical prediction rule performance.

    PubMed

    Kent, Peter; Boyle, Eleanor; Keating, Jennifer L; Albert, Hanne B; Hartvigsen, Jan

    2017-02-01

    To quantify variability in the results of statistical analyses based on contingency tables and discuss the implications for the choice of sample size for studies that derive clinical prediction rules. An analysis of three pre-existing sets of large cohort data (n = 4,062-8,674) was performed. In each data set, repeated random sampling of various sample sizes, from n = 100 up to n = 2,000, was performed 100 times at each sample size and the variability in estimates of sensitivity, specificity, positive and negative likelihood ratios, posttest probabilities, odds ratios, and risk/prevalence ratios for each sample size was calculated. There were very wide, and statistically significant, differences in estimates derived from contingency tables from the same data set when calculated in sample sizes below 400 people, and typically, this variability stabilized in samples of 400-600 people. Although estimates of prevalence also varied significantly in samples below 600 people, that relationship only explains a small component of the variability in these statistical parameters. To reduce sample-specific variability, contingency tables should consist of 400 participants or more when used to derive clinical prediction rules or test their performance. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Spectral calculations for pressure-velocity and pressure-strain correlations in homogeneous shear turbulence

    NASA Astrophysics Data System (ADS)

    Dutta, Kishore

    2018-02-01

    Theoretical analyses of pressure related turbulent statistics are vital for a reliable and accurate modeling of turbulence. In the inertial subrange of turbulent shear flow, pressure-velocity and pressure-strain correlations are affected by anisotropy imposed at large scales. Recently, Tsuji and Kaneda (2012 J. Fluid Mech. 694 50) performed a set of experiments on homogeneous shear flow, and estimated various one-dimensional pressure related spectra and the associated non-dimensional universal numbers. Here, starting from the governing Navier-Stokes dynamics for the fluctuating velocity field and assuming the anisotropy at inertial scales as a weak perturbation of an otherwise isotropic dynamics, we analytically derive the form of the pressure-velocity and pressure-strain correlations. The associated universal numbers are calculated using the well-known renormalization-group results, and are compared with the experimental estimates of Tsuji and Kaneda. Approximations involved in the perturbative calculations are discussed.

  15. Statistical analyses on sandstones: Systematic approach for predicting petrographical and petrophysical properties

    NASA Astrophysics Data System (ADS)

    Stück, H. L.; Siegesmund, S.

    2012-04-01

    Sandstones are a popular natural stone due to their wide occurrence and availability. The different applications for these stones have led to an increase in demand. From the viewpoint of conservation and the natural stone industry, an understanding of the material behaviour of this construction material is very important. Sandstones are a highly heterogeneous material. Based on statistical analyses with a sufficiently large dataset, a systematic approach to predicting the material behaviour should be possible. Since the literature already contains a large volume of data concerning the petrographical and petrophysical properties of sandstones, a large dataset could be compiled for the statistical analyses. The aim of this study is to develop constraints on the material behaviour and especially on the weathering behaviour of sandstones. Approximately 300 samples from historical and presently mined natural sandstones in Germany and ones described worldwide were included in the statistical approach. The mineralogical composition and fabric characteristics were determined from detailed thin section analyses and descriptions in the literature. Particular attention was paid to evaluating the compositional and textural maturity, grain contact respectively contact thickness, type of cement, degree of alteration and the intergranular volume. Statistical methods were used to test for normal distributions and calculating the linear regression of the basic petrophysical properties of density, porosity, water uptake as well as the strength. The sandstones were classified into three different pore size distributions and evaluated with the other petrophysical properties. Weathering behavior like hygric swelling and salt loading tests were also included. To identify similarities between individual sandstones or to define groups of specific sandstone types, principle component analysis, cluster analysis and factor analysis were applied. Our results show that composition and porosity evolution during diagenesis is a very important control on the petrophysical properties of a building stone. The relationship between intergranular volume, cementation and grain contact, can also provide valuable information to predict the strength properties. Since the samples investigated mainly originate from the Triassic German epicontinental basin, arkoses and feldspar-arenites are underrepresented. In general, the sandstones can be grouped as follows: i) quartzites, highly mature with a primary porosity of about 40%, ii) quartzites, highly mature, showing a primary porosity of 40% but with early clay infiltration, iii) sublitharenites-lithic arenites exhibiting a lower primary porosity, higher cementation with quartz and Fe-oxides ferritic and iv) sublitharenites-lithic arenites with a higher content of pseudomatrix. However, in the last two groups the feldspar and lithoclasts can also show considerable alteration. All sandstone groups differ with respect to the pore space and strength data, as well as water uptake properties, which were obtained by linear regression analysis. Similar petrophysical properties are discernible for each type when using principle component analysis. Furthermore, strength as well as the porosity of sandstones shows distinct differences considering their stratigraphic ages and the compositions. The relationship between porosity, strength as well as salt resistance could also be verified. Hygric swelling shows an interrelation to pore size type, porosity and strength but also to the degree of alteration (e.g. lithoclasts, pseudomatrix). To summarize, the different regression analyses and the calculated confidence regions provide a significant tool to classify the petrographical and petrophysical parameters of sandstones. Based on this, the durability and the weathering behavior of the sandstone groups can be constrained. Keywords: sandstones, petrographical & petrophysical properties, predictive approach, statistical investigation

  16. Impact of tamsulosin and nifedipine on contractility of pregnant rat ureters in vitro.

    PubMed

    Haddad, Lisette; Corriveau, Stéphanie; Rousseau, Eric; Blouin, Simon; Pasquier, Jean-Charles; Ponsot, Yves; Roy-Lacroix, Marie-Ève

    2018-01-01

    To evaluate the in vitro effect of tamsulosin and nifedipine on the contractility of pregnant rat ureters and to perform quantitative analysis of the pharmacological effects. Medical expulsive therapy (MET) is commonly used to treat urolithiasis. However, this treatment is seldom used in pregnant women since no studies support this practice. This was an in vitro study on animal tissue derived from pregnant Sprague-Dawley rats. A total of 124 ureteral segments were mounted in an organ bath system and contractile response to methacholine (MCh) was assessed. Tamsulosin or nifedipine were added at cumulative concentrations (0.001-1 μM). The area under the curve (AUC) from isometric tension measurements was calculated. The effect of pharmacological agents and the respective controls were assessed by calculating the AUC for each 5-min interval. Statistical analyses were performed using the Mann-Whitney-Wilcoxon nonparametric test. Both drugs displayed statistically significant inhibitory activity at concentrations of 0.1 and 1 μM for tamsulosin and 1 μM for nifedipine when calculated as the AUC as compared to DMSO controls. Tamsulosin and nifedipine directly inhibit MCh-induced contractility of pregnant rat ureters. Further work is needed to determine the clinical efficacy of these medications for MET in pregnancy.

  17. Discovering human germ cell mutagens with whole genome sequencing: Insights from power calculations reveal the importance of controlling for between-family variability.

    PubMed

    Webster, R J; Williams, A; Marchetti, F; Yauk, C L

    2018-07-01

    Mutations in germ cells pose potential genetic risks to offspring. However, de novo mutations are rare events that are spread across the genome and are difficult to detect. Thus, studies in this area have generally been under-powered, and no human germ cell mutagen has been identified. Whole Genome Sequencing (WGS) of human pedigrees has been proposed as an approach to overcome these technical and statistical challenges. WGS enables analysis of a much wider breadth of the genome than traditional approaches. Here, we performed power analyses to determine the feasibility of using WGS in human families to identify germ cell mutagens. Different statistical models were compared in the power analyses (ANOVA and multiple regression for one-child families, and mixed effect model sampling between two to four siblings per family). Assumptions were made based on parameters from the existing literature, such as the mutation-by-paternal age effect. We explored two scenarios: a constant effect due to an exposure that occurred in the past, and an accumulating effect where the exposure is continuing. Our analysis revealed the importance of modeling inter-family variability of the mutation-by-paternal age effect. Statistical power was improved by models accounting for the family-to-family variability. Our power analyses suggest that sufficient statistical power can be attained with 4-28 four-sibling families per treatment group, when the increase in mutations ranges from 40 to 10% respectively. Modeling family variability using mixed effect models provided a reduction in sample size compared to a multiple regression approach. Much larger sample sizes were required to detect an interaction effect between environmental exposures and paternal age. These findings inform study design and statistical modeling approaches to improve power and reduce sequencing costs for future studies in this area. Crown Copyright © 2018. Published by Elsevier B.V. All rights reserved.

  18. Statistical issues on the analysis of change in follow-up studies in dental research.

    PubMed

    Blance, Andrew; Tu, Yu-Kang; Baelum, Vibeke; Gilthorpe, Mark S

    2007-12-01

    To provide an overview to the problems in study design and associated analyses of follow-up studies in dental research, particularly addressing three issues: treatment-baselineinteractions; statistical power; and nonrandomization. Our previous work has shown that many studies purport an interacion between change (from baseline) and baseline values, which is often based on inappropriate statistical analyses. A priori power calculations are essential for randomized controlled trials (RCTs), but in the pre-test/post-test RCT design it is not well known to dental researchers that the choice of statistical method affects power, and that power is affected by treatment-baseline interactions. A common (good) practice in the analysis of RCT data is to adjust for baseline outcome values using ancova, thereby increasing statistical power. However, an important requirement for ancova is there to be no interaction between the groups and baseline outcome (i.e. effective randomization); the patient-selection process should not cause differences in mean baseline values across groups. This assumption is often violated for nonrandomized (observational) studies and the use of ancova is thus problematic, potentially giving biased estimates, invoking Lord's paradox and leading to difficulties in the interpretation of results. Baseline interaction issues can be overcome by use of statistical methods; not widely practiced in dental research: Oldham's method and multilevel modelling; the latter is preferred for its greater flexibility to deal with more than one follow-up occasion as well as additional covariates To illustrate these three key issues, hypothetical examples are considered from the fields of periodontology, orthodontics, and oral implantology. Caution needs to be exercised when considering the design and analysis of follow-up studies. ancova is generally inappropriate for nonrandomized studies and causal inferences from observational data should be avoided.

  19. The influence of control group reproduction on the statistical ...

    EPA Pesticide Factsheets

    Because of various Congressional mandates to protect the environment from endocrine disrupting chemicals (EDCs), the United States Environmental Protection Agency (USEPA) initiated the Endocrine Disruptor Screening Program. In the context of this framework, the Office of Research and Development within the USEPA developed the Medaka Extended One Generation Reproduction Test (MEOGRT) to characterize the endocrine action of a suspected EDC. One important endpoint of the MEOGRT is fecundity of breeding pairs of medaka. Power analyses were conducted to determine the number of replicates needed in proposed test designs and to determine the effects that varying reproductive parameters (e.g. mean fecundity, variance, and days with no egg production) will have on the statistical power of the test. A software tool, the MEOGRT Reproduction Power Analysis Tool, was developed to expedite these power analyses by both calculating estimates of the needed reproductive parameters (e.g. population mean and variance) and performing the power analysis under user specified scenarios. The manuscript illustrates how the reproductive performance of the control medaka that are used in a MEOGRT influence statistical power, and therefore the successful implementation of the protocol. Example scenarios, based upon medaka reproduction data collected at MED, are discussed that bolster the recommendation that facilities planning to implement the MEOGRT should have a culture of medaka with hi

  20. "Are cognitive interventions effective in Alzheimer's disease? A controlled meta- analysis of the effects of bias": Correction to Oltra-Cucarella et al. (2016).

    PubMed

    2016-07-01

    Reports an error in "Are Cognitive Interventions Effective in Alzheimer's Disease? A Controlled Meta-Analysis of the Effects of Bias" by Javier Oltra-Cucarella, Rubén Pérez-Elvira, Raul Espert and Anita Sohn McCormick (Neuropsychology, Advanced Online Publication, Apr 7, 2016, np). In the article the first sentence of the third paragraph of the Source of bias subsection in the Statistical Analysis subsection of the Correlational Meta-Analysis section should read "For the control condition bias, three comparison groups were differentiated: (a) a structured cognitive intervention, (b) a placebo control condition, and (c) a pharma control condition without cognitive intervention or no treatment at all." (The following abstract of the original article appeared in record 2016-16656-001.) There is limited evidence about the efficacy of cognitive interventions for Alzheimer's disease (AD). However, aside from the methodological quality of the studies analyzed, the methodology used in previous meta-analyses is itself a risk of bias as different types of effect sizes (ESs) were calculated and combined. This study aimed at examining the results of nonpharmacological interventions for AD with an adequate control of statistical methods and to demonstrate a different approach to meta-analysis. ESs were calculated with the independent groups pre/post design. Average ESs for separate outcomes were calculated and moderator analyses were performed so as to offer an overview of the effects of bias. Eighty-seven outcomes from 19 studies (n = 812) were meta-analyzed. ESs were small on average for cognitive and functional outcomes after intervention. Moderator analyses showed no effect of control of bias, although ESs were different from zero only in some circumstances (e.g., memory outcomes in randomized studies). Cognitive interventions showed no more efficacy than placebo interventions, and functional ESs were consistently low across conditions. cognitive interventions delivered may not be effective in AD probably due to the fact that the assumptions behind the cognitive interventions might be inadequate. Future directions include a change in the type of intervention as well as the use of outcomes other than standardized tests. Additional studies with larger sample sizes and different designs are needed to increase the power of both primary studies and meta-analyses. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  1. Descriptive and inferential statistical methods used in burns research.

    PubMed

    Al-Benna, Sammy; Al-Ajam, Yazan; Way, Benjamin; Steinstraesser, Lars

    2010-05-01

    Burns research articles utilise a variety of descriptive and inferential methods to present and analyse data. The aim of this study was to determine the descriptive methods (e.g. mean, median, SD, range, etc.) and survey the use of inferential methods (statistical tests) used in articles in the journal Burns. This study defined its population as all original articles published in the journal Burns in 2007. Letters to the editor, brief reports, reviews, and case reports were excluded. Study characteristics, use of descriptive statistics and the number and types of statistical methods employed were evaluated. Of the 51 articles analysed, 11(22%) were randomised controlled trials, 18(35%) were cohort studies, 11(22%) were case control studies and 11(22%) were case series. The study design and objectives were defined in all articles. All articles made use of continuous and descriptive data. Inferential statistics were used in 49(96%) articles. Data dispersion was calculated by standard deviation in 30(59%). Standard error of the mean was quoted in 19(37%). The statistical software product was named in 33(65%). Of the 49 articles that used inferential statistics, the tests were named in 47(96%). The 6 most common tests used (Student's t-test (53%), analysis of variance/co-variance (33%), chi(2) test (27%), Wilcoxon & Mann-Whitney tests (22%), Fisher's exact test (12%)) accounted for the majority (72%) of statistical methods employed. A specified significance level was named in 43(88%) and the exact significance levels were reported in 28(57%). Descriptive analysis and basic statistical techniques account for most of the statistical tests reported. This information should prove useful in deciding which tests should be emphasised in educating burn care professionals. These results highlight the need for burn care professionals to have a sound understanding of basic statistics, which is crucial in interpreting and reporting data. Advice should be sought from professionals in the fields of biostatistics and epidemiology when using more advanced statistical techniques. Copyright 2009 Elsevier Ltd and ISBI. All rights reserved.

  2. Bias, precision and statistical power of analysis of covariance in the analysis of randomized trials with baseline imbalance: a simulation study.

    PubMed

    Egbewale, Bolaji E; Lewis, Martyn; Sim, Julius

    2014-04-09

    Analysis of variance (ANOVA), change-score analysis (CSA) and analysis of covariance (ANCOVA) respond differently to baseline imbalance in randomized controlled trials. However, no empirical studies appear to have quantified the differential bias and precision of estimates derived from these methods of analysis, and their relative statistical power, in relation to combinations of levels of key trial characteristics. This simulation study therefore examined the relative bias, precision and statistical power of these three analyses using simulated trial data. 126 hypothetical trial scenarios were evaluated (126,000 datasets), each with continuous data simulated by using a combination of levels of: treatment effect; pretest-posttest correlation; direction and magnitude of baseline imbalance. The bias, precision and power of each method of analysis were calculated for each scenario. Compared to the unbiased estimates produced by ANCOVA, both ANOVA and CSA are subject to bias, in relation to pretest-posttest correlation and the direction of baseline imbalance. Additionally, ANOVA and CSA are less precise than ANCOVA, especially when pretest-posttest correlation ≥ 0.3. When groups are balanced at baseline, ANCOVA is at least as powerful as the other analyses. Apparently greater power of ANOVA and CSA at certain imbalances is achieved in respect of a biased treatment effect. Across a range of correlations between pre- and post-treatment scores and at varying levels and direction of baseline imbalance, ANCOVA remains the optimum statistical method for the analysis of continuous outcomes in RCTs, in terms of bias, precision and statistical power.

  3. Bias, precision and statistical power of analysis of covariance in the analysis of randomized trials with baseline imbalance: a simulation study

    PubMed Central

    2014-01-01

    Background Analysis of variance (ANOVA), change-score analysis (CSA) and analysis of covariance (ANCOVA) respond differently to baseline imbalance in randomized controlled trials. However, no empirical studies appear to have quantified the differential bias and precision of estimates derived from these methods of analysis, and their relative statistical power, in relation to combinations of levels of key trial characteristics. This simulation study therefore examined the relative bias, precision and statistical power of these three analyses using simulated trial data. Methods 126 hypothetical trial scenarios were evaluated (126 000 datasets), each with continuous data simulated by using a combination of levels of: treatment effect; pretest-posttest correlation; direction and magnitude of baseline imbalance. The bias, precision and power of each method of analysis were calculated for each scenario. Results Compared to the unbiased estimates produced by ANCOVA, both ANOVA and CSA are subject to bias, in relation to pretest-posttest correlation and the direction of baseline imbalance. Additionally, ANOVA and CSA are less precise than ANCOVA, especially when pretest-posttest correlation ≥ 0.3. When groups are balanced at baseline, ANCOVA is at least as powerful as the other analyses. Apparently greater power of ANOVA and CSA at certain imbalances is achieved in respect of a biased treatment effect. Conclusions Across a range of correlations between pre- and post-treatment scores and at varying levels and direction of baseline imbalance, ANCOVA remains the optimum statistical method for the analysis of continuous outcomes in RCTs, in terms of bias, precision and statistical power. PMID:24712304

  4. Analysis of temperature-dependent neutron transmission and self-indication measurements on tantalum at 2-keV neutron energy

    NASA Technical Reports Server (NTRS)

    Semler, T. T.

    1973-01-01

    The method of pseudo-resonance cross sections is used to analyze published temperature-dependent neutron transmission and self-indication measurements on tantalum in the unresolved region. In the energy region analyzed, 1825.0 to 2017.0 eV, a direct application of the pseudo-resonance approach using a customary average strength function will not provide effective cross sections which fit the measured cross section behavior. Rather a local value of the strength function is required, and a set of resonances which model the measured behavior of the effective cross sections is derived. This derived set of resonance parameters adequately represents the observed resonance hehavior in this local energy region. Similar analyses for the measurements in other unresolved energy regions are necessary to obtain local resonance parameters for improved reactor calculations. This study suggests that Doppler coefficients calculated by sampling from grand average statistical distributions over the entire unresolved resonance region can be in error, since significant local variations in the statistical distributions are not taken into consideration.

  5. Automated brain volumetrics in multiple sclerosis: a step closer to clinical application.

    PubMed

    Wang, C; Beadnall, H N; Hatton, S N; Bader, G; Tomic, D; Silva, D G; Barnett, M H

    2016-07-01

    Whole brain volume (WBV) estimates in patients with multiple sclerosis (MS) correlate more robustly with clinical disability than traditional, lesion-based metrics. Numerous algorithms to measure WBV have been developed over the past two decades. We compare Structural Image Evaluation using Normalisation of Atrophy-Cross-sectional (SIENAX) to NeuroQuant and MSmetrix, for assessment of cross-sectional WBV in patients with MS. MRIs from 61 patients with relapsing-remitting MS and 2 patients with clinically isolated syndrome were analysed. WBV measurements were calculated using SIENAX, NeuroQuant and MSmetrix. Statistical agreement between the methods was evaluated using linear regression and Bland-Altman plots. Precision and accuracy of WBV measurement was calculated for (1) NeuroQuant versus SIENAX and (2) MSmetrix versus SIENAX. Precision (Pearson's r) of WBV estimation for NeuroQuant and MSmetrix versus SIENAX was 0.983 and 0.992, respectively. Accuracy (Cb) was 0.871 and 0.994, respectively. NeuroQuant and MSmetrix showed a 5.5% and 1.0% volume difference compared with SIENAX, respectively, that was consistent across low and high values. In the analysed population, NeuroQuant and MSmetrix both quantified cross-sectional WBV with comparable statistical agreement to SIENAX, a well-validated cross-sectional tool that has been used extensively in MS clinical studies. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  6. [Clinical research XXIII. From clinical judgment to meta-analyses].

    PubMed

    Rivas-Ruiz, Rodolfo; Castelán-Martínez, Osvaldo D; Pérez-Rodríguez, Marcela; Palacios-Cruz, Lino; Noyola-Castillo, Maura E; Talavera, Juan O

    2014-01-01

    Systematic reviews (SR) are studies made in order to ask clinical questions based on original articles. Meta-analysis (MTA) is the mathematical analysis of SR. These analyses are divided in two groups, those which evaluate the measured results of quantitative variables (for example, the body mass index -BMI-) and those which evaluate qualitative variables (for example, if a patient is alive or dead, or if he is healing or not). Quantitative variables generally use the mean difference analysis and qualitative variables can be performed using several calculations: odds ratio (OR), relative risk (RR), absolute risk reduction (ARR) and hazard ratio (HR). These analyses are represented through forest plots which allow the evaluation of each individual study, as well as the heterogeneity between studies and the overall effect of the intervention. These analyses are mainly based on Student's t test and chi-squared. To take appropriate decisions based on the MTA, it is important to understand the characteristics of statistical methods in order to avoid misinterpretations.

  7. Continuous Covariate Imbalance and Conditional Power for Clinical Trial Interim Analyses

    PubMed Central

    Ciolino, Jody D.; Martin, Renee' H.; Zhao, Wenle; Jauch, Edward C.; Hill, Michael D.; Palesch, Yuko Y.

    2014-01-01

    Oftentimes valid statistical analyses for clinical trials involve adjustment for known influential covariates, regardless of imbalance observed in these covariates at baseline across treatment groups. Thus, it must be the case that valid interim analyses also properly adjust for these covariates. There are situations, however, in which covariate adjustment is not possible, not planned, or simply carries less merit as it makes inferences less generalizable and less intuitive. In this case, covariate imbalance between treatment groups can have a substantial effect on both interim and final primary outcome analyses. This paper illustrates the effect of influential continuous baseline covariate imbalance on unadjusted conditional power (CP), and thus, on trial decisions based on futility stopping bounds. The robustness of the relationship is illustrated for normal, skewed, and bimodal continuous baseline covariates that are related to a normally distributed primary outcome. Results suggest that unadjusted CP calculations in the presence of influential covariate imbalance require careful interpretation and evaluation. PMID:24607294

  8. Terbinafine in the treatment of dermatophyte toenail onychomycosis: a meta-analysis of efficacy for continuous and intermittent regimens.

    PubMed

    Gupta, A K; Paquet, M; Simpson, F; Tavakkol, A

    2013-03-01

    To compare mycological and complete cures of terbinafine continuous and intermittent regimens in the treatment of toenail onychomycosis. The PubMed database was searched using the terms "terbinafine", "onychomycosis", "continuous" and "pulse(d)" or "intermittent". The inclusion criteria were head-to-head comparison of terbinafine pulse and continuous regimens for dermatophyte toenail infections. Risk ratios were calculated for intention-to-treat and evaluable patient analyses, when possible. Pooled estimates for total and subgroup analyses were calculated using a random effect model, Mantel-Haenszel method and their probabilities were calculated with z-statistics. Nine studies from eight publications were included. Two continuous regimens and four intermittent regimens were investigated. A pooled risk ratio of 0.87 was obtained for intention-to-treat (95% CI: 0.79-0.96, P = 0.004, n = 6) and evaluable patient (95% CI: 0.80-0.96, P = 0.003, n = 8) analyses of mycological cure, favouring continuous terbinafine. For complete cure, pooled risk ratios of 0.97 (95% CI: 0.77-1.23, P = 0.82, n = 7) for intention-to-treat and 0.93 (95% CI: 0.76-1.13, P = 0.44, n = 9) for evaluable patient analyses showed equality of the two regimens. The pulse regimen that demonstrated consistently comparable results to the continuous terbinafine regimen was two pulses of terbinafine 250 mg/day for 4 weeks on/4 weeks off. Meta-analysis of published studies of toenail onychomycosis showed that a continuous terbinafine regimen is generally significantly superior to a pulsed terbinafine regimen for mycological cure. In contrast, some pulse terbinafine regimens were as effective as continuous terbinafine regimens for complete cure. © 2012 The Authors. Journal of the European Academy of Dermatology and Venereology © 2012 European Academy of Dermatology and Venereology.

  9. Crimes against the elderly in Italy, 2007-2014.

    PubMed

    Terranova, Claudio; Bevilacqua, Greta; Zen, Margherita; Montisci, Massimo

    2017-08-01

    Crimes against the elderly have physical, psychological, and economic consequences. Approaches for mitigating them must be based on comprehensive knowledge of the phenomenon. This study analyses crimes against the elderly in Italy during the period 2007-2014 from an epidemiological viewpoint. Data on violent and non-violent crimes derived from the Italian Institute of Statistics were analysed in relation to trends, gender and age by linear regression, T-test, and calculation of the odds ratio with a 95% confidence interval. Results show that the elderly are at higher risk of being victimized in two types of crime, violent (residential robbery) and non-violent (pick-pocketing and purse-snatching) compared with other age groups during the period considered. A statistically significant increase in residential robbery and pick-pocketing was also observed. The rate of homicide against the elderly was stable during the study period, in contrast with reduced rates in other age groups. These results may be explained by risk factors increasing the profiles of elderly individuals as potential victims, such as frailty, cognitive impairment, and social isolation. Further studies analysing the characteristics of victims are required. Based on the results presented here, appropriate preventive strategies should be planned to reduce crimes against the elderly. Copyright © 2017 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  10. Cross-sectional and longitudinal evaluation of liver volume and total liver fat burden in adults with nonalcoholic steatohepatitis

    PubMed Central

    Tang, An; Chen, Joshua; Le, Thuy-Anh; Changchien, Christopher; Hamilton, Gavin; Middleton, Michael S.; Loomba, Rohit; Sirlin, Claude B.

    2014-01-01

    Purpose To explore the cross-sectional and longitudinal relationships between fractional liver fat content, liver volume, and total liver fat burden. Methods In 43 adults with non-alcoholic steatohepatitis participating in a clinical trial, liver volume was estimated by segmentation of magnitude-based low-flip-angle multiecho GRE images. The liver mean proton density fat fraction (PDFF) was calculated. The total liver fat index (TLFI) was estimated as the product of liver mean PDFF and liver volume. Linear regression analyses were performed. Results Cross-sectional analyses revealed statistically significant relationships between TLFI and liver mean PDFF (R2 = 0.740 baseline/0.791 follow-up, P < 0.001 baseline/P < 0.001 follow-up), and between TLFI and liver volume (R2 = 0.352/0.452, P < 0.001/< 0.001). Longitudinal analyses revealed statistically significant relationships between liver volume change and liver mean PDFF change (R2 = 0.556, P < 0.001), between TLFI change and liver mean PDFF change (R2 = 0.920, P < 0.001), and between TLFI change and liver volume change (R2 = 0.735, P < 0.001). Conclusion Liver segmentation in combination with MRI-based PDFF estimation may be used to monitor liver volume, liver mean PDFF, and TLFI in a clinical trial. PMID:25015398

  11. Interim analyses in 2 x 2 crossover trials.

    PubMed

    Cook, R J

    1995-09-01

    A method is presented for performing interim analyses in long term 2 x 2 crossover trials with serial patient entry. The analyses are based on a linear statistic that combines data from individuals observed for one treatment period with data from individuals observed for both periods. The coefficients in this linear combination can be chosen quite arbitrarily, but we focus on variance-based weights to maximize power for tests regarding direct treatment effects. The type I error rate of this procedure is controlled by utilizing the joint distribution of the linear statistics over analysis stages. Methods for performing power and sample size calculations are indicated. A two-stage sequential design involving simultaneous patient entry and a single between-period interim analysis is considered in detail. The power and average number of measurements required for this design are compared to those of the usual crossover trial. The results indicate that, while there is minimal loss in power relative to the usual crossover design in the absence of differential carry-over effects, the proposed design can have substantially greater power when differential carry-over effects are present. The two-stage crossover design can also lead to more economical studies in terms of the expected number of measurements required, due to the potential for early stopping. Attention is directed toward normally distributed responses.

  12. Periodontal disease and carotid atherosclerosis: A meta-analysis of 17,330 participants.

    PubMed

    Zeng, Xian-Tao; Leng, Wei-Dong; Lam, Yat-Yin; Yan, Bryan P; Wei, Xue-Mei; Weng, Hong; Kwong, Joey S W

    2016-01-15

    The association between periodontal disease and carotid atherosclerosis has been evaluated primarily in single-center studies, and whether periodontal disease is an independent risk factor of carotid atherosclerosis remains uncertain. This meta-analysis aimed to evaluate the association between periodontal disease and carotid atherosclerosis. We searched PubMed and Embase for relevant observational studies up to February 20, 2015. Two authors independently extracted data from included studies, and odds ratios (ORs) with 95% confidence intervals (CIs) were calculated for overall and subgroup meta-analyses. Statistical heterogeneity was assessed by the chi-squared test (P<0.1 for statistical significance) and quantified by the I(2) statistic. Data analysis was conducted using the Comprehensive Meta-Analysis (CMA) software. Fifteen observational studies involving 17,330 participants were included in the meta-analysis. The overall pooled result showed that periodontal disease was associated with carotid atherosclerosis (OR: 1.27, 95% CI: 1.14-1.41; P<0.001) but statistical heterogeneity was substantial (I(2)=78.90%). Subgroup analysis of adjusted smoking and diabetes mellitus showed borderline significance (OR: 1.08; 95% CI: 1.00-1.18; P=0.05). Sensitivity and cumulative analyses both indicated that our results were robust. Findings of our meta-analysis indicated that the presence of periodontal disease was associated with carotid atherosclerosis; however, further large-scale, well-conducted clinical studies are needed to explore the precise risk of developing carotid atherosclerosis in patients with periodontal disease. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  13. Analysis by the Residual Method for Estimate Market Value of Land on the Areas with Mining Exploitation in Subsoil under Future New Building

    NASA Astrophysics Data System (ADS)

    Gwozdz-Lason, Monika

    2017-12-01

    This paper attempts to answer some of the following questions: what is the main selling advantage of a plot of land on the areas with mining exploitation? which attributes influence on market value the most? and how calculate the mining influence in subsoil under future new building as market value of plot with commercial use? This focus is not accidental, as the paper sets out to prove that the subsoil load bearing capacity, as directly inferred from the local geotechnical properties with mining exploitation, considerably influences the market value of this type of real estate. Presented in this elaborate analysis and calculations, are part of the ongoing development works which aimed at suggesting a new technology and procedures for estimating the value of the land belonging to the third category geotechnical. Analysed the question was examined both in terms of the theoretical and empirical. On the basis of the analysed code calculations in residual method, numerical, statistical and econometric defined results and final conclusions. A market analysis yielded a group of subsoil stabilization costs which depend on the mining operations interaction, subsoil parameters, type of the contemplated structure, its foundations, selected stabilization method, its overall area and shape.

  14. Mars Exploration Rovers Landing Dispersion Analysis

    NASA Technical Reports Server (NTRS)

    Knocke, Philip C.; Wawrzyniak, Geoffrey G.; Kennedy, Brian M.; Desai, Prasun N.; Parker, TImothy J.; Golombek, Matthew P.; Duxbury, Thomas C.; Kass, David M.

    2004-01-01

    Landing dispersion estimates for the Mars Exploration Rover missions were key elements in the site targeting process and in the evaluation of landing risk. This paper addresses the process and results of the landing dispersion analyses performed for both Spirit and Opportunity. The several contributors to landing dispersions (navigation and atmospheric uncertainties, spacecraft modeling, winds, and margins) are discussed, as are the analysis tools used. JPL's MarsLS program, a MATLAB-based landing dispersion visualization and statistical analysis tool, was used to calculate the probability of landing within hazardous areas. By convolving this with the probability of landing within flight system limits (in-spec landing) for each hazard area, a single overall measure of landing risk was calculated for each landing ellipse. In-spec probability contours were also generated, allowing a more synoptic view of site risks, illustrating the sensitivity to changes in landing location, and quantifying the possible consequences of anomalies such as incomplete maneuvers. Data and products required to support these analyses are described, including the landing footprints calculated by NASA Langley's POST program and JPL's AEPL program, cartographically registered base maps and hazard maps, and flight system estimates of in-spec landing probabilities for each hazard terrain type. Various factors encountered during operations, including evolving navigation estimates and changing atmospheric models, are discussed and final landing points are compared with approach estimates.

  15. EvolQG - An R package for evolutionary quantitative genetics

    PubMed Central

    Melo, Diogo; Garcia, Guilherme; Hubbe, Alex; Assis, Ana Paula; Marroig, Gabriel

    2016-01-01

    We present an open source package for performing evolutionary quantitative genetics analyses in the R environment for statistical computing. Evolutionary theory shows that evolution depends critically on the available variation in a given population. When dealing with many quantitative traits this variation is expressed in the form of a covariance matrix, particularly the additive genetic covariance matrix or sometimes the phenotypic matrix, when the genetic matrix is unavailable and there is evidence the phenotypic matrix is sufficiently similar to the genetic matrix. Given this mathematical representation of available variation, the \\textbf{EvolQG} package provides functions for calculation of relevant evolutionary statistics; estimation of sampling error; corrections for this error; matrix comparison via correlations, distances and matrix decomposition; analysis of modularity patterns; and functions for testing evolutionary hypotheses on taxa diversification. PMID:27785352

  16. Evaluation of neutron total and capture cross sections on 99Tc in the unresolved resonance region

    NASA Astrophysics Data System (ADS)

    Iwamoto, Nobuyuki; Katabuchi, Tatsuya

    2017-09-01

    Long-lived fission product Technetium-99 is one of the most important radioisotopes for nuclear transmutation. The reliable nuclear data are indispensable for a wide energy range up to a few MeV, in order to develop environmental load reducing technology. The statistical analyses of resolved resonances were performed by using the truncated Porter-Thomas distribution, coupled-channels optical model, nuclear level density model and Bayes' theorem on conditional probability. The total and capture cross sections were calculated by a nuclear reaction model code CCONE. The resulting cross sections have statistical consistency between the resolved and unresolved resonance regions. The evaluated capture data reproduce those recently measured at ANNRI of J-PARC/MLF above resolved resonance region up to 800 keV.

  17. Evaluation of the validity of the Bolton Index using cone-beam computed tomography (CBCT)

    PubMed Central

    Llamas, José M.; Cibrián, Rosa; Gandía, José L.; Paredes, Vanessa

    2012-01-01

    Aims: To evaluate the reliability and reproducibility of calculating the Bolton Index using cone-beam computed tomography (CBCT), and to compare this with measurements obtained using the 2D Digital Method. Material and Methods: Traditional study models were obtained from 50 patients, which were then digitized in order to be able to measure them using the Digital Method. Likewise, CBCTs of those same patients were undertaken using the Dental Picasso Master 3D® and the images obtained were then analysed using the InVivoDental programme. Results: By determining the regression lines for both measurement methods, as well as the difference between both of their values, the two methods are shown to be comparable, despite the fact that the measurements analysed presented statistically significant differences. Conclusions: The three-dimensional models obtained from the CBCT are as accurate and reproducible as the digital models obtained from the plaster study casts for calculating the Bolton Index. The differences existing between both methods were clinically acceptable. Key words:Tooth-size, digital models, bolton index, CBCT. PMID:22549690

  18. Fruit and vegetable intake and risk of breast cancer by hormone receptor status.

    PubMed

    Jung, Seungyoun; Spiegelman, Donna; Baglietto, Laura; Bernstein, Leslie; Boggs, Deborah A; van den Brandt, Piet A; Buring, Julie E; Cerhan, James R; Gaudet, Mia M; Giles, Graham G; Goodman, Gary; Hakansson, Niclas; Hankinson, Susan E; Helzlsouer, Kathy; Horn-Ross, Pamela L; Inoue, Manami; Krogh, Vittorio; Lof, Marie; McCullough, Marjorie L; Miller, Anthony B; Neuhouser, Marian L; Palmer, Julie R; Park, Yikyung; Robien, Kim; Rohan, Thomas E; Scarmo, Stephanie; Schairer, Catherine; Schouten, Leo J; Shikany, James M; Sieri, Sabina; Tsugane, Schoichiro; Visvanathan, Kala; Weiderpass, Elisabete; Willett, Walter C; Wolk, Alicja; Zeleniuch-Jacquotte, Anne; Zhang, Shumin M; Zhang, Xuehong; Ziegler, Regina G; Smith-Warner, Stephanie A

    2013-02-06

    Estrogen receptor-negative (ER(-)) breast cancer has few known or modifiable risk factors. Because ER(-) tumors account for only 15% to 20% of breast cancers, large pooled analyses are necessary to evaluate precisely the suspected inverse association between fruit and vegetable intake and risk of ER(-) breast cancer. Among 993 466 women followed for 11 to 20 years in 20 cohort studies, we documented 19 869 estrogen receptor positive (ER(+)) and 4821 ER(-) breast cancers. We calculated study-specific multivariable relative risks (RRs) and 95% confidence intervals (CIs) using Cox proportional hazards regression analyses and then combined them using a random-effects model. All statistical tests were two-sided. Total fruit and vegetable intake was statistically significantly inversely associated with risk of ER(-) breast cancer but not with risk of breast cancer overall or of ER(+) tumors. The inverse association for ER(-) tumors was observed primarily for vegetable consumption. The pooled relative risks comparing the highest vs lowest quintile of total vegetable consumption were 0.82 (95% CI = 0.74 to 0.90) for ER(-) breast cancer and 1.04 (95% CI = 0.97 to 1.11) for ER(+) breast cancer (P (common-effects) by ER status < .001). Total fruit consumption was non-statistically significantly associated with risk of ER(-) breast cancer (pooled multivariable RR comparing the highest vs lowest quintile = 0.94, 95% CI = 0.85 to 1.04). We observed no association between total fruit and vegetable intake and risk of overall breast cancer. However, vegetable consumption was inversely associated with risk of ER(-) breast cancer in our large pooled analyses.

  19. Behavior, sensitivity, and power of activation likelihood estimation characterized by massive empirical simulation.

    PubMed

    Eickhoff, Simon B; Nichols, Thomas E; Laird, Angela R; Hoffstaedter, Felix; Amunts, Katrin; Fox, Peter T; Bzdok, Danilo; Eickhoff, Claudia R

    2016-08-15

    Given the increasing number of neuroimaging publications, the automated knowledge extraction on brain-behavior associations by quantitative meta-analyses has become a highly important and rapidly growing field of research. Among several methods to perform coordinate-based neuroimaging meta-analyses, Activation Likelihood Estimation (ALE) has been widely adopted. In this paper, we addressed two pressing questions related to ALE meta-analysis: i) Which thresholding method is most appropriate to perform statistical inference? ii) Which sample size, i.e., number of experiments, is needed to perform robust meta-analyses? We provided quantitative answers to these questions by simulating more than 120,000 meta-analysis datasets using empirical parameters (i.e., number of subjects, number of reported foci, distribution of activation foci) derived from the BrainMap database. This allowed to characterize the behavior of ALE analyses, to derive first power estimates for neuroimaging meta-analyses, and to thus formulate recommendations for future ALE studies. We could show as a first consequence that cluster-level family-wise error (FWE) correction represents the most appropriate method for statistical inference, while voxel-level FWE correction is valid but more conservative. In contrast, uncorrected inference and false-discovery rate correction should be avoided. As a second consequence, researchers should aim to include at least 20 experiments into an ALE meta-analysis to achieve sufficient power for moderate effects. We would like to note, though, that these calculations and recommendations are specific to ALE and may not be extrapolated to other approaches for (neuroimaging) meta-analysis. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Behavior, Sensitivity, and power of activation likelihood estimation characterized by massive empirical simulation

    PubMed Central

    Eickhoff, Simon B.; Nichols, Thomas E.; Laird, Angela R.; Hoffstaedter, Felix; Amunts, Katrin; Fox, Peter T.

    2016-01-01

    Given the increasing number of neuroimaging publications, the automated knowledge extraction on brain-behavior associations by quantitative meta-analyses has become a highly important and rapidly growing field of research. Among several methods to perform coordinate-based neuroimaging meta-analyses, Activation Likelihood Estimation (ALE) has been widely adopted. In this paper, we addressed two pressing questions related to ALE meta-analysis: i) Which thresholding method is most appropriate to perform statistical inference? ii) Which sample size, i.e., number of experiments, is needed to perform robust meta-analyses? We provided quantitative answers to these questions by simulating more than 120,000 meta-analysis datasets using empirical parameters (i.e., number of subjects, number of reported foci, distribution of activation foci) derived from the BrainMap database. This allowed to characterize the behavior of ALE analyses, to derive first power estimates for neuroimaging meta-analyses, and to thus formulate recommendations for future ALE studies. We could show as a first consequence that cluster-level family-wise error (FWE) correction represents the most appropriate method for statistical inference, while voxel-level FWE correction is valid but more conservative. In contrast, uncorrected inference and false-discovery rate correction should be avoided. As a second consequence, researchers should aim to include at least 20 experiments into an ALE meta-analysis to achieve sufficient power for moderate effects. We would like to note, though, that these calculations and recommendations are specific to ALE and may not be extrapolated to other approaches for (neuroimaging) meta-analysis. PMID:27179606

  1. Statistical aspects of genetic association testing in small samples, based on selective DNA pooling data in the arctic fox.

    PubMed

    Szyda, Joanna; Liu, Zengting; Zatoń-Dobrowolska, Magdalena; Wierzbicki, Heliodor; Rzasa, Anna

    2008-01-01

    We analysed data from a selective DNA pooling experiment with 130 individuals of the arctic fox (Alopex lagopus), which originated from 2 different types regarding body size. The association between alleles of 6 selected unlinked molecular markers and body size was tested by using univariate and multinomial logistic regression models, applying odds ratio and test statistics from the power divergence family. Due to the small sample size and the resulting sparseness of the data table, in hypothesis testing we could not rely on the asymptotic distributions of the tests. Instead, we tried to account for data sparseness by (i) modifying confidence intervals of odds ratio; (ii) using a normal approximation of the asymptotic distribution of the power divergence tests with different approaches for calculating moments of the statistics; and (iii) assessing P values empirically, based on bootstrap samples. As a result, a significant association was observed for 3 markers. Furthermore, we used simulations to assess the validity of the normal approximation of the asymptotic distribution of the test statistics under the conditions of small and sparse samples.

  2. Human movement stochastic variability leads to diagnostic biomarkers In Autism Spectrum Disorders (ASD)

    NASA Astrophysics Data System (ADS)

    Wu, Di; Torres, Elizabeth B.; Jose, Jorge V.

    2015-03-01

    ASD is a spectrum of neurodevelopmental disorders. The high heterogeneity of the symptoms associated with the disorder impedes efficient diagnoses based on human observations. Recent advances with high-resolution MEM wearable sensors enable accurate movement measurements that may escape the naked eye. It calls for objective metrics to extract physiological relevant information from the rapidly accumulating data. In this talk we'll discuss the statistical analysis of movement data continuously collected with high-resolution sensors at 240Hz. We calculated statistical properties of speed fluctuations within the millisecond time range that closely correlate with the subjects' cognitive abilities. We computed the periodicity and synchronicity of the speed fluctuations' from their power spectrum and ensemble averaged two-point cross-correlation function. We built a two-parameter phase space from the temporal statistical analyses of the nearest neighbor fluctuations that provided a quantitative biomarker for ASD and adult normal subjects and further classified ASD severity. We also found age related developmental statistical signatures and potential ASD parental links in our movement dynamical studies. Our results may have direct clinical applications.

  3. Satellite disintegration dynamics

    NASA Technical Reports Server (NTRS)

    Dasenbrock, R. R.; Kaufman, B.; Heard, W. B.

    1975-01-01

    The subject of satellite disintegration is examined in detail. Elements of the orbits of individual fragments, determined by DOD space surveillance systems, are used to accurately predict the time and place of fragmentation. Dual time independent and time dependent analyses are performed for simulated and real breakups. Methods of statistical mechanics are used to study the evolution of the fragment clouds. The fragments are treated as an ensemble of non-interacting particles. A solution of Liouville's equation is obtained which enables the spatial density to be calculated as a function of position, time and initial velocity distribution.

  4. STR data for 15 autosomal STR markers from Paraná (Southern Brazil).

    PubMed

    Alves, Hemerson B; Leite, Fábio P N; Sotomaior, Vanessa S; Rueda, Fábio F; Silva, Rosane; Moura-Neto, Rodrigo S

    2014-03-01

    Allelic frequencies for 15 STR autosomal loci, using AmpFℓSTR® Identifiler™, forensic, and statistical parameters were calculated. All loci reached the Hardy-Weinberg equilibrium. The combined power of discrimination and mean power of exclusion were 0.999999999999999999 and 0.9999993, respectively. The MDS plot and NJ tree analysis, generated by FST matrix, corroborated the notion of the origins of the Paraná population as mainly European-derived. The combination of these 15 STR loci represents a powerful strategy for individual identification and parentage analyses for the Paraná population.

  5. Analysis of data from NASA B-57B gust gradient program

    NASA Technical Reports Server (NTRS)

    Frost, W.; Lin, M. C.; Chang, H. P.; Ringnes, E.

    1985-01-01

    Statistical analysis of the turbulence measured in flight 6 of the NASA B-57B over Denver, Colorado, from July 7 to July 23, 1982 included the calculations of average turbulence parameters, integral length scales, probability density functions, single point autocorrelation coefficients, two point autocorrelation coefficients, normalized autospectra, normalized two point autospectra, and two point cross sectra for gust velocities. The single point autocorrelation coefficients were compared with the theoretical model developed by von Karman. Theoretical analyses were developed which address the effects spanwise gust distributions, using two point spatial turbulence correlations.

  6. Coding completeness and quality of relative survival-related variables in the National Program of Cancer Registries Cancer Surveillance System, 1995-2008.

    PubMed

    Wilson, Reda J; O'Neil, M E; Ntekop, E; Zhang, Kevin; Ren, Y

    2014-01-01

    Calculating accurate estimates of cancer survival is important for various analyses of cancer patient care and prognosis. Current US survival rates are estimated based on data from the National Cancer Institute's (NCI's) Surveillance, Epidemiology, and End RESULTS (SEER) program, covering approximately 28 percent of the US population. The National Program of Cancer Registries (NPCR) covers about 96 percent of the US population. Using a population-based database with greater US population coverage to calculate survival rates at the national, state, and regional levels can further enhance the effective monitoring of cancer patient care and prognosis in the United States. The first step is to establish the coding completeness and coding quality of the NPCR data needed for calculating survival rates and conducting related validation analyses. Using data from the NPCR-Cancer Surveillance System (CSS) from 1995 through 2008, we assessed coding completeness and quality on 26 data elements that are needed to calculate cancer relative survival estimates and conduct related analyses. Data elements evaluated consisted of demographic, follow-up, prognostic, and cancer identification variables. Analyses were performed showing trends of these variables by diagnostic year, state of residence at diagnosis, and cancer site. Mean overall percent coding completeness by each NPCR central cancer registry averaged across all data elements and diagnosis years ranged from 92.3 percent to 100 percent. RESULTS showing the mean percent coding completeness for the relative survival-related variables in NPCR data are presented. All data elements but 1 have a mean coding completeness greater than 90 percent as was the mean completeness by data item group type. Statistically significant differences in coding completeness were found in the ICD revision number, cause of death, vital status, and date of last contact variables when comparing diagnosis years. The majority of data items had a coding quality greater than 90 percent, with exceptions found in cause of death, follow-up source, and the SEER Summary Stage 1977, and SEER Summary Stage 2000. Percent coding completeness and quality are very high for variables in the NPCR-CSS that are covariates to calculating relative survival. NPCR provides the opportunity to calculate relative survival that may be more generalizable to the US population.

  7. Lattice QCD and nucleon resonances

    NASA Astrophysics Data System (ADS)

    Edwards, R. G.; Fiebig, H. R.; Fleming, G.; Richards, D. G.; LHP Collaboration

    2004-06-01

    Lattice calculations provide an ab initio means for the study of QCD. Recent progress at understanding the spectrum and structure of nucleons from lattice QCD studies is reviewed. Measurements of the masses of the lightest particles for the lowest spin values are described and related to predictions of the quark model. Measurements of the mass of the first radial excitation of the nucleon, the so-called Roper resonance, obtained using Bayesian statistical analyses, are detailed. The need to perform calculations at realistically light values of the pion mass is emphasised, and the exciting progress at attaining such masses is outlined. The talk concludes with future prospects, emphasising the importance of constructing a basis of interpolating operators that is sensitive to three-quark states, to multi-quark states, and to excited glue.

  8. Estimates of the seasonal mean vertical velocity fields of the extratropical Northern Hemisphere

    NASA Technical Reports Server (NTRS)

    White, G. H.

    1983-01-01

    Indirect methods are employed to estimate the wintertime and summertime mean vertical velocity fields of the extratropical Northern Hemisphere and intercomparisons are made, together with comparisons with mean seasonal patterns of cloudiness and precipitation. Twice-daily NMC operational analyses produced general circulation statistics for 11 winters and 12 summers, permitting calculation of the seasonal NMC averages for 6 hr forecasts, solution of the omega equation, integration of continuity equation downward from 100 mb, and solution of the thermodynamic energy equation in the absence of diabatic heating. The methods all yielded similar vertical velocity patterns; however, the magnitude of the vertical velocities could not be calculated with great accuracy. Orography was concluded to have less of an effect in summer than in winter, when winds are stronger.

  9. Ab initio Study on Ionization Energies of 3-Amino-1-propanol

    NASA Astrophysics Data System (ADS)

    Wang, Ke-dong; Jia, Ying-bin; Lai, Zhen-jiang; Liu, Yu-fang

    2011-06-01

    Fourteen conformers of 3-amino-1-propanol as the minima on the potential energy surface are examined at the MP2/6-311++G** level. Their relative energies calculated at B3LYP, MP3 and MP4 levels of theory indicated that two most stable conformers display the intramolecular OH···N hydrogen bonds. The vertical ionization energies of these conformers calculated with ab initio electron propagator theory in the P3/aug-cc-pVTZ approximation are in agreement with experimental data from photoelectron spectroscopy. Natural bond orbital analyses were used to explain the differences of IEs of the highest occupied molecular ortibal of conformers. Combined with statistical mechanics principles, conformational distributions at various temperatures are obtained and the temperature dependence of photoelectron spectra is interpreted.

  10. Statistics for NAEG: past efforts, new results, and future plans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, R.O.; Simpson, J.C.; Kinnison, R.R.

    A brief review of Nevada Applied Ecology Group (NAEG) objectives is followed by a summary of past statistical analyses conducted by Pacific Northwest Laboratory for the NAEG. Estimates of spatial pattern of radionuclides and other statistical analyses at NS's 201, 219 and 221 are reviewed as background for new analyses presented in this paper. Suggested NAEG activities and statistical analyses needed for the projected termination date of NAEG studies in March 1986 are given.

  11. Evaluation of Evidence of Statistical Support and Corroboration of Subgroup Claims in Randomized Clinical Trials.

    PubMed

    Wallach, Joshua D; Sullivan, Patrick G; Trepanowski, John F; Sainani, Kristin L; Steyerberg, Ewout W; Ioannidis, John P A

    2017-04-01

    Many published randomized clinical trials (RCTs) make claims for subgroup differences. To evaluate how often subgroup claims reported in the abstracts of RCTs are actually supported by statistical evidence (P < .05 from an interaction test) and corroborated by subsequent RCTs and meta-analyses. This meta-epidemiological survey examines data sets of trials with at least 1 subgroup claim, including Subgroup Analysis of Trials Is Rarely Easy (SATIRE) articles and Discontinuation of Randomized Trials (DISCO) articles. We used Scopus (updated July 2016) to search for English-language articles citing each of the eligible index articles with at least 1 subgroup finding in the abstract. Articles with a subgroup claim in the abstract with or without evidence of statistical heterogeneity (P < .05 from an interaction test) in the text and articles attempting to corroborate the subgroup findings. Study characteristics of trials with at least 1 subgroup claim in the abstract were recorded. Two reviewers extracted the data necessary to calculate subgroup-level effect sizes, standard errors, and the P values for interaction. For individual RCTs and meta-analyses that attempted to corroborate the subgroup findings from the index articles, trial characteristics were extracted. Cochran Q test was used to reevaluate heterogeneity with the data from all available trials. The number of subgroup claims in the abstracts of RCTs, the number of subgroup claims in the abstracts of RCTs with statistical support (subgroup findings), and the number of subgroup findings corroborated by subsequent RCTs and meta-analyses. Sixty-four eligible RCTs made a total of 117 subgroup claims in their abstracts. Of these 117 claims, only 46 (39.3%) in 33 articles had evidence of statistically significant heterogeneity from a test for interaction. In addition, out of these 46 subgroup findings, only 16 (34.8%) ensured balance between randomization groups within the subgroups (eg, through stratified randomization), 13 (28.3%) entailed a prespecified subgroup analysis, and 1 (2.2%) was adjusted for multiple testing. Only 5 (10.9%) of the 46 subgroup findings had at least 1 subsequent pure corroboration attempt by a meta-analysis or an RCT. In all 5 cases, the corroboration attempts found no evidence of a statistically significant subgroup effect. In addition, all effect sizes from meta-analyses were attenuated toward the null. A minority of subgroup claims made in the abstracts of RCTs are supported by their own data (ie, a significant interaction effect). For those that have statistical support (P < .05 from an interaction test), most fail to meet other best practices for subgroup tests, including prespecification, stratified randomization, and adjustment for multiple testing. Attempts to corroborate statistically significant subgroup differences are rare; when done, the initially observed subgroup differences are not reproduced.

  12. Bias and precision of selected analytes reported by the National Atmospheric Deposition Program and National Trends Network, 1984

    USGS Publications Warehouse

    Brooks, M.H.; Schroder, L.J.; Willoughby, T.C.

    1987-01-01

    The U.S. Geological Survey operated a blind audit sample program during 1974 to test the effects of the sample handling and shipping procedures used by the National Atmospheric Deposition Program and National Trends Network on the quality of wet deposition data produced by the combined networks. Blind audit samples, which were dilutions of standard reference water samples, were submitted by network site operators to the central analytical laboratory disguised as actual wet deposition samples. Results from the analyses of blind audit samples were used to calculate estimates of analyte bias associated with all network wet deposition samples analyzed in 1984 and to estimate analyte precision. Concentration differences between double blind samples that were submitted to the central analytical laboratory and separate analyses of aliquots of those blind audit samples that had not undergone network sample handling and shipping were used to calculate analyte masses that apparently were added to each blind audit sample by routine network handling and shipping procedures. These calculated masses indicated statistically significant biases for magnesium, sodium , potassium, chloride, and sulfate. Median calculated masses were 41.4 micrograms (ug) for calcium, 14.9 ug for magnesium, 23.3 ug for sodium, 0.7 ug for potassium, 16.5 ug for chloride and 55.3 ug for sulfate. Analyte precision was estimated using two different sets of replicate measures performed by the central analytical laboratory. Estimated standard deviations were similar to those previously reported. (Author 's abstract)

  13. Modeling radiation loads in the ILC main linac and a novel approach to treat dark current

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mokhov, Nilolai V.; Rakhno, Igor L.; Tropin, Igor S.

    Electromagnetic and hadron showers generated by electrons of dark current (DC) can represent a significant radiation threat to the ILC linac equipment and personnel. In this study, a commissioning scenario is analysed which is considered as the worst-case scenario for the main linac regarding the DC contribution to the radiation environment in the tunnel. A normal operation scenario is analysed as well. An emphasis is made on radiation load to sensitive electronic equipment—cryogenic thermometers inside the cryomodules. Prompt and residual dose rates in the ILC main linac tunnels were also calculated in these new high-statistics runs. A novel approach wasmore » developed—as a part of general purpose Monte Carlo code MARS15—to model generation, acceleration and transport of DC electrons in electromagnetic fields inside SRF cavities. Comparisons were made with a standard approach when a set of pre-calculated DC electron trajectories is used, with a proper normalization, as a source for Monte Carlo modelling. Results of MARS15 Monte Carlo calculations, performed for the current main linac tunnel design, reveal that the peak absorbed dose in the cryogenic thermometers in the main tunnel for 20 years of operation is about 0.8 MGy. The calculated contact residual dose on cryomodules and tunnel walls in the main tunnel for typical irradiation and cooling conditions is 0.1 and 0.01 mSv/hr, respectively.« less

  14. Size and shape measurement in contemporary cephalometrics.

    PubMed

    McIntyre, Grant T; Mossey, Peter A

    2003-06-01

    The traditional method of analysing cephalograms--conventional cephalometric analysis (CCA)--involves the calculation of linear distance measurements, angular measurements, area measurements, and ratios. Because shape information cannot be determined from these 'size-based' measurements, an increasing number of studies employ geometric morphometric tools in the cephalometric analysis of craniofacial morphology. Most of the discussions surrounding the appropriateness of CCA, Procrustes superimposition, Euclidean distance matrix analysis (EDMA), thin-plate spline analysis (TPS), finite element morphometry (FEM), elliptical Fourier functions (EFF), and medial axis analysis (MAA) have centred upon mathematical and statistical arguments. Surprisingly, little information is available to assist the orthodontist in the clinical relevance of each technique. This article evaluates the advantages and limitations of the above methods currently used to analyse the craniofacial morphology on cephalograms and investigates their clinical relevance and possible applications.

  15. GC-MS quantification of suspected volatile allergens in fragrances. 2. Data treatment strategies and method performances.

    PubMed

    Bassereau, Maud; Chaintreau, Alain; Duperrex, Stéphanie; Joulain, Daniel; Leijs, Hans; Loesing, Gerd; Owen, Neil; Sherlock, Alan; Schippa, Christine; Thorel, Pierre-Jean; Vey, Matthias

    2007-01-10

    The performances of the GC-MS determination of suspected allergens in fragrance concentrates have been investigated. The limit of quantification was experimentally determined (10 mg/L), and the variability was investigated for three different data treatment strategies: (1) two columns and three quantification ions; (2) two columns and one quantification ion; and (3) one column and three quantification ions. The first strategy best minimizes the risk of determination bias due to coelutions. This risk was evaluated by calculating the probability of coeluting a suspected allergen with perfume constituents exhibiting ions in common. For hydroxycitronellal, when using a two-column strategy, this may statistically occur more than once every 36 analyses for one ion or once every 144 analyses for three ions in common.

  16. Problems of allometric scaling analysis: examples from mammalian reproductive biology.

    PubMed

    Martin, Robert D; Genoud, Michel; Hemelrijk, Charlotte K

    2005-05-01

    Biological scaling analyses employing the widely used bivariate allometric model are beset by at least four interacting problems: (1) choice of an appropriate best-fit line with due attention to the influence of outliers; (2) objective recognition of divergent subsets in the data (allometric grades); (3) potential restrictions on statistical independence resulting from phylogenetic inertia; and (4) the need for extreme caution in inferring causation from correlation. A new non-parametric line-fitting technique has been developed that eliminates requirements for normality of distribution, greatly reduces the influence of outliers and permits objective recognition of grade shifts in substantial datasets. This technique is applied in scaling analyses of mammalian gestation periods and of neonatal body mass in primates. These analyses feed into a re-examination, conducted with partial correlation analysis, of the maternal energy hypothesis relating to mammalian brain evolution, which suggests links between body size and brain size in neonates and adults, gestation period and basal metabolic rate. Much has been made of the potential problem of phylogenetic inertia as a confounding factor in scaling analyses. However, this problem may be less severe than suspected earlier because nested analyses of variance conducted on residual variation (rather than on raw values) reveals that there is considerable variance at low taxonomic levels. In fact, limited divergence in body size between closely related species is one of the prime examples of phylogenetic inertia. One common approach to eliminating perceived problems of phylogenetic inertia in allometric analyses has been calculation of 'independent contrast values'. It is demonstrated that the reasoning behind this approach is flawed in several ways. Calculation of contrast values for closely related species of similar body size is, in fact, highly questionable, particularly when there are major deviations from the best-fit line for the scaling relationship under scrutiny.

  17. Modelling the Effects of Land-Use Changes on Climate: a Case Study on Yamula DAM

    NASA Astrophysics Data System (ADS)

    Köylü, Ü.; Geymen, A.

    2016-10-01

    Dams block flow of rivers and cause artificial water reservoirs which affect the climate and the land use characteristics of the river basin. In this research, the effect of the huge water body obtained by Yamula Dam in Kızılırmak Basin is analysed over surrounding spatial's land use and climate change. Mann Kendal non-parametrical statistical test, Theil&Sen Slope method, Inverse Distance Weighting (IDW), Soil Conservation Service-Curve Number (SCS-CN) methods are integrated for spatial and temporal analysis of the research area. For this research humidity, temperature, wind speed, precipitation observations which are collected in 16 weather stations nearby Kızılırmak Basin are analyzed. After that these statistical information is combined by GIS data over years. An application is developed for GIS analysis in Python Programming Language and integrated with ArcGIS software. Statistical analysis calculated in the R Project for Statistical Computing and integrated with developed application. According to the statistical analysis of extracted time series of meteorological parameters, statistical significant spatiotemporal trends are observed for climate change and land use characteristics. In this study, we indicated the effect of big dams in local climate on semi-arid Yamula Dam.

  18. Using conventional F-statistics to study unconventional sex-chromosome differentiation.

    PubMed

    Rodrigues, Nicolas; Dufresnes, Christophe

    2017-01-01

    Species with undifferentiated sex chromosomes emerge as key organisms to understand the astonishing diversity of sex-determination systems. Whereas new genomic methods are widening opportunities to study these systems, the difficulty to separately characterize their X and Y homologous chromosomes poses limitations. Here we demonstrate that two simple F -statistics calculated from sex-linked genotypes, namely the genetic distance ( F st ) between sexes and the inbreeding coefficient ( F is ) in the heterogametic sex, can be used as reliable proxies to compare sex-chromosome differentiation between populations. We correlated these metrics using published microsatellite data from two frog species ( Hyla arborea and Rana temporaria ), and show that they intimately relate to the overall amount of X-Y differentiation in populations. However, the fits for individual loci appear highly variable, suggesting that a dense genetic coverage will be needed for inferring fine-scale patterns of differentiation along sex-chromosomes. The applications of these F -statistics, which implies little sampling requirement, significantly facilitate population analyses of sex-chromosomes.

  19. 40 CFR 1065.602 - Statistics.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Statistics. 1065.602 Section 1065.602... PROCEDURES Calculations and Data Requirements § 1065.602 Statistics. (a) Overview. This section contains equations and example calculations for statistics that are specified in this part. In this section we use...

  20. Calculation of streamflow statistics for Ontario and the Great Lakes states

    USGS Publications Warehouse

    Piggott, Andrew R.; Neff, Brian P.

    2005-01-01

    Basic, flow-duration, and n-day frequency statistics were calculated for 779 current and historical streamflow gages in Ontario and 3,157 streamflow gages in the Great Lakes states with length-of-record daily mean streamflow data ending on December 31, 2000 and September 30, 2001, respectively. The statistics were determined using the U.S. Geological Survey’s SWSTAT and IOWDM, ANNIE, and LIBANNE software and Linux shell and PERL programming that enabled the mass processing of the data and calculation of the statistics. Verification exercises were performed to assess the accuracy of the processing and calculations. The statistics and descriptions, longitudes and latitudes, and drainage areas for each of the streamflow gages are summarized in ASCII text files and ESRI shapefiles.

  1. On the Calculation of Uncertainty Statistics with Error Bounds for CFD Calculations Containing Random Parameters and Fields

    NASA Technical Reports Server (NTRS)

    Barth, Timothy J.

    2016-01-01

    This chapter discusses the ongoing development of combined uncertainty and error bound estimates for computational fluid dynamics (CFD) calculations subject to imposed random parameters and random fields. An objective of this work is the construction of computable error bound formulas for output uncertainty statistics that guide CFD practitioners in systematically determining how accurately CFD realizations should be approximated and how accurately uncertainty statistics should be approximated for output quantities of interest. Formal error bounds formulas for moment statistics that properly account for the presence of numerical errors in CFD calculations and numerical quadrature errors in the calculation of moment statistics have been previously presented in [8]. In this past work, hierarchical node-nested dense and sparse tensor product quadratures are used to calculate moment statistics integrals. In the present work, a framework has been developed that exploits the hierarchical structure of these quadratures in order to simplify the calculation of an estimate of the quadrature error needed in error bound formulas. When signed estimates of realization error are available, this signed error may also be used to estimate output quantity of interest probability densities as a means to assess the impact of realization error on these density estimates. Numerical results are presented for CFD problems with uncertainty to demonstrate the capabilities of this framework.

  2. PSSMSearch: a server for modeling, visualization, proteome-wide discovery and annotation of protein motif specificity determinants.

    PubMed

    Krystkowiak, Izabella; Manguy, Jean; Davey, Norman E

    2018-06-05

    There is a pressing need for in silico tools that can aid in the identification of the complete repertoire of protein binding (SLiMs, MoRFs, miniMotifs) and modification (moiety attachment/removal, isomerization, cleavage) motifs. We have created PSSMSearch, an interactive web-based tool for rapid statistical modeling, visualization, discovery and annotation of protein motif specificity determinants to discover novel motifs in a proteome-wide manner. PSSMSearch analyses proteomes for regions with significant similarity to a motif specificity determinant model built from a set of aligned motif-containing peptides. Multiple scoring methods are available to build a position-specific scoring matrix (PSSM) describing the motif specificity determinant model. This model can then be modified by a user to add prior knowledge of specificity determinants through an interactive PSSM heatmap. PSSMSearch includes a statistical framework to calculate the significance of specificity determinant model matches against a proteome of interest. PSSMSearch also includes the SLiMSearch framework's annotation, motif functional analysis and filtering tools to highlight relevant discriminatory information. Additional tools to annotate statistically significant shared keywords and GO terms, or experimental evidence of interaction with a motif-recognizing protein have been added. Finally, PSSM-based conservation metrics have been created for taxonomic range analyses. The PSSMSearch web server is available at http://slim.ucd.ie/pssmsearch/.

  3. The classification of secondary colorectal liver cancer in human biopsy samples using angular dispersive x-ray diffraction and multivariate analysis

    NASA Astrophysics Data System (ADS)

    Theodorakou, Chrysoula; Farquharson, Michael J.

    2009-08-01

    The motivation behind this study is to assess whether angular dispersive x-ray diffraction (ADXRD) data, processed using multivariate analysis techniques, can be used for classifying secondary colorectal liver cancer tissue and normal surrounding liver tissue in human liver biopsy samples. The ADXRD profiles from a total of 60 samples of normal liver tissue and colorectal liver metastases were measured using a synchrotron radiation source. The data were analysed for 56 samples using nonlinear peak-fitting software. Four peaks were fitted to all of the ADXRD profiles, and the amplitude, area, amplitude and area ratios for three of the four peaks were calculated and used for the statistical and multivariate analysis. The statistical analysis showed that there are significant differences between all the peak-fitting parameters and ratios between the normal and the diseased tissue groups. The technique of soft independent modelling of class analogy (SIMCA) was used to classify normal liver tissue and colorectal liver metastases resulting in 67% of the normal tissue samples and 60% of the secondary colorectal liver tissue samples being classified correctly. This study has shown that the ADXRD data of normal and secondary colorectal liver cancer are statistically different and x-ray diffraction data analysed using multivariate analysis have the potential to be used as a method of tissue classification.

  4. Measurement of Low-Energy Nuclear-Recoil Quenching Factors in CsI[Na] and Statistical Analysis of the First Observation of Coherent, Elastic Neutrino-Nucleus Scattering

    NASA Astrophysics Data System (ADS)

    Rich, Grayson Currie

    The COHERENT Collaboration has produced the first-ever observation, with a significance of 6.7sigma, of a process consistent with coherent, elastic neutrino-nucleus scattering (CEnuNS) as first predicted and described by D.Z. Freedman in 1974. Physics of the CEnuNS process are presented along with its relationship to future measurements in the arenas of nuclear physics, fundamental particle physics, and astroparticle physics, where the newly-observed interaction presents a viable tool for investigations into numerous outstanding questions about the nature of the universe. To enable the CEnuNS observation with a 14.6-kg CsI[Na] detector, new measurements of the response of CsI[Na] to low-energy nuclear recoils, which is the only mechanism by which CEnuNS is detectable, were carried out at Triangle Universities Nuclear Laboratory; these measurements are detailed and an effective nuclear-recoil quenching factor of 8.78 +/- 1.66% is established for CsI[Na] in the recoil-energy range of 5-30 keV, based on new and literature data. Following separate analyses of the CEnuNS-search data by groups at the University of Chicago and the Moscow Engineering and Physics Institute, information from simulations, calculations, and ancillary measurements were used to inform statistical analyses of the collected data. Based on input from the Chicago analysis, the number of CEnuNS events expected from the Standard Model is 173 +/- 48; interpretation as a simple counting experiment finds 136 +/- 31 CEnuNS counts in the data, while a two-dimensional, profile likelihood fit yields 134 +/- 22 CEnuNS counts. Details of the simulations, calculations, and supporting measurements are discussed, in addition to the statistical procedures. Finally, potential improvements to the CsI[Na]-based CEnuNS measurement are presented along with future possibilities for COHERENT Collaboration, including new CEnuNS detectors and measurement of the neutrino-induced neutron spallation process.

  5. Applied statistics in ecology: common pitfalls and simple solutions

    Treesearch

    E. Ashley Steel; Maureen C. Kennedy; Patrick G. Cunningham; John S. Stanovick

    2013-01-01

    The most common statistical pitfalls in ecological research are those associated with data exploration, the logic of sampling and design, and the interpretation of statistical results. Although one can find published errors in calculations, the majority of statistical pitfalls result from incorrect logic or interpretation despite correct numerical calculations. There...

  6. Prevalence of refractive errors in the Slovak population calculated using the Gullstrand schematic eye model.

    PubMed

    Popov, I; Valašková, J; Štefaničková, J; Krásnik, V

    2017-01-01

    A substantial part of the population suffers from some kind of refractive errors. It is envisaged that their prevalence may change with the development of society. The aim of this study is to determine the prevalence of refractive errors using calculations based on the Gullstrand schematic eye model. We used the Gullstrand schematic eye model to calculate refraction retrospectively. Refraction was presented as the need for glasses correction at a vertex distance of 12 mm. The necessary data was obtained using the optical biometer Lenstar LS900. Data which could not be obtained due to the limitations of the device was substituted by theoretical data from the Gullstrand schematic eye model. Only analyses from the right eyes were presented. The data was interpreted using descriptive statistics, Pearson correlation and t-test. The statistical tests were conducted at a level of significance of 5%. Our sample included 1663 patients (665 male, 998 female) within the age range of 19 to 96 years. Average age was 70.8 ± 9.53 years. Average refraction of the eye was 2.73 ± 2.13D (males 2.49 ± 2.34, females 2.90 ± 2.76). The mean absolute error from emmetropia was 3.01 ± 1.58 (males 2.83 ± 2.95, females 3.25 ± 3.35). 89.06% of the sample was hyperopic, 6.61% was myopic and 4.33% emmetropic. We did not find any correlation between refraction and age. Females were more hyperopic than males. We did not find any statistically significant hypermetopic shift of refraction with age. According to our estimation, the calculations of refractive errors using the Gullstrand schematic eye model showed a significant hypermetropic shift of more than +2D. Our results could be used in future for comparing the prevalence of refractive errors using same methods we used.Key words: refractive errors, refraction, Gullstrand schematic eye model, population, emmetropia.

  7. Analysing the teleconnection systems affecting the climate of the Carpathian Basin

    NASA Astrophysics Data System (ADS)

    Kristóf, Erzsébet; Bartholy, Judit; Pongrácz, Rita

    2017-04-01

    Nowadays, the increase of the global average near-surface air temperature is unequivocal. Atmospheric low-frequency variabilities have substantial impacts on climate variables such as air temperature and precipitation. Therefore, assessing their effects is essential to improve global and regional climate model simulations for the 21st century. The North Atlantic Oscillation (NAO) is one of the best-known atmospheric teleconnection patterns affecting the Carpathian Basin in Central Europe. Besides NAO, we aim to analyse other interannual-to-decadal teleconnection patterns, which might have significant impacts on the Carpathian Basin, namely, the East Atlantic/West Russia pattern, the Scandinavian pattern, the Mediterranean Oscillation, and the North-Sea Caspian Pattern. For this purpose primarily the European Centre for Medium-Range Weather Forecasts' (ECMWF) ERA-20C atmospheric reanalysis dataset and multivariate statistical methods are used. The indices of each teleconnection pattern and their correlations with temperature and precipitation will be calculated for the period of 1961-1990. On the basis of these data first the long range (i. e. seasonal and/or annual scale) forecast ability is evaluated. Then, we aim to calculate the same indices of the relevant teleconnection patterns for the historical and future simulations of Coupled Model Intercomparison Project Phase 5 (CMIP5) models and compare them against each other using statistical methods. Our ultimate goal is to examine all available CMIP5 models and evaluate their abilities to reproduce the selected teleconnection systems. Thus, climate predictions for the 21st century for the Carpathian Basin may be improved using the best-performing models among all CMIP5 model simulations.

  8. Methods for calculating confidence and credible intervals for the residual between-study variance in random effects meta-regression models

    PubMed Central

    2014-01-01

    Background Meta-regression is becoming increasingly used to model study level covariate effects. However this type of statistical analysis presents many difficulties and challenges. Here two methods for calculating confidence intervals for the magnitude of the residual between-study variance in random effects meta-regression models are developed. A further suggestion for calculating credible intervals using informative prior distributions for the residual between-study variance is presented. Methods Two recently proposed and, under the assumptions of the random effects model, exact methods for constructing confidence intervals for the between-study variance in random effects meta-analyses are extended to the meta-regression setting. The use of Generalised Cochran heterogeneity statistics is extended to the meta-regression setting and a Newton-Raphson procedure is developed to implement the Q profile method for meta-analysis and meta-regression. WinBUGS is used to implement informative priors for the residual between-study variance in the context of Bayesian meta-regressions. Results Results are obtained for two contrasting examples, where the first example involves a binary covariate and the second involves a continuous covariate. Intervals for the residual between-study variance are wide for both examples. Conclusions Statistical methods, and R computer software, are available to compute exact confidence intervals for the residual between-study variance under the random effects model for meta-regression. These frequentist methods are almost as easily implemented as their established counterparts for meta-analysis. Bayesian meta-regressions are also easily performed by analysts who are comfortable using WinBUGS. Estimates of the residual between-study variance in random effects meta-regressions should be routinely reported and accompanied by some measure of their uncertainty. Confidence and/or credible intervals are well-suited to this purpose. PMID:25196829

  9. Statistical average estimates of high latitude field-aligned currents from the STARE and SABRE coherent VHF radar systems

    NASA Astrophysics Data System (ADS)

    Kosch, M. J.; Nielsen, E.

    Two bistatic VHF radar systems, STARE and SABRE, have been employed to estimate ionospheric electric fields in the geomagnetic latitude range 61.1 - 69.3° (geographic latitude range 63.8 - 72.6°) over northern Scandinavia. 173 days of good backscatter from all four radars have been analysed during the period 1982 to 1986, from which the average ionospheric divergence electric field versus latitude and time is calculated. The average magnetic field-aligned currents are computed using an AE-dependent empirical model of the ionospheric conductance. Statistical Birkeland current estimates are presented for high and low values of the Kp and AE indices as well as positive and negative orientations of the IMF B z component. The results compare very favourably to other ground-based and satellite measurements.

  10. Precalculus teachers' perspectives on using graphing calculators: an example from one curriculum

    NASA Astrophysics Data System (ADS)

    Karadeniz, Ilyas; Thompson, Denisse R.

    2018-01-01

    Graphing calculators are hand-held technological tools currently used in mathematics classrooms. Teachers' perspectives on using graphing calculators are important in terms of exploring what teachers think about using such technology in advanced mathematics courses, particularly precalculus courses. A descriptive intrinsic case study was conducted to analyse the perspectives of 11 teachers using graphing calculators with potential Computer Algebra System (CAS) capability while teaching Functions, Statistics, and Trigonometry, a precalculus course for 11th-grade students developed by the University of Chicago School Mathematics Project. Data were collected from multiple sources as part of a curriculum evaluation study conducted during the 2007-2008 school year. Although all teachers were using the same curriculum that integrated CAS into the instructional materials, teachers had mixed views about the technology. Graphing calculator features were used much more than CAS features, with many teachers concerned about the use of CAS because of pressures from external assessments. In addition, several teachers found it overwhelming to learn a new technology at the same time they were learning a new curriculum. The results have implications for curriculum developers and others working with teachers to update curriculum and the use of advanced technologies simultaneously.

  11. Incorrect likelihood methods were used to infer scaling laws of marine predator search behaviour.

    PubMed

    Edwards, Andrew M; Freeman, Mervyn P; Breed, Greg A; Jonsen, Ian D

    2012-01-01

    Ecologists are collecting extensive data concerning movements of animals in marine ecosystems. Such data need to be analysed with valid statistical methods to yield meaningful conclusions. We demonstrate methodological issues in two recent studies that reached similar conclusions concerning movements of marine animals (Nature 451:1098; Science 332:1551). The first study analysed vertical movement data to conclude that diverse marine predators (Atlantic cod, basking sharks, bigeye tuna, leatherback turtles and Magellanic penguins) exhibited "Lévy-walk-like behaviour", close to a hypothesised optimal foraging strategy. By reproducing the original results for the bigeye tuna data, we show that the likelihood of tested models was calculated from residuals of regression fits (an incorrect method), rather than from the likelihood equations of the actual probability distributions being tested. This resulted in erroneous Akaike Information Criteria, and the testing of models that do not correspond to valid probability distributions. We demonstrate how this led to overwhelming support for a model that has no biological justification and that is statistically spurious because its probability density function goes negative. Re-analysis of the bigeye tuna data, using standard likelihood methods, overturns the original result and conclusion for that data set. The second study observed Lévy walk movement patterns by mussels. We demonstrate several issues concerning the likelihood calculations (including the aforementioned residuals issue). Re-analysis of the data rejects the original Lévy walk conclusion. We consequently question the claimed existence of scaling laws of the search behaviour of marine predators and mussels, since such conclusions were reached using incorrect methods. We discourage the suggested potential use of "Lévy-like walks" when modelling consequences of fishing and climate change, and caution that any resulting advice to managers of marine ecosystems would be problematic. For reproducibility and future work we provide R source code for all calculations.

  12. [Analysis of the Association between Regional Deprivation and Utilization: An Assessment of Need for Physicians in Germany].

    PubMed

    Kopetsch, T; Maier, W

    2018-01-01

    A new strategy for planning outpatient medical care needs to be developed. The social and morbidity structure of the population should be considered in the planning of needs-based provision of medical care. This paper aims to examine the extent to which the degree of regional deprivation can be incorporated in the calculation of the regional requirements for specialists in Germany. To measure regional deprivation status at district level, we used the "German Index of Multiple Deprivation" (GIMD) developed in the Helmholtz Zentrum München - German Research Center for Environmental Health. Scores were calculated for the deprivation status of each rural and urban district in Germany. The methods used to compute the deprivation-adjusted medical need are linear regression analyses. The analyses were based on regionalized data for the number of office-based physicians and their billing data. The analyses were carried out with the SPSS software package, version 20. The analyses showed a clear positive correlation between regional deprivation and the utilisation of medical services both for outpatients and in-patients, on the one hand, and mortality and morbidity, as measured by the risk adjustment factor (RSA), on the other. At the district level, the analyses also revealed varying associations between the degree of deprivation and the utilisation of the 12 groups of specialists included in the needs assessment. On this basis, an algorithm was developed by which deprivation at district level can be used to calculate an increase or a decrease in the relative number of specialists needed. Using the GIMD and various determinants of medical utilisation, the model showed that medical need increased with the level of regional deprivation. However, regarding SHI medical specialist groups, the associations found in this analysis were statistically (R 2 ) insufficient to suggest a needs assessment planning system based only on the factors analysed, thereby restricting physicians' constitutional right of professional freedom. In particular cases, i. e. licenses to meet special needs, the developed instruments may be suitable for indicating a greater or lesser need for doctors at a regional level due to their relative ease of use and practicability. © Georg Thieme Verlag KG Stuttgart · New York.

  13. Estimating total maximum daily loads with the Stochastic Empirical Loading and Dilution Model

    USGS Publications Warehouse

    Granato, Gregory; Jones, Susan Cheung

    2017-01-01

    The Massachusetts Department of Transportation (DOT) and the Rhode Island DOT are assessing and addressing roadway contributions to total maximum daily loads (TMDLs). Example analyses for total nitrogen, total phosphorus, suspended sediment, and total zinc in highway runoff were done by the U.S. Geological Survey in cooperation with FHWA to simulate long-term annual loads for TMDL analyses with the stochastic empirical loading and dilution model known as SELDM. Concentration statistics from 19 highway runoff monitoring sites in Massachusetts were used with precipitation statistics from 11 long-term monitoring sites to simulate long-term pavement yields (loads per unit area). Highway sites were stratified by traffic volume or surrounding land use to calculate concentration statistics for rural roads, low-volume highways, high-volume highways, and ultraurban highways. The median of the event mean concentration statistics in each traffic volume category was used to simulate annual yields from pavement for a 29- or 30-year period. Long-term average yields for total nitrogen, phosphorus, and zinc from rural roads are lower than yields from the other categories, but yields of sediment are higher than for the low-volume highways. The average yields of the selected water quality constituents from high-volume highways are 1.35 to 2.52 times the associated yields from low-volume highways. The average yields of the selected constituents from ultraurban highways are 1.52 to 3.46 times the associated yields from high-volume highways. Example simulations indicate that both concentration reduction and flow reduction by structural best management practices are crucial for reducing runoff yields.

  14. MEG/EEG Source Reconstruction, Statistical Evaluation, and Visualization with NUTMEG

    PubMed Central

    Dalal, Sarang S.; Zumer, Johanna M.; Guggisberg, Adrian G.; Trumpis, Michael; Wong, Daniel D. E.; Sekihara, Kensuke; Nagarajan, Srikantan S.

    2011-01-01

    NUTMEG is a source analysis toolbox geared towards cognitive neuroscience researchers using MEG and EEG, including intracranial recordings. Evoked and unaveraged data can be imported to the toolbox for source analysis in either the time or time-frequency domains. NUTMEG offers several variants of adaptive beamformers, probabilistic reconstruction algorithms, as well as minimum-norm techniques to generate functional maps of spatiotemporal neural source activity. Lead fields can be calculated from single and overlapping sphere head models or imported from other software. Group averages and statistics can be calculated as well. In addition to data analysis tools, NUTMEG provides a unique and intuitive graphical interface for visualization of results. Source analyses can be superimposed onto a structural MRI or headshape to provide a convenient visual correspondence to anatomy. These results can also be navigated interactively, with the spatial maps and source time series or spectrogram linked accordingly. Animations can be generated to view the evolution of neural activity over time. NUTMEG can also display brain renderings and perform spatial normalization of functional maps using SPM's engine. As a MATLAB package, the end user may easily link with other toolboxes or add customized functions. PMID:21437174

  15. MEG/EEG source reconstruction, statistical evaluation, and visualization with NUTMEG.

    PubMed

    Dalal, Sarang S; Zumer, Johanna M; Guggisberg, Adrian G; Trumpis, Michael; Wong, Daniel D E; Sekihara, Kensuke; Nagarajan, Srikantan S

    2011-01-01

    NUTMEG is a source analysis toolbox geared towards cognitive neuroscience researchers using MEG and EEG, including intracranial recordings. Evoked and unaveraged data can be imported to the toolbox for source analysis in either the time or time-frequency domains. NUTMEG offers several variants of adaptive beamformers, probabilistic reconstruction algorithms, as well as minimum-norm techniques to generate functional maps of spatiotemporal neural source activity. Lead fields can be calculated from single and overlapping sphere head models or imported from other software. Group averages and statistics can be calculated as well. In addition to data analysis tools, NUTMEG provides a unique and intuitive graphical interface for visualization of results. Source analyses can be superimposed onto a structural MRI or headshape to provide a convenient visual correspondence to anatomy. These results can also be navigated interactively, with the spatial maps and source time series or spectrogram linked accordingly. Animations can be generated to view the evolution of neural activity over time. NUTMEG can also display brain renderings and perform spatial normalization of functional maps using SPM's engine. As a MATLAB package, the end user may easily link with other toolboxes or add customized functions.

  16. Statistical technique for analysing functional connectivity of multiple spike trains.

    PubMed

    Masud, Mohammad Shahed; Borisyuk, Roman

    2011-03-15

    A new statistical technique, the Cox method, used for analysing functional connectivity of simultaneously recorded multiple spike trains is presented. This method is based on the theory of modulated renewal processes and it estimates a vector of influence strengths from multiple spike trains (called reference trains) to the selected (target) spike train. Selecting another target spike train and repeating the calculation of the influence strengths from the reference spike trains enables researchers to find all functional connections among multiple spike trains. In order to study functional connectivity an "influence function" is identified. This function recognises the specificity of neuronal interactions and reflects the dynamics of postsynaptic potential. In comparison to existing techniques, the Cox method has the following advantages: it does not use bins (binless method); it is applicable to cases where the sample size is small; it is sufficiently sensitive such that it estimates weak influences; it supports the simultaneous analysis of multiple influences; it is able to identify a correct connectivity scheme in difficult cases of "common source" or "indirect" connectivity. The Cox method has been thoroughly tested using multiple sets of data generated by the neural network model of the leaky integrate and fire neurons with a prescribed architecture of connections. The results suggest that this method is highly successful for analysing functional connectivity of simultaneously recorded multiple spike trains. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. 40 CFR Appendix IV to Part 265 - Tests for Significance

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... introductory statistics texts. ... student's t-test involves calculation of the value of a t-statistic for each comparison of the mean... parameter with its initial background concentration or value. The calculated value of the t-statistic must...

  18. Statistical power analysis in wildlife research

    USGS Publications Warehouse

    Steidl, R.J.; Hayes, J.P.

    1997-01-01

    Statistical power analysis can be used to increase the efficiency of research efforts and to clarify research results. Power analysis is most valuable in the design or planning phases of research efforts. Such prospective (a priori) power analyses can be used to guide research design and to estimate the number of samples necessary to achieve a high probability of detecting biologically significant effects. Retrospective (a posteriori) power analysis has been advocated as a method to increase information about hypothesis tests that were not rejected. However, estimating power for tests of null hypotheses that were not rejected with the effect size observed in the study is incorrect; these power estimates will always be a??0.50 when bias adjusted and have no relation to true power. Therefore, retrospective power estimates based on the observed effect size for hypothesis tests that were not rejected are misleading; retrospective power estimates are only meaningful when based on effect sizes other than the observed effect size, such as those effect sizes hypothesized to be biologically significant. Retrospective power analysis can be used effectively to estimate the number of samples or effect size that would have been necessary for a completed study to have rejected a specific null hypothesis. Simply presenting confidence intervals can provide additional information about null hypotheses that were not rejected, including information about the size of the true effect and whether or not there is adequate evidence to 'accept' a null hypothesis as true. We suggest that (1) statistical power analyses be routinely incorporated into research planning efforts to increase their efficiency, (2) confidence intervals be used in lieu of retrospective power analyses for null hypotheses that were not rejected to assess the likely size of the true effect, (3) minimum biologically significant effect sizes be used for all power analyses, and (4) if retrospective power estimates are to be reported, then the I?-level, effect sizes, and sample sizes used in calculations must also be reported.

  19. Spatial analyses of benthic habitats to define coral reef ecosystem regions and potential biogeographic boundaries along a latitudinal gradient.

    PubMed

    Walker, Brian K

    2012-01-01

    Marine organism diversity typically attenuates latitudinally from tropical to colder climate regimes. Since the distribution of many marine species relates to certain habitats and depth regimes, mapping data provide valuable information in the absence of detailed ecological data that can be used to identify and spatially quantify smaller scale (10 s km) coral reef ecosystem regions and potential physical biogeographic barriers. This study focused on the southeast Florida coast due to a recognized, but understudied, tropical to subtropical biogeographic gradient. GIS spatial analyses were conducted on recent, accurate, shallow-water (0-30 m) benthic habitat maps to identify and quantify specific regions along the coast that were statistically distinct in the number and amount of major benthic habitat types. Habitat type and width were measured for 209 evenly-spaced cross-shelf transects. Evaluation of groupings from a cluster analysis at 75% similarity yielded five distinct regions. The number of benthic habitats and their area, width, distance from shore, distance from each other, and LIDAR depths were calculated in GIS and examined to determine regional statistical differences. The number of benthic habitats decreased with increasing latitude from 9 in the south to 4 in the north and many of the habitat metrics statistically differed between regions. Three potential biogeographic barriers were found at the Boca, Hillsboro, and Biscayne boundaries, where specific shallow-water habitats were absent further north; Middle Reef, Inner Reef, and oceanic seagrass beds respectively. The Bahamas Fault Zone boundary was also noted where changes in coastal morphologies occurred that could relate to subtle ecological changes. The analyses defined regions on a smaller scale more appropriate to regional management decisions, hence strengthening marine conservation planning with an objective, scientific foundation for decision making. They provide a framework for similar regional analyses elsewhere.

  20. Spatial Analyses of Benthic Habitats to Define Coral Reef Ecosystem Regions and Potential Biogeographic Boundaries along a Latitudinal Gradient

    PubMed Central

    Walker, Brian K.

    2012-01-01

    Marine organism diversity typically attenuates latitudinally from tropical to colder climate regimes. Since the distribution of many marine species relates to certain habitats and depth regimes, mapping data provide valuable information in the absence of detailed ecological data that can be used to identify and spatially quantify smaller scale (10 s km) coral reef ecosystem regions and potential physical biogeographic barriers. This study focused on the southeast Florida coast due to a recognized, but understudied, tropical to subtropical biogeographic gradient. GIS spatial analyses were conducted on recent, accurate, shallow-water (0–30 m) benthic habitat maps to identify and quantify specific regions along the coast that were statistically distinct in the number and amount of major benthic habitat types. Habitat type and width were measured for 209 evenly-spaced cross-shelf transects. Evaluation of groupings from a cluster analysis at 75% similarity yielded five distinct regions. The number of benthic habitats and their area, width, distance from shore, distance from each other, and LIDAR depths were calculated in GIS and examined to determine regional statistical differences. The number of benthic habitats decreased with increasing latitude from 9 in the south to 4 in the north and many of the habitat metrics statistically differed between regions. Three potential biogeographic barriers were found at the Boca, Hillsboro, and Biscayne boundaries, where specific shallow-water habitats were absent further north; Middle Reef, Inner Reef, and oceanic seagrass beds respectively. The Bahamas Fault Zone boundary was also noted where changes in coastal morphologies occurred that could relate to subtle ecological changes. The analyses defined regions on a smaller scale more appropriate to regional management decisions, hence strengthening marine conservation planning with an objective, scientific foundation for decision making. They provide a framework for similar regional analyses elsewhere. PMID:22276204

  1. Racial disparities in diabetes mortality in the 50 most populous US cities.

    PubMed

    Rosenstock, Summer; Whitman, Steve; West, Joseph F; Balkin, Michael

    2014-10-01

    While studies have consistently shown that in the USA, non-Hispanic Blacks (Blacks) have higher diabetes prevalence, complication and death rates than non-Hispanic Whites (Whites), there are no studies that compare disparities in diabetes mortality across the largest US cities. This study presents and compares Black/White age-adjusted diabetes mortality rate ratios (RRs), calculated using national death files and census data, for the 50 most populous US cities. Relationships between city-level diabetes mortality RRs and 12 ecological variables were explored using bivariate correlation analyses. Multivariate analyses were conducted using negative binomial regression to examine how much of the disparity could be explained by these variables. Blacks had statistically significantly higher mortality rates compared to Whites in 39 of the 41 cities included in analyses, with statistically significant rate ratios ranging from 1.57 (95 % CI: 1.33-1.86) in Baltimore to 3.78 (95 % CI: 2.84-5.02) in Washington, DC. Analyses showed that economic inequality was strongly correlated with the diabetes mortality disparity, driven by differences in White poverty levels. This was followed by segregation. Multivariate analyses showed that adjusting for Black/White poverty alone explained 58.5 % of the disparity. Adjusting for Black/White poverty and segregation explained 72.6 % of the disparity. This study emphasizes the role that inequalities in social and economic determinants, rather than for example poverty on its own, play in Black/White diabetes mortality disparities. It also highlights how the magnitude of the disparity and the factors that influence it can vary greatly across cities, underscoring the importance of using local data to identify context specific barriers and develop effective interventions to eliminate health disparities.

  2. Bridging ImmunoGenomic Data Analysis Workflow Gaps (BIGDAWG): An integrated case-control analysis pipeline.

    PubMed

    Pappas, Derek J; Marin, Wesley; Hollenbach, Jill A; Mack, Steven J

    2016-03-01

    Bridging ImmunoGenomic Data-Analysis Workflow Gaps (BIGDAWG) is an integrated data-analysis pipeline designed for the standardized analysis of highly-polymorphic genetic data, specifically for the HLA and KIR genetic systems. Most modern genetic analysis programs are designed for the analysis of single nucleotide polymorphisms, but the highly polymorphic nature of HLA and KIR data require specialized methods of data analysis. BIGDAWG performs case-control data analyses of highly polymorphic genotype data characteristic of the HLA and KIR loci. BIGDAWG performs tests for Hardy-Weinberg equilibrium, calculates allele frequencies and bins low-frequency alleles for k×2 and 2×2 chi-squared tests, and calculates odds ratios, confidence intervals and p-values for each allele. When multi-locus genotype data are available, BIGDAWG estimates user-specified haplotypes and performs the same binning and statistical calculations for each haplotype. For the HLA loci, BIGDAWG performs the same analyses at the individual amino-acid level. Finally, BIGDAWG generates figures and tables for each of these comparisons. BIGDAWG obviates the error-prone reformatting needed to traffic data between multiple programs, and streamlines and standardizes the data-analysis process for case-control studies of highly polymorphic data. BIGDAWG has been implemented as the bigdawg R package and as a free web application at bigdawg.immunogenomics.org. Copyright © 2015 American Society for Histocompatibility and Immunogenetics. Published by Elsevier Inc. All rights reserved.

  3. Accounting for autocorrelation in multi-drug resistant tuberculosis predictors using a set of parsimonious orthogonal eigenvectors aggregated in geographic space.

    PubMed

    Jacob, Benjamin J; Krapp, Fiorella; Ponce, Mario; Gottuzzo, Eduardo; Griffith, Daniel A; Novak, Robert J

    2010-05-01

    Spatial autocorrelation is problematic for classical hierarchical cluster detection tests commonly used in multi-drug resistant tuberculosis (MDR-TB) analyses as considerable random error can occur. Therefore, when MDRTB clusters are spatially autocorrelated the assumption that the clusters are independently random is invalid. In this research, a product moment correlation coefficient (i.e., the Moran's coefficient) was used to quantify local spatial variation in multiple clinical and environmental predictor variables sampled in San Juan de Lurigancho, Lima, Peru. Initially, QuickBird 0.61 m data, encompassing visible bands and the near infra-red bands, were selected to synthesize images of land cover attributes of the study site. Data of residential addresses of individual patients with smear-positive MDR-TB were geocoded, prevalence rates calculated and then digitally overlaid onto the satellite data within a 2 km buffer of 31 georeferenced health centers, using a 10 m2 grid-based algorithm. Geographical information system (GIS)-gridded measurements of each health center were generated based on preliminary base maps of the georeferenced data aggregated to block groups and census tracts within each buffered area. A three-dimensional model of the study site was constructed based on a digital elevation model (DEM) to determine terrain covariates associated with the sampled MDR-TB covariates. Pearson's correlation was used to evaluate the linear relationship between the DEM and the sampled MDR-TB data. A SAS/GIS(R) module was then used to calculate univariate statistics and to perform linear and non-linear regression analyses using the sampled predictor variables. The estimates generated from a global autocorrelation analyses were then spatially decomposed into empirical orthogonal bases using a negative binomial regression with a non-homogeneous mean. Results of the DEM analyses indicated a statistically non-significant, linear relationship between georeferenced health centers and the sampled covariate elevation. The data exhibited positive spatial autocorrelation and the decomposition of Moran's coefficient into uncorrelated, orthogonal map pattern components revealed global spatial heterogeneities necessary to capture latent autocorrelation in the MDR-TB model. It was thus shown that Poisson regression analyses and spatial eigenvector mapping can elucidate the mechanics of MDR-TB transmission by prioritizing clinical and environmental-sampled predictor variables for identifying high risk populations.

  4. SNPassoc: an R package to perform whole genome association studies.

    PubMed

    González, Juan R; Armengol, Lluís; Solé, Xavier; Guinó, Elisabet; Mercader, Josep M; Estivill, Xavier; Moreno, Víctor

    2007-03-01

    The popularization of large-scale genotyping projects has led to the widespread adoption of genetic association studies as the tool of choice in the search for single nucleotide polymorphisms (SNPs) underlying susceptibility to complex diseases. Although the analysis of individual SNPs is a relatively trivial task, when the number is large and multiple genetic models need to be explored it becomes necessary a tool to automate the analyses. In order to address this issue, we developed SNPassoc, an R package to carry out most common analyses in whole genome association studies. These analyses include descriptive statistics and exploratory analysis of missing values, calculation of Hardy-Weinberg equilibrium, analysis of association based on generalized linear models (either for quantitative or binary traits), and analysis of multiple SNPs (haplotype and epistasis analysis). Package SNPassoc is available at CRAN from http://cran.r-project.org. A tutorial is available on Bioinformatics online and in http://davinci.crg.es/estivill_lab/snpassoc.

  5. Quantifying and reducing statistical uncertainty in sample-based health program costing studies in low- and middle-income countries.

    PubMed

    Rivera-Rodriguez, Claudia L; Resch, Stephen; Haneuse, Sebastien

    2018-01-01

    In many low- and middle-income countries, the costs of delivering public health programs such as for HIV/AIDS, nutrition, and immunization are not routinely tracked. A number of recent studies have sought to estimate program costs on the basis of detailed information collected on a subsample of facilities. While unbiased estimates can be obtained via accurate measurement and appropriate analyses, they are subject to statistical uncertainty. Quantification of this uncertainty, for example, via standard errors and/or 95% confidence intervals, provides important contextual information for decision-makers and for the design of future costing studies. While other forms of uncertainty, such as that due to model misspecification, are considered and can be investigated through sensitivity analyses, statistical uncertainty is often not reported in studies estimating the total program costs. This may be due to a lack of awareness/understanding of (1) the technical details regarding uncertainty estimation and (2) the availability of software with which to calculate uncertainty for estimators resulting from complex surveys. We provide an overview of statistical uncertainty in the context of complex costing surveys, emphasizing the various potential specific sources that contribute to overall uncertainty. We describe how analysts can compute measures of uncertainty, either via appropriately derived formulae or through resampling techniques such as the bootstrap. We also provide an overview of calibration as a means of using additional auxiliary information that is readily available for the entire program, such as the total number of doses administered, to decrease uncertainty and thereby improve decision-making and the planning of future studies. A recent study of the national program for routine immunization in Honduras shows that uncertainty can be reduced by using information available prior to the study. This method can not only be used when estimating the total cost of delivering established health programs but also to decrease uncertainty when the interest lies in assessing the incremental effect of an intervention. Measures of statistical uncertainty associated with survey-based estimates of program costs, such as standard errors and 95% confidence intervals, provide important contextual information for health policy decision-making and key inputs for the design of future costing studies. Such measures are often not reported, possibly because of technical challenges associated with their calculation and a lack of awareness of appropriate software. Modern statistical analysis methods for survey data, such as calibration, provide a means to exploit additional information that is readily available but was not used in the design of the study to significantly improve the estimation of total cost through the reduction of statistical uncertainty.

  6. Quantifying and reducing statistical uncertainty in sample-based health program costing studies in low- and middle-income countries

    PubMed Central

    Resch, Stephen

    2018-01-01

    Objectives: In many low- and middle-income countries, the costs of delivering public health programs such as for HIV/AIDS, nutrition, and immunization are not routinely tracked. A number of recent studies have sought to estimate program costs on the basis of detailed information collected on a subsample of facilities. While unbiased estimates can be obtained via accurate measurement and appropriate analyses, they are subject to statistical uncertainty. Quantification of this uncertainty, for example, via standard errors and/or 95% confidence intervals, provides important contextual information for decision-makers and for the design of future costing studies. While other forms of uncertainty, such as that due to model misspecification, are considered and can be investigated through sensitivity analyses, statistical uncertainty is often not reported in studies estimating the total program costs. This may be due to a lack of awareness/understanding of (1) the technical details regarding uncertainty estimation and (2) the availability of software with which to calculate uncertainty for estimators resulting from complex surveys. We provide an overview of statistical uncertainty in the context of complex costing surveys, emphasizing the various potential specific sources that contribute to overall uncertainty. Methods: We describe how analysts can compute measures of uncertainty, either via appropriately derived formulae or through resampling techniques such as the bootstrap. We also provide an overview of calibration as a means of using additional auxiliary information that is readily available for the entire program, such as the total number of doses administered, to decrease uncertainty and thereby improve decision-making and the planning of future studies. Results: A recent study of the national program for routine immunization in Honduras shows that uncertainty can be reduced by using information available prior to the study. This method can not only be used when estimating the total cost of delivering established health programs but also to decrease uncertainty when the interest lies in assessing the incremental effect of an intervention. Conclusion: Measures of statistical uncertainty associated with survey-based estimates of program costs, such as standard errors and 95% confidence intervals, provide important contextual information for health policy decision-making and key inputs for the design of future costing studies. Such measures are often not reported, possibly because of technical challenges associated with their calculation and a lack of awareness of appropriate software. Modern statistical analysis methods for survey data, such as calibration, provide a means to exploit additional information that is readily available but was not used in the design of the study to significantly improve the estimation of total cost through the reduction of statistical uncertainty. PMID:29636964

  7. On the role of the transient eddies in maintaining the seasonal mean circulation

    NASA Technical Reports Server (NTRS)

    White, G. H.; Hoskins, B. J.

    1984-01-01

    The role of transient eddies in maintaining the observed local seasonal mean atmospheric circulation was investigated by examining the time-averaged momentum balances and omega equation, using seasonal statistics calculated from daily operational analyses by the European Centre for Medium Range Weather Forecasts. While both the Northern and Southern Hemispheres and several seasons were studied, emphasis was placed upon the Northern Hemisphere during December 1981-February 1982. The results showed that transient eddies played a secondary role in the seasonal mean zonal momentum budget and in the forcing of seasonal mean vertical and a geostrophic motion.

  8. Picante: R tools for integrating phylogenies and ecology.

    PubMed

    Kembel, Steven W; Cowan, Peter D; Helmus, Matthew R; Cornwell, William K; Morlon, Helene; Ackerly, David D; Blomberg, Simon P; Webb, Campbell O

    2010-06-01

    Picante is a software package that provides a comprehensive set of tools for analyzing the phylogenetic and trait diversity of ecological communities. The package calculates phylogenetic diversity metrics, performs trait comparative analyses, manipulates phenotypic and phylogenetic data, and performs tests for phylogenetic signal in trait distributions, community structure and species interactions. Picante is a package for the R statistical language and environment written in R and C, released under a GPL v2 open-source license, and freely available on the web (http://picante.r-forge.r-project.org) and from CRAN (http://cran.r-project.org).

  9. Discriminatory power of water polo game-related statistics at the 2008 Olympic Games.

    PubMed

    Escalante, Yolanda; Saavedra, Jose M; Mansilla, Mirella; Tella, Victor

    2011-02-01

    The aims of this study were (1) to compare water polo game-related statistics by context (winning and losing teams) and sex (men and women), and (2) to identify characteristics discriminating the performances for each sex. The game-related statistics of the 64 matches (44 men's and 20 women's) played in the final phase of the Olympic Games held in Beijing in 2008 were analysed. Unpaired t-tests compared winners and losers and men and women, and confidence intervals and effect sizes of the differences were calculated. The results were subjected to a discriminant analysis to identify the differentiating game-related statistics of the winning and losing teams. The results showed the differences between winning and losing men's teams to be in both defence and offence, whereas in women's teams they were only in offence. In men's games, passing (assists), aggressive play (exclusions), centre position effectiveness (centre shots), and goalkeeper defence (goalkeeper-blocked 5-m shots) predominated, whereas in women's games the play was more dynamic (possessions). The variable that most discriminated performance in men was goalkeeper-blocked shots, and in women shooting effectiveness (shots). These results should help coaches when planning training and competition.

  10. Kriging analysis of mean annual precipitation, Powder River Basin, Montana and Wyoming

    USGS Publications Warehouse

    Karlinger, M.R.; Skrivan, James A.

    1981-01-01

    Kriging is a statistical estimation technique for regionalized variables which exhibit an autocorrelation structure. Such structure can be described by a semi-variogram of the observed data. The kriging estimate at any point is a weighted average of the data, where the weights are determined using the semi-variogram and an assumed drift, or lack of drift, in the data. Block, or areal, estimates can also be calculated. The kriging algorithm, based on unbiased and minimum-variance estimates, involves a linear system of equations to calculate the weights. Kriging variances can then be used to give confidence intervals of the resulting estimates. Mean annual precipitation in the Powder River basin, Montana and Wyoming, is an important variable when considering restoration of coal-strip-mining lands of the region. Two kriging analyses involving data at 60 stations were made--one assuming no drift in precipitation, and one a partial quadratic drift simulating orographic effects. Contour maps of estimates of mean annual precipitation were similar for both analyses, as were the corresponding contours of kriging variances. Block estimates of mean annual precipitation were made for two subbasins. Runoff estimates were 1-2 percent of the kriged block estimates. (USGS)

  11. Meta-analyses of the 5-HTTLPR polymorphisms and post-traumatic stress disorder.

    PubMed

    Navarro-Mateu, Fernando; Escámez, Teresa; Koenen, Karestan C; Alonso, Jordi; Sánchez-Meca, Julio

    2013-01-01

    To conduct a meta-analysis of all published genetic association studies of 5-HTTLPR polymorphisms performed in PTSD cases. Potential studies were identified through PubMed/MEDLINE, EMBASE, Web of Science databases (Web of Knowledge, WoK), PsychINFO, PsychArticles and HuGeNet (Human Genome Epidemiology Network) up until December 2011. Published observational studies reporting genotype or allele frequencies of this genetic factor in PTSD cases and in non-PTSD controls were all considered eligible for inclusion in this systematic review. Two reviewers selected studies for possible inclusion and extracted data independently following a standardized protocol. A biallelic and a triallelic meta-analysis, including the total S and S' frequencies, the dominant (S+/LL and S'+/L'L') and the recessive model (SS/L+ and S'S'/L'+), was performed with a random-effect model to calculate the pooled OR and its corresponding 95% CI. Forest plots and Cochran's Q-Statistic and I(2) index were calculated to check for heterogeneity. Subgroup analyses and meta-regression were carried out to analyze potential moderators. Publication bias and quality of reporting were also analyzed. 13 studies met our inclusion criteria, providing a total sample of 1874 patients with PTSD and 7785 controls in the biallelic meta-analyses and 627 and 3524, respectively, in the triallelic. None of the meta-analyses showed evidence of an association between 5-HTTLPR and PTSD but several characteristics (exposure to the same principal stressor for PTSD cases and controls, adjustment for potential confounding variables, blind assessment, study design, type of PTSD, ethnic distribution and Total Quality Score) influenced the results in subgroup analyses and meta-regression. There was no evidence of potential publication bias. Current evidence does not support a direct effect of 5-HTTLPR polymorphisms on PTSD. Further analyses of gene-environment interactions, epigenetic modulation and new studies with large samples and/or meta-analyses are required.

  12. Power calculator for instrumental variable analysis in pharmacoepidemiology

    PubMed Central

    Walker, Venexia M; Davies, Neil M; Windmeijer, Frank; Burgess, Stephen; Martin, Richard M

    2017-01-01

    Abstract Background Instrumental variable analysis, for example with physicians’ prescribing preferences as an instrument for medications issued in primary care, is an increasingly popular method in the field of pharmacoepidemiology. Existing power calculators for studies using instrumental variable analysis, such as Mendelian randomization power calculators, do not allow for the structure of research questions in this field. This is because the analysis in pharmacoepidemiology will typically have stronger instruments and detect larger causal effects than in other fields. Consequently, there is a need for dedicated power calculators for pharmacoepidemiological research. Methods and Results The formula for calculating the power of a study using instrumental variable analysis in the context of pharmacoepidemiology is derived before being validated by a simulation study. The formula is applicable for studies using a single binary instrument to analyse the causal effect of a binary exposure on a continuous outcome. An online calculator, as well as packages in both R and Stata, are provided for the implementation of the formula by others. Conclusions The statistical power of instrumental variable analysis in pharmacoepidemiological studies to detect a clinically meaningful treatment effect is an important consideration. Research questions in this field have distinct structures that must be accounted for when calculating power. The formula presented differs from existing instrumental variable power formulae due to its parametrization, which is designed specifically for ease of use by pharmacoepidemiologists. PMID:28575313

  13. PopSc: Computing Toolkit for Basic Statistics of Molecular Population Genetics Simultaneously Implemented in Web-Based Calculator, Python and R

    PubMed Central

    Huang, Ying; Li, Cao; Liu, Linhai; Jia, Xianbo; Lai, Song-Jia

    2016-01-01

    Although various computer tools have been elaborately developed to calculate a series of statistics in molecular population genetics for both small- and large-scale DNA data, there is no efficient and easy-to-use toolkit available yet for exclusively focusing on the steps of mathematical calculation. Here, we present PopSc, a bioinformatic toolkit for calculating 45 basic statistics in molecular population genetics, which could be categorized into three classes, including (i) genetic diversity of DNA sequences, (ii) statistical tests for neutral evolution, and (iii) measures of genetic differentiation among populations. In contrast to the existing computer tools, PopSc was designed to directly accept the intermediate metadata, such as allele frequencies, rather than the raw DNA sequences or genotyping results. PopSc is first implemented as the web-based calculator with user-friendly interface, which greatly facilitates the teaching of population genetics in class and also promotes the convenient and straightforward calculation of statistics in research. Additionally, we also provide the Python library and R package of PopSc, which can be flexibly integrated into other advanced bioinformatic packages of population genetics analysis. PMID:27792763

  14. PopSc: Computing Toolkit for Basic Statistics of Molecular Population Genetics Simultaneously Implemented in Web-Based Calculator, Python and R.

    PubMed

    Chen, Shi-Yi; Deng, Feilong; Huang, Ying; Li, Cao; Liu, Linhai; Jia, Xianbo; Lai, Song-Jia

    2016-01-01

    Although various computer tools have been elaborately developed to calculate a series of statistics in molecular population genetics for both small- and large-scale DNA data, there is no efficient and easy-to-use toolkit available yet for exclusively focusing on the steps of mathematical calculation. Here, we present PopSc, a bioinformatic toolkit for calculating 45 basic statistics in molecular population genetics, which could be categorized into three classes, including (i) genetic diversity of DNA sequences, (ii) statistical tests for neutral evolution, and (iii) measures of genetic differentiation among populations. In contrast to the existing computer tools, PopSc was designed to directly accept the intermediate metadata, such as allele frequencies, rather than the raw DNA sequences or genotyping results. PopSc is first implemented as the web-based calculator with user-friendly interface, which greatly facilitates the teaching of population genetics in class and also promotes the convenient and straightforward calculation of statistics in research. Additionally, we also provide the Python library and R package of PopSc, which can be flexibly integrated into other advanced bioinformatic packages of population genetics analysis.

  15. Altered Brain Activity in Unipolar Depression Revisited: Meta-analyses of Neuroimaging Studies.

    PubMed

    Müller, Veronika I; Cieslik, Edna C; Serbanescu, Ilinca; Laird, Angela R; Fox, Peter T; Eickhoff, Simon B

    2017-01-01

    During the past 20 years, numerous neuroimaging experiments have investigated aberrant brain activation during cognitive and emotional processing in patients with unipolar depression (UD). The results of those investigations, however, vary considerably; moreover, previous meta-analyses also yielded inconsistent findings. To readdress aberrant brain activation in UD as evidenced by neuroimaging experiments on cognitive and/or emotional processing. Neuroimaging experiments published from January 1, 1997, to October 1, 2015, were identified by a literature search of PubMed, Web of Science, and Google Scholar using different combinations of the terms fMRI (functional magnetic resonance imaging), PET (positron emission tomography), neural, major depression, depression, major depressive disorder, unipolar depression, dysthymia, emotion, emotional, affective, cognitive, task, memory, working memory, inhibition, control, n-back, and Stroop. Neuroimaging experiments (using fMRI or PET) reporting whole-brain results of group comparisons between adults with UD and healthy control individuals as coordinates in a standard anatomic reference space and using an emotional or/and cognitive challenging task were selected. Coordinates reported to show significant activation differences between UD and healthy controls during emotional or cognitive processing were extracted. By using the revised activation likelihood estimation algorithm, different meta-analyses were calculated. Meta-analyses tested for brain regions consistently found to show aberrant brain activation in UD compared with controls. Analyses were calculated across all emotional processing experiments, all cognitive processing experiments, positive emotion processing, negative emotion processing, experiments using emotional face stimuli, experiments with a sex discrimination task, and memory processing. All meta-analyses were calculated across experiments independent of reporting an increase or decrease of activity in major depressive disorder. For meta-analyses with a minimum of 17 experiments available, separate analyses were performed for increases and decreases. In total, 57 studies with 99 individual neuroimaging experiments comprising in total 1058 patients were included; 34 of them tested cognitive and 65 emotional processing. Overall analyses across cognitive processing experiments (P > .29) and across emotional processing experiments (P > .47) revealed no significant results. Similarly, no convergence was found in analyses investigating positive (all P > .15), negative (all P > .76), or memory (all P > .48) processes. Analyses that restricted inclusion of confounds (eg, medication, comorbidity, age) did not change the results. Inconsistencies exist across individual experiments investigating aberrant brain activity in UD and replication problems across previous neuroimaging meta-analyses. For individual experiments, these inconsistencies may relate to use of uncorrected inference procedures, differences in experimental design and contrasts, or heterogeneous clinical populations; meta-analytically, differences may be attributable to varying inclusion and exclusion criteria or rather liberal statistical inference approaches.

  16. Altered Brain Activity in Unipolar Depression Revisited Meta-analyses of Neuroimaging Studies

    PubMed Central

    Müller, Veronika I.; Cieslik, Edna C.; Serbanescu, Ilinca; Laird, Angela R.; Fox, Peter T.; Eickhoff, Simon B.

    2017-01-01

    IMPORTANCE During the past 20 years, numerous neuroimaging experiments have investigated aberrant brain activation during cognitive and emotional processing in patients with unipolar depression (UD). The results of those investigations, however, vary considerably; moreover, previous meta-analyses also yielded inconsistent findings. OBJECTIVE To readdress aberrant brain activation in UD as evidenced by neuroimaging experiments on cognitive and/or emotional processing. DATA SOURCES Neuroimaging experiments published from January 1, 1997, to October 1, 2015, were identified by a literature search of PubMed, Web of Science, and Google Scholar using different combinations of the terms fMRI (functional magnetic resonance imaging), PET (positron emission tomography), neural, major depression, depression, major depressive disorder, unipolar depression, dysthymia, emotion, emotional, affective, cognitive, task, memory, working memory, inhibition, control, n-back, and Stroop. STUDY SELECTION Neuroimaging experiments (using fMRI or PET) reporting whole-brain results of group comparisons between adults with UD and healthy control individuals as coordinates in a standard anatomic reference space and using an emotional or/and cognitive challenging task were selected. DATA EXTRACTION AND SYNTHESIS Coordinates reported to show significant activation differences between UD and healthy controls during emotional or cognitive processing were extracted. By using the revised activation likelihood estimation algorithm, different meta-analyses were calculated. MAIN OUTCOMES AND MEASURES Meta-analyses tested for brain regions consistently found to show aberrant brain activation in UD compared with controls. Analyses were calculated across all emotional processing experiments, all cognitive processing experiments, positive emotion processing, negative emotion processing, experiments using emotional face stimuli, experiments with a sex discrimination task, and memory processing. All meta-analyses were calculated across experiments independent of reporting an increase or decrease of activity in major depressive disorder. For meta-analyses with a minimum of 17 experiments available, separate analyses were performed for increases and decreases. RESULTS In total, 57 studies with 99 individual neuroimaging experiments comprising in total 1058 patients were included; 34 of them tested cognitive and 65 emotional processing. Overall analyses across cognitive processing experiments (P > .29) and across emotional processing experiments (P > .47) revealed no significant results. Similarly, no convergence was found in analyses investigating positive (all P > .15), negative (all P > .76), or memory (all P > .48) processes. Analyses that restricted inclusion of confounds (eg, medication, comorbidity, age) did not change the results. CONCLUSIONS AND RELEVANCE Inconsistencies exist across individual experiments investigating aberrant brain activity in UD and replication problems across previous neuroimaging meta-analyses. For individual experiments, these inconsistencies may relate to use of uncorrected inference procedures, differences in experimental design and contrasts, or heterogeneous clinical populations; meta-analytically, differences may be attributable to varying inclusion and exclusion criteria or rather liberal statistical inference approaches. PMID:27829086

  17. Analysis of the sleep quality of elderly people using biomedical signals.

    PubMed

    Moreno-Alsasua, L; Garcia-Zapirain, B; Mendez-Zorrilla, A

    2015-01-01

    This paper presents a technical solution that analyses sleep signals captured by biomedical sensors to find possible disorders during rest. Specifically, the method evaluates electrooculogram (EOG) signals, skin conductance (GSR), air flow (AS), and body temperature. Next, a quantitative sleep quality analysis determines significant changes in the biological signals, and any similarities between them in a given time period. Filtering techniques such as the Fourier transform method and IIR filters process the signal and identify significant variations. Once these changes have been identified, all significant data is compared and a quantitative and statistical analysis is carried out to determine the level of a person's rest. To evaluate the correlation and significant differences, a statistical analysis has been calculated showing correlation between EOG and AS signals (p=0,005), EOG, and GSR signals (p=0,037) and, finally, the EOG and Body temperature (p=0,04). Doctors could use this information to monitor changes within a patient.

  18. High resolution 40AR/39AR chronostratigraphy of the Late Cretaceous El Gallo Formation, Baja California del Norte, Mexico

    NASA Astrophysics Data System (ADS)

    Renne, Paul R.; Fulford, Madeleine M.; Busby-Spera, Cathy

    1991-03-01

    Laser probe 40Ar/39Ar analyses of individual sanidine grains from four tuffs in the alluvial Late Cretaceous (Campanian) El Gallo Formation yield statistically distinct mean dates ranging from 74.87±0.05 Ma to 73.59±0.09 Ma. The exceptional precision of these dates permits calculation of statistically significant sediment accumulation rates that are much higher than passive sediment loading would cause, implying rapid tectonically induced subsidence. The dates bracket tightly the age of important dinosaur and mammalian faunas previously reported from the El Gallo Formation. The dates support an age less than 73 Ma for the Campanian/Maastrichtian stage boundary, younger than indicated by several currently used time scales. Further application of the single grain 40Ar/39Ar technique may be expected to greatly benefit stratigraphic studies of Mesozoic sedimentary basins and contribute to calibration of biostratigraphic and magnetostratigraphic time scales.

  19. Bispectral analysis of equatorial spread F density irregularities

    NASA Technical Reports Server (NTRS)

    Labelle, J.; Lund, E. J.

    1992-01-01

    Bispectral analysis has been applied to density irregularities at frequencies 5-30 Hz observed with a sounding rocket launched from Peru in March 1983. Unlike the power spectrum, the bispectrum contains statistical information about the phase relations between the Fourier components which make up the waveform. In the case of spread F data from 475 km the 5-30 Hz portion of the spectrum displays overall enhanced bicoherence relative to that of the background instrumental noise and to that expected due to statistical considerations, implying that the observed f exp -2.5 power law spectrum has a significant non-Gaussian component. This is consistent with previous qualitative analyses. The bicoherence has also been calculated for simulated equatorial spread F density irregularities in approximately the same wavelength regime, and the resulting bispectrum has some features in common with that of the rocket data. The implications of this analysis for equatorial spread F are discussed, and some future investigations are suggested.

  20. An analysis of population and social change in London wards in the 1980s.

    PubMed

    Congdon, P

    1989-01-01

    "This paper discusses the estimation and projection of small area populations in London, [England] and considers trends in intercensal social and demographic indices which can be calculated using these estimates. Information available annually on vital statistics and electorates is combined with detailed data from the Census Small Area Statistics to derive demographic component based population estimates for London's electoral wards over five year periods. The availability of age disaggregated population estimates permits derivation of small area social indicators for intercensal years, for example, of unemployment and mortality. Trends in spatial inequality of such indicators during the 1980s are analysed and point to continuing wide differentials. A typology of population and social indicators gives an indication of the small area distribution of the recent population turnaround in inner London, and of its association with other social processes such as gentrification and ethnic concentration." excerpt

  1. Anisotropic analysis of trabecular architecture in human femur bone radiographs using quaternion wavelet transforms.

    PubMed

    Sangeetha, S; Sujatha, C M; Manamalli, D

    2014-01-01

    In this work, anisotropy of compressive and tensile strength regions of femur trabecular bone are analysed using quaternion wavelet transforms. The normal and abnormal femur trabecular bone radiographic images are considered for this study. The sub-anatomic regions, which include compressive and tensile regions, are delineated using pre-processing procedures. These delineated regions are subjected to quaternion wavelet transforms and statistical parameters are derived from the transformed images. These parameters are correlated with apparent porosity, which is derived from the strength regions. Further, anisotropy is also calculated from the transformed images and is analyzed. Results show that the anisotropy values derived from second and third phase components of quaternion wavelet transform are found to be distinct for normal and abnormal samples with high statistical significance for both compressive and tensile regions. These investigations demonstrate that architectural anisotropy derived from QWT analysis is able to differentiate normal and abnormal samples.

  2. Cluster analysis of European Y-chromosomal STR haplotypes using the discrete Laplace method.

    PubMed

    Andersen, Mikkel Meyer; Eriksen, Poul Svante; Morling, Niels

    2014-07-01

    The European Y-chromosomal short tandem repeat (STR) haplotype distribution has previously been analysed in various ways. Here, we introduce a new way of analysing population substructure using a new method based on clustering within the discrete Laplace exponential family that models the probability distribution of the Y-STR haplotypes. Creating a consistent statistical model of the haplotypes enables us to perform a wide range of analyses. Previously, haplotype frequency estimation using the discrete Laplace method has been validated. In this paper we investigate how the discrete Laplace method can be used for cluster analysis to further validate the discrete Laplace method. A very important practical fact is that the calculations can be performed on a normal computer. We identified two sub-clusters of the Eastern and Western European Y-STR haplotypes similar to results of previous studies. We also compared pairwise distances (between geographically separated samples) with those obtained using the AMOVA method and found good agreement. Further analyses that are impossible with AMOVA were made using the discrete Laplace method: analysis of the homogeneity in two different ways and calculating marginal STR distributions. We found that the Y-STR haplotypes from e.g. Finland were relatively homogeneous as opposed to the relatively heterogeneous Y-STR haplotypes from e.g. Lublin, Eastern Poland and Berlin, Germany. We demonstrated that the observed distributions of alleles at each locus were similar to the expected ones. We also compared pairwise distances between geographically separated samples from Africa with those obtained using the AMOVA method and found good agreement. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  3. How conservative is Fisher's exact test? A quantitative evaluation of the two-sample comparative binomial trial.

    PubMed

    Crans, Gerald G; Shuster, Jonathan J

    2008-08-15

    The debate as to which statistical methodology is most appropriate for the analysis of the two-sample comparative binomial trial has persisted for decades. Practitioners who favor the conditional methods of Fisher, Fisher's exact test (FET), claim that only experimental outcomes containing the same amount of information should be considered when performing analyses. Hence, the total number of successes should be fixed at its observed level in hypothetical repetitions of the experiment. Using conditional methods in clinical settings can pose interpretation difficulties, since results are derived using conditional sample spaces rather than the set of all possible outcomes. Perhaps more importantly from a clinical trial design perspective, this test can be too conservative, resulting in greater resource requirements and more subjects exposed to an experimental treatment. The actual significance level attained by FET (the size of the test) has not been reported in the statistical literature. Berger (J. R. Statist. Soc. D (The Statistician) 2001; 50:79-85) proposed assessing the conservativeness of conditional methods using p-value confidence intervals. In this paper we develop a numerical algorithm that calculates the size of FET for sample sizes, n, up to 125 per group at the two-sided significance level, alpha = 0.05. Additionally, this numerical method is used to define new significance levels alpha(*) = alpha+epsilon, where epsilon is a small positive number, for each n, such that the size of the test is as close as possible to the pre-specified alpha (0.05 for the current work) without exceeding it. Lastly, a sample size and power calculation example are presented, which demonstrates the statistical advantages of implementing the adjustment to FET (using alpha(*) instead of alpha) in the two-sample comparative binomial trial. 2008 John Wiley & Sons, Ltd

  4. "What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"

    ERIC Educational Resources Information Center

    Ozturk, Elif

    2012-01-01

    The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…

  5. Influence of CT contrast agent on dose calculation of intensity modulated radiation therapy plan for nasopharyngeal carcinoma.

    PubMed

    Lee, F K-H; Chan, C C-L; Law, C-K

    2009-02-01

    Contrast enhanced computed tomography (CECT) has been used for delineation of treatment target in radiotherapy. The different Hounsfield unit due to the injected contrast agent may affect radiation dose calculation. We investigated this effect on intensity modulated radiotherapy (IMRT) of nasopharyngeal carcinoma (NPC). Dose distributions of 15 IMRT plans were recalculated on CECT. Dose statistics for organs at risk (OAR) and treatment targets were recorded for the plain CT-calculated and CECT-calculated plans. Statistical significance of the differences was evaluated. Correlations were also tested, among magnitude of calculated dose difference, tumor size and level of enhancement contrast. Differences in nodal mean/median dose were statistically significant, but small (approximately 0.15 Gy for a 66 Gy prescription). In the vicinity of the carotid arteries, the difference in calculated dose was also statistically significant, but only with a mean of approximately 0.2 Gy. We did not observe any significant correlation between the difference in the calculated dose and the tumor size or level of enhancement. The results implied that the calculated dose difference was clinically insignificant and may be acceptable for IMRT planning.

  6. A multi-wave study of organizational justice at work and long-term sickness absence among employees with depressive symptoms.

    PubMed

    Hjarsbech, Pernille U; Christensen, Karl Bang; Bjorner, Jakob B; Madsen, Ida E H; Thorsen, Sannie V; Carneiro, Isabella G; Christensen, Ulla; Rugulies, Reiner

    2014-03-01

    Mental health problems are strong predictors of long-term sickness absence (LTSA). In this study, we investigated whether organizational justice at work - fairness in resolving conflicts and distributing work - prevents risk of LTSA among employees with depressive symptoms. In a longitudinal study with five waves of data collection, we examined a cohort of 1034 employees with depressive symptoms. Depressive symptoms and organizational justice were assessed by self-administered questionnaires and information on LTSA was derived from a national register. Using Poisson regression analyses, we calculated rate ratios (RR) for the prospective association of organizational justice and change in organizational justice with time to onset of LTSA. All analyses were sex stratified. Among men, intermediate levels of organizational justice were statistically significantly associated with a decreased risk of subsequent LTSA after adjustment for covariates [RR 0.49, 95% confidence interval (95% CI) 0.26-0.91]. There was also a decreased risk for men with high levels of organizational justice although these estimates did not reach statistical significance after adjustment (RR 0.47, 95% CI 0.20-1.10). We found no such results for women. In both sexes, neither favorable nor adverse changes in organizational justice were statistically significantly associated with the risk of LTSA. This study shows that organizational justice may have a protective effect on the risk of LTSA among men with depressive symptoms. A protective effect of favorable changes in organizational justice was not found.

  7. Chi-Square Statistics, Tests of Hypothesis and Technology.

    ERIC Educational Resources Information Center

    Rochowicz, John A.

    The use of technology such as computers and programmable calculators enables students to find p-values and conduct tests of hypotheses in many different ways. Comprehension and interpretation of a research problem become the focus for statistical analysis. This paper describes how to calculate chisquare statistics and p-values for statistical…

  8. Tips and Tricks for Successful Application of Statistical Methods to Biological Data.

    PubMed

    Schlenker, Evelyn

    2016-01-01

    This chapter discusses experimental design and use of statistics to describe characteristics of data (descriptive statistics) and inferential statistics that test the hypothesis posed by the investigator. Inferential statistics, based on probability distributions, depend upon the type and distribution of the data. For data that are continuous, randomly and independently selected, as well as normally distributed more powerful parametric tests such as Student's t test and analysis of variance (ANOVA) can be used. For non-normally distributed or skewed data, transformation of the data (using logarithms) may normalize the data allowing use of parametric tests. Alternatively, with skewed data nonparametric tests can be utilized, some of which rely on data that are ranked prior to statistical analysis. Experimental designs and analyses need to balance between committing type 1 errors (false positives) and type 2 errors (false negatives). For a variety of clinical studies that determine risk or benefit, relative risk ratios (random clinical trials and cohort studies) or odds ratios (case-control studies) are utilized. Although both use 2 × 2 tables, their premise and calculations differ. Finally, special statistical methods are applied to microarray and proteomics data, since the large number of genes or proteins evaluated increase the likelihood of false discoveries. Additional studies in separate samples are used to verify microarray and proteomic data. Examples in this chapter and references are available to help continued investigation of experimental designs and appropriate data analysis.

  9. A Bayesian Method for Identifying Contaminated Detectors in Low-Level Alpha Spectrometers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maclellan, Jay A.; Strom, Daniel J.; Joyce, Kevin E.

    2011-11-02

    Analyses used for radiobioassay and other radiochemical tests are normally designed to meet specified quality objectives, such relative bias, precision, and minimum detectable activity (MDA). In the case of radiobioassay analyses for alpha emitting radionuclides, a major determiner of the process MDA is the instrument background. Alpha spectrometry detectors are often restricted to only a few counts over multi-day periods in order to meet required MDAs for nuclides such as plutonium-239 and americium-241. A detector background criterion is often set empirically based on experience, or frequentist or classical statistics are applied to the calculated background count necessary to meet amore » required MDA. An acceptance criterion for the detector background is set at the multiple of the estimated background standard deviation above the assumed mean that provides an acceptably small probability of observation if the mean and standard deviation estimate are correct. The major problem with this method is that the observed background counts used to estimate the mean, and thereby the standard deviation when a Poisson distribution is assumed, are often in the range of zero to three counts. At those expected count levels it is impossible to obtain a good estimate of the true mean from a single measurement. As an alternative, Bayesian statistical methods allow calculation of the expected detector background count distribution based on historical counts from new, uncontaminated detectors. This distribution can then be used to identify detectors showing an increased probability of contamination. The effect of varying the assumed range of background counts (i.e., the prior probability distribution) from new, uncontaminated detectors will be is discussed.« less

  10. Effect of Different Ceramic Crown Preparations on Tooth Structure Loss: An In Vitro Study

    NASA Astrophysics Data System (ADS)

    Ebrahimpour, Ashkan

    Objective: To quantify and compare the amount of tooth-structure reduction following the full-coverage preparations for crown materials of porcelain-fused-to-metal, lithium disilicate glass-ceramic and yttria-stabilized tetragonal zirconia polycrystalline for three tooth morphologies. Methods: Groups of resin teeth of different morphologies were individually weighed to high precision, then prepared following the preparation guidelines. The teeth were re-weighed after preparation and the amount of structural reduction was calculated. Statistical analyses were performed to find out if there was a significant difference among the groups. Results: Amount of tooth reduction for zirconia crown preparations was the lowest and statistically different compared with the other two materials. No statistical significance was found between the amount of reduction for porcelain-fused-to-metal and lithium disilicate glass-ceramic crowns. Conclusion: Within the limitations of this study, more tooth structure can be saved when utilizing zirconia full-coverage restorations compared with lithium disilicate glass-ceramic and porcelain-fused-to-metal crowns in maxillary central incisors, first premolars and first molars.

  11. The effectiveness and cost-effectiveness of intraoperative imaging in high-grade glioma resection; a comparative review of intraoperative ALA, fluorescein, ultrasound and MRI.

    PubMed

    Eljamel, M Sam; Mahboob, Syed Osama

    2016-12-01

    Surgical resection of high-grade gliomas (HGG) is standard therapy because it imparts significant progression free (PFS) and overall survival (OS). However, HGG-tumor margins are indistinguishable from normal brain during surgery. Hence intraoperative technology such as fluorescence (ALA, fluorescein) and intraoperative ultrasound (IoUS) and MRI (IoMRI) has been deployed. This study compares the effectiveness and cost-effectiveness of these technologies. Critical literature review and meta-analyses, using MEDLINE/PubMed service. The list of references in each article was double-checked for any missing references. We included all studies that reported the use of ALA, fluorescein (FLCN), IoUS or IoMRI to guide HGG-surgery. The meta-analyses were conducted according to statistical heterogeneity between studies. If there was no heterogeneity, fixed effects model was used; otherwise, a random effects model was used. Statistical heterogeneity was explored by χ 2 and inconsistency (I 2 ) statistics. To assess cost-effectiveness, we calculated the incremental cost per quality-adjusted life-year (QALY). Gross total resection (GTR) after ALA, FLCN, IoUS and IoMRI was 69.1%, 84.4%, 73.4% and 70% respectively. The differences were not statistically significant. All four techniques led to significant prolongation of PFS and tended to prolong OS. However none of these technologies led to significant prolongation of OS compared to controls. The cost/QALY was $16,218, $3181, $6049 and $32,954 for ALA, FLCN, IoUS and IoMRI respectively. ALA, FLCN, IoUS and IoMRI significantly improve GTR and PFS of HGG. Their incremental cost was below the threshold for cost-effectiveness of HGG-therapy, denoting that each intraoperative technology was cost-effective on its own. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Confidence intervals for the between-study variance in random-effects meta-analysis using generalised heterogeneity statistics: should we use unequal tails?

    PubMed

    Jackson, Dan; Bowden, Jack

    2016-09-07

    Confidence intervals for the between study variance are useful in random-effects meta-analyses because they quantify the uncertainty in the corresponding point estimates. Methods for calculating these confidence intervals have been developed that are based on inverting hypothesis tests using generalised heterogeneity statistics. Whilst, under the random effects model, these new methods furnish confidence intervals with the correct coverage, the resulting intervals are usually very wide, making them uninformative. We discuss a simple strategy for obtaining 95 % confidence intervals for the between-study variance with a markedly reduced width, whilst retaining the nominal coverage probability. Specifically, we consider the possibility of using methods based on generalised heterogeneity statistics with unequal tail probabilities, where the tail probability used to compute the upper bound is greater than 2.5 %. This idea is assessed using four real examples and a variety of simulation studies. Supporting analytical results are also obtained. Our results provide evidence that using unequal tail probabilities can result in shorter 95 % confidence intervals for the between-study variance. We also show some further results for a real example that illustrates how shorter confidence intervals for the between-study variance can be useful when performing sensitivity analyses for the average effect, which is usually the parameter of primary interest. We conclude that using unequal tail probabilities when computing 95 % confidence intervals for the between-study variance, when using methods based on generalised heterogeneity statistics, can result in shorter confidence intervals. We suggest that those who find the case for using unequal tail probabilities convincing should use the '1-4 % split', where greater tail probability is allocated to the upper confidence bound. The 'width-optimal' interval that we present deserves further investigation.

  13. Visual field progression in glaucoma: estimating the overall significance of deterioration with permutation analyses of pointwise linear regression (PoPLR).

    PubMed

    O'Leary, Neil; Chauhan, Balwantray C; Artes, Paul H

    2012-10-01

    To establish a method for estimating the overall statistical significance of visual field deterioration from an individual patient's data, and to compare its performance to pointwise linear regression. The Truncated Product Method was used to calculate a statistic S that combines evidence of deterioration from individual test locations in the visual field. The overall statistical significance (P value) of visual field deterioration was inferred by comparing S with its permutation distribution, derived from repeated reordering of the visual field series. Permutation of pointwise linear regression (PoPLR) and pointwise linear regression were evaluated in data from patients with glaucoma (944 eyes, median mean deviation -2.9 dB, interquartile range: -6.3, -1.2 dB) followed for more than 4 years (median 10 examinations over 8 years). False-positive rates were estimated from randomly reordered series of this dataset, and hit rates (proportion of eyes with significant deterioration) were estimated from the original series. The false-positive rates of PoPLR were indistinguishable from the corresponding nominal significance levels and were independent of baseline visual field damage and length of follow-up. At P < 0.05, the hit rates of PoPLR were 12, 29, and 42%, at the fifth, eighth, and final examinations, respectively, and at matching specificities they were consistently higher than those of pointwise linear regression. In contrast to population-based progression analyses, PoPLR provides a continuous estimate of statistical significance for visual field deterioration individualized to a particular patient's data. This allows close control over specificity, essential for monitoring patients in clinical practice and in clinical trials.

  14. A Regression Framework for Effect Size Assessments in Longitudinal Modeling of Group Differences

    PubMed Central

    Feingold, Alan

    2013-01-01

    The use of growth modeling analysis (GMA)--particularly multilevel analysis and latent growth modeling--to test the significance of intervention effects has increased exponentially in prevention science, clinical psychology, and psychiatry over the past 15 years. Model-based effect sizes for differences in means between two independent groups in GMA can be expressed in the same metric (Cohen’s d) commonly used in classical analysis and meta-analysis. This article first reviews conceptual issues regarding calculation of d for findings from GMA and then introduces an integrative framework for effect size assessments that subsumes GMA. The new approach uses the structure of the linear regression model, from which effect sizes for findings from diverse cross-sectional and longitudinal analyses can be calculated with familiar statistics, such as the regression coefficient, the standard deviation of the dependent measure, and study duration. PMID:23956615

  15. Assessing exclusionary power of a paternity test involving a pair of alleged grandparents.

    PubMed

    Scarpetta, Marco A; Staub, Rick W; Einum, David D

    2007-02-01

    The power of a genetic test battery to exclude a pair of individuals as grandparents is an important consideration for parentage testing laboratories. However, a reliable method to calculate such a statistic with short-tandem repeat (STR) genetic markers has not been presented. Two formulae describing the random grandparents not excluded (RGPNE) statistic at a single genetic locus were derived: RGPNE = a(4 - 6a + 4a(2)- a(3)) when the paternal obligate allele (POA) is defined and RGPNE = 2[(a + b)(2 - a - b)][1 - (a + b)(2 - a - b)] + [(a + b)(2 - a - b)] when the POA is ambiguous. A minimum number of genetic markers required to yield cumulative RGPNE values of not greater than 0.01 was calculated with weighted average allele frequencies of the CODIS STR loci. RGPNE data for actual grandparentage cases are also presented to empirically examine the exclusionary power of routine casework. A comparison of RGPNE and random man not excluded (RMNE) values demonstrates the increased difficulty involved in excluding two individuals as grandparents compared to excluding a single alleged parent. A minimum of 12 STR markers is necessary to achieve RGPNE values of not greater than 0.01 when the mother is tested; more than 25 markers are required without the mother. Cumulative RGPNE values for each of 22 nonexclusionary grandparentage cases were not more than 0.01 but were significantly weaker when calculated without data from the mother. Calculation of the RGPNE provides a simple means to help minimize the potential of false inclusions in grandparentage analyses. This study also underscores the importance of testing the mother when examining the parents of an unavailable alleged father (AF).

  16. Geochemistry of some rare earth elements in groundwater, Vierlingsbeek, The Netherlands.

    PubMed

    Janssen, René P T; Verweij, Wilko

    2003-03-01

    Groundwater samples were taken from seven bore holes at depths ranging from 2 to 41m nearby drinking water pumping station Vierlingsbeek, The Netherlands and analysed for Y, La, Ce, Pr, Nd, Sm and Eu. Shale-normalized patterns were generally flat and showed that the observed rare earth elements (REE) were probably of natural origin. In the shallow groundwaters the REEs were light REE (LREE) enriched, probably caused by binding of LREEs to colloids. To improve understanding of the behaviour of the REE, two approaches were used: calculations of the speciation and a statistical approach. For the speciation calculations, complexation and precipitation reactions including inorganic and dissolved organic carbon (DOC) compounds, were taken into account. The REE speciation showed REE(3+), REE(SO(4))(+), REE(CO(3))(+) and REE(DOC) being the major species. Dissolution of pure REE precipitates and REE-enriched solid phases did not account for the observed REEs in groundwater. Regulation of REE concentrations by adsorption-desorption processes to Fe(III)(OH)(3) and Al(OH)(3) minerals, which were calculated to be present in nearly all groundwaters, is a probable explanation. The statistical approach (multiple linear regression) showed that pH is by far the most significant groundwater characteristic which contributes to the variation in REE concentrations. Also DOC, SO(4), Fe and Al contributed significantly, although to a much lesser extent, to the variation in REE concentrations. This is in line with the calculated REE-species in solution and REE-adsorption to iron and aluminium (hydr)oxides. Regression equations including only pH, were derived to predict REE concentrations in groundwater. External validation showed that these regression equations were reasonably successful to predict REE concentrations of groundwater of another drinking water pumping station in quite different region of The Netherlands.

  17. Bone marrow edema pattern in advanced hip osteoarthritis: quantitative assessment with magnetic resonance imaging and correlation with clinical examination, radiographic findings, and histopathology.

    PubMed

    Taljanovic, Mihra S; Graham, Anna R; Benjamin, James B; Gmitro, Arthur F; Krupinski, Elizabeth A; Schwartz, Stephanie A; Hunter, Tim B; Resnick, Donald L

    2008-05-01

    To correlate the amount of bone marrow edema (BME) calculated by magnetic resonance imaging(MRI) with clinical findings, histopathology, and radiographic findings, in patients with advanced hip osteoarthritis(OA). The study was approved by The Institutional Human Subject Protection Committee. Coronal MRI of hips was acquired in 19 patients who underwent hip replacement. A spin echo (SE) sequence with four echoes and separate fast spin echo (FSE) proton density (PD)-weighted SE sequences of fat (F) and water (W) were acquired with water and fat suppression, respectively. T2 and water:fat ratio calculations were made for the outlined regions of interest. The calculated MRI values were correlated with the clinical, radiographic, and histopathologic findings. Analyses of variance were done on the MRI data for W/(W + F) and for T2 values (total and focal values) for the symptomatic and contralateral hips. The values were significantly higher in the study group. Statistically significant correlations were found between pain and total W/(W + F), pain and focal T2 values, and the number of microfractures and calculated BME for the focal W/(W + F) in the proximal femora. Statistically significant correlations were found between the radiographic findings and MRI values for total W/(W + F), focal W/(W + F) and focal T2 and among the radiographic findings, pain, and hip movement. On histopathology, only a small amount of BME was seen in eight proximal femora. The amount of BME in the OA hip, as measured by MRI, correlates with the severity of pain, radiographic findings, and number of microfractures.

  18. Wildfire cluster detection using space-time scan statistics

    NASA Astrophysics Data System (ADS)

    Tonini, M.; Tuia, D.; Ratle, F.; Kanevski, M.

    2009-04-01

    The aim of the present study is to identify spatio-temporal clusters of fires sequences using space-time scan statistics. These statistical methods are specifically designed to detect clusters and assess their significance. Basically, scan statistics work by comparing a set of events occurring inside a scanning window (or a space-time cylinder for spatio-temporal data) with those that lie outside. Windows of increasing size scan the zone across space and time: the likelihood ratio is calculated for each window (comparing the ratio "observed cases over expected" inside and outside): the window with the maximum value is assumed to be the most probable cluster, and so on. Under the null hypothesis of spatial and temporal randomness, these events are distributed according to a known discrete-state random process (Poisson or Bernoulli), which parameters can be estimated. Given this assumption, it is possible to test whether or not the null hypothesis holds in a specific area. In order to deal with fires data, the space-time permutation scan statistic has been applied since it does not require the explicit specification of the population-at risk in each cylinder. The case study is represented by Florida daily fire detection using the Moderate Resolution Imaging Spectroradiometer (MODIS) active fire product during the period 2003-2006. As result, statistically significant clusters have been identified. Performing the analyses over the entire frame period, three out of the five most likely clusters have been identified in the forest areas, on the North of the country; the other two clusters cover a large zone in the South, corresponding to agricultural land and the prairies in the Everglades. Furthermore, the analyses have been performed separately for the four years to analyze if the wildfires recur each year during the same period. It emerges that clusters of forest fires are more frequent in hot seasons (spring and summer), while in the South areas they are widely present along the whole year. The analysis of fires distribution to evaluate if they are statistically more frequent in some area or/and in some period of the year, can be useful to support fire management and to focus on prevention measures.

  19. Physics-based statistical model and simulation method of RF propagation in urban environments

    DOEpatents

    Pao, Hsueh-Yuan; Dvorak, Steven L.

    2010-09-14

    A physics-based statistical model and simulation/modeling method and system of electromagnetic wave propagation (wireless communication) in urban environments. In particular, the model is a computationally efficient close-formed parametric model of RF propagation in an urban environment which is extracted from a physics-based statistical wireless channel simulation method and system. The simulation divides the complex urban environment into a network of interconnected urban canyon waveguides which can be analyzed individually; calculates spectral coefficients of modal fields in the waveguides excited by the propagation using a database of statistical impedance boundary conditions which incorporates the complexity of building walls in the propagation model; determines statistical parameters of the calculated modal fields; and determines a parametric propagation model based on the statistical parameters of the calculated modal fields from which predictions of communications capability may be made.

  20. Lindemann histograms as a new method to analyse nano-patterns and phases

    NASA Astrophysics Data System (ADS)

    Makey, Ghaith; Ilday, Serim; Tokel, Onur; Ibrahim, Muhamet; Yavuz, Ozgun; Pavlov, Ihor; Gulseren, Oguz; Ilday, Omer

    The detection, observation, and analysis of material phases and atomistic patterns are of great importance for understanding systems exhibiting both equilibrium and far-from-equilibrium dynamics. As such, there is intense research on phase transitions and pattern dynamics in soft matter, statistical and nonlinear physics, and polymer physics. In order to identify phases and nano-patterns, the pair correlation function is commonly used. However, this approach is limited in terms of recognizing competing patterns in dynamic systems, and lacks visualisation capabilities. In order to solve these limitations, we introduce Lindemann histogram quantification as an alternative method to analyse solid, liquid, and gas phases, along with hexagonal, square, and amorphous nano-pattern symmetries. We show that the proposed approach based on Lindemann parameter calculated per particle maps local number densities to material phase or particles pattern. We apply the Lindemann histogram method on dynamical colloidal self-assembly experimental data and identify competing patterns.

  1. POWER ANALYSIS FOR COMPLEX MEDIATIONAL DESIGNS USING MONTE CARLO METHODS

    PubMed Central

    Thoemmes, Felix; MacKinnon, David P.; Reiser, Mark R.

    2013-01-01

    Applied researchers often include mediation effects in applications of advanced methods such as latent variable models and linear growth curve models. Guidance on how to estimate statistical power to detect mediation for these models has not yet been addressed in the literature. We describe a general framework for power analyses for complex mediational models. The approach is based on the well known technique of generating a large number of samples in a Monte Carlo study, and estimating power as the percentage of cases in which an estimate of interest is significantly different from zero. Examples of power calculation for commonly used mediational models are provided. Power analyses for the single mediator, multiple mediators, three-path mediation, mediation with latent variables, moderated mediation, and mediation in longitudinal designs are described. Annotated sample syntax for Mplus is appended and tabled values of required sample sizes are shown for some models. PMID:23935262

  2. Analysis of complex environment effect on near-field emission

    NASA Astrophysics Data System (ADS)

    Ravelo, B.; Lalléchère, S.; Bonnet, P.; Paladian, F.

    2014-10-01

    The article is dealing with uncertainty analyses of radiofrequency circuits electromagnetic compatibility emission based on the near-field/near-field (NF/NF) transform combined with stochastic approach. By using 2D data corresponding to electromagnetic (EM) field (X=E or H) scanned in the observation plane placed at the position z0 above the circuit under test (CUT), the X field map was extracted. Then, uncertainty analyses were assessed via the statistical moments from X component. In addition, stochastic collocation based was considered and calculations were applied to planar EM NF radiated by the CUTs as Wilkinson power divider and a microstrip line operating at GHz levels. After Matlab implementation, the mean and standard deviation were assessed. The present study illustrates how the variations of environmental parameters may impact EM fields. The NF uncertainty methodology can be applied to any physical parameter effects in complex environment and useful for printed circuit board (PCBs) design guideline.

  3. Number needed to treat (NNT) in clinical literature: an appraisal.

    PubMed

    Mendes, Diogo; Alves, Carlos; Batel-Marques, Francisco

    2017-06-01

    The number needed to treat (NNT) is an absolute effect measure that has been used to assess beneficial and harmful effects of medical interventions. Several methods can be used to calculate NNTs, and they should be applied depending on the different study characteristics, such as the design and type of variable used to measure outcomes. Whether or not the most recommended methods have been applied to calculate NNTs in studies published in the medical literature is yet to be determined. The aim of this study is to assess whether the methods used to calculate NNTs in studies published in medical journals are in line with basic methodological recommendations. The top 25 high-impact factor journals in the "General and/or Internal Medicine" category were screened to identify studies assessing pharmacological interventions and reporting NNTs. Studies were categorized according to their design and the type of variables. NNTs were assessed for completeness (baseline risk, time horizon, and confidence intervals [CIs]). The methods used for calculating NNTs in selected studies were compared to basic methodological recommendations published in the literature. Data were analyzed using descriptive statistics. The search returned 138 citations, of which 51 were selected. Most were meta-analyses (n = 23, 45.1%), followed by clinical trials (n = 17, 33.3%), cohort (n = 9, 17.6%), and case-control studies (n = 2, 3.9%). Binary variables were more common (n = 41, 80.4%) than time-to-event (n = 10, 19.6%) outcomes. Twenty-six studies (51.0%) reported only NNT to benefit (NNTB), 14 (27.5%) reported both NNTB and NNT to harm (NNTH), and 11 (21.6%) reported only NNTH. Baseline risk (n = 37, 72.5%), time horizon (n = 38, 74.5%), and CI (n = 32, 62.7%) for NNTs were not always reported. Basic methodological recommendations to calculate NNTs were not followed in 15 studies (29.4%). The proportion of studies applying non-recommended methods was particularly high for meta-analyses (n = 13, 56.5%). A considerable proportion of studies, particularly meta-analyses, applied methods that are not in line with basic methodological recommendations. Despite their usefulness in assisting clinical decisions, NNTs are uninterpretable if incompletely reported, and they may be misleading if calculating methods are inadequate to study designs and variables under evaluation. Further research is needed to confirm the present findings.

  4. Patient safety: numerical skills and drug calculation abilities of nursing students and registered nurses.

    PubMed

    McMullan, Miriam; Jones, Ray; Lea, Susan

    2010-04-01

    This paper is a report of a correlational study of the relations of age, status, experience and drug calculation ability to numerical ability of nursing students and Registered Nurses. Competent numerical and drug calculation skills are essential for nurses as mistakes can put patients' lives at risk. A cross-sectional study was carried out in 2006 in one United Kingdom university. Validated numerical and drug calculation tests were given to 229 second year nursing students and 44 Registered Nurses attending a non-medical prescribing programme. The numeracy test was failed by 55% of students and 45% of Registered Nurses, while 92% of students and 89% of nurses failed the drug calculation test. Independent of status or experience, older participants (> or = 35 years) were statistically significantly more able to perform numerical calculations. There was no statistically significant difference between nursing students and Registered Nurses in their overall drug calculation ability, but nurses were statistically significantly more able than students to perform basic numerical calculations and calculations for solids, oral liquids and injections. Both nursing students and Registered Nurses were statistically significantly more able to perform calculations for solids, liquid oral and injections than calculations for drug percentages, drip and infusion rates. To prevent deskilling, Registered Nurses should continue to practise and refresh all the different types of drug calculations as often as possible with regular (self)-testing of their ability. Time should be set aside in curricula for nursing students to learn how to perform basic numerical and drug calculations. This learning should be reinforced through regular practice and assessment.

  5. PHI in the Early Detection of Prostate Cancer.

    PubMed

    Fuchsova, Radka; Topolcan, Ondrej; Windrichova, Jindra; Hora, Milan; Dolejsova, Olga; Pecen, Ladislav; Kasik, Petr; Novak, Jaroslav; Casova, Miroslava; Smejkal, Jiri

    2015-09-01

    To evaluate changes in the serum levels of prostate specific antigen (PSA), %free PSA and -2proPSA biomarkers, and prostate health index (PHI) in the diagnostic algorithm of early prostate cancer. The Immunoanalytical Laboratory of the University Hospital in Pilsen examined sera from 263 patients being treated at the Hospital's Urology Department with suspected prostate cancer who had undergone biopsies and were divided into a benign and malignant group. The monitored biomarkers were measured using chemiluminescence. All statistical analyses were calculated using the SAS software. We found statistically significantly increased levels of -2proPSA, PHI and PSA and decreased levels of %freePSA in patients diagnosed with prostate cancer by prostate biopsy vs. patients with benign prostatic hypertrophy (median values: -2proPSA: 16 vs. 21 ng/l, PHI: 35 vs. 62, total PSA: 7.2 vs. 7.7 μg/l and %free PSA: 16.7 vs. 11.7%). Receiver operating characteristic curves showed the best performance for PHI compared to other markers. The assessment of -2proPSA and the calculation of PHI appear to be of great benefit for a more accurate differential diagnosis of benign hyperplasia and prostate cancer. Copyright© 2015 International Institute of Anticancer Research (Dr. John G. Delinassios), All rights reserved.

  6. BaCoCa--a heuristic software tool for the parallel assessment of sequence biases in hundreds of gene and taxon partitions.

    PubMed

    Kück, Patrick; Struck, Torsten H

    2014-01-01

    BaCoCa (BAse COmposition CAlculator) is a user-friendly software that combines multiple statistical approaches (like RCFV and C value calculations) to identify biases in aligned sequence data which potentially mislead phylogenetic reconstructions. As a result of its speed and flexibility, the program provides the possibility to analyze hundreds of pre-defined gene partitions and taxon subsets in one single process run. BaCoCa is command-line driven and can be easily integrated into automatic process pipelines of phylogenomic studies. Moreover, given the tab-delimited output style the results can be easily used for further analyses in programs like Excel or statistical packages like R. A built-in option of BaCoCa is the generation of heat maps with hierarchical clustering of certain results using R. As input files BaCoCa can handle FASTA and relaxed PHYLIP, which are commonly used in phylogenomic pipelines. BaCoCa is implemented in Perl and works on Windows PCs, Macs and Linux operating systems. The executable source code as well as example test files and a detailed documentation of BaCoCa are freely available at http://software.zfmk.de. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. OPR-PPR, a Computer Program for Assessing Data Importance to Model Predictions Using Linear Statistics

    USGS Publications Warehouse

    Tonkin, Matthew J.; Tiedeman, Claire; Ely, D. Matthew; Hill, Mary C.

    2007-01-01

    The OPR-PPR program calculates the Observation-Prediction (OPR) and Parameter-Prediction (PPR) statistics that can be used to evaluate the relative importance of various kinds of data to simulated predictions. The data considered fall into three categories: (1) existing observations, (2) potential observations, and (3) potential information about parameters. The first two are addressed by the OPR statistic; the third is addressed by the PPR statistic. The statistics are based on linear theory and measure the leverage of the data, which depends on the location, the type, and possibly the time of the data being considered. For example, in a ground-water system the type of data might be a head measurement at a particular location and time. As a measure of leverage, the statistics do not take into account the value of the measurement. As linear measures, the OPR and PPR statistics require minimal computational effort once sensitivities have been calculated. Sensitivities need to be calculated for only one set of parameter values; commonly these are the values estimated through model calibration. OPR-PPR can calculate the OPR and PPR statistics for any mathematical model that produces the necessary OPR-PPR input files. In this report, OPR-PPR capabilities are presented in the context of using the ground-water model MODFLOW-2000 and the universal inverse program UCODE_2005. The method used to calculate the OPR and PPR statistics is based on the linear equation for prediction standard deviation. Using sensitivities and other information, OPR-PPR calculates (a) the percent increase in the prediction standard deviation that results when one or more existing observations are omitted from the calibration data set; (b) the percent decrease in the prediction standard deviation that results when one or more potential observations are added to the calibration data set; or (c) the percent decrease in the prediction standard deviation that results when potential information on one or more parameters is added.

  8. Quantifying the Diversity and Similarity of Surgical Procedures Among Hospitals and Anesthesia Providers.

    PubMed

    Dexter, Franklin; Ledolter, Johannes; Hindman, Bradley J

    2016-01-01

    In this Statistical Grand Rounds, we review methods for the analysis of the diversity of procedures among hospitals, the activities among anesthesia providers, etc. We apply multiple methods and consider their relative reliability and usefulness for perioperative applications, including calculations of SEs. We also review methods for comparing the similarity of procedures among hospitals, activities among anesthesia providers, etc. We again apply multiple methods and consider their relative reliability and usefulness for perioperative applications. The applications include strategic analyses (e.g., hospital marketing) and human resource analytics (e.g., comparisons among providers). Measures of diversity of procedures and activities (e.g., Herfindahl and Gini-Simpson index) are used for quantification of each facility (hospital) or anesthesia provider, one at a time. Diversity can be thought of as a summary measure. Thus, if the diversity of procedures for 48 hospitals is studied, the diversity (and its SE) is being calculated for each hospital. Likewise, the effective numbers of common procedures at each hospital can be calculated (e.g., by using the exponential of the Shannon index). Measures of similarity are pairwise assessments. Thus, if quantifying the similarity of procedures among cases with a break or handoff versus cases without a break or handoff, a similarity index represents a correlation coefficient. There are several different measures of similarity, and we compare their features and applicability for perioperative data. We rely extensively on sensitivity analyses to interpret observed values of the similarity index.

  9. A DISCUSSION ON DIFFERENT APPROACHES FOR ASSESSING LIFETIME RISKS OF RADON-INDUCED LUNG CANCER.

    PubMed

    Chen, Jing; Murith, Christophe; Palacios, Martha; Wang, Chunhong; Liu, Senlin

    2017-11-01

    Lifetime risks of radon induced lung cancer were assessed based on epidemiological approaches for Canadian, Swiss and Chinese populations, using the most recent vital statistic data and radon distribution characteristics available for each country. In the risk calculation, the North America residential radon risk model was used for the Canadian population, the European residential radon risk model for the Swiss population, the Chinese residential radon risk model for the Chinese population, and the EPA/BEIR-VI radon risk model for all three populations. The results were compared with the risk calculated from the International Commission on Radiological Protection (ICRP)'s exposure-to-risk conversion coefficients. In view of the fact that the ICRP coefficients were recommended for radiation protection of all populations, it was concluded that, generally speaking, lifetime absolute risks calculated with ICRP-recommended coefficients agree reasonably well with the range of radon induced lung cancer risk predicted by risk models derived from epidemiological pooling analyses. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. BlueSNP: R package for highly scalable genome-wide association studies using Hadoop clusters.

    PubMed

    Huang, Hailiang; Tata, Sandeep; Prill, Robert J

    2013-01-01

    Computational workloads for genome-wide association studies (GWAS) are growing in scale and complexity outpacing the capabilities of single-threaded software designed for personal computers. The BlueSNP R package implements GWAS statistical tests in the R programming language and executes the calculations across computer clusters configured with Apache Hadoop, a de facto standard framework for distributed data processing using the MapReduce formalism. BlueSNP makes computationally intensive analyses, such as estimating empirical p-values via data permutation, and searching for expression quantitative trait loci over thousands of genes, feasible for large genotype-phenotype datasets. http://github.com/ibm-bioinformatics/bluesnp

  11. Statistical Analyses for Probabilistic Assessments of the Reactor Pressure Vessel Structural Integrity: Building a Master Curve on an Extract of the 'Euro' Fracture Toughness Dataset, Controlling Statistical Uncertainty for Both Mono-Temperature and multi-temperature tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Josse, Florent; Lefebvre, Yannick; Todeschini, Patrick

    2006-07-01

    Assessing the structural integrity of a nuclear Reactor Pressure Vessel (RPV) subjected to pressurized-thermal-shock (PTS) transients is extremely important to safety. In addition to conventional deterministic calculations to confirm RPV integrity, Electricite de France (EDF) carries out probabilistic analyses. Probabilistic analyses are interesting because some key variables, albeit conventionally taken at conservative values, can be modeled more accurately through statistical variability. One variable which significantly affects RPV structural integrity assessment is cleavage fracture initiation toughness. The reference fracture toughness method currently in use at EDF is the RCCM and ASME Code lower-bound K{sub IC} based on the indexing parameter RT{submore » NDT}. However, in order to quantify the toughness scatter for probabilistic analyses, the master curve method is being analyzed at present. Furthermore, the master curve method is a direct means of evaluating fracture toughness based on K{sub JC} data. In the framework of the master curve investigation undertaken by EDF, this article deals with the following two statistical items: building a master curve from an extract of a fracture toughness dataset (from the European project 'Unified Reference Fracture Toughness Design curves for RPV Steels') and controlling statistical uncertainty for both mono-temperature and multi-temperature tests. Concerning the first point, master curve temperature dependence is empirical in nature. To determine the 'original' master curve, Wallin postulated that a unified description of fracture toughness temperature dependence for ferritic steels is possible, and used a large number of data corresponding to nuclear-grade pressure vessel steels and welds. Our working hypothesis is that some ferritic steels may behave in slightly different ways. Therefore we focused exclusively on the basic french reactor vessel metal of types A508 Class 3 and A 533 grade B Class 1, taking the sampling level and direction into account as well as the test specimen type. As for the second point, the emphasis is placed on the uncertainties in applying the master curve approach. For a toughness dataset based on different specimens of a single product, application of the master curve methodology requires the statistical estimation of one parameter: the reference temperature T{sub 0}. Because of the limited number of specimens, estimation of this temperature is uncertain. The ASTM standard provides a rough evaluation of this statistical uncertainty through an approximate confidence interval. In this paper, a thorough study is carried out to build more meaningful confidence intervals (for both mono-temperature and multi-temperature tests). These results ensure better control over uncertainty, and allow rigorous analysis of the impact of its influencing factors: the number of specimens and the temperatures at which they have been tested. (authors)« less

  12. Filter Tuning Using the Chi-Squared Statistic

    NASA Technical Reports Server (NTRS)

    Lilly-Salkowski, Tyler B.

    2017-01-01

    This paper examines the use of the Chi-square statistic as a means of evaluating filter performance. The goal of the process is to characterize the filter performance in the metric of covariance realism. The Chi-squared statistic is the value calculated to determine the realism of a covariance based on the prediction accuracy and the covariance values at a given point in time. Once calculated, it is the distribution of this statistic that provides insight on the accuracy of the covariance. The process of tuning an Extended Kalman Filter (EKF) for Aqua and Aura support is described, including examination of the measurement errors of available observation types, and methods of dealing with potentially volatile atmospheric drag modeling. Predictive accuracy and the distribution of the Chi-squared statistic, calculated from EKF solutions, are assessed.

  13. MOLSIM: A modular molecular simulation software

    PubMed Central

    Jurij, Reščič

    2015-01-01

    The modular software MOLSIM for all‐atom molecular and coarse‐grained simulations is presented with focus on the underlying concepts used. The software possesses four unique features: (1) it is an integrated software for molecular dynamic, Monte Carlo, and Brownian dynamics simulations; (2) simulated objects are constructed in a hierarchical fashion representing atoms, rigid molecules and colloids, flexible chains, hierarchical polymers, and cross‐linked networks; (3) long‐range interactions involving charges, dipoles and/or anisotropic dipole polarizabilities are handled either with the standard Ewald sum, the smooth particle mesh Ewald sum, or the reaction‐field technique; (4) statistical uncertainties are provided for all calculated observables. In addition, MOLSIM supports various statistical ensembles, and several types of simulation cells and boundary conditions are available. Intermolecular interactions comprise tabulated pairwise potentials for speed and uniformity and many‐body interactions involve anisotropic polarizabilities. Intramolecular interactions include bond, angle, and crosslink potentials. A very large set of analyses of static and dynamic properties is provided. The capability of MOLSIM can be extended by user‐providing routines controlling, for example, start conditions, intermolecular potentials, and analyses. An extensive set of case studies in the field of soft matter is presented covering colloids, polymers, and crosslinked networks. © 2015 The Authors. Journal of Computational Chemistry Published by Wiley Periodicals, Inc. PMID:25994597

  14. Statistical Data Analyses of Trace Chemical, Biochemical, and Physical Analytical Signatures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Udey, Ruth Norma

    Analytical and bioanalytical chemistry measurement results are most meaningful when interpreted using rigorous statistical treatments of the data. The same data set may provide many dimensions of information depending on the questions asked through the applied statistical methods. Three principal projects illustrated the wealth of information gained through the application of statistical data analyses to diverse problems.

  15. Comparison of different methods of inter-eye asymmetry of rim area and disc area analysis

    PubMed Central

    Fansi, A A K; Boisjoly, H; Chagnon, M; Harasymowycz, P J

    2011-01-01

    Purpose To describe different methods of inter-eye asymmetry of rim area (RA) to disc area (DA) asymmetry ratio (RADAAR) analysis. Methods This was an observational, descriptive, and cross-sectional study. Both the eyes of all participants underwent confocal scanning laser ophthalmoscopy (Heidelberg retina tomograph (HRT 3)), frequency-doubling technology perimetry (FDT), and complete ophthalmological examination. Based on ophthalmological clinical examination and FDT results of the worse eye, subjects were classified as either normal, possible glaucoma, and probable glaucoma or definitive glaucoma. RADAAR values were calculated based on stereometric HRT 3 values using different mathematical formulae. RADAAR-1 was calculated as a relative difference of rim and DAs between the eyes. RADAAR-2 was calculated by subtracting the value of rim to DA ratio of the smaller disc from the value of rim to DA ratio of the larger disc. RADAAR-3 was calculated by dividing the previous two values. Statistical analyses included ANOVA as well as Student t-tests. Results Data of 334 participants were analysed, 78 of which were classified as definitive glaucoma. RADAAR-1 values were significantly different between the four different groups of diagnosis (F=5.82; P<0.001). The 1st and 99th percentile limits of normality for RADAAR-1, RADAAR-2, and RADAAR-3 in normal group were, respectively, −10.64 and 8.4; −0.32 and 0.22; and 0.58 and 1.32. Conclusions RADAAR-1 seems to best distinguish between the diagnostic groups. Knowledge of RADAAR distribution in various diagnostic groups may aid in clinical diagnosis of asymmetric glaucomatous damage. PMID:21921945

  16. MRI T2 Mapping of the Knee Articular Cartilage Using Different Acquisition Sequences and Calculation Methods at 1.5 Tesla.

    PubMed

    Mars, Mokhtar; Bouaziz, Mouna; Tbini, Zeineb; Ladeb, Fethi; Gharbi, Souha

    2018-06-12

    This study aims to determine how Magnetic Resonance Imaging (MRI) acquisition techniques and calculation methods affect T2 values of knee cartilage at 1.5 Tesla and to identify sequences that can be used for high-resolution T2 mapping in short scanning times. This study was performed on phantom and twenty-nine patients who underwent MRI of the knee joint at 1.5 Tesla. The protocol includes T2 mapping sequences based on Single Echo Spin Echo (SESE), Multi-Echo Spin Echo (MESE), Fast Spin Echo (FSE) and Turbo Gradient Spin Echo (TGSE). The T2 relaxation times were quantified and evaluated using three calculation methods (MapIt, Syngo Offline and monoexponential fit). Signal to Noise Ratios (SNR) were measured in all sequences. All statistical analyses were performed using the t-test. The average T2 values in phantom were 41.7 ± 13.8 ms for SESE, 43.2 ± 14.4 ms for MESE, 42.4 ± 14.1 ms for FSE and 44 ± 14.5 ms for TGSE. In the patient study, the mean differences were 6.5 ± 8.2 ms, 7.8 ± 7.6 ms and 8.4 ± 14.2 ms for MESE, FSE and TGSE compared to SESE respectively; these statistical results were not significantly different (p > 0.05). The comparison between the three calculation methods showed no significant difference (p > 0.05). t-Test showed no significant difference between SNR values for all sequences. T2 values depend not only on the sequence type but also on the calculation method. None of the sequences revealed significant differences compared to the SESE reference sequence. TGSE with its short scanning time can be used for high-resolution T2 mapping. ©2018The Author(s). Published by S. Karger AG, Basel.

  17. Statistics and bioinformatics in nutritional sciences: analysis of complex data in the era of systems biology⋆

    PubMed Central

    Fu, Wenjiang J.; Stromberg, Arnold J.; Viele, Kert; Carroll, Raymond J.; Wu, Guoyao

    2009-01-01

    Over the past two decades, there have been revolutionary developments in life science technologies characterized by high throughput, high efficiency, and rapid computation. Nutritionists now have the advanced methodologies for the analysis of DNA, RNA, protein, low-molecular-weight metabolites, as well as access to bioinformatics databases. Statistics, which can be defined as the process of making scientific inferences from data that contain variability, has historically played an integral role in advancing nutritional sciences. Currently, in the era of systems biology, statistics has become an increasingly important tool to quantitatively analyze information about biological macromolecules. This article describes general terms used in statistical analysis of large, complex experimental data. These terms include experimental design, power analysis, sample size calculation, and experimental errors (type I and II errors) for nutritional studies at population, tissue, cellular, and molecular levels. In addition, we highlighted various sources of experimental variations in studies involving microarray gene expression, real-time polymerase chain reaction, proteomics, and other bioinformatics technologies. Moreover, we provided guidelines for nutritionists and other biomedical scientists to plan and conduct studies and to analyze the complex data. Appropriate statistical analyses are expected to make an important contribution to solving major nutrition-associated problems in humans and animals (including obesity, diabetes, cardiovascular disease, cancer, ageing, and intrauterine fetal retardation). PMID:20233650

  18. Evaluating test-retest reliability in patient-reported outcome measures for older people: A systematic review.

    PubMed

    Park, Myung Sook; Kang, Kyung Ja; Jang, Sun Joo; Lee, Joo Yun; Chang, Sun Ju

    2018-03-01

    This study aimed to evaluate the components of test-retest reliability including time interval, sample size, and statistical methods used in patient-reported outcome measures in older people and to provide suggestions on the methodology for calculating test-retest reliability for patient-reported outcomes in older people. This was a systematic literature review. MEDLINE, Embase, CINAHL, and PsycINFO were searched from January 1, 2000 to August 10, 2017 by an information specialist. This systematic review was guided by both the Preferred Reporting Items for Systematic Reviews and Meta-Analyses checklist and the guideline for systematic review published by the National Evidence-based Healthcare Collaborating Agency in Korea. The methodological quality was assessed by the Consensus-based Standards for the selection of health Measurement Instruments checklist box B. Ninety-five out of 12,641 studies were selected for the analysis. The median time interval for test-retest reliability was 14days, and the ratio of sample size for test-retest reliability to the number of items in each measure ranged from 1:1 to 1:4. The most frequently used statistical methods for continuous scores was intraclass correlation coefficients (ICCs). Among the 63 studies that used ICCs, 21 studies presented models for ICC calculations and 30 studies reported 95% confidence intervals of the ICCs. Additional analyses using 17 studies that reported a strong ICC (>0.09) showed that the mean time interval was 12.88days and the mean ratio of the number of items to sample size was 1:5.37. When researchers plan to assess the test-retest reliability of patient-reported outcome measures for older people, they need to consider an adequate time interval of approximately 13days and the sample size of about 5 times the number of items. Particularly, statistical methods should not only be selected based on the types of scores of the patient-reported outcome measures, but should also be described clearly in the studies that report the results of test-retest reliability. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Annular tautomerism: experimental observations and quantum mechanics calculations.

    PubMed

    Cruz-Cabeza, Aurora J; Schreyer, Adrian; Pitt, William R

    2010-06-01

    The use of MP2 level quantum mechanical (QM) calculations on isolated heteroaromatic ring systems for the prediction of the tautomeric propensities of whole molecules in a crystalline environment was examined. A Polarisable Continuum Model was used in the calculations to account for environment effects on the tautomeric relative stabilities. The calculated relative energies of tautomers were compared to relative abundances within the Cambridge Structural Database (CSD) and the Protein Data Bank (PDB). The work was focussed on 84 annular tautomeric forms of 34 common ring systems. Good agreement was found between the calculations and the experimental data even if the quantity of these data was limited in many cases. The QM results were compared to those produced by much faster semiempirical calculations. In a search for other sources of the useful experimental data, the relative numbers of known compounds in which prototropic positions were often substituted by heavy atoms were also analysed. A scheme which groups all annular tautomeric transformations into 10 classes was developed. The scheme was designed to encompass a comprehensive set of known and theoretically possible tautomeric ring systems generated as part of a previous study. General trends across analogous ring systems were detected as a result. The calculations and statistics collected on crystallographic data as well as the general trends observed should be useful for the better modelling of annular tautomerism in the applications such as computer-aided drug design, small molecule crystal structure prediction, the naming of compounds and the interpretation of protein-small molecule crystal structures.

  20. Annular tautomerism: experimental observations and quantum mechanics calculations

    NASA Astrophysics Data System (ADS)

    Cruz-Cabeza, Aurora J.; Schreyer, Adrian; Pitt, William R.

    2010-06-01

    The use of MP2 level quantum mechanical (QM) calculations on isolated heteroaromatic ring systems for the prediction of the tautomeric propensities of whole molecules in a crystalline environment was examined. A Polarisable Continuum Model was used in the calculations to account for environment effects on the tautomeric relative stabilities. The calculated relative energies of tautomers were compared to relative abundances within the Cambridge Structural Database (CSD) and the Protein Data Bank (PDB). The work was focussed on 84 annular tautomeric forms of 34 common ring systems. Good agreement was found between the calculations and the experimental data even if the quantity of these data was limited in many cases. The QM results were compared to those produced by much faster semiempirical calculations. In a search for other sources of the useful experimental data, the relative numbers of known compounds in which prototropic positions were often substituted by heavy atoms were also analysed. A scheme which groups all annular tautomeric transformations into 10 classes was developed. The scheme was designed to encompass a comprehensive set of known and theoretically possible tautomeric ring systems generated as part of a previous study. General trends across analogous ring systems were detected as a result. The calculations and statistics collected on crystallographic data as well as the general trends observed should be useful for the better modelling of annular tautomerism in the applications such as computer-aided drug design, small molecule crystal structure prediction, the naming of compounds and the interpretation of protein—small molecule crystal structures.

  1. Graphical augmentations to the funnel plot assess the impact of additional evidence on a meta-analysis.

    PubMed

    Langan, Dean; Higgins, Julian P T; Gregory, Walter; Sutton, Alexander J

    2012-05-01

    We aim to illustrate the potential impact of a new study on a meta-analysis, which gives an indication of the robustness of the meta-analysis. A number of augmentations are proposed to one of the most widely used of graphical displays, the funnel plot. Namely, 1) statistical significance contours, which define regions of the funnel plot in which a new study would have to be located to change the statistical significance of the meta-analysis; and 2) heterogeneity contours, which show how a new study would affect the extent of heterogeneity in a given meta-analysis. Several other features are also described, and the use of multiple features simultaneously is considered. The statistical significance contours suggest that one additional study, no matter how large, may have a very limited impact on the statistical significance of a meta-analysis. The heterogeneity contours illustrate that one outlying study can increase the level of heterogeneity dramatically. The additional features of the funnel plot have applications including 1) informing sample size calculations for the design of future studies eligible for inclusion in the meta-analysis; and 2) informing the updating prioritization of a portfolio of meta-analyses such as those prepared by the Cochrane Collaboration. Copyright © 2012 Elsevier Inc. All rights reserved.

  2. Notes on numerical reliability of several statistical analysis programs

    USGS Publications Warehouse

    Landwehr, J.M.; Tasker, Gary D.

    1999-01-01

    This report presents a benchmark analysis of several statistical analysis programs currently in use in the USGS. The benchmark consists of a comparison between the values provided by a statistical analysis program for variables in the reference data set ANASTY and their known or calculated theoretical values. The ANASTY data set is an amendment of the Wilkinson NASTY data set that has been used in the statistical literature to assess the reliability (computational correctness) of calculated analytical results.

  3. A principal component meta-analysis on multiple anthropometric traits identifies novel loci for body shape

    PubMed Central

    Ried, Janina S.; Jeff M., Janina; Chu, Audrey Y.; Bragg-Gresham, Jennifer L.; van Dongen, Jenny; Huffman, Jennifer E.; Ahluwalia, Tarunveer S.; Cadby, Gemma; Eklund, Niina; Eriksson, Joel; Esko, Tõnu; Feitosa, Mary F.; Goel, Anuj; Gorski, Mathias; Hayward, Caroline; Heard-Costa, Nancy L.; Jackson, Anne U.; Jokinen, Eero; Kanoni, Stavroula; Kristiansson, Kati; Kutalik, Zoltán; Lahti, Jari; Luan, Jian'an; Mägi, Reedik; Mahajan, Anubha; Mangino, Massimo; Medina-Gomez, Carolina; Monda, Keri L.; Nolte, Ilja M.; Pérusse, Louis; Prokopenko, Inga; Qi, Lu; Rose, Lynda M.; Salvi, Erika; Smith, Megan T.; Snieder, Harold; Stančáková, Alena; Ju Sung, Yun; Tachmazidou, Ioanna; Teumer, Alexander; Thorleifsson, Gudmar; van der Harst, Pim; Walker, Ryan W.; Wang, Sophie R.; Wild, Sarah H.; Willems, Sara M.; Wong, Andrew; Zhang, Weihua; Albrecht, Eva; Couto Alves, Alexessander; Bakker, Stephan J. L.; Barlassina, Cristina; Bartz, Traci M.; Beilby, John; Bellis, Claire; Bergman, Richard N.; Bergmann, Sven; Blangero, John; Blüher, Matthias; Boerwinkle, Eric; Bonnycastle, Lori L.; Bornstein, Stefan R.; Bruinenberg, Marcel; Campbell, Harry; Chen, Yii-Der Ida; Chiang, Charleston W. K.; Chines, Peter S.; Collins, Francis S; Cucca, Fracensco; Cupples, L Adrienne; D'Avila, Francesca; de Geus, Eco J .C.; Dedoussis, George; Dimitriou, Maria; Döring, Angela; Eriksson, Johan G.; Farmaki, Aliki-Eleni; Farrall, Martin; Ferreira, Teresa; Fischer, Krista; Forouhi, Nita G.; Friedrich, Nele; Gjesing, Anette Prior; Glorioso, Nicola; Graff, Mariaelisa; Grallert, Harald; Grarup, Niels; Gräßler, Jürgen; Grewal, Jagvir; Hamsten, Anders; Harder, Marie Neergaard; Hartman, Catharina A.; Hassinen, Maija; Hastie, Nicholas; Hattersley, Andrew Tym; Havulinna, Aki S.; Heliövaara, Markku; Hillege, Hans; Hofman, Albert; Holmen, Oddgeir; Homuth, Georg; Hottenga, Jouke-Jan; Hui, Jennie; Husemoen, Lise Lotte; Hysi, Pirro G.; Isaacs, Aaron; Ittermann, Till; Jalilzadeh, Shapour; James, Alan L.; Jørgensen, Torben; Jousilahti, Pekka; Jula, Antti; Marie Justesen, Johanne; Justice, Anne E.; Kähönen, Mika; Karaleftheri, Maria; Tee Khaw, Kay; Keinanen-Kiukaanniemi, Sirkka M.; Kinnunen, Leena; Knekt, Paul B.; Koistinen, Heikki A.; Kolcic, Ivana; Kooner, Ishminder K.; Koskinen, Seppo; Kovacs, Peter; Kyriakou, Theodosios; Laitinen, Tomi; Langenberg, Claudia; Lewin, Alexandra M.; Lichtner, Peter; Lindgren, Cecilia M.; Lindström, Jaana; Linneberg, Allan; Lorbeer, Roberto; Lorentzon, Mattias; Luben, Robert; Lyssenko, Valeriya; Männistö, Satu; Manunta, Paolo; Leach, Irene Mateo; McArdle, Wendy L.; Mcknight, Barbara; Mohlke, Karen L.; Mihailov, Evelin; Milani, Lili; Mills, Rebecca; Montasser, May E.; Morris, Andrew P.; Müller, Gabriele; Musk, Arthur W.; Narisu, Narisu; Ong, Ken K.; Oostra, Ben A.; Osmond, Clive; Palotie, Aarno; Pankow, James S.; Paternoster, Lavinia; Penninx, Brenda W.; Pichler, Irene; Pilia, Maria G.; Polašek, Ozren; Pramstaller, Peter P.; Raitakari, Olli T; Rankinen, Tuomo; Rao, D. C.; Rayner, Nigel W.; Ribel-Madsen, Rasmus; Rice, Treva K.; Richards, Marcus; Ridker, Paul M.; Rivadeneira, Fernando; Ryan, Kathy A.; Sanna, Serena; Sarzynski, Mark A.; Scholtens, Salome; Scott, Robert A.; Sebert, Sylvain; Southam, Lorraine; Sparsø, Thomas Hempel; Steinthorsdottir, Valgerdur; Stirrups, Kathleen; Stolk, Ronald P.; Strauch, Konstantin; Stringham, Heather M.; Swertz, Morris A.; Swift, Amy J.; Tönjes, Anke; Tsafantakis, Emmanouil; van der Most, Peter J.; Van Vliet-Ostaptchouk, Jana V.; Vandenput, Liesbeth; Vartiainen, Erkki; Venturini, Cristina; Verweij, Niek; Viikari, Jorma S.; Vitart, Veronique; Vohl, Marie-Claude; Vonk, Judith M.; Waeber, Gérard; Widén, Elisabeth; Willemsen, Gonneke; Wilsgaard, Tom; Winkler, Thomas W.; Wright, Alan F.; Yerges-Armstrong, Laura M.; Hua Zhao, Jing; Carola Zillikens, M.; Boomsma, Dorret I.; Bouchard, Claude; Chambers, John C.; Chasman, Daniel I.; Cusi, Daniele; Gansevoort, Ron T.; Gieger, Christian; Hansen, Torben; Hicks, Andrew A.; Hu, Frank; Hveem, Kristian; Jarvelin, Marjo-Riitta; Kajantie, Eero; Kooner, Jaspal S.; Kuh, Diana; Kuusisto, Johanna; Laakso, Markku; Lakka, Timo A.; Lehtimäki, Terho; Metspalu, Andres; Njølstad, Inger; Ohlsson, Claes; Oldehinkel, Albertine J.; Palmer, Lyle J.; Pedersen, Oluf; Perola, Markus; Peters, Annette; Psaty, Bruce M.; Puolijoki, Hannu; Rauramaa, Rainer; Rudan, Igor; Salomaa, Veikko; Schwarz, Peter E. H.; Shudiner, Alan R.; Smit, Jan H.; Sørensen, Thorkild I. A.; Spector, Timothy D.; Stefansson, Kari; Stumvoll, Michael; Tremblay, Angelo; Tuomilehto, Jaakko; Uitterlinden, André G.; Uusitupa, Matti; Völker, Uwe; Vollenweider, Peter; Wareham, Nicholas J.; Watkins, Hugh; Wilson, James F.; Zeggini, Eleftheria; Abecasis, Goncalo R.; Boehnke, Michael; Borecki, Ingrid B.; Deloukas, Panos; van Duijn, Cornelia M.; Fox, Caroline; Groop, Leif C.; Heid, Iris M.; Hunter, David J.; Kaplan, Robert C.; McCarthy, Mark I.; North, Kari E.; O'Connell, Jeffrey R.; Schlessinger, David; Thorsteinsdottir, Unnur; Strachan, David P.; Frayling, Timothy; Hirschhorn, Joel N.; Müller-Nurasyid, Martina; Loos, Ruth J. F.

    2016-01-01

    Large consortia have revealed hundreds of genetic loci associated with anthropometric traits, one trait at a time. We examined whether genetic variants affect body shape as a composite phenotype that is represented by a combination of anthropometric traits. We developed an approach that calculates averaged PCs (AvPCs) representing body shape derived from six anthropometric traits (body mass index, height, weight, waist and hip circumference, waist-to-hip ratio). The first four AvPCs explain >99% of the variability, are heritable, and associate with cardiometabolic outcomes. We performed genome-wide association analyses for each body shape composite phenotype across 65 studies and meta-analysed summary statistics. We identify six novel loci: LEMD2 and CD47 for AvPC1, RPS6KA5/C14orf159 and GANAB for AvPC3, and ARL15 and ANP32 for AvPC4. Our findings highlight the value of using multiple traits to define complex phenotypes for discovery, which are not captured by single-trait analyses, and may shed light onto new pathways. PMID:27876822

  4. Mapping of compositional properties of coal using isometric log-ratio transformation and sequential Gaussian simulation - A comparative study for spatial ultimate analyses data.

    PubMed

    Karacan, C Özgen; Olea, Ricardo A

    2018-03-01

    Chemical properties of coal largely determine coal handling, processing, beneficiation methods, and design of coal-fired power plants. Furthermore, these properties impact coal strength, coal blending during mining, as well as coal's gas content, which is important for mining safety. In order for these processes and quantitative predictions to be successful, safer, and economically feasible, it is important to determine and map chemical properties of coals accurately in order to infer these properties prior to mining. Ultimate analysis quantifies principal chemical elements in coal. These elements are C, H, N, S, O, and, depending on the basis, ash, and/or moisture. The basis for the data is determined by the condition of the sample at the time of analysis, with an "as-received" basis being the closest to sampling conditions and thus to the in-situ conditions of the coal. The parts determined or calculated as the result of ultimate analyses are compositions, reported in weight percent, and pose the challenges of statistical analyses of compositional data. The treatment of parts using proper compositional methods may be even more important in mapping them, as most mapping methods carry uncertainty due to partial sampling as well. In this work, we map the ultimate analyses parts of the Springfield coal from an Indiana section of the Illinois basin, USA, using sequential Gaussian simulation of isometric log-ratio transformed compositions. We compare the results with those of direct simulations of compositional parts. We also compare the implications of these approaches in calculating other properties using correlations to identify the differences and consequences. Although the study here is for coal, the methods described in the paper are applicable to any situation involving compositional data and its mapping.

  5. The development of a novel approach for assessment of the first flush in urban stormwater discharges.

    PubMed

    Bach, P M; McCarthy, D T; Deletic, A

    2010-01-01

    The management of stormwater pollution has placed particular emphasis on the first flush phenomenon. However, definition and current methods of analyses of the phenomena contain serious limitations, the most important being their inability to capture a possible impact of the event size (total event volume) on the first flush. This paper presents the development of a novel approach in defining and assessing the first flush that should overcome these problems. The phenomenon is present in a catchment if the decrease in pollution concentration with the absolute cumulative volume of runoff from the catchment is statistically significant. Using data from seven diverse catchments around Melbourne, Australia, changes in pollutant concentrations for Total Suspended Solids (TSS) and Total Nitrogen (TN) were calculated over the absolute cumulative runoff and aggregated from a collection of different storm events. Due to the discrete nature of the water quality data, each concentration was calculated as a flow-weighted average at 2 mm runoff volume increments. The aggregated concentrations recorded in each increment (termed as a 'slice' of runoff) were statistically compared to each other across the absolute cumulative runoff volume. A first flush is then defined as the volume at which concentrations reach the 'background concentration' (i.e. the statistically significant minimum). Initial results clearly highlight first flush and background concentrations in all but one catchment supporting the validity of this new approach. Future work will need to address factors, which will help assess the first flush's magnitude and volume. Sensitivity testing and correlation with catchment characteristics should also be undertaken.

  6. New Predictive Parameters of Bell’s Palsy: Neutrophil to Lymphocyte Ratio and Platelet to Lymphocyte Ratio

    PubMed Central

    Atan, Doğan; İkincioğulları, Aykut; Köseoğlu, Sabri; Özcan, Kürşat Murat; Çetin, Mehmet Ali; Ensari, Serdar; Dere, Hüseyin

    2015-01-01

    Background: Bell’s palsy is the most frequent cause of unilateral facial paralysis. Inflammation is thought to play an important role in the pathogenesis of Bell’s palsy. Aims: Neutrophil to lymphocyte ratio (NLR) and platelet to lymphocyte ratio (PLR) are simple and inexpensive tests which are indicative of inflammation and can be calculated by all physicians. The aim of this study was to reveal correlations of Bell’s palsy and degree of paralysis with NLR and PLR. Study Design: Case-control study. Methods: The retrospective study was performed January 2010 and December 2013. Ninety-nine patients diagnosed as Bell’s palsy were included in the Bell’s palsy group and ninety-nine healthy individuals with the same demographic characteristics as the Bell’s palsy group were included in the control group. As a result of analyses, NLR and PLR were calculated. Results: The mean NLR was 4.37 in the Bell’s palsy group and 1.89 in the control group with a statistically significant difference (p<0.001). The mean PLR was 137.5 in the Bell’s palsy group and 113.75 in the control group with a statistically significant difference (p=0.008). No statistically significant relation was detected between the degree of facial paralysis and NLR and PLR. Conclusion: The NLR and the PLR were significantly higher in patients with Bell’s palsy. This is the first study to reveal a relation between Bell’s palsy and PLR. NLR and PLR can be used as auxiliary parameters in the diagnosis of Bell’s palsy. PMID:26167340

  7. Within What Distance Does “Greenness” Best Predict Physical Health? A Systematic Review of Articles with GIS Buffer Analyses across the Lifespan

    PubMed Central

    2017-01-01

    Is the amount of “greenness” within a 250-m, 500-m, 1000-m or a 2000-m buffer surrounding a person’s home a good predictor of their physical health? The evidence is inconclusive. We reviewed Web of Science articles that used geographic information system buffer analyses to identify trends between physical health, greenness, and distance within which greenness is measured. Our inclusion criteria were: (1) use of buffers to estimate residential greenness; (2) statistical analyses that calculated significance of the greenness-physical health relationship; and (3) peer-reviewed articles published in English between 2007 and 2017. To capture multiple findings from a single article, we selected our unit of inquiry as the analysis, not the article. Our final sample included 260 analyses in 47 articles. All aspects of the review were in accordance with PRISMA guidelines. Analyses were independently judged as more, less, or least likely to be biased based on the inclusion of objective health measures and income/education controls. We found evidence that larger buffer sizes, up to 2000 m, better predicted physical health than smaller ones. We recommend that future analyses use nested rather than overlapping buffers to evaluate to what extent greenness not immediately around a person’s home (i.e., within 1000–2000 m) predicts physical health. PMID:28644420

  8. Early Warning Signs of Suicide in Service Members Who Engage in Unauthorized Acts of Violence

    DTIC Science & Technology

    2016-06-01

    observable to military law enforcement personnel. Statistical analyses tested for differences in warning signs between cases of suicide, violence, or...indicators, (2) Behavioral Change indicators, (3) Social indicators, and (4) Occupational indicators. Statistical analyses were conducted to test for...6 Coding _________________________________________________________________ 7 Statistical

  9. Earth Observation System Flight Dynamics System Covariance Realism

    NASA Technical Reports Server (NTRS)

    Zaidi, Waqar H.; Tracewell, David

    2016-01-01

    This presentation applies a covariance realism technique to the National Aeronautics and Space Administration (NASA) Earth Observation System (EOS) Aqua and Aura spacecraft based on inferential statistics. The technique consists of three parts: collection calculation of definitive state estimates through orbit determination, calculation of covariance realism test statistics at each covariance propagation point, and proper assessment of those test statistics.

  10. [Statistical analysis using freely-available "EZR (Easy R)" software].

    PubMed

    Kanda, Yoshinobu

    2015-10-01

    Clinicians must often perform statistical analyses for purposes such evaluating preexisting evidence and designing or executing clinical studies. R is a free software environment for statistical computing. R supports many statistical analysis functions, but does not incorporate a statistical graphical user interface (GUI). The R commander provides an easy-to-use basic-statistics GUI for R. However, the statistical function of the R commander is limited, especially in the field of biostatistics. Therefore, the author added several important statistical functions to the R commander and named it "EZR (Easy R)", which is now being distributed on the following website: http://www.jichi.ac.jp/saitama-sct/. EZR allows the application of statistical functions that are frequently used in clinical studies, such as survival analyses, including competing risk analyses and the use of time-dependent covariates and so on, by point-and-click access. In addition, by saving the script automatically created by EZR, users can learn R script writing, maintain the traceability of the analysis, and assure that the statistical process is overseen by a supervisor.

  11. Statistical analysis of radiation dose derived from ingestion of foods

    NASA Astrophysics Data System (ADS)

    Dougherty, Ward L.

    2001-09-01

    This analysis undertook the task of designing and implementing a methodology to determine an individual's probabilistic radiation dose from ingestion of foods utilizing Crystal Ball. A dietary intake model was determined by comparing previous existing models. Two principal radionuclides were considered-Lead210 (Pb-210) and Radium 226 (Ra-226). Samples from three different local grocery stores-Publix, Winn Dixie, and Albertsons-were counted on a gamma spectroscopy system with a GeLi detector. The same food samples were considered as those in the original FIPR database. A statistical analysis, utilizing the Crystal Ball program, was performed on the data to assess the most accurate distribution to use for these data. This allowed a determination of a radiation dose to an individual based on the above-information collected. Based on the analyses performed, radiation dose for grocery store samples was lower for Radium-226 than FIPR debris analyses, 2.7 vs. 5.91 mrem/yr. Lead-210 had a higher dose in the grocery store sample than the FIPR debris analyses, 21.4 vs. 518 mrem/yr. The output radiation dose was higher for all evaluations when an accurate estimation of distributions for each value was considered. Radium-226 radiation dose for FIPR and grocery rose to 9.56 and 4.38 mrem/yr. Radiation dose from ingestion of Pb-210 rose to 34.7 and 854 mrem/yr for FIPR and grocery data, respectively. Lead-210 was higher than initial doses for many reasons: Different peak examined, lower edge of detection limit, and minimum detectable concentration was considered. FIPR did not utilize grocery samples as a control because they calculated radiation dose that appeared unreasonably high. Consideration of distributions with the initial values allowed reevaluation of radiation does and showed a significant difference to original deterministic values. This work shows the value and importance of considering distributions to ensure that a person's radiation dose is accurately calculated. Probabilistic dose methodology was proved to be a more accurate and realistic method of radiation dose determination. This type of methodology provides a visual presentation of dose distribution that can be a vital aid in risk methodology.

  12. Study of Left Ventricular Mass and Its Determinants on Echocardiography.

    PubMed

    Guleri, Namrata; Rana, Susheela; Chauhan, Randhir S; Negi, Prakash Chand; Diwan, Yogesh; Diwan, Deepa

    2017-09-01

    Increased Left Ventricular Mass (LVM) is an independent risk factor for cardiovascular morbidity and mortality. This study was done to find the prevalence and determinants of LVM in the Northern Indian population. A prospective cross-sectional observational study was carried out in a tertiary care centre in Himachal Pradesh, India and the study population included all consecutive patients fulfilling the inclusion criteria attending cardiology OPD on seeking medical attention with various symptoms for dyslipidaemia, hypertension but not on medication over a period of one year. Focused history was taken; physical examination and investigations were done. Data collected was analysed using Epi-info software version 3.5.1. We calculated means of LVM index for categorical variables i.e., sex, tobacco consumption, alcohol consumption and sedentary lifestyle etc., and also calculated p-values as test of significance for mean difference across the exposure variable groups. The Pearson correlation coefficient was calculated and 2 tailed significance at p< 0.05 was taken as statistically significant. Mean age of study population was 42.30±9.8 years and 62.9% were males. The mean LVM index was significantly higher in men than in women 77.7 ± 11.4 vs.71.3 ± 15.7 (p-value <0.01). Strong positive correlation was observed between increased waist hip ratio and increased Left Ventricular Mass Index (LVMI). The Pearson correlation coefficient was 36.77 and it was statistically significant with p-value 0.04. We found positive and independent correlation of increased LVMI with increased Waist Hip Ratio (WHR). A positive independent correlation was also observed with higher fasting blood sugar levels.

  13. Towards interoperable and reproducible QSAR analyses: Exchange of datasets.

    PubMed

    Spjuth, Ola; Willighagen, Egon L; Guha, Rajarshi; Eklund, Martin; Wikberg, Jarl Es

    2010-06-30

    QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets and hence work collectively, but also allows for analyzing the effect descriptors have on the statistical model's performance. The presented Bioclipse plugins equip scientists with graphical tools that make QSAR-ML easily accessible for the community.

  14. Towards interoperable and reproducible QSAR analyses: Exchange of datasets

    PubMed Central

    2010-01-01

    Background QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. Results We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Conclusions Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets and hence work collectively, but also allows for analyzing the effect descriptors have on the statistical model's performance. The presented Bioclipse plugins equip scientists with graphical tools that make QSAR-ML easily accessible for the community. PMID:20591161

  15. Does the emergency surgery score accurately predict outcomes in emergent laparotomies?

    PubMed

    Peponis, Thomas; Bohnen, Jordan D; Sangji, Naveen F; Nandan, Anirudh R; Han, Kelsey; Lee, Jarone; Yeh, D Dante; de Moya, Marc A; Velmahos, George C; Chang, David C; Kaafarani, Haytham M A

    2017-08-01

    The emergency surgery score is a mortality-risk calculator for emergency general operation patients. We sought to examine whether the emergency surgery score predicts 30-day morbidity and mortality in a high-risk group of patients undergoing emergent laparotomy. Using the 2011-2012 American College of Surgeons National Surgical Quality Improvement Program database, we identified all patients who underwent emergent laparotomy using (1) the American College of Surgeons National Surgical Quality Improvement Program definition of "emergent," and (2) all Current Procedural Terminology codes denoting a laparotomy, excluding aortic aneurysm rupture. Multivariable logistic regression analyses were performed to measure the correlation (c-statistic) between the emergency surgery score and (1) 30-day mortality, and (2) 30-day morbidity after emergent laparotomy. As sensitivity analyses, the correlation between the emergency surgery score and 30-day mortality was also evaluated in prespecified subgroups based on Current Procedural Terminology codes. A total of 26,410 emergent laparotomy patients were included. Thirty-day mortality and morbidity were 10.2% and 43.8%, respectively. The emergency surgery score correlated well with mortality (c-statistic = 0.84); scores of 1, 11, and 22 correlated with mortalities of 0.4%, 39%, and 100%, respectively. Similarly, the emergency surgery score correlated well with morbidity (c-statistic = 0.74); scores of 0, 7, and 11 correlated with complication rates of 13%, 58%, and 79%, respectively. The morbidity rates plateaued for scores higher than 11. Sensitivity analyses demonstrated that the emergency surgery score effectively predicts mortality in patients undergoing emergent (1) splenic, (2) gastroduodenal, (3) intestinal, (4) hepatobiliary, or (5) incarcerated ventral hernia operation. The emergency surgery score accurately predicts outcomes in all types of emergent laparotomy patients and may prove valuable as a bedside decision-making tool for patient and family counseling, as well as for adequate risk-adjustment in emergent laparotomy quality benchmarking efforts. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Statistical approaches in published ophthalmic clinical science papers: a comparison to statistical practice two decades ago.

    PubMed

    Zhang, Harrison G; Ying, Gui-Shuang

    2018-02-09

    The aim of this study is to evaluate the current practice of statistical analysis of eye data in clinical science papers published in British Journal of Ophthalmology ( BJO ) and to determine whether the practice of statistical analysis has improved in the past two decades. All clinical science papers (n=125) published in BJO in January-June 2017 were reviewed for their statistical analysis approaches for analysing primary ocular measure. We compared our findings to the results from a previous paper that reviewed BJO papers in 1995. Of 112 papers eligible for analysis, half of the studies analysed the data at an individual level because of the nature of observation, 16 (14%) studies analysed data from one eye only, 36 (32%) studies analysed data from both eyes at ocular level, one study (1%) analysed the overall summary of ocular finding per individual and three (3%) studies used the paired comparison. Among studies with data available from both eyes, 50 (89%) of 56 papers in 2017 did not analyse data from both eyes or ignored the intereye correlation, as compared with in 60 (90%) of 67 papers in 1995 (P=0.96). Among studies that analysed data from both eyes at an ocular level, 33 (92%) of 36 studies completely ignored the intereye correlation in 2017, as compared with in 16 (89%) of 18 studies in 1995 (P=0.40). A majority of studies did not analyse the data properly when data from both eyes were available. The practice of statistical analysis did not improve in the past two decades. Collaborative efforts should be made in the vision research community to improve the practice of statistical analysis for ocular data. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  17. A two-component rain model for the prediction of attenuation statistics

    NASA Technical Reports Server (NTRS)

    Crane, R. K.

    1982-01-01

    A two-component rain model has been developed for calculating attenuation statistics. In contrast to most other attenuation prediction models, the two-component model calculates the occurrence probability for volume cells or debris attenuation events. The model performed significantly better than the International Radio Consultative Committee model when used for predictions on earth-satellite paths. It is expected that the model will have applications in modeling the joint statistics required for space diversity system design, the statistics of interference due to rain scatter at attenuating frequencies, and the duration statistics for attenuation events.

  18. Reporting and methodological quality of meta-analyses in urological literature.

    PubMed

    Xia, Leilei; Xu, Jing; Guzzo, Thomas J

    2017-01-01

    To assess the overall quality of published urological meta-analyses and identify predictive factors for high quality. We systematically searched PubMed to identify meta-analyses published from January 1st, 2011 to December 31st, 2015 in 10 predetermined major paper-based urology journals. The characteristics of the included meta-analyses were collected, and their reporting and methodological qualities were assessed by the PRISMA checklist (27 items) and AMSTAR tool (11 items), respectively. Descriptive statistics were used for individual items as a measure of overall compliance, and PRISMA and AMSTAR scores were calculated as the sum of adequately reported domains. Logistic regression was used to identify predictive factors for high qualities. A total of 183 meta-analyses were included. The mean PRISMA and AMSTAR scores were 22.74 ± 2.04 and 7.57 ± 1.41, respectively. PRISMA item 5, protocol and registration, items 15 and 22, risk of bias across studies, items 16 and 23, additional analysis had less than 50% adherence. AMSTAR item 1, " a priori " design, item 5, list of studies and item 10, publication bias had less than 50% adherence. Logistic regression analyses showed that funding support and " a priori " design were associated with superior reporting quality, following PRISMA guideline and " a priori " design were associated with superior methodological quality. Reporting and methodological qualities of recently published meta-analyses in major paper-based urology journals are generally good. Further improvement could potentially be achieved by strictly adhering to PRISMA guideline and having " a priori " protocol.

  19. Transport calculations and sensitivity analyses for air-over-ground and air-over-seawater weapons environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pace, J.V. III; Bartine, D.E.; Mynatt, F.R.

    1976-01-01

    Two-dimensional neutron and secondary gamma-ray transport calculations and cross-section sensitivity analyses have been performed to determine the effects of varying source heights and cross sections on calculated doses. The air-over-ground calculations demonstrate the existence of an optimal height of burst for a specific ground range and indicate under what conditions they are conservative with respect to infinite air calculations. The air-over-seawater calculations showed the importance of hydrogen and chlorine in gamma production. Additional sensitivity analyses indicated the importance of water in the ground, the amount of reduction in ground thickness for calculational purposes, and the effect of the degree ofmore » Legendre angular expansion of the scattering cross-sections (P/sub l/) on the calculated dose.« less

  20. Profiling agricultural land cover change in the North Central U.S. using ten years of the Cropland Data Layer

    NASA Astrophysics Data System (ADS)

    Sandborn, A.; Ebinger, L.

    2016-12-01

    The Cropland Data Layer (CDL), produced by the USDA/National Agricultural Statistics Service, provides annual, georeferenced crop specific land cover data over the contiguous United States. Several analyses were performed on ten years (2007-2016) of CDL data in order to visualize and quantify agricultural change over the North Central region (North Dakota, South Dakota, and Minnesota). Crop masks were derived from the CDL and layered to produce a ten-year time stack of corn, soybeans, and spring wheat at 30m spatial resolution. Through numerous image analyses, a temporal profile of each crop type was compiled and portrayed cartographically. For each crop, analyses included calculating the mean center of crop area over the ten year sequence, identifying the first and latest year the crop was grown on each pixel, and distinguishing crop rotation patterns and replacement statistics. Results show a clear north-western expansion trend for corn and soybeans, and a western migration trend for spring wheat. While some change may be due to commonly practiced crop rotation, this analysis shows that crop footprints have extended into areas that were previously other crops, idle cropland, and pasture/rangeland. Possible factors contributing to this crop migration pattern include profit advantages of row crops over small grains, improved crop genetics, climate change, and farm management program changes. Identifying and mapping these crop planting differences will better inform agricultural best practices, help to monitor the latest crop migration patterns, and present researchers with a way to quantitatively measure and forecast future agricultural trends.

  1. Does workplace social capital protect against long-term sickness absence? Linking workplace aggregated social capital to sickness absence registry data.

    PubMed

    Hansen, Anne-Sophie K; Madsen, Ida E H; Thorsen, Sannie Vester; Melkevik, Ole; Bjørner, Jakob Bue; Andersen, Ingelise; Rugulies, Reiner

    2018-05-01

    Most previous prospective studies have examined workplace social capital as a resource of the individual. However, literature suggests that social capital is a collective good. In the present study we examined whether a high level of workplace aggregated social capital (WASC) predicts a decreased risk of individual-level long-term sickness absence (LTSA) in Danish private sector employees. A sample of 2043 employees (aged 18-64 years, 38.5% women) from 260 Danish private-sector companies filled in a questionnaire on workplace social capital and covariates. WASC was calculated by assigning the company-averaged social capital score to all employees of each company. We derived LTSA, defined as sickness absence of more than three weeks, from a national register. We examined if WASC predicted employee LTSA using multilevel survival analyses, while excluding participants with LTSA in the three months preceding baseline. We found no statistically significant association in any of the analyses. The hazard ratio for LTSA in the fully adjusted model was 0.93 (95% CI 0.77-1.13) per one standard deviation increase in WASC. When using WASC as a categorical exposure we found a statistically non-significant tendency towards a decreased risk of LTSA in employees with medium WASC (fully adjusted model: HR 0.78 (95% CI 0.48-1.27)). Post hoc analyses with workplace social capital as a resource of the individual showed similar results. WASC did not predict LTSA in this sample of Danish private-sector employees.

  2. Type II diabetes mellitus and incident osteoarthritis of the hand: a population-based case-control analysis.

    PubMed

    Frey, N; Hügle, T; Jick, S S; Meier, C R; Spoendlin, J

    2016-09-01

    Emerging evidence suggests that diabetes may be a risk factor for osteoarthritis (OA). However, previous results on the association between diabetes and all OA were conflicting. We aimed to comprehensively analyse the association between type II diabetes mellitus (T2DM) and osteoarthritis of the hand (HOA) specifically. We conducted a matched (1:1) case-control study using the UK-based Clinical Practice Research Datalink (CPRD) of cases aged 30-90 years with an incident diagnosis of HOA from 1995 to 2013. In multivariable conditional logistic regression analyses, we calculated odds ratios (OR) for incident HOA in patients with T2DM, categorized by T2DM severity (HbA1C), duration, and pharmacological treatment. We further performed sensitivity analyses in patients with and without other metabolic diseases (hypertension (HT), hyperlipidaemia (HL), obesity). Among 13,500 cases and 13,500 controls, we observed no statistically significant association between T2DM and HOA (OR 0.95, 95% confidence interval (CI) 0.87-1.04), regardless of T2DM severity, duration, or pharmacological treatment. Having HT did not change the OR. Although we observed slightly increased ORs in overweight T2DM patients with co-occurring HL with or without coexisting HT, none of these ORs were statistically significant. Our results provide evidence that T2DM is not an independent risk factor for HOA. Concurrence of T2DM with HT, HL, and/or obesity did not change this association significantly. Copyright © 2016 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  3. Isotropy analyses of the Planck convergence map

    NASA Astrophysics Data System (ADS)

    Marques, G. A.; Novaes, C. P.; Bernui, A.; Ferreira, I. S.

    2018-01-01

    The presence of matter in the path of relic photons causes distortions in the angular pattern of the cosmic microwave background (CMB) temperature fluctuations, modifying their properties in a slight but measurable way. Recently, the Planck Collaboration released the estimated convergence map, an integrated measure of the large-scale matter distribution that produced the weak gravitational lensing (WL) phenomenon observed in Planck CMB data. We perform exhaustive analyses of this convergence map calculating the variance in small and large regions of the sky, but excluding the area masked due to Galactic contaminations, and compare them with the features expected in the set of simulated convergence maps, also released by the Planck Collaboration. Our goal is to search for sky directions or regions where the WL imprints anomalous signatures to the variance estimator revealed through a χ2 analyses at a statistically significant level. In the local analysis of the Planck convergence map, we identified eight patches of the sky in disagreement, in more than 2σ, with what is observed in the average of the simulations. In contrast, in the large regions analysis we found no statistically significant discrepancies, but, interestingly, the regions with the highest χ2 values are surrounding the ecliptic poles. Thus, our results show a good agreement with the features expected by the Λ cold dark matter concordance model, as given by the simulations. Yet, the outliers regions found here could suggest that the data still contain residual contamination, like noise, due to over- or underestimation of systematic effects in the simulation data set.

  4. Does workplace social capital protect against long-term sickness absence? Linking workplace aggregated social capital to sickness absence registry data

    PubMed Central

    Hansen, Anne-Sophie K.; Madsen, Ida E. H.; Thorsen, Sannie Vester; Melkevik, Ole; Bjørner, Jakob Bue; Andersen, Ingelise; Rugulies, Reiner

    2017-01-01

    Aims: Most previous prospective studies have examined workplace social capital as a resource of the individual. However, literature suggests that social capital is a collective good. In the present study we examined whether a high level of workplace aggregated social capital (WASC) predicts a decreased risk of individual-level long-term sickness absence (LTSA) in Danish private sector employees. Methods: A sample of 2043 employees (aged 18–64 years, 38.5% women) from 260 Danish private-sector companies filled in a questionnaire on workplace social capital and covariates. WASC was calculated by assigning the company-averaged social capital score to all employees of each company. We derived LTSA, defined as sickness absence of more than three weeks, from a national register. We examined if WASC predicted employee LTSA using multilevel survival analyses, while excluding participants with LTSA in the three months preceding baseline. Results: We found no statistically significant association in any of the analyses. The hazard ratio for LTSA in the fully adjusted model was 0.93 (95% CI 0.77–1.13) per one standard deviation increase in WASC. When using WASC as a categorical exposure we found a statistically non-significant tendency towards a decreased risk of LTSA in employees with medium WASC (fully adjusted model: HR 0.78 (95% CI 0.48–1.27)). Post hoc analyses with workplace social capital as a resource of the individual showed similar results. Conclusions: WASC did not predict LTSA in this sample of Danish private-sector employees. PMID:28784025

  5. Statistical Properties of SEE Rate Calculation in the Limits of Large and Small Event Counts

    NASA Technical Reports Server (NTRS)

    Ladbury, Ray

    2007-01-01

    This viewgraph presentation reviews the Statistical properties of Single Event Effects (SEE) rate calculations. The goal of SEE rate calculation is to bound the SEE rate, though the question is by how much. The presentation covers: (1) Understanding errors on SEE cross sections, (2) Methodology: Maximum Likelihood and confidence Contours, (3) Tests with Simulated data and (4) Applications.

  6. FluxPyt: a Python-based free and open-source software for 13C-metabolic flux analyses.

    PubMed

    Desai, Trunil S; Srivastava, Shireesh

    2018-01-01

    13 C-Metabolic flux analysis (MFA) is a powerful approach to estimate intracellular reaction rates which could be used in strain analysis and design. Processing and analysis of labeling data for calculation of fluxes and associated statistics is an essential part of MFA. However, various software currently available for data analysis employ proprietary platforms and thus limit accessibility. We developed FluxPyt, a Python-based truly open-source software package for conducting stationary 13 C-MFA data analysis. The software is based on the efficient elementary metabolite unit framework. The standard deviations in the calculated fluxes are estimated using the Monte-Carlo analysis. FluxPyt also automatically creates flux maps based on a template for visualization of the MFA results. The flux distributions calculated by FluxPyt for two separate models: a small tricarboxylic acid cycle model and a larger Corynebacterium glutamicum model, were found to be in good agreement with those calculated by a previously published software. FluxPyt was tested in Microsoft™ Windows 7 and 10, as well as in Linux Mint 18.2. The availability of a free and open 13 C-MFA software that works in various operating systems will enable more researchers to perform 13 C-MFA and to further modify and develop the package.

  7. FluxPyt: a Python-based free and open-source software for 13C-metabolic flux analyses

    PubMed Central

    Desai, Trunil S.

    2018-01-01

    13C-Metabolic flux analysis (MFA) is a powerful approach to estimate intracellular reaction rates which could be used in strain analysis and design. Processing and analysis of labeling data for calculation of fluxes and associated statistics is an essential part of MFA. However, various software currently available for data analysis employ proprietary platforms and thus limit accessibility. We developed FluxPyt, a Python-based truly open-source software package for conducting stationary 13C-MFA data analysis. The software is based on the efficient elementary metabolite unit framework. The standard deviations in the calculated fluxes are estimated using the Monte-Carlo analysis. FluxPyt also automatically creates flux maps based on a template for visualization of the MFA results. The flux distributions calculated by FluxPyt for two separate models: a small tricarboxylic acid cycle model and a larger Corynebacterium glutamicum model, were found to be in good agreement with those calculated by a previously published software. FluxPyt was tested in Microsoft™ Windows 7 and 10, as well as in Linux Mint 18.2. The availability of a free and open 13C-MFA software that works in various operating systems will enable more researchers to perform 13C-MFA and to further modify and develop the package. PMID:29736347

  8. Stationary statistical theory of two-surface multipactor regarding all impacts for efficient threshold analysis

    NASA Astrophysics Data System (ADS)

    Lin, Shu; Wang, Rui; Xia, Ning; Li, Yongdong; Liu, Chunliang

    2018-01-01

    Statistical multipactor theories are critical prediction approaches for multipactor breakdown determination. However, these approaches still require a negotiation between the calculation efficiency and accuracy. This paper presents an improved stationary statistical theory for efficient threshold analysis of two-surface multipactor. A general integral equation over the distribution function of the electron emission phase with both the single-sided and double-sided impacts considered is formulated. The modeling results indicate that the improved stationary statistical theory can not only obtain equally good accuracy of multipactor threshold calculation as the nonstationary statistical theory, but also achieve high calculation efficiency concurrently. By using this improved stationary statistical theory, the total time consumption in calculating full multipactor susceptibility zones of parallel plates can be decreased by as much as a factor of four relative to the nonstationary statistical theory. It also shows that the effect of single-sided impacts is indispensable for accurate multipactor prediction of coaxial lines and also more significant for the high order multipactor. Finally, the influence of secondary emission yield (SEY) properties on the multipactor threshold is further investigated. It is observed that the first cross energy and the energy range between the first cross and the SEY maximum both play a significant role in determining the multipactor threshold, which agrees with the numerical simulation results in the literature.

  9. Derivation from first principles of the statistical distribution of the mass peak intensities of MS data.

    PubMed

    Ipsen, Andreas

    2015-02-03

    Despite the widespread use of mass spectrometry (MS) in a broad range of disciplines, the nature of MS data remains very poorly understood, and this places important constraints on the quality of MS data analysis as well as on the effectiveness of MS instrument design. In the following, a procedure for calculating the statistical distribution of the mass peak intensity for MS instruments that use analog-to-digital converters (ADCs) and electron multipliers is presented. It is demonstrated that the physical processes underlying the data-generation process, from the generation of the ions to the signal induced at the detector, and on to the digitization of the resulting voltage pulse, result in data that can be well-approximated by a Gaussian distribution whose mean and variance are determined by physically meaningful instrumental parameters. This allows for a very precise understanding of the signal-to-noise ratio of mass peak intensities and suggests novel ways of improving it. Moreover, it is a prerequisite for being able to address virtually all data analytical problems in downstream analyses in a statistically rigorous manner. The model is validated with experimental data.

  10. Multivariate model of female black bear habitat use for a Geographic Information System

    USGS Publications Warehouse

    Clark, Joseph D.; Dunn, James E.; Smith, Kimberly G.

    1993-01-01

    Simple univariate statistical techniques may not adequately assess the multidimensional nature of habitats used by wildlife. Thus, we developed a multivariate method to model habitat-use potential using a set of female black bear (Ursus americanus) radio locations and habitat data consisting of forest cover type, elevation, slope, aspect, distance to roads, distance to streams, and forest cover type diversity score in the Ozark Mountains of Arkansas. The model is based on the Mahalanobis distance statistic coupled with Geographic Information System (GIS) technology. That statistic is a measure of dissimilarity and represents a standardized squared distance between a set of sample variates and an ideal based on the mean of variates associated with animal observations. Calculations were made with the GIS to produce a map containing Mahalanobis distance values within each cell on a 60- × 60-m grid. The model identified areas of high habitat use potential that could not otherwise be identified by independent perusal of any single map layer. This technique avoids many pitfalls that commonly affect typical multivariate analyses of habitat use and is a useful tool for habitat manipulation or mitigation to favor terrestrial vertebrates that use habitats on a landscape scale.

  11. Compositional data analysis for physical activity, sedentary time and sleep research.

    PubMed

    Dumuid, Dorothea; Stanford, Tyman E; Martin-Fernández, Josep-Antoni; Pedišić, Željko; Maher, Carol A; Lewis, Lucy K; Hron, Karel; Katzmarzyk, Peter T; Chaput, Jean-Philippe; Fogelholm, Mikael; Hu, Gang; Lambert, Estelle V; Maia, José; Sarmiento, Olga L; Standage, Martyn; Barreira, Tiago V; Broyles, Stephanie T; Tudor-Locke, Catrine; Tremblay, Mark S; Olds, Timothy

    2017-01-01

    The health effects of daily activity behaviours (physical activity, sedentary time and sleep) are widely studied. While previous research has largely examined activity behaviours in isolation, recent studies have adjusted for multiple behaviours. However, the inclusion of all activity behaviours in traditional multivariate analyses has not been possible due to the perfect multicollinearity of 24-h time budget data. The ensuing lack of adjustment for known effects on the outcome undermines the validity of study findings. We describe a statistical approach that enables the inclusion of all daily activity behaviours, based on the principles of compositional data analysis. Using data from the International Study of Childhood Obesity, Lifestyle and the Environment, we demonstrate the application of compositional multiple linear regression to estimate adiposity from children's daily activity behaviours expressed as isometric log-ratio coordinates. We present a novel method for predicting change in a continuous outcome based on relative changes within a composition, and for calculating associated confidence intervals to allow for statistical inference. The compositional data analysis presented overcomes the lack of adjustment that has plagued traditional statistical methods in the field, and provides robust and reliable insights into the health effects of daily activity behaviours.

  12. The Problem of Auto-Correlation in Parasitology

    PubMed Central

    Pollitt, Laura C.; Reece, Sarah E.; Mideo, Nicole; Nussey, Daniel H.; Colegrave, Nick

    2012-01-01

    Explaining the contribution of host and pathogen factors in driving infection dynamics is a major ambition in parasitology. There is increasing recognition that analyses based on single summary measures of an infection (e.g., peak parasitaemia) do not adequately capture infection dynamics and so, the appropriate use of statistical techniques to analyse dynamics is necessary to understand infections and, ultimately, control parasites. However, the complexities of within-host environments mean that tracking and analysing pathogen dynamics within infections and among hosts poses considerable statistical challenges. Simple statistical models make assumptions that will rarely be satisfied in data collected on host and parasite parameters. In particular, model residuals (unexplained variance in the data) should not be correlated in time or space. Here we demonstrate how failure to account for such correlations can result in incorrect biological inference from statistical analysis. We then show how mixed effects models can be used as a powerful tool to analyse such repeated measures data in the hope that this will encourage better statistical practices in parasitology. PMID:22511865

  13. Distribution of tunnelling times for quantum electron transport.

    PubMed

    Rudge, Samuel L; Kosov, Daniel S

    2016-03-28

    In electron transport, the tunnelling time is the time taken for an electron to tunnel out of a system after it has tunnelled in. We define the tunnelling time distribution for quantum processes in a dissipative environment and develop a practical approach for calculating it, where the environment is described by the general Markovian master equation. We illustrate the theory by using the rate equation to compute the tunnelling time distribution for electron transport through a molecular junction. The tunnelling time distribution is exponential, which indicates that Markovian quantum tunnelling is a Poissonian statistical process. The tunnelling time distribution is used not only to study the quantum statistics of tunnelling along the average electric current but also to analyse extreme quantum events where an electron jumps against the applied voltage bias. The average tunnelling time shows distinctly different temperature dependence for p- and n-type molecular junctions and therefore provides a sensitive tool to probe the alignment of molecular orbitals relative to the electrode Fermi energy.

  14. Generalising Ward's Method for Use with Manhattan Distances.

    PubMed

    Strauss, Trudie; von Maltitz, Michael Johan

    2017-01-01

    The claim that Ward's linkage algorithm in hierarchical clustering is limited to use with Euclidean distances is investigated. In this paper, Ward's clustering algorithm is generalised to use with l1 norm or Manhattan distances. We argue that the generalisation of Ward's linkage method to incorporate Manhattan distances is theoretically sound and provide an example of where this method outperforms the method using Euclidean distances. As an application, we perform statistical analyses on languages using methods normally applied to biology and genetic classification. We aim to quantify differences in character traits between languages and use a statistical language signature based on relative bi-gram (sequence of two letters) frequencies to calculate a distance matrix between 32 Indo-European languages. We then use Ward's method of hierarchical clustering to classify the languages, using the Euclidean distance and the Manhattan distance. Results obtained from using the different distance metrics are compared to show that the Ward's algorithm characteristic of minimising intra-cluster variation and maximising inter-cluster variation is not violated when using the Manhattan metric.

  15. The shape of CMB temperature and polarization peaks on the sphere

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marcos-Caballero, A.; Fernández-Cobos, R.; Martínez-González, E.

    2016-04-01

    We present a theoretical study of CMB temperature peaks, including its effect over the polarization field, and allowing nonzero eccentricity. The formalism is developed in harmonic space and using the covariant derivative on the sphere, which guarantees that the expressions obtained are completely valid at large scales (i.e., no flat approximation). The expected patterns induced by the peak, either in temperature or polarization, are calculated, as well as their covariances. It is found that the eccentricity introduces a quadrupolar dependence in the peak shape, which is proportional to a complex bias parameter b {sub ε}, characterizing the peak asymmetry andmore » orientation. In addition, the one-point statistics of the variables defining the peak on the sphere is reviewed, finding some differences with respect to the flat case for large peaks. Finally, we present a mechanism to simulate constrained CMB maps with a particular peak on the field, which is an interesting tool for analysing the statistical properties of the peaks present in the data.« less

  16. Grain size analysis and depositional environment of shallow marine to basin floor, Kelantan River Delta

    NASA Astrophysics Data System (ADS)

    Afifah, M. R. Nurul; Aziz, A. Che; Roslan, M. Kamal

    2015-09-01

    Sediment samples were collected from the shallow marine from Kuala Besar, Kelantan outwards to the basin floor of South China Sea which consisted of quaternary bottom sediments. Sixty five samples were analysed for their grain size distribution and statistical relationships. Basic statistical analysis like mean, standard deviation, skewness and kurtosis were calculated and used to differentiate the depositional environment of the sediments and to derive the uniformity of depositional environment either from the beach or river environment. The sediments of all areas were varied in their sorting ranging from very well sorted to poorly sorted, strongly negative skewed to strongly positive skewed, and extremely leptokurtic to very platykurtic in nature. Bivariate plots between the grain-size parameters were then interpreted and the Coarsest-Median (CM) pattern showed the trend suggesting relationships between sediments influenced by three ongoing hydrodynamic factors namely turbidity current, littoral drift and waves dynamic, which functioned to control the sediments distribution pattern in various ways.

  17. International employment in clinical practice: influencing factors for the dental hygienist.

    PubMed

    Abbott, A; Barrow, S-Y; Lopresti, F; Hittelman, E

    2005-02-01

    To assess demographics, job characteristics, geographical regions, resources and commitment, which influence dental hygienists seeking international clinical practice employment opportunities. Questionnaires were mailed to a convenience sample of members of the Dental Hygienists' Association of the City of New York. Statistical analyses were conducted and frequency distributions and relationships between variables were calculated. Seventy-two percent of respondents reported that they are or may be interested in working overseas. Italy and Spain (67%) were the regions of most interest. Salary (65%) was cited as the most influencing factor in selection, whereas non-compliance with the equivalency to Occupational Safety and Health Administration standards (74%) was the most frequently perceived barrier. Multiple language fluency was statistically significant (0.003) regarding interest in overseas employment. Policy makers, employers and educators need to be aware of these findings should recruitment be a possibility to render urgently needed oral hygiene care in regions where there is a perceived shortage of dental hygienists.

  18. Combined Uncertainty and A-Posteriori Error Bound Estimates for General CFD Calculations: Theory and Software Implementation

    NASA Technical Reports Server (NTRS)

    Barth, Timothy J.

    2014-01-01

    This workshop presentation discusses the design and implementation of numerical methods for the quantification of statistical uncertainty, including a-posteriori error bounds, for output quantities computed using CFD methods. Hydrodynamic realizations often contain numerical error arising from finite-dimensional approximation (e.g. numerical methods using grids, basis functions, particles) and statistical uncertainty arising from incomplete information and/or statistical characterization of model parameters and random fields. The first task at hand is to derive formal error bounds for statistics given realizations containing finite-dimensional numerical error [1]. The error in computed output statistics contains contributions from both realization error and the error resulting from the calculation of statistics integrals using a numerical method. A second task is to devise computable a-posteriori error bounds by numerically approximating all terms arising in the error bound estimates. For the same reason that CFD calculations including error bounds but omitting uncertainty modeling are only of limited value, CFD calculations including uncertainty modeling but omitting error bounds are only of limited value. To gain maximum value from CFD calculations, a general software package for uncertainty quantification with quantified error bounds has been developed at NASA. The package provides implementations for a suite of numerical methods used in uncertainty quantification: Dense tensorization basis methods [3] and a subscale recovery variant [1] for non-smooth data, Sparse tensorization methods[2] utilizing node-nested hierarchies, Sampling methods[4] for high-dimensional random variable spaces.

  19. Evaluation of prompt gamma-ray data and nuclear structure of niobium-94 with statistical model calculations

    NASA Astrophysics Data System (ADS)

    Turkoglu, Danyal

    Precise knowledge of prompt gamma-ray intensities following neutron capture is critical for elemental and isotopic analyses, homeland security, modeling nuclear reactors, etc. A recently-developed database of prompt gamma-ray production cross sections and nuclear structure information in the form of a decay scheme, called the Evaluated Gamma-ray Activation File (EGAF), is under revision. Statistical model calculations are useful for checking the consistency of the decay scheme, providing insight on its completeness and accuracy. Furthermore, these statistical model calculations are necessary to estimate the contribution of continuum gamma-rays, which cannot be experimentally resolved due to the high density of excited states in medium- and heavy-mass nuclei. Decay-scheme improvements in EGAF lead to improvements to other databases (Evaluated Nuclear Structure Data File, Reference Input Parameter Library) that are ultimately used in nuclear-reaction models to generate the Evaluated Nuclear Data File (ENDF). Gamma-ray transitions following neutron capture in 93Nb have been studied at the cold-neutron beam facility at the Budapest Research Reactor. Measurements have been performed using a coaxial HPGe detector with Compton suppression. Partial gamma-ray production capture cross sections at a neutron velocity of 2200 m/s have been deduced relative to that of the 255.9-keV transition after cold-neutron capture by 93Nb. With the measurement of a niobium chloride target, this partial cross section was internally standardized to the cross section for the 1951-keV transition after cold-neutron capture by 35Cl. The resulting (0.1377 +/- 0.0018) barn (b) partial cross section produced a calibration factor that was 23% lower than previously measured for the EGAF database. The thermal-neutron cross sections were deduced for the 93Nb(n,gamma ) 94mNb and 93Nb(n,gamma) 94gNb reactions by summing the experimentally-measured partial gamma-ray production cross sections associated with the ground-state transitions below the 396-keV level and combining that summation with the contribution to the ground state from the quasi-continuum above 396 keV, determined with Monte Carlo statistical model calculations using the DICEBOX computer code. These values, sigmam and sigma 0, were (0.83 +/- 0.05) b and (1.16 +/- 0.11) b, respectively, and found to be in agreement with literature values. Comparison of the modeled population and experimental depopulation of individual levels confirmed tentative spin assignments and suggested changes where imbalances existed.

  20. SPA- STATISTICAL PACKAGE FOR TIME AND FREQUENCY DOMAIN ANALYSIS

    NASA Technical Reports Server (NTRS)

    Brownlow, J. D.

    1994-01-01

    The need for statistical analysis often arises when data is in the form of a time series. This type of data is usually a collection of numerical observations made at specified time intervals. Two kinds of analysis may be performed on the data. First, the time series may be treated as a set of independent observations using a time domain analysis to derive the usual statistical properties including the mean, variance, and distribution form. Secondly, the order and time intervals of the observations may be used in a frequency domain analysis to examine the time series for periodicities. In almost all practical applications, the collected data is actually a mixture of the desired signal and a noise signal which is collected over a finite time period with a finite precision. Therefore, any statistical calculations and analyses are actually estimates. The Spectrum Analysis (SPA) program was developed to perform a wide range of statistical estimation functions. SPA can provide the data analyst with a rigorous tool for performing time and frequency domain studies. In a time domain statistical analysis the SPA program will compute the mean variance, standard deviation, mean square, and root mean square. It also lists the data maximum, data minimum, and the number of observations included in the sample. In addition, a histogram of the time domain data is generated, a normal curve is fit to the histogram, and a goodness-of-fit test is performed. These time domain calculations may be performed on both raw and filtered data. For a frequency domain statistical analysis the SPA program computes the power spectrum, cross spectrum, coherence, phase angle, amplitude ratio, and transfer function. The estimates of the frequency domain parameters may be smoothed with the use of Hann-Tukey, Hamming, Barlett, or moving average windows. Various digital filters are available to isolate data frequency components. Frequency components with periods longer than the data collection interval are removed by least-squares detrending. As many as ten channels of data may be analyzed at one time. Both tabular and plotted output may be generated by the SPA program. This program is written in FORTRAN IV and has been implemented on a CDC 6000 series computer with a central memory requirement of approximately 142K (octal) of 60 bit words. This core requirement can be reduced by segmentation of the program. The SPA program was developed in 1978.

  1. Analysis and interpretation of cost data in randomised controlled trials: review of published studies

    PubMed Central

    Barber, Julie A; Thompson, Simon G

    1998-01-01

    Objective To review critically the statistical methods used for health economic evaluations in randomised controlled trials where an estimate of cost is available for each patient in the study. Design Survey of published randomised trials including an economic evaluation with cost values suitable for statistical analysis; 45 such trials published in 1995 were identified from Medline. Main outcome measures The use of statistical methods for cost data was assessed in terms of the descriptive statistics reported, use of statistical inference, and whether the reported conclusions were justified. Results Although all 45 trials reviewed apparently had cost data for each patient, only 9 (20%) reported adequate measures of variability for these data and only 25 (56%) gave results of statistical tests or a measure of precision for the comparison of costs between the randomised groups. Only 16 (36%) of the articles gave conclusions which were justified on the basis of results presented in the paper. No paper reported sample size calculations for costs. Conclusions The analysis and interpretation of cost data from published trials reveal a lack of statistical awareness. Strong and potentially misleading conclusions about the relative costs of alternative therapies have often been reported in the absence of supporting statistical evidence. Improvements in the analysis and reporting of health economic assessments are urgently required. Health economic guidelines need to be revised to incorporate more detailed statistical advice. Key messagesHealth economic evaluations required for important healthcare policy decisions are often carried out in randomised controlled trialsA review of such published economic evaluations assessed whether statistical methods for cost outcomes have been appropriately used and interpretedFew publications presented adequate descriptive information for costs or performed appropriate statistical analysesIn at least two thirds of the papers, the main conclusions regarding costs were not justifiedThe analysis and reporting of health economic assessments within randomised controlled trials urgently need improving PMID:9794854

  2. Statistics-based email communication security behavior recognition

    NASA Astrophysics Data System (ADS)

    Yi, Junkai; Su, Yueyang; Zhao, Xianghui

    2017-08-01

    With the development of information technology, e-mail has become a popular communication medium. It has great significant to determine the relationship between the two sides of the communication. Firstly, this paper analysed and processed the content and attachment of e-mail using the skill of steganalysis and malware analysis. And it also conducts the following feature extracting and behaviour model establishing which based on Naive Bayesian theory. Then a behaviour analysis method was employed to calculate and evaluate the communication security. Finally, some experiments about the accuracy of the behavioural relationship of communication identifying has been carried out. The result shows that this method has a great effects and correctness as eighty-four percent.

  3. A Documentary Analysis of Abstracts Presented in European Congresses on Adapted Physical Activity.

    PubMed

    Sklenarikova, Jana; Kudlacek, Martin; Baloun, Ladislav; Causgrove Dunn, Janice

    2016-07-01

    The purpose of the study was to identify trends in research abstracts published in the books of abstracts of the European Congress of Adapted Physical Activity from 2004 to 2012. A documentary analysis of the contents of 459 abstracts was completed. Data were coded based on subcategories used in a previous study by Zhang, deLisle, and Chen (2006) and by Porretta and Sherrill (2005): number of authors, data source, sample size, type of disability, data analyses, type of study, and focus of study. Descriptive statistics calculated for each subcategory revealed an overall picture of the state and trends of scientific inquiry in adapted physical activity research in Europe.

  4. Parton distributions in the LHC era

    NASA Astrophysics Data System (ADS)

    Del Debbio, Luigi

    2018-03-01

    Analyses of LHC (and other!) experiments require robust and statistically accurate determinations of the structure of the proton, encoded in the parton distribution functions (PDFs). The standard description of hadronic processes relies on factorization theorems, which allow a separation of process-dependent short-distance physics from the universal long-distance structure of the proton. Traditionally the PDFs are obtained from fits to experimental data. However, understanding the long-distance properties of hadrons is a nonperturbative problem, and lattice QCD can play a role in providing useful results from first principles. In this talk we compare the different approaches used to determine PDFs, and try to assess the impact of existing, and future, lattice calculations.

  5. Currie detection limits in gamma-ray spectroscopy.

    PubMed

    De Geer, Lars-Erik

    2004-01-01

    Currie Hypothesis testing is applied to gamma-ray spectral data, where an optimum part of the peak is used and the background is considered well known from nearby channels. With this, the risk of making Type I errors is about 100 times lower than commonly assumed. A programme, PeakMaker, produces random peaks with given characteristics on the screen and calculations are done to facilitate a full use of Poisson statistics in spectrum analyses. SHORT TECHNICAL NOTE SUMMARY: The Currie decision limit concept applied to spectral data is reinterpreted, which gives better consistency between the selected error risk and the observed error rates. A PeakMaker program is described and the few count problem is analyzed.

  6. 40 CFR 63.1260 - Reporting requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) Data and rationale used to support an engineering assessment to calculate uncontrolled emissions from... profiles, performance tests, engineering analyses, design evaluations, or calculations used to demonstrate... required calculations and engineering analyses have been performed. For the initial Periodic report, each...

  7. Comparing the nine-item Shared Decision-Making Questionnaire to the OPTION Scale - an attempt to establish convergent validity.

    PubMed

    Scholl, Isabelle; Kriston, Levente; Dirmaier, Jörg; Härter, Martin

    2015-02-01

    While there has been a clear move towards shared decision-making (SDM) in the last few years, the measurement of SDM-related constructs remains challenging. There has been a call for further psychometric testing of known scales, especially regarding validity aspects. To test convergent validity of the nine-item Shared Decision-Making Questionnaire (SDM-Q-9) by comparing it to the OPTION Scale. Cross-sectional study. Data were collected in outpatient care practices. Patients suffering from chronic diseases and facing a medical decision were included in the study. Consultations were evaluated using the OPTION Scale. Patients completed the SDM-Q-9 after the consultation. First, the internal consistency of both scales and the inter-rater reliability of the OPTION Scale were calculated. To analyse the convergent validity of the SDM-Q-9, correlation between the patient (SDM-Q-9) and expert ratings (OPTION Scale) was calculated. A total of 21 physicians provided analysable data of consultations with 63 patients. Analyses revealed good internal consistency of the SDM-Q-9 and limited internal consistency of the OPTION Scale. Inter-rater reliability of the latter was less than optimal. Association between the total scores of both instruments was weak with a Spearman correlation of r = 0.19 and did not reach statistical significance. By the use of the OPTION Scale convergent validity of the SDM-Q-9 could not be established. Several possible explanations for this result are discussed. This study shows that the measurement of SDM remains challenging. © 2012 John Wiley & Sons Ltd.

  8. Comparison of somatotype values of football players in two professional league football teams according to the positions.

    PubMed

    Orhan, Ozlem; Sagir, Mehmet; Zorba, Erdal

    2013-06-01

    This study compared the somatotype values of football players according to their playing positions. The study aimed to determine the physical profiles of players and to analyze the relationships between somatotypes and playing positions. Study participants were members of two teams in the Turkey Professional Football League, Gençlerbirligi Sports Team (GB) (N = 24) and Gençlerbirligi Oftas Sports Team (GBO) (N = 24). Anthropometric measurements of the players were performed according to techniques suggested by the Anthropometric Standardization Reference Manual (ASRM) and International Biological Program (IBP). In somatotype calculations, triceps, subscapular, supraspinale and calf skinfold thickness, humerus bicondylar, femur bicondylar, biceps circumference, calf circumference and body weight and height were used. Statistical analysis of the data was performed using the Graph Pad prism Version 5.00 for Windows (Graph Pad Software, San Diego California USA); somatotype calculations and analyses used the Somatotype 1.1 program and graphical representations of the results were produced. Analysis of non-parametric (two independent samples) Mann-Whitney U Test of the player data showed that there were no statistically significant differences between the two teams. The measurements indicated that, when all of the GB and GBO players were evaluated collectively, their average somatotypes were balanced mesomorph. The somatotypes of GBO goalkeepers were generally ectomorphic mesomorph; GB goalkeepers were balanced mesomorphic, although they were slightly endomorphic.

  9. Biomechanical Analysis of Military Boots. Phase 1. Materials Testing of Military and Commercial Footwear

    DTIC Science & Technology

    1992-10-01

    N=8) and Results of 44 Statistical Analyses for Impact Test Performed on Forefoot of Unworn Footwear A-2. Summary Statistics (N=8) and Results of...on Forefoot of Worn Footwear Vlll Tables (continued) Table Page B-2. Summary Statistics (N=4) and Results of 76 Statistical Analyses for Impact...used tests to assess heel and forefoot shock absorption, upper and sole durability, and flexibility (Cavanagh, 1978). Later, the number of tests was

  10. Quantifying, displaying and accounting for heterogeneity in the meta-analysis of RCTs using standard and generalised Q statistics

    PubMed Central

    2011-01-01

    Background Clinical researchers have often preferred to use a fixed effects model for the primary interpretation of a meta-analysis. Heterogeneity is usually assessed via the well known Q and I2 statistics, along with the random effects estimate they imply. In recent years, alternative methods for quantifying heterogeneity have been proposed, that are based on a 'generalised' Q statistic. Methods We review 18 IPD meta-analyses of RCTs into treatments for cancer, in order to quantify the amount of heterogeneity present and also to discuss practical methods for explaining heterogeneity. Results Differing results were obtained when the standard Q and I2 statistics were used to test for the presence of heterogeneity. The two meta-analyses with the largest amount of heterogeneity were investigated further, and on inspection the straightforward application of a random effects model was not deemed appropriate. Compared to the standard Q statistic, the generalised Q statistic provided a more accurate platform for estimating the amount of heterogeneity in the 18 meta-analyses. Conclusions Explaining heterogeneity via the pre-specification of trial subgroups, graphical diagnostic tools and sensitivity analyses produced a more desirable outcome than an automatic application of the random effects model. Generalised Q statistic methods for quantifying and adjusting for heterogeneity should be incorporated as standard into statistical software. Software is provided to help achieve this aim. PMID:21473747

  11. Corrigendum to ‘Evidence for shock heating and constraints on Martian surface temperatures revealed by 40Ar/ 39Ar thermochronometry of Martian meteorites’ [Geochim. Cosmochim. Acta (2010) 6900–6920

    DOE PAGES

    Cassata, William S.; Shuster, David L.; Renne, Paul R.; ...

    2014-10-23

    Here, the authors regret they have discovered errors in Eq. (3) and in a spreadsheet used to calculate cosmogenic exposure ages shown in Table 1. Eq. (3) is missing a term. The spreadsheet errors concerned an incorrect cell reference and application of Eq. (3). Correction of these errors results in ~15–20% changes to the exposure ages of all samples, minor (generally <0.2%) changes to the radioisotopic ages of some samples (those that entailed a correction for chlorine-derived 38Ar calculated based on the exposure age; see Section 3.3), and statistically insignificant changes to the inferred trapped components identified through isochron analyses.more » These modifications have no impact on the modeling, discussions, or conclusions in the paper, nor do the changes to radioisotopic ages exceed the 1 sigma uncertainties.« less

  12. Characterizing Sub-Daily Flow Regimes: Implications of Hydrologic Resolution on Ecohydrology Studies

    DOE PAGES

    Bevelhimer, Mark S.; McManamay, Ryan A.; O'Connor, B.

    2014-05-26

    Natural variability in flow is a primary factor controlling geomorphic and ecological processes in riverine ecosystems. Within the hydropower industry, there is growing pressure from environmental groups and natural resource managers to change reservoir releases from daily peaking to run-of-river operations on the basis of the assumption that downstream biological communities will improve under a more natural flow regime. In this paper, we discuss the importance of assessing sub-daily flows for understanding the physical and ecological dynamics within river systems. We present a variety of metrics for characterizing sub-daily flow variation and use these metrics to evaluate general trends amongmore » streams affected by peaking hydroelectric projects, run-of-river projects and streams that are largely unaffected by flow altering activities. Univariate and multivariate techniques were used to assess similarity among different stream types on the basis of these sub-daily metrics. For comparison, similar analyses were performed using analogous metrics calculated with mean daily flow values. Our results confirm that sub-daily flow metrics reveal variation among and within streams that are not captured by daily flow statistics. Using sub-daily flow statistics, we were able to quantify the degree of difference between unaltered and peaking streams and the amount of similarity between unaltered and run-of-river streams. The sub-daily statistics were largely uncorrelated with daily statistics of similar scope. Furthermore, on short temporal scales, sub-daily statistics reveal the relatively constant nature of unaltered streamreaches and the highly variable nature of hydropower-affected streams, whereas daily statistics show just the opposite over longer temporal scales.« less

  13. A time to be born: Variation in the hour of birth in a rural population of Northern Argentina.

    PubMed

    Chaney, Carlye; Goetz, Laura G; Valeggia, Claudia

    2018-04-17

    The present study aimed at investigating the timing of birth across the day in a rural population of indigenous and nonindigenous women in the province of Formosa, Argentina in order to explore the variation in patterns in a non-Western setting. This study utilized birth record data transcribed from delivery room records at a rural hospital in the province of Formosa, northern Argentina. The sample included data for Criollo, Wichí, and Toba/Qom women (n = 2421). Statistical analysis was conducted using directional statistics to identify a mean sample direction. Chi-square tests for homogeneity were also used to test for statistical significant differences between hours of the day. The mean sample direction was 81.04°, which equates to 5:24 AM when calculated as time on a 24-hr clock. Chi-squared analyses showed a statistically significant peak in births between 12:00 and 4:00 AM. Birth counts generally declined throughout the day until a statistically significant trough around 5:00 PM. This pattern may be associated with the circadian rhythms of hormone release, particularly melatonin, on a proximate level. At the ultimate level, giving birth in the early hours of the morning may have been selected to time births when the mother could benefit from the predator protection and support provided by her social group as well as increased mother-infant bonding from a more peaceful environment. © 2018 Wiley Periodicals, Inc.

  14. Power, effects, confidence, and significance: an investigation of statistical practices in nursing research.

    PubMed

    Gaskin, Cadeyrn J; Happell, Brenda

    2014-05-01

    To (a) assess the statistical power of nursing research to detect small, medium, and large effect sizes; (b) estimate the experiment-wise Type I error rate in these studies; and (c) assess the extent to which (i) a priori power analyses, (ii) effect sizes (and interpretations thereof), and (iii) confidence intervals were reported. Statistical review. Papers published in the 2011 volumes of the 10 highest ranked nursing journals, based on their 5-year impact factors. Papers were assessed for statistical power, control of experiment-wise Type I error, reporting of a priori power analyses, reporting and interpretation of effect sizes, and reporting of confidence intervals. The analyses were based on 333 papers, from which 10,337 inferential statistics were identified. The median power to detect small, medium, and large effect sizes was .40 (interquartile range [IQR]=.24-.71), .98 (IQR=.85-1.00), and 1.00 (IQR=1.00-1.00), respectively. The median experiment-wise Type I error rate was .54 (IQR=.26-.80). A priori power analyses were reported in 28% of papers. Effect sizes were routinely reported for Spearman's rank correlations (100% of papers in which this test was used), Poisson regressions (100%), odds ratios (100%), Kendall's tau correlations (100%), Pearson's correlations (99%), logistic regressions (98%), structural equation modelling/confirmatory factor analyses/path analyses (97%), and linear regressions (83%), but were reported less often for two-proportion z tests (50%), analyses of variance/analyses of covariance/multivariate analyses of variance (18%), t tests (8%), Wilcoxon's tests (8%), Chi-squared tests (8%), and Fisher's exact tests (7%), and not reported for sign tests, Friedman's tests, McNemar's tests, multi-level models, and Kruskal-Wallis tests. Effect sizes were infrequently interpreted. Confidence intervals were reported in 28% of papers. The use, reporting, and interpretation of inferential statistics in nursing research need substantial improvement. Most importantly, researchers should abandon the misleading practice of interpreting the results from inferential tests based solely on whether they are statistically significant (or not) and, instead, focus on reporting and interpreting effect sizes, confidence intervals, and significance levels. Nursing researchers also need to conduct and report a priori power analyses, and to address the issue of Type I experiment-wise error inflation in their studies. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  15. The Time in Therapeutic Range and Bleeding Complications of Warfarin in Different Geographic Regions of Turkey: A Subgroup Analysis of WARFARIN-TR Study.

    PubMed

    Kılıç, Salih; Çelik, Ahmet; Çakmak, Hüseyin Altuğ; Afşin, Abdülmecit; Tekkeşin, Ahmet İlker; Açıksarı, Gönül; Memetoğlu, Mehmet Erdem; Özpamuk Karadeniz, Fatma; Şahan, Ekrem; Alıcı, Mehmet Hayri; Dereli, Yüksel; Sinan, Ümit Yaşar; Zoghi, Mehdi

    2017-08-04

    The time in therapeutic range values may vary between different geographical regions of Turkey in patients vitamin K antagonist therapy. To evaluate the time in therapeutic range percentages, efficacy, safety and awareness of warfarin according to the different geographical regions in patients who participated in the WARFARIN-TR study (The Awareness, Efficacy, Safety and Time in Therapeutic Range of Warfarin in the Turkish population) in Turkey. Cross-sectional study. The WARFARIN-TR study includes 4987 patients using warfarin and involved regular international normalized ratio monitoring between January 1, 2014 and December 31, 2014. Patients attended follow-ups for 12 months. The sample size calculations were analysed according to the density of the regional population and according to Turkish Statistical Institute data. The time in therapeutic range was calculated according to F.R. Roosendaal's algorithm. Awareness was evaluated based on the patients' knowledge of the effect of warfarin and food-drug interactions with simple questions developed based on a literature review. The Turkey-wide time in therapeutic range was reported as 49.5%±22.9 in the WARFARIN-TR study. There were statistically significant differences between regions in terms of time in therapeutic range (p>0.001). The highest rate was reported in the Marmara region (54.99%±20.91) and the lowest was in the South-eastern Anatolia region (41.95±24.15) (p>0.001). Bleeding events were most frequently seen in Eastern Anatolia (41.6%), with major bleeding in the Aegean region (5.11%) and South-eastern Anatolia (5.36%). There were statistically significant differences between the regions in terms of awareness (p>0.001). Statistically significant differences were observed in terms of the efficacy, safety and awareness of warfarin therapy according to different geographical regions in Turkey.

  16. GreekLex 2: A comprehensive lexical database with part-of-speech, syllabic, phonological, and stress information

    PubMed Central

    van Heuven, Walter J. B.; Pitchford, Nicola J.; Ledgeway, Timothy

    2017-01-01

    Databases containing lexical properties on any given orthography are crucial for psycholinguistic research. In the last ten years, a number of lexical databases have been developed for Greek. However, these lack important part-of-speech information. Furthermore, the need for alternative procedures for calculating syllabic measurements and stress information, as well as combination of several metrics to investigate linguistic properties of the Greek language are highlighted. To address these issues, we present a new extensive lexical database of Modern Greek (GreekLex 2) with part-of-speech information for each word and accurate syllabification and orthographic information predictive of stress, as well as several measurements of word similarity and phonetic information. The addition of detailed statistical information about Greek part-of-speech, syllabification, and stress neighbourhood allowed novel analyses of stress distribution within different grammatical categories and syllabic lengths to be carried out. Results showed that the statistical preponderance of stress position on the pre-final syllable that is reported for Greek language is dependent upon grammatical category. Additionally, analyses showed that a proportion higher than 90% of the tokens in the database would be stressed correctly solely by relying on stress neighbourhood information. The database and the scripts for orthographic and phonological syllabification as well as phonetic transcription are available at http://www.psychology.nottingham.ac.uk/greeklex/. PMID:28231303

  17. Statistical process control as a tool for controlling operating room performance: retrospective analysis and benchmarking.

    PubMed

    Chen, Tsung-Tai; Chang, Yun-Jau; Ku, Shei-Ling; Chung, Kuo-Piao

    2010-10-01

    There is much research using statistical process control (SPC) to monitor surgical performance, including comparisons among groups to detect small process shifts, but few of these studies have included a stabilization process. This study aimed to analyse the performance of surgeons in operating room (OR) and set a benchmark by SPC after stabilized process. The OR profile of 499 patients who underwent laparoscopic cholecystectomy performed by 16 surgeons at a tertiary hospital in Taiwan during 2005 and 2006 were recorded. SPC was applied to analyse operative and non-operative times using the following five steps: first, the times were divided into two segments; second, they were normalized; third, they were evaluated as individual processes; fourth, the ARL(0) was calculated;, and fifth, the different groups (surgeons) were compared. Outliers were excluded to ensure stability for each group and to facilitate inter-group comparison. The results showed that in the stabilized process, only one surgeon exhibited a significantly shorter total process time (including operative time and non-operative time). In this study, we use five steps to demonstrate how to control surgical and non-surgical time in phase I. There are some measures that can be taken to prevent skew and instability in the process. Also, using SPC, one surgeon can be shown to be a real benchmark. © 2010 Blackwell Publishing Ltd.

  18. GreekLex 2: A comprehensive lexical database with part-of-speech, syllabic, phonological, and stress information.

    PubMed

    Kyparissiadis, Antonios; van Heuven, Walter J B; Pitchford, Nicola J; Ledgeway, Timothy

    2017-01-01

    Databases containing lexical properties on any given orthography are crucial for psycholinguistic research. In the last ten years, a number of lexical databases have been developed for Greek. However, these lack important part-of-speech information. Furthermore, the need for alternative procedures for calculating syllabic measurements and stress information, as well as combination of several metrics to investigate linguistic properties of the Greek language are highlighted. To address these issues, we present a new extensive lexical database of Modern Greek (GreekLex 2) with part-of-speech information for each word and accurate syllabification and orthographic information predictive of stress, as well as several measurements of word similarity and phonetic information. The addition of detailed statistical information about Greek part-of-speech, syllabification, and stress neighbourhood allowed novel analyses of stress distribution within different grammatical categories and syllabic lengths to be carried out. Results showed that the statistical preponderance of stress position on the pre-final syllable that is reported for Greek language is dependent upon grammatical category. Additionally, analyses showed that a proportion higher than 90% of the tokens in the database would be stressed correctly solely by relying on stress neighbourhood information. The database and the scripts for orthographic and phonological syllabification as well as phonetic transcription are available at http://www.psychology.nottingham.ac.uk/greeklex/.

  19. Effect of an EBM course in combination with case method learning sessions: an RCT on professional performance, job satisfaction, and self-efficacy of occupational physicians.

    PubMed

    Hugenholtz, Nathalie I R; Schaafsma, Frederieke G; Nieuwenhuijsen, Karen; van Dijk, Frank J H

    2008-10-01

    An intervention existing of an evidence-based medicine (EBM) course in combination with case method learning sessions (CMLSs) was designed to enhance the professional performance, self-efficacy and job satisfaction of occupational physicians. A cluster randomized controlled trial was set up and data were collected through questionnaires at baseline (T0), directly after the intervention (T1) and 7 months after baseline (T2). The data of the intervention group [T0 (n = 49), T1 (n = 31), T2 (n = 29)] and control group [T0 (n = 49), T1 (n = 28), T2 (n = 28)] were analysed in mixed model analyses. Mean scores of the perceived value of the CMLS were calculated in the intervention group. The overall effect of the intervention over time comparing the intervention with the control group was statistically significant for professional performance (p < 0.001). Job satisfaction and self-efficacy changes were small and not statistically significant between the groups. The perceived value of the CMLS to gain new insights and to improve the quality of their performance increased with the number of sessions followed. An EBM course in combination with case method learning sessions is perceived as valuable and offers evidence to enhance the professional performance of occupational physicians. However, it does not seem to influence their self-efficacy and job satisfaction.

  20. Learn-as-you-go acceleration of cosmological parameter estimates

    NASA Astrophysics Data System (ADS)

    Aslanyan, Grigor; Easther, Richard; Price, Layne C.

    2015-09-01

    Cosmological analyses can be accelerated by approximating slow calculations using a training set, which is either precomputed or generated dynamically. However, this approach is only safe if the approximations are well understood and controlled. This paper surveys issues associated with the use of machine-learning based emulation strategies for accelerating cosmological parameter estimation. We describe a learn-as-you-go algorithm that is implemented in the Cosmo++ code and (1) trains the emulator while simultaneously estimating posterior probabilities; (2) identifies unreliable estimates, computing the exact numerical likelihoods if necessary; and (3) progressively learns and updates the error model as the calculation progresses. We explicitly describe and model the emulation error and show how this can be propagated into the posterior probabilities. We apply these techniques to the Planck likelihood and the calculation of ΛCDM posterior probabilities. The computation is significantly accelerated without a pre-defined training set and uncertainties in the posterior probabilities are subdominant to statistical fluctuations. We have obtained a speedup factor of 6.5 for Metropolis-Hastings and 3.5 for nested sampling. Finally, we discuss the general requirements for a credible error model and show how to update them on-the-fly.

  1. Learn-as-you-go acceleration of cosmological parameter estimates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aslanyan, Grigor; Easther, Richard; Price, Layne C., E-mail: g.aslanyan@auckland.ac.nz, E-mail: r.easther@auckland.ac.nz, E-mail: lpri691@aucklanduni.ac.nz

    2015-09-01

    Cosmological analyses can be accelerated by approximating slow calculations using a training set, which is either precomputed or generated dynamically. However, this approach is only safe if the approximations are well understood and controlled. This paper surveys issues associated with the use of machine-learning based emulation strategies for accelerating cosmological parameter estimation. We describe a learn-as-you-go algorithm that is implemented in the Cosmo++ code and (1) trains the emulator while simultaneously estimating posterior probabilities; (2) identifies unreliable estimates, computing the exact numerical likelihoods if necessary; and (3) progressively learns and updates the error model as the calculation progresses. We explicitlymore » describe and model the emulation error and show how this can be propagated into the posterior probabilities. We apply these techniques to the Planck likelihood and the calculation of ΛCDM posterior probabilities. The computation is significantly accelerated without a pre-defined training set and uncertainties in the posterior probabilities are subdominant to statistical fluctuations. We have obtained a speedup factor of 6.5 for Metropolis-Hastings and 3.5 for nested sampling. Finally, we discuss the general requirements for a credible error model and show how to update them on-the-fly.« less

  2. The choice of statistical methods for comparisons of dosimetric data in radiotherapy.

    PubMed

    Chaikh, Abdulhamid; Giraud, Jean-Yves; Perrin, Emmanuel; Bresciani, Jean-Pierre; Balosso, Jacques

    2014-09-18

    Novel irradiation techniques are continuously introduced in radiotherapy to optimize the accuracy, the security and the clinical outcome of treatments. These changes could raise the question of discontinuity in dosimetric presentation and the subsequent need for practice adjustments in case of significant modifications. This study proposes a comprehensive approach to compare different techniques and tests whether their respective dose calculation algorithms give rise to statistically significant differences in the treatment doses for the patient. Statistical investigation principles are presented in the framework of a clinical example based on 62 fields of radiotherapy for lung cancer. The delivered doses in monitor units were calculated using three different dose calculation methods: the reference method accounts the dose without tissues density corrections using Pencil Beam Convolution (PBC) algorithm, whereas new methods calculate the dose with tissues density correction for 1D and 3D using Modified Batho (MB) method and Equivalent Tissue air ratio (ETAR) method, respectively. The normality of the data and the homogeneity of variance between groups were tested using Shapiro-Wilks and Levene test, respectively, then non-parametric statistical tests were performed. Specifically, the dose means estimated by the different calculation methods were compared using Friedman's test and Wilcoxon signed-rank test. In addition, the correlation between the doses calculated by the three methods was assessed using Spearman's rank and Kendall's rank tests. The Friedman's test showed a significant effect on the calculation method for the delivered dose of lung cancer patients (p <0.001). The density correction methods yielded to lower doses as compared to PBC by on average (-5 ± 4.4 SD) for MB and (-4.7 ± 5 SD) for ETAR. Post-hoc Wilcoxon signed-rank test of paired comparisons indicated that the delivered dose was significantly reduced using density-corrected methods as compared to the reference method. Spearman's and Kendall's rank tests indicated a positive correlation between the doses calculated with the different methods. This paper illustrates and justifies the use of statistical tests and graphical representations for dosimetric comparisons in radiotherapy. The statistical analysis shows the significance of dose differences resulting from two or more techniques in radiotherapy.

  3. Tank 241-T-204, core 188 analytical results for the final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nuzum, J.L.

    TANK 241-T-204, CORE 188, ANALYTICAL RESULTS FOR THE FINAL REPORT. This document is the final laboratory report for Tank 241 -T-204. Push mode core segments were removed from Riser 3 between March 27, 1997, and April 11, 1997. Segments were received and extruded at 222-8 Laboratory. Analyses were performed in accordance with Tank 241-T-204 Push Mode Core Sampling and analysis Plan (TRAP) (Winkleman, 1997), Letter of instruction for Core Sample Analysis of Tanks 241-T-201, 241- T-202, 241-T-203, and 241-T-204 (LAY) (Bell, 1997), and Safety Screening Data Qual@ Objective (DO) ODukelow, et al., 1995). None of the subsamples submitted for totalmore » alpha activity (AT) or differential scanning calorimetry (DC) analyses exceeded the notification limits stated in DO. The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group and are not considered in this report.« less

  4. Cognitive capitalism: the effect of cognitive ability on wealth, as mediated through scientific achievement and economic freedom.

    PubMed

    Rindermann, Heiner; Thompson, James

    2011-06-01

    Traditional economic theories stress the relevance of political, institutional, geographic, and historical factors for economic growth. In contrast, human-capital theories suggest that peoples' competences, mediated by technological progress, are the deciding factor in a nation's wealth. Using three large-scale assessments, we calculated cognitive-competence sums for the mean and for upper- and lower-level groups for 90 countries and compared the influence of each group's intellectual ability on gross domestic product. In our cross-national analyses, we applied different statistical methods (path analyses, bootstrapping) and measures developed by different research groups to various country samples and historical periods. Our results underscore the decisive relevance of cognitive ability--particularly of an intellectual class with high cognitive ability and accomplishments in science, technology, engineering, and math--for national wealth. Furthermore, this group's cognitive ability predicts the quality of economic and political institutions, which further determines the economic affluence of the nation. Cognitive resources enable the evolution of capitalism and the rise of wealth.

  5. An analysis of the AVE-SESAME I period using statistical structure and correlation functions. [Atmospheric Variability Experiment-Severe Environmental Storm and Mesoscale Experiment

    NASA Technical Reports Server (NTRS)

    Fuelberg, H. E.; Meyer, P. J.

    1984-01-01

    Structure and correlation functions are used to describe atmospheric variability during the 10-11 April day of AVE-SESAME 1979 that coincided with the Red River Valley tornado outbreak. The special mesoscale rawinsonde data are employed in calculations involving temperature, geopotential height, horizontal wind speed and mixing ratio. Functional analyses are performed in both the lower and upper troposphere for the composite 24 h experiment period and at individual 3 h observation times. Results show that mesoscale features are prominent during the composite period. Fields of mixing ratio and horizontal wind speed exhibit the greatest amounts of small-scale variance, whereas temperature and geopotential height contain the least. Results for the nine individual times show that small-scale variance is greatest during the convective outbreak. The functions also are used to estimate random errors in the rawinsonde data. Finally, sensitivity analyses are presented to quantify confidence limits of the structure functions.

  6. Entropy of hydrological systems under small samples: Uncertainty and variability

    NASA Astrophysics Data System (ADS)

    Liu, Dengfeng; Wang, Dong; Wang, Yuankun; Wu, Jichun; Singh, Vijay P.; Zeng, Xiankui; Wang, Lachun; Chen, Yuanfang; Chen, Xi; Zhang, Liyuan; Gu, Shenghua

    2016-01-01

    Entropy theory has been increasingly applied in hydrology in both descriptive and inferential ways. However, little attention has been given to the small-sample condition widespread in hydrological practice, where either hydrological measurements are limited or are even nonexistent. Accordingly, entropy estimated under this condition may incur considerable bias. In this study, small-sample condition is considered and two innovative entropy estimators, the Chao-Shen (CS) estimator and the James-Stein-type shrinkage (JSS) estimator, are introduced. Simulation tests are conducted with common distributions in hydrology, that lead to the best-performing JSS estimator. Then, multi-scale moving entropy-based hydrological analyses (MM-EHA) are applied to indicate the changing patterns of uncertainty of streamflow data collected from the Yangtze River and the Yellow River, China. For further investigation into the intrinsic property of entropy applied in hydrological uncertainty analyses, correlations of entropy and other statistics at different time-scales are also calculated, which show connections between the concept of uncertainty and variability.

  7. A new principle for the standardization of long paragraphs for reading speed analysis.

    PubMed

    Radner, Wolfgang; Radner, Stephan; Diendorfer, Gabriela

    2016-01-01

    To investigate the reliability, validity, and statistical comparability of long paragraphs that were developed to be equivalent in construction and difficulty. Seven long paragraphs were developed that were equal in syntax, morphology, and number and position of words (111), with the same number of syllables (179) and number of characters (660). For validity analyses, the paragraphs were compared with the mean reading speed of a set of seven sentence optotypes of the RADNER Reading Charts (mean of 7 × 14 = 98 words read). Reliability analyses were performed by calculating the Cronbach's alpha value and the corrected total item correlation. Sixty participants (aged 20-77 years) read the paragraphs and the sentences (distance 40 cm; font: Times New Roman 12 pt). Test items were presented randomly; reading length was measured with a stopwatch. Reliability analysis yielded a Cronbach's alpha value of 0.988. When the long paragraphs were compared in pairwise fashion, significant differences were found in 13 of the 21 pairs (p < 0.05). In two sequences of three paragraphs each and in eight pairs of paragraphs, the paragraphs did not differ significantly, and these paragraph combinations are therefore suitable for comparative research studies. The mean reading speed was 173.34 ± 24.01 words per minute (wpm) for the long paragraphs and 198.26 ± 28.60 wpm for the sentence optotypes. The maximum difference in reading speed was 5.55 % for the long paragraphs and 2.95 % for the short sentence optotypes. The correlation between long paragraphs and sentence optotypes was high (r = 0.9243). Despite good reliability and equivalence in construction and degree of difficulty, a statistically significant difference in reading speed can occur between long paragraphs. Since statistical significance should be dependent only on the persons tested, either standardizing long paragraphs for statistical equality of reading speed measurements or increasing the number of presented paragraphs is recommended for comparative investigations.

  8. Statistical alignment: computational properties, homology testing and goodness-of-fit.

    PubMed

    Hein, J; Wiuf, C; Knudsen, B; Møller, M B; Wibling, G

    2000-09-08

    The model of insertions and deletions in biological sequences, first formulated by Thorne, Kishino, and Felsenstein in 1991 (the TKF91 model), provides a basis for performing alignment within a statistical framework. Here we investigate this model.Firstly, we show how to accelerate the statistical alignment algorithms several orders of magnitude. The main innovations are to confine likelihood calculations to a band close to the similarity based alignment, to get good initial guesses of the evolutionary parameters and to apply an efficient numerical optimisation algorithm for finding the maximum likelihood estimate. In addition, the recursions originally presented by Thorne, Kishino and Felsenstein can be simplified. Two proteins, about 1500 amino acids long, can be analysed with this method in less than five seconds on a fast desktop computer, which makes this method practical for actual data analysis.Secondly, we propose a new homology test based on this model, where homology means that an ancestor to a sequence pair can be found finitely far back in time. This test has statistical advantages relative to the traditional shuffle test for proteins.Finally, we describe a goodness-of-fit test, that allows testing the proposed insertion-deletion (indel) process inherent to this model and find that real sequences (here globins) probably experience indels longer than one, contrary to what is assumed by the model. Copyright 2000 Academic Press.

  9. Influence of neurophysiological hippotherapy on the transference of the centre of gravity among children with cerebral palsy.

    PubMed

    Maćków, Anna; Małachowska-Sobieska, Monika; Demczuk-Włodarczyk, Ewa; Sidorowska, Marta; Szklarska, Alicja; Lipowicz, Anna

    2014-01-01

    The aim of the study was to present the influence of neurophysiological hippotherapy on the transference of the centre of gravity (COG) among children with cerebral palsy (CP). The study involved 19 children aged 4-13 years suffering from CP who demonstrated an asymmetric (A/P) model of compensation. Body balance was studied with the Cosmogamma Balance Platform. An examination on this platform was performed before and after a session of neurophysiological hippotherapy. In order to compare the correlations and differences between the examinations, the results were analysed using Student's T-test for dependent samples at p ≤ 0.05 as the level of statistical significance and descriptive statistics were calculated. The mean value of the body's centre of gravity in the frontal plane (COG X) was 18.33 (mm) during the first examination, changing by 21.84 (mm) after neurophysiological hippotherapy towards deloading of the antigravity lower limb (p ≤ 0.0001). The other stabilographic parameters increased; however, only the change in average speed of antero - posterior COG oscillation was statistically significant (p = 0.0354). 1. One session of neurophysiological hippotherapy induced statistically significant changes in the position of the centre of gravity in the body in the frontal plane and the average speed of COG oscillation in the sagittal plane among CP children demonstrating an asymmetric model of compensation (A/P).

  10. WASP (Write a Scientific Paper) using Excel - 6: Standard error and confidence interval.

    PubMed

    Grech, Victor

    2018-03-01

    The calculation of descriptive statistics includes the calculation of standard error and confidence interval, an inevitable component of data analysis in inferential statistics. This paper provides pointers as to how to do this in Microsoft Excel™. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. 40 CFR 91.512 - Request for public hearing.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... plans and statistical analyses have been properly applied (specifically, whether sampling procedures and statistical analyses specified in this subpart were followed and whether there exists a basis for... will be made available to the public during Agency business hours. ...

  12. A retrospective survey of research design and statistical analyses in selected Chinese medical journals in 1998 and 2008.

    PubMed

    Jin, Zhichao; Yu, Danghui; Zhang, Luoman; Meng, Hong; Lu, Jian; Gao, Qingbin; Cao, Yang; Ma, Xiuqiang; Wu, Cheng; He, Qian; Wang, Rui; He, Jia

    2010-05-25

    High quality clinical research not only requires advanced professional knowledge, but also needs sound study design and correct statistical analyses. The number of clinical research articles published in Chinese medical journals has increased immensely in the past decade, but study design quality and statistical analyses have remained suboptimal. The aim of this investigation was to gather evidence on the quality of study design and statistical analyses in clinical researches conducted in China for the first decade of the new millennium. Ten (10) leading Chinese medical journals were selected and all original articles published in 1998 (N = 1,335) and 2008 (N = 1,578) were thoroughly categorized and reviewed. A well-defined and validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation. Main outcomes were the frequencies of different types of study design, error/defect proportion in design and statistical analyses, and implementation of CONSORT in randomized clinical trials. From 1998 to 2008: The error/defect proportion in statistical analyses decreased significantly ( = 12.03, p<0.001), 59.8% (545/1,335) in 1998 compared to 52.2% (664/1,578) in 2008. The overall error/defect proportion of study design also decreased ( = 21.22, p<0.001), 50.9% (680/1,335) compared to 42.40% (669/1,578). In 2008, design with randomized clinical trials remained low in single digit (3.8%, 60/1,578) with two-third showed poor results reporting (defects in 44 papers, 73.3%). Nearly half of the published studies were retrospective in nature, 49.3% (658/1,335) in 1998 compared to 48.2% (761/1,578) in 2008. Decreases in defect proportions were observed in both results presentation ( = 93.26, p<0.001), 92.7% (945/1,019) compared to 78.2% (1023/1,309) and interpretation ( = 27.26, p<0.001), 9.7% (99/1,019) compared to 4.3% (56/1,309), some serious ones persisted. Chinese medical research seems to have made significant progress regarding statistical analyses, but there remains ample room for improvement regarding study designs. Retrospective clinical studies are the most often used design, whereas randomized clinical trials are rare and often show methodological weaknesses. Urgent implementation of the CONSORT statement is imperative.

  13. A Meta-Meta-Analysis: Empirical Review of Statistical Power, Type I Error Rates, Effect Sizes, and Model Selection of Meta-Analyses Published in Psychology

    ERIC Educational Resources Information Center

    Cafri, Guy; Kromrey, Jeffrey D.; Brannick, Michael T.

    2010-01-01

    This article uses meta-analyses published in "Psychological Bulletin" from 1995 to 2005 to describe meta-analyses in psychology, including examination of statistical power, Type I errors resulting from multiple comparisons, and model choice. Retrospective power estimates indicated that univariate categorical and continuous moderators, individual…

  14. Reliability of Computerized Neurocognitive Tests for Concussion Assessment: A Meta-Analysis.

    PubMed

    Farnsworth, James L; Dargo, Lucas; Ragan, Brian G; Kang, Minsoo

    2017-09-01

      Although widely used, computerized neurocognitive tests (CNTs) have been criticized because of low reliability and poor sensitivity. A systematic review was published summarizing the reliability of Immediate Post-Concussion Assessment and Cognitive Testing (ImPACT) scores; however, this was limited to a single CNT. Expansion of the previous review to include additional CNTs and a meta-analysis is needed. Therefore, our purpose was to analyze reliability data for CNTs using meta-analysis and examine moderating factors that may influence reliability.   A systematic literature search (key terms: reliability, computerized neurocognitive test, concussion) of electronic databases (MEDLINE, PubMed, Google Scholar, and SPORTDiscus) was conducted to identify relevant studies.   Studies were included if they met all of the following criteria: used a test-retest design, involved at least 1 CNT, provided sufficient statistical data to allow for effect-size calculation, and were published in English.   Two independent reviewers investigated each article to assess inclusion criteria. Eighteen studies involving 2674 participants were retained. Intraclass correlation coefficients were extracted to calculate effect sizes and determine overall reliability. The Fisher Z transformation adjusted for sampling error associated with averaging correlations. Moderator analyses were conducted to evaluate the effects of the length of the test-retest interval, intraclass correlation coefficient model selection, participant demographics, and study design on reliability. Heterogeneity was evaluated using the Cochran Q statistic.   The proportion of acceptable outcomes was greatest for the Axon Sports CogState Test (75%) and lowest for the ImPACT (25%). Moderator analyses indicated that the type of intraclass correlation coefficient model used significantly influenced effect-size estimates, accounting for 17% of the variation in reliability.   The Axon Sports CogState Test, which has a higher proportion of acceptable outcomes and shorter test duration relative to other CNTs, may be a reliable option; however, future studies are needed to compare the diagnostic accuracy of these instruments.

  15. Single awakening salivary measurements provide reliable estimates of morning cortisol levels in pregnant women.

    PubMed

    Vlenterie, Richelle; Roeleveld, Nel; van Gelder, Marleen M H J

    2016-12-01

    Mood disorders during pregnancy have been associated with adverse effects on maternal as well as fetal health. Since mood, anxiety, and stress disorders are related with elevated cortisol levels, salivary cortisol may be a useful biomarker. Although multiple samples are generally recommended, a single measurement of awakening salivary cortisol could be a simpler and more cost-effective method to determine whether women have elevated morning cortisol levels during a specific period of pregnancy. Therefore, the aim of this validation study among 177 women in the PRIDE Study was to examine whether one awakening salivary cortisol measurement will suffice to classify pregnant women as having normal or elevated cortisol levels compared to awakening salivary cortisol measurements on three consecutive working days. We calculated intraclass correlation coefficients (ICC) and Cohen's kappa statistics (κ) overall as well as in sub-analyses within strata based on maternal age, level of education, net household income, pre-pregnancy BMI, parity, complications during pregnancy, caffeine consumption, gestational week of sampling, and awakening time. The mean cortisol concentrations were 8.98ng/ml (SD 5.32) for day one, 8.62ng/ml (SD 4.55) for day two, and 8.39ng/ml (SD 4.58) for day three. The overall ICC was 0.86 (95% CI 0.82-0.89) while the κ was 0.75 (95% CI 0.64-0.86). For the ICCs calculated within sub-analyses, a maximum difference of 0.11 was observed between the strata. For the κ statistics, most strata did not differ more than 0.12, except for pre-pregnancy BMI, severe nausea, and extreme fatigue with differences up to 0.22. In conclusion, one awakening salivary cortisol measurement is as reliable for the classification of pregnant women into normal and elevated morning cortisol levels as salivary cortisol measurements on three consecutive working days. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. MutAIT: an online genetic toxicology data portal and analysis tools.

    PubMed

    Avancini, Daniele; Menzies, Georgina E; Morgan, Claire; Wills, John; Johnson, George E; White, Paul A; Lewis, Paul D

    2016-05-01

    Assessment of genetic toxicity and/or carcinogenic activity is an essential element of chemical screening programs employed to protect human health. Dose-response and gene mutation data are frequently analysed by industry, academia and governmental agencies for regulatory evaluations and decision making. Over the years, a number of efforts at different institutions have led to the creation and curation of databases to house genetic toxicology data, largely, with the aim of providing public access to facilitate research and regulatory assessments. This article provides a brief introduction to a new genetic toxicology portal called Mutation Analysis Informatics Tools (MutAIT) (www.mutait.org) that provides easy access to two of the largest genetic toxicology databases, the Mammalian Gene Mutation Database (MGMD) and TransgenicDB. TransgenicDB is a comprehensive collection of transgenic rodent mutation data initially compiled and collated by Health Canada. The updated MGMD contains approximately 50 000 individual mutation spectral records from the published literature. The portal not only gives access to an enormous quantity of genetic toxicology data, but also provides statistical tools for dose-response analysis and calculation of benchmark dose. Two important R packages for dose-response analysis are provided as web-distributed applications with user-friendly graphical interfaces. The 'drsmooth' package performs dose-response shape analysis and determines various points of departure (PoD) metrics and the 'PROAST' package provides algorithms for dose-response modelling. The MutAIT statistical tools, which are currently being enhanced, provide users with an efficient and comprehensive platform to conduct quantitative dose-response analyses and determine PoD values that can then be used to calculate human exposure limits or margins of exposure. © The Author 2015. Published by Oxford University Press on behalf of the UK Environmental Mutagen Society. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  17. Presentation approaches for enhancing interpretability of patient-reported outcomes (PROs) in meta-analysis: a protocol for a systematic survey of Cochrane reviews.

    PubMed

    Devji, Tahira; Johnston, Bradley C; Patrick, Donald L; Bhandari, Mohit; Thabane, Lehana; Guyatt, Gordon H

    2017-09-27

    Meta-analyses of clinical trials often provide sufficient information for decision-makers to evaluate whether chance can explain apparent differences between interventions. Interpretation of the magnitude and importance of treatment effects beyond statistical significance can, however, be challenging, particularly for patient-reported outcomes (PROs) measured using questionnaires with which clinicians have limited familiarity. The objectives of our study are to systematically evaluate Cochrane systematic review authors' approaches to calculation, reporting and interpretation of pooled estimates of patient-reported outcome measures (PROMs) in meta-analyses. We will conduct a methodological survey of a random sample of Cochrane systematic reviews published from 1 January 2015 to 1 April 2017 that report at least one statistically significant pooled result for at least one PRO in the abstract. Author pairs will independently review all titles, abstracts and full texts identified by the literature search, and they will extract data using a standardised data extraction form. We will extract the following: year of publication, number of included trials, number of included participants, clinical area, type of intervention(s) and control(s), type of meta-analysis and use of the Grading of Recommendations, Assessment, Development and Evaluation approach to rate the quality of evidence, as well as information regarding the characteristics of PROMs, calculation and presentation of PROM effect estimates and interpretation of PROM effect estimates. We will document and summarise the methods used for the analysis, reporting and interpretation of each summary effect measure. We will summarise categorical variables with frequencies and percentages and continuous outcomes as means and/or medians and associated measures of dispersion. Ethics approval for this study is not required. We will disseminate the results of this review in peer-reviewed publications and conference presentations. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  18. [Metacarpophalangeal and carpal numeric indices to calculate bone age and predict adult size].

    PubMed

    Ebrí Torné, B; Ebrí Verde, I

    2012-04-01

    This work presents new numerical methods from the meta-carpal-phalangeal and carpal indexes, for calculating bone age. In addition, these new methods enable the adult height to be predicted using multiple regression equations. The longitudinal case series studied included 160 healthy children from Zaragoza, of both genders, aged between 6 months and 20 years, and studied annually, including the radiological study. For the statistical analysis the statistical package "Statistix", as well as the Excel program, was used. The new indexes are closely co-related to the chronological age, thus leading to predictive equations for the calculation of the bone age of children up to 20 years of age. In addition, it presents particular equations for up to 4 years of age, in order to optimise the diagnosis at these early ages. The resulting bones ages can be applied to numerical standard deviation tables, as well as to an equivalences chart, which directly gives us the ossification diagnosis. The predictive equations of adult height allow a reliable forecast of the future height of the studied child. These forecasts, analysed by the Student test did not show significant differences as regards the adult height that children of the case series finally achieved. The results can be obtained with a pocket calculator or through free software available for the reader. For the first time, and using a centre-developed and non-foreign methods, bones age standards and adult height predictive equations for the study of children, are presented. We invite the practitioner to use these meta-carpal-phalangeal and carpal methods in order to achieve the necessary experience to apply it to a healthy population and those with different disorders. Copyright © 2011 Asociación Española de Pediatría. Published by Elsevier Espana. All rights reserved.

  19. Statistical analyses in Swedish randomised trials on mammography screening and in other randomised trials on cancer screening: a systematic review

    PubMed Central

    Boniol, Mathieu; Smans, Michel; Sullivan, Richard; Boyle, Peter

    2015-01-01

    Objectives We compared calculations of relative risks of cancer death in Swedish mammography trials and in other cancer screening trials. Participants Men and women from 30 to 74 years of age. Setting Randomised trials on cancer screening. Design For each trial, we identified the intervention period, when screening was offered to screening groups and not to control groups, and the post-intervention period, when screening (or absence of screening) was the same in screening and control groups. We then examined which cancer deaths had been used for the computation of relative risk of cancer death. Main outcome measures Relative risk of cancer death. Results In 17 non-breast screening trials, deaths due to cancers diagnosed during the intervention and post-intervention periods were used for relative risk calculations. In the five Swedish trials, relative risk calculations used deaths due to breast cancers found during intervention periods, but deaths due to breast cancer found at first screening of control groups were added to these groups. After reallocation of the added breast cancer deaths to post-intervention periods of control groups, relative risks of 0.86 (0.76; 0.97) were obtained for cancers found during intervention periods and 0.83 (0.71; 0.97) for cancers found during post-intervention periods, indicating constant reduction in the risk of breast cancer death during follow-up, irrespective of screening. Conclusions The use of unconventional statistical methods in Swedish trials has led to overestimation of risk reduction in breast cancer death attributable to mammography screening. The constant risk reduction observed in screening groups was probably due to the trial design that optimised awareness and medical management of women allocated to screening groups. PMID:26152677

  20. Algorithm for Identifying Erroneous Rain-Gauge Readings

    NASA Technical Reports Server (NTRS)

    Rickman, Doug

    2005-01-01

    An algorithm analyzes rain-gauge data to identify statistical outliers that could be deemed to be erroneous readings. Heretofore, analyses of this type have been performed in burdensome manual procedures that have involved subjective judgements. Sometimes, the analyses have included computational assistance for detecting values falling outside of arbitrary limits. The analyses have been performed without statistically valid knowledge of the spatial and temporal variations of precipitation within rain events. In contrast, the present algorithm makes it possible to automate such an analysis, makes the analysis objective, takes account of the spatial distribution of rain gauges in conjunction with the statistical nature of spatial variations in rainfall readings, and minimizes the use of arbitrary criteria. The algorithm implements an iterative process that involves nonparametric statistics.

  1. Tools for Basic Statistical Analysis

    NASA Technical Reports Server (NTRS)

    Luz, Paul L.

    2005-01-01

    Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.

  2. Citation of previous meta-analyses on the same topic: a clue to perpetuation of incorrect methods?

    PubMed

    Li, Tianjing; Dickersin, Kay

    2013-06-01

    Systematic reviews and meta-analyses serve as a basis for decision-making and clinical practice guidelines and should be carried out using appropriate methodology to avoid incorrect inferences. We describe the characteristics, statistical methods used for meta-analyses, and citation patterns of all 21 glaucoma systematic reviews we identified pertaining to the effectiveness of prostaglandin analog eye drops in treating primary open-angle glaucoma, published between December 2000 and February 2012. We abstracted data, assessed whether appropriate statistical methods were applied in meta-analyses, and examined citation patterns of included reviews. We identified two forms of problematic statistical analyses in 9 of the 21 systematic reviews examined. Except in 1 case, none of the 9 reviews that used incorrect statistical methods cited a previously published review that used appropriate methods. Reviews that used incorrect methods were cited 2.6 times more often than reviews that used appropriate statistical methods. We speculate that by emulating the statistical methodology of previous systematic reviews, systematic review authors may have perpetuated incorrect approaches to meta-analysis. The use of incorrect statistical methods, perhaps through emulating methods described in previous research, calls conclusions of systematic reviews into question and may lead to inappropriate patient care. We urge systematic review authors and journal editors to seek the advice of experienced statisticians before undertaking or accepting for publication a systematic review and meta-analysis. The author(s) have no proprietary or commercial interest in any materials discussed in this article. Copyright © 2013 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  3. The effect of a senior jazz dance class on static balance in healthy women over 50 years of age: a pilot study.

    PubMed

    Wallmann, Harvey W; Gillis, Carrie B; Alpert, Patricia T; Miller, Sally K

    2009-01-01

    The purpose of this pilot study is to assess the impact of a senior jazz dance class on static balance for healthy women over 50 years of age using the NeuroCom Smart Balance Master System (Balance Master). A total of 12 healthy women aged 54-88 years completed a 15-week jazz dance class which they attended 1 time per week for 90 min per class. Balance data were collected using the Sensory Organization Test (SOT) at baseline (pre), at 7 weeks (mid), and after 15 weeks (post). An equilibrium score measuring postural sway was calculated for each of six different conditions. The composite equilibrium score (all six conditions integrated to 1 score) was used as an overall measure of balance. Repeated measures analyses of variance (ANOVAs) were used to compare the means of each participant's SOT composite equilibrium score in addition to the equilibrium score for each individual condition (1-6) across the 3 time points (pre, mid, post). There was a statistically significant difference among the means, p < .0005. Pairwise (Bonferroni) post hoc analyses revealed the following statistically significant findings for SOT composite equilibrium scores for the pre (67.33 + 10.43), mid (75.25 + 6.97), and post (79.00 + 4.97) measurements: premid (p = .008); prepost (p < .0005); midpost (p = .033). In addition, correlational statistics were used to determine any relationship between SOT scores and age. Results indicated that administration of a 15-week jazz dance class 1 time per week was beneficial in improving static balance as measured by the Balance Master SOT.

  4. Association between periodontal disease and mortality in people with CKD: a meta-analysis of cohort studies.

    PubMed

    Zhang, Jian; Jiang, Hong; Sun, Min; Chen, Jianghua

    2017-08-16

    Periodontal disease occurs relatively prevalently in people with chronic kidney disease (CKD), but it remains indeterminate whether periodontal disease is an independent risk factor for premature death in this population. Interventions to reduce mortality in CKD population consistently yield to unsatisfactory results and new targets are necessitated. So this meta-analysis aimed to evaluate the association between periodontal disease and mortality in the CKD population. Pubmed, Embase, Web of Science, Scopus and abstracts from recent relevant meeting were searched by two authors independently. Relative risks (RRs) with 95% confidence intervals (CIs) were calculated for overall and subgroup meta-analyses. Statistical heterogeneity was explored by chi-square test and quantified by the I 2 statistic. Eight cohort studies comprising 5477 individuals with CKD were incorporated. The overall pooled data demonstrated that periodontal disease was associated with all-cause death in CKD population (RR, 1.254; 95% CI 1.046-1.503; P = 0.005), with a moderate heterogeneity, I 2  = 52.2%. However, no evident association was observed between periodontal disease and cardiovascular mortality (RR, 1.30, 95% CI, 0.82-2.06; P = 0.259). Besides, statistical heterogeneity was substantial (I 2  = 72.5%; P = 0.012). Associations for mortality were similar between subgroups, such as the different stages of CKD, adjustment for confounding factors. Specific to all-cause death, sensitivity and cumulative analyses both suggested that our results were robust. As for cardiovascular mortality, the association with periodontal disease needs to be further strengthened. We demonstrated that periodontal disease was associated with an increased risk of all-cause death in CKD people. Yet no adequate evidence suggested periodontal disease was also at elevated risk for cardiovascular death.

  5. Therapeutic whole-body hypothermia reduces mortality in severe traumatic brain injury if the cooling index is sufficiently high: meta-analyses of the effect of single cooling parameters and their integrated measure.

    PubMed

    Olah, Emoke; Poto, Laszlo; Hegyi, Peter; Szabo, Imre; Hartmann, Petra; Solymar, Margit; Petervari, Erika; Balasko, Marta; Habon, Tamas; Rumbus, Zoltan; Tenk, Judit; Rostas, Ildiko; Weinberg, Jordan; Romanovsky, Andrej A; Garami, Andras

    2018-04-21

    Therapeutic hypothermia was investigated repeatedly as a tool to improve the outcome of severe traumatic brain injury (TBI), but previous clinical trials and meta-analyses found contradictory results. We aimed to determine the effectiveness of therapeutic whole-body hypothermia on the mortality of adult patients with severe TBI by using a novel approach of meta-analysis. We searched the PubMed, EMBASE, and Cochrane Library databases from inception to February 2017. The identified human studies were evaluated regarding statistical, clinical, and methodological designs to ensure inter-study homogeneity. We extracted data on TBI severity, body temperature, mortality, and cooling parameters; then we calculated the cooling index, an integrated measure of therapeutic hypothermia. Forest plot of all identified studies showed no difference in the outcome of TBI between cooled and not cooled patients, but inter-study heterogeneity was high. On the contrary, by meta-analysis of RCTs which were homogenous with regards to statistical, clinical designs and precisely reported the cooling protocol, we showed decreased odds ratio for mortality in therapeutic hypothermia compared to no cooling. As independent factors, milder and longer cooling, and rewarming at < 0.25°C/h were associated with better outcome. Therapeutic hypothermia was beneficial only if the cooling index (measure of combination of cooling parameters) was sufficiently high. We conclude that high methodological and statistical inter-study heterogeneity could underlie the contradictory results obtained in previous studies. By analyzing methodologically homogenous studies, we show that cooling improves the outcome of severe TBI and this beneficial effect depends on certain cooling parameters and on their integrated measure, the cooling index.

  6. The influence of control group reproduction on the statistical power of the Environmental Protection Agency's Medaka Extended One Generation Reproduction Test (MEOGRT).

    PubMed

    Flynn, Kevin; Swintek, Joe; Johnson, Rodney

    2017-02-01

    Because of various Congressional mandates to protect the environment from endocrine disrupting chemicals (EDCs), the United States Environmental Protection Agency (USEPA) initiated the Endocrine Disruptor Screening Program. In the context of this framework, the Office of Research and Development within the USEPA developed the Medaka Extended One Generation Reproduction Test (MEOGRT) to characterize the endocrine action of a suspected EDC. One important endpoint of the MEOGRT is fecundity of medaka breeding pairs. Power analyses were conducted to determine the number of replicates needed in proposed test designs and to determine the effects that varying reproductive parameters (e.g. mean fecundity, variance, and days with no egg production) would have on the statistical power of the test. The MEOGRT Reproduction Power Analysis Tool (MRPAT) is a software tool developed to expedite these power analyses by both calculating estimates of the needed reproductive parameters (e.g. population mean and variance) and performing the power analysis under user specified scenarios. Example scenarios are detailed that highlight the importance of the reproductive parameters on statistical power. When control fecundity is increased from 21 to 38 eggs per pair per day and the variance decreased from 49 to 20, the gain in power is equivalent to increasing replication by 2.5 times. On the other hand, if 10% of the breeding pairs, including controls, do not spawn, the power to detect a 40% decrease in fecundity drops to 0.54 from nearly 0.98 when all pairs have some level of egg production. Perhaps most importantly, MRPAT was used to inform the decision making process that lead to the final recommendation of the MEOGRT to have 24 control breeding pairs and 12 breeding pairs in each exposure group. Published by Elsevier Inc.

  7. Substituting values for censored data from Texas, USA, reservoirs inflated and obscured trends in analyses commonly used for water quality target development.

    PubMed

    Grantz, Erin; Haggard, Brian; Scott, J Thad

    2018-06-12

    We calculated four median datasets (chlorophyll a, Chl a; total phosphorus, TP; and transparency) using multiple approaches to handling censored observations, including substituting fractions of the quantification limit (QL; dataset 1 = 1QL, dataset 2 = 0.5QL) and statistical methods for censored datasets (datasets 3-4) for approximately 100 Texas, USA reservoirs. Trend analyses of differences between dataset 1 and 3 medians indicated percent difference increased linearly above thresholds in percent censored data (%Cen). This relationship was extrapolated to estimate medians for site-parameter combinations with %Cen > 80%, which were combined with dataset 3 as dataset 4. Changepoint analysis of Chl a- and transparency-TP relationships indicated threshold differences up to 50% between datasets. Recursive analysis identified secondary thresholds in dataset 4. Threshold differences show that information introduced via substitution or missing due to limitations of statistical methods biased values, underestimated error, and inflated the strength of TP thresholds identified in datasets 1-3. Analysis of covariance identified differences in linear regression models relating transparency-TP between datasets 1, 2, and the more statistically robust datasets 3-4. Study findings identify high-risk scenarios for biased analytical outcomes when using substitution. These include high probability of median overestimation when %Cen > 50-60% for a single QL, or when %Cen is as low 16% for multiple QL's. Changepoint analysis was uniquely vulnerable to substitution effects when using medians from sites with %Cen > 50%. Linear regression analysis was less sensitive to substitution and missing data effects, but differences in model parameters for transparency cannot be discounted and could be magnified by log-transformation of the variables.

  8. A statistical model of operational impacts on the framework of the bridge crane

    NASA Astrophysics Data System (ADS)

    Antsev, V. Yu; Tolokonnikov, A. S.; Gorynin, A. D.; Reutov, A. A.

    2017-02-01

    The technical regulations of the Customs Union demands implementation of the risk analysis of the bridge cranes operation at their design stage. The statistical model has been developed for performance of random calculations of risks, allowing us to model possible operational influences on the bridge crane metal structure in their various combination. The statistical model is practically actualized in the software product automated calculation of risks of failure occurrence of bridge cranes.

  9. 2012 Workplace and Gender Relations Survey of Reserve Component Members: Survey Note and Briefing

    DTIC Science & Technology

    2013-05-08

    to be a statistically significant difference at the .05 leve l of significance. Overview The abi li ty to calculate annual prevalence rates is a...understand that to be a statistically significant difference at the .05 level of significance. Overview The ability to calculate annual prevalence...statistically significant differences for women or men in the overall rate between 2008 and 2012. Of the 2.8% of women who experienced UMAN

  10. Statistical Signal Process in R Language in the Pharmacovigilance Programme of India.

    PubMed

    Kumar, Aman; Ahuja, Jitin; Shrivastava, Tarani Prakash; Kumar, Vipin; Kalaiselvan, Vivekanandan

    2018-05-01

    The Ministry of Health & Family Welfare, Government of India, initiated the Pharmacovigilance Programme of India (PvPI) in July 2010. The purpose of the PvPI is to collect data on adverse reactions due to medications, analyze it, and use the reference to recommend informed regulatory intervention, besides communicating the risk to health care professionals and the public. The goal of the present study was to apply statistical tools to find the relationship between drugs and ADRs for signal detection by R programming. Four statistical parameters were proposed for quantitative signal detection. These 4 parameters are IC 025 , PRR and PRR lb , chi-square, and N 11 ; we calculated these 4 values using R programming. We analyzed 78,983 drug-ADR combinations, and the total count of drug-ADR combination was 4,20,060. During the calculation of the statistical parameter, we use 3 variables: (1) N 11 (number of counts), (2) N 1. (Drug margin), and (3) N .1 (ADR margin). The structure and calculation of these 4 statistical parameters in R language are easily understandable. On the basis of the IC value (IC value >0), out of the 78,983 drug-ADR combination (drug-ADR combination), we found the 8,667 combinations to be significantly associated. The calculation of statistical parameters in R language is time saving and allows to easily identify new signals in the Indian ICSR (Individual Case Safety Reports) database.

  11. Reporting quality of statistical methods in surgical observational studies: protocol for systematic review.

    PubMed

    Wu, Robert; Glen, Peter; Ramsay, Tim; Martel, Guillaume

    2014-06-28

    Observational studies dominate the surgical literature. Statistical adjustment is an important strategy to account for confounders in observational studies. Research has shown that published articles are often poor in statistical quality, which may jeopardize their conclusions. The Statistical Analyses and Methods in the Published Literature (SAMPL) guidelines have been published to help establish standards for statistical reporting.This study will seek to determine whether the quality of statistical adjustment and the reporting of these methods are adequate in surgical observational studies. We hypothesize that incomplete reporting will be found in all surgical observational studies, and that the quality and reporting of these methods will be of lower quality in surgical journals when compared with medical journals. Finally, this work will seek to identify predictors of high-quality reporting. This work will examine the top five general surgical and medical journals, based on a 5-year impact factor (2007-2012). All observational studies investigating an intervention related to an essential component area of general surgery (defined by the American Board of Surgery), with an exposure, outcome, and comparator, will be included in this systematic review. Essential elements related to statistical reporting and quality were extracted from the SAMPL guidelines and include domains such as intent of analysis, primary analysis, multiple comparisons, numbers and descriptive statistics, association and correlation analyses, linear regression, logistic regression, Cox proportional hazard analysis, analysis of variance, survival analysis, propensity analysis, and independent and correlated analyses. Each article will be scored as a proportion based on fulfilling criteria in relevant analyses used in the study. A logistic regression model will be built to identify variables associated with high-quality reporting. A comparison will be made between the scores of surgical observational studies published in medical versus surgical journals. Secondary outcomes will pertain to individual domains of analysis. Sensitivity analyses will be conducted. This study will explore the reporting and quality of statistical analyses in surgical observational studies published in the most referenced surgical and medical journals in 2013 and examine whether variables (including the type of journal) can predict high-quality reporting.

  12. Fukushima Daiichi Radionuclide Inventories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cardoni, Jeffrey N.; Jankovsky, Zachary Kyle

    Radionuclide inventories are generated to permit detailed analyses of the Fukushima Daiichi meltdowns. This is necessary information for severe accident calculations, dose calculations, and source term and consequence analyses. Inventories are calculated using SCALE6 and compared to values predicted by international researchers supporting the OECD/NEA's Benchmark Study on the Accident at Fukushima Daiichi Nuclear Power Station (BSAF). Both sets of inventory information are acceptable for best-estimate analyses of the Fukushima reactors. Consistent nuclear information for severe accident codes, including radionuclide class masses and core decay powers, are also derived from the SCALE6 analyses. Key nuclide activity ratios are calculated asmore » functions of burnup and nuclear data in order to explore the utility for nuclear forensics and support future decommissioning efforts.« less

  13. Sampling and sensitivity analyses tools (SaSAT) for computational modelling

    PubMed Central

    Hoare, Alexander; Regan, David G; Wilson, David P

    2008-01-01

    SaSAT (Sampling and Sensitivity Analysis Tools) is a user-friendly software package for applying uncertainty and sensitivity analyses to mathematical and computational models of arbitrary complexity and context. The toolbox is built in Matlab®, a numerical mathematical software package, and utilises algorithms contained in the Matlab® Statistics Toolbox. However, Matlab® is not required to use SaSAT as the software package is provided as an executable file with all the necessary supplementary files. The SaSAT package is also designed to work seamlessly with Microsoft Excel but no functionality is forfeited if that software is not available. A comprehensive suite of tools is provided to enable the following tasks to be easily performed: efficient and equitable sampling of parameter space by various methodologies; calculation of correlation coefficients; regression analysis; factor prioritisation; and graphical output of results, including response surfaces, tornado plots, and scatterplots. Use of SaSAT is exemplified by application to a simple epidemic model. To our knowledge, a number of the methods available in SaSAT for performing sensitivity analyses have not previously been used in epidemiological modelling and their usefulness in this context is demonstrated. PMID:18304361

  14. Sybil--efficient constraint-based modelling in R.

    PubMed

    Gelius-Dietrich, Gabriel; Desouki, Abdelmoneim Amer; Fritzemeier, Claus Jonathan; Lercher, Martin J

    2013-11-13

    Constraint-based analyses of metabolic networks are widely used to simulate the properties of genome-scale metabolic networks. Publicly available implementations tend to be slow, impeding large scale analyses such as the genome-wide computation of pairwise gene knock-outs, or the automated search for model improvements. Furthermore, available implementations cannot easily be extended or adapted by users. Here, we present sybil, an open source software library for constraint-based analyses in R; R is a free, platform-independent environment for statistical computing and graphics that is widely used in bioinformatics. Among other functions, sybil currently provides efficient methods for flux-balance analysis (FBA), MOMA, and ROOM that are about ten times faster than previous implementations when calculating the effect of whole-genome single gene deletions in silico on a complete E. coli metabolic model. Due to the object-oriented architecture of sybil, users can easily build analysis pipelines in R or even implement their own constraint-based algorithms. Based on its highly efficient communication with different mathematical optimisation programs, sybil facilitates the exploration of high-dimensional optimisation problems on small time scales. Sybil and all its dependencies are open source. Sybil and its documentation are available for download from the comprehensive R archive network (CRAN).

  15. Statistical analyses of commercial vehicle accident factors. Volume 1 Part 1

    DOT National Transportation Integrated Search

    1978-02-01

    Procedures for conducting statistical analyses of commercial vehicle accidents have been established and initially applied. A file of some 3,000 California Highway Patrol accident reports from two areas of California during a period of about one year...

  16. 40 CFR 90.712 - Request for public hearing.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... sampling plans and statistical analyses have been properly applied (specifically, whether sampling procedures and statistical analyses specified in this subpart were followed and whether there exists a basis... Clerk and will be made available to the public during Agency business hours. ...

  17. Acetabular revisions using porous tantalum components: A retrospective study with 5-10 years follow-up

    PubMed Central

    Evola, Francesco Roberto; Costarella, Luciano; Evola, Giuseppe; Barchitta, Martina; Agodi, Antonella; Sessa, Giuseppe

    2017-01-01

    AIM To evaluate the clinical and X-ray results of acetabular components and tantalum augments in prosthetic hip revisions. METHODS Fifty-eight hip prostheses with primary failure of the acetabular component were reviewed with tantalum implants. The clinical records and X-rays of these cases were retrospectively reviewed. Bone defect evaluations were based on preoperative CT scans and classified according to Paprosky criteria of Radiolucent lines and periprosthetic gaps; implant mobilization and osteolysis were evaluated by X-ray. An ad hoc database was created and statistical analyses were performed with SPSS software (IBM SPSS Statistics for Windows, version 23.0). Statistical analyses were carried out using the Student’s t test for independent and paired samples. A P value of < 0.05 was considered statistically significant and cumulative survival was calculated by the Kaplan-Meier method. RESULTS The mean follow-up was 87.6 ± 25.6 mo (range 3-120 mo). 25 cases (43.1%) were classified as minor defects, and 33 cases (56.9%) as major defects. The preoperative HHS rating improved significantly from a mean of 40.7 ± 6.1 (range: 29-53) before revision, to a mean of 85.8 ± 6.1 (range: 70-94) at the end of the follow-up (Student’s t test for paired samples: P < 0.001). Considering HHS only at the end of follow-up, no statistically significant difference was observed between patients with a major or minor defect (Student’s t test for independent samples: P > 0.05). Radiolucent lines were found in 4 implants (6.9%). Postoperative acetabular gaps were observed in 5 hips (8.6%). No signs of implant mobilization or areas of periprosthetic osteolysis were found in the x-rays at the final follow-up. Only 3 implants failed: 1 case of infection and 2 cases of instability. Defined as the end-point, cumulative survival at 10 years was 95% (for all reasons) and 100% for aseptic loosening of the acetabular component. CONCLUSION The medium-term use of prosthetic tantalum components in prosthetic hip revisions is safe and effective in a wide variety of acetabular bone defects. PMID:28808626

  18. Acetabular revisions using porous tantalum components: A retrospective study with 5-10 years follow-up.

    PubMed

    Evola, Francesco Roberto; Costarella, Luciano; Evola, Giuseppe; Barchitta, Martina; Agodi, Antonella; Sessa, Giuseppe

    2017-07-18

    To evaluate the clinical and X-ray results of acetabular components and tantalum augments in prosthetic hip revisions. Fifty-eight hip prostheses with primary failure of the acetabular component were reviewed with tantalum implants. The clinical records and X-rays of these cases were retrospectively reviewed. Bone defect evaluations were based on preoperative CT scans and classified according to Paprosky criteria of Radiolucent lines and periprosthetic gaps; implant mobilization and osteolysis were evaluated by X-ray. An ad hoc database was created and statistical analyses were performed with SPSS software (IBM SPSS Statistics for Windows, version 23.0). Statistical analyses were carried out using the Student's t test for independent and paired samples. A P value of < 0.05 was considered statistically significant and cumulative survival was calculated by the Kaplan-Meier method. The mean follow-up was 87.6 ± 25.6 mo (range 3-120 mo). 25 cases (43.1%) were classified as minor defects, and 33 cases (56.9%) as major defects. The preoperative HHS rating improved significantly from a mean of 40.7 ± 6.1 (range: 29-53) before revision, to a mean of 85.8 ± 6.1 (range: 70-94) at the end of the follow-up (Student's t test for paired samples: P < 0.001). Considering HHS only at the end of follow-up, no statistically significant difference was observed between patients with a major or minor defect (Student's t test for independent samples: P > 0.05). Radiolucent lines were found in 4 implants (6.9%). Postoperative acetabular gaps were observed in 5 hips (8.6%). No signs of implant mobilization or areas of periprosthetic osteolysis were found in the x-rays at the final follow-up. Only 3 implants failed: 1 case of infection and 2 cases of instability. Defined as the end-point, cumulative survival at 10 years was 95% (for all reasons) and 100% for aseptic loosening of the acetabular component. The medium-term use of prosthetic tantalum components in prosthetic hip revisions is safe and effective in a wide variety of acetabular bone defects.

  19. Reporting and methodological quality of meta-analyses in urological literature

    PubMed Central

    Xu, Jing

    2017-01-01

    Purpose To assess the overall quality of published urological meta-analyses and identify predictive factors for high quality. Materials and Methods We systematically searched PubMed to identify meta-analyses published from January 1st, 2011 to December 31st, 2015 in 10 predetermined major paper-based urology journals. The characteristics of the included meta-analyses were collected, and their reporting and methodological qualities were assessed by the PRISMA checklist (27 items) and AMSTAR tool (11 items), respectively. Descriptive statistics were used for individual items as a measure of overall compliance, and PRISMA and AMSTAR scores were calculated as the sum of adequately reported domains. Logistic regression was used to identify predictive factors for high qualities. Results A total of 183 meta-analyses were included. The mean PRISMA and AMSTAR scores were 22.74 ± 2.04 and 7.57 ± 1.41, respectively. PRISMA item 5, protocol and registration, items 15 and 22, risk of bias across studies, items 16 and 23, additional analysis had less than 50% adherence. AMSTAR item 1, “a priori” design, item 5, list of studies and item 10, publication bias had less than 50% adherence. Logistic regression analyses showed that funding support and “a priori” design were associated with superior reporting quality, following PRISMA guideline and “a priori” design were associated with superior methodological quality. Conclusions Reporting and methodological qualities of recently published meta-analyses in major paper-based urology journals are generally good. Further improvement could potentially be achieved by strictly adhering to PRISMA guideline and having “a priori” protocol. PMID:28439452

  20. Effective field theory of statistical anisotropies for primordial bispectrum and gravitational waves

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rostami, Tahereh; Karami, Asieh; Firouzjahi, Hassan, E-mail: t.rostami@ipm.ir, E-mail: karami@ipm.ir, E-mail: firouz@ipm.ir

    2017-06-01

    We present the effective field theory studies of primordial statistical anisotropies in models of anisotropic inflation. The general action in unitary gauge is presented to calculate the leading interactions between the gauge field fluctuations, the curvature perturbations and the tensor perturbations. The anisotropies in scalar power spectrum and bispectrum are calculated and the dependence of these anisotropies to EFT couplings are presented. In addition, we calculate the statistical anisotropy in tensor power spectrum and the scalar-tensor cross correlation. Our EFT approach incorporates anisotropies generated in models with non-trivial speed for the gauge field fluctuations and sound speed for scalar perturbationsmore » such as in DBI inflation.« less

  1. Supply Chain Collaboration: Information Sharing in a Tactical Operating Environment

    DTIC Science & Technology

    2013-06-01

    architecture, there are four tiers: Client (Web Application Clients ), Presentation (Web-Server), Processing (Application-Server), Data (Database...organization in each period. This data will be collected to analyze. i) Analyses and Validation: We will do a statistics test in this data, Pareto ...notes, outstanding deliveries, and inventory. i) Analyses and Validation: We will do a statistics test in this data, Pareto analyses and confirmation

  2. The Physics and Operation of Ultra-Submicron Length Semiconductor Devices.

    DTIC Science & Technology

    1994-05-01

    300 mei heterostructure diode at T=3001( with Fenni statistics and flat band conditions In all of the calculations with a heterostructure barrier, once...25 24- 22- 21- 0 50 100 150 200 Obhnce (mre Figure 8. Self-consistent T=300K calculation with Fenni statistics showing the density and donor

  3. On the statistical significance of excess events: Remarks of caution and the need for a standard method of calculation

    NASA Technical Reports Server (NTRS)

    Staubert, R.

    1985-01-01

    Methods for calculating the statistical significance of excess events and the interpretation of the formally derived values are discussed. It is argued that a simple formula for a conservative estimate should generally be used in order to provide a common understanding of quoted values.

  4. Reduction of Fasting Blood Glucose and Hemoglobin A1c Using Oral Aloe Vera: A Meta-Analysis.

    PubMed

    Dick, William R; Fletcher, Emily A; Shah, Sachin A

    2016-06-01

    Diabetes mellitus is a global epidemic and one of the leading causes of morbidity and mortality. Additional medications that are novel, affordable, and efficacious are needed to treat this rampant disease. This meta-analysis was performed to ascertain the effectiveness of oral aloe vera consumption on the reduction of fasting blood glucose (FBG) and hemoglobin A1c (HbA1c). PubMed, CINAHL, Natural Medicines Comprehensive Database, and Natural Standard databases were searched. Studies of aloe vera's effect on FBG, HbA1c, homeostasis model assessment-estimated insulin resistance (HOMA-IR), fasting serum insulin, fructosamine, and oral glucose tolerance test (OGTT) in prediabetic and diabetic populations were examined. After data extraction, the parameters of FBG and HbA1c had appropriate data for meta-analyses. Extracted data were verified and then analyzed by StatsDirect Statistical Software. Reductions of FBG and HbA1c were reported as the weighted mean differences from baseline, calculated by a random-effects model with 95% confidence intervals. Subgroup analyses to determine clinical and statistical heterogeneity were also performed. Publication bias was assessed by using the Egger bias statistic. Nine studies were included in the FBG parameter (n = 283); 5 of these studies included HbA1c data (n = 89). Aloe vera decreased FBG by 46.6 mg/dL (p < 0.0001) and HbA1c by 1.05% (p = 0.004). Significant reductions of both endpoints were maintained in all subgroup analyses. Additionally, the data suggest that patients with an FBG ≥200 mg/dL may see a greater benefit. A mean FBG reduction of 109.9 mg/dL was observed in this population (p ≤ 0.0001). The Egger statistic showed publication bias with FBG but not with HbA1c (p = 0.010 and p = 0.602, respectively). These results support the use of oral aloe vera for significantly reducing FBG (46.6 mg/dL) and HbA1c (1.05%). Further clinical studies that are more robust and better controlled are warranted to further explore these findings.

  5. Sensitivity of bud burst in key tree species in the UK to recent climate variability and change

    NASA Astrophysics Data System (ADS)

    Abernethy, Rachel; Cook, Sally; Hemming, Deborah; McCarthy, Mark

    2017-04-01

    Analysing the relationship between the changing climate of the UK and the spatial and temporal distribution of spring bud burst plays an important role in understanding ecosystem functionality and predicting future phenological trends. The location and timing of bud burst of eleven species of trees alongside climatic factors such as, temperature, precipitation and hours of sunshine (photoperiod) were used to investigate: i. which species' bud burst timing experiences the greatest impact from a changing climate, ii. which climatic factor has the greatest influence on the timing of bud burst, and iii. whether the location of bud burst is influenced by climate variability. Winter heatwave duration was also analysed as part of an investigation into the relationship between temperature trends of a specific winter period and the following spring events. Geographic Information Systems (GIS) and statistical analysis tools were used to visualise spatial patterns and to analyse the phenological and climate data through regression and analysis of variance (ANOVA) tests. Where there were areas that showed a strong positive or negative relationship between phenology and climate, satellite imagery was used to calculate a Normalised Difference Vegetation Index (NDVI) and a Leaf Area Index (LAI) to further investigate the relationships found. It was expected that in the north of the UK, where bud burst tends to occur later in the year than in the south, that the bud bursts would begin to occur earlier due to increasing temperatures and increased hours of sunshine. However, initial results show that for some species, the bud burst timing tends to remain or become later in the year. Interesting results will be found when investigating the statistical significance between the changing location of the bud bursts and each climatic factor.

  6. Spatially explicit spectral analysis of point clouds and geospatial data

    USGS Publications Warehouse

    Buscombe, Daniel D.

    2015-01-01

    The increasing use of spatially explicit analyses of high-resolution spatially distributed data (imagery and point clouds) for the purposes of characterising spatial heterogeneity in geophysical phenomena necessitates the development of custom analytical and computational tools. In recent years, such analyses have become the basis of, for example, automated texture characterisation and segmentation, roughness and grain size calculation, and feature detection and classification, from a variety of data types. In this work, much use has been made of statistical descriptors of localised spatial variations in amplitude variance (roughness), however the horizontal scale (wavelength) and spacing of roughness elements is rarely considered. This is despite the fact that the ratio of characteristic vertical to horizontal scales is not constant and can yield important information about physical scaling relationships. Spectral analysis is a hitherto under-utilised but powerful means to acquire statistical information about relevant amplitude and wavelength scales, simultaneously and with computational efficiency. Further, quantifying spatially distributed data in the frequency domain lends itself to the development of stochastic models for probing the underlying mechanisms which govern the spatial distribution of geological and geophysical phenomena. The software packagePySESA (Python program for Spatially Explicit Spectral Analysis) has been developed for generic analyses of spatially distributed data in both the spatial and frequency domains. Developed predominantly in Python, it accesses libraries written in Cython and C++ for efficiency. It is open source and modular, therefore readily incorporated into, and combined with, other data analysis tools and frameworks with particular utility for supporting research in the fields of geomorphology, geophysics, hydrography, photogrammetry and remote sensing. The analytical and computational structure of the toolbox is described, and its functionality illustrated with an example of a high-resolution bathymetric point cloud data collected with multibeam echosounder.

  7. Does physical activity moderate the relationship between depression symptomatology and low back pain? Cohort and co-twin control analyses nested in the longitudinal study of aging Danish twins (LSADT).

    PubMed

    Hübscher, Markus; Hartvigsen, Jan; Fernandez, Matthew; Christensen, Kaare; Ferreira, Paulo

    2016-04-01

    To investigate whether depression symptomatology is associated with low back pain (LBP) in twins aged 70+ and whether this effect depends on a person's physical activity (PA) status. This prospective cohort and nested case-control study used a nationally representative sample of twins. Data on depression symptomatology (modified Cambridge Mental Disorders Examination) and self-reported PA were obtained from the Longitudinal Study of Aging Danish Twins using twins without LBP at baseline. Associations between depression symptomatology (highest quartile) at baseline and LBP two years later were investigated using logistic regression analyses adjusted for sex. To examine the moderating effect of PA, we tested its interaction with depression. Associations were analysed using the complete sample of 2446 twins and a matched case-control analysis of 97 twin pairs discordant for LBP at follow-up. Odds ratios (OR) with 95% confidence intervals (CI) were calculated. Using the whole sample, high depression scores were associated with an increased probability of LBP (OR 1.56, 95% CI 1.22-1.99, P ≤ 0.01). There was no statistically significant interaction of light PA and depression symptomatology (OR 0.78, 95% CI 0.46-1.35, P = 0.39) and strenuous PA and depression symptomatology (0.84, 95% CI 0.50-1.41, P = 0.51). The case-control analysis showed similar ORs, although statistically insignificant. High depression symptomatology predicted incident LBP. This effect is supposedly not attributable to genetic or shared environmental factors. Physical activity did not moderate the effect of depression symptomatology on LBP.

  8. 3-D microstructure of olivine in complex geological materials reconstructed by correlative X-ray μ-CT and EBSD analyses.

    PubMed

    Kahl, W-A; Dilissen, N; Hidas, K; Garrido, C J; López-Sánchez-Vizcaíno, V; Román-Alpiste, M J

    2017-11-01

    We reconstruct the 3-D microstructure of centimetre-sized olivine crystals in rocks from the Almirez ultramafic massif (SE Spain) using combined X-ray micro computed tomography (μ-CT) and electron backscatter diffraction (EBSD). The semidestructive sample treatment involves geographically oriented drill pressing of rocks and preparation of oriented thin sections for EBSD from the μ-CT scanned cores. The μ-CT results show that the mean intercept length (MIL) analyses provide reliable information on the shape preferred orientation (SPO) of texturally different olivine groups. We show that statistical interpretation of crystal preferred orientation (CPO) and SPO of olivine becomes feasible because the highest densities of the distribution of main olivine crystal axes from EBSD are aligned with the three axes of the 3-D ellipsoid calculated from the MIL analyses from μ-CT. From EBSD data we distinguish multiple CPO groups and by locating the thin sections within the μ-CT volume, we assign SPO to the corresponding olivine crystal aggregates, which confirm the results of statistical comparison. We demonstrate that the limitations of both methods (i.e. no crystal orientation data in μ-CT and no spatial information in EBSD) can be overcome, and the 3-D orientation of the crystallographic axes of olivines from different orientation groups can be successfully correlated with the crystal shapes of representative olivine grains. Through this approach one can establish the link among geological structures, macrostructure, fabric and 3-D SPO-CPO relationship at the hand specimen scale even in complex, coarse-grained geomaterials. © 2017 The Authors Journal of Microscopy © 2017 Royal Microscopical Society.

  9. A New Scoring System to Predict the Risk for High-risk Adenoma and Comparison of Existing Risk Calculators.

    PubMed

    Murchie, Brent; Tandon, Kanwarpreet; Hakim, Seifeldin; Shah, Kinchit; O'Rourke, Colin; Castro, Fernando J

    2017-04-01

    Colorectal cancer (CRC) screening guidelines likely over-generalizes CRC risk, 35% of Americans are not up to date with screening, and there is growing incidence of CRC in younger patients. We developed a practical prediction model for high-risk colon adenomas in an average-risk population, including an expanded definition of high-risk polyps (≥3 nonadvanced adenomas), exposing higher than average-risk patients. We also compared results with previously created calculators. Patients aged 40 to 59 years, undergoing first-time average-risk screening or diagnostic colonoscopies were evaluated. Risk calculators for advanced adenomas and high-risk adenomas were created based on age, body mass index, sex, race, and smoking history. Previously established calculators with similar risk factors were selected for comparison of concordance statistic (c-statistic) and external validation. A total of 5063 patients were included. Advanced adenomas, and high-risk adenomas were seen in 5.7% and 7.4% of the patient population, respectively. The c-statistic for our calculator was 0.639 for the prediction of advanced adenomas, and 0.650 for high-risk adenomas. When applied to our population, all previous models had lower c-statistic results although one performed similarly. Our model compares favorably to previously established prediction models. Age and body mass index were used as continuous variables, likely improving the c-statistic. It also reports absolute predictive probabilities of advanced and high-risk polyps, allowing for more individualized risk assessment of CRC.

  10. Research of Extension of the Life Cycle of Helicopter Rotor Blade in Hungary

    DTIC Science & Technology

    2003-02-01

    Radiography (DXR), and (iii) Vibration Diagnostics (VD) with Statistical Energy Analysis (SEA) were semi- simultaneously applied [1]. The used three...2.2. Vibration Diagnostics (VD)) Parallel to the NDT measurements the Statistical Energy Analysis (SEA) as a vibration diagnostical tool were...noises were analysed with a dual-channel real time frequency analyser (BK2035). In addition to the Statistical Energy Analysis measurement a small

  11. A systematic review of the quality of statistical methods employed for analysing quality of life data in cancer randomised controlled trials.

    PubMed

    Hamel, Jean-Francois; Saulnier, Patrick; Pe, Madeline; Zikos, Efstathios; Musoro, Jammbe; Coens, Corneel; Bottomley, Andrew

    2017-09-01

    Over the last decades, Health-related Quality of Life (HRQoL) end-points have become an important outcome of the randomised controlled trials (RCTs). HRQoL methodology in RCTs has improved following international consensus recommendations. However, no international recommendations exist concerning the statistical analysis of such data. The aim of our study was to identify and characterise the quality of the statistical methods commonly used for analysing HRQoL data in cancer RCTs. Building on our recently published systematic review, we analysed a total of 33 published RCTs studying the HRQoL methods reported in RCTs since 1991. We focussed on the ability of the methods to deal with the three major problems commonly encountered when analysing HRQoL data: their multidimensional and longitudinal structure and the commonly high rate of missing data. All studies reported HRQoL being assessed repeatedly over time for a period ranging from 2 to 36 months. Missing data were common, with compliance rates ranging from 45% to 90%. From the 33 studies considered, 12 different statistical methods were identified. Twenty-nine studies analysed each of the questionnaire sub-dimensions without type I error adjustment. Thirteen studies repeated the HRQoL analysis at each assessment time again without type I error adjustment. Only 8 studies used methods suitable for repeated measurements. Our findings show a lack of consistency in statistical methods for analysing HRQoL data. Problems related to multiple comparisons were rarely considered leading to a high risk of false positive results. It is therefore critical that international recommendations for improving such statistical practices are developed. Copyright © 2017. Published by Elsevier Ltd.

  12. Sunspot activity and influenza pandemics: a statistical assessment of the purported association.

    PubMed

    Towers, S

    2017-10-01

    Since 1978, a series of papers in the literature have claimed to find a significant association between sunspot activity and the timing of influenza pandemics. This paper examines these analyses, and attempts to recreate the three most recent statistical analyses by Ertel (1994), Tapping et al. (2001), and Yeung (2006), which all have purported to find a significant relationship between sunspot numbers and pandemic influenza. As will be discussed, each analysis had errors in the data. In addition, in each analysis arbitrary selections or assumptions were also made, and the authors did not assess the robustness of their analyses to changes in those arbitrary assumptions. Varying the arbitrary assumptions to other, equally valid, assumptions negates the claims of significance. Indeed, an arbitrary selection made in one of the analyses appears to have resulted in almost maximal apparent significance; changing it only slightly yields a null result. This analysis applies statistically rigorous methodology to examine the purported sunspot/pandemic link, using more statistically powerful un-binned analysis methods, rather than relying on arbitrarily binned data. The analyses are repeated using both the Wolf and Group sunspot numbers. In all cases, no statistically significant evidence of any association was found. However, while the focus in this particular analysis was on the purported relationship of influenza pandemics to sunspot activity, the faults found in the past analyses are common pitfalls; inattention to analysis reproducibility and robustness assessment are common problems in the sciences, that are unfortunately not noted often enough in review.

  13. Time Series Analysis Based on Running Mann Whitney Z Statistics

    USDA-ARS?s Scientific Manuscript database

    A sensitive and objective time series analysis method based on the calculation of Mann Whitney U statistics is described. This method samples data rankings over moving time windows, converts those samples to Mann-Whitney U statistics, and then normalizes the U statistics to Z statistics using Monte-...

  14. Saitohin Q7R polymorphism is associated with late-onset Alzheimer's disease susceptibility among caucasian populations: a meta-analysis.

    PubMed

    Huang, Rong; Tian, Sai; Cai, Rongrong; Sun, Jie; Xia, Wenqing; Dong, Xue; Shen, Yanjue; Wang, Shaohua

    2017-08-01

    Saitohin (STH) Q7R polymorphism has been reported to influence the individual's susceptibility to Alzheimer's disease (AD); however, conclusions remain controversial. Therefore, we performed this meta-analysis to explore the association between STH Q7R polymorphism and AD risk. Systematic literature searches were performed in the PubMed, Embase, Cochrane Library and Web of Science for studies published before 31 August 2016. Pooled odds ratios (ORs) and 95% confidence intervals (CIs) were calculated to assess the strength of the association using a fixed- or random-effects model. Subgroup analyses, Galbraith plot and sensitivity analyses were also performed. All statistical analyses were performed with STATA Version 12.0. A total of 19 case-control studies from 17 publications with 4387 cases and 3972 controls were included in our meta-analysis. The results showed that the Q7R polymorphism was significantly associated with an increased risk of AD in a recessive model (RR versus QQ+QR, OR = 1.27, 95% CI = 1.01-1.60, P = 0.040). After excluding the four studies not carried out in caucasians, the overall association was unchanged in all comparison models. Further subgroup analyses stratified by the time of AD onset, and the quality of included studies provided statistical evidence of significant increased risk of AD in RR versus QQ+QR model only in late-onset subjects (OR = 1.56, 95% CI = 1.07-2.26, P = 0.021) and in studies with high quality (OR = 1.37, 95% CI = 1.01-1.86, P = 0.043). This meta-analysis suggests that the RR genotype in saitohin Q7R polymorphism may be a human-specific risk factor for AD, especially among late-onset AD subjects and caucasian populations. © 2017 The Authors. Journal of Cellular and Molecular Medicine published by John Wiley & Sons Ltd and Foundation for Cellular and Molecular Medicine.

  15. Testing a Coupled Global-limited-area Data Assimilation System using Observations from the 2004 Pacific Typhoon Season

    NASA Astrophysics Data System (ADS)

    Holt, C. R.; Szunyogh, I.; Gyarmati, G.; Hoffman, R. N.; Leidner, M.

    2011-12-01

    Tropical cyclone (TC) track and intensity forecasts have improved in recent years due to increased model resolution, improved data assimilation, and the rapid increase in the number of routinely assimilated observations over oceans. The data assimilation approach that has received the most attention in recent years is Ensemble Kalman Filtering (EnKF). The most attractive feature of the EnKF is that it uses a fully flow-dependent estimate of the error statistics, which can have important benefits for the analysis of rapidly developing TCs. We implement the Local Ensemble Transform Kalman Filter algorithm, a vari- ation of the EnKF, on a reduced-resolution version of the National Centers for Environmental Prediction (NCEP) Global Forecast System (GFS) model and the NCEP Regional Spectral Model (RSM) to build a coupled global-limited area anal- ysis/forecast system. This is the first time, to our knowledge, that such a system is used for the analysis and forecast of tropical cyclones. We use data from summer 2004 to study eight tropical cyclones in the Northwest Pacific. The benchmark data sets that we use to assess the performance of our system are the NCEP Reanalysis and the NCEP Operational GFS analyses from 2004. These benchmark analyses were both obtained by the Statistical Spectral Interpolation, which was the operational data assimilation system of NCEP in 2004. The GFS Operational analysis assimilated a large number of satellite radiance observations in addition to the observations assimilated in our system. All analyses are verified against the Joint Typhoon Warning Center Best Track data set. The errors are calculated for the position and intensity of the TCs. The global component of the ensemble-based system shows improvement in po- sition analysis over the NCEP Reanalysis, but shows no significant difference from the NCEP operational analysis for most of the storm tracks. The regional com- ponent of our system improves position analysis over all the global analyses. The intensity analyses, measured by the minimum sea level pressure, are of similar quality in all of the analyses. Regional deterministic forecasts started from our analyses are generally not significantly different from those started from the GFS operational analysis. On average, the regional experiments performed better for longer than 48 h sea level pressure forecasts, while the global forecast performed better in predicting the position for longer than 48 h.

  16. Process air quality data

    NASA Technical Reports Server (NTRS)

    Butler, C. M.; Hogge, J. E.

    1978-01-01

    Air quality sampling was conducted. Data for air quality parameters, recorded on written forms, punched cards or magnetic tape, are available for 1972 through 1975. Computer software was developed to (1) calculate several daily statistical measures of location, (2) plot time histories of data or the calculated daily statistics, (3) calculate simple correlation coefficients, and (4) plot scatter diagrams. Computer software was developed for processing air quality data to include time series analysis and goodness of fit tests. Computer software was developed to (1) calculate a larger number of daily statistical measures of location, and a number of daily monthly and yearly measures of location, dispersion, skewness and kurtosis, (2) decompose the extended time series model and (3) perform some goodness of fit tests. The computer program is described, documented and illustrated by examples. Recommendations are made for continuation of the development of research on processing air quality data.

  17. High-Density Signal Interface Electromagnetic Radiation Prediction for Electromagnetic Compatibility Evaluation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Halligan, Matthew

    Radiated power calculation approaches for practical scenarios of incomplete high- density interface characterization information and incomplete incident power information are presented. The suggested approaches build upon a method that characterizes power losses through the definition of power loss constant matrices. Potential radiated power estimates include using total power loss information, partial radiated power loss information, worst case analysis, and statistical bounding analysis. A method is also proposed to calculate radiated power when incident power information is not fully known for non-periodic signals at the interface. Incident data signals are modeled from a two-state Markov chain where bit state probabilities aremore » derived. The total spectrum for windowed signals is postulated as the superposition of spectra from individual pulses in a data sequence. Statistical bounding methods are proposed as a basis for the radiated power calculation due to the statistical calculation complexity to find a radiated power probability density function.« less

  18. A test of safety, violence prevention, and civility climate domain-specific relationships with relevant workplace hazards.

    PubMed

    Gazica, Michele W; Spector, Paul E

    2016-01-01

    Safety climate, violence prevention climate, and civility climate were independently developed and linked to domain-specific workplace hazards, although all three were designed to promote the physical and psychological safety of workers. To test domain specificity between conceptually related workplace climates and relevant workplace hazards. Data were collected from 368 persons employed in various industries and descriptive statistics were calculated for all study variables. Correlational and relative weights analyses were used to test for domain specificity. The three climate domains were similarly predictive of most workplace hazards, regardless of domain specificity. This study suggests that the three climate domains share a common higher order construct that may predict relevant workplace hazards better than any of the scales alone.

  19. Validation and Sensitivity Analysis of a New Atmosphere-Soil-Vegetation Model.

    NASA Astrophysics Data System (ADS)

    Nagai, Haruyasu

    2002-02-01

    This paper describes details, validation, and sensitivity analysis of a new atmosphere-soil-vegetation model. The model consists of one-dimensional multilayer submodels for atmosphere, soil, and vegetation and radiation schemes for the transmission of solar and longwave radiations in canopy. The atmosphere submodel solves prognostic equations for horizontal wind components, potential temperature, specific humidity, fog water, and turbulence statistics by using a second-order closure model. The soil submodel calculates the transport of heat, liquid water, and water vapor. The vegetation submodel evaluates the heat and water budget on leaf surface and the downward liquid water flux. The model performance was tested by using measured data of the Cooperative Atmosphere-Surface Exchange Study (CASES). Calculated ground surface fluxes were mainly compared with observations at a winter wheat field, concerning the diurnal variation and change in 32 days of the first CASES field program in 1997, CASES-97. The measured surface fluxes did not satisfy the energy balance, so sensible and latent heat fluxes obtained by the eddy correlation method were corrected. By using options of the solar radiation scheme, which addresses the effect of the direct solar radiation component, calculated albedo agreed well with the observations. Some sensitivity analyses were also done for model settings. Model calculations of surface fluxes and surface temperature were in good agreement with measurements as a whole.

  20. Validation of Cross Sections for Monte Carlo Simulation of the Photoelectric Effect

    NASA Astrophysics Data System (ADS)

    Han, Min Cheol; Kim, Han Sung; Pia, Maria Grazia; Basaglia, Tullio; Batič, Matej; Hoff, Gabriela; Kim, Chan Hyeong; Saracco, Paolo

    2016-04-01

    Several total and partial photoionization cross section calculations, based on both theoretical and empirical approaches, are quantitatively evaluated with statistical analyses using a large collection of experimental data retrieved from the literature to identify the state of the art for modeling the photoelectric effect in Monte Carlo particle transport. Some of the examined cross section models are available in general purpose Monte Carlo systems, while others have been implemented and subjected to validation tests for the first time to estimate whether they could improve the accuracy of particle transport codes. The validation process identifies Scofield's 1973 non-relativistic calculations, tabulated in the Evaluated Photon Data Library (EPDL), as the one best reproducing experimental measurements of total cross sections. Specialized total cross section models, some of which derive from more recent calculations, do not provide significant improvements. Scofield's non-relativistic calculations are not surpassed regarding the compatibility with experiment of K and L shell photoionization cross sections either, although in a few test cases Ebel's parameterization produces more accurate results close to absorption edges. Modifications to Biggs and Lighthill's parameterization implemented in Geant4 significantly reduce the accuracy of total cross sections at low energies with respect to its original formulation. The scarcity of suitable experimental data hinders a similar extensive analysis for the simulation of the photoelectron angular distribution, which is limited to a qualitative appraisal.

  1. A modified method of 3D-SSP analysis for amyloid PET imaging using [¹¹C]BF-227.

    PubMed

    Kaneta, Tomohiro; Okamura, Nobuyuki; Minoshima, Satoshi; Furukawa, Katsutoshi; Tashiro, Manabu; Furumoto, Shozo; Iwata, Ren; Fukuda, Hiroshi; Takahashi, Shoki; Yanai, Kazuhiko; Kudo, Yukitsuka; Arai, Hiroyuki

    2011-12-01

    Three-dimensional stereotactic surface projection (3D-SSP) analyses have been widely used in dementia imaging studies. However, 3D-SSP sometimes shows paradoxical results on amyloid positron emission tomography (PET) analyses. This is thought to be caused by errors in anatomical standardization (AS) based on an (18)F-fluorodeoxyglucose (FDG) template. We developed a new method of 3D-SSP analysis for amyloid PET imaging, and used it to analyze (11)C-labeled 2-(2-[2-dimethylaminothiazol-5-yl]ethenyl)-6-(2-[fluoro]ethoxy)benzoxazole (BF-227) PET images of subjects with mild cognitive impairment (MCI) and Alzheimer's disease (AD). The subjects were 20 with MCI, 19 patients with AD, and 17 healthy controls. Twelve subjects with MCI were followed up for 3 years or more, and conversion to AD was seen in 6 cases. All subjects underwent PET with both FDG and BF-227. For AS and 3D-SSP analyses of PET data, Neurostat (University of Washington, WA, USA) was used. Method 1 involves AS for BF-227 images using an FDG template. In this study, we developed a new method (Method 2) for AS: First, an FDG image was subjected to AS using an FDG template. Then, the BF-227 image of the same patient was registered to the FDG image, and AS was performed using the transformation parameters calculated for AS of the corresponding FDG images. Regional values were normalized by the average value obtained at the cerebellum and values were calculated for the frontal, parietal, temporal, and occipital lobes. For statistical comparison of the 3 groups, we applied one-way analysis of variance followed by the Bonferroni post hoc test. For statistical comparison between converters and non-converters, the t test was applied. Statistical significance was defined as p < 0.05. Among the 56 cases we studied, Method 1 demonstrated slight distortions after AS of the image in 16 cases and heavy distortions in 4 cases in which the distortions were not observed with Method 2. Both methods demonstrated that the values in AD and MCI patients were significantly higher than those in the controls, in the parietal, temporal, and occipital lobes. However, only Method 2 showed significant differences in the frontal lobes. In addition, Method 2 could demonstrate a significantly higher value in MCI-to-AD converters in the parietal and frontal lobes. Method 2 corrects AS errors that often occur when using Method 1, and has made appropriate 3D-SSP analysis of amyloid PET imaging possible. This new method of 3D-SSP analysis for BF-227 PET could prove useful for detecting differences between normal groups and AD and MCI groups, and between converters and non-converters.

  2. Comparative multivariate analyses of transient otoacoustic emissions and distorsion products in normal and impaired hearing.

    PubMed

    Stamate, Mirela Cristina; Todor, Nicolae; Cosgarea, Marcel

    2015-01-01

    The clinical utility of otoacoustic emissions as a noninvasive objective test of cochlear function has been long studied. Both transient otoacoustic emissions and distorsion products can be used to identify hearing loss, but to what extent they can be used as predictors for hearing loss is still debated. Most studies agree that multivariate analyses have better test performances than univariate analyses. The aim of the study was to determine transient otoacoustic emissions and distorsion products performance in identifying normal and impaired hearing loss, using the pure tone audiogram as a gold standard procedure and different multivariate statistical approaches. The study included 105 adult subjects with normal hearing and hearing loss who underwent the same test battery: pure-tone audiometry, tympanometry, otoacoustic emission tests. We chose to use the logistic regression as a multivariate statistical technique. Three logistic regression models were developed to characterize the relations between different risk factors (age, sex, tinnitus, demographic features, cochlear status defined by otoacoustic emissions) and hearing status defined by pure-tone audiometry. The multivariate analyses allow the calculation of the logistic score, which is a combination of the inputs, weighted by coefficients, calculated within the analyses. The accuracy of each model was assessed using receiver operating characteristics curve analysis. We used the logistic score to generate receivers operating curves and to estimate the areas under the curves in order to compare different multivariate analyses. We compared the performance of each otoacoustic emission (transient, distorsion product) using three different multivariate analyses for each ear, when multi-frequency gold standards were used. We demonstrated that all multivariate analyses provided high values of the area under the curve proving the performance of the otoacoustic emissions. Each otoacoustic emission test presented high values of area under the curve, suggesting that implementing a multivariate approach to evaluate the performances of each otoacoustic emission test would serve to increase the accuracy in identifying the normal and impaired ears. We encountered the highest area under the curve value for the combined multivariate analysis suggesting that both otoacoustic emission tests should be used in assessing hearing status. Our multivariate analyses revealed that age is a constant predictor factor of the auditory status for both ears, but the presence of tinnitus was the most important predictor for the hearing level, only for the left ear. Age presented similar coefficients, but tinnitus coefficients, by their high value, produced the highest variations of the logistic scores, only for the left ear group, thus increasing the risk of hearing loss. We did not find gender differences between ears for any otoacoustic emission tests, but studies still debate this question as the results are contradictory. Neither gender, nor environment origin had any predictive value for the hearing status, according to the results of our study. Like any other audiological test, using otoacoustic emissions to identify hearing loss is not without error. Even when applying multivariate analysis, perfect test performance is never achieved. Although most studies demonstrated the benefit of using the multivariate analysis, it has not been incorporated into clinical decisions maybe because of the idiosyncratic nature of multivariate solutions or because of the lack of the validation studies.

  3. Comparative multivariate analyses of transient otoacoustic emissions and distorsion products in normal and impaired hearing

    PubMed Central

    STAMATE, MIRELA CRISTINA; TODOR, NICOLAE; COSGAREA, MARCEL

    2015-01-01

    Background and aim The clinical utility of otoacoustic emissions as a noninvasive objective test of cochlear function has been long studied. Both transient otoacoustic emissions and distorsion products can be used to identify hearing loss, but to what extent they can be used as predictors for hearing loss is still debated. Most studies agree that multivariate analyses have better test performances than univariate analyses. The aim of the study was to determine transient otoacoustic emissions and distorsion products performance in identifying normal and impaired hearing loss, using the pure tone audiogram as a gold standard procedure and different multivariate statistical approaches. Methods The study included 105 adult subjects with normal hearing and hearing loss who underwent the same test battery: pure-tone audiometry, tympanometry, otoacoustic emission tests. We chose to use the logistic regression as a multivariate statistical technique. Three logistic regression models were developed to characterize the relations between different risk factors (age, sex, tinnitus, demographic features, cochlear status defined by otoacoustic emissions) and hearing status defined by pure-tone audiometry. The multivariate analyses allow the calculation of the logistic score, which is a combination of the inputs, weighted by coefficients, calculated within the analyses. The accuracy of each model was assessed using receiver operating characteristics curve analysis. We used the logistic score to generate receivers operating curves and to estimate the areas under the curves in order to compare different multivariate analyses. Results We compared the performance of each otoacoustic emission (transient, distorsion product) using three different multivariate analyses for each ear, when multi-frequency gold standards were used. We demonstrated that all multivariate analyses provided high values of the area under the curve proving the performance of the otoacoustic emissions. Each otoacoustic emission test presented high values of area under the curve, suggesting that implementing a multivariate approach to evaluate the performances of each otoacoustic emission test would serve to increase the accuracy in identifying the normal and impaired ears. We encountered the highest area under the curve value for the combined multivariate analysis suggesting that both otoacoustic emission tests should be used in assessing hearing status. Our multivariate analyses revealed that age is a constant predictor factor of the auditory status for both ears, but the presence of tinnitus was the most important predictor for the hearing level, only for the left ear. Age presented similar coefficients, but tinnitus coefficients, by their high value, produced the highest variations of the logistic scores, only for the left ear group, thus increasing the risk of hearing loss. We did not find gender differences between ears for any otoacoustic emission tests, but studies still debate this question as the results are contradictory. Neither gender, nor environment origin had any predictive value for the hearing status, according to the results of our study. Conclusion Like any other audiological test, using otoacoustic emissions to identify hearing loss is not without error. Even when applying multivariate analysis, perfect test performance is never achieved. Although most studies demonstrated the benefit of using the multivariate analysis, it has not been incorporated into clinical decisions maybe because of the idiosyncratic nature of multivariate solutions or because of the lack of the validation studies. PMID:26733749

  4. Early rheumatoid arthritis 6 years after diagnosis is still associated with high direct costs and increasing loss of productivity: the Swedish TIRA project.

    PubMed

    Hallert, E; Husberg, M; Kalkan, A; Skogh, T; Bernfort, L

    2014-01-01

    To calculate total costs over 6 years after diagnosis of early rheumatoid arthritis (RA). In the longitudinal prospective multicentre TIRA study, 239 patients from seven units, diagnosed in 1996-98, reported regularly on health-care utilization and the number of days lost from work. Costs were obtained from official databases and calculated using unit costs (Swedish kronor, SEK) from 2001. Indirect costs were calculated using the human capital approach (HCA). Costs were inflation adjusted to Euro June 2012, using the Swedish Consumer Price Index and the exchange rate of June 2012. Statistical analyses were based on linear mixed models (LMMs) for changes over time. The mean total cost per patient was EUR 14,768 in year 1, increasing to EUR 18,438 in year 6. Outpatient visits and hospitalization decreased but costs for surgery increased from EUR 92/patient in year 1 to EUR 444/patient in year 6. Drug costs increased from EUR 429/patient to EUR 2214/patient, mainly because of the introduction of biologics. In year 1, drugs made up for 10% of direct costs, and increased to 49% in year 6. Sick leave decreased during the first years but disability pensions increased, resulting in unchanged indirect costs. Over the following years, disability pensions increased further and indirect costs increased from EUR 10,284 in year 1 to EUR 13,874 in year 6. LMM analyses showed that indirect costs were unchanged whereas direct costs, after an initial fall, increased over the following years, leading to increasing total costs. In the 6 years after diagnosis of early RA, drug costs were partially offset by decreasing outpatient visits but indirect costs remained unchanged and total costs increased.

  5. Group Analysis in MNE-Python of Evoked Responses from a Tactile Stimulation Paradigm: A Pipeline for Reproducibility at Every Step of Processing, Going from Individual Sensor Space Representations to an across-Group Source Space Representation

    PubMed Central

    Andersen, Lau M.

    2018-01-01

    An important aim of an analysis pipeline for magnetoencephalographic data is that it allows for the researcher spending maximal effort on making the statistical comparisons that will answer the questions of the researcher, while in turn spending minimal effort on the intricacies and machinery of the pipeline. I here present a set of functions and scripts that allow for setting up a clear, reproducible structure for separating raw and processed data into folders and files such that minimal effort can be spend on: (1) double-checking that the right input goes into the right functions; (2) making sure that output and intermediate steps can be accessed meaningfully; (3) applying operations efficiently across groups of subjects; (4) re-processing data if changes to any intermediate step are desirable. Applying the scripts requires only general knowledge about the Python language. The data analyses are neural responses to tactile stimulations of the right index finger in a group of 20 healthy participants acquired from an Elekta Neuromag System. Two analyses are presented: going from individual sensor space representations to, respectively, an across-group sensor space representation and an across-group source space representation. The processing steps covered for the first analysis are filtering the raw data, finding events of interest in the data, epoching data, finding and removing independent components related to eye blinks and heart beats, calculating participants' individual evoked responses by averaging over epoched data and calculating a grand average sensor space representation over participants. The second analysis starts from the participants' individual evoked responses and covers: estimating noise covariance, creating a forward model, creating an inverse operator, estimating distributed source activity on the cortical surface using a minimum norm procedure, morphing those estimates onto a common cortical template and calculating the patterns of activity that are statistically different from baseline. To estimate source activity, processing of the anatomy of subjects based on magnetic resonance imaging is necessary. The necessary steps are covered here: importing magnetic resonance images, segmenting the brain, estimating boundaries between different tissue layers, making fine-resolution scalp surfaces for facilitating co-registration, creating source spaces and creating volume conductors for each subject. PMID:29403349

  6. Group Analysis in MNE-Python of Evoked Responses from a Tactile Stimulation Paradigm: A Pipeline for Reproducibility at Every Step of Processing, Going from Individual Sensor Space Representations to an across-Group Source Space Representation.

    PubMed

    Andersen, Lau M

    2018-01-01

    An important aim of an analysis pipeline for magnetoencephalographic data is that it allows for the researcher spending maximal effort on making the statistical comparisons that will answer the questions of the researcher, while in turn spending minimal effort on the intricacies and machinery of the pipeline. I here present a set of functions and scripts that allow for setting up a clear, reproducible structure for separating raw and processed data into folders and files such that minimal effort can be spend on: (1) double-checking that the right input goes into the right functions; (2) making sure that output and intermediate steps can be accessed meaningfully; (3) applying operations efficiently across groups of subjects; (4) re-processing data if changes to any intermediate step are desirable. Applying the scripts requires only general knowledge about the Python language. The data analyses are neural responses to tactile stimulations of the right index finger in a group of 20 healthy participants acquired from an Elekta Neuromag System. Two analyses are presented: going from individual sensor space representations to, respectively, an across-group sensor space representation and an across-group source space representation. The processing steps covered for the first analysis are filtering the raw data, finding events of interest in the data, epoching data, finding and removing independent components related to eye blinks and heart beats, calculating participants' individual evoked responses by averaging over epoched data and calculating a grand average sensor space representation over participants. The second analysis starts from the participants' individual evoked responses and covers: estimating noise covariance, creating a forward model, creating an inverse operator, estimating distributed source activity on the cortical surface using a minimum norm procedure, morphing those estimates onto a common cortical template and calculating the patterns of activity that are statistically different from baseline. To estimate source activity, processing of the anatomy of subjects based on magnetic resonance imaging is necessary. The necessary steps are covered here: importing magnetic resonance images, segmenting the brain, estimating boundaries between different tissue layers, making fine-resolution scalp surfaces for facilitating co-registration, creating source spaces and creating volume conductors for each subject.

  7. Effect of Embolization Material in the Calculation of Dose Deposition in Arteriovenous Malformations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De la Cruz, O. O. Galvan; Moreno-Jimenez, S.; Larraga-Gutierrez, J. M.

    2010-12-07

    In this work it is studied the impact of the incorporation of high Z materials (embolization material) in the dose calculation for stereotactic radiosurgery treatment for arteriovenous malformations. A statistical analysis is done to establish the variables that may impact in the dose calculation. To perform the comparison pencil beam (PB) and Monte Carlo (MC) calculation algorithms were used. The comparison between both dose calculations shows that PB overestimates the dose deposited. The statistical analysis, for the quantity of patients of the study (20), shows that the variable that may impact in the dose calculation is the volume of themore » high Z material in the arteriovenous malformation. Further studies have to be done to establish the clinical impact with the radiosurgery result.« less

  8. Proper assessment of the JFK assassination bullet lead evidence from metallurgical and statistical perspectives.

    PubMed

    Randich, Erik; Grant, Patrick M

    2006-07-01

    The bullet evidence in the JFK assassination investigation was reexamined from metallurgical and statistical standpoints. The questioned specimens are comprised of soft lead, possibly from full-metal-jacketed Mannlicher-Carcano (MC), 6.5-mm ammunition. During lead refining, contaminant elements are removed to specified levels for a desired alloy or composition. Microsegregation of trace and minor elements during lead casting and processing can account for the experimental variabilities measured in various evidentiary and comparison samples by laboratory analysts. Thus, elevated concentrations of antimony and copper at crystallographic grain boundaries, the widely varying sizes of grains in MC bullet lead, and the 5-60 mg bullet samples analyzed for assassination intelligence effectively resulted in operational sampling error for the analyses. This deficiency was not considered in the original data interpretation and resulted in an invalid conclusion in favor of the single-bullet theory of the assassination. Alternate statistical calculations, based on the historic analytical data, incorporating weighted averaging and propagation of experimental uncertainties also considerably weaken support for the single-bullet theory. In effect, this assessment of the material composition of the lead specimens from the assassination concludes that the extant evidence is consistent with any number between two and five rounds fired in Dealey Plaza during the shooting.

  9. Proper Assessment of the JFK Assassination Bullet Lead Evidence from Metallurgical and Statistical Perspectives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Randich, E; Grant, P M

    2006-08-29

    The bullet evidence in the JFK assassination investigation was reexamined from metallurgical and statistical standpoints. The questioned specimens are comprised of soft lead, possibly from full-metal-jacketed Mannlicher-Carcano, 6.5-mm ammunition. During lead refining, contaminant elements are removed to specified levels for a desired alloy or composition. Microsegregation of trace and minor elements during lead casting and processing can account for the experimental variabilities measured in various evidentiary and comparison samples by laboratory analysts. Thus, elevated concentrations of antimony and copper at crystallographic grain boundaries, the widely varying sizes of grains in Mannlicher-Carcano bullet lead, and the 5-60 mg bullet samples analyzedmore » for assassination intelligence effectively resulted in operational sampling error for the analyses. This deficiency was not considered in the original data interpretation and resulted in an invalid conclusion in favor of the single-bullet theory of the assassination. Alternate statistical calculations, based on the historic analytical data, incorporating weighted averaging and propagation of experimental uncertainties also considerably weaken support for the single-bullet theory. In effect, this assessment of the material composition of the lead specimens from the assassination concludes that the extant evidence is consistent with any number between two and five rounds fired in Dealey Plaza during the shooting.« less

  10. What can 35 years and over 700,000 measurements tell us about noise exposure in the mining industry?

    PubMed

    Roberts, Benjamin; Sun, Kan; Neitzel, Richard L

    2017-01-01

    To analyse over 700,000 cross-sectional measurements from the Mine Safety and Health Administration (MHSA) and develop statistical models to predict noise exposure for a worker. Descriptive statistics were used to summarise the data. Two linear regression models were used to predict noise exposure based on MSHA-permissible exposure limit (PEL) and action level (AL), respectively. Twofold cross validation was used to compare the exposure estimates from the models to actual measurement. The mean difference and t-statistic was calculated for each job title to determine whether the model predictions were significantly different from the actual data. Measurements were acquired from MSHA through a Freedom of Information Act request. From 1979 to 2014, noise exposure has decreased. Measurements taken before the implementation of MSHA's revised noise regulation in 2000 were on average 4.5 dBA higher than after the law was implemented. Both models produced exposure predictions that were less than 1 dBA different than the holdout data. Overall noise levels in mines have been decreasing. However, this decrease has not been uniform across all mining sectors. The exposure predictions from the model will be useful to help predict hearing loss in workers in the mining industry.

  11. 78 FR 24336 - Rules of Practice and Procedure; Adjusting Civil Money Penalties for Inflation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-25

    ... courts. \\4\\ The CPI is published by the Department of Labor, Bureau of Statistics, and is available at.... Mathematical Calculation In general, the adjustment calculation required by the Inflation Adjustment Act is... adjusted in 2009. According to the Bureau of Labor Statistics, the CPI for June 1996 and June 2009 was 156...

  12. 40 CFR 91.511 - Suspension and revocation of certificates of conformity.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... many engines as needed so that the CumSum statistic, as calculated in § 91.508(a), falls below the... family, if the manufacturer desires to continue introduction into commerce of a modified version of that... family so that the CumSum statistic, as calculated in § 91.508(a) using the newly assigned FEL if...

  13. Conservative Tests under Satisficing Models of Publication Bias.

    PubMed

    McCrary, Justin; Christensen, Garret; Fanelli, Daniele

    2016-01-01

    Publication bias leads consumers of research to observe a selected sample of statistical estimates calculated by producers of research. We calculate critical values for statistical significance that could help to adjust after the fact for the distortions created by this selection effect, assuming that the only source of publication bias is file drawer bias. These adjusted critical values are easy to calculate and differ from unadjusted critical values by approximately 50%-rather than rejecting a null hypothesis when the t-ratio exceeds 2, the analysis suggests rejecting a null hypothesis when the t-ratio exceeds 3. Samples of published social science research indicate that on average, across research fields, approximately 30% of published t-statistics fall between the standard and adjusted cutoffs.

  14. Conservative Tests under Satisficing Models of Publication Bias

    PubMed Central

    McCrary, Justin; Christensen, Garret; Fanelli, Daniele

    2016-01-01

    Publication bias leads consumers of research to observe a selected sample of statistical estimates calculated by producers of research. We calculate critical values for statistical significance that could help to adjust after the fact for the distortions created by this selection effect, assuming that the only source of publication bias is file drawer bias. These adjusted critical values are easy to calculate and differ from unadjusted critical values by approximately 50%—rather than rejecting a null hypothesis when the t-ratio exceeds 2, the analysis suggests rejecting a null hypothesis when the t-ratio exceeds 3. Samples of published social science research indicate that on average, across research fields, approximately 30% of published t-statistics fall between the standard and adjusted cutoffs. PMID:26901834

  15. Causality in Statistical Power: Isomorphic Properties of Measurement, Research Design, Effect Size, and Sample Size.

    PubMed

    Heidel, R Eric

    2016-01-01

    Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of the outcome, the research design, the magnitude of the effect size, the variance of the effect size, and the sample size. A framework grounded in the phenomenon of isomorphism, or interdependencies amongst different constructs with similar forms, will be presented to understand the isomorphic effects of decisions made on each of the five aforementioned components of statistical power.

  16. Analysis of the variability of extra-tropical cyclones at the regional scale for the coasts of Northern Germany and investigation of their coastal impacts

    NASA Astrophysics Data System (ADS)

    Schaaf, Benjamin; Feser, Frauke

    2015-04-01

    The evaluation of long-term changes in wind speeds is very important for the coastal areas and the protection measures. Therefor the wind variability at the regional scale for the coast of Northern Germany shall be analysed. In order to derive changes in storminess it is essential to analyse long, homogeneous meteorological time series. Wind measurements often suffer from inconsistencies which arise from changes in instrumentation, observation method, or station location. Reanalysis data take into account such inhomogeneities of observation data and convert these measurements into a consistent, gridded data set with the same grid spacing and time intervals. This leads to a smooth, homogeneous data set, but with relatively low resolution (about 210 km for the longest reanalysis data set, the NCEP reanalysis starting in 1948). Therefore a high-resolution regional atmospheric model will be used to bring these reanalyses to a higher resolution, using in addition to a dynamical downscaling approach the spectral nudging technique. This method 'nudges' the large spatial scales of the regional climate model towards the reanalysis, while the smaller spatial scales are left unchanged. It was applied successfully in a number of applications, leading to realistic atmospheric weather descriptions of the past. With the regional climate model COSMO-CLM a very high-resolution data set was calculated for the last 67 years, the period from 1948 until now. The model area is North Germany with the coastal area of the North sea and parts of the Baltic sea. This is one of the first model simulations on climate scale with a very high resolution of 2.8 km, so even small scale effects can be detected. With this hindcast-simulation there are numerous options of evaluation. One can create wind climatologies for regional areas such as for the metropolitan region of Hamburg. Otherwise one can investigate individual storms in a case study. With a filtering and tracking program the course of individual storms can be tracked and compared with observations. Also statistical studies can be done and one can calculate percentiles, return periods and other different extreme value statistic variables. Later, with a further nesting simulation, the resolution can be reduced to 1 km for individual areas of interest to analyse small islands (as Foehr or Amrum) and their effects on the atmospheric flow more closely.

  17. A Retrospective Survey of Research Design and Statistical Analyses in Selected Chinese Medical Journals in 1998 and 2008

    PubMed Central

    Jin, Zhichao; Yu, Danghui; Zhang, Luoman; Meng, Hong; Lu, Jian; Gao, Qingbin; Cao, Yang; Ma, Xiuqiang; Wu, Cheng; He, Qian; Wang, Rui; He, Jia

    2010-01-01

    Background High quality clinical research not only requires advanced professional knowledge, but also needs sound study design and correct statistical analyses. The number of clinical research articles published in Chinese medical journals has increased immensely in the past decade, but study design quality and statistical analyses have remained suboptimal. The aim of this investigation was to gather evidence on the quality of study design and statistical analyses in clinical researches conducted in China for the first decade of the new millennium. Methodology/Principal Findings Ten (10) leading Chinese medical journals were selected and all original articles published in 1998 (N = 1,335) and 2008 (N = 1,578) were thoroughly categorized and reviewed. A well-defined and validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation. Main outcomes were the frequencies of different types of study design, error/defect proportion in design and statistical analyses, and implementation of CONSORT in randomized clinical trials. From 1998 to 2008: The error/defect proportion in statistical analyses decreased significantly ( = 12.03, p<0.001), 59.8% (545/1,335) in 1998 compared to 52.2% (664/1,578) in 2008. The overall error/defect proportion of study design also decreased ( = 21.22, p<0.001), 50.9% (680/1,335) compared to 42.40% (669/1,578). In 2008, design with randomized clinical trials remained low in single digit (3.8%, 60/1,578) with two-third showed poor results reporting (defects in 44 papers, 73.3%). Nearly half of the published studies were retrospective in nature, 49.3% (658/1,335) in 1998 compared to 48.2% (761/1,578) in 2008. Decreases in defect proportions were observed in both results presentation ( = 93.26, p<0.001), 92.7% (945/1,019) compared to 78.2% (1023/1,309) and interpretation ( = 27.26, p<0.001), 9.7% (99/1,019) compared to 4.3% (56/1,309), some serious ones persisted. Conclusions/Significance Chinese medical research seems to have made significant progress regarding statistical analyses, but there remains ample room for improvement regarding study designs. Retrospective clinical studies are the most often used design, whereas randomized clinical trials are rare and often show methodological weaknesses. Urgent implementation of the CONSORT statement is imperative. PMID:20520824

  18. Quantification and Statistical Analysis Methods for Vessel Wall Components from Stained Images with Masson's Trichrome

    PubMed Central

    Hernández-Morera, Pablo; Castaño-González, Irene; Travieso-González, Carlos M.; Mompeó-Corredera, Blanca; Ortega-Santana, Francisco

    2016-01-01

    Purpose To develop a digital image processing method to quantify structural components (smooth muscle fibers and extracellular matrix) in the vessel wall stained with Masson’s trichrome, and a statistical method suitable for small sample sizes to analyze the results previously obtained. Methods The quantification method comprises two stages. The pre-processing stage improves tissue image appearance and the vessel wall area is delimited. In the feature extraction stage, the vessel wall components are segmented by grouping pixels with a similar color. The area of each component is calculated by normalizing the number of pixels of each group by the vessel wall area. Statistical analyses are implemented by permutation tests, based on resampling without replacement from the set of the observed data to obtain a sampling distribution of an estimator. The implementation can be parallelized on a multicore machine to reduce execution time. Results The methods have been tested on 48 vessel wall samples of the internal saphenous vein stained with Masson’s trichrome. The results show that the segmented areas are consistent with the perception of a team of doctors and demonstrate good correlation between the expert judgments and the measured parameters for evaluating vessel wall changes. Conclusion The proposed methodology offers a powerful tool to quantify some components of the vessel wall. It is more objective, sensitive and accurate than the biochemical and qualitative methods traditionally used. The permutation tests are suitable statistical techniques to analyze the numerical measurements obtained when the underlying assumptions of the other statistical techniques are not met. PMID:26761643

  19. Comparison of a non-stationary voxelation-corrected cluster-size test with TFCE for group-Level MRI inference.

    PubMed

    Li, Huanjie; Nickerson, Lisa D; Nichols, Thomas E; Gao, Jia-Hong

    2017-03-01

    Two powerful methods for statistical inference on MRI brain images have been proposed recently, a non-stationary voxelation-corrected cluster-size test (CST) based on random field theory and threshold-free cluster enhancement (TFCE) based on calculating the level of local support for a cluster, then using permutation testing for inference. Unlike other statistical approaches, these two methods do not rest on the assumptions of a uniform and high degree of spatial smoothness of the statistic image. Thus, they are strongly recommended for group-level fMRI analysis compared to other statistical methods. In this work, the non-stationary voxelation-corrected CST and TFCE methods for group-level analysis were evaluated for both stationary and non-stationary images under varying smoothness levels, degrees of freedom and signal to noise ratios. Our results suggest that, both methods provide adequate control for the number of voxel-wise statistical tests being performed during inference on fMRI data and they are both superior to current CSTs implemented in popular MRI data analysis software packages. However, TFCE is more sensitive and stable for group-level analysis of VBM data. Thus, the voxelation-corrected CST approach may confer some advantages by being computationally less demanding for fMRI data analysis than TFCE with permutation testing and by also being applicable for single-subject fMRI analyses, while the TFCE approach is advantageous for VBM data. Hum Brain Mapp 38:1269-1280, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  20. Global aesthetic surgery statistics: a closer look.

    PubMed

    Heidekrueger, Paul I; Juran, S; Ehrl, D; Aung, T; Tanna, N; Broer, P Niclas

    2017-08-01

    Obtaining quality global statistics about surgical procedures remains an important yet challenging task. The International Society of Aesthetic Plastic Surgery (ISAPS) reports the total number of surgical and non-surgical procedures performed worldwide on a yearly basis. While providing valuable insight, ISAPS' statistics leave two important factors unaccounted for: (1) the underlying base population, and (2) the number of surgeons performing the procedures. Statistics of the published ISAPS' 'International Survey on Aesthetic/Cosmetic Surgery' were analysed by country, taking into account the underlying national base population according to the official United Nations population estimates. Further, the number of surgeons per country was used to calculate the number of surgeries performed per surgeon. In 2014, based on ISAPS statistics, national surgical procedures ranked in the following order: 1st USA, 2nd Brazil, 3rd South Korea, 4th Mexico, 5th Japan, 6th Germany, 7th Colombia, and 8th France. When considering the size of the underlying national populations, the demand for surgical procedures per 100,000 people changes the overall ranking substantially. It was also found that the rate of surgical procedures per surgeon shows great variation between the responding countries. While the US and Brazil are often quoted as the countries with the highest demand for plastic surgery, according to the presented analysis, other countries surpass these countries in surgical procedures per capita. While data acquisition and quality should be improved in the future, valuable insight regarding the demand for surgical procedures can be gained by taking specific demographic and geographic factors into consideration.

  1. Use of Statistical Analyses in the Ophthalmic Literature

    PubMed Central

    Lisboa, Renato; Meira-Freitas, Daniel; Tatham, Andrew J.; Marvasti, Amir H.; Sharpsten, Lucie; Medeiros, Felipe A.

    2014-01-01

    Purpose To identify the most commonly used statistical analyses in the ophthalmic literature and to determine the likely gain in comprehension of the literature that readers could expect if they were to sequentially add knowledge of more advanced techniques to their statistical repertoire. Design Cross-sectional study Methods All articles published from January 2012 to December 2012 in Ophthalmology, American Journal of Ophthalmology and Archives of Ophthalmology were reviewed. A total of 780 peer-reviewed articles were included. Two reviewers examined each article and assigned categories to each one depending on the type of statistical analyses used. Discrepancies between reviewers were resolved by consensus. Main Outcome Measures Total number and percentage of articles containing each category of statistical analysis were obtained. Additionally we estimated the accumulated number and percentage of articles that a reader would be expected to be able to interpret depending on their statistical repertoire. Results Readers with little or no statistical knowledge would be expected to be able to interpret the statistical methods presented in only 20.8% of articles. In order to understand more than half (51.4%) of the articles published, readers were expected to be familiar with at least 15 different statistical methods. Knowledge of 21 categories of statistical methods was necessary to comprehend 70.9% of articles, while knowledge of more than 29 categories was necessary to comprehend more than 90% of articles. Articles in retina and glaucoma subspecialties showed a tendency for using more complex analysis when compared to cornea. Conclusions Readers of clinical journals in ophthalmology need to have substantial knowledge of statistical methodology to understand the results of published studies in the literature. The frequency of use of complex statistical analyses also indicates that those involved in the editorial peer-review process must have sound statistical knowledge in order to critically appraise articles submitted for publication. The results of this study could provide guidance to direct the statistical learning of clinical ophthalmologists, researchers and educators involved in the design of courses for residents and medical students. PMID:24612977

  2. Global atmospheric circulation statistics, 1000-1 mb

    NASA Technical Reports Server (NTRS)

    Randel, William J.

    1992-01-01

    The atlas presents atmospheric general circulation statistics derived from twelve years (1979-90) of daily National Meteorological Center (NMC) operational geopotential height analyses; it is an update of a prior atlas using data over 1979-1986. These global analyses are available on pressure levels covering 1000-1 mb (approximately 0-50 km). The geopotential grids are a combined product of the Climate Analysis Center (which produces analyses over 70-1 mb) and operational NMC analyses (over 1000-100 mb). Balance horizontal winds and hydrostatic temperatures are derived from the geopotential fields.

  3. Accident Source Terms for Pressurized Water Reactors with High-Burnup Cores Calculated using MELCOR 1.8.5.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gauntt, Randall O.; Goldmann, Andrew; Kalinich, Donald A.

    2016-12-01

    In this study, risk-significant pressurized-water reactor severe accident sequences are examined using MELCOR 1.8.5 to explore the range of fission product releases to the reactor containment building. Advances in the understanding of fission product release and transport behavior and severe accident progression are used to render best estimate analyses of selected accident sequences. Particular emphasis is placed on estimating the effects of high fuel burnup in contrast with low burnup on fission product releases to the containment. Supporting this emphasis, recent data available on fission product release from high-burnup (HBU) fuel from the French VERCOR project are used in thismore » study. The results of these analyses are treated as samples from a population of accident sequences in order to employ approximate order statistics characterization of the results. These trends and tendencies are then compared to the NUREG-1465 alternative source term prescription used today for regulatory applications. In general, greater differences are observed between the state-of-the-art calculations for either HBU or low-burnup (LBU) fuel and the NUREG-1465 containment release fractions than exist between HBU and LBU release fractions. Current analyses suggest that retention of fission products within the vessel and the reactor coolant system (RCS) are greater than contemplated in the NUREG-1465 prescription, and that, overall, release fractions to the containment are therefore lower across the board in the present analyses than suggested in NUREG-1465. The decreased volatility of Cs 2 MoO 4 compared to CsI or CsOH increases the predicted RCS retention of cesium, and as a result, cesium and iodine do not follow identical behaviors with respect to distribution among vessel, RCS, and containment. With respect to the regulatory alternative source term, greater differences are observed between the NUREG-1465 prescription and both HBU and LBU predictions than exist between HBU and LBU analyses. Additionally, current analyses suggest that the NUREG-1465 release fractions are conservative by about a factor of 2 in terms of release fractions and that release durations for in-vessel and late in-vessel release periods are in fact longer than the NUREG-1465 durations. It is currently planned that a subsequent report will further characterize these results using more refined statistical methods, permitting a more precise reformulation of the NUREG-1465 alternative source term for both LBU and HBU fuels, with the most important finding being that the NUREG-1465 formula appears to embody significant conservatism compared to current best-estimate analyses. ACKNOWLEDGEMENTS This work was supported by the United States Nuclear Regulatory Commission, Office of Nuclear Regulatory Research. The authors would like to thank Dr. Ian Gauld and Dr. Germina Ilas, of Oak Ridge National Laboratory, for their contributions to this work. In addition to development of core fission product inventory and decay heat information for use in MELCOR models, their insights related to fuel management practices and resulting effects on spatial distribution of fission products in the core was instrumental in completion of our work.« less

  4. Performance assessment in a flight simulator test—Validation of a space psychology methodology

    NASA Astrophysics Data System (ADS)

    Johannes, B.; Salnitski, Vyacheslav; Soll, Henning; Rauch, Melina; Goeters, Klaus-Martin; Maschke, Peter; Stelling, Dirk; Eißfeldt, Hinnerk

    2007-02-01

    The objective assessment of operator performance in hand controlled docking of a spacecraft on a space station has 30 years of tradition and is well established. In the last years the performance assessment was successfully combined with a psycho-physiological approach for the objective assessment of the levels of physiological arousal and psychological load. These methods are based on statistical reference data. For the enhancement of the statistical power of the evaluation methods, both were actually implemented into a comparable terrestrial task: the flight simulator test of DLR in the selection procedure for ab initio pilot applicants for civil airlines. In the first evaluation study 134 male subjects were analysed. Subjects underwent a flight simulator test including three tasks, which were evaluated by instructors applying well-established and standardised rating scales. The principles of the performance algorithms of the docking training were adapted for the automated flight performance assessment. They are presented here. The increased human errors under instrument flight conditions without visual feedback required a manoeuvre recognition algorithm before calculating the deviation of the flown track from the given task elements. Each manoeuvre had to be evaluated independently of former failures. The expert rated performance showed a highly significant correlation with the automatically calculated performance for each of the three tasks: r=.883, r=.874, r=.872, respectively. An automated algorithm successfully assessed the flight performance. This new method will possibly provide a wide range of other future applications in aviation and space psychology.

  5. A first principles calculation and statistical mechanics modeling of defects in Al-H system

    NASA Astrophysics Data System (ADS)

    Ji, Min; Wang, Cai-Zhuang; Ho, Kai-Ming

    2007-03-01

    The behavior of defects and hydrogen in Al was investigated by first principles calculations and statistical mechanics modeling. The formation energy of different defects in Al+H system such as Al vacancy, H in institution and multiple H in Al vacancy were calculated by first principles method. Defect concentration in thermodynamical equilibrium was studied by total free energy calculation including configuration entropy and defect-defect interaction from low concentration limit to hydride limit. In our grand canonical ensemble model, hydrogen chemical potential under different environment plays an important role in determing the defect concentration and properties in Al-H system.

  6. Development of the Statistical Reasoning in Biology Concept Inventory (SRBCI)

    PubMed Central

    Deane, Thomas; Nomme, Kathy; Jeffery, Erica; Pollock, Carol; Birol, Gülnur

    2016-01-01

    We followed established best practices in concept inventory design and developed a 12-item inventory to assess student ability in statistical reasoning in biology (Statistical Reasoning in Biology Concept Inventory [SRBCI]). It is important to assess student thinking in this conceptual area, because it is a fundamental requirement of being statistically literate and associated skills are needed in almost all walks of life. Despite this, previous work shows that non–expert-like thinking in statistical reasoning is common, even after instruction. As science educators, our goal should be to move students along a novice-to-expert spectrum, which could be achieved with growing experience in statistical reasoning. We used item response theory analyses (the one-parameter Rasch model and associated analyses) to assess responses gathered from biology students in two populations at a large research university in Canada in order to test SRBCI’s robustness and sensitivity in capturing useful data relating to the students’ conceptual ability in statistical reasoning. Our analyses indicated that SRBCI is a unidimensional construct, with items that vary widely in difficulty and provide useful information about such student ability. SRBCI should be useful as a diagnostic tool in a variety of biology settings and as a means of measuring the success of teaching interventions designed to improve statistical reasoning skills. PMID:26903497

  7. Secondary Analysis of National Longitudinal Transition Study 2 Data

    ERIC Educational Resources Information Center

    Hicks, Tyler A.; Knollman, Greg A.

    2015-01-01

    This review examines published secondary analyses of National Longitudinal Transition Study 2 (NLTS2) data, with a primary focus upon statistical objectives, paradigms, inferences, and methods. Its primary purpose was to determine which statistical techniques have been common in secondary analyses of NLTS2 data. The review begins with an…

  8. A Nonparametric Geostatistical Method For Estimating Species Importance

    Treesearch

    Andrew J. Lister; Rachel Riemann; Michael Hoppus

    2001-01-01

    Parametric statistical methods are not always appropriate for conducting spatial analyses of forest inventory data. Parametric geostatistical methods such as variography and kriging are essentially averaging procedures, and thus can be affected by extreme values. Furthermore, non normal distributions violate the assumptions of analyses in which test statistics are...

  9. "Who Was 'Shadow'?" The Computer Knows: Applying Grammar-Program Statistics in Content Analyses to Solve Mysteries about Authorship.

    ERIC Educational Resources Information Center

    Ellis, Barbara G.; Dick, Steven J.

    1996-01-01

    Employs the statistics-documentation portion of a word-processing program's grammar-check feature together with qualitative analyses to determine that Henry Watterson, long-time editor of the "Louisville Courier-Journal," was probably the South's famed Civil War correspondent "Shadow." (TB)

  10. Comparison of corneal endothelial image analysis by Konan SP8000 noncontact and Bio-Optics Bambi systems.

    PubMed

    Benetz, B A; Diaconu, E; Bowlin, S J; Oak, S S; Laing, R A; Lass, J H

    1999-01-01

    Compare corneal endothelial image analysis by Konan SP8000 and Bio-Optics Bambi image-analysis systems. Corneal endothelial images from 98 individuals (191 eyes), ranging in age from 4 to 87 years, with a normal slit-lamp examination and no history of ocular trauma, intraocular surgery, or intraocular inflammation were obtained by the Konan SP8000 noncontact specular microscope. One observer analyzed these images by using the Konan system and a second observer by using the Bio-Optics Bambi system. Three methods of analyses were used: a fixed-frame method to obtain cell density (for both Konan and Bio-Optics Bambi) and a "dot" (Konan) or "corners" (Bio-Optics Bambi) method to determine morphometric parameters. The cell density determined by the Konan fixed-frame method was significantly higher (157 cells/mm2) than the Bio-Optics Bambi fixed-frame method determination (p<0.0001). However, the difference in cell density, although still statistically significant, was smaller and reversed comparing the Konan fixed-frame method with both Konan dot and Bio-Optics Bambi comers method (-74 cells/mm2, p<0.0001; -55 cells/mm2, p<0.0001, respectively). Small but statistically significant morphometric analyses differences between Konan and Bio-Optics Bambi were seen: cell density, +19 cells/mm2 (p = 0.03); cell area, -3.0 microm2 (p = 0.008); and coefficient of variation, +1.0 (p = 0.003). There was no statistically significant difference between these two methods in the percentage of six-sided cells detected (p = 0.55). Cell densities measured by the Konan fixed-frame method were comparable with Konan and Bio-Optics Bambi's morphometric analysis, but not with the Bio-Optics Bambi fixed-frame method. The two morphometric analyses were comparable with minimal or no differences for the parameters that were studied. The Konan SP8000 endothelial image-analysis system may be useful for large-scale clinical trials determining cell loss; its noncontact system has many clinical benefits (including patient comfort, safety, ease of use, and short procedure time) and provides reliable cell-density calculations.

  11. An application of Social Values for Ecosystem Services (SolVES) to three national forests in Colorado and Wyoming

    USGS Publications Warehouse

    Sherrouse, Benson C.; Semmens, Darius J.; Clement, Jessica M.

    2014-01-01

    Despite widespread recognition that social-value information is needed to inform stakeholders and decision makers regarding trade-offs in environmental management, it too often remains absent from ecosystem service assessments. Although quantitative indicators of social values need to be explicitly accounted for in the decision-making process, they need not be monetary. Ongoing efforts to map such values demonstrate how they can also be made spatially explicit and relatable to underlying ecological information. We originally developed Social Values for Ecosystem Services (SolVES) as a tool to assess, map, and quantify nonmarket values perceived by various groups of ecosystem stakeholders. With SolVES 2.0 we have extended the functionality by integrating SolVES with Maxent maximum entropy modeling software to generate more complete social-value maps from available value and preference survey data and to produce more robust models describing the relationship between social values and ecosystems. The current study has two objectives: (1) evaluate how effectively the value index, a quantitative, nonmonetary social-value indicator calculated by SolVES, reproduces results from more common statistical methods of social-survey data analysis and (2) examine how the spatial results produced by SolVES provide additional information that could be used by managers and stakeholders to better understand more complex relationships among stakeholder values, attitudes, and preferences. To achieve these objectives, we applied SolVES to value and preference survey data collected for three national forests, the Pike and San Isabel in Colorado and the Bridger–Teton and the Shoshone in Wyoming. Value index results were generally consistent with results found through more common statistical analyses of the survey data such as frequency, discriminant function, and correlation analyses. In addition, spatial analysis of the social-value maps produced by SolVES provided information that was useful for explaining relationships between stakeholder values and forest uses. Our results suggest that SolVES can effectively reproduce information derived from traditional statistical analyses while adding spatially explicit, social-value information that can contribute to integrated resource assessment, planning, and management of forests and other ecosystems.

  12. Interhemispheric currents in the ring current region as seen by the Cluster spacecraft

    NASA Astrophysics Data System (ADS)

    Tenfjord, P.; Ostgaard, N.; Haaland, S.; Laundal, K.; Reistad, J. P.

    2013-12-01

    The existence of interhemispheric currents has been predicted by several authors, but their extent in the ring current has to our knowledge never been studied systematically by using in-situ measurements. These currents have been suggested to be associated with observed asymmetries of the aurora. We perform a statistical study of current density and direction during ring current crossings using the Cluster spacecraft. We analyse the extent of the interhemispheric field aligned currents for a wide range of solar wind conditions. Direct estimations of equatorial current direction and density are achieved through the curlometer technique. The curlometer technique is based on Ampere's law and requires magnetic field measurements from all four spacecrafts. The use of this method requires careful study of factors that limit the accuracy, such as tetrahedron shape and configuration. This significantly limits our dataset, but is a necessity for accurate current calculations. Our goal is to statistically investigate the occurrence of interhemispheric currents, and determine if there are parameters or magnetospheric states on which the current magnitude and directions depend upon.

  13. A New Index for the MMPI-2 Test for Detecting Dissimulation in Forensic Evaluations: A Pilot Study.

    PubMed

    Martino, Vito; Grattagliano, Ignazio; Bosco, Andrea; Massaro, Ylenia; Lisi, Andrea; Campobasso, Filippo; Marchitelli, Maria Alessia; Catanesi, Roberto

    2016-01-01

    This pilot study is the starting point of a potentially broad research project aimed at identifying new strategies for assessing malingering during forensic evaluations. The forensic group was comprised of 67 males who were seeking some sort of certification (e.g., adoption, child custody, driver's license, issuance of gun permits, etc.); the nonforensic group was comprised of 62 healthy male volunteers. Each participant was administered the MMPI-2. Statistical analyses were conducted on obtained scores of 48 MMPI-2 scales. In the first step, parametric statistics were adopted to identify the best combination of MMPI-2 scales that differentiated the two groups of participants. In the second step, frequency-based, nonparametric methods were used for diagnostic purposes. A model that utilized the best three predictors ("7-Pt", "L," and "1-Hs") was developed and used to calculate the Forensic Evaluation Dissimulation Index (FEDI), which features satisfactory diagnostic accuracy (0.9), sensitivity (0.82), specificity (0.81), and likelihood ratio indices (LR+ = 4.32; LR- = 0.22). © 2015 American Academy of Forensic Sciences.

  14. Quantitative impact of pediatric sinus surgery on facial growth.

    PubMed

    Senior, B; Wirtschafter, A; Mai, C; Becker, C; Belenky, W

    2000-11-01

    To quantitatively evaluate the long-term impact of sinus surgery on paranasal sinus development in the pediatric patient. Longitudinal review of eight pediatric patients treated with unilateral sinus surgery for periorbital or orbital cellulitis with an average follow-up of 6.9 years. Control subjects consisted of two groups, 9 normal adult patients with no computed tomographic evidence of sinusitis and 10 adult patients with scans consistent with sinusitis and a history of sinus-related symptoms extending to childhood. Application of computed tomography (CT) volumetrics, a technique allowing for precise calculation of volumes using thinly cut CT images, to the study and control groups. Paired Student t test analyses of side-to-side volume comparisons in the normal patients, patients with sinusitis, and patients who had surgery revealed no statistically significant differences. Comparisons between the orbital volumes of patients who did and did not have surgery revealed a statistically significant increase in orbital volume in patients who had surgery. Only minimal changes in facial volume measurements have been found, confirming clinical impressions that sinus surgery in children is safe and without significant cosmetic sequelae.

  15. Image Processing Diagnostics: Emphysema

    NASA Astrophysics Data System (ADS)

    McKenzie, Alex

    2009-10-01

    Currently the computerized tomography (CT) scan can detect emphysema sooner than traditional x-rays, but other tests are required to measure more accurately the amount of affected lung. CT scan images show clearly if a patient has emphysema, but is unable by visual scan alone, to quantify the degree of the disease, as it appears merely as subtle, barely distinct, dark spots on the lung. Our goal is to create a software plug-in to interface with existing open source medical imaging software, to automate the process of accurately diagnosing and determining emphysema severity levels in patients. This will be accomplished by performing a number of statistical calculations using data taken from CT scan images of several patients representing a wide range of severity of the disease. These analyses include an examination of the deviation from a normal distribution curve to determine skewness, a commonly used statistical parameter. Our preliminary results show that this method of assessment appears to be more accurate and robust than currently utilized methods which involve looking at percentages of radiodensities in air passages of the lung.

  16. Extended Statistical Short-Range Guidance for Peak Wind Speed Analyses at the Shuttle Landing Facility: Phase II Results

    NASA Technical Reports Server (NTRS)

    Lambert, Winifred C.

    2003-01-01

    This report describes the results from Phase II of the AMU's Short-Range Statistical Forecasting task for peak winds at the Shuttle Landing Facility (SLF). The peak wind speeds are an important forecast element for the Space Shuttle and Expendable Launch Vehicle programs. The 45th Weather Squadron and the Spaceflight Meteorology Group indicate that peak winds are challenging to forecast. The Applied Meteorology Unit was tasked to develop tools that aid in short-range forecasts of peak winds at tower sites of operational interest. A seven year record of wind tower data was used in the analysis. Hourly and directional climatologies by tower and month were developed to determine the seasonal behavior of the average and peak winds. Probability density functions (PDF) of peak wind speed were calculated to determine the distribution of peak speed with average speed. These provide forecasters with a means of determining the probability of meeting or exceeding a certain peak wind given an observed or forecast average speed. A PC-based Graphical User Interface (GUI) tool was created to display the data quickly.

  17. Diffusion-weighted MR imaging findings of kidneys in patients with early phase of obstruction.

    PubMed

    Bozgeyik, Zulkif; Kocakoc, Ercan; Sonmezgoz, Fitnet

    2009-04-01

    Diffusion-weighted (DW) magnetic resonance (MR) imaging is an MR technique used to show molecular diffusion. The apparent diffusion coefficient (ADC), as a quantitative parameter calculated from the DW MR images. The purpose of this study is to evaluate the ability of DW MR imaging in early phase of obstruction due to urolithiasis. Twenty-six patients with acute dilatation of the pelvicalyceal system detected by intravenous urography were included in this study. MR imaging was performed using a 1.5 T whole-body superconducting MR scanner. DW imaging can be performed using single-shot spin-echo, echo-planar imaging (EPI) sequences with the following diffusion gradient b values: 100, 600, 1000 s/mm(2). Circular region of interest (ROI) was placed in the renal parenchyma for the measurement of ADC values in the normal and obstructed kidney. For statistical analyses, Paired t test were used. In spite of obstructed kidneys had the lower ADC values compared to normal kidneys, these alterations were statistically insignificant. We did not observe significantly different ADC values of early phase of obstructed kidneys compared to normal kidneys.

  18. Profile of Polyphenolic and Essential Oil Composition of Polish Propolis, Black Poplar and Aspens Buds.

    PubMed

    Okińczyc, Piotr; Szumny, Antoni; Szperlik, Jakub; Kulma, Anna; Franiczek, Roman; Żbikowska, Beata; Krzyżanowska, Barbara; Sroka, Zbigniew

    2018-05-25

    In this work, we studied similarities and differences between 70% ethanol in water extract (70EE) and essential oils (EOs) obtained from propolis, black poplars ( Populus nigra L.) and aspens ( P. tremula L.) to ascertain which of these is a better indicator of the plant species used by bees to collect propolis precursors. Composition of 70EE was analyzed by UPLC-PDA-MS, while GC-MS was used to research the EOs. Principal component analyses (PCA) and calculations of Spearman's coefficient rank were used for statistical analysis. Statistical analysis exhibited correlation between chemical compositions of propolis and Populus buds' 70EE. In the case of EOs, results were less clear. Compositions of black poplars, aspens EOs and propolises have shown more variability than 70EE. Different factors such as higher instability of EOs compared to 70EE, different degradation pattern of benzyl esters to benzoic acid, differences in plant metabolism and bees' preferences may be responsible for these phenomena. Our research has therefore shown that 70EE of propolis reflected the composition of P. nigra or complex aspen⁻black poplar origin.

  19. Irrigation water use in Kansas, 2013

    USGS Publications Warehouse

    Lanning-Rush, Jennifer L.

    2016-03-22

    This report, prepared by the U.S. Geological Survey in cooperation with the Kansas Department of Agriculture, Division of Water Resources, presents derivative statistics of 2013 irrigation water use in Kansas. The published regional and county-level statistics from the previous 4 years (2009–12) are shown with the 2013 statistics and are used to calculate a 5-year average. An overall Kansas average and regional averages also are calculated and presented. Total reported irrigation water use in 2013 was 3.3 million acre-feet of water applied to 3.0 million irrigated acres.

  20. Calculating weighted estimates of peak streamflow statistics

    USGS Publications Warehouse

    Cohn, Timothy A.; Berenbrock, Charles; Kiang, Julie E.; Mason, Jr., Robert R.

    2012-01-01

    According to the Federal guidelines for flood-frequency estimation, the uncertainty of peak streamflow statistics, such as the 1-percent annual exceedance probability (AEP) flow at a streamgage, can be reduced by combining the at-site estimate with the regional regression estimate to obtain a weighted estimate of the flow statistic. The procedure assumes the estimates are independent, which is reasonable in most practical situations. The purpose of this publication is to describe and make available a method for calculating a weighted estimate from the uncertainty or variance of the two independent estimates.

  1. Differing long term trends for two common amphibian species (Bufo bufo and Rana temporaria) in alpine landscapes of Salzburg, Austria

    PubMed Central

    Kyek, Martin; Lindner, Robert

    2017-01-01

    This study focuses on the population trends of two widespread European anuran species: the common toad (Bufo bufo) and the common frog (Rana temporaria). The basis of this study is data gathered over two decades of amphibian fencing alongside roads in the Austrian state of Salzburg. Different statistical approaches were used to analyse the data. Overall average increase or decrease of each species was estimated by calculating a simple average locality index. In addition the statistical software TRIM was used to verify these trends as well as to categorize the data based on the geographic location of each migration site. The results show differing overall trends for the two species: the common toad being stable and the common frog showing a substantial decline over the last two decades. Further analyses based on geographic categorization reveal the strongest decrease in the alpine range of the species. Drainage and agricultural intensification are still ongoing problems within alpine areas, not only in Salzburg. Particularly in respect to micro-climate and the availability of spawning places these changes appear to have a greater impact on the habitats of the common frog than the common toad. Therefore we consider habitat destruction to be the main potential reason behind this dramatic decline. We also conclude that the substantial loss of biomass of a widespread species such as the common frog must have a severe, and often overlooked, ecological impact. PMID:29121054

  2. [Regionalisation of Germany by data of agricultural structures].

    PubMed

    Merle, Roswitha; Busse, Marc; Rechter, Galina; Meer, Uwe

    2012-01-01

    In order to simplify the design of representative studies in animal populations the structural differences of animal husbandry (cattle, pigs and laying hens) in Germany were characterised. Several regions were defined and thus districts identified which are typical for the respective region and can be regarded as representatives for the whole region. Data on animal husbandry as well as human population per district originated from the Federal Statistical Office and were linked to the geometric data of the Federal Agency for Cartography and Geodesy. By this, data of "livestock units/square kilometre area" and "farms/square kilometre area" per district were calculated using methods of the spatial statistics Global Moran's Index, Anselin Local Moran's Index and Getis-Ord Gi*. With the help of these analyses six clusters could be identified which resulted in four large (Middle, Northwest, East, and South) and one smaller region (Northern Upper-Rhine) respecting the federal state borders. These regions differed significantly regarding animal and farm densities. The selection of typical districts was carried out with the help of the respective animal and farm data of the species pigs, dairy cattle and laying hens. The means of the selected districts (three to six per region) were within the 60%- and the 80%-percentile of at least two of the analysed variables. Concerning the region Northern Upper-Rhine no representative district was selected. This presented regionalisation including representative districts can be used for the design of scientific studies that are associated with animal husbandry in Germany.

  3. Dietary fiber intake and risk of breast cancer defined by estrogen and progesterone receptor status: the Japan Public Health Center-based Prospective Study.

    PubMed

    Narita, Saki; Inoue, Manami; Saito, Eiko; Abe, Sarah K; Sawada, Norie; Ishihara, Junko; Iwasaki, Motoki; Yamaji, Taiki; Shimazu, Taichi; Sasazuki, Shizuka; Shibuya, Kenji; Tsugane, Shoichiro

    2017-06-01

    Epidemiological studies have suggested a protective effect of dietary fiber intake on breast cancer risk while the results have been inconsistent. Our study aimed to investigate the association between dietary fiber intake and breast cancer risk and to explore whether this association is modified by reproductive factors and hormone receptor status of the tumor. A total of 44,444 women aged 45 to 74 years from the Japan Public Health Center-based Prospective Study were included in analyses. Dietary intake assessment was performed using a validated 138-item food frequency questionnaire (FFQ). Hazard ratios (HRs) and 95% confidence intervals (CIs) for breast cancer incidence were calculated by multivariate Cox proportional hazards regression models. During 624,423 person-years of follow-up period, 681 breast cancer cases were identified. After adjusting for major confounders for breast cancer risk, inverse trends were observed but statistically non-significant. Extremely high intake of fiber was associated with decreased risk of breast cancer but this should be interpreted with caution due to limited statistical power. In stratified analyses by menopausal and hormone receptor status, null associations were observed except for ER-PR- status. Our findings suggest that extreme high fiber intake may be associated with decreased risk of breast cancer but the level of dietary fiber intake among Japanese population might not be sufficient to examine the association between dietary fiber intake and breast cancer risk.

  4. A statistical method to calculate blood contamination in the measurement of salivary hormones in healthy women.

    PubMed

    Behr, Guilherme A; Patel, Jay P; Coote, Marg; Moreira, Jose C F; Gelain, Daniel P; Steiner, Meir; Frey, Benicio N

    2017-05-01

    Previous studies have reported that salivary concentrations of certain hormones correlate with their respective serum levels. However, most of these studies did not control for potential blood contamination in saliva. In the present study we developed a statistical method to test the amount of blood contamination that needs to be avoided in saliva samples for the following hormones: cortisol, estradiol, progesterone, testosterone and oxytocin. Saliva and serum samples were collected from 38 healthy, medication-free women (mean age=33.8±7.3yr.; range=19-45). Serum and salivary hormonal levels and the amount of transferrin in saliva samples were determined using enzyme immunoassays. Salivary transferrin levels did not correlate with salivary cortisol or estradiol (up to 3mg/dl), but they were positively correlated with salivary testosterone, progesterone and oxytocin (p<0.05). After controlling for blood contamination, only cortisol (r=0.65, P<0.001) and progesterone levels (r=0.57, P=0.002) displayed a positive correlation between saliva and serum. Our analyses suggest that transferrin levels higher than 0.80, 0.92 and 0.64mg/dl should be avoided for testosterone, progesterone and oxytocin salivary analyses, respectively. We recommend that salivary transferrin is measured in research involving salivary hormones in order to determine the level of blood contamination that might affect specific hormonal salivary concentrations. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  5. Wave packet and statistical quantum calculations for the He + NeH⁺ → HeH⁺ + Ne reaction on the ground electronic state.

    PubMed

    Koner, Debasish; Barrios, Lizandra; González-Lezana, Tomás; Panda, Aditya N

    2014-09-21

    A real wave packet based time-dependent method and a statistical quantum method have been used to study the He + NeH(+) (v, j) reaction with the reactant in various ro-vibrational states, on a recently calculated ab initio ground state potential energy surface. Both the wave packet and statistical quantum calculations were carried out within the centrifugal sudden approximation as well as using the exact Hamiltonian. Quantum reaction probabilities exhibit dense oscillatory pattern for smaller total angular momentum values, which is a signature of resonances in a complex forming mechanism for the title reaction. Significant differences, found between exact and approximate quantum reaction cross sections, highlight the importance of inclusion of Coriolis coupling in the calculations. Statistical results are in fairly good agreement with the exact quantum results, for ground ro-vibrational states of the reactant. Vibrational excitation greatly enhances the reaction cross sections, whereas rotational excitation has relatively small effect on the reaction. The nature of the reaction cross section curves is dependent on the initial vibrational state of the reactant and is typical of a late barrier type potential energy profile.

  6. Nomogram for sample size calculation on a straightforward basis for the kappa statistic.

    PubMed

    Hong, Hyunsook; Choi, Yunhee; Hahn, Seokyung; Park, Sue Kyung; Park, Byung-Joo

    2014-09-01

    Kappa is a widely used measure of agreement. However, it may not be straightforward in some situation such as sample size calculation due to the kappa paradox: high agreement but low kappa. Hence, it seems reasonable in sample size calculation that the level of agreement under a certain marginal prevalence is considered in terms of a simple proportion of agreement rather than a kappa value. Therefore, sample size formulae and nomograms using a simple proportion of agreement rather than a kappa under certain marginal prevalences are proposed. A sample size formula was derived using the kappa statistic under the common correlation model and goodness-of-fit statistic. The nomogram for the sample size formula was developed using SAS 9.3. The sample size formulae using a simple proportion of agreement instead of a kappa statistic and nomograms to eliminate the inconvenience of using a mathematical formula were produced. A nomogram for sample size calculation with a simple proportion of agreement should be useful in the planning stages when the focus of interest is on testing the hypothesis of interobserver agreement involving two raters and nominal outcome measures. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. Exercise and Bone Mineral Density in Premenopausal Women: A Meta-Analysis of Randomized Controlled Trials

    PubMed Central

    Kelley, George A.; Kelley, Kristi S.; Kohrt, Wendy M.

    2013-01-01

    Objective. Examine the effects of exercise on femoral neck (FN) and lumbar spine (LS) bone mineral density (BMD) in premenopausal women. Methods. Meta-analysis of randomized controlled exercise trials ≥24 weeks in premenopausal women. Standardized effect sizes (g) were calculated for each result and pooled using random-effects models, Z score alpha values, 95% confidence intervals (CIs), and number needed to treat (NNT). Heterogeneity was examined using Q and I 2. Moderator and predictor analyses using mixed-effects ANOVA and simple metaregression were conducted. Statistical significance was set at P ≤ 0.05. Results. Statistically significant improvements were found for both FN (7g's, 466 participants, g = 0.342, 95%  CI = 0.132, 0.553, P = 0.001, Q = 10.8, P = 0.22, I 2 = 25.7%, NNT = 5) and LS (6g's, 402 participants, g = 0.201, 95%  CI = 0.009, 0.394, P = 0.04, Q = 3.3, P = 0.65, I 2 = 0%, NNT = 9) BMD. A trend for greater benefits in FN BMD was observed for studies published in countries other than the United States and for those who participated in home versus facility-based exercise. Statistically significant, or a trend for statistically significant, associations were observed for 7 different moderators and predictors, 6 for FN BMD and 1 for LS BMD. Conclusions. Exercise benefits FN and LS BMD in premenopausal women. The observed moderators and predictors deserve further investigation in well-designed randomized controlled trials. PMID:23401684

  8. Exercise and bone mineral density in premenopausal women: a meta-analysis of randomized controlled trials.

    PubMed

    Kelley, George A; Kelley, Kristi S; Kohrt, Wendy M

    2013-01-01

    Objective. Examine the effects of exercise on femoral neck (FN) and lumbar spine (LS) bone mineral density (BMD) in premenopausal women. Methods. Meta-analysis of randomized controlled exercise trials ≥24 weeks in premenopausal women. Standardized effect sizes (g) were calculated for each result and pooled using random-effects models, Z score alpha values, 95% confidence intervals (CIs), and number needed to treat (NNT). Heterogeneity was examined using Q and I(2). Moderator and predictor analyses using mixed-effects ANOVA and simple metaregression were conducted. Statistical significance was set at P ≤ 0.05. Results. Statistically significant improvements were found for both FN (7g's, 466 participants, g = 0.342, 95%  CI = 0.132, 0.553, P = 0.001, Q = 10.8, P = 0.22, I(2) = 25.7%, NNT = 5) and LS (6g's, 402 participants, g = 0.201, 95%  CI = 0.009, 0.394, P = 0.04, Q = 3.3, P = 0.65, I(2) = 0%, NNT = 9) BMD. A trend for greater benefits in FN BMD was observed for studies published in countries other than the United States and for those who participated in home versus facility-based exercise. Statistically significant, or a trend for statistically significant, associations were observed for 7 different moderators and predictors, 6 for FN BMD and 1 for LS BMD. Conclusions. Exercise benefits FN and LS BMD in premenopausal women. The observed moderators and predictors deserve further investigation in well-designed randomized controlled trials.

  9. Statistical definition of relapse: case of family drug court.

    PubMed

    Alemi, Farrokh; Haack, Mary; Nemes, Susanna

    2004-06-01

    At any point in time, a patient's return to drug use can be seen either as a temporary event or as a return to persistent use. There is no formal standard for distinguishing persistent drug use from an occasional relapse. This lack of standardization persists although the consequences of either interpretation can be life altering. In a drug court or regulatory situation, for example, misinterpreting relapse as return to drug use could lead to incarceration, loss of child custody, or loss of employment. A clinician who mistakes a client's relapse for persistent drug use may fail to adjust treatment intensity to client's needs. An empirical and standardized method for distinguishing relapse from persistent drug use is needed. This paper provides a tool for clinicians and judges to distinguish relapse from persistent use based on statistical analyses of patterns of client's drug use. To accomplish this, a control chart is created for time-in-between relapses. This paper shows how a statistical limit can be calculated by examining either the client's history or other clients in the same program. If client's time-in-between relapse exceeds the statistical limit, then the client has returned to persistent use. Otherwise, the drug use is temporary. To illustrate the method, it is applied to data from three family drug courts. The approach allows the estimation of control limits based on the client's as well as the court's historical patterns. The approach also allows comparison of courts based on recovery rates.

  10. Kidney function changes with aging in adults: comparison between cross-sectional and longitudinal data analyses in renal function assessment.

    PubMed

    Chung, Sang M; Lee, David J; Hand, Austin; Young, Philip; Vaidyanathan, Jayabharathi; Sahajwalla, Chandrahas

    2015-12-01

    The study evaluated whether the renal function decline rate per year with age in adults varies based on two primary statistical analyses: cross-section (CS), using one observation per subject, and longitudinal (LT), using multiple observations per subject over time. A total of 16628 records (3946 subjects; age range 30-92 years) of creatinine clearance and relevant demographic data were used. On average, four samples per subject were collected for up to 2364 days (mean: 793 days). A simple linear regression and random coefficient models were selected for CS and LT analyses, respectively. The renal function decline rates per year were 1.33 and 0.95 ml/min/year for CS and LT analyses, respectively, and were slower when the repeated individual measurements were considered. The study confirms that rates are different based on statistical analyses, and that a statistically robust longitudinal model with a proper sampling design provides reliable individual as well as population estimates of the renal function decline rates per year with age in adults. In conclusion, our findings indicated that one should be cautious in interpreting the renal function decline rate with aging information because its estimation was highly dependent on the statistical analyses. From our analyses, a population longitudinal analysis (e.g. random coefficient model) is recommended if individualization is critical, such as a dose adjustment based on renal function during a chronic therapy. Copyright © 2015 John Wiley & Sons, Ltd.

  11. New estimates of asymmetric decomposition of racemic mixtures by natural beta-radiation sources

    NASA Technical Reports Server (NTRS)

    Hegstrom, R. A.; Rich, A.; Van House, J.

    1985-01-01

    Some recent calculations that appeared to invalidate the Vester-Ulbricht hypothesis, which suggests that the chirality of biological molecules originates from the beta-radiolysis of prebiotic racemic mixtures, are reexamined. These calculations apparently showed that the radiolysis-induced chiral polarization can never exceed the chiral polarization produced by statistical fluctuations. It is here shown that several overly restrictive conditions were imposed on these calculations which, when relaxed, allow the radiolysis-induced polarization to exceed that produced by statistical fluctuations, in accordance with the Vester-Ulbricht hypothesis.

  12. IPAC-Inlet Performance Analysis Code

    NASA Technical Reports Server (NTRS)

    Barnhart, Paul J.

    1997-01-01

    A series of analyses have been developed which permit the calculation of the performance of common inlet designs. The methods presented are useful for determining the inlet weight flows, total pressure recovery, and aerodynamic drag coefficients for given inlet geometric designs. Limited geometric input data is required to use this inlet performance prediction methodology. The analyses presented here may also be used to perform inlet preliminary design studies. The calculated inlet performance parameters may be used in subsequent engine cycle analyses or installed engine performance calculations for existing uninstalled engine data.

  13. Computed statistics at streamgages, and methods for estimating low-flow frequency statistics and development of regional regression equations for estimating low-flow frequency statistics at ungaged locations in Missouri

    USGS Publications Warehouse

    Southard, Rodney E.

    2013-01-01

    The weather and precipitation patterns in Missouri vary considerably from year to year. In 2008, the statewide average rainfall was 57.34 inches and in 2012, the statewide average rainfall was 30.64 inches. This variability in precipitation and resulting streamflow in Missouri underlies the necessity for water managers and users to have reliable streamflow statistics and a means to compute select statistics at ungaged locations for a better understanding of water availability. Knowledge of surface-water availability is dependent on the streamflow data that have been collected and analyzed by the U.S. Geological Survey for more than 100 years at approximately 350 streamgages throughout Missouri. The U.S. Geological Survey, in cooperation with the Missouri Department of Natural Resources, computed streamflow statistics at streamgages through the 2010 water year, defined periods of drought and defined methods to estimate streamflow statistics at ungaged locations, and developed regional regression equations to compute selected streamflow statistics at ungaged locations. Streamflow statistics and flow durations were computed for 532 streamgages in Missouri and in neighboring States of Missouri. For streamgages with more than 10 years of record, Kendall’s tau was computed to evaluate for trends in streamflow data. If trends were detected, the variable length method was used to define the period of no trend. Water years were removed from the dataset from the beginning of the record for a streamgage until no trend was detected. Low-flow frequency statistics were then computed for the entire period of record and for the period of no trend if 10 or more years of record were available for each analysis. Three methods are presented for computing selected streamflow statistics at ungaged locations. The first method uses power curve equations developed for 28 selected streams in Missouri and neighboring States that have multiple streamgages on the same streams. Statistical estimates on one of these streams can be calculated at an ungaged location that has a drainage area that is between 40 percent of the drainage area of the farthest upstream streamgage and within 150 percent of the drainage area of the farthest downstream streamgage along the stream of interest. The second method may be used on any stream with a streamgage that has operated for 10 years or longer and for which anthropogenic effects have not changed the low-flow characteristics at the ungaged location since collection of the streamflow data. A ratio of drainage area of the stream at the ungaged location to the drainage area of the stream at the streamgage was computed to estimate the statistic at the ungaged location. The range of applicability is between 40- and 150-percent of the drainage area of the streamgage, and the ungaged location must be located on the same stream as the streamgage. The third method uses regional regression equations to estimate selected low-flow frequency statistics for unregulated streams in Missouri. This report presents regression equations to estimate frequency statistics for the 10-year recurrence interval and for the N-day durations of 1, 2, 3, 7, 10, 30, and 60 days. Basin and climatic characteristics were computed using geographic information system software and digital geospatial data. A total of 35 characteristics were computed for use in preliminary statewide and regional regression analyses based on existing digital geospatial data and previous studies. Spatial analyses for geographical bias in the predictive accuracy of the regional regression equations defined three low-flow regions with the State representing the three major physiographic provinces in Missouri. Region 1 includes the Central Lowlands, Region 2 includes the Ozark Plateaus, and Region 3 includes the Mississippi Alluvial Plain. A total of 207 streamgages were used in the regression analyses for the regional equations. Of the 207 U.S. Geological Survey streamgages, 77 were located in Region 1, 120 were located in Region 2, and 10 were located in Region 3. Streamgages located outside of Missouri were selected to extend the range of data used for the independent variables in the regression analyses. Streamgages included in the regression analyses had 10 or more years of record and were considered to be affected minimally by anthropogenic activities or trends. Regional regression analyses identified three characteristics as statistically significant for the development of regional equations. For Region 1, drainage area, longest flow path, and streamflow-variability index were statistically significant. The range in the standard error of estimate for Region 1 is 79.6 to 94.2 percent. For Region 2, drainage area and streamflow variability index were statistically significant, and the range in the standard error of estimate is 48.2 to 72.1 percent. For Region 3, drainage area and streamflow-variability index also were statistically significant with a range in the standard error of estimate of 48.1 to 96.2 percent. Limitations on the use of estimating low-flow frequency statistics at ungaged locations are dependent on the method used. The first method outlined for use in Missouri, power curve equations, were developed to estimate the selected statistics for ungaged locations on 28 selected streams with multiple streamgages located on the same stream. A second method uses a drainage-area ratio to compute statistics at an ungaged location using data from a single streamgage on the same stream with 10 or more years of record. Ungaged locations on these streams may use the ratio of the drainage area at an ungaged location to the drainage area at a streamgage location to scale the selected statistic value from the streamgage location to the ungaged location. This method can be used if the drainage area of the ungaged location is within 40 to 150 percent of the streamgage drainage area. The third method is the use of the regional regression equations. The limits for the use of these equations are based on the ranges of the characteristics used as independent variables and that streams must be affected minimally by anthropogenic activities.

  14. Mandibular trabecular bone as fracture indicator in 80-year-old men and women.

    PubMed

    Hassani-Nejad, Azar; Ahlqwist, Margareta; Hakeberg, Magnus; Jonasson, Grethe

    2013-12-01

    The objective of the present study was to compare assessments of the mandibular bone as fracture risk indicators for 277 men and women. The mandibular trabecular bone was evaluated in periapical radiographs, using a visual index, as dense, mixed dense and sparse, or sparse. Bone texture was analysed using a computer-based method in which the number of transitions from trabeculae to intertrabecular spaces was calculated. The sum of the sizes and intensities of the spaces between the trabeculae was calculated using Jaw-X software. Women had a statistically significantly greater number of fractures and a higher frequency of sparse mandibular bone. The OR for having suffered a fracture with visually sparse trabecular bone was highest for the male group (OR = 5.55) and lowest for the female group (OR = 3.35). For bone texture as an indicator of previous fracture, the OR was significant for the female group (OR = 2.61) but not for the male group, whereas the Jaw-X calculations did not differentiate between fractured and non-fractured groups. In conclusion, all bone-quality assessments showed that women had a higher incidence of sparse trabecular bone than did men. Only the methods of visual assessment and trabecular texture were significantly correlated with previous bone fractures. © 2013 Eur J Oral Sci.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gu, M F; Holcomb, C; Jayakuma, J

    We present detailed atomic physics models for motional Stark effects (MSE) diagnostic on magnetic fusion devices. Excitation and ionization cross sections of the hydrogen or deuterium beam traveling in a magnetic field in collisions with electrons, ions, and neutral gas are calculated in the first Born approximation. The density matrices and polarization states of individual Stark-Zeeman components of the Balmer {alpha} line are obtained for both beam into plasma and beam into gas models. A detailed comparison of the model calculations and the MSE polarimetry and spectral intensity measurements obtained at the DIII-D tokamak is carried out. Although our beammore » into gas models provide a qualitative explanation for the larger {pi}/{sigma} intensity ratios and represent significant improvements over the statistical population models, empirical adjustment factors ranging from 1.0-2.0 must still be applied to individual line intensities to bring the calculations into full agreement with the observations. Nevertheless, we demonstrate that beam into gas measurements can be used successfully as calibration procedures for measuring the magnetic pitch angle through {pi}/{sigma} intensity ratios. The analyses of the filter-scan polarization spectra from the DIII-D MSE polarimetry system indicate unknown channel and time dependent light contaminations in the beam into gas measurements. Such contaminations may be the main reason for the failure of beam into gas calibration on MSE polarimetry systems.« less

  16. Heads Up! a Calculation- & Jargon-Free Approach to Statistics

    ERIC Educational Resources Information Center

    Giese, Alan R.

    2012-01-01

    Evaluating the strength of evidence in noisy data is a critical step in scientific thinking that typically relies on statistics. Students without statistical training will benefit from heuristic models that highlight the logic of statistical analysis. The likelihood associated with various coin-tossing outcomes gives students such a model. There…

  17. Characterization of Hg1-xCdxTe heterostructures by thermoelectric measurements

    NASA Astrophysics Data System (ADS)

    Baars, J.; Brink, D.; Edwall, D. D.; Bubulac, L. O.

    1993-08-01

    P-on-n mercury cadmium telluride (MCT) heterostructures grown by MOCVD with As and In as n- and p-type dopants, respectively, are examined by measuring the Seebeck and Hall coefficients between 20 and 320K. The results are analyzed regarding doping and composition of the layers by least squares fitting the experimental profiles with the calculated temperature dependencies. The electron and hole densities of the layers are calculated taking into account Fermi-Dirac statistics, a nonparabolic conduction band, a parabolic valence band, a discrete acceptor level, and fully ionized donors. For the Seebeck coefficient, the relation we previously showed to be valid for p-type MCT1 is used. This relation relies on the thermoelectric effect in a temperature gradient resulting from the diffusion of nondegenerate carriers scattered by LO-phonons. It also fits the observed thermoelectric properties of n-type MCT in a wide temperature range. The doping and structural parameters determined from the thermoelectric measurements agreed very well with As and In profiles obtained from secondary ion mass spectroscopy measurements and the data obtained from analyses of infrared transmission measurements.

  18. Correcting power and p-value calculations for bias in diffusion tensor imaging.

    PubMed

    Lauzon, Carolyn B; Landman, Bennett A

    2013-07-01

    Diffusion tensor imaging (DTI) provides quantitative parametric maps sensitive to tissue microarchitecture (e.g., fractional anisotropy, FA). These maps are estimated through computational processes and subject to random distortions including variance and bias. Traditional statistical procedures commonly used for study planning (including power analyses and p-value/alpha-rate thresholds) specifically model variability, but neglect potential impacts of bias. Herein, we quantitatively investigate the impacts of bias in DTI on hypothesis test properties (power and alpha-rate) using a two-sided hypothesis testing framework. We present theoretical evaluation of bias on hypothesis test properties, evaluate the bias estimation technique SIMEX for DTI hypothesis testing using simulated data, and evaluate the impacts of bias on spatially varying power and alpha rates in an empirical study of 21 subjects. Bias is shown to inflame alpha rates, distort the power curve, and cause significant power loss even in empirical settings where the expected difference in bias between groups is zero. These adverse effects can be attenuated by properly accounting for bias in the calculation of power and p-values. Copyright © 2013 Elsevier Inc. All rights reserved.

  19. Inferential Statistics in "Language Teaching Research": A Review and Ways Forward

    ERIC Educational Resources Information Center

    Lindstromberg, Seth

    2016-01-01

    This article reviews all (quasi)experimental studies appearing in the first 19 volumes (1997-2015) of "Language Teaching Research" (LTR). Specifically, it provides an overview of how statistical analyses were conducted in these studies and of how the analyses were reported. The overall conclusion is that there has been a tight adherence…

  20. The Web as an educational tool for/in learning/teaching bioinformatics statistics.

    PubMed

    Oliver, J; Pisano, M E; Alonso, T; Roca, P

    2005-12-01

    Statistics provides essential tool in Bioinformatics to interpret the results of a database search or for the management of enormous amounts of information provided from genomics, proteomics and metabolomics. The goal of this project was the development of a software tool that would be as simple as possible to demonstrate the use of the Bioinformatics statistics. Computer Simulation Methods (CSMs) developed using Microsoft Excel were chosen for their broad range of applications, immediate and easy formula calculation, immediate testing and easy graphics representation, and of general use and acceptance by the scientific community. The result of these endeavours is a set of utilities which can be accessed from the following URL: http://gmein.uib.es/bioinformatica/statistics. When tested on students with previous coursework with traditional statistical teaching methods, the general opinion/overall consensus was that Web-based instruction had numerous advantages, but traditional methods with manual calculations were also needed for their theory and practice. Once having mastered the basic statistical formulas, Excel spreadsheets and graphics were shown to be very useful for trying many parameters in a rapid fashion without having to perform tedious calculations. CSMs will be of great importance for the formation of the students and professionals in the field of bioinformatics, and for upcoming applications of self-learning and continuous formation.

  1. Statistical characteristics of trajectories of diamagnetic unicellular organisms in a magnetic field.

    PubMed

    Gorobets, Yu I; Gorobets, O Yu

    2015-01-01

    The statistical model is proposed in this paper for description of orientation of trajectories of unicellular diamagnetic organisms in a magnetic field. The statistical parameter such as the effective energy is calculated on basis of this model. The resulting effective energy is the statistical characteristics of trajectories of diamagnetic microorganisms in a magnetic field connected with their metabolism. The statistical model is applicable for the case when the energy of the thermal motion of bacteria is negligible in comparison with their energy in a magnetic field and the bacteria manifest the significant "active random movement", i.e. there is the randomizing motion of the bacteria of non thermal nature, for example, movement of bacteria by means of flagellum. The energy of the randomizing active self-motion of bacteria is characterized by the new statistical parameter for biological objects. The parameter replaces the energy of the randomizing thermal motion in calculation of the statistical distribution. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Calculation of Weibull strength parameters, Batdorf flaw density constants and related statistical quantities using PC-CARES

    NASA Technical Reports Server (NTRS)

    Szatmary, Steven A.; Gyekenyesi, John P.; Nemeth, Noel N.

    1990-01-01

    This manual describes the operation and theory of the PC-CARES (Personal Computer-Ceramic Analysis and Reliability Evaluation of Structures) computer program for the IBM PC and compatibles running PC-DOS/MS-DOR OR IBM/MS-OS/2 (version 1.1 or higher) operating systems. The primary purpose of this code is to estimate Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities. Included in the manual is the description of the calculation of shape and scale parameters of the two-parameter Weibull distribution using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. The methods for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull line, as well as the techniques for calculating the Batdorf flaw-density constants are also described.

  3. Comparison of Numerical Analyses with a Static Load Test of a Continuous Flight Auger Pile

    NASA Astrophysics Data System (ADS)

    Hoľko, Michal; Stacho, Jakub

    2014-12-01

    The article deals with numerical analyses of a Continuous Flight Auger (CFA) pile. The analyses include a comparison of calculated and measured load-settlement curves as well as a comparison of the load distribution over a pile's length. The numerical analyses were executed using two types of software, i.e., Ansys and Plaxis, which are based on FEM calculations. Both types of software are different from each other in the way they create numerical models, model the interface between the pile and soil, and use constitutive material models. The analyses have been prepared in the form of a parametric study, where the method of modelling the interface and the material models of the soil are compared and analysed. Our analyses show that both types of software permit the modelling of pile foundations. The Plaxis software uses advanced material models as well as the modelling of the impact of groundwater or overconsolidation. The load-settlement curve calculated using Plaxis is equal to the results of a static load test with a more than 95 % degree of accuracy. In comparison, the load-settlement curve calculated using Ansys allows for the obtaining of only an approximate estimate, but the software allows for the common modelling of large structure systems together with a foundation system.

  4. Methodologie de modelisation aerostructurelle d'une aile utilisant un logiciel de calcul aerodynamique et un logiciel de calcul par elements finis =

    NASA Astrophysics Data System (ADS)

    Communier, David

    Lors de l'etude structurelle d'une aile d'avion, il est difficile de modeliser fidelement les forces aerodynamiques subies par l'aile de l'avion. Pour faciliter l'analyse, on repartit la portance maximale theorique de l'aile sur son longeron principal ou sur ses nervures. La repartition utilisee implique que l'aile entiere sera plus resistante que necessaire et donc que la structure ne sera pas totalement optimisee. Pour pallier ce probleme, il faudrait s'assurer d'appliquer une repartition aerodynamique de la portance sur la surface complete de l'aile. On serait donc en mesure d'obtenir une repartition des charges sur l'aile beaucoup plus fiable. Pour le realiser, nous aurons besoin de coupler les resultats d'un logiciel calculant les charges aerodynamiques de l'aile avec les resultats d'un logiciel permettant sa conception et son analyse structurelle. Dans ce projet, le logiciel utilise pour calculer les coefficients de pression sur l'aile est XFLR5 et le logiciel permettant la conception et l'analyse structurelle sera CATIA V5. Le logiciel XFLR5 permet une analyse rapide d'une aile en se basant sur l'analyse de ses profils. Ce logiciel calcule les performances des profils de la meme maniere que XFOIL et permet de choisir parmi trois methodes de calcul pour obtenir les performances de l'aile : Lifting Line Theory (LLT), Vortex Lattice Method (VLM) et 3D Panels. Dans notre methodologie, nous utilisons la methode de calcul 3D Panels dont la validite a ete testee en soufflerie pour confirmer les calculs sur XFLR5. En ce qui concerne la conception et l'analyse par des elements finis de la structure, le logiciel CATIA V5 est couramment utilise dans le domaine aerospatial. CATIA V5 permet une automatisation des etapes de conception de l'aile. Ainsi, dans ce memoire, nous allons decrire la methodologie permettant l'etude aerostructurelle d'une aile d'avion.

  5. Accuracy of Currently Used Paper Burn Diagram vs a Three-Dimensional Computerized Model.

    PubMed

    Benjamin, Nicole C; Lee, Jong O; Norbury, William B; Branski, Ludwik K; Wurzer, Paul; Jimenez, Carlos J; Benjamin, Debra A; Herndon, David N

    Burn units have historically used paper diagrams to estimate percent burn; however, unintentional errors can occur. The use of a computer program that incorporates wound mapping from photographs onto a three-dimensional (3D) human diagram could decrease subjectivity in preparing burn diagrams and subsequent calculations of TBSA burned. Analyses were done on 19 burned patients who had an estimated TBSA burned of ≥20%. The patients were admitted to Shriners Hospitals for Children or the University of Texas Medical Branch in Galveston, Texas, from July 2012 to September 2013 for treatment. Digital photographs were collected before the patient's first surgery. Using BurnCase 3D (RISC Software GmbH, Hagenberg, Austria), a burn mapping software, the user traced partial- and full-thickness burns from photographs. The program then superimposed tracings onto a 3D model and calculated percent burned. The results were compared with the Lund and Browder diagrams completed after the first operation. A two-tailed t-test was used to calculate statistical differences. For partial-thickness burns, burn sizes calculated using Lund and Browder diagrams were significantly larger than those calculated using BurnCase 3D (15% difference, P < .01). The opposite was found for full-thickness burns, with burn sizes being smaller when calculated using Lund and Browder diagrams (11% difference, P < .05). In conclusion, substantial differences exist in percent burn estimations derived from BurnCase 3D and paper diagrams. In our studied cohort, paper diagrams were associated with overestimation of partial-thickness burn size and underestimation of full-thickness burn size. Additional studies comparing BurnCase 3D with other commonly used methods are warranted.

  6. The Threshold Bias Model: A Mathematical Model for the Nomothetic Approach of Suicide

    PubMed Central

    Folly, Walter Sydney Dutra

    2011-01-01

    Background Comparative and predictive analyses of suicide data from different countries are difficult to perform due to varying approaches and the lack of comparative parameters. Methodology/Principal Findings A simple model (the Threshold Bias Model) was tested for comparative and predictive analyses of suicide rates by age. The model comprises of a six parameter distribution that was applied to the USA suicide rates by age for the years 2001 and 2002. Posteriorly, linear extrapolations are performed of the parameter values previously obtained for these years in order to estimate the values corresponding to the year 2003. The calculated distributions agreed reasonably well with the aggregate data. The model was also used to determine the age above which suicide rates become statistically observable in USA, Brazil and Sri Lanka. Conclusions/Significance The Threshold Bias Model has considerable potential applications in demographic studies of suicide. Moreover, since the model can be used to predict the evolution of suicide rates based on information extracted from past data, it will be of great interest to suicidologists and other researchers in the field of mental health. PMID:21909431

  7. GenAlEx 6.5: genetic analysis in Excel. Population genetic software for teaching and research--an update.

    PubMed

    Peakall, Rod; Smouse, Peter E

    2012-10-01

    GenAlEx: Genetic Analysis in Excel is a cross-platform package for population genetic analyses that runs within Microsoft Excel. GenAlEx offers analysis of diploid codominant, haploid and binary genetic loci and DNA sequences. Both frequency-based (F-statistics, heterozygosity, HWE, population assignment, relatedness) and distance-based (AMOVA, PCoA, Mantel tests, multivariate spatial autocorrelation) analyses are provided. New features include calculation of new estimators of population structure: G'(ST), G''(ST), Jost's D(est) and F'(ST) through AMOVA, Shannon Information analysis, linkage disequilibrium analysis for biallelic data and novel heterogeneity tests for spatial autocorrelation analysis. Export to more than 30 other data formats is provided. Teaching tutorials and expanded step-by-step output options are included. The comprehensive guide has been fully revised. GenAlEx is written in VBA and provided as a Microsoft Excel Add-in (compatible with Excel 2003, 2007, 2010 on PC; Excel 2004, 2011 on Macintosh). GenAlEx, and supporting documentation and tutorials are freely available at: http://biology.anu.edu.au/GenAlEx. rod.peakall@anu.edu.au.

  8. The threshold bias model: a mathematical model for the nomothetic approach of suicide.

    PubMed

    Folly, Walter Sydney Dutra

    2011-01-01

    Comparative and predictive analyses of suicide data from different countries are difficult to perform due to varying approaches and the lack of comparative parameters. A simple model (the Threshold Bias Model) was tested for comparative and predictive analyses of suicide rates by age. The model comprises of a six parameter distribution that was applied to the USA suicide rates by age for the years 2001 and 2002. Posteriorly, linear extrapolations are performed of the parameter values previously obtained for these years in order to estimate the values corresponding to the year 2003. The calculated distributions agreed reasonably well with the aggregate data. The model was also used to determine the age above which suicide rates become statistically observable in USA, Brazil and Sri Lanka. The Threshold Bias Model has considerable potential applications in demographic studies of suicide. Moreover, since the model can be used to predict the evolution of suicide rates based on information extracted from past data, it will be of great interest to suicidologists and other researchers in the field of mental health.

  9. Detecting the existence of gene flow between Spanish and North African goats through a coalescent approach.

    PubMed

    Martínez, Amparo; Manunza, Arianna; Delgado, Juan Vicente; Landi, Vincenzo; Adebambo, Ayotunde; Ismaila, Muritala; Capote, Juan; El Ouni, Mabrouk; Elbeltagy, Ahmed; Abushady, Asmaa M; Galal, Salah; Ferrando, Ainhoa; Gómez, Mariano; Pons, Agueda; Badaoui, Bouabid; Jordana, Jordi; Vidal, Oriol; Amills, Marcel

    2016-12-14

    Human-driven migrations are one of the main processes shaping the genetic diversity and population structure of domestic species. However, their magnitude and direction have been rarely analysed in a statistical framework. We aimed to estimate the impact of migration on the population structure of Spanish and African goats. To achieve this goal, we analysed a dataset of 1,472 individuals typed with 23 microsatellites. Population structure of African and Spanish goats was moderate (mean F ST  = 0.07), with the exception of the Canarian and South African breeds that displayed a significant differentiation when compared to goats from North Africa and Nigeria. Measurement of gene flow with Migrate-n and IMa coalescent genealogy samplers supported the existence of a bidirectional gene flow between African and Spanish goats. Moreover, IMa estimates of the effective number of migrants were remarkably lower than those calculated with Migrate-n and classical approaches. Such discrepancies suggest that recent divergence, rather than extensive gene flow, is the main cause of the weak population structure observed in caprine breeds.

  10. Detecting the existence of gene flow between Spanish and North African goats through a coalescent approach

    PubMed Central

    Martínez, Amparo; Manunza, Arianna; Delgado, Juan Vicente; Landi, Vincenzo; Adebambo, Ayotunde; Ismaila, Muritala; Capote, Juan; El Ouni, Mabrouk; Elbeltagy, Ahmed; Abushady, Asmaa M.; Galal, Salah; Ferrando, Ainhoa; Gómez, Mariano; Pons, Agueda; Badaoui, Bouabid; Jordana, Jordi; Vidal, Oriol; Amills, Marcel

    2016-01-01

    Human-driven migrations are one of the main processes shaping the genetic diversity and population structure of domestic species. However, their magnitude and direction have been rarely analysed in a statistical framework. We aimed to estimate the impact of migration on the population structure of Spanish and African goats. To achieve this goal, we analysed a dataset of 1,472 individuals typed with 23 microsatellites. Population structure of African and Spanish goats was moderate (mean FST = 0.07), with the exception of the Canarian and South African breeds that displayed a significant differentiation when compared to goats from North Africa and Nigeria. Measurement of gene flow with Migrate-n and IMa coalescent genealogy samplers supported the existence of a bidirectional gene flow between African and Spanish goats. Moreover, IMa estimates of the effective number of migrants were remarkably lower than those calculated with Migrate-n and classical approaches. Such discrepancies suggest that recent divergence, rather than extensive gene flow, is the main cause of the weak population structure observed in caprine breeds. PMID:27966592

  11. Fels-Rand: an Xlisp-Stat program for the comparative analysis of data under phylogenetic uncertainty.

    PubMed

    Blomberg, S

    2000-11-01

    Currently available programs for the comparative analysis of phylogenetic data do not perform optimally when the phylogeny is not completely specified (i.e. the phylogeny contains polytomies). Recent literature suggests that a better way to analyse the data would be to create random trees from the known phylogeny that are fully-resolved but consistent with the known tree. A computer program is presented, Fels-Rand, that performs such analyses. A randomisation procedure is used to generate trees that are fully resolved but whose structure is consistent with the original tree. Statistics are then calculated on a large number of these randomly-generated trees. Fels-Rand uses the object-oriented features of Xlisp-Stat to manipulate internal tree representations. Xlisp-Stat's dynamic graphing features are used to provide heuristic tools to aid in analysis, particularly outlier analysis. The usefulness of Xlisp-Stat as a system for phylogenetic computation is discussed. Available from the author or at http://www.uq.edu.au/~ansblomb/Fels-Rand.sit.hqx. Xlisp-Stat is available from http://stat.umn.edu/~luke/xls/xlsinfo/xlsinfo.html. s.blomberg@abdn.ac.uk

  12. Comparison of two surface temperature measurement using thermocouples and infrared camera

    NASA Astrophysics Data System (ADS)

    Michalski, Dariusz; Strąk, Kinga; Piasecka, Magdalena

    This paper compares two methods applied to measure surface temperatures at an experimental setup designed to analyse flow boiling heat transfer. The temperature measurements were performed in two parallel rectangular minichannels, both 1.7 mm deep, 16 mm wide and 180 mm long. The heating element for the fluid flowing in each minichannel was a thin foil made of Haynes-230. The two measurement methods employed to determine the surface temperature of the foil were: the contact method, which involved mounting thermocouples at several points in one minichannel, and the contactless method to study the other minichannel, where the results were provided with an infrared camera. Calculations were necessary to compare the temperature results. Two sets of measurement data obtained for different values of the heat flux were analysed using the basic statistical methods, the method error and the method accuracy. The experimental error and the method accuracy were taken into account. The comparative analysis showed that although the values and distributions of the surface temperatures obtained with the two methods were similar but both methods had certain limitations.

  13. No Association of Coronary Artery Disease with X-Chromosomal Variants in Comprehensive International Meta-Analysis.

    PubMed

    Loley, Christina; Alver, Maris; Assimes, Themistocles L; Bjonnes, Andrew; Goel, Anuj; Gustafsson, Stefan; Hernesniemi, Jussi; Hopewell, Jemma C; Kanoni, Stavroula; Kleber, Marcus E; Lau, King Wai; Lu, Yingchang; Lyytikäinen, Leo-Pekka; Nelson, Christopher P; Nikpay, Majid; Qu, Liming; Salfati, Elias; Scholz, Markus; Tukiainen, Taru; Willenborg, Christina; Won, Hong-Hee; Zeng, Lingyao; Zhang, Weihua; Anand, Sonia S; Beutner, Frank; Bottinger, Erwin P; Clarke, Robert; Dedoussis, George; Do, Ron; Esko, Tõnu; Eskola, Markku; Farrall, Martin; Gauguier, Dominique; Giedraitis, Vilmantas; Granger, Christopher B; Hall, Alistair S; Hamsten, Anders; Hazen, Stanley L; Huang, Jie; Kähönen, Mika; Kyriakou, Theodosios; Laaksonen, Reijo; Lind, Lars; Lindgren, Cecilia; Magnusson, Patrik K E; Marouli, Eirini; Mihailov, Evelin; Morris, Andrew P; Nikus, Kjell; Pedersen, Nancy; Rallidis, Loukianos; Salomaa, Veikko; Shah, Svati H; Stewart, Alexandre F R; Thompson, John R; Zalloua, Pierre A; Chambers, John C; Collins, Rory; Ingelsson, Erik; Iribarren, Carlos; Karhunen, Pekka J; Kooner, Jaspal S; Lehtimäki, Terho; Loos, Ruth J F; März, Winfried; McPherson, Ruth; Metspalu, Andres; Reilly, Muredach P; Ripatti, Samuli; Sanghera, Dharambir K; Thiery, Joachim; Watkins, Hugh; Deloukas, Panos; Kathiresan, Sekar; Samani, Nilesh J; Schunkert, Heribert; Erdmann, Jeanette; König, Inke R

    2016-10-12

    In recent years, genome-wide association studies have identified 58 independent risk loci for coronary artery disease (CAD) on the autosome. However, due to the sex-specific data structure of the X chromosome, it has been excluded from most of these analyses. While females have 2 copies of chromosome X, males have only one. Also, one of the female X chromosomes may be inactivated. Therefore, special test statistics and quality control procedures are required. Thus, little is known about the role of X-chromosomal variants in CAD. To fill this gap, we conducted a comprehensive X-chromosome-wide meta-analysis including more than 43,000 CAD cases and 58,000 controls from 35 international study cohorts. For quality control, sex-specific filters were used to adequately take the special structure of X-chromosomal data into account. For single study analyses, several logistic regression models were calculated allowing for inactivation of one female X-chromosome, adjusting for sex and investigating interactions between sex and genetic variants. Then, meta-analyses including all 35 studies were conducted using random effects models. None of the investigated models revealed genome-wide significant associations for any variant. Although we analyzed the largest-to-date sample, currently available methods were not able to detect any associations of X-chromosomal variants with CAD.

  14. Ayurveda: Between Religion, Spirituality, and Medicine

    PubMed Central

    Kessler, C.; Wischnewsky, M.; Michalsen, A.; Eisenmann, C.; Melzer, J.

    2013-01-01

    Ayurveda is playing a growing part in Europe. Questions regarding the role of religion and spirituality within Ayurveda are discussed widely. Yet, there is little data on the influence of religious and spiritual aspects on its European diffusion. Methods. A survey was conducted with a new questionnaire. It was analysed by calculating frequency variables and testing differences in distributions with the χ 2-Test. Principal Component Analyses with Varimax Rotation were performed. Results. 140 questionnaires were analysed. Researchers found that individual religious and spiritual backgrounds influence attitudes and expectations towards Ayurveda. Statistical relationships were found between religious/spiritual backgrounds and decisions to offer/access Ayurveda. Accessing Ayurveda did not exclude the simultaneous use of modern medicine and CAM. From the majority's perspective Ayurveda is simultaneously a science, medicine, and a spiritual approach. Conclusion. Ayurveda seems to be able to satisfy the individual needs of therapists and patients, despite worldview differences. Ayurvedic concepts are based on anthropologic assumptions including different levels of existence in healing approaches. Thereby, Ayurveda can be seen in accordance with the prerequisites for a Whole Medical System. As a result of this, intimate and individual therapist-patient relationships can emerge. Larger surveys involving bigger participant numbers with fully validated questionnaires are warranted to support these results. PMID:24368928

  15. Spatial analysis of dengue fever in Guangdong Province, China, 2001-2006.

    PubMed

    Liu, Chunxiao; Liu, Qiyong; Lin, Hualiang; Xin, Benqiang; Nie, Jun

    2014-01-01

    Guangdong Province is the area most seriously affected by dengue fever in China. In this study, we describe the spatial distribution of dengue fever in Guangdong Province from 2001 to 2006 with the objective of informing priority areas for public health planning and resource allocation. Annualized incidence at a county level was calculated and mapped to show crude incidence, excess hazard, and spatial smoothed incidence. Geographic information system-based spatial scan statistics was conducted to detect the spatial distribution pattern of dengue fever incidence at the county level. Spatial scan cluster analyses suggested that counties around Guangzhou City and Chaoshan Region were at increased risk for dengue fever (P < .01). Some spatial clusters of dengue fever were found in Guangdong Province, which allowed intervention measures to be targeted for maximum effect.

  16. A test of safety, violence prevention, and civility climate domain-specific relationships with relevant workplace hazards

    PubMed Central

    Spector, Paul E.

    2016-01-01

    Background Safety climate, violence prevention climate, and civility climate were independently developed and linked to domain-specific workplace hazards, although all three were designed to promote the physical and psychological safety of workers. Purpose To test domain specificity between conceptually related workplace climates and relevant workplace hazards. Methods Data were collected from 368 persons employed in various industries and descriptive statistics were calculated for all study variables. Correlational and relative weights analyses were used to test for domain specificity. Results The three climate domains were similarly predictive of most workplace hazards, regardless of domain specificity. Discussion This study suggests that the three climate domains share a common higher order construct that may predict relevant workplace hazards better than any of the scales alone. PMID:27110930

  17. GAMBIT: the global and modular beyond-the-standard-model inference tool

    NASA Astrophysics Data System (ADS)

    Athron, Peter; Balazs, Csaba; Bringmann, Torsten; Buckley, Andy; Chrząszcz, Marcin; Conrad, Jan; Cornell, Jonathan M.; Dal, Lars A.; Dickinson, Hugh; Edsjö, Joakim; Farmer, Ben; Gonzalo, Tomás E.; Jackson, Paul; Krislock, Abram; Kvellestad, Anders; Lundberg, Johan; McKay, James; Mahmoudi, Farvah; Martinez, Gregory D.; Putze, Antje; Raklev, Are; Ripken, Joachim; Rogan, Christopher; Saavedra, Aldo; Savage, Christopher; Scott, Pat; Seo, Seon-Hee; Serra, Nicola; Weniger, Christoph; White, Martin; Wild, Sebastian

    2017-11-01

    We describe the open-source global fitting package GAMBIT: the Global And Modular Beyond-the-Standard-Model Inference Tool. GAMBIT combines extensive calculations of observables and likelihoods in particle and astroparticle physics with a hierarchical model database, advanced tools for automatically building analyses of essentially any model, a flexible and powerful system for interfacing to external codes, a suite of different statistical methods and parameter scanning algorithms, and a host of other utilities designed to make scans faster, safer and more easily-extendible than in the past. Here we give a detailed description of the framework, its design and motivation, and the current models and other specific components presently implemented in GAMBIT. Accompanying papers deal with individual modules and present first GAMBIT results. GAMBIT can be downloaded from gambit.hepforge.org.

  18. Realization of a four-step molecular switch in scanning tunneling microscope manipulation of single chlorophyll-a molecules

    PubMed Central

    Iancu, Violeta; Hla, Saw-Wai

    2006-01-01

    Single chlorophyll-a molecules, a vital resource for the sustenance of life on Earth, have been investigated by using scanning tunneling microscope manipulation and spectroscopy on a gold substrate at 4.6 K. Chlorophyll-a binds on Au(111) via its porphyrin unit while the phytyl-chain is elevated from the surface by the support of four CH3 groups. By injecting tunneling electrons from the scanning tunneling microscope tip, we are able to bend the phytyl-chain, which enables the switching of four molecular conformations in a controlled manner. Statistical analyses and structural calculations reveal that all reversible switching mechanisms are initiated by a single tunneling-electron energy-transfer process, which induces bond rotation within the phytyl-chain. PMID:16954201

  19. Analysis of low-field isotropic vortex glass containing vortex groups in YBa2Cu3O7−x thin films visualized by scanning SQUID microscopy

    PubMed Central

    Wells, Frederick S.; Pan, Alexey V.; Wang, X. Renshaw; Fedoseev, Sergey A.; Hilgenkamp, Hans

    2015-01-01

    The glass-like vortex distribution in pulsed laser deposited YBa2Cu3O7 − x thin films is observed by scanning superconducting quantum interference device microscopy and analysed for ordering after cooling in magnetic fields significantly smaller than the Earth's field. Autocorrelation calculations on this distribution show a weak short-range positional order, while Delaunay triangulation shows a near-complete lack of orientational order. The distribution of these vortices is finally characterised as an isotropic vortex glass. Abnormally closely spaced groups of vortices, which are statistically unlikely to occur, are observed above a threshold magnetic field. The origin of these groups is discussed, but will require further investigation. PMID:25728772

  20. Random mechanics: Nonlinear vibrations, turbulences, seisms, swells, fatigue

    NASA Astrophysics Data System (ADS)

    Kree, P.; Soize, C.

    The random modeling of physical phenomena, together with probabilistic methods for the numerical calculation of random mechanical forces, are analytically explored. Attention is given to theoretical examinations such as probabilistic concepts, linear filtering techniques, and trajectory statistics. Applications of the methods to structures experiencing atmospheric turbulence, the quantification of turbulence, and the dynamic responses of the structures are considered. A probabilistic approach is taken to study the effects of earthquakes on structures and to the forces exerted by ocean waves on marine structures. Theoretical analyses by means of vector spaces and stochastic modeling are reviewed, as are Markovian formulations of Gaussian processes and the definition of stochastic differential equations. Finally, random vibrations with a variable number of links and linear oscillators undergoing the square of Gaussian processes are investigated.

  1. Plasma biochemical and PCV ranges for healthy, wild, immature hawksbill (Eretmochelys imbricata) sea turtles.

    PubMed

    Whiting, S D; Guinea, M L; Fomiatti, K; Flint, M; Limpus, C J

    2014-06-14

    In recent years, the use of blood chemistry as a diagnostic tool for sea turtles has been demonstrated, but much of its effectiveness relies on reference intervals. The first comprehensive blood chemistry values for healthy wild hawksbill (Eretmochelys imbricata) sea turtles are presented. Nineteen blood chemistry analytes and packed cell volume were analysed for 40 clinically healthy juvenile hawksbill sea turtles captured from a rocky reef habitat in northern Australia. We used four statistical approaches to calculate reference intervals and to investigate their use with non-normal distributions and small sample sizes, and to compare upper and lower limits between methods. Eleven analytes were correlated with curved carapace length indicating that body size should be considered when designing future studies and interpreting analyte values. British Veterinary Association.

  2. A d-statistic for single-case designs that is equivalent to the usual between-groups d-statistic.

    PubMed

    Shadish, William R; Hedges, Larry V; Pustejovsky, James E; Boyajian, Jonathan G; Sullivan, Kristynn J; Andrade, Alma; Barrientos, Jeannette L

    2014-01-01

    We describe a standardised mean difference statistic (d) for single-case designs that is equivalent to the usual d in between-groups experiments. We show how it can be used to summarise treatment effects over cases within a study, to do power analyses in planning new studies and grant proposals, and to meta-analyse effects across studies of the same question. We discuss limitations of this d-statistic, and possible remedies to them. Even so, this d-statistic is better founded statistically than other effect size measures for single-case design, and unlike many general linear model approaches such as multilevel modelling or generalised additive models, it produces a standardised effect size that can be integrated over studies with different outcome measures. SPSS macros for both effect size computation and power analysis are available.

  3. Development and validation of a national data registry for midwife-led births: the Midwives Alliance of North America Statistics Project 2.0 dataset.

    PubMed

    Cheyney, Melissa; Bovbjerg, Marit; Everson, Courtney; Gordon, Wendy; Hannibal, Darcy; Vedam, Saraswathi

    2014-01-01

    In 2004, the Midwives Alliance of North America's (MANA's) Division of Research developed a Web-based data collection system to gather information on the practices and outcomes associated with midwife-led births in the United States. This system, called the MANA Statistics Project (MANA Stats), grew out of a widely acknowledged need for more reliable data on outcomes by intended place of birth. This article describes the history and development of the MANA Stats birth registry and provides an analysis of the 2.0 dataset's content, strengths, and limitations. Data collection and review procedures for the MANA Stats 2.0 dataset are described, along with methods for the assessment of data accuracy. We calculated descriptive statistics for client demographics and contributing midwife credentials, and assessed the quality of data by calculating point estimates, 95% confidence intervals, and kappa statistics for key outcomes on pre- and postreview samples of records. The MANA Stats 2.0 dataset (2004-2009) contains 24,848 courses of care, 20,893 of which are for women who planned a home or birth center birth at the onset of labor. The majority of these records were planned home births (81%). Births were attended primarily by certified professional midwives (73%), and clients were largely white (92%), married (87%), and college-educated (49%). Data quality analyses of 9932 records revealed no differences between pre- and postreviewed samples for 7 key benchmarking variables (kappa, 0.98-1.00). The MANA Stats 2.0 data were accurately entered by participants; any errors in this dataset are likely random and not systematic. The primary limitation of the 2.0 dataset is that the sample was captured through voluntary participation; thus, it may not accurately reflect population-based outcomes. The dataset's primary strength is that it will allow for the examination of research questions on normal physiologic birth and midwife-led birth outcomes by intended place of birth. © 2014 by the American College of Nurse-Midwives.

  4. 2000 Iowa crash facts : a summary of motor vehicle crash statistics on Iowa roadways

    DOT National Transportation Integrated Search

    2000-01-01

    All statistics are gathered and calculated by the Iowa Department of Transportations Office of Driver Services. National statistics : are obtained from Traffic Safety Facts 2000 published by the U.S. Department of Transportations National...

  5. Ensuring Positiveness of the Scaled Difference Chi-Square Test Statistic

    ERIC Educational Resources Information Center

    Satorra, Albert; Bentler, Peter M.

    2010-01-01

    A scaled difference test statistic T[tilde][subscript d] that can be computed from standard software of structural equation models (SEM) by hand calculations was proposed in Satorra and Bentler (Psychometrika 66:507-514, 2001). The statistic T[tilde][subscript d] is asymptotically equivalent to the scaled difference test statistic T[bar][subscript…

  6. A statistical human resources costing and accounting model for analysing the economic effects of an intervention at a workplace.

    PubMed

    Landstad, Bodil J; Gelin, Gunnar; Malmquist, Claes; Vinberg, Stig

    2002-09-15

    The study had two primary aims. The first aim was to combine a human resources costing and accounting approach (HRCA) with a quantitative statistical approach in order to get an integrated model. The second aim was to apply this integrated model in a quasi-experimental study in order to investigate whether preventive intervention affected sickness absence costs at the company level. The intervention studied contained occupational organizational measures, competence development, physical and psychosocial working environmental measures and individual and rehabilitation measures on both an individual and a group basis. The study is a quasi-experimental design with a non-randomized control group. Both groups involved cleaning jobs at predominantly female workplaces. The study plan involved carrying out before and after studies on both groups. The study included only those who were at the same workplace during the whole of the study period. In the HRCA model used here, the cost of sickness absence is the net difference between the costs, in the form of the value of the loss of production and the administrative cost, and the benefits in the form of lower labour costs. According to the HRCA model, the intervention used counteracted a rise in sickness absence costs at the company level, giving an average net effect of 266.5 Euros per person (full-time working) during an 8-month period. Using an analogue statistical analysis on the whole of the material, the contribution of the intervention counteracted a rise in sickness absence costs at the company level giving an average net effect of 283.2 Euros. Using a statistical method it was possible to study the regression coefficients in sub-groups and calculate the p-values for these coefficients; in the younger group the intervention gave a calculated net contribution of 605.6 Euros with a p-value of 0.073, while the intervention net contribution in the older group had a very high p-value. Using the statistical model it was also possible to study contributions of other variables and interactions. This study established that the HRCA model and the integrated model produced approximately the same monetary outcomes. The integrated model, however, allowed a deeper understanding of the various possible relationships and quantified the results with confidence intervals.

  7. Bootstrap versus Statistical Effect Size Corrections: A Comparison with Data from the Finding Embedded Figures Test.

    ERIC Educational Resources Information Center

    Thompson, Bruce; Melancon, Janet G.

    Effect sizes have been increasingly emphasized in research as more researchers have recognized that: (1) all parametric analyses (t-tests, analyses of variance, etc.) are correlational; (2) effect sizes have played an important role in meta-analytic work; and (3) statistical significance testing is limited in its capacity to inform scientific…

  8. Comments on `A Cautionary Note on the Interpretation of EOFs'.

    NASA Astrophysics Data System (ADS)

    Behera, Swadhin K.; Rao, Suryachandra A.; Saji, Hameed N.; Yamagata, Toshio

    2003-04-01

    The misleading aspect of the statistical analyses used in Dommenget and Latif, which raises concerns on some of the reported climate modes, is demonstrated. Adopting simple statistical techniques, the physical existence of the Indian Ocean dipole mode is shown and then the limitations of varimax and regression analyses in capturing the climate mode are discussed.

  9. Experimental design, power and sample size for animal reproduction experiments.

    PubMed

    Chapman, Phillip L; Seidel, George E

    2008-01-01

    The present paper concerns statistical issues in the design of animal reproduction experiments, with emphasis on the problems of sample size determination and power calculations. We include examples and non-technical discussions aimed at helping researchers avoid serious errors that may invalidate or seriously impair the validity of conclusions from experiments. Screen shots from interactive power calculation programs and basic SAS power calculation programs are presented to aid in understanding statistical power and computing power in some common experimental situations. Practical issues that are common to most statistical design problems are briefly discussed. These include one-sided hypothesis tests, power level criteria, equality of within-group variances, transformations of response variables to achieve variance equality, optimal specification of treatment group sizes, 'post hoc' power analysis and arguments for the increased use of confidence intervals in place of hypothesis tests.

  10. Trends in selected streamflow statistics at 19 long-term streamflow-gaging stations indicative of outflows from Texas to Arkansas, Louisiana, Galveston Bay, and the Gulf of Mexico, 1922-2009

    USGS Publications Warehouse

    Barbie, Dana L.; Wehmeyer, Loren L.

    2012-01-01

    Trends in selected streamflow statistics during 1922-2009 were evaluated at 19 long-term streamflow-gaging stations considered indicative of outflows from Texas to Arkansas, Louisiana, Galveston Bay, and the Gulf of Mexico. The U.S. Geological Survey, in cooperation with the Texas Water Development Board, evaluated streamflow data from streamflow-gaging stations with more than 50 years of record that were active as of 2009. The outflows into Arkansas and Louisiana were represented by 3 streamflow-gaging stations, and outflows into the Gulf of Mexico, including Galveston Bay, were represented by 16 streamflow-gaging stations. Monotonic trend analyses were done using the following three streamflow statistics generated from daily mean values of streamflow: (1) annual mean daily discharge, (2) annual maximum daily discharge, and (3) annual minimum daily discharge. The trend analyses were based on the nonparametric Kendall's Tau test, which is useful for the detection of monotonic upward or downward trends with time. A total of 69 trend analyses by Kendall's Tau were computed - 19 periods of streamflow multiplied by the 3 streamflow statistics plus 12 additional trend analyses because the periods of record for 2 streamflow-gaging stations were divided into periods representing pre- and post-reservoir impoundment. Unless otherwise described, each trend analysis used the entire period of record for each streamflow-gaging station. The monotonic trend analysis detected 11 statistically significant downward trends, 37 instances of no trend, and 21 statistically significant upward trends. One general region studied, which seemingly has relatively more upward trends for many of the streamflow statistics analyzed, includes the rivers and associated creeks and bayous to Galveston Bay in the Houston metropolitan area. Lastly, the most western river basins considered (the Nueces and Rio Grande) had statistically significant downward trends for many of the streamflow statistics analyzed.

  11. Assignment of the relative and absolute stereochemistry of two novel epoxides using NMR and DFT-GIAO calculations

    NASA Astrophysics Data System (ADS)

    Moraes, F. C.; Alvarenga, E. S.; Demuner, A. J.; Viana, V. M.

    2018-07-01

    Considering the potential biological application of isobenzofuranones, especially as agrochemical defensives, two novel epoxides, (1aR,2R,2aR,5S,5aS,6S,6aS)-5-(hydroxymethyl)hexahydro-2,6-methanooxireno[2,3-f]isobenzofuran-3(1aH)-one (9), and (1aS,2S,2aR,5S,5aS,6R,6aR)-5-(hydroxymethyl)hexahydro-2,6-methanooxireno[2,3-f]isobenzofuran-3(1aH)-one (10), were synthesized from the readily available D-mannitol in six steps. The multiplicities of the hydrogens located at the bridge of the bicycle are distinct for epoxides 9 and 10 due to W coupling, and this feature was employed to confirm the assignment of these nuclei. Besides analyses of the 2D NMR spectra, the assignments of the nuclei at the epoxide ring were also inferred from information obtained by theoretical calculations. The calculated 1H and 13C NMR chemical shifts for eight candidate structures were compared with the experimental chemical shifts of 9 and 10 by measuring the mean absolute errors (MAE) and by the DP4 statistical analysis. The structures and relative configurations of 9, and 10 were determined via NMR spectroscopy assisted with theoretical calculations. As consequence of the enantioselective syntheses starting from a natural polyol, the absolute configurations of the epoxides 9 and 10 were also defined.

  12. Measuring Academic Performance for Healthcare Researchers with the H Index: Which Search Tool Should Be Used?

    PubMed Central

    Patel, Vanash M.; Ashrafian, Hutan; Almoudaris, Alex; Makanjuola, Jonathan; Bucciarelli-Ducci, Chiara; Darzi, Ara; Athanasiou, Thanos

    2013-01-01

    Objectives To compare H index scores for healthcare researchers returned by Google Scholar, Web of Science and Scopus databases, and to assess whether a researcher's age, country of institutional affiliation and physician status influences calculations. Subjects and Methods One hundred and ninety-five Nobel laureates in Physiology and Medicine from 1901 to 2009 were considered. Year of first and last publications, total publications and citation counts, and the H index for each laureate were calculated from each database. Cronbach's alpha statistics was used to measure the reliability of H index scores between the databases. Laureate characteristic influence on the H index was analysed using linear regression. Results There was no concordance between the databases when considering the number of publications and citations count per laureate. The H index was the most reliably calculated bibliometric across the three databases (Cronbach's alpha = 0.900). All databases returned significantly higher H index scores for younger laureates (p < 0.0001). Google Scholar and Web of Science returned significantly higher H index for physician laureates (p = 0.025 and p = 0.029, respectively). Country of institutional affiliation did not influence the H index in any database. Conclusion The H index appeared to be the most consistently calculated bibliometric between the databases for Nobel laureates in Physiology and Medicine. Researcher-specific characteristics constituted an important component of objective research assessment. The findings of this study call to question the choice of current and future academic performance databases. PMID:22964880

  13. The estimation of absorbed dose rates for non-human biota : an extended inter-comparison.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Batlle, J. V. I.; Beaugelin-Seiller, K.; Beresford, N. A.

    An exercise to compare 10 approaches for the calculation of unweighted whole-body absorbed dose rates was conducted for 74 radionuclides and five of the ICRP's Reference Animals and Plants, or RAPs (duck, frog, flatfish egg, rat and elongated earthworm), selected for this exercise to cover a range of body sizes, dimensions and exposure scenarios. Results were analysed using a non-parametric method requiring no specific hypotheses about the statistical distribution of data. The obtained unweighted absorbed dose rates for internal exposure compare well between the different approaches, with 70% of the results falling within a range of variation of {+-}20%. Themore » variation is greater for external exposure, although 90% of the estimates are within an order of magnitude of one another. There are some discernible patterns where specific models over- or under-predicted. These are explained based on the methodological differences including number of daughter products included in the calculation of dose rate for a parent nuclide; source-target geometry; databases for discrete energy and yield of radionuclides; rounding errors in integration algorithms; and intrinsic differences in calculation methods. For certain radionuclides, these factors combine to generate systematic variations between approaches. Overall, the technique chosen to interpret the data enabled methodological differences in dosimetry calculations to be quantified and compared, allowing the identification of common issues between different approaches and providing greater assurance on the fundamental dose conversion coefficient approaches used in available models for assessing radiological effects to biota.« less

  14. Quantum statistical mechanics of dense partially ionized hydrogen

    NASA Technical Reports Server (NTRS)

    Dewitt, H. E.; Rogers, F. J.

    1972-01-01

    The theory of dense hydrogen plasmas beginning with the two component quantum grand partition function is reviewed. It is shown that ionization equilibrium and molecular dissociation equilibrium can be treated in the same manner with proper consideration of all two-body states. A quantum perturbation expansion is used to give an accurate calculation of the equation of state of the gas for any degree of dissociation and ionization. The statistical mechanical calculation of the plasma equation of state is intended for stellar interiors. The general approach is extended to the calculation of the equation of state of the outer layers of large planets.

  15. Regional projection of climate impact indices over the Mediterranean region

    NASA Astrophysics Data System (ADS)

    Casanueva, Ana; Frías, M.; Dolores; Herrera, Sixto; Bedia, Joaquín; San Martín, Daniel; Gutiérrez, José Manuel; Zaninovic, Ksenija

    2014-05-01

    Climate Impact Indices (CIIs) are being increasingly used in different socioeconomic sectors to transfer information about climate change impacts and risks to stakeholders. CIIs are typically based on different weather variables such as temperature, wind speed, precipitation or humidity and comprise, in a single index, the relevant meteorological information for the particular impact sector (in this study wildfires and tourism). This dependence on several climate variables poses important limitations to the application of statistical downscaling techniques, since physical consistency among variables is required in most cases to obtain reliable local projections. The present study assesses the suitability of the "direct" downscaling approach, in which the downscaling method is directly applied to the CII. In particular, for illustrative purposes, we consider two popular indices used in the wildfire and tourism sectors, the Fire Weather Index (FWI) and the Physiological Equivalent Temperature (PET), respectively. As an example, two case studies are analysed over two representative Mediterranean regions of interest for the EU CLIM-RUN project: continental Spain for the FWI and Croatia for the PET. Results obtained with this "direct" downscaling approach are similar to those found from the application of the statistical downscaling to the individual meteorological drivers prior to the index calculation ("component" downscaling) thus, a wider range of statistical downscaling methods could be used. As an illustration, future changes in both indices are projected by applying two direct statistical downscaling methods, analogs and linear regression, to the ECHAM5 model. Larger differences were found between the two direct statistical downscaling approaches than between the direct and the component approaches with a single downscaling method. While these examples focus on particular indices and Mediterranean regions of interest for CLIM-RUN stakeholders, the same study could be extended to other indices and regions.

  16. Evaluation of salivary fluoride retention from a new high fluoride mouthrinse.

    PubMed

    Mason, Stephen C; Shirodaria, Soha; Sufi, Farzana; Rees, Gareth D; Birkhed, Dowen

    2010-11-01

    To evaluate salivary fluoride retention from a new high fluoride daily use mouthrinse over a 120 min period. Sixteen subjects completed a randomised single-blind, four-treatment cross-over trial. Sensodyne® Pronamel® mouthrinse (A) contained 450 ppm fluoride; reference products were Colgate® Fluorigard® (B), Listerine® Total Care (C) and Listerine Softmint Sensation (D) containing 225, 100 and 0 ppm fluoride respectively. Salivary fluoride retention was monitored ex vivo after a single supervised use of test product (10 mL, 60 s). Samples were collected at 0, 1, 3, 5, 15, 30, 60 and 120 min post-rinse, generating fluoride clearance curves from which the area under the curve (AUC) was calculated. Differences in salivary fluoride concentrations for each product were analysed using ANCOVA at each time point using a 5% significance level, as well as lnAUC for the periods 0-120, 0-1, 1-15, 15-60 and 60-120 min. Pairwise comparisons between all treatment groups were performed. Salivary fluoride levels for A-C peaked immediately following use. Fluoride levels were statistically significantly higher for A versus B-D (p≤ 0.004), linear dose responses were apparent. AUC(0-120) was statistically significantly greater for A than for B (p = 0.035), C (p< 0.0001) and D (p< 0.0001). Post-hoc comparisons of lnAUC for the remaining time domains showed fluoride retention from A was statistically significantly greater versus B-D (p< 0.0001). Single-use treatment with the new mouthrinse containing 450 ppm fluoride resulted in statistically significantly higher salivary fluoride levels throughout the 120 min test period. Total fluoride retention (AUC(0-120)) was also statistically significantly greater versus comparator rinse treatments. Copyright © 2010 Elsevier Ltd. All rights reserved.

  17. The thresholds for statistical and clinical significance – a five-step procedure for evaluation of intervention effects in randomised clinical trials

    PubMed Central

    2014-01-01

    Background Thresholds for statistical significance are insufficiently demonstrated by 95% confidence intervals or P-values when assessing results from randomised clinical trials. First, a P-value only shows the probability of getting a result assuming that the null hypothesis is true and does not reflect the probability of getting a result assuming an alternative hypothesis to the null hypothesis is true. Second, a confidence interval or a P-value showing significance may be caused by multiplicity. Third, statistical significance does not necessarily result in clinical significance. Therefore, assessment of intervention effects in randomised clinical trials deserves more rigour in order to become more valid. Methods Several methodologies for assessing the statistical and clinical significance of intervention effects in randomised clinical trials were considered. Balancing simplicity and comprehensiveness, a simple five-step procedure was developed. Results For a more valid assessment of results from a randomised clinical trial we propose the following five-steps: (1) report the confidence intervals and the exact P-values; (2) report Bayes factor for the primary outcome, being the ratio of the probability that a given trial result is compatible with a ‘null’ effect (corresponding to the P-value) divided by the probability that the trial result is compatible with the intervention effect hypothesised in the sample size calculation; (3) adjust the confidence intervals and the statistical significance threshold if the trial is stopped early or if interim analyses have been conducted; (4) adjust the confidence intervals and the P-values for multiplicity due to number of outcome comparisons; and (5) assess clinical significance of the trial results. Conclusions If the proposed five-step procedure is followed, this may increase the validity of assessments of intervention effects in randomised clinical trials. PMID:24588900

  18. Rural-urban disparity in oral health-related quality of life.

    PubMed

    Gaber, Amal; Galarneau, Chantal; Feine, Jocelyne S; Emami, Elham

    2018-04-01

    The objective of this population-based cross-sectional study was to estimate rural-urban disparity in the oral health-related quality of life (OHRQoL) of the Quebec adult population. A 2-stage sampling design was used to collect data from the 1788 parents/caregivers of schoolchildren living in the 8 regions of the province of Quebec in Canada. Andersen's behavioural model for health services utilization was used as a conceptual framework. Place of residency was defined according to the Statistics Canada Census Metropolitan Area and Census Agglomeration Influenced Zone classification. The outcome of interest was OHRQoL measured using the Oral Health Impact Profile (OHIP)-14 validated questionnaire. Data weighting was applied, and the prevalence, extent and severity of negative oral health impacts were calculated. Statistical analyses included descriptive statistics, bivariate analyses and binary logistic regression. The prevalence of poor oral health-related quality life (OHRQoL) was statistically higher in rural areas than in urban zones (P = .02). Rural residents reported a significantly higher prevalence of negative daily-life impacts in pain, psychological discomfort and social disability OHIP domains (P < .05). Additionally, the rural population showed a greater number of negative oral health impacts (P = .03). There was no significant rural-urban difference in the severity of poor oral health. Logistic regression indicated that the prevalence of poor OHRQoL was significantly related to place of residency (OR = 1.6; 95% CI = 1.1-2.5; P = .022), perceived oral health (OR = 9.4; 95% CI = 5.7-15.5; P < .001), dental treatment needs factors (perceived need for dental treatment, pain, dental care seeking) (OR = 8.7; 95% CI = 4.8-15.6; P < .001) and education (OR = 2.7; 95% CI = 1.8-3.9; P < .001). The results of this study suggest a potential difference in OHRQoL of Quebec rural and urban populations, and a need to develop strategies to promote oral health outcomes, specifically for rural residents. Further studies are needed to confirm these results. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  19. Statistical properties of filtered pseudorandom digital sequences formed from the sum of maximum-length sequences

    NASA Technical Reports Server (NTRS)

    Wallace, G. R.; Weathers, G. D.; Graf, E. R.

    1973-01-01

    The statistics of filtered pseudorandom digital sequences called hybrid-sum sequences, formed from the modulo-two sum of several maximum-length sequences, are analyzed. The results indicate that a relation exists between the statistics of the filtered sequence and the characteristic polynomials of the component maximum length sequences. An analysis procedure is developed for identifying a large group of sequences with good statistical properties for applications requiring the generation of analog pseudorandom noise. By use of the analysis approach, the filtering process is approximated by the convolution of the sequence with a sum of unit step functions. A parameter reflecting the overall statistical properties of filtered pseudorandom sequences is derived. This parameter is called the statistical quality factor. A computer algorithm to calculate the statistical quality factor for the filtered sequences is presented, and the results for two examples of sequence combinations are included. The analysis reveals that the statistics of the signals generated with the hybrid-sum generator are potentially superior to the statistics of signals generated with maximum-length generators. Furthermore, fewer calculations are required to evaluate the statistics of a large group of hybrid-sum generators than are required to evaluate the statistics of the same size group of approximately equivalent maximum-length sequences.

  20. Dietary fat intake and risk of epithelial ovarian cancer: a meta-analysis of 6,689 subjects from 8 observational studies.

    PubMed

    Huncharek, M; Kupelnick, B

    2001-01-01

    The etiology of epithelial ovarian cancer is unknown. Prior work suggests that high dietary fat intake is associated with an increased risk of this tumor, although this association remains speculative. A meta-analysis was performed to evaluate this suspected relationship. Using previously described methods, a protocol was developed for a meta-analysis examining the association between high vs. low dietary fat intake and the risk of epithelial ovarian cancer. Literature search techniques, study inclusion criteria, and statistical procedures were prospectively defined. Data from observational studies were pooled using a general variance-based meta-analytic method employing confidence intervals (CI) previously described by Greenland. The outcome of interest was a summary relative risk (RRs) reflecting the risk of ovarian cancer associated with high vs. low dietary fat intake. Sensitivity analyses were performed when necessary to evaluate any observed statistical heterogeneity. The literature search yielded 8 observational studies enrolling 6,689 subjects. Data were stratified into three dietary fat intake categories: total fat, animal fat, and saturated fat. Initial tests for statistical homogeneity demonstrated that hospital-based studies accounted for observed heterogeneity possibly because of selection bias. Accounting for this, an RRs was calculated for high vs. low total fat intake, yielding a value of 1.24 (95% CI = 1.07-1.43), a statistically significant result. That is, high total fat intake is associated with a 24% increased risk of ovarian cancer development. The RRs for high saturated fat intake was 1.20 (95% CI = 1.04-1.39), suggesting a 20% increased risk of ovarian cancer among subjects with these dietary habits. High vs. low animal fat diet gave an RRs of 1.70 (95% CI = 1.43-2.03), consistent with a statistically significant 70% increased ovarian cancer risk. High dietary fat intake appears to represent a significant risk factor for the development of ovarian cancer. The magnitude of this risk associated with total fat and saturated fat is rather modest. Ovarian cancer risk associated with high animal fat intake appears significantly greater than that associated with the other types of fat intake studied, although this requires confirmation via larger analyses. Further work is needed to clarify factors that may modify the effects of dietary fat in vivo.

  1. Nursing students' mathematic calculation skills.

    PubMed

    Rainboth, Lynde; DeMasi, Chris

    2006-12-01

    This mixed method study used a pre-test/post-test design to evaluate the efficacy of a teaching strategy in improving beginning nursing student learning outcomes. During a 4-week student teaching period, a convenience sample of 54 sophomore level nursing students were required to complete calculation assignments, taught one calculation method, and mandated to attend medication calculation classes. These students completed pre- and post-math tests and a major medication mathematic exam. Scores from the intervention student group were compared to those achieved by the previous sophomore class. Results demonstrated a statistically significant improvement from pre- to post-test and the students who received the intervention had statistically significantly higher scores on the major medication calculation exam than did the students in the control group. The evaluation completed by the intervention group showed that the students were satisfied with the method and outcome.

  2. Publication of statistically significant research findings in prosthodontics & implant dentistry in the context of other dental specialties.

    PubMed

    Papageorgiou, Spyridon N; Kloukos, Dimitrios; Petridis, Haralampos; Pandis, Nikolaos

    2015-10-01

    To assess the hypothesis that there is excessive reporting of statistically significant studies published in prosthodontic and implantology journals, which could indicate selective publication. The last 30 issues of 9 journals in prosthodontics and implant dentistry were hand-searched for articles with statistical analyses. The percentages of significant and non-significant results were tabulated by parameter of interest. Univariable/multivariable logistic regression analyses were applied to identify possible predictors of reporting statistically significance findings. The results of this study were compared with similar studies in dentistry with random-effects meta-analyses. From the 2323 included studies 71% of them reported statistically significant results, with the significant results ranging from 47% to 86%. Multivariable modeling identified that geographical area and involvement of statistician were predictors of statistically significant results. Compared to interventional studies, the odds that in vitro and observational studies would report statistically significant results was increased by 1.20 times (OR: 2.20, 95% CI: 1.66-2.92) and 0.35 times (OR: 1.35, 95% CI: 1.05-1.73), respectively. The probability of statistically significant results from randomized controlled trials was significantly lower compared to various study designs (difference: 30%, 95% CI: 11-49%). Likewise the probability of statistically significant results in prosthodontics and implant dentistry was lower compared to other dental specialties, but this result did not reach statistical significant (P>0.05). The majority of studies identified in the fields of prosthodontics and implant dentistry presented statistically significant results. The same trend existed in publications of other specialties in dentistry. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Imprints of magnetic power and helicity spectra on radio polarimetry statistics

    NASA Astrophysics Data System (ADS)

    Junklewitz, H.; Enßlin, T. A.

    2011-06-01

    The statistical properties of turbulent magnetic fields in radio-synchrotron sources should be imprinted on the statistics of polarimetric observables. In search of these imprints, i.e. characteristic modifications of the polarimetry statistics caused by magnetic field properties, we calculate correlation and cross-correlation functions from a set of observables that contain total intensity I, polarized intensity P, and Faraday depth φ. The correlation functions are evaluated for all combinations of observables up to fourth order in magnetic field B. We derive these analytically as far as possible and from first principles using only some basic assumptions, such as Gaussian statistics for the underlying magnetic field in the observed region and statistical homogeneity. We further assume some simplifications to reduce the complexity of the calculations, because for a start we were interested in a proof of concept. Using this statistical approach, we show that it is possible to gain information about the helical part of the magnetic power spectrum via the correlation functions < P(kperp) φ(k'_{perp)φ(k''perp)>B} and < I(kperp) φ(k'_{perp)φ(k''perp)>B}. Using this insight, we construct an easy-to-use test for helicity called LITMUS (Local Inference Test for Magnetic fields which Uncovers heliceS), which gives a spectrally integrated measure of helicity. For now, all calculations are given in a Faraday-free case, but set up so that Faraday rotational effects can be included later.

  4. Analyses of 1/15 scale Creare bypass transient experiments. [PWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kmetyk, L.N.; Buxton, L.D.; Cole, R.K. Jr.

    1982-09-01

    RELAP4 analyses of several 1/15 scale Creare H-series bypass transient experiments have been done to investigate the effect of using different downcomer nodalizations, physical scales, slip models, and vapor fraction donoring methods. Most of the analyses were thermal equilibrium calculations performed with RELAP4/MOD5, but a few such calculations were done with RELAP4/MOD6 and RELAP4/MOD7, which contain improved slip models. In order to estimate the importance of nonequilibrium effects, additional analyses were performed with TRAC-PD2, RELAP5 and the nonequilibrium option of RELAP4/MOD7. The purpose of these studies was to determine whether results from Westinghouse's calculation of the Creare experiments, which weremore » done with a UHI-modified version of SATAN, were sufficient to guarantee SATAN would be conservative with respect to ECC bypass in full-scale plant analyses.« less

  5. Implemented Lomb-Scargle periodogram: a valuable tool for improving cyclostratigraphic research on unevenly sampled deep-sea stratigraphic sequences

    NASA Astrophysics Data System (ADS)

    Pardo-Iguzquiza, Eulogio; Rodríguez-Tovar, Francisco J.

    2011-12-01

    One important handicap when working with stratigraphic sequences is the discontinuous character of the sedimentary record, especially relevant in cyclostratigraphic analysis. Uneven palaeoclimatic/palaeoceanographic time series are common, their cyclostratigraphic analysis being comparatively difficult because most spectral methodologies are appropriate only when working with even sampling. As a means to solve this problem, a program for calculating the smoothed Lomb-Scargle periodogram and cross-periodogram, which additionally evaluates the statistical confidence of the estimated power spectrum through a Monte Carlo procedure (the permutation test), has been developed. The spectral analysis of a short uneven time series calls for assessment of the statistical significance of the spectral peaks, since a periodogram can always be calculated but the main challenge resides in identifying true spectral features. To demonstrate the effectiveness of this program, two case studies are presented: the one deals with synthetic data and the other with paleoceanographic/palaeoclimatic proxies. On a simulated time series of 500 data, two uneven time series (with 100 and 25 data) were generated by selecting data at random. Comparative analysis between the power spectra from the simulated series and from the two uneven time series demonstrates the usefulness of the smoothed Lomb-Scargle periodogram for uneven sequences, making it possible to distinguish between statistically significant and spurious spectral peaks. Fragmentary time series of Cd/Ca ratios and δ18O from core AII107-131 of SPECMAP were analysed as a real case study. The efficiency of the direct and cross Lomb-Scargle periodogram in recognizing Milankovitch and sub-Milankovitch signals related to palaeoclimatic/palaeoceanographic changes is demonstrated. As implemented, the Lomb-Scargle periodogram may be applied to any palaeoclimatic/palaeoceanographic proxies, including those usually recovered from contourites, and it holds special interest in the context of centennial- to millennial-scale climatic changes affecting contouritic currents.

  6. The ADHF/NT-proBNP risk score to predict 1-year mortality in hospitalized patients with advanced decompensated heart failure.

    PubMed

    Scrutinio, Domenico; Ammirati, Enrico; Guida, Pietro; Passantino, Andrea; Raimondo, Rosa; Guida, Valentina; Sarzi Braga, Simona; Canova, Paolo; Mastropasqua, Filippo; Frigerio, Maria; Lagioia, Rocco; Oliva, Fabrizio

    2014-04-01

    The acute decompensated heart failure/N-terminal pro-B-type natriuretic peptide (ADHF/NT-proBNP) score is a validated risk scoring system that predicts mortality in hospitalized heart failure patients with a wide range of left ventricular ejection fractions (LVEFs). We sought to assess discrimination and calibration of the score when applied to patients with advanced decompensated heart failure (AHF). We studied 445 patients hospitalized for AHF, defined by the presence of severe symptoms of worsening HF at admission, severely depressed LVEF, and the need for intravenous diuretic and/or inotropic drugs. The primary outcome was cumulative (in-hospital and post-discharge) mortality and post-discharge 1-year mortality. Separate analyses were performed for patients aged ≤ 70 years. A Seattle Heart Failure Score (SHFS) was calculated for each patient discharged alive. During follow-up, 144 patients (32.4%) died, and 69 (15.5%) underwent heart transplantation (HT) or ventricular assist device (VAD) implantation. After accounting for the competing events (VAD/HT), the ADHF/NT-proBNP score's C-statistic for cumulative mortality was 0.738 in the overall cohort and 0.771 in patients aged ≤ 70 years. The C-statistic for post-discharge mortality was 0.741 and 0.751, respectively. Adding prior (≤6 months) hospitalizations for HF to the score increased the C-statistic for post-discharge mortality to 0.759 in the overall cohort and to 0.774 in patients aged ≤ 70 years. Predicted and observed mortality rates by quartiles of score were highly correlated. The SHFS demonstrated adequate discrimination but underestimated the risk. The ADHF/NT-proBNP risk calculator is available at http://www.fsm.it/fsm/file/NTproBNPscore.zip. Our data suggest that the ADHF/NT-proBNP score may efficiently predict mortality in patients hospitalized with AHF. Copyright © 2014 International Society for Heart and Lung Transplantation. Published by Elsevier Inc. All rights reserved.

  7. Analysis of filament statistics in fast camera data on MAST

    NASA Astrophysics Data System (ADS)

    Farley, Tom; Militello, Fulvio; Walkden, Nick; Harrison, James; Silburn, Scott; Bradley, James

    2017-10-01

    Coherent filamentary structures have been shown to play a dominant role in turbulent cross-field particle transport [D'Ippolito 2011]. An improved understanding of filaments is vital in order to control scrape off layer (SOL) density profiles and thus control first wall erosion, impurity flushing and coupling of radio frequency heating in future devices. The Elzar code [T. Farley, 2017 in prep.] is applied to MAST data. The code uses information about the magnetic equilibrium to calculate the intensity of light emission along field lines as seen in the camera images, as a function of the field lines' radial and toroidal locations at the mid-plane. In this way a `pseudo-inversion' of the intensity profiles in the camera images is achieved from which filaments can be identified and measured. In this work, a statistical analysis of the intensity fluctuations along field lines in the camera field of view is performed using techniques similar to those typically applied in standard Langmuir probe analyses. These filament statistics are interpreted in terms of the theoretical ergodic framework presented by F. Militello & J.T. Omotani, 2016, in order to better understand how time averaged filament dynamics produce the more familiar SOL density profiles. This work has received funding from the RCUK Energy programme (Grant Number EP/P012450/1), from Euratom (Grant Agreement No. 633053) and from the EUROfusion consortium.

  8. Neighbourhood safety and area deprivation modify the associations between parkland and psychological distress in Sydney, Australia

    PubMed Central

    2013-01-01

    Background The aim of this study was to investigate how perceived neighbourhood safety and area deprivation influenced the relationship between parklands and mental health. Methods Information about psychological distress, perceptions of safety, demographic and socio-economic background at the individual level was extracted from New South Wales Population Health Survey. The proportion of a postcode that was parkland was used as a proxy measure for access to parklands and was calculated for each individual. Generalized Estimating Equations logistic regression analyses were performed to account for correlation between participants within postcodes, and with controls for socio-demographic characteristics and socio-economic status at the area level. Results In areas where the residents reported perceiving their neighbourhood to be “safe” and controlling for area levels of socio-economic deprivation, there were no statistically significant associations between the proportion of parkland and high or very high psychological distress. In the most disadvantaged neighbourhoods which were perceived as unsafe by residents, those with greater proportions of parkland, over 20%, there was greater psychological distress, this association was statistically significant (20-40% parkland: OR=2.27, 95% CI=1.45-3.55; >40% parkland: OR=2.53, 95% CI=1.53-4.19). Conclusion Our study indicates that perceptions of neighbourhood safety and area deprivation were statistically significant effect modifiers of the association between parkland and psychological distress. PMID:23635303

  9. Correlation between hospital-level antibiotic consumption and incident health care facility-onset Clostridium difficile infection.

    PubMed

    Crew, Page E; Rhodes, Nathaniel J; O'Donnell, J Nicholas; Miglis, Cristina; Gilbert, Elise M; Zembower, Teresa R; Qi, Chao; Silkaitis, Christina; Sutton, Sarah H; Scheetz, Marc H

    2018-03-01

    The purpose of this single-center, ecologic study is to characterize the relationship between facility-wide (FacWide) antibiotic consumption and incident health care facility-onset Clostridium difficile infection (HO-CDI). FacWide antibiotic consumption and incident HO-CDI were tallied on a monthly basis and standardized, from January 2013 through April 2015. Spearman rank-order correlation coefficients were calculated using matched-months analysis and a 1-month delay. Regression analyses were performed, with P < .05 considered statistically significant. FacWide analysis identified a matched-months correlation between ceftriaxone and HO-CDI (ρ = 0.44, P = .018). A unit of stem cell transplant recipients did not have significant correlation between carbapenems and HO-CDI in matched months (ρ = 0.37, P = .098), but a significant correlation was observed when a 1-month lag was applied (ρ = 0.54, P = .014). Three statistically significant lag associations were observed between FacWide/unit-level antibiotic consumption and HO-CDI, and 1 statistically significant nonlagged association was observed FacWide. Antibiotic consumption may convey extended ward-level risk for incident CDI. Consumption of antibiotic agents may have immediate and prolonged influence on incident CDI. Additional studies are needed to investigate the immediate and delayed associations between antibiotic consumption and C difficile colonization, infection, and transmission at the hospital level. Published by Elsevier Inc.

  10. Neighbourhood safety and area deprivation modify the associations between parkland and psychological distress in Sydney, Australia.

    PubMed

    Chong, Shanley; Lobb, Elizabeth; Khan, Rabia; Abu-Rayya, Hisham; Byun, Roy; Jalaludin, Bin

    2013-05-01

    The aim of this study was to investigate how perceived neighbourhood safety and area deprivation influenced the relationship between parklands and mental health. Information about psychological distress, perceptions of safety, demographic and socio-economic background at the individual level was extracted from New South Wales Population Health Survey. The proportion of a postcode that was parkland was used as a proxy measure for access to parklands and was calculated for each individual. Generalized Estimating Equations logistic regression analyses were performed to account for correlation between participants within postcodes, and with controls for socio-demographic characteristics and socio-economic status at the area level. In areas where the residents reported perceiving their neighbourhood to be "safe" and controlling for area levels of socio-economic deprivation, there were no statistically significant associations between the proportion of parkland and high or very high psychological distress. In the most disadvantaged neighbourhoods which were perceived as unsafe by residents, those with greater proportions of parkland, over 20%, there was greater psychological distress, this association was statistically significant (20-40% parkland: OR=2.27, 95% CI=1.45-3.55; >40% parkland: OR=2.53, 95% CI=1.53-4.19). Our study indicates that perceptions of neighbourhood safety and area deprivation were statistically significant effect modifiers of the association between parkland and psychological distress.

  11. Linear regression analysis of Hospital Episode Statistics predicts a large increase in demand for elective hand surgery in England.

    PubMed

    Bebbington, Emily; Furniss, Dominic

    2015-02-01

    We integrated two factors, demographic population shifts and changes in prevalence of disease, to predict future trends in demand for hand surgery in England, to facilitate workforce planning. We analysed Hospital Episode Statistics data for Dupuytren's disease, carpal tunnel syndrome, cubital tunnel syndrome, and trigger finger from 1998 to 2011. Using linear regression, we estimated trends in both diagnosis and surgery until 2030. We integrated this regression with age specific population data from the Office for National Statistics in order to estimate how this will contribute to a change in workload over time. There has been a significant increase in both absolute numbers of diagnoses and surgery for all four conditions. Combined with future population data, we calculate that the total operative burden for these four conditions will increase from 87,582 in 2011 to 170,166 (95% confidence interval 144,517-195,353) in 2030. The prevalence of these diseases in the ageing population, and increasing prevalence of predisposing factors such as obesity and diabetes, may account for the predicted increase in workload. The most cost effective treatments must be sought, which requires high quality clinical trials. Our methodology can be applied to other sub-specialties to help anticipate the need for future service provision. Copyright © 2014 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  12. DESIGNING ENVIRONMENTAL MONITORING DATABASES FOR STATISTIC ASSESSMENT

    EPA Science Inventory

    Databases designed for statistical analyses have characteristics that distinguish them from databases intended for general use. EMAP uses a probabilistic sampling design to collect data to produce statistical assessments of environmental conditions. In addition to supporting the ...

  13. Comparing Visual and Statistical Analysis of Multiple Baseline Design Graphs.

    PubMed

    Wolfe, Katie; Dickenson, Tammiee S; Miller, Bridget; McGrath, Kathleen V

    2018-04-01

    A growing number of statistical analyses are being developed for single-case research. One important factor in evaluating these methods is the extent to which each corresponds to visual analysis. Few studies have compared statistical and visual analysis, and information about more recently developed statistics is scarce. Therefore, our purpose was to evaluate the agreement between visual analysis and four statistical analyses: improvement rate difference (IRD); Tau-U; Hedges, Pustejovsky, Shadish (HPS) effect size; and between-case standardized mean difference (BC-SMD). Results indicate that IRD and BC-SMD had the strongest overall agreement with visual analysis. Although Tau-U had strong agreement with visual analysis on raw values, it had poorer agreement when those values were dichotomized to represent the presence or absence of a functional relation. Overall, visual analysis appeared to be more conservative than statistical analysis, but further research is needed to evaluate the nature of these disagreements.

  14. Errors in statistical decision making Chapter 2 in Applied Statistics in Agricultural, Biological, and Environmental Sciences

    USDA-ARS?s Scientific Manuscript database

    Agronomic and Environmental research experiments result in data that are analyzed using statistical methods. These data are unavoidably accompanied by uncertainty. Decisions about hypotheses, based on statistical analyses of these data are therefore subject to error. This error is of three types,...

  15. The Large-Scale Structure of Semantic Networks: Statistical Analyses and a Model of Semantic Growth

    ERIC Educational Resources Information Center

    Steyvers, Mark; Tenenbaum, Joshua B.

    2005-01-01

    We present statistical analyses of the large-scale structure of 3 types of semantic networks: word associations, WordNet, and Roget's Thesaurus. We show that they have a small-world structure, characterized by sparse connectivity, short average path lengths between words, and strong local clustering. In addition, the distributions of the number of…

  16. Using information theory to assess the communicative capacity of circulating microRNA.

    PubMed

    Finn, Nnenna A; Searles, Charles D

    2013-10-11

    The discovery of extracellular microRNAs (miRNAs) and their transport modalities (i.e., microparticles, exosomes, proteins and lipoproteins) has sparked theories regarding their role in intercellular communication. Here, we assessed the information transfer capacity of different miRNA transport modalities in human serum by utilizing basic principles of information theory. Zipf Statistics were calculated for each of the miRNA transport modalities identified in human serum. Our analyses revealed that miRNA-mediated information transfer is redundant, as evidenced by negative Zipf's Statistics with magnitudes greater than one. In healthy subjects, the potential communicative capacity of miRNA in complex with circulating proteins was significantly lower than that of miRNA encapsulated in circulating microparticles and exosomes. Moreover, the presence of coronary heart disease significantly lowered the communicative capacity of all circulating miRNA transport modalities. To assess the internal organization of circulating miRNA signals, Shannon's zero- and first-order entropies were calculated. Microparticles (MPs) exhibited the lowest Shannon entropic slope, indicating a relatively high capacity for information transfer. Furthermore, compared to the other miRNA transport modalities, MPs appeared to be the most efficient at transferring miRNA to cultured endothelial cells. Taken together, these findings suggest that although all transport modalities have the capacity for miRNA-based information transfer, MPs may be the simplest and most robust way to achieve miRNA-based signal transduction in sera. This study presents a novel method for analyzing the quantitative capacity of miRNA-mediated information transfer while providing insight into the communicative characteristics of distinct circulating miRNA transport modalities. Published by Elsevier Inc.

  17. Variation and seasonal patterns of suicide mortality in Finland and Sweden since the 1750s.

    PubMed

    Holopainen, Jari; Helama, Samuli; Björkenstam, Charlotte; Partonen, Timo

    2013-11-01

    Suicide mortality varies in both the short and long term. Our study examines suicide mortality in Finland and Sweden from the 1750s until today. The aim of our study is to detect any seasonal peaks in suicide rates and examine their temporal evolution to suggest a mechanism that may explain such peaks. We acquired the study material from the Finnish and Swedish cause of death statistics (257,341 deaths by suicide) and the relevant population gender structure data. We then separately calculated the annual male and female suicide rates per 100,000 inhabitants. We analysed the suicide peaks, calculating factors of proportionality for the available data by dividing the suicide rates in the peak months (May and October) by the annual suicide rates. Suicide rates in Finland and Sweden peak twice a year. Both men and women in both countries most often commit suicide in May. There is another peak in October, with the exception of Finnish men. These suicide peaks coincide with a temperature increase in May and the biggest annual drop in temperature in October. We also observed a monotonic long-term change in the Swedish statistics, but not in the Finnish data. Our hypothesis is that seasonal variation in suicide rates may be caused by abrupt temperature changes twice a year that trigger the activity in brown adipose tissue and deepen depression. While the overall suicide mortality rates varied considerably, the monthly proportions in May did not. This finding suggests a routine factor underlying the spring peak in suicide mortality.

  18. A critical review and meta-analysis of the association between overt hyperthyroidism and mortality.

    PubMed

    Brandt, Frans; Green, Anders; Hegedüs, Laszlo; Brix, Thomas H

    2011-10-01

    Overt hyperthyroidism has been associated with cardiac arrhythmias, hypercoagulopathy, stroke, and pulmonary embolism, all of which may increase mortality. Some, but not all, studies show an increased mortality in patients with hyperthyroidism. This inconsistency may be due to differences in study design, characteristics of participants, or confounders. In order to test whether hyperthyroidism influences mortality, we performed a critical review and statistical meta-analysis. Based on an electronic PubMed search, using the Medical Subject Heading words such as hyperthyroidism, thyrotoxicosis, and mortality or survival, case-control and cohort studies were selected and reviewed. Using meta-analysis, an overall relative risk (RR) of mortality was calculated. Eight studies fulfilled the inclusion criteria, six of which showed an increased all-cause mortality; seven studies, including 31,138 patients and 400,000 person years at risk, allowed calculation of mortality in a meta-analysis. Based on this, the RR of overall mortality was 1.21 (95% confidence interval: 1.05-1.38). Analyses including studies considering setting, treatment, and control for co-morbidity did not significantly alter this finding. As the measured heterogeneity (I(2)) ranges from 89.1 to 98.3%, which is much higher than the 50% generally viewed on as a threshold, the statistical heterogeneity is very pronounced in the included studies. In patients diagnosed with hyperthyroidism, mortality is increased by ∼ 20%. Future studies need to address the cause of hyperthyroidism, impact of type of therapy, time dependency, as well as the potential influence of confounding or genetic susceptibility before the question of causality can be answered.

  19. Meta-analysis: Problems with Russian Publications.

    PubMed

    Verbitskaya, E V

    2015-01-01

    Meta-analysis is a powerful tool to identify Evidence Based medical technologies (interventions) for use in every day practice. Meta-analysis uses statistical approaches to combine results from multiple studies in an effort to increase power (over individual studies), improve estimates of the size of the effect and/or to resolve uncertainty when reports disagree. Meta-analysis is a quantitative, formal study design used to systematically assess previous research studies to derive conclusions from this research. Meta-analysis may provide more precise estimate of the effect of treatment or risk factor for a disease, or other outcomes, than any individual study contributing to the pooled analysis.We have quite a substantial number of Russian medical publications, but not so many Meta-Analyses published in Russian. Russian publications are cited in English language papers not so often. A total of 90% of clinical studies included in published Meta-Analyses incorporate only English language papers. International studies or papers with Russian co-authors are published in English language. The main question is: what is the problem with inclusion of Russian medical publications in Meta-Analysis? The main reasons for this are the following: 1) It is difficult to find Russian papers, difficult to work with them and to work with Russian journals:a. There are single Russian Biomedical Journals, which are translated into English and are included in databases (PubMed, Scopus and other), despite the fact that all of them have English language abstracts.b. The majority the meta-analyses authors use in their work different citation management software such as the Mendeley, Reference Manager, ProCite, EndNote, and others. These citation management systems allow scientists to organize their own literature databases with internet searches and have adds-on for the Office programs what makes process of literature citation very convenient. The Internet sites of the majority of International Journals have built-in tools for saving citations to reference manager software. The majority of articles in Russian journals cannot be captured by citation management systems: they do not have special coding of articles descriptors.c. Some journals still have PDF files of the whole journal issue without dividing it into articles and do not provide any descriptors, making manual time-consuming input of information the only possibility. Moreover the context search of the article content is unavailable for search engines.2) The quality of research. This problem has been discussed for more than twenty years already. Still we have too many publications of poor quality of study design and statistical analysis. With the exception of pharmacological clinical tails, designed and supervised by international Pharma industry, many interventional studies, conducted in Russia, have methodological flaws inferring a high risk of bias:a. Absence of adequate control,b. No standard endpoints, duration of therapy and follow up,c. Absence of randomization and blinding,d. Low power of studies: sample sizes are calculated (if calculated at all) in such a way, that the main goal is to have as small sample size as possible. Very often statisticians have to solve the problem how to substantiate a small number of subjects, that sponsor could afford, instead of calculating the needed sample size to reach enough power.e. No standards of statistical analysis.f. Russian journals do not have standards for description and presentation of study results, in particular, results of statistical analysis (a reader even cannot see what is presented: standard deviation (SD) or standard error of the mean (SEM).We have a long standing experience in analysis of methodological and statistical quality of Russian biomedical publications and have found up to 80% publications with statistical and methodological errors and high risk of bias.In our practice, we had tried to perform two Meta-analyses for two local pharmaceutical products for prevention of stroke recurrence. For the first product, we did not found even two single Russian language studies suitable for the analysis (incomparable populations, different designs, endpoints, doses etc.). For the second product, only four studies had comparable populations and standard internationally approved scales for effectiveness analysis. However, the combinations of scales, the length of treatment and follow up differed widely, so that we could combine the results of only 2 or 3 studies for each end point. Russian researchers have to follow internationally recognised standards in study design, selection of endpoint, timelines and therapy regimens, data analysis and presentation of results. Russian journals need to develop consolidate rules for authors of clinical trials and epidemiological research of result reporting close to international standards. In this case the international Network EQUATOR (Enhancing the QUAlity and Transparency Of health Research http://www.equator-network.org/) is one to be taken into account. In addition, Russian Journals have to improve their online information for better interaction with search engines and citation managers.

  20. Calculations vs. measurements of remnant dose rates for SNS spent structures

    NASA Astrophysics Data System (ADS)

    Popova, I. I.; Gallmeier, F. X.; Trotter, S.; Dayton, M.

    2018-06-01

    Residual dose rate measurements were conducted on target vessel #13 and proton beam window #5 after extraction from their service locations. These measurements were used to verify calculation methods of radionuclide inventory assessment that are typically performed for nuclear waste characterization and transportation of these structures. Neutronics analyses for predicting residual dose rates were carried out using the transport code MCNPX and the transmutation code CINDER90. For transport analyses complex and rigorous geometry model of the structures and their surrounding are applied. The neutronics analyses were carried out using Bertini and CEM high energy physics models for simulating particles interaction. Obtained preliminary calculational results were analysed and compared to the measured dose rates and overall are showing good agreement with in 40% in average.

  1. Calculations vs. measurements of remnant dose rates for SNS spent structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Popova, Irina I.; Gallmeier, Franz X.; Trotter, Steven M.

    Residual dose rate measurements were conducted on target vessel #13 and proton beam window #5 after extraction from their service locations. These measurements were used to verify calculation methods of radionuclide inventory assessment that are typically performed for nuclear waste characterization and transportation of these structures. Neutronics analyses for predicting residual dose rates were carried out using the transport code MCNPX and the transmutation code CINDER90. For transport analyses complex and rigorous geometry model of the structures and their surrounding are applied. The neutronics analyses were carried out using Bertini and CEM high energy physics models for simulating particles interaction.more » Obtained preliminary calculational results were analysed and compared to the measured dose rates and overall are showing good agreement with in 40% in average.« less

  2. Differences in Performance Among Test Statistics for Assessing Phylogenomic Model Adequacy.

    PubMed

    Duchêne, David A; Duchêne, Sebastian; Ho, Simon Y W

    2018-05-18

    Statistical phylogenetic analyses of genomic data depend on models of nucleotide or amino acid substitution. The adequacy of these substitution models can be assessed using a number of test statistics, allowing the model to be rejected when it is found to provide a poor description of the evolutionary process. A potentially valuable use of model-adequacy test statistics is to identify when data sets are likely to produce unreliable phylogenetic estimates, but their differences in performance are rarely explored. We performed a comprehensive simulation study to identify test statistics that are sensitive to some of the most commonly cited sources of phylogenetic estimation error. Our results show that, for many test statistics, traditional thresholds for assessing model adequacy can fail to reject the model when the phylogenetic inferences are inaccurate and imprecise. This is particularly problematic when analysing loci that have few variable informative sites. We propose new thresholds for assessing substitution model adequacy and demonstrate their effectiveness in analyses of three phylogenomic data sets. These thresholds lead to frequent rejection of the model for loci that yield topological inferences that are imprecise and are likely to be inaccurate. We also propose the use of a summary statistic that provides a practical assessment of overall model adequacy. Our approach offers a promising means of enhancing model choice in genome-scale data sets, potentially leading to improvements in the reliability of phylogenomic inference.

  3. Data processing with Pymicra, the Python tool for Micrometeorological Analyses

    NASA Astrophysics Data System (ADS)

    Chor, T. L.; Dias, N. L.

    2017-12-01

    With the ever-increasing capability of instrumentation of collecting high-frequency turbulence data, micrometeorological experiments are now generating significant amounts of data. Clearly, data processing -- and not data collection anymore -- has become the limiting factor for those very large data sets. The ability of extracting useful scientific information from those experiments, therefore, hinges on tools that (i) are able to process those data effectively and accurately, (ii) are flexible enough to be adapted to the specific requirements of each investigation, and (iii) are robust enough to make data analysis easily reproducible over different sets of large data sets. We have developed a framework for micrometeorological data analysis called Pymicra which does deliver such capabilities while maintaining proximity of the investigator with the data. It is fully written in an open-source, very high level language, Python, which has been gaining widespread acceptance as a scientific tool. It follows the philosophy of "not reinventing the wheel" and, as a result, relies on existing well-established open-source Python packages such as Numpy and Pandas. Thus, minimum effort is needed to program statistics, array processing, Fourier analysis, etc. Among the things that Pymicra does are reading and organizing data from virtually any format, applying common quality control procedures, extracting fluctuations in a number of ways, correcting for sensor drift, automatic calculation of fluid properties (such as air and dry air density), handling of units, calculation of cross-spectra, calculation of turbulent fluxes and scales, and all other features already provided by Pandas (interpolation, statistical tests, handling of missing data, etc.). Pymicra is freely available on Github and the fact that it makes heavy use of high-level programming makes adding and modifying code considerably easy for any scientific programmer, making it straightforward for other scientists to contribute with new functionality and point out room for improvements. Because of that, Pymicra is a candidate to be a community-developed code in the future and to centralize part of the data processing aimed at micrometeorology.

  4. Statistical Distance as a Measure of Physiological Dysregulation Is Largely Robust to Variation in Its Biomarker Composition

    PubMed Central

    Cohen, Alan A.; Leroux, Maxime; Faucher, Samuel; Morissette-Thomas, Vincent; Legault, Véronique; Fried, Linda P.; Ferrucci, Luigi

    2015-01-01

    Physiological dysregulation may underlie aging and many chronic diseases, but is challenging to quantify because of the complexity of the underlying systems. Recently, we described a measure of physiological dysregulation, DM, that uses statistical distance to assess the degree to which an individual’s biomarker profile is normal versus aberrant. However, the sensitivity of DM to details of the calculation method has not yet been systematically assessed. In particular, the number and choice of biomarkers and the definition of the reference population (RP, the population used to define a “normal” profile) may be important. Here, we address this question by validating the method on 44 common clinical biomarkers from three longitudinal cohort studies and one cross-sectional survey. DMs calculated on different biomarker subsets show that while the signal of physiological dysregulation increases with the number of biomarkers included, the value of additional markers diminishes as more are added and inclusion of 10-15 is generally sufficient. As long as enough markers are included, individual markers have little effect on the final metric, and even DMs calculated from mutually exclusive groups of markers correlate with each other at r~0.4-0.5. We also used data subsets to generate thousands of combinations of study populations and RPs to address sensitivity to differences in age range, sex, race, data set, sample size, and their interactions. Results were largely consistent (but not identical) regardless of the choice of RP; however, the signal was generally clearer with a younger and healthier RP, and RPs too different from the study population performed poorly. Accordingly, biomarker and RP choice are not particularly important in most cases, but caution should be used across very different populations or for fine-scale analyses. Biologically, the lack of sensitivity to marker choice and better performance of younger, healthier RPs confirm an interpretation of DM physiological dysregulation and as an emergent property of a complex system. PMID:25875923

  5. Development of QC Procedures for Ocean Data Obtained by National Research Projects of Korea

    NASA Astrophysics Data System (ADS)

    Kim, S. D.; Park, H. M.

    2017-12-01

    To establish data management system for ocean data obtained by national research projects of Ministry of Oceans and Fisheries of Korea, KIOST conducted standardization and development of QC procedures. After reviewing and analyzing the existing international and domestic ocean-data standards and QC procedures, the draft version of standards and QC procedures were prepared. The proposed standards and QC procedures were reviewed and revised by experts in the field of oceanography and academic societies several times. A technical report on the standards of 25 data items and 12 QC procedures for physical, chemical, biological and geological data items. The QC procedure for temperature and salinity data was set up by referring the manuals published by GTSPP, ARGO and IOOS QARTOD. It consists of 16 QC tests applicable for vertical profile data and time series data obtained in real-time mode and delay mode. Three regional range tests to inspect annual, seasonal and monthly variations were included in the procedure. Three programs were developed to calculate and provide upper limit and lower limit of temperature and salinity at depth from 0 to 1550m. TS data of World Ocean Database, ARGO, GTSPP and in-house data of KIOST were analysed statistically to calculate regional limit of Northwest Pacific area. Based on statistical analysis, the programs calculate regional ranges using mean and standard deviation at 3 kind of grid systems (3° grid, 1° grid and 0.5° grid) and provide recommendation. The QC procedures for 12 data items were set up during 1st phase of national program for data management (2012-2015) and are being applied to national research projects practically at 2nd phase (2016-2019). The QC procedures will be revised by reviewing the result of QC application when the 2nd phase of data management programs is completed.

  6. Clinical calculators in hospital medicine: Availability, classification, and needs.

    PubMed

    Dziadzko, Mikhail A; Gajic, Ognjen; Pickering, Brian W; Herasevich, Vitaly

    2016-09-01

    Clinical calculators are widely used in modern clinical practice, but are not generally applied to electronic health record (EHR) systems. Important barriers to the application of these clinical calculators into existing EHR systems include the need for real-time calculation, human-calculator interaction, and data source requirements. The objective of this study was to identify, classify, and evaluate the use of available clinical calculators for clinicians in the hospital setting. Dedicated online resources with medical calculators and providers of aggregated medical information were queried for readily available clinical calculators. Calculators were mapped by clinical categories, mechanism of calculation, and the goal of calculation. Online statistics from selected Internet resources and clinician opinion were used to assess the use of clinical calculators. One hundred seventy-six readily available calculators in 4 categories, 6 primary specialties, and 40 subspecialties were identified. The goals of calculation included prediction, severity, risk estimation, diagnostic, and decision-making aid. A combination of summation logic with cutoffs or rules was the most frequent mechanism of computation. Combined results, online resources, statistics, and clinician opinion identified 13 most utilized calculators. Although not an exhaustive list, a total of 176 validated calculators were identified, classified, and evaluated for usefulness. Most of these calculators are used for adult patients in the critical care or internal medicine settings. Thirteen of 176 clinical calculators were determined to be useful in our institution. All of these calculators have an interface for manual input. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  7. The Evaluation of Bivariate Mixed Models in Meta-analyses of Diagnostic Accuracy Studies with SAS, Stata and R.

    PubMed

    Vogelgesang, Felicitas; Schlattmann, Peter; Dewey, Marc

    2018-05-01

    Meta-analyses require a thoroughly planned procedure to obtain unbiased overall estimates. From a statistical point of view not only model selection but also model implementation in the software affects the results. The present simulation study investigates the accuracy of different implementations of general and generalized bivariate mixed models in SAS (using proc mixed, proc glimmix and proc nlmixed), Stata (using gllamm, xtmelogit and midas) and R (using reitsma from package mada and glmer from package lme4). Both models incorporate the relationship between sensitivity and specificity - the two outcomes of interest in meta-analyses of diagnostic accuracy studies - utilizing random effects. Model performance is compared in nine meta-analytic scenarios reflecting the combination of three sizes for meta-analyses (89, 30 and 10 studies) with three pairs of sensitivity/specificity values (97%/87%; 85%/75%; 90%/93%). The evaluation of accuracy in terms of bias, standard error and mean squared error reveals that all implementations of the generalized bivariate model calculate sensitivity and specificity estimates with deviations less than two percentage points. proc mixed which together with reitsma implements the general bivariate mixed model proposed by Reitsma rather shows convergence problems. The random effect parameters are in general underestimated. This study shows that flexibility and simplicity of model specification together with convergence robustness should influence implementation recommendations, as the accuracy in terms of bias was acceptable in all implementations using the generalized approach. Schattauer GmbH.

  8. Are head-to-head trials of biologics needed? The role of value of information methods in arthritis research.

    PubMed

    Welton, Nicky J; Madan, Jason; Ades, Anthony E

    2011-09-01

    Reimbursement decisions are typically based on cost-effectiveness analyses. While a cost-effectiveness analysis can identify the optimum strategy, there is usually some degree of uncertainty around this decision. Sources of uncertainty include statistical sampling error in treatment efficacy measures, underlying baseline risk, utility measures and costs, as well as uncertainty in the structure of the model. The optimal strategy is therefore only optimal on average, and a decision to adopt this strategy might still be the wrong decision if all uncertainty could be eliminated. This means that there is a quantifiable expected (average) loss attaching to decisions made under uncertainty, and hence a value in collecting information to reduce that uncertainty. Value of information (VOI) analyses can be used to provide guidance on whether more research would be cost-effective, which particular model inputs (parameters) have the most bearing on decision uncertainty, and can also help with the design and sample size of further research. Here, we introduce the key concepts in VOI analyses, and highlight the inputs required to calculate it. The adoption of the new biologic treatments for RA and PsA tends to be based on placebo-controlled trials. We discuss the possible role of VOI analyses in deciding whether head-to-head comparisons of the biologic therapies should be carried out, illustrating with examples from other fields. We emphasize the need for a model of the natural history of RA and PsA, which reflects a consensus view.

  9. Survival Regression Modeling Strategies in CVD Prediction.

    PubMed

    Barkhordari, Mahnaz; Padyab, Mojgan; Sardarinia, Mahsa; Hadaegh, Farzad; Azizi, Fereidoun; Bozorgmanesh, Mohammadreza

    2016-04-01

    A fundamental part of prevention is prediction. Potential predictors are the sine qua non of prediction models. However, whether incorporating novel predictors to prediction models could be directly translated to added predictive value remains an area of dispute. The difference between the predictive power of a predictive model with (enhanced model) and without (baseline model) a certain predictor is generally regarded as an indicator of the predictive value added by that predictor. Indices such as discrimination and calibration have long been used in this regard. Recently, the use of added predictive value has been suggested while comparing the predictive performances of the predictive models with and without novel biomarkers. User-friendly statistical software capable of implementing novel statistical procedures is conspicuously lacking. This shortcoming has restricted implementation of such novel model assessment methods. We aimed to construct Stata commands to help researchers obtain the aforementioned statistical indices. We have written Stata commands that are intended to help researchers obtain the following. 1, Nam-D'Agostino X 2 goodness of fit test; 2, Cut point-free and cut point-based net reclassification improvement index (NRI), relative absolute integrated discriminatory improvement index (IDI), and survival-based regression analyses. We applied the commands to real data on women participating in the Tehran lipid and glucose study (TLGS) to examine if information relating to a family history of premature cardiovascular disease (CVD), waist circumference, and fasting plasma glucose can improve predictive performance of Framingham's general CVD risk algorithm. The command is adpredsurv for survival models. Herein we have described the Stata package "adpredsurv" for calculation of the Nam-D'Agostino X 2 goodness of fit test as well as cut point-free and cut point-based NRI, relative and absolute IDI, and survival-based regression analyses. We hope this work encourages the use of novel methods in examining predictive capacity of the emerging plethora of novel biomarkers.

  10. Employing Deceptive Dynamic Network Topology Through Software-Defined Networking

    DTIC Science & Technology

    2014-03-01

    manage economies, banking, and businesses , to the way we gather intelligence and militaries wage war. With computer networks and the Internet, we have seen...space, along with other generated statistics , similar to that performed by the Ant Census project. As we have shown, there is an extensive and diverse...calculated RTT for each probe. In the ping statistics , we are presented the details of probes sent and responses received, and the calculated packet loss

  11. To P or Not to P: Backing Bayesian Statistics.

    PubMed

    Buchinsky, Farrel J; Chadha, Neil K

    2017-12-01

    In biomedical research, it is imperative to differentiate chance variation from truth before we generalize what we see in a sample of subjects to the wider population. For decades, we have relied on null hypothesis significance testing, where we calculate P values for our data to decide whether to reject a null hypothesis. This methodology is subject to substantial misinterpretation and errant conclusions. Instead of working backward by calculating the probability of our data if the null hypothesis were true, Bayesian statistics allow us instead to work forward, calculating the probability of our hypothesis given the available data. This methodology gives us a mathematical means of incorporating our "prior probabilities" from previous study data (if any) to produce new "posterior probabilities." Bayesian statistics tell us how confidently we should believe what we believe. It is time to embrace and encourage their use in our otolaryngology research.

  12. Explorative spatial analysis of traffic accident statistics and road mortality among the provinces of Turkey.

    PubMed

    Erdogan, Saffet

    2009-10-01

    The aim of the study is to describe the inter-province differences in traffic accidents and mortality on roads of Turkey. Two different risk indicators were used to evaluate the road safety performance of the provinces in Turkey. These indicators are the ratios between the number of persons killed in road traffic accidents (1) and the number of accidents (2) (nominators) and their exposure to traffic risk (denominator). Population and the number of registered motor vehicles in the provinces were used as denominators individually. Spatial analyses were performed to the mean annual rate of deaths and to the number of fatal accidents that were calculated for the period of 2001-2006. Empirical Bayes smoothing was used to remove background noise from the raw death and accident rates because of the sparsely populated provinces and small number of accident and death rates of provinces. Global and local spatial autocorrelation analyses were performed to show whether the provinces with high rates of deaths-accidents show clustering or are located closer by chance. The spatial distribution of provinces with high rates of deaths and accidents was nonrandom and detected as clustered with significance of P<0.05 with spatial autocorrelation analyses. Regions with high concentration of fatal accidents and deaths were located in the provinces that contain the roads connecting the Istanbul, Ankara, and Antalya provinces. Accident and death rates were also modeled with some independent variables such as number of motor vehicles, length of roads, and so forth using geographically weighted regression analysis with forward step-wise elimination. The level of statistical significance was taken as P<0.05. Large differences were found between the rates of deaths and accidents according to denominators in the provinces. The geographically weighted regression analyses did significantly better predictions for both accident rates and death rates than did ordinary least regressions, as indicated by adjusted R(2) values. Geographically weighted regression provided values of 0.89-0.99 adjusted R(2) for death and accident rates, compared with 0.88-0.95, respectively, by ordinary least regressions. Geographically weighted regression has the potential to reveal local patterns in the spatial distribution of rates, which would be ignored by the ordinary least regression approach. The application of spatial analysis and modeling of accident statistics and death rates at provincial level in Turkey will help to identification of provinces with outstandingly high accident and death rates. This could help more efficient road safety management in Turkey.

  13. The Digital Shoreline Analysis System (DSAS) Version 4.0 - An ArcGIS extension for calculating shoreline change

    USGS Publications Warehouse

    Thieler, E. Robert; Himmelstoss, Emily A.; Zichichi, Jessica L.; Ergul, Ayhan

    2009-01-01

    The Digital Shoreline Analysis System (DSAS) version 4.0 is a software extension to ESRI ArcGIS v.9.2 and above that enables a user to calculate shoreline rate-of-change statistics from multiple historic shoreline positions. A user-friendly interface of simple buttons and menus guides the user through the major steps of shoreline change analysis. Components of the extension and user guide include (1) instruction on the proper way to define a reference baseline for measurements, (2) automated and manual generation of measurement transects and metadata based on user-specified parameters, and (3) output of calculated rates of shoreline change and other statistical information. DSAS computes shoreline rates of change using four different methods: (1) endpoint rate, (2) simple linear regression, (3) weighted linear regression, and (4) least median of squares. The standard error, correlation coefficient, and confidence interval are also computed for the simple and weighted linear-regression methods. The results of all rate calculations are output to a table that can be linked to the transect file by a common attribute field. DSAS is intended to facilitate the shoreline change-calculation process and to provide rate-of-change information and the statistical data necessary to establish the reliability of the calculated results. The software is also suitable for any generic application that calculates positional change over time, such as assessing rates of change of glacier limits in sequential aerial photos, river edge boundaries, land-cover changes, and so on.

  14. Statistics or How to Know Your Onions.

    ERIC Educational Resources Information Center

    Hawkins, Anne S.

    1986-01-01

    Using calculators (and computers) to develop an understanding and appreciation of statistical ideas is advocated. Manual computation as a prerequisite for developing concepts is negated through several examples. (MNS)

  15. Calculation of precise firing statistics in a neural network model

    NASA Astrophysics Data System (ADS)

    Cho, Myoung Won

    2017-08-01

    A precise prediction of neural firing dynamics is requisite to understand the function of and the learning process in a biological neural network which works depending on exact spike timings. Basically, the prediction of firing statistics is a delicate manybody problem because the firing probability of a neuron at a time is determined by the summation over all effects from past firing states. A neural network model with the Feynman path integral formulation is recently introduced. In this paper, we present several methods to calculate firing statistics in the model. We apply the methods to some cases and compare the theoretical predictions with simulation results.

  16. Statistical Analyses of Raw Material Data for MTM45-1/CF7442A-36% RW: CMH Cure Cycle

    NASA Technical Reports Server (NTRS)

    Coroneos, Rula; Pai, Shantaram, S.; Murthy, Pappu

    2013-01-01

    This report describes statistical characterization of physical properties of the composite material system MTM45-1/CF7442A, which has been tested and is currently being considered for use on spacecraft structures. This composite system is made of 6K plain weave graphite fibers in a highly toughened resin system. This report summarizes the distribution types and statistical details of the tests and the conditions for the experimental data generated. These distributions will be used in multivariate regression analyses to help determine material and design allowables for similar material systems and to establish a procedure for other material systems. Additionally, these distributions will be used in future probabilistic analyses of spacecraft structures. The specific properties that are characterized are the ultimate strength, modulus, and Poisson??s ratio by using a commercially available statistical package. Results are displayed using graphical and semigraphical methods and are included in the accompanying appendixes.

  17. Statistics Using Just One Formula

    ERIC Educational Resources Information Center

    Rosenthal, Jeffrey S.

    2018-01-01

    This article advocates that introductory statistics be taught by basing all calculations on a single simple margin-of-error formula and deriving all of the standard introductory statistical concepts (confidence intervals, significance tests, comparisons of means and proportions, etc) from that one formula. It is argued that this approach will…

  18. Level set method with automatic selective local statistics for brain tumor segmentation in MR images.

    PubMed

    Thapaliya, Kiran; Pyun, Jae-Young; Park, Chun-Su; Kwon, Goo-Rak

    2013-01-01

    The level set approach is a powerful tool for segmenting images. This paper proposes a method for segmenting brain tumor images from MR images. A new signed pressure function (SPF) that can efficiently stop the contours at weak or blurred edges is introduced. The local statistics of the different objects present in the MR images were calculated. Using local statistics, the tumor objects were identified among different objects. In this level set method, the calculation of the parameters is a challenging task. The calculations of different parameters for different types of images were automatic. The basic thresholding value was updated and adjusted automatically for different MR images. This thresholding value was used to calculate the different parameters in the proposed algorithm. The proposed algorithm was tested on the magnetic resonance images of the brain for tumor segmentation and its performance was evaluated visually and quantitatively. Numerical experiments on some brain tumor images highlighted the efficiency and robustness of this method. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  19. LFSTAT - Low-Flow Analysis in R

    NASA Astrophysics Data System (ADS)

    Koffler, Daniel; Laaha, Gregor

    2013-04-01

    The calculation of characteristic stream flow during dry conditions is a basic requirement for many problems in hydrology, ecohydrology and water resources management. As opposed to floods, a number of different indices are used to characterise low flows and streamflow droughts. Although these indices and methods of calculation have been well documented in the WMO Manual on Low-flow Estimation and Prediction [1], a comprehensive software was missing which enables a fast and standardized calculation of low flow statistics. We present the new software package lfstat to fill in this obvious gap. Our software package is based on the statistical open source software R, and expands it to analyse daily stream flow data records focusing on low-flows. As command-line based programs are not everyone's preference, we also offer a plug-in for the R-Commander, an easy to use graphical user interface (GUI) provided for R which is based on tcl/tk. The functionality of lfstat includes estimation methods for low-flow indices, extreme value statistics, deficit characteristics, and additional graphical methods to control the computation of complex indices and to illustrate the data. Beside the basic low flow indices, the baseflow index and recession constants can be computed. For extreme value statistics, state-of-the-art methods for L-moment based local and regional frequency analysis (RFA) are available. The tools for deficit characteristics include various pooling and threshold selection methods to support the calculation of drought duration and deficit indices. The most common graphics for low flow analysis are available, and the plots can be modified according to the user preferences. Graphics include hydrographs for different periods, flexible streamflow deficit plots, baseflow visualisation, recession diagnostic, flow duration curves as well as double mass curves, and many more. From a technical point of view, the package uses a S3-class called lfobj (low-flow objects). This objects are usual R-data-frames including date, flow, hydrological year and possibly baseflow information. Once these objects are created, analysis can be performed by mouse-click and a script can be saved to make the analysis easily reproducible. At the moment we are offering implementation of all major methods proposed in the WMO manual on Low-flow Estimation and Predictions [1]. Future plans include a dynamic low flow report in odt-file format using odf-weave which allows automatic updates if data or analysis change. We hope to offer a tool to ease and structure the analysis of stream flow data focusing on low-flows and to make analysis transparent and communicable. The package can also be used in teaching students the first steps in low-flow hydrology. The software packages can be installed from CRAN (latest stable) and R-Forge: http://r-forge.r-project.org (development version). References: [1] Gustard, Alan; Demuth, Siegfried, (eds.) Manual on Low-flow Estimation and Prediction. Geneva, Switzerland, World Meteorological Organization, (Operational Hydrology Report No. 50, WMO-No. 1029).

  20. Integrated analyses in plastics forming

    NASA Astrophysics Data System (ADS)

    Bo, Wang

    This is the thesis which explains the progress made in the analysis, simulation and testing of plastics forming. This progress can be applied to injection and compression mould design. Three activities of plastics forming have been investigated, namely filling analysis, cooling analysis and ejecting analysis. The filling section of plastics forming has been analysed and calculated by using MOLDFLOW and FILLCALC V. software. A comparing of high speed compression moulding and injection moulding has been made. The cooling section of plastics forming has been analysed by using MOLDFLOW software and a finite difference computer program. The latter program can be used as a sample program to calculate the feasibility of cooling different materials to required target temperatures under controlled cooling conditions. The application of thermal imaging has been also introduced to determine the actual process temperatures. Thermal imaging can be used as a powerful tool to analyse mould surface temperatures and to verify the mathematical model. A buckling problem for ejecting section has been modelled and calculated by PATRAN/ABAQUS finite element analysis software and tested. These calculations and analysis are applied to the special case but can be use as an example for general analysis and calculation in the ejection section of plastics forming.

  1. Monte-Carlo Method Application for Precising Meteor Velocity from TV Observations

    NASA Astrophysics Data System (ADS)

    Kozak, P.

    2014-12-01

    Monte-Carlo method (method of statistical trials) as an application for meteor observations processing was developed in author's Ph.D. thesis in 2005 and first used in his works in 2008. The idea of using the method consists in that if we generate random values of input data - equatorial coordinates of the meteor head in a sequence of TV frames - in accordance with their statistical distributions we get a possibility to plot the probability density distributions for all its kinematical parameters, and to obtain their mean values and dispersions. At that the theoretical possibility appears to precise the most important parameter - geocentric velocity of a meteor - which has the highest influence onto precision of meteor heliocentric orbit elements calculation. In classical approach the velocity vector was calculated in two stages: first we calculate the vector direction as a vector multiplication of vectors of poles of meteor trajectory big circles, calculated from two observational points. Then we calculated the absolute value of velocity independently from each observational point selecting any of them from some reasons as a final parameter. In the given method we propose to obtain a statistical distribution of velocity absolute value as an intersection of two distributions corresponding to velocity values obtained from different points. We suppose that such an approach has to substantially increase the precision of meteor velocity calculation and remove any subjective inaccuracies.

  2. Post Hoc Analyses of ApoE Genotype-Defined Subgroups in Clinical Trials.

    PubMed

    Kennedy, Richard E; Cutter, Gary R; Wang, Guoqiao; Schneider, Lon S

    2016-01-01

    Many post hoc analyses of clinical trials in Alzheimer's disease (AD) and mild cognitive impairment (MCI) are in small Phase 2 trials. Subject heterogeneity may lead to statistically significant post hoc results that cannot be replicated in larger follow-up studies. We investigated the extent of this problem using simulation studies mimicking current trial methods with post hoc analyses based on ApoE4 carrier status. We used a meta-database of 24 studies, including 3,574 subjects with mild AD and 1,171 subjects with MCI/prodromal AD, to simulate clinical trial scenarios. Post hoc analyses examined if rates of progression on the Alzheimer's Disease Assessment Scale-cognitive (ADAS-cog) differed between ApoE4 carriers and non-carriers. Across studies, ApoE4 carriers were younger and had lower baseline scores, greater rates of progression, and greater variability on the ADAS-cog. Up to 18% of post hoc analyses for 18-month trials in AD showed greater rates of progression for ApoE4 non-carriers that were statistically significant but unlikely to be confirmed in follow-up studies. The frequency of erroneous conclusions dropped below 3% with trials of 100 subjects per arm. In MCI, rates of statistically significant differences with greater progression in ApoE4 non-carriers remained below 3% unless sample sizes were below 25 subjects per arm. Statistically significant differences for ApoE4 in post hoc analyses often reflect heterogeneity among small samples rather than true differential effect among ApoE4 subtypes. Such analyses must be viewed cautiously. ApoE genotype should be incorporated into the design stage to minimize erroneous conclusions.

  3. Methodological Standards for Meta-Analyses and Qualitative Systematic Reviews of Cardiac Prevention and Treatment Studies: A Scientific Statement From the American Heart Association.

    PubMed

    Rao, Goutham; Lopez-Jimenez, Francisco; Boyd, Jack; D'Amico, Frank; Durant, Nefertiti H; Hlatky, Mark A; Howard, George; Kirley, Katherine; Masi, Christopher; Powell-Wiley, Tiffany M; Solomonides, Anthony E; West, Colin P; Wessel, Jennifer

    2017-09-05

    Meta-analyses are becoming increasingly popular, especially in the fields of cardiovascular disease prevention and treatment. They are often considered to be a reliable source of evidence for making healthcare decisions. Unfortunately, problems among meta-analyses such as the misapplication and misinterpretation of statistical methods and tests are long-standing and widespread. The purposes of this statement are to review key steps in the development of a meta-analysis and to provide recommendations that will be useful for carrying out meta-analyses and for readers and journal editors, who must interpret the findings and gauge methodological quality. To make the statement practical and accessible, detailed descriptions of statistical methods have been omitted. Based on a survey of cardiovascular meta-analyses, published literature on methodology, expert consultation, and consensus among the writing group, key recommendations are provided. Recommendations reinforce several current practices, including protocol registration; comprehensive search strategies; methods for data extraction and abstraction; methods for identifying, measuring, and dealing with heterogeneity; and statistical methods for pooling results. Other practices should be discontinued, including the use of levels of evidence and evidence hierarchies to gauge the value and impact of different study designs (including meta-analyses) and the use of structured tools to assess the quality of studies to be included in a meta-analysis. We also recommend choosing a pooling model for conventional meta-analyses (fixed effect or random effects) on the basis of clinical and methodological similarities among studies to be included, rather than the results of a test for statistical heterogeneity. © 2017 American Heart Association, Inc.

  4. Playing-related disabling musculoskeletal disorders in young and adult classical piano students.

    PubMed

    Bruno, S; Lorusso, A; L'Abbate, N

    2008-07-01

    To determine the prevalence of instrument-related musculoskeletal problems in classical piano students and investigate piano-specific risk factors. A specially developed four parts questionnaire was administered to classical piano students of two Apulian conservatories, in southern Italy. A cross-sectional design was used. Prevalences of playing related musculoskeletal disorders (MSDs) were calculated and cases were compared with non-cases. A total of 195 out of the 224 piano students responded (87%). Among 195 responders, 75 (38.4%) were considered affected according to the pre-established criteria. Disabling MSDs showed similar prevalence rates for neck (29.3%), thoracic spine (21.3%) and upper limbs (from 20.0 to 30.4%) in the affected group. Univariate analyses showed statistical differences concerning mean age, number of hours per week spent playing, more than 60 min of continuative playing without breaks, lack of sport practice and acceptability of "No pain, no gain" criterion in students with music-related pain compared with pianists not affected. Statistical correlation was found only between upper limbs diseases in pianists and hand sizes. No correlation with the model of piano played was found in the affected group. The multivariate analyses performed by logistic regression confirmed the independent correlation of the risk factors age, lack of sport practice and acceptability of "No pain, no gain" criterion. Our study showed MSDs to be a common problem among classical piano students. With variance in several studies reported, older students appeared to be more frequently affected by disabling MSDs and no difference in the prevalence rate of the disorders was found in females.

  5. Relationship among environmental quality variables, housing variables, and residential needs: a secondary analysis of the relationship among indoor, outdoor, and personal air (RIOPA) concentrations database

    NASA Astrophysics Data System (ADS)

    Garcia, Fausto; Shendell, Derek G.; Madrigano, Jaime

    2017-03-01

    Retrospective descriptive secondary analyses of data from relationships of indoor, outdoor, and personal air (RIOPA) study homes (in Houston, Texas; Los Angeles County, California; and, Elizabeth, New Jersey May 1999-February 2001) were conducted. Data included air exchange rates, associations between indoor and outdoor temperature and humidity, and calculated apparent temperature and humidex. Analyses examined if study homes provided optimum thermal comfort for residents during both heating and cooling seasons when compared to current American Society of Heating, Refrigerating and Air Conditioning Engineers (ASHRAE) Standards 62/62.1 and 55. Results suggested outdoor temperature, humidex, and apparent temperature during the cooling season potentially served as indicators of indoor personal exposure to parameters of thermal comfort. Outdoor temperatures, humidex, and apparent temperature during the cooling season had statistically significant predictive abilities in predicting indoor temperature. During the heating season, only humidex in Texas and combined data across study states were statistically significant, but with weaker to moderate predicative ability. The high degree of correlation between outdoor and indoor environmental variables provided support for the validity of epidemiologic studies of weather relying on temporal comparisons. Results indicated most RIOPA study residents experienced thermal comfort; however, many values indicated how several residents may have experienced some discomfort depending on clothing and indoor activities. With climate change, increases in temperature are expected, with more days of extreme heat and humidity and, potentially harsher, longer winters. Homes being built or modernized should be created with the appropriate guidelines to provide comfort for residents daily and in extreme weather events.

  6. Prevalence, Trend and Determining Factors of Gestational Diabetes in Germany.

    PubMed

    Huy, C; Loerbroks, A; Hornemann, A; Röhrig, S; Schneider, S

    2012-04-01

    Purpose: The true prevalence of gestational diabetes in Germany is unknown. Thus, the study's purposes were to estimate the prevalence of gestational diabetes as well as to describe the temporal prevalence trend and to identify determinants. Material and Methods: We calculated prevalence estimates based on two datasets: the register-based German perinatal statistic (n = 650 232) and the maternal self-reports from the German children and youth health survey (KiGGS; n = 15 429). Differences between prevalence estimates were analysed using χ 2 and trend tests, and determinants were identified using logistic regression. Results: According to the perinatal statistic, gestational diabetes was present in 3.7 % of pregnant women in Germany in 2010. The prevalence across the years 2001 to 2006 was estimated at 1.9 % which differed significantly from the prevalence estimate derived from the KiGGS dataset for the same period of time (5.3 %; 95 % confidence interval: 4.6-6.1 %). Both datasets show an increasing trend of gestational diabetes (p < 0.001). The risk for gestational diabetes was mainly associated with age, BMI and social class of pregnant women as well as with multiple pregnancies. Conclusion: The lack of significant screening studies among representative samples hampers a sound estimation of the true prevalence of gestational diabetes in Germany. The increasing trend in gestational diabetes might continue due to the projected increase of important risk factors (e.g., maternal age, obesity). Our analyses support the current consensus recommendations regarding standardised gestational diabetes screening.

  7. Temporal trends in the acidity of precipitation and surface waters of New York

    USGS Publications Warehouse

    Peters, Norman E.; Schroeder, Roy A.; Troutman, David E.

    1982-01-01

    Statistical analyses of precipitation data from a nine-station monitoring network indicate little change in pH from 1965-78 within New York State as a whole but suggest that pH of bulk precipitation has decreased in the western part of the State by approximately 0.2 pH units since 1965 and increased in the eastern part by a similar amount. This trend is equivalent to an annual change in hydrogen-ion concentration of 0.2 microequivalents per liter. An average annual increase in precipitation quantity of 2 to 3 percent since 1965 has resulted in an increased acid load in the western and central parts of the State. During 1965-78, sulfate concentration in precipitation decreased an average of 1-4 percent annually. In general, no trend in nitrate was detected. Calculated trends in hydrogen-ion concentration do not correlate with measured trends of sulfate and nitrate, which suggests variable neutralization of hydrogen ion, possibly by particles from dry deposition. Neutralization has produced an increase of about 0.3 pH units in nonurban areas and 0.7 pH units in urban areas. Statistical analyses of chemical data from several streams throughout New York suggest that sulfate concentrations decreased an average of 1 to 4 percent per year. This decrease is comparable to the sulfate decrease in precipitation during the same period. In most areas of the State, chemical contributions from urbanization and farming, as well as the neutralizing effect of carbonate soils, conceal whatever effects acid precipitation may have on pH of streams.

  8. CORSEN, a new software dedicated to microscope-based 3D distance measurements: mRNA-mitochondria distance, from single-cell to population analyses.

    PubMed

    Jourdren, Laurent; Delaveau, Thierry; Marquenet, Emelie; Jacq, Claude; Garcia, Mathilde

    2010-07-01

    Recent improvements in microscopy technology allow detection of single molecules of RNA, but tools for large-scale automatic analyses of particle distributions are lacking. An increasing number of imaging studies emphasize the importance of mRNA localization in the definition of cell territory or the biogenesis of cell compartments. CORSEN is a new tool dedicated to three-dimensional (3D) distance measurements from imaging experiments especially developed to access the minimal distance between RNA molecules and cellular compartment markers. CORSEN includes a 3D segmentation algorithm allowing the extraction and the characterization of the cellular objects to be processed--surface determination, aggregate decomposition--for minimal distance calculations. CORSEN's main contribution lies in exploratory statistical analysis, cell population characterization, and high-throughput assays that are made possible by the implementation of a batch process analysis. We highlighted CORSEN's utility for the study of relative positions of mRNA molecules and mitochondria: CORSEN clearly discriminates mRNA localized to the vicinity of mitochondria from those that are translated on free cytoplasmic polysomes. Moreover, it quantifies the cell-to-cell variations of mRNA localization and emphasizes the necessity for statistical approaches. This method can be extended to assess the evolution of the distance between specific mRNAs and other cellular structures in different cellular contexts. CORSEN was designed for the biologist community with the concern to provide an easy-to-use and highly flexible tool that can be applied for diverse distance quantification issues.

  9. Effect of an EBM course in combination with case method learning sessions: an RCT on professional performance, job satisfaction, and self-efficacy of occupational physicians

    PubMed Central

    Schaafsma, Frederieke G.; Nieuwenhuijsen, Karen; van Dijk, Frank J. H.

    2008-01-01

    Objective An intervention existing of an evidence-based medicine (EBM) course in combination with case method learning sessions (CMLSs) was designed to enhance the professional performance, self-efficacy and job satisfaction of occupational physicians. Methods A cluster randomized controlled trial was set up and data were collected through questionnaires at baseline (T0), directly after the intervention (T1) and 7 months after baseline (T2). The data of the intervention group [T0 (n = 49), T1 (n = 31), T2 (n = 29)] and control group [T0 (n = 49), T1 (n = 28), T2 (n = 28)] were analysed in mixed model analyses. Mean scores of the perceived value of the CMLS were calculated in the intervention group. Results The overall effect of the intervention over time comparing the intervention with the control group was statistically significant for professional performance (p < 0.001). Job satisfaction and self-efficacy changes were small and not statistically significant between the groups. The perceived value of the CMLS to gain new insights and to improve the quality of their performance increased with the number of sessions followed. Conclusion An EBM course in combination with case method learning sessions is perceived as valuable and offers evidence to enhance the professional performance of occupational physicians. However, it does not seem to influence their self-efficacy and job satisfaction. PMID:18386046

  10. Cost-effectiveness of two vocational rehabilitation programs for persons with severe mental illness.

    PubMed

    Dixon, Lisa; Hoch, Jeffrey S; Clark, Robin; Bebout, Richard; Drake, Robert; McHugo, Greg; Becker, Deborah

    2002-09-01

    This study sought to determine differences in the cost-effectiveness of two vocational programs: individual placement and support (IPS), in which employment specialists within a mental health center help patients obtain competitive jobs and provide them with ongoing support, and enhanced vocational rehabilitation (EVR), in which stepwise services that involve prevocational experiences are delivered by rehabilitation agencies. A total of 150 unemployed inner-city patients with severe mental disorders who expressed an interest in competitive employment were randomly assigned to IPS or EVR programs and were followed for 18 months. Wages from all forms of employment and the number of weeks and hours of competitive employment were tracked monthly. Estimates were made of direct mental health costs and vocational costs. Incremental cost-effectiveness ratios (ICERs) were calculated for competitive employment outcomes and total wages. No statistically significant differences were found in the overall costs of IPS and EVR. Participation in the IPS program was associated with significantly more hours and weeks of competitive employment. However, the average combined earnings-earnings from competitive and noncompetitive employment-were virtually the same both programs. The ICER estimates indicated that participants in the IPS program worked in competitive employment settings for an additional week over the 18-month period at a cost of $283 ($13 an hour). The analyses suggest that IPS participants engaged in competitive employment at a higher cost. When combined earnings were used as the outcome, data from the statistical analyses were insufficient to enable any firm conclusions to be drawn. The findings illustrate the importance of choice of outcomes in evaluations of employment programs.

  11. Relationship among environmental quality variables, housing variables, and residential needs: a secondary analysis of the relationship among indoor, outdoor, and personal air (RIOPA) concentrations database.

    PubMed

    Garcia, Fausto; Shendell, Derek G; Madrigano, Jaime

    2017-03-01

    Retrospective descriptive secondary analyses of data from relationships of indoor, outdoor, and personal air (RIOPA) study homes (in Houston, Texas; Los Angeles County, California; and, Elizabeth, New Jersey May 1999-February 2001) were conducted. Data included air exchange rates, associations between indoor and outdoor temperature and humidity, and calculated apparent temperature and humidex. Analyses examined if study homes provided optimum thermal comfort for residents during both heating and cooling seasons when compared to current American Society of Heating, Refrigerating and Air Conditioning Engineers (ASHRAE) Standards 62/62.1 and 55. Results suggested outdoor temperature, humidex, and apparent temperature during the cooling season potentially served as indicators of indoor personal exposure to parameters of thermal comfort. Outdoor temperatures, humidex, and apparent temperature during the cooling season had statistically significant predictive abilities in predicting indoor temperature. During the heating season, only humidex in Texas and combined data across study states were statistically significant, but with weaker to moderate predicative ability. The high degree of correlation between outdoor and indoor environmental variables provided support for the validity of epidemiologic studies of weather relying on temporal comparisons. Results indicated most RIOPA study residents experienced thermal comfort; however, many values indicated how several residents may have experienced some discomfort depending on clothing and indoor activities. With climate change, increases in temperature are expected, with more days of extreme heat and humidity and, potentially harsher, longer winters. Homes being built or modernized should be created with the appropriate guidelines to provide comfort for residents daily and in extreme weather events.

  12. Reporting characteristics of meta-analyses in orthodontics: methodological assessment and statistical recommendations.

    PubMed

    Papageorgiou, Spyridon N; Papadopoulos, Moschos A; Athanasiou, Athanasios E

    2014-02-01

    Ideally meta-analyses (MAs) should consolidate the characteristics of orthodontic research in order to produce an evidence-based answer. However severe flaws are frequently observed in most of them. The aim of this study was to evaluate the statistical methods, the methodology, and the quality characteristics of orthodontic MAs and to assess their reporting quality during the last years. Electronic databases were searched for MAs (with or without a proper systematic review) in the field of orthodontics, indexed up to 2011. The AMSTAR tool was used for quality assessment of the included articles. Data were analyzed with Student's t-test, one-way ANOVA, and generalized linear modelling. Risk ratios with 95% confidence intervals were calculated to represent changes during the years in reporting of key items associated with quality. A total of 80 MAs with 1086 primary studies were included in this evaluation. Using the AMSTAR tool, 25 (27.3%) of the MAs were found to be of low quality, 37 (46.3%) of medium quality, and 18 (22.5%) of high quality. Specific characteristics like explicit protocol definition, extensive searches, and quality assessment of included trials were associated with a higher AMSTAR score. Model selection and dealing with heterogeneity or publication bias were often problematic in the identified reviews. The number of published orthodontic MAs is constantly increasing, while their overall quality is considered to range from low to medium. Although the number of MAs of medium and high level seems lately to rise, several other aspects need improvement to increase their overall quality.

  13. Health Disparities Calculator (HD*Calc) - SEER Software

    Cancer.gov

    Statistical software that generates summary measures to evaluate and monitor health disparities. Users can import SEER data or other population-based health data to calculate 11 disparity measurements.

  14. 40 CFR Appendix IV to Part 264 - Cochran's Approximation to the Behrens-Fisher Students' t-test

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... summary measures to calculate a t-statistic (t*) and a comparison t-statistic (tc). The t* value is compared to the tc value and a conclusion reached as to whether there has been a statistically significant... made in collecting the background data. The t-statistic (tc), against which t* will be compared...

  15. 76 FR 22122 - Section 8 Housing Choice Voucher Program-Demonstration Project of Small Area Fair Market Rents in...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-20

    ... rent and the core-based statistical area (CBSA) rent as applied to the 40th percentile FMR for that..., calculated on the basis of the core-based statistical area (CBSA) or the metropolitan Statistical Area (MSA... will be ranked according to each of the statistics specified above, and then a weighted average ranking...

  16. 40 CFR Appendix IV to Part 264 - Cochran's Approximation to the Behrens-Fisher Students' t-test

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... summary measures to calculate a t-statistic (t*) and a comparison t-statistic (tc). The t* value is compared to the tc value and a conclusion reached as to whether there has been a statistically significant... made in collecting the background data. The t-statistic (tc), against which t* will be compared...

  17. Altered white matter development in children born very preterm.

    PubMed

    Young, Julia M; Vandewouw, Marlee M; Morgan, Benjamin R; Smith, Mary Lou; Sled, John G; Taylor, Margot J

    2018-06-01

    Children born very preterm (VPT) at less than 32 weeks' gestational age (GA) are prone to disrupted white matter maturation and impaired cognitive development. The aims of the present study were to identify differences in white matter microstructure and connectivity of children born VPT compared to term-born children, as well as relations between white matter measures with cognitive outcomes and early brain injury. Diffusion images and T1-weighted anatomical MR images were acquired along with developmental assessments in 31 VPT children (mean GA: 28.76 weeks) and 28 term-born children at 4 years of age. FSL's tract-based spatial statistics was used to create a cohort-specific template and mean fractional anisotropy (FA) skeleton that was applied to each child's DTI data. Whole brain deterministic tractography was performed and graph theoretical measures of connectivity were calculated based on the number of streamlines between cortical and subcortical nodes derived from the Desikan-Killiany atlas. Between-group analyses included FSL Randomise for voxel-wise statistics and permutation testing for connectivity analyses. Within-group analyses between FA values and graph measures with IQ, language and visual-motor scores as well as history of white matter injury (WMI) and germinal matrix/intraventricular haemorrhage (GMH/IVH) were performed. In the children born VPT, FA values within major white matter tracts were reduced compared to term-born children. Reduced measures of local strength, clustering coefficient, local and global efficiency were present in the children born VPT within nodes in the lateral frontal, middle and superior temporal, cingulate, precuneus and lateral occipital regions. Within-group analyses revealed associations in term-born children between FA, Verbal IQ, Performance IQ and Full scale IQ within regions of the superior longitudinal fasciculus, inferior fronto-occipital fasciculus, forceps minor and forceps major. No associations with outcome were found in the VPT group. Global efficiency was reduced in the children born VPT with a history of WMI and GMH/IVH. These findings are evidence for under-developed and less connected white matter in children born VPT, contributing to our understanding of white matter development within this population.

  18. Calculation of the Hadronic Vacuum Polarization Disconnected Contribution to the Muon Anomalous Magnetic Moment

    NASA Astrophysics Data System (ADS)

    Blum, T.; Boyle, P. A.; Izubuchi, T.; Jin, L.; Jüttner, A.; Lehner, C.; Maltman, K.; Marinkovic, M.; Portelli, A.; Spraggs, M.; Rbc; Ukqcd Collaborations

    2016-06-01

    We report the first lattice QCD calculation of the hadronic vacuum polarization (HVP) disconnected contribution to the muon anomalous magnetic moment at physical pion mass. The calculation uses a refined noise-reduction technique that enables the control of statistical uncertainties at the desired level with modest computational effort. Measurements were performed on the 483×96 physical-pion-mass lattice generated by the RBC and UKQCD Collaborations. We find the leading-order hadronic vacuum polarization aμHVP (LO )disc=-9.6 (3.3 )(2.3 )×10-10 , where the first error is statistical and the second systematic.

  19. Calculation of the Hadronic Vacuum Polarization Disconnected Contribution to the Muon Anomalous Magnetic Moment.

    PubMed

    Blum, T; Boyle, P A; Izubuchi, T; Jin, L; Jüttner, A; Lehner, C; Maltman, K; Marinkovic, M; Portelli, A; Spraggs, M

    2016-06-10

    We report the first lattice QCD calculation of the hadronic vacuum polarization (HVP) disconnected contribution to the muon anomalous magnetic moment at physical pion mass. The calculation uses a refined noise-reduction technique that enables the control of statistical uncertainties at the desired level with modest computational effort. Measurements were performed on the 48^{3}×96 physical-pion-mass lattice generated by the RBC and UKQCD Collaborations. We find the leading-order hadronic vacuum polarization a_{μ}^{HVP(LO)disc}=-9.6(3.3)(2.3)×10^{-10}, where the first error is statistical and the second systematic.

  20. Enhancements to PCRSM.

    DTIC Science & Technology

    1991-03-01

    the A parameters; yhatf, to calculate the y-hat statistics; ssrf, to calculate the uncorrected SSR; sstof, to calculate the uncorrected SSTO ; matmulmm...DEGREES OF FREEDOM * int sstocdf, ssrcdf, ssecdf; float ssr, ssto , sse; /* SUM OF SQUARES * float ssrc, sstoc, ssec; float insr, insto, inse; float...Y-HAT STATSISTICS * yhatf(x,beta,stats,n,n); /* CALCULATE UNCORRECTED SSR * ssrf(beta, x, y, mn, n, ss); ssr = ss[l][l]; /* CALCULATE UNCORRECTED SSTO

Top