Sample records for levels statistical analysis

  1. Descriptive data analysis.

    PubMed

    Thompson, Cheryl Bagley

    2009-01-01

    This 13th article of the Basics of Research series is first in a short series on statistical analysis. These articles will discuss creating your statistical analysis plan, levels of measurement, descriptive statistics, probability theory, inferential statistics, and general considerations for interpretation of the results of a statistical analysis.

  2. Meta-analysis of gene-level associations for rare variants based on single-variant statistics.

    PubMed

    Hu, Yi-Juan; Berndt, Sonja I; Gustafsson, Stefan; Ganna, Andrea; Hirschhorn, Joel; North, Kari E; Ingelsson, Erik; Lin, Dan-Yu

    2013-08-08

    Meta-analysis of genome-wide association studies (GWASs) has led to the discoveries of many common variants associated with complex human diseases. There is a growing recognition that identifying "causal" rare variants also requires large-scale meta-analysis. The fact that association tests with rare variants are performed at the gene level rather than at the variant level poses unprecedented challenges in the meta-analysis. First, different studies may adopt different gene-level tests, so the results are not compatible. Second, gene-level tests require multivariate statistics (i.e., components of the test statistic and their covariance matrix), which are difficult to obtain. To overcome these challenges, we propose to perform gene-level tests for rare variants by combining the results of single-variant analysis (i.e., p values of association tests and effect estimates) from participating studies. This simple strategy is possible because of an insight that multivariate statistics can be recovered from single-variant statistics, together with the correlation matrix of the single-variant test statistics, which can be estimated from one of the participating studies or from a publicly available database. We show both theoretically and numerically that the proposed meta-analysis approach provides accurate control of the type I error and is as powerful as joint analysis of individual participant data. This approach accommodates any disease phenotype and any study design and produces all commonly used gene-level tests. An application to the GWAS summary results of the Genetic Investigation of ANthropometric Traits (GIANT) consortium reveals rare and low-frequency variants associated with human height. The relevant software is freely available. Copyright © 2013 The American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  3. General Framework for Meta-analysis of Rare Variants in Sequencing Association Studies

    PubMed Central

    Lee, Seunggeun; Teslovich, Tanya M.; Boehnke, Michael; Lin, Xihong

    2013-01-01

    We propose a general statistical framework for meta-analysis of gene- or region-based multimarker rare variant association tests in sequencing association studies. In genome-wide association studies, single-marker meta-analysis has been widely used to increase statistical power by combining results via regression coefficients and standard errors from different studies. In analysis of rare variants in sequencing studies, region-based multimarker tests are often used to increase power. We propose meta-analysis methods for commonly used gene- or region-based rare variants tests, such as burden tests and variance component tests. Because estimation of regression coefficients of individual rare variants is often unstable or not feasible, the proposed method avoids this difficulty by calculating score statistics instead that only require fitting the null model for each study and then aggregating these score statistics across studies. Our proposed meta-analysis rare variant association tests are conducted based on study-specific summary statistics, specifically score statistics for each variant and between-variant covariance-type (linkage disequilibrium) relationship statistics for each gene or region. The proposed methods are able to incorporate different levels of heterogeneity of genetic effects across studies and are applicable to meta-analysis of multiple ancestry groups. We show that the proposed methods are essentially as powerful as joint analysis by directly pooling individual level genotype data. We conduct extensive simulations to evaluate the performance of our methods by varying levels of heterogeneity across studies, and we apply the proposed methods to meta-analysis of rare variant effects in a multicohort study of the genetics of blood lipid levels. PMID:23768515

  4. mapDIA: Preprocessing and statistical analysis of quantitative proteomics data from data independent acquisition mass spectrometry.

    PubMed

    Teo, Guoshou; Kim, Sinae; Tsou, Chih-Chiang; Collins, Ben; Gingras, Anne-Claude; Nesvizhskii, Alexey I; Choi, Hyungwon

    2015-11-03

    Data independent acquisition (DIA) mass spectrometry is an emerging technique that offers more complete detection and quantification of peptides and proteins across multiple samples. DIA allows fragment-level quantification, which can be considered as repeated measurements of the abundance of the corresponding peptides and proteins in the downstream statistical analysis. However, few statistical approaches are available for aggregating these complex fragment-level data into peptide- or protein-level statistical summaries. In this work, we describe a software package, mapDIA, for statistical analysis of differential protein expression using DIA fragment-level intensities. The workflow consists of three major steps: intensity normalization, peptide/fragment selection, and statistical analysis. First, mapDIA offers normalization of fragment-level intensities by total intensity sums as well as a novel alternative normalization by local intensity sums in retention time space. Second, mapDIA removes outlier observations and selects peptides/fragments that preserve the major quantitative patterns across all samples for each protein. Last, using the selected fragments and peptides, mapDIA performs model-based statistical significance analysis of protein-level differential expression between specified groups of samples. Using a comprehensive set of simulation datasets, we show that mapDIA detects differentially expressed proteins with accurate control of the false discovery rates. We also describe the analysis procedure in detail using two recently published DIA datasets generated for 14-3-3β dynamic interaction network and prostate cancer glycoproteome. The software was written in C++ language and the source code is available for free through SourceForge website http://sourceforge.net/projects/mapdia/.This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Temporal scaling and spatial statistical analyses of groundwater level fluctuations

    NASA Astrophysics Data System (ADS)

    Sun, H.; Yuan, L., Sr.; Zhang, Y.

    2017-12-01

    Natural dynamics such as groundwater level fluctuations can exhibit multifractionality and/or multifractality due likely to multi-scale aquifer heterogeneity and controlling factors, whose statistics requires efficient quantification methods. This study explores multifractionality and non-Gaussian properties in groundwater dynamics expressed by time series of daily level fluctuation at three wells located in the lower Mississippi valley, after removing the seasonal cycle in the temporal scaling and spatial statistical analysis. First, using the time-scale multifractional analysis, a systematic statistical method is developed to analyze groundwater level fluctuations quantified by the time-scale local Hurst exponent (TS-LHE). Results show that the TS-LHE does not remain constant, implying the fractal-scaling behavior changing with time and location. Hence, we can distinguish the potentially location-dependent scaling feature, which may characterize the hydrology dynamic system. Second, spatial statistical analysis shows that the increment of groundwater level fluctuations exhibits a heavy tailed, non-Gaussian distribution, which can be better quantified by a Lévy stable distribution. Monte Carlo simulations of the fluctuation process also show that the linear fractional stable motion model can well depict the transient dynamics (i.e., fractal non-Gaussian property) of groundwater level, while fractional Brownian motion is inadequate to describe natural processes with anomalous dynamics. Analysis of temporal scaling and spatial statistics therefore may provide useful information and quantification to understand further the nature of complex dynamics in hydrology.

  6. General specifications for the development of a USL NASA PC R and D statistical analysis support package

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Bassari, Jinous; Triantafyllopoulos, Spiros

    1984-01-01

    The University of Southwestern Louisiana (USL) NASA PC R and D statistical analysis support package is designed to be a three-level package to allow statistical analysis for a variety of applications within the USL Data Base Management System (DBMS) contract work. The design addresses usage of the statistical facilities as a library package, as an interactive statistical analysis system, and as a batch processing package.

  7. Technical Note: The Initial Stages of Statistical Data Analysis

    PubMed Central

    Tandy, Richard D.

    1998-01-01

    Objective: To provide an overview of several important data-related considerations in the design stage of a research project and to review the levels of measurement and their relationship to the statistical technique chosen for the data analysis. Background: When planning a study, the researcher must clearly define the research problem and narrow it down to specific, testable questions. The next steps are to identify the variables in the study, decide how to group and treat subjects, and determine how to measure, and the underlying level of measurement of, the dependent variables. Then the appropriate statistical technique can be selected for data analysis. Description: The four levels of measurement in increasing complexity are nominal, ordinal, interval, and ratio. Nominal data are categorical or “count” data, and the numbers are treated as labels. Ordinal data can be ranked in a meaningful order by magnitude. Interval data possess the characteristics of ordinal data and also have equal distances between levels. Ratio data have a natural zero point. Nominal and ordinal data are analyzed with nonparametric statistical techniques and interval and ratio data with parametric statistical techniques. Advantages: Understanding the four levels of measurement and when it is appropriate to use each is important in determining which statistical technique to use when analyzing data. PMID:16558489

  8. Primary, Secondary, and Meta-Analysis of Research

    ERIC Educational Resources Information Center

    Glass, Gene V.

    1976-01-01

    Examines data analysis at three levels: analysis of data; secondary analysis is the re-analysis of data for the purpose of answering the original research question with better statistical techniques, or answering new questions with old data; and, meta-analysis refers to the statistical analysis of many analysis results from individual studies for…

  9. Comparison of a non-stationary voxelation-corrected cluster-size test with TFCE for group-Level MRI inference.

    PubMed

    Li, Huanjie; Nickerson, Lisa D; Nichols, Thomas E; Gao, Jia-Hong

    2017-03-01

    Two powerful methods for statistical inference on MRI brain images have been proposed recently, a non-stationary voxelation-corrected cluster-size test (CST) based on random field theory and threshold-free cluster enhancement (TFCE) based on calculating the level of local support for a cluster, then using permutation testing for inference. Unlike other statistical approaches, these two methods do not rest on the assumptions of a uniform and high degree of spatial smoothness of the statistic image. Thus, they are strongly recommended for group-level fMRI analysis compared to other statistical methods. In this work, the non-stationary voxelation-corrected CST and TFCE methods for group-level analysis were evaluated for both stationary and non-stationary images under varying smoothness levels, degrees of freedom and signal to noise ratios. Our results suggest that, both methods provide adequate control for the number of voxel-wise statistical tests being performed during inference on fMRI data and they are both superior to current CSTs implemented in popular MRI data analysis software packages. However, TFCE is more sensitive and stable for group-level analysis of VBM data. Thus, the voxelation-corrected CST approach may confer some advantages by being computationally less demanding for fMRI data analysis than TFCE with permutation testing and by also being applicable for single-subject fMRI analyses, while the TFCE approach is advantageous for VBM data. Hum Brain Mapp 38:1269-1280, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  10. Assessing the Lexico-Grammatical Characteristics of a Corpus of College-Level Statistics Textbooks: Implications for Instruction and Practice

    ERIC Educational Resources Information Center

    Wagler, Amy E.; Lesser, Lawrence M.; González, Ariel I.; Leal, Luis

    2015-01-01

    A corpus of current editions of statistics textbooks was assessed to compare aspects and levels of readability for the topics of "measures of center," "line of fit," "regression analysis," and "regression inference." Analysis with lexical software of these text selections revealed that the large corpus can…

  11. The level crossing rates and associated statistical properties of a random frequency response function

    NASA Astrophysics Data System (ADS)

    Langley, Robin S.

    2018-03-01

    This work is concerned with the statistical properties of the frequency response function of the energy of a random system. Earlier studies have considered the statistical distribution of the function at a single frequency, or alternatively the statistics of a band-average of the function. In contrast the present analysis considers the statistical fluctuations over a frequency band, and results are obtained for the mean rate at which the function crosses a specified level (or equivalently, the average number of times the level is crossed within the band). Results are also obtained for the probability of crossing a specified level at least once, the mean rate of occurrence of peaks, and the mean trough-to-peak height. The analysis is based on the assumption that the natural frequencies and mode shapes of the system have statistical properties that are governed by the Gaussian Orthogonal Ensemble (GOE), and the validity of this assumption is demonstrated by comparison with numerical simulations for a random plate. The work has application to the assessment of the performance of dynamic systems that are sensitive to random imperfections.

  12. Analysis of Coastal Dunes: A Remote Sensing and Statistical Approach.

    ERIC Educational Resources Information Center

    Jones, J. Richard

    1985-01-01

    Remote sensing analysis and statistical methods were used to analyze the coastal dunes of Plum Island, Massachusetts. The research methodology used provides an example of a student project for remote sensing, geomorphology, or spatial analysis courses at the university level. (RM)

  13. The Heuristics of Statistical Argumentation: Scaffolding at the Postsecondary Level

    ERIC Educational Resources Information Center

    Pardue, Teneal Messer

    2017-01-01

    Language plays a key role in statistics and, by extension, in statistics education. Enculturating students into the practice of statistics requires preparing them to communicate results of data analysis. Statistical argumentation is one way of providing structure to facilitate discourse in the statistics classroom. In this study, a teaching…

  14. Improved score statistics for meta-analysis in single-variant and gene-level association studies.

    PubMed

    Yang, Jingjing; Chen, Sai; Abecasis, Gonçalo

    2018-06-01

    Meta-analysis is now an essential tool for genetic association studies, allowing them to combine large studies and greatly accelerating the pace of genetic discovery. Although the standard meta-analysis methods perform equivalently as the more cumbersome joint analysis under ideal settings, they result in substantial power loss under unbalanced settings with various case-control ratios. Here, we investigate the power loss problem by the standard meta-analysis methods for unbalanced studies, and further propose novel meta-analysis methods performing equivalently to the joint analysis under both balanced and unbalanced settings. We derive improved meta-score-statistics that can accurately approximate the joint-score-statistics with combined individual-level data, for both linear and logistic regression models, with and without covariates. In addition, we propose a novel approach to adjust for population stratification by correcting for known population structures through minor allele frequencies. In the simulated gene-level association studies under unbalanced settings, our method recovered up to 85% power loss caused by the standard methods. We further showed the power gain of our methods in gene-level tests with 26 unbalanced studies of age-related macular degeneration . In addition, we took the meta-analysis of three unbalanced studies of type 2 diabetes as an example to discuss the challenges of meta-analyzing multi-ethnic samples. In summary, our improved meta-score-statistics with corrections for population stratification can be used to construct both single-variant and gene-level association studies, providing a useful framework for ensuring well-powered, convenient, cross-study analyses. © 2018 WILEY PERIODICALS, INC.

  15. Social inequalities in alcohol consumption in the Czech Republic: a multilevel analysis.

    PubMed

    Dzúrová, Dagmara; Spilková, Jana; Pikhart, Hynek

    2010-05-01

    Czech Republic traditionally ranks among the countries with the highest alcohol, consumption. This paper examines both risk and protective factors for frequent of alcohol, consumption in the Czech population using multilevel analysis. Risk factors were measured at the, individual level and at the area level. The individual-level data were obtained from a survey for a, sample of 3526 respondents aged 18-64 years. The area-level data were obtained from the Czech, Statistical Office. The group most inclinable to risk alcohol consumption and binge drinking are mainly, men, who live as single, with low education and also unemployed. Only the variable for divorce rate, showed statistical significance at both levels, thus the individual and the aggregated one. No cross-level interactions were found to be statistically significant. Copyright 2010 Elsevier Ltd. All rights reserved.

  16. Why campaigns for local transportation funding initiatives succeed or fail : an analysis of four communities and national data

    DOT National Transportation Integrated Search

    2000-06-01

    This report uses statistical analysis of community-level characteristics and qualitatively focused case studies to explore what determines the success of local transportation-related tax measures. The report contains both a statistical analysis of lo...

  17. metaCCA: summary statistics-based multivariate meta-analysis of genome-wide association studies using canonical correlation analysis.

    PubMed

    Cichonska, Anna; Rousu, Juho; Marttinen, Pekka; Kangas, Antti J; Soininen, Pasi; Lehtimäki, Terho; Raitakari, Olli T; Järvelin, Marjo-Riitta; Salomaa, Veikko; Ala-Korpela, Mika; Ripatti, Samuli; Pirinen, Matti

    2016-07-01

    A dominant approach to genetic association studies is to perform univariate tests between genotype-phenotype pairs. However, analyzing related traits together increases statistical power, and certain complex associations become detectable only when several variants are tested jointly. Currently, modest sample sizes of individual cohorts, and restricted availability of individual-level genotype-phenotype data across the cohorts limit conducting multivariate tests. We introduce metaCCA, a computational framework for summary statistics-based analysis of a single or multiple studies that allows multivariate representation of both genotype and phenotype. It extends the statistical technique of canonical correlation analysis to the setting where original individual-level records are not available, and employs a covariance shrinkage algorithm to achieve robustness.Multivariate meta-analysis of two Finnish studies of nuclear magnetic resonance metabolomics by metaCCA, using standard univariate output from the program SNPTEST, shows an excellent agreement with the pooled individual-level analysis of original data. Motivated by strong multivariate signals in the lipid genes tested, we envision that multivariate association testing using metaCCA has a great potential to provide novel insights from already published summary statistics from high-throughput phenotyping technologies. Code is available at https://github.com/aalto-ics-kepaco anna.cichonska@helsinki.fi or matti.pirinen@helsinki.fi Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  18. metaCCA: summary statistics-based multivariate meta-analysis of genome-wide association studies using canonical correlation analysis

    PubMed Central

    Cichonska, Anna; Rousu, Juho; Marttinen, Pekka; Kangas, Antti J.; Soininen, Pasi; Lehtimäki, Terho; Raitakari, Olli T.; Järvelin, Marjo-Riitta; Salomaa, Veikko; Ala-Korpela, Mika; Ripatti, Samuli; Pirinen, Matti

    2016-01-01

    Motivation: A dominant approach to genetic association studies is to perform univariate tests between genotype-phenotype pairs. However, analyzing related traits together increases statistical power, and certain complex associations become detectable only when several variants are tested jointly. Currently, modest sample sizes of individual cohorts, and restricted availability of individual-level genotype-phenotype data across the cohorts limit conducting multivariate tests. Results: We introduce metaCCA, a computational framework for summary statistics-based analysis of a single or multiple studies that allows multivariate representation of both genotype and phenotype. It extends the statistical technique of canonical correlation analysis to the setting where original individual-level records are not available, and employs a covariance shrinkage algorithm to achieve robustness. Multivariate meta-analysis of two Finnish studies of nuclear magnetic resonance metabolomics by metaCCA, using standard univariate output from the program SNPTEST, shows an excellent agreement with the pooled individual-level analysis of original data. Motivated by strong multivariate signals in the lipid genes tested, we envision that multivariate association testing using metaCCA has a great potential to provide novel insights from already published summary statistics from high-throughput phenotyping technologies. Availability and implementation: Code is available at https://github.com/aalto-ics-kepaco Contacts: anna.cichonska@helsinki.fi or matti.pirinen@helsinki.fi Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153689

  19. A quantitative analysis of factors influencing the professional longevity of high school science teachers in Florida

    NASA Astrophysics Data System (ADS)

    Ridgley, James Alexander, Jr.

    This dissertation is an exploratory quantitative analysis of various independent variables to determine their effect on the professional longevity (years of service) of high school science teachers in the state of Florida for the academic years 2011-2012 to 2013-2014. Data are collected from the Florida Department of Education, National Center for Education Statistics, and the National Assessment of Educational Progress databases. The following research hypotheses are examined: H1 - There are statistically significant differences in Level 1 (teacher variables) that influence the professional longevity of a high school science teacher in Florida. H2 - There are statistically significant differences in Level 2 (school variables) that influence the professional longevity of a high school science teacher in Florida. H3 - There are statistically significant differences in Level 3 (district variables) that influence the professional longevity of a high school science teacher in Florida. H4 - When tested in a hierarchical multiple regression, there are statistically significant differences in Level 1, Level 2, or Level 3 that influence the professional longevity of a high school science teacher in Florida. The professional longevity of a Floridian high school science teacher is the dependent variable. The independent variables are: (Level 1) a teacher's sex, age, ethnicity, earned degree, salary, number of schools taught in, migration count, and various years of service in different areas of education; (Level 2) a school's geographic location, residential population density, average class size, charter status, and SES; and (Level 3) a school district's average SES and average spending per pupil. Statistical analyses of exploratory MLRs and a HMR are used to support the research hypotheses. The final results of the HMR analysis show a teacher's age, salary, earned degree (unknown, associate, and doctorate), and ethnicity (Hispanic and Native Hawaiian/Pacific Islander); a school's charter status; and a school district's average SES are all significant predictors of a Florida high school science teacher's professional longevity. Although statistically significant in the initial exploratory MLR analyses, a teacher's ethnicity (Asian and Black), a school's geographic location (city and rural), and a school's SES are not statistically significant in the final HMR model.

  20. Why Campaigns for Local Transportation Funding Initiatives Succeed or Fail: An Analysis of Four Communities and National Data (PDF file)

    DOT National Transportation Integrated Search

    2000-06-01

    This report uses statistical analysis of community-level characteristics and qualitatively focused case studies to explore what determines the success of local transportation-related tax measures. The report contains both a statistical analysis of lo...

  1. Assessing the Kansas water-level monitoring program: An example of the application of classical statistics to a geological problem

    USGS Publications Warehouse

    Davis, J.C.

    2000-01-01

    Geologists may feel that geological data are not amenable to statistical analysis, or at best require specialized approaches such as nonparametric statistics and geostatistics. However, there are many circumstances, particularly in systematic studies conducted for environmental or regulatory purposes, where traditional parametric statistical procedures can be beneficial. An example is the application of analysis of variance to data collected in an annual program of measuring groundwater levels in Kansas. Influences such as well conditions, operator effects, and use of the water can be assessed and wells that yield less reliable measurements can be identified. Such statistical studies have resulted in yearly improvements in the quality and reliability of the collected hydrologic data. Similar benefits may be achieved in other geological studies by the appropriate use of classical statistical tools.

  2. A Meta-analysis of Gender Differences in Applied Statistics Achievement.

    ERIC Educational Resources Information Center

    Schram, Christine M.

    1996-01-01

    A meta-analysis of gender differences examined statistics achievement in postsecondary level psychology, education, and business courses. Analysis of 13 articles (18 samples) found that undergraduate males had an advantage, outscoring females when the outcome was a series of examinations. Females outscored males when the outcome was total course…

  3. Bias, precision and statistical power of analysis of covariance in the analysis of randomized trials with baseline imbalance: a simulation study.

    PubMed

    Egbewale, Bolaji E; Lewis, Martyn; Sim, Julius

    2014-04-09

    Analysis of variance (ANOVA), change-score analysis (CSA) and analysis of covariance (ANCOVA) respond differently to baseline imbalance in randomized controlled trials. However, no empirical studies appear to have quantified the differential bias and precision of estimates derived from these methods of analysis, and their relative statistical power, in relation to combinations of levels of key trial characteristics. This simulation study therefore examined the relative bias, precision and statistical power of these three analyses using simulated trial data. 126 hypothetical trial scenarios were evaluated (126,000 datasets), each with continuous data simulated by using a combination of levels of: treatment effect; pretest-posttest correlation; direction and magnitude of baseline imbalance. The bias, precision and power of each method of analysis were calculated for each scenario. Compared to the unbiased estimates produced by ANCOVA, both ANOVA and CSA are subject to bias, in relation to pretest-posttest correlation and the direction of baseline imbalance. Additionally, ANOVA and CSA are less precise than ANCOVA, especially when pretest-posttest correlation ≥ 0.3. When groups are balanced at baseline, ANCOVA is at least as powerful as the other analyses. Apparently greater power of ANOVA and CSA at certain imbalances is achieved in respect of a biased treatment effect. Across a range of correlations between pre- and post-treatment scores and at varying levels and direction of baseline imbalance, ANCOVA remains the optimum statistical method for the analysis of continuous outcomes in RCTs, in terms of bias, precision and statistical power.

  4. Bias, precision and statistical power of analysis of covariance in the analysis of randomized trials with baseline imbalance: a simulation study

    PubMed Central

    2014-01-01

    Background Analysis of variance (ANOVA), change-score analysis (CSA) and analysis of covariance (ANCOVA) respond differently to baseline imbalance in randomized controlled trials. However, no empirical studies appear to have quantified the differential bias and precision of estimates derived from these methods of analysis, and their relative statistical power, in relation to combinations of levels of key trial characteristics. This simulation study therefore examined the relative bias, precision and statistical power of these three analyses using simulated trial data. Methods 126 hypothetical trial scenarios were evaluated (126 000 datasets), each with continuous data simulated by using a combination of levels of: treatment effect; pretest-posttest correlation; direction and magnitude of baseline imbalance. The bias, precision and power of each method of analysis were calculated for each scenario. Results Compared to the unbiased estimates produced by ANCOVA, both ANOVA and CSA are subject to bias, in relation to pretest-posttest correlation and the direction of baseline imbalance. Additionally, ANOVA and CSA are less precise than ANCOVA, especially when pretest-posttest correlation ≥ 0.3. When groups are balanced at baseline, ANCOVA is at least as powerful as the other analyses. Apparently greater power of ANOVA and CSA at certain imbalances is achieved in respect of a biased treatment effect. Conclusions Across a range of correlations between pre- and post-treatment scores and at varying levels and direction of baseline imbalance, ANCOVA remains the optimum statistical method for the analysis of continuous outcomes in RCTs, in terms of bias, precision and statistical power. PMID:24712304

  5. Assessment of trace elements levels in patients with Type 2 diabetes using multivariate statistical analysis.

    PubMed

    Badran, M; Morsy, R; Soliman, H; Elnimr, T

    2016-01-01

    The trace elements metabolism has been reported to possess specific roles in the pathogenesis and progress of diabetes mellitus. Due to the continuous increase in the population of patients with Type 2 diabetes (T2D), this study aims to assess the levels and inter-relationships of fast blood glucose (FBG) and serum trace elements in Type 2 diabetic patients. This study was conducted on 40 Egyptian Type 2 diabetic patients and 36 healthy volunteers (Hospital of Tanta University, Tanta, Egypt). The blood serum was digested and then used to determine the levels of 24 trace elements using an inductive coupled plasma mass spectroscopy (ICP-MS). Multivariate statistical analysis depended on correlation coefficient, cluster analysis (CA) and principal component analysis (PCA), were used to analysis the data. The results exhibited significant changes in FBG and eight of trace elements, Zn, Cu, Se, Fe, Mn, Cr, Mg, and As, levels in the blood serum of Type 2 diabetic patients relative to those of healthy controls. The statistical analyses using multivariate statistical techniques were obvious in the reduction of the experimental variables, and grouping the trace elements in patients into three clusters. The application of PCA revealed a distinct difference in associations of trace elements and their clustering patterns in control and patients group in particular for Mg, Fe, Cu, and Zn that appeared to be the most crucial factors which related with Type 2 diabetes. Therefore, on the basis of this study, the contributors of trace elements content in Type 2 diabetic patients can be determine and specify with correlation relationship and multivariate statistical analysis, which confirm that the alteration of some essential trace metals may play a role in the development of diabetes mellitus. Copyright © 2015 Elsevier GmbH. All rights reserved.

  6. Significant Association of Urinary Toxic Metals and Autism-Related Symptoms—A Nonlinear Statistical Analysis with Cross Validation

    PubMed Central

    Adams, James; Kruger, Uwe; Geis, Elizabeth; Gehn, Eva; Fimbres, Valeria; Pollard, Elena; Mitchell, Jessica; Ingram, Julie; Hellmers, Robert; Quig, David; Hahn, Juergen

    2017-01-01

    Introduction A number of previous studies examined a possible association of toxic metals and autism, and over half of those studies suggest that toxic metal levels are different in individuals with Autism Spectrum Disorders (ASD). Additionally, several studies found that those levels correlate with the severity of ASD. Methods In order to further investigate these points, this paper performs the most detailed statistical analysis to date of a data set in this field. First morning urine samples were collected from 67 children and adults with ASD and 50 neurotypical controls of similar age and gender. The samples were analyzed to determine the levels of 10 urinary toxic metals (UTM). Autism-related symptoms were assessed with eleven behavioral measures. Statistical analysis was used to distinguish participants on the ASD spectrum and neurotypical participants based upon the UTM data alone. The analysis also included examining the association of autism severity with toxic metal excretion data using linear and nonlinear analysis. “Leave-one-out” cross-validation was used to ensure statistical independence of results. Results and Discussion Average excretion levels of several toxic metals (lead, tin, thallium, antimony) were significantly higher in the ASD group. However, ASD classification using univariate statistics proved difficult due to large variability, but nonlinear multivariate statistical analysis significantly improved ASD classification with Type I/II errors of 15% and 18%, respectively. These results clearly indicate that the urinary toxic metal excretion profiles of participants in the ASD group were significantly different from those of the neurotypical participants. Similarly, nonlinear methods determined a significantly stronger association between the behavioral measures and toxic metal excretion. The association was strongest for the Aberrant Behavior Checklist (including subscales on Irritability, Stereotypy, Hyperactivity, and Inappropriate Speech), but significant associations were found for UTM with all eleven autism-related assessments with cross-validation R2 values ranging from 0.12–0.48. PMID:28068407

  7. A Simple Test of Class-Level Genetic Association Can Reveal Novel Cardiometabolic Trait Loci.

    PubMed

    Qian, Jing; Nunez, Sara; Reed, Eric; Reilly, Muredach P; Foulkes, Andrea S

    2016-01-01

    Characterizing the genetic determinants of complex diseases can be further augmented by incorporating knowledge of underlying structure or classifications of the genome, such as newly developed mappings of protein-coding genes, epigenetic marks, enhancer elements and non-coding RNAs. We apply a simple class-level testing framework, termed Genetic Class Association Testing (GenCAT), to identify protein-coding gene association with 14 cardiometabolic (CMD) related traits across 6 publicly available genome wide association (GWA) meta-analysis data resources. GenCAT uses SNP-level meta-analysis test statistics across all SNPs within a class of elements, as well as the size of the class and its unique correlation structure, to determine if the class is statistically meaningful. The novelty of findings is evaluated through investigation of regional signals. A subset of findings are validated using recently updated, larger meta-analysis resources. A simulation study is presented to characterize overall performance with respect to power, control of family-wise error and computational efficiency. All analysis is performed using the GenCAT package, R version 3.2.1. We demonstrate that class-level testing complements the common first stage minP approach that involves individual SNP-level testing followed by post-hoc ascribing of statistically significant SNPs to genes and loci. GenCAT suggests 54 protein-coding genes at 41 distinct loci for the 13 CMD traits investigated in the discovery analysis, that are beyond the discoveries of minP alone. An additional application to biological pathways demonstrates flexibility in defining genetic classes. We conclude that it would be prudent to include class-level testing as standard practice in GWA analysis. GenCAT, for example, can be used as a simple, complementary and efficient strategy for class-level testing that leverages existing data resources, requires only summary level data in the form of test statistics, and adds significant value with respect to its potential for identifying multiple novel and clinically relevant trait associations.

  8. The effect of project-based learning on students' statistical literacy levels for data representation

    NASA Astrophysics Data System (ADS)

    Koparan, Timur; Güven, Bülent

    2015-07-01

    The point of this study is to define the effect of project-based learning approach on 8th Grade secondary-school students' statistical literacy levels for data representation. To achieve this goal, a test which consists of 12 open-ended questions in accordance with the views of experts was developed. Seventy 8th grade secondary-school students, 35 in the experimental group and 35 in the control group, took this test twice, one before the application and one after the application. All the raw scores were turned into linear points by using the Winsteps 3.72 modelling program that makes the Rasch analysis and t-tests, and an ANCOVA analysis was carried out with the linear points. Depending on the findings, it was concluded that the project-based learning approach increases students' level of statistical literacy for data representation. Students' levels of statistical literacy before and after the application were shown through the obtained person-item maps.

  9. IGESS: a statistical approach to integrating individual-level genotype data and summary statistics in genome-wide association studies.

    PubMed

    Dai, Mingwei; Ming, Jingsi; Cai, Mingxuan; Liu, Jin; Yang, Can; Wan, Xiang; Xu, Zongben

    2017-09-15

    Results from genome-wide association studies (GWAS) suggest that a complex phenotype is often affected by many variants with small effects, known as 'polygenicity'. Tens of thousands of samples are often required to ensure statistical power of identifying these variants with small effects. However, it is often the case that a research group can only get approval for the access to individual-level genotype data with a limited sample size (e.g. a few hundreds or thousands). Meanwhile, summary statistics generated using single-variant-based analysis are becoming publicly available. The sample sizes associated with the summary statistics datasets are usually quite large. How to make the most efficient use of existing abundant data resources largely remains an open question. In this study, we propose a statistical approach, IGESS, to increasing statistical power of identifying risk variants and improving accuracy of risk prediction by i ntegrating individual level ge notype data and s ummary s tatistics. An efficient algorithm based on variational inference is developed to handle the genome-wide analysis. Through comprehensive simulation studies, we demonstrated the advantages of IGESS over the methods which take either individual-level data or summary statistics data as input. We applied IGESS to perform integrative analysis of Crohns Disease from WTCCC and summary statistics from other studies. IGESS was able to significantly increase the statistical power of identifying risk variants and improve the risk prediction accuracy from 63.2% ( ±0.4% ) to 69.4% ( ±0.1% ) using about 240 000 variants. The IGESS software is available at https://github.com/daviddaigithub/IGESS . zbxu@xjtu.edu.cn or xwan@comp.hkbu.edu.hk or eeyang@hkbu.edu.hk. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  10. Statistics without Tears: Complex Statistics with Simple Arithmetic

    ERIC Educational Resources Information Center

    Smith, Brian

    2011-01-01

    One of the often overlooked aspects of modern statistics is the analysis of time series data. Modern introductory statistics courses tend to rush to probabilistic applications involving risk and confidence. Rarely does the first level course linger on such useful and fascinating topics as time series decomposition, with its practical applications…

  11. Statistical approaches in published ophthalmic clinical science papers: a comparison to statistical practice two decades ago.

    PubMed

    Zhang, Harrison G; Ying, Gui-Shuang

    2018-02-09

    The aim of this study is to evaluate the current practice of statistical analysis of eye data in clinical science papers published in British Journal of Ophthalmology ( BJO ) and to determine whether the practice of statistical analysis has improved in the past two decades. All clinical science papers (n=125) published in BJO in January-June 2017 were reviewed for their statistical analysis approaches for analysing primary ocular measure. We compared our findings to the results from a previous paper that reviewed BJO papers in 1995. Of 112 papers eligible for analysis, half of the studies analysed the data at an individual level because of the nature of observation, 16 (14%) studies analysed data from one eye only, 36 (32%) studies analysed data from both eyes at ocular level, one study (1%) analysed the overall summary of ocular finding per individual and three (3%) studies used the paired comparison. Among studies with data available from both eyes, 50 (89%) of 56 papers in 2017 did not analyse data from both eyes or ignored the intereye correlation, as compared with in 60 (90%) of 67 papers in 1995 (P=0.96). Among studies that analysed data from both eyes at an ocular level, 33 (92%) of 36 studies completely ignored the intereye correlation in 2017, as compared with in 16 (89%) of 18 studies in 1995 (P=0.40). A majority of studies did not analyse the data properly when data from both eyes were available. The practice of statistical analysis did not improve in the past two decades. Collaborative efforts should be made in the vision research community to improve the practice of statistical analysis for ocular data. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  12. A Statistical Analysis Plan to Support the Joint Forward Area Air Defense Test.

    DTIC Science & Technology

    1984-08-02

    hy estahlishing a specific significance level prior to performing the statistical test (traditionally a levels are set at .01 or .05). What is often...undesirable increase in 8. For constant a levels , the power (I - 8) of a statistical test can he increased by Increasing the sample size of the test. fRef...ANOVA Iparison Test on MOP I=--ferences Exist AmongF "Upon MOP "A" Factor I "A" Factor I 1MOP " A " Levels ? I . I I I _ _ ________ IPerform k-Sample Com- I

  13. Application of Ontology Technology in Health Statistic Data Analysis.

    PubMed

    Guo, Minjiang; Hu, Hongpu; Lei, Xingyun

    2017-01-01

    Research Purpose: establish health management ontology for analysis of health statistic data. Proposed Methods: this paper established health management ontology based on the analysis of the concepts in China Health Statistics Yearbook, and used protégé to define the syntactic and semantic structure of health statistical data. six classes of top-level ontology concepts and their subclasses had been extracted and the object properties and data properties were defined to establish the construction of these classes. By ontology instantiation, we can integrate multi-source heterogeneous data and enable administrators to have an overall understanding and analysis of the health statistic data. ontology technology provides a comprehensive and unified information integration structure of the health management domain and lays a foundation for the efficient analysis of multi-source and heterogeneous health system management data and enhancement of the management efficiency.

  14. Killing Barney Fife: Law Enforcements Socially Constructed Perception of Violence and its Influence on Police Militarization

    DTIC Science & Technology

    2015-09-01

    then examines the correlation between violence and police militarization. A statistical analysis of crime data found an inverse relationship between...violence and police militarization. A statistical analysis of crime data found an inverse relationship between levels of reported violence and...events. The research then focused on the correlation between violence and police militarization. The research began with a detailed statistical

  15. Cognition, comprehension and application of biostatistics in research by Indian postgraduate students in periodontics.

    PubMed

    Swetha, Jonnalagadda Laxmi; Arpita, Ramisetti; Srikanth, Chintalapani; Nutalapati, Rajasekhar

    2014-01-01

    Biostatistics is an integral part of research protocols. In any field of inquiry or investigation, data obtained is subsequently classified, analyzed and tested for accuracy by statistical methods. Statistical analysis of collected data, thus, forms the basis for all evidence-based conclusions. The aim of this study is to evaluate the cognition, comprehension and application of biostatistics in research among post graduate students in Periodontics, in India. A total of 391 post graduate students registered for a master's course in periodontics at various dental colleges across India were included in the survey. Data regarding the level of knowledge, understanding and its application in design and conduct of the research protocol was collected using a dichotomous questionnaire. A descriptive statistics was used for data analysis. Nearly 79.2% students were aware of the importance of biostatistics in research, 55-65% were familiar with MS-EXCEL spreadsheet for graphical representation of data and with the statistical softwares available on the internet, 26.0% had biostatistics as mandatory subject in their curriculum, 9.5% tried to perform statistical analysis on their own while 3.0% were successful in performing statistical analysis of their studies on their own. Biostatistics should play a central role in planning, conduct, interim analysis, final analysis and reporting of periodontal research especially by the postgraduate students. Indian postgraduate students in periodontics are aware of the importance of biostatistics in research but the level of understanding and application is still basic and needs to be addressed.

  16. The Statistical Power of Planned Comparisons.

    ERIC Educational Resources Information Center

    Benton, Roberta L.

    Basic principles underlying statistical power are examined; and issues pertaining to effect size, sample size, error variance, and significance level are highlighted via the use of specific hypothetical examples. Analysis of variance (ANOVA) and related methods remain popular, although other procedures sometimes have more statistical power against…

  17. Analysis of pediatric blood lead levels in New York City for 1970-1976.

    PubMed Central

    Billick, I H; Curran, A S; Shier, D R

    1979-01-01

    A study was completed of more than 170,000 records of pediatric venous blood levels and supporting demographic information collected in New York City during 1970-1976. The geometric mean (GM) blood lead level shows a consistent cyclical variation superimposed on an overall decreasing trend with time for all ages and ethnic groups studied. The GM blood lead levels for blacks are significantly greater than those for either Hispanics or whites. Regression analysis indicates a significant statistical association between GM blood lead level and ambient air lead level, after appropriate adjustments are made for age and ethnic group. These highly significant statistical relationships provide extremely strong incentives and directions for research into casual factors related to blood lead levels in children. PMID:499123

  18. Power Analysis in Two-Level Unbalanced Designs

    ERIC Educational Resources Information Center

    Konstantopoulos, Spyros

    2010-01-01

    Previous work on statistical power has discussed mainly single-level designs or 2-level balanced designs with random effects. Although balanced experiments are common, in practice balance cannot always be achieved. Work on class size is one example of unbalanced designs. This study provides methods for power analysis in 2-level unbalanced designs…

  19. Research Update: Spatially resolved mapping of electronic structure on atomic level by multivariate statistical analysis

    NASA Astrophysics Data System (ADS)

    Belianinov, Alex; Ganesh, Panchapakesan; Lin, Wenzhi; Sales, Brian C.; Sefat, Athena S.; Jesse, Stephen; Pan, Minghu; Kalinin, Sergei V.

    2014-12-01

    Atomic level spatial variability of electronic structure in Fe-based superconductor FeTe0.55Se0.45 (Tc = 15 K) is explored using current-imaging tunneling-spectroscopy. Multivariate statistical analysis of the data differentiates regions of dissimilar electronic behavior that can be identified with the segregation of chalcogen atoms, as well as boundaries between terminations and near neighbor interactions. Subsequent clustering analysis allows identification of the spatial localization of these dissimilar regions. Similar statistical analysis of modeled calculated density of states of chemically inhomogeneous FeTe1-xSex structures further confirms that the two types of chalcogens, i.e., Te and Se, can be identified by their electronic signature and differentiated by their local chemical environment. This approach allows detailed chemical discrimination of the scanning tunneling microscopy data including separation of atomic identities, proximity, and local configuration effects and can be universally applicable to chemically and electronically inhomogeneous surfaces.

  20. Statistical Analysis of a Large Sample Size Pyroshock Test Data Set Including Post Flight Data Assessment. Revision 1

    NASA Technical Reports Server (NTRS)

    Hughes, William O.; McNelis, Anne M.

    2010-01-01

    The Earth Observing System (EOS) Terra spacecraft was launched on an Atlas IIAS launch vehicle on its mission to observe planet Earth in late 1999. Prior to launch, the new design of the spacecraft's pyroshock separation system was characterized by a series of 13 separation ground tests. The analysis methods used to evaluate this unusually large amount of shock data will be discussed in this paper, with particular emphasis on population distributions and finding statistically significant families of data, leading to an overall shock separation interface level. The wealth of ground test data also allowed a derivation of a Mission Assurance level for the flight. All of the flight shock measurements were below the EOS Terra Mission Assurance level thus contributing to the overall success of the EOS Terra mission. The effectiveness of the statistical methodology for characterizing the shock interface level and for developing a flight Mission Assurance level from a large sample size of shock data is demonstrated in this paper.

  1. Low-Level Stratus Prediction Using Binary Statistical Regression: A Progress Report Using Moffett Field Data.

    DTIC Science & Technology

    1983-12-01

    analysis; such work is not reported here. It seems pos- sible that a robust principle component analysis may he informa- tive (see Gnanadesikan (1977...Statistics in Atmospheric Sciences, American Meteorological Soc., Boston, Mass. (1979) pp. 46-48. a Gnanadesikan , R., Methods for Statistical Data...North Carolina Chapel Hill, NC 20742 Dr. R. Gnanadesikan Bell Telephone Lab Murray Hill, NJ 07733 -%.. *5%a: *1 *15 I ,, - . . , ,, ... . . . . . . NO

  2. K-nearest neighbors based methods for identification of different gear crack levels under different motor speeds and loads: Revisited

    NASA Astrophysics Data System (ADS)

    Wang, Dong

    2016-03-01

    Gears are the most commonly used components in mechanical transmission systems. Their failures may cause transmission system breakdown and result in economic loss. Identification of different gear crack levels is important to prevent any unexpected gear failure because gear cracks lead to gear tooth breakage. Signal processing based methods mainly require expertize to explain gear fault signatures which is usually not easy to be achieved by ordinary users. In order to automatically identify different gear crack levels, intelligent gear crack identification methods should be developed. The previous case studies experimentally proved that K-nearest neighbors based methods exhibit high prediction accuracies for identification of 3 different gear crack levels under different motor speeds and loads. In this short communication, to further enhance prediction accuracies of existing K-nearest neighbors based methods and extend identification of 3 different gear crack levels to identification of 5 different gear crack levels, redundant statistical features are constructed by using Daubechies 44 (db44) binary wavelet packet transform at different wavelet decomposition levels, prior to the use of a K-nearest neighbors method. The dimensionality of redundant statistical features is 620, which provides richer gear fault signatures. Since many of these statistical features are redundant and highly correlated with each other, dimensionality reduction of redundant statistical features is conducted to obtain new significant statistical features. At last, the K-nearest neighbors method is used to identify 5 different gear crack levels under different motor speeds and loads. A case study including 3 experiments is investigated to demonstrate that the developed method provides higher prediction accuracies than the existing K-nearest neighbors based methods for recognizing different gear crack levels under different motor speeds and loads. Based on the new significant statistical features, some other popular statistical models including linear discriminant analysis, quadratic discriminant analysis, classification and regression tree and naive Bayes classifier, are compared with the developed method. The results show that the developed method has the highest prediction accuracies among these statistical models. Additionally, selection of the number of new significant features and parameter selection of K-nearest neighbors are thoroughly investigated.

  3. A Systematic Review and Meta-Regression Analysis of Lung Cancer Risk and Inorganic Arsenic in Drinking Water.

    PubMed

    Lamm, Steven H; Ferdosi, Hamid; Dissen, Elisabeth K; Li, Ji; Ahn, Jaeil

    2015-12-07

    High levels (> 200 µg/L) of inorganic arsenic in drinking water are known to be a cause of human lung cancer, but the evidence at lower levels is uncertain. We have sought the epidemiological studies that have examined the dose-response relationship between arsenic levels in drinking water and the risk of lung cancer over a range that includes both high and low levels of arsenic. Regression analysis, based on six studies identified from an electronic search, examined the relationship between the log of the relative risk and the log of the arsenic exposure over a range of 1-1000 µg/L. The best-fitting continuous meta-regression model was sought and found to be a no-constant linear-quadratic analysis where both the risk and the exposure had been logarithmically transformed. This yielded both a statistically significant positive coefficient for the quadratic term and a statistically significant negative coefficient for the linear term. Sub-analyses by study design yielded results that were similar for both ecological studies and non-ecological studies. Statistically significant X-intercepts consistently found no increased level of risk at approximately 100-150 µg/L arsenic.

  4. A Systematic Review and Meta-Regression Analysis of Lung Cancer Risk and Inorganic Arsenic in Drinking Water

    PubMed Central

    Lamm, Steven H.; Ferdosi, Hamid; Dissen, Elisabeth K.; Li, Ji; Ahn, Jaeil

    2015-01-01

    High levels (> 200 µg/L) of inorganic arsenic in drinking water are known to be a cause of human lung cancer, but the evidence at lower levels is uncertain. We have sought the epidemiological studies that have examined the dose-response relationship between arsenic levels in drinking water and the risk of lung cancer over a range that includes both high and low levels of arsenic. Regression analysis, based on six studies identified from an electronic search, examined the relationship between the log of the relative risk and the log of the arsenic exposure over a range of 1–1000 µg/L. The best-fitting continuous meta-regression model was sought and found to be a no-constant linear-quadratic analysis where both the risk and the exposure had been logarithmically transformed. This yielded both a statistically significant positive coefficient for the quadratic term and a statistically significant negative coefficient for the linear term. Sub-analyses by study design yielded results that were similar for both ecological studies and non-ecological studies. Statistically significant X-intercepts consistently found no increased level of risk at approximately 100–150 µg/L arsenic. PMID:26690190

  5. Statistical Assessment of Variability of Terminal Restriction Fragment Length Polymorphism Analysis Applied to Complex Microbial Communities ▿ †

    PubMed Central

    Rossi, Pierre; Gillet, François; Rohrbach, Emmanuelle; Diaby, Nouhou; Holliger, Christof

    2009-01-01

    The variability of terminal restriction fragment polymorphism analysis applied to complex microbial communities was assessed statistically. Recent technological improvements were implemented in the successive steps of the procedure, resulting in a standardized procedure which provided a high level of reproducibility. PMID:19749066

  6. Single-Level and Multilevel Mediation Analysis

    ERIC Educational Resources Information Center

    Tofighi, Davood; Thoemmes, Felix

    2014-01-01

    Mediation analysis is a statistical approach used to examine how the effect of an independent variable on an outcome is transmitted through an intervening variable (mediator). In this article, we provide a gentle introduction to single-level and multilevel mediation analyses. Using single-level data, we demonstrate an application of structural…

  7. A statistical approach to deriving subsystem specifications. [for spacecraft shock and vibrational environment tests

    NASA Technical Reports Server (NTRS)

    Keegan, W. B.

    1974-01-01

    In order to produce cost effective environmental test programs, the test specifications must be realistic and to be useful, they must be available early in the life of a program. This paper describes a method for achieving such specifications for subsystems by utilizing the results of a statistical analysis of data acquired at subsystem mounting locations during system level environmental tests. The paper describes the details of this statistical analysis. The resultant recommended levels are a function of the subsystems' mounting location in the spacecraft. Methods of determining this mounting 'zone' are described. Recommendations are then made as to which of the various problem areas encountered should be pursued further.

  8. The Data from Aeromechanics Test and Analytics -- Management and Analysis Package (DATAMAP). Volume I. User’s Manual.

    DTIC Science & Technology

    1980-12-01

    to sound pressure level in decibels assuming a fre- quency of 1000 Hz. 249 The perceived noisiness values are derived from a formula specified in...Analyses .......... 244 6.i.16 Perceived Noise Level Analysis .............249 6.1.17 Acoustic Weighting Networks ................250 6.2 DERIVATIONS...BAND ANALYSIS BASIC STATISTICAL ANALYSES: *OCTAVE ANALYSIS MEAN *THIRD OCTAVE ANALYSIS VARIANCE *PERCEIVED NOISE LEVEL STANDARD DEVIATION CALCULATION

  9. Analysis of the dependence of extreme rainfalls

    NASA Astrophysics Data System (ADS)

    Padoan, Simone; Ancey, Christophe; Parlange, Marc

    2010-05-01

    The aim of spatial analysis is to quantitatively describe the behavior of environmental phenomena such as precipitation levels, wind speed or daily temperatures. A number of generic approaches to spatial modeling have been developed[1], but these are not necessarily ideal for handling extremal aspects given their focus on mean process levels. The areal modelling of the extremes of a natural process observed at points in space is important in environmental statistics; for example, understanding extremal spatial rainfall is crucial in flood protection. In light of recent concerns over climate change, the use of robust mathematical and statistical methods for such analyses has grown in importance. Multivariate extreme value models and the class of maxstable processes [2] have a similar asymptotic motivation to the univariate Generalized Extreme Value (GEV) distribution , but providing a general approach to modeling extreme processes incorporating temporal or spatial dependence. Statistical methods for max-stable processes and data analyses of practical problems are discussed by [3] and [4]. This work illustrates methods to the statistical modelling of spatial extremes and gives examples of their use by means of a real extremal data analysis of Switzerland precipitation levels. [1] Cressie, N. A. C. (1993). Statistics for Spatial Data. Wiley, New York. [2] de Haan, L and Ferreria A. (2006). Extreme Value Theory An Introduction. Springer, USA. [3] Padoan, S. A., Ribatet, M and Sisson, S. A. (2009). Likelihood-Based Inference for Max-Stable Processes. Journal of the American Statistical Association, Theory & Methods. In press. [4] Davison, A. C. and Gholamrezaee, M. (2009), Geostatistics of extremes. Journal of the Royal Statistical Society, Series B. To appear.

  10. Phosphorylated neurofilament heavy: A potential blood biomarker to evaluate the severity of acute spinal cord injuries in adults

    PubMed Central

    Singh, Ajai; Kumar, Vineet; Ali, Sabir; Mahdi, Abbas Ali; Srivastava, Rajeshwer Nath

    2017-01-01

    Aims: The aim of this study is to analyze the serial estimation of phosphorylated neurofilament heavy (pNF-H) in blood plasma that would act as a potential biomarker for early prediction of the neurological severity of acute spinal cord injuries (SCI) in adults. Settings and Design: Pilot study/observational study. Subjects and Methods: A total of 40 patients (28 cases and 12 controls) of spine injury were included in this study. In the enrolled cases, plasma level of pNF-H was evaluated in blood samples and neurological evaluation was performed by the American Spinal Injury Association Injury Scale at specified period. Serial plasma neurofilament heavy values were then correlated with the neurological status of these patients during follow-up visits and were analyzed statistically. Statistical Analysis Used: Statistical analysis was performed using GraphPad InStat software (version 3.05 for Windows, San Diego, CA, USA). The correlation analysis between the clinical progression and pNF-H expression was done using Spearman's correlation. Results: The mean baseline level of pNF-H in cases was 6.40 ± 2.49 ng/ml, whereas in controls it was 0.54 ± 0.27 ng/ml. On analyzing the association between the two by Mann–Whitney U–test, the difference in levels was found to be statistically significant. The association between the neurological progression and pNF-H expression was determined using correlation analysis (Spearman's correlation). At 95% confidence interval, the correlation coefficient was found to be 0.64, and the correlation was statistically significant. Conclusions: Plasma pNF-H levels were elevated in accordance with the severity of SCI. Therefore, pNF-H may be considered as a potential biomarker to determine early the severity of SCI in adult patients. PMID:29291173

  11. 41 CFR 60-2.35 - Compliance status.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... workforce (i.e., the employment of minorities or women at a percentage rate below, or above, the goal level... obligations will be determined by analysis of statistical data and other non-statistical information which...

  12. Teaching statistics in biology: using inquiry-based learning to strengthen understanding of statistical analysis in biology laboratory courses.

    PubMed

    Metz, Anneke M

    2008-01-01

    There is an increasing need for students in the biological sciences to build a strong foundation in quantitative approaches to data analyses. Although most science, engineering, and math field majors are required to take at least one statistics course, statistical analysis is poorly integrated into undergraduate biology course work, particularly at the lower-division level. Elements of statistics were incorporated into an introductory biology course, including a review of statistics concepts and opportunity for students to perform statistical analysis in a biological context. Learning gains were measured with an 11-item statistics learning survey instrument developed for the course. Students showed a statistically significant 25% (p < 0.005) increase in statistics knowledge after completing introductory biology. Students improved their scores on the survey after completing introductory biology, even if they had previously completed an introductory statistics course (9%, improvement p < 0.005). Students retested 1 yr after completing introductory biology showed no loss of their statistics knowledge as measured by this instrument, suggesting that the use of statistics in biology course work may aid long-term retention of statistics knowledge. No statistically significant differences in learning were detected between male and female students in the study.

  13. Cognition, comprehension and application of biostatistics in research by Indian postgraduate students in periodontics

    PubMed Central

    Swetha, Jonnalagadda Laxmi; Arpita, Ramisetti; Srikanth, Chintalapani; Nutalapati, Rajasekhar

    2014-01-01

    Background: Biostatistics is an integral part of research protocols. In any field of inquiry or investigation, data obtained is subsequently classified, analyzed and tested for accuracy by statistical methods. Statistical analysis of collected data, thus, forms the basis for all evidence-based conclusions. Aim: The aim of this study is to evaluate the cognition, comprehension and application of biostatistics in research among post graduate students in Periodontics, in India. Materials and Methods: A total of 391 post graduate students registered for a master's course in periodontics at various dental colleges across India were included in the survey. Data regarding the level of knowledge, understanding and its application in design and conduct of the research protocol was collected using a dichotomous questionnaire. A descriptive statistics was used for data analysis. Results: Nearly 79.2% students were aware of the importance of biostatistics in research, 55-65% were familiar with MS-EXCEL spreadsheet for graphical representation of data and with the statistical softwares available on the internet, 26.0% had biostatistics as mandatory subject in their curriculum, 9.5% tried to perform statistical analysis on their own while 3.0% were successful in performing statistical analysis of their studies on their own. Conclusion: Biostatistics should play a central role in planning, conduct, interim analysis, final analysis and reporting of periodontal research especially by the postgraduate students. Indian postgraduate students in periodontics are aware of the importance of biostatistics in research but the level of understanding and application is still basic and needs to be addressed. PMID:24744547

  14. Statistical models for the analysis and design of digital polymerase chain (dPCR) experiments

    USGS Publications Warehouse

    Dorazio, Robert; Hunter, Margaret

    2015-01-01

    Statistical methods for the analysis and design of experiments using digital PCR (dPCR) have received only limited attention and have been misused in many instances. To address this issue and to provide a more general approach to the analysis of dPCR data, we describe a class of statistical models for the analysis and design of experiments that require quantification of nucleic acids. These models are mathematically equivalent to generalized linear models of binomial responses that include a complementary, log–log link function and an offset that is dependent on the dPCR partition volume. These models are both versatile and easy to fit using conventional statistical software. Covariates can be used to specify different sources of variation in nucleic acid concentration, and a model’s parameters can be used to quantify the effects of these covariates. For purposes of illustration, we analyzed dPCR data from different types of experiments, including serial dilution, evaluation of copy number variation, and quantification of gene expression. We also showed how these models can be used to help design dPCR experiments, as in selection of sample sizes needed to achieve desired levels of precision in estimates of nucleic acid concentration or to detect differences in concentration among treatments with prescribed levels of statistical power.

  15. Statistical Models for the Analysis and Design of Digital Polymerase Chain Reaction (dPCR) Experiments.

    PubMed

    Dorazio, Robert M; Hunter, Margaret E

    2015-11-03

    Statistical methods for the analysis and design of experiments using digital PCR (dPCR) have received only limited attention and have been misused in many instances. To address this issue and to provide a more general approach to the analysis of dPCR data, we describe a class of statistical models for the analysis and design of experiments that require quantification of nucleic acids. These models are mathematically equivalent to generalized linear models of binomial responses that include a complementary, log-log link function and an offset that is dependent on the dPCR partition volume. These models are both versatile and easy to fit using conventional statistical software. Covariates can be used to specify different sources of variation in nucleic acid concentration, and a model's parameters can be used to quantify the effects of these covariates. For purposes of illustration, we analyzed dPCR data from different types of experiments, including serial dilution, evaluation of copy number variation, and quantification of gene expression. We also showed how these models can be used to help design dPCR experiments, as in selection of sample sizes needed to achieve desired levels of precision in estimates of nucleic acid concentration or to detect differences in concentration among treatments with prescribed levels of statistical power.

  16. Statistical Power Analysis with Microsoft Excel: Normal Tests for One or Two Means as a Prelude to Using Non-Central Distributions to Calculate Power

    ERIC Educational Resources Information Center

    Texeira, Antonio; Rosa, Alvaro; Calapez, Teresa

    2009-01-01

    This article presents statistical power analysis (SPA) based on the normal distribution using Excel, adopting textbook and SPA approaches. The objective is to present the latter in a comparative way within a framework that is familiar to textbook level readers, as a first step to understand SPA with other distributions. The analysis focuses on the…

  17. SOCR: Statistics Online Computational Resource

    PubMed Central

    Dinov, Ivo D.

    2011-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR). This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student’s intuition and enhance their learning. PMID:21451741

  18. Forest statistics for New Hampshire

    Treesearch

    Thomas S. Frieswyk; Anne M. Malley

    1985-01-01

    This is a statistical report on the fourth forest survey of New Hampshire conducted in 1982-83 by the Forest Inventory and Analysis Unit, Northeastern Forest Experiment Station. Statistics for forest area, numbers of trees, timber volume, tree biomass, and timber products output are displayed at the state, unit, and county levels. The current inventory indicates that...

  19. New Statistical Probe into the Decline of Daily Newspaper Household Penetration.

    ERIC Educational Resources Information Center

    Alperstein, Gerald

    From 1950 to 1970, daily newspaper household penetration (DNHP) levels dropped from 1.24 to 0.99 in the United States. This paper describes some of the variables involved in this decline and outlines a market-by-market statistical analysis of the relationship between the penetration levels of daily newspapers and other forms of mass media. From…

  20. A General Framework for Power Analysis to Detect the Moderator Effects in Two- and Three-Level Cluster Randomized Trials

    ERIC Educational Resources Information Center

    Dong, Nianbo; Spybrook, Jessaca; Kelcey, Ben

    2016-01-01

    The purpose of this study is to propose a general framework for power analyses to detect the moderator effects in two- and three-level cluster randomized trials (CRTs). The study specifically aims to: (1) develop the statistical formulations for calculating statistical power, minimum detectable effect size (MDES) and its confidence interval to…

  1. Gene Level Meta-Analysis of Quantitative Traits by Functional Linear Models.

    PubMed

    Fan, Ruzong; Wang, Yifan; Boehnke, Michael; Chen, Wei; Li, Yun; Ren, Haobo; Lobach, Iryna; Xiong, Momiao

    2015-08-01

    Meta-analysis of genetic data must account for differences among studies including study designs, markers genotyped, and covariates. The effects of genetic variants may differ from population to population, i.e., heterogeneity. Thus, meta-analysis of combining data of multiple studies is difficult. Novel statistical methods for meta-analysis are needed. In this article, functional linear models are developed for meta-analyses that connect genetic data to quantitative traits, adjusting for covariates. The models can be used to analyze rare variants, common variants, or a combination of the two. Both likelihood-ratio test (LRT) and F-distributed statistics are introduced to test association between quantitative traits and multiple variants in one genetic region. Extensive simulations are performed to evaluate empirical type I error rates and power performance of the proposed tests. The proposed LRT and F-distributed statistics control the type I error very well and have higher power than the existing methods of the meta-analysis sequence kernel association test (MetaSKAT). We analyze four blood lipid levels in data from a meta-analysis of eight European studies. The proposed methods detect more significant associations than MetaSKAT and the P-values of the proposed LRT and F-distributed statistics are usually much smaller than those of MetaSKAT. The functional linear models and related test statistics can be useful in whole-genome and whole-exome association studies. Copyright © 2015 by the Genetics Society of America.

  2. Dealing with missing standard deviation and mean values in meta-analysis of continuous outcomes: a systematic review.

    PubMed

    Weir, Christopher J; Butcher, Isabella; Assi, Valentina; Lewis, Stephanie C; Murray, Gordon D; Langhorne, Peter; Brady, Marian C

    2018-03-07

    Rigorous, informative meta-analyses rely on availability of appropriate summary statistics or individual participant data. For continuous outcomes, especially those with naturally skewed distributions, summary information on the mean or variability often goes unreported. While full reporting of original trial data is the ideal, we sought to identify methods for handling unreported mean or variability summary statistics in meta-analysis. We undertook two systematic literature reviews to identify methodological approaches used to deal with missing mean or variability summary statistics. Five electronic databases were searched, in addition to the Cochrane Colloquium abstract books and the Cochrane Statistics Methods Group mailing list archive. We also conducted cited reference searching and emailed topic experts to identify recent methodological developments. Details recorded included the description of the method, the information required to implement the method, any underlying assumptions and whether the method could be readily applied in standard statistical software. We provided a summary description of the methods identified, illustrating selected methods in example meta-analysis scenarios. For missing standard deviations (SDs), following screening of 503 articles, fifteen methods were identified in addition to those reported in a previous review. These included Bayesian hierarchical modelling at the meta-analysis level; summary statistic level imputation based on observed SD values from other trials in the meta-analysis; a practical approximation based on the range; and algebraic estimation of the SD based on other summary statistics. Following screening of 1124 articles for methods estimating the mean, one approximate Bayesian computation approach and three papers based on alternative summary statistics were identified. Illustrative meta-analyses showed that when replacing a missing SD the approximation using the range minimised loss of precision and generally performed better than omitting trials. When estimating missing means, a formula using the median, lower quartile and upper quartile performed best in preserving the precision of the meta-analysis findings, although in some scenarios, omitting trials gave superior results. Methods based on summary statistics (minimum, maximum, lower quartile, upper quartile, median) reported in the literature facilitate more comprehensive inclusion of randomised controlled trials with missing mean or variability summary statistics within meta-analyses.

  3. Measurement and statistical analysis of single-molecule current-voltage characteristics, transition voltage spectroscopy, and tunneling barrier height.

    PubMed

    Guo, Shaoyin; Hihath, Joshua; Díez-Pérez, Ismael; Tao, Nongjian

    2011-11-30

    We report on the measurement and statistical study of thousands of current-voltage characteristics and transition voltage spectra (TVS) of single-molecule junctions with different contact geometries that are rapidly acquired using a new break junction method at room temperature. This capability allows one to obtain current-voltage, conductance voltage, and transition voltage histograms, thus adding a new dimension to the previous conductance histogram analysis at a fixed low-bias voltage for single molecules. This method confirms the low-bias conductance values of alkanedithiols and biphenyldithiol reported in literature. However, at high biases the current shows large nonlinearity and asymmetry, and TVS allows for the determination of a critically important parameter, the tunneling barrier height or energy level alignment between the molecule and the electrodes of single-molecule junctions. The energy level alignment is found to depend on the molecule and also on the contact geometry, revealing the role of contact geometry in both the contact resistance and energy level alignment of a molecular junction. Detailed statistical analysis further reveals that, despite the dependence of the energy level alignment on contact geometry, the variation in single-molecule conductance is primarily due to contact resistance rather than variations in the energy level alignment.

  4. Research Update: Spatially resolved mapping of electronic structure on atomic level by multivariate statistical analysis

    DOE PAGES

    Belianinov, Alex; Panchapakesan, G.; Lin, Wenzhi; ...

    2014-12-02

    Atomic level spatial variability of electronic structure in Fe-based superconductor FeTe0.55Se0.45 (Tc = 15 K) is explored using current-imaging tunneling-spectroscopy. Multivariate statistical analysis of the data differentiates regions of dissimilar electronic behavior that can be identified with the segregation of chalcogen atoms, as well as boundaries between terminations and near neighbor interactions. Subsequent clustering analysis allows identification of the spatial localization of these dissimilar regions. Similar statistical analysis of modeled calculated density of states of chemically inhomogeneous FeTe1 x Sex structures further confirms that the two types of chalcogens, i.e., Te and Se, can be identified by their electronic signaturemore » and differentiated by their local chemical environment. This approach allows detailed chemical discrimination of the scanning tunneling microscopy data including separation of atomic identities, proximity, and local configuration effects and can be universally applicable to chemically and electronically inhomogeneous surfaces.« less

  5. Research Update: Spatially resolved mapping of electronic structure on atomic level by multivariate statistical analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belianinov, Alex, E-mail: belianinova@ornl.gov; Ganesh, Panchapakesan; Lin, Wenzhi

    2014-12-01

    Atomic level spatial variability of electronic structure in Fe-based superconductor FeTe{sub 0.55}Se{sub 0.45} (T{sub c} = 15 K) is explored using current-imaging tunneling-spectroscopy. Multivariate statistical analysis of the data differentiates regions of dissimilar electronic behavior that can be identified with the segregation of chalcogen atoms, as well as boundaries between terminations and near neighbor interactions. Subsequent clustering analysis allows identification of the spatial localization of these dissimilar regions. Similar statistical analysis of modeled calculated density of states of chemically inhomogeneous FeTe{sub 1−x}Se{sub x} structures further confirms that the two types of chalcogens, i.e., Te and Se, can be identified bymore » their electronic signature and differentiated by their local chemical environment. This approach allows detailed chemical discrimination of the scanning tunneling microscopy data including separation of atomic identities, proximity, and local configuration effects and can be universally applicable to chemically and electronically inhomogeneous surfaces.« less

  6. Teaching Statistics in Biology: Using Inquiry-based Learning to Strengthen Understanding of Statistical Analysis in Biology Laboratory Courses

    PubMed Central

    2008-01-01

    There is an increasing need for students in the biological sciences to build a strong foundation in quantitative approaches to data analyses. Although most science, engineering, and math field majors are required to take at least one statistics course, statistical analysis is poorly integrated into undergraduate biology course work, particularly at the lower-division level. Elements of statistics were incorporated into an introductory biology course, including a review of statistics concepts and opportunity for students to perform statistical analysis in a biological context. Learning gains were measured with an 11-item statistics learning survey instrument developed for the course. Students showed a statistically significant 25% (p < 0.005) increase in statistics knowledge after completing introductory biology. Students improved their scores on the survey after completing introductory biology, even if they had previously completed an introductory statistics course (9%, improvement p < 0.005). Students retested 1 yr after completing introductory biology showed no loss of their statistics knowledge as measured by this instrument, suggesting that the use of statistics in biology course work may aid long-term retention of statistics knowledge. No statistically significant differences in learning were detected between male and female students in the study. PMID:18765754

  7. Long-term sea level trends: Natural or anthropogenic?

    NASA Astrophysics Data System (ADS)

    Becker, M.; Karpytchev, M.; Lennartz-Sassinek, S.

    2014-08-01

    Detection and attribution of human influence on sea level rise are important topics that have not yet been explored in depth. We question whether the sea level changes (SLC) over the past century were natural in origin. SLC exhibit power law long-term correlations. By estimating Hurst exponent through Detrended Fluctuation Analysis and by applying statistics of Lennartz and Bunde, we search the lower bounds of statistically significant external sea level trends in longest tidal records worldwide. We provide statistical evidences that the observed SLC, at global and regional scales, is beyond its natural internal variability. The minimum anthropogenic sea level trend (MASLT) contributes to the observed sea level rise more than 50% in New York, Baltimore, San Diego, Marseille, and Mumbai. A MASLT is about 1 mm/yr in global sea level reconstructions that is more than half of the total observed sea level trend during the XXth century.

  8. Clustered Stomates in "Begonia": An Exercise in Data Collection & Statistical Analysis of Biological Space

    ERIC Educational Resources Information Center

    Lau, Joann M.; Korn, Robert W.

    2007-01-01

    In this article, the authors present a laboratory exercise in data collection and statistical analysis in biological space using clustered stomates on leaves of "Begonia" plants. The exercise can be done in middle school classes by students making their own slides and seeing imprints of cells, or at the high school level through collecting data of…

  9. Quantifying the impact of between-study heterogeneity in multivariate meta-analyses

    PubMed Central

    Jackson, Dan; White, Ian R; Riley, Richard D

    2012-01-01

    Measures that quantify the impact of heterogeneity in univariate meta-analysis, including the very popular I2 statistic, are now well established. Multivariate meta-analysis, where studies provide multiple outcomes that are pooled in a single analysis, is also becoming more commonly used. The question of how to quantify heterogeneity in the multivariate setting is therefore raised. It is the univariate R2 statistic, the ratio of the variance of the estimated treatment effect under the random and fixed effects models, that generalises most naturally, so this statistic provides our basis. This statistic is then used to derive a multivariate analogue of I2, which we call . We also provide a multivariate H2 statistic, the ratio of a generalisation of Cochran's heterogeneity statistic and its associated degrees of freedom, with an accompanying generalisation of the usual I2 statistic, . Our proposed heterogeneity statistics can be used alongside all the usual estimates and inferential procedures used in multivariate meta-analysis. We apply our methods to some real datasets and show how our statistics are equally appropriate in the context of multivariate meta-regression, where study level covariate effects are included in the model. Our heterogeneity statistics may be used when applying any procedure for fitting the multivariate random effects model. Copyright © 2012 John Wiley & Sons, Ltd. PMID:22763950

  10. The Relationship between Zinc Levels and Autism: A Systematic Review and Meta-analysis.

    PubMed

    Babaknejad, Nasim; Sayehmiri, Fatemeh; Sayehmiri, Kourosh; Mohamadkhani, Ashraf; Bahrami, Somaye

    2016-01-01

    Autism is a complex behaviorally defined disorder.There is a relationship between zinc (Zn) levels in autistic patients and development of pathogenesis, but the conclusion is not permanent. The present study conducted to estimate this probability using meta-analysis method. In this study, Fixed Effect Model, twelve articles published from 1978 to 2012 were selected by searching Google scholar, PubMed, ISI Web of Science, and Scopus and information were analyzed. I² statistics were calculated to examine heterogeneity. The information was analyzed using R and STATA Ver. 12.2. There was no significant statistical difference between hair, nail, and teeth Zn levels between controls and autistic patients: -0.471 [95% confidence interval (95% CI): -1.172 to 0.231]. There was significant statistical difference between plasma Zn concentration and autistic patients besides healthy controls: -0.253 (95% CI: 0.498 to -0.007). Using a Random Effect Model, the overall Integration of data from the two groups was -0.414 (95% CI: -0.878 to -0.051). Based on sensitivity analysis, zinc supplements can be used for the nutritional therapy for autistic patients.

  11. Anger and depression levels of mothers with premature infants in the neonatal intensive care unit.

    PubMed

    Kardaşözdemir, Funda; AKGüN Şahin, Zümrüt

    2016-02-04

    The aim of this study was to examine anger and depression levels of mothers who had a premature infant in the NICU, and all factors affecting the situation. This descriptive study was performed in the level I and II units of NICU at three state hospitals in Turkey. The data was collected with a demographic questionnaire, "Beck Depression Inventory" and "Anger Expression Scale". Descriptive statistics, parametric and nonparametric statistical tests and Pearson correlation were used in the data analysis. Mothers whose infants are under care in NICU have moderate depression. It has also been determined that mothers' educational level, income level and gender of infants were statistically significant (p <0.05). A positive relationship between depression and trait anger scores was found to be statistically significant. A negative relationship existed between depression and anger-control scores for the mothers, which was statistically significant (p <0.05). Due to the results of research, recommended that mothers who are at risk of depression and anger in the NICU evaluated by nurses and these nurses to develop their consulting roles.

  12. Forest statistics for Vermont: 1973 and 1983

    Treesearch

    Thomas S. Frieswyk; Anne M. Malley

    1985-01-01

    A statistical report on the fourth forest survey of Vermont conducted in 1982-1983 by the Forest Inventory and Analysis Unit, Northeastern Forest Experiment Station. Statistics for forest area, numbers of trees, timber volume, tree biomass, and timber products output are displayed at the state, unit, and county levels. The current inventory indicates that the state has...

  13. Forest statistics for Delaware: 1986 and 1999

    Treesearch

    Douglas M. Griffith; Richard H. Widmann; Richard H. Widmann

    2001-01-01

    A statistical report on the fourth forest inventory of Delaware conducted in 1999 by the Forest Inventory and Analysis Unit of the Northeastern Research Station. Statistics for forest area, numbers of trees, tree biomass, timber volume, growth, and change are displayed at the state and, where appropriate, the county level. The current inventory indicates that there are...

  14. Forest statistics for West Virginia: 1989 and 2000

    Treesearch

    Douglas M. Griffith; Richard H. Widmann

    2003-01-01

    A statistical report on the fifth forest inventory of West Virginia conducted in 2000 by the Forest Inventory and Analysis unit of the Northeastern Research Station. Statistics for forest area, numbers of trees, tree biomass, timber volume, growth, and change are displayed at the state and, where appropriate, the county level. The current inventory indicates that there...

  15. The skeletal maturation status estimated by statistical shape analysis: axial images of Japanese cervical vertebra.

    PubMed

    Shin, S M; Kim, Y-I; Choi, Y-S; Yamaguchi, T; Maki, K; Cho, B-H; Park, S-B

    2015-01-01

    To evaluate axial cervical vertebral (ACV) shape quantitatively and to build a prediction model for skeletal maturation level using statistical shape analysis for Japanese individuals. The sample included 24 female and 19 male patients with hand-wrist radiographs and CBCT images. Through generalized Procrustes analysis and principal components (PCs) analysis, the meaningful PCs were extracted from each ACV shape and analysed for the estimation regression model. Each ACV shape had meaningful PCs, except for the second axial cervical vertebra. Based on these models, the smallest prediction intervals (PIs) were from the combination of the shape space PCs, age and gender. Overall, the PIs of the male group were smaller than those of the female group. There was no significant correlation between centroid size as a size factor and skeletal maturation level. Our findings suggest that the ACV maturation method, which was applied by statistical shape analysis, could confirm information about skeletal maturation in Japanese individuals as an available quantifier of skeletal maturation and could be as useful a quantitative method as the skeletal maturation index.

  16. The skeletal maturation status estimated by statistical shape analysis: axial images of Japanese cervical vertebra

    PubMed Central

    Shin, S M; Choi, Y-S; Yamaguchi, T; Maki, K; Cho, B-H; Park, S-B

    2015-01-01

    Objectives: To evaluate axial cervical vertebral (ACV) shape quantitatively and to build a prediction model for skeletal maturation level using statistical shape analysis for Japanese individuals. Methods: The sample included 24 female and 19 male patients with hand–wrist radiographs and CBCT images. Through generalized Procrustes analysis and principal components (PCs) analysis, the meaningful PCs were extracted from each ACV shape and analysed for the estimation regression model. Results: Each ACV shape had meaningful PCs, except for the second axial cervical vertebra. Based on these models, the smallest prediction intervals (PIs) were from the combination of the shape space PCs, age and gender. Overall, the PIs of the male group were smaller than those of the female group. There was no significant correlation between centroid size as a size factor and skeletal maturation level. Conclusions: Our findings suggest that the ACV maturation method, which was applied by statistical shape analysis, could confirm information about skeletal maturation in Japanese individuals as an available quantifier of skeletal maturation and could be as useful a quantitative method as the skeletal maturation index. PMID:25411713

  17. Extreme Statistics of Storm Surges in the Baltic Sea

    NASA Astrophysics Data System (ADS)

    Kulikov, E. A.; Medvedev, I. P.

    2017-11-01

    Statistical analysis of the extreme values of the Baltic Sea level has been performed for a series of observations for 15-125 years at 13 tide gauge stations. It is shown that the empirical relation between value of extreme sea level rises or ebbs (caused by storm events) and its return period in the Baltic Sea can be well approximated by the Gumbel probability distribution. The maximum values of extreme floods/ebbs of the 100-year recurrence were observed in the Gulf of Finland and the Gulf of Riga. The two longest data series, observed in Stockholm and Vyborg over 125 years, have shown a significant deviation from the Gumbel distribution for the rarest events. Statistical analysis of the hourly sea level data series reveals some asymmetry in the variability of the Baltic Sea level. The probability of rises proved higher than that of ebbs. As for the magnitude of the 100-year recurrence surge, it considerably exceeded the magnitude of ebbs almost everywhere. This asymmetry effect can be attributed to the influence of low atmospheric pressure during storms. A statistical study of extreme values has also been applied to sea level series for Narva over the period of 1994-2000, which were simulated by the ROMS numerical model. Comparisons of the "simulated" and "observed" extreme sea level distributions show that the model reproduces quite satisfactorily extreme floods of "moderate" magnitude; however, it underestimates sea level changes for the most powerful storm surges.

  18. A Powerful Procedure for Pathway-Based Meta-analysis Using Summary Statistics Identifies 43 Pathways Associated with Type II Diabetes in European Populations.

    PubMed

    Zhang, Han; Wheeler, William; Hyland, Paula L; Yang, Yifan; Shi, Jianxin; Chatterjee, Nilanjan; Yu, Kai

    2016-06-01

    Meta-analysis of multiple genome-wide association studies (GWAS) has become an effective approach for detecting single nucleotide polymorphism (SNP) associations with complex traits. However, it is difficult to integrate the readily accessible SNP-level summary statistics from a meta-analysis into more powerful multi-marker testing procedures, which generally require individual-level genetic data. We developed a general procedure called Summary based Adaptive Rank Truncated Product (sARTP) for conducting gene and pathway meta-analysis that uses only SNP-level summary statistics in combination with genotype correlation estimated from a panel of individual-level genetic data. We demonstrated the validity and power advantage of sARTP through empirical and simulated data. We conducted a comprehensive pathway-based meta-analysis with sARTP on type 2 diabetes (T2D) by integrating SNP-level summary statistics from two large studies consisting of 19,809 T2D cases and 111,181 controls with European ancestry. Among 4,713 candidate pathways from which genes in neighborhoods of 170 GWAS established T2D loci were excluded, we detected 43 T2D globally significant pathways (with Bonferroni corrected p-values < 0.05), which included the insulin signaling pathway and T2D pathway defined by KEGG, as well as the pathways defined according to specific gene expression patterns on pancreatic adenocarcinoma, hepatocellular carcinoma, and bladder carcinoma. Using summary data from 8 eastern Asian T2D GWAS with 6,952 cases and 11,865 controls, we showed 7 out of the 43 pathways identified in European populations remained to be significant in eastern Asians at the false discovery rate of 0.1. We created an R package and a web-based tool for sARTP with the capability to analyze pathways with thousands of genes and tens of thousands of SNPs.

  19. A Powerful Procedure for Pathway-Based Meta-analysis Using Summary Statistics Identifies 43 Pathways Associated with Type II Diabetes in European Populations

    PubMed Central

    Zhang, Han; Wheeler, William; Hyland, Paula L.; Yang, Yifan; Shi, Jianxin; Chatterjee, Nilanjan; Yu, Kai

    2016-01-01

    Meta-analysis of multiple genome-wide association studies (GWAS) has become an effective approach for detecting single nucleotide polymorphism (SNP) associations with complex traits. However, it is difficult to integrate the readily accessible SNP-level summary statistics from a meta-analysis into more powerful multi-marker testing procedures, which generally require individual-level genetic data. We developed a general procedure called Summary based Adaptive Rank Truncated Product (sARTP) for conducting gene and pathway meta-analysis that uses only SNP-level summary statistics in combination with genotype correlation estimated from a panel of individual-level genetic data. We demonstrated the validity and power advantage of sARTP through empirical and simulated data. We conducted a comprehensive pathway-based meta-analysis with sARTP on type 2 diabetes (T2D) by integrating SNP-level summary statistics from two large studies consisting of 19,809 T2D cases and 111,181 controls with European ancestry. Among 4,713 candidate pathways from which genes in neighborhoods of 170 GWAS established T2D loci were excluded, we detected 43 T2D globally significant pathways (with Bonferroni corrected p-values < 0.05), which included the insulin signaling pathway and T2D pathway defined by KEGG, as well as the pathways defined according to specific gene expression patterns on pancreatic adenocarcinoma, hepatocellular carcinoma, and bladder carcinoma. Using summary data from 8 eastern Asian T2D GWAS with 6,952 cases and 11,865 controls, we showed 7 out of the 43 pathways identified in European populations remained to be significant in eastern Asians at the false discovery rate of 0.1. We created an R package and a web-based tool for sARTP with the capability to analyze pathways with thousands of genes and tens of thousands of SNPs. PMID:27362418

  20. A survey of work engagement and psychological capital levels.

    PubMed

    Bonner, Lynda

    2016-08-11

    To evaluate the relationship between work engagement and psychological capital (PsyCap) levels reported by registered nurses. PsyCap is a developable human resource. Research on PsyCap as an antecedent to work engagement in nurses is needed. A convenience sample of 137 registered nurses participated in this quantitative cross-sectional survey. Questionnaires measured self-reported levels of work engagement and psychological capital. Descriptive and inferential statistics were used for data analysis. There was a statistically significant correlation between work engagement and PsyCap scores (r=0.633, p<0.01). Nurses working at band 5 level reported statistically significantly lower PsyCap scores compared with nurses working at band 6 and 7 levels. Nurses reporting high levels of work engagement also reported high levels of PsyCap. Band 5 nurses might benefit most from interventions to increase their PsyCap. This study supports PsyCap as an antecedent to work engagement.

  1. Diagnostic Value of Serum YKL-40 Level for Coronary Artery Disease: A Meta-Analysis.

    PubMed

    Song, Chun-Li; Bin-Li; Diao, Hong-Ying; Wang, Jiang-Hua; Shi, Yong-fei; Lu, Yang; Wang, Guan; Guo, Zi-Yuan; Li, Yang-Xue; Liu, Jian-Gen; Wang, Jin-Peng; Zhang, Ji-Chang; Zhao, Zhuo; Liu, Yi-Hang; Li, Ying; Cai, Dan; Li, Qian

    2016-01-01

    This meta-analysis aimed to identify the value of serum YKL-40 level for the diagnosis of coronary artery disease (CAD). Through searching the following electronic databases: the Cochrane Library Database (Issue 12, 2013), Web of Science (1945 ∼ 2013), PubMed (1966 ∼ 2013), CINAHL (1982 ∼ 2013), EMBASE (1980 ∼ 2013), and the Chinese Biomedical Database (CBM; 1982 ∼ 2013), related articles were determined without any language restrictions. STATA statistical software (Version 12.0, Stata Corporation, College Station, TX) was chosen to deal with statistical data. Standard mean difference (SMD) and its corresponding 95% confidence interval (95% CI) were calculated. Eleven clinical case-control studies that recruited 1,175 CAD patients and 1,261 healthy controls were selected for statistical analysis. The main findings of our meta-analysis showed that serum YKL-40 level in CAD patients was significantly higher than that in control subjects (SMD = 2.79, 95% CI = 1.73 ∼ 3.85, P < 0.001). Ethnicity-stratified analysis indicated a higher serum YKL-40 level in CAD patients than control subjects among China, Korea, and Denmark populations (China: SMD = 2.97, 95% CI = 1.21 ∼ 4.74, P = 0.001; Korea: SMD = 0.66, 95% CI = 0.17 ∼ 1.15, P = 0.008; Denmark: SMD = 1.85, 95% CI = 1.42 ∼ 2.29, P < 0.001; respectively), but not in Turkey (SMD = 4.52, 95% CI = -2.87 ∼ 11.91, P = 0.231). The present meta-analysis suggests that an elevated serum YKL-40 level may be used as a promising diagnostic tool for early identification of CAD.

  2. Quantitative investigation of inappropriate regression model construction and the importance of medical statistics experts in observational medical research: a cross-sectional study.

    PubMed

    Nojima, Masanori; Tokunaga, Mutsumi; Nagamura, Fumitaka

    2018-05-05

    To investigate under what circumstances inappropriate use of 'multivariate analysis' is likely to occur and to identify the population that needs more support with medical statistics. The frequency of inappropriate regression model construction in multivariate analysis and related factors were investigated in observational medical research publications. The inappropriate algorithm of using only variables that were significant in univariate analysis was estimated to occur at 6.4% (95% CI 4.8% to 8.5%). This was observed in 1.1% of the publications with a medical statistics expert (hereinafter 'expert') as the first author, 3.5% if an expert was included as coauthor and in 12.2% if experts were not involved. In the publications where the number of cases was 50 or less and the study did not include experts, inappropriate algorithm usage was observed with a high proportion of 20.2%. The OR of the involvement of experts for this outcome was 0.28 (95% CI 0.15 to 0.53). A further, nation-level, analysis showed that the involvement of experts and the implementation of unfavourable multivariate analysis are associated at the nation-level analysis (R=-0.652). Based on the results of this study, the benefit of participation of medical statistics experts is obvious. Experts should be involved for proper confounding adjustment and interpretation of statistical models. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  3. [Statistical analysis of German radiologic periodicals: developmental trends in the last 10 years].

    PubMed

    Golder, W

    1999-09-01

    To identify which statistical tests are applied in German radiological publications, to what extent their use has changed during the last decade, and which factors might be responsible for this development. The major articles published in "ROFO" and "DER RADIOLOGE" during 1988, 1993 and 1998 were reviewed for statistical content. The contributions were classified by principal focus and radiological subspecialty. The methods used were assigned to descriptive, basal and advanced statistics. Sample size, significance level and power were established. The use of experts' assistance was monitored. Finally, we calculated the so-called cumulative accessibility of the publications. 525 contributions were found to be eligible. In 1988, 87% used descriptive statistics only, 12.5% basal, and 0.5% advanced statistics. The corresponding figures in 1993 and 1998 are 62 and 49%, 32 and 41%, and 6 and 10%, respectively. Statistical techniques were most likely to be used in research on musculoskeletal imaging and articles dedicated to MRI. Six basic categories of statistical methods account for the complete statistical analysis appearing in 90% of the articles. ROC analysis is the single most common advanced technique. Authors make increasingly use of statistical experts' opinion and programs. During the last decade, the use of statistical methods in German radiological journals has fundamentally improved, both quantitatively and qualitatively. Presently, advanced techniques account for 20% of the pertinent statistical tests. This development seems to be promoted by the increasing availability of statistical analysis software.

  4. Tools for Basic Statistical Analysis

    NASA Technical Reports Server (NTRS)

    Luz, Paul L.

    2005-01-01

    Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.

  5. The Practicality of Statistical Physics Handout Based on KKNI and the Constructivist Approach

    NASA Astrophysics Data System (ADS)

    Sari, S. Y.; Afrizon, R.

    2018-04-01

    Statistical physics lecture shows that: 1) the performance of lecturers, social climate, students’ competence and soft skills needed at work are in enough category, 2) students feel difficulties in following the lectures of statistical physics because it is abstract, 3) 40.72% of students needs more understanding in the form of repetition, practice questions and structured tasks, and 4) the depth of statistical physics material needs to be improved gradually and structured. This indicates that learning materials in accordance of The Indonesian National Qualification Framework or Kerangka Kualifikasi Nasional Indonesia (KKNI) with the appropriate learning approach are needed to help lecturers and students in lectures. The author has designed statistical physics handouts which have very valid criteria (90.89%) according to expert judgment. In addition, the practical level of handouts designed also needs to be considered in order to be easy to use, interesting and efficient in lectures. The purpose of this research is to know the practical level of statistical physics handout based on KKNI and a constructivist approach. This research is a part of research and development with 4-D model developed by Thiagarajan. This research activity has reached part of development test at Development stage. Data collection took place by using a questionnaire distributed to lecturers and students. Data analysis using descriptive data analysis techniques in the form of percentage. The analysis of the questionnaire shows that the handout of statistical physics has very practical criteria. The conclusion of this study is statistical physics handouts based on the KKNI and constructivist approach have been practically used in lectures.

  6. Investigation of trends in flooding in the Tug Fork basin of Kentucky, Virginia, and West Virginia

    USGS Publications Warehouse

    Hirsch, Robert M.; Scott, Arthur G.; Wyant, Timothy

    1982-01-01

    Statistical analysis indicates that the average size of annual-flood peaks of the Tug Fork (Ky., Va., and W. Va.) has been increasing. However, additional statistical analysis does not indicate that the flood levels that were exceeded typically once or twice a year in the period 1947-79 are any more likely to be exceeded now than in 1947. Possible trends in streamchannel size also are investigated at three locations. No discernible trends in channel size are noted. Further statistical analysis of the trend in the size of annual-flood peaks shows that much of the annual variation is related to local rainfall and to the 'natural' hydrologic response in a relatively undisturbed subbasin. However, some statistical indication of trend persists after accounting for these natural factors, though it is of borderline statistical significance. Further study in the basin may relate flood magnitudes to both rainfall and to land use.

  7. Statistical sensor fusion analysis of near-IR polarimetric and thermal imagery for the detection of minelike targets

    NASA Astrophysics Data System (ADS)

    Weisenseel, Robert A.; Karl, William C.; Castanon, David A.; DiMarzio, Charles A.

    1999-02-01

    We present an analysis of statistical model based data-level fusion for near-IR polarimetric and thermal data, particularly for the detection of mines and mine-like targets. Typical detection-level data fusion methods, approaches that fuse detections from individual sensors rather than fusing at the level of the raw data, do not account rationally for the relative reliability of different sensors, nor the redundancy often inherent in multiple sensors. Representative examples of such detection-level techniques include logical AND/OR operations on detections from individual sensors and majority vote methods. In this work, we exploit a statistical data model for the detection of mines and mine-like targets to compare and fuse multiple sensor channels. Our purpose is to quantify the amount of knowledge that each polarimetric or thermal channel supplies to the detection process. With this information, we can make reasonable decisions about the usefulness of each channel. We can use this information to improve the detection process, or we can use it to reduce the number of required channels.

  8. An analysis of science versus pseudoscience

    NASA Astrophysics Data System (ADS)

    Hooten, James T.

    2011-12-01

    This quantitative study identified distinctive features in archival datasets commissioned by the National Science Foundation (NSF) for Science and Engineering Indicators reports. The dependent variables included education level, and scores for science fact knowledge, science process knowledge, and pseudoscience beliefs. The dependent variables were aggregated into nine NSF-defined geographic regions and examined for the years 2004 and 2006. The variables were also examined over all years available in the dataset. Descriptive statistics were determined and tests for normality and homogeneity of variances were performed using Statistical Package for the Social Sciences. Analysis of Variance was used to test for statistically significant differences between the nine geographic regions for each of the four dependent variables. Statistical significance of 0.05 was used. Tukey post-hoc analysis was used to compute practical significance of differences between regions. Post-hoc power analysis using G*Power was used to calculate the probability of Type II errors. Tests for correlations across all years of the dependent variables were also performed. Pearson's r was used to indicate the strength of the relationship between the dependent variables. Small to medium differences in science literacy and education level were observed between many of the nine U.S. geographic regions. The most significant differences occurred when the West South Central region was compared to the New England and the Pacific regions. Belief in pseudoscience appeared to be distributed evenly across all U.S. geographic regions. Education level was a strong indicator of science literacy regardless of a respondent's region of residence. Recommendations for further study include more in-depth investigation to uncover the nature of the relationship between education level and belief in pseudoscience.

  9. Classification of the European Union member states according to the relative level of sustainable development.

    PubMed

    Anna, Bluszcz

    Nowadays methods of measurement and assessment of the level of sustained development at the international, national and regional level are a current research problem, which requires multi-dimensional analysis. The relative assessment of the sustainability level of the European Union member states and the comparative analysis of the position of Poland relative to other countries was the aim of the conducted studies in the article. EU member states were treated as objects in the multi-dimensional space. Dimensions of space were specified by ten diagnostic variables describing the sustainability level of UE countries in three dimensions, i.e., social, economic and environmental. Because the compiled statistical data were expressed in different units of measure, taxonomic methods were used for building an aggregated measure to assess the level of sustainable development of EU member states, which through normalisation of variables enabled the comparative analysis between countries. Methodology of studies consisted of eight stages, which included, among others: defining data matrices, calculating the variability coefficient for all variables, which variability coefficient was under 10 %, division of variables into stimulants and destimulants, selection of the method of variable normalisation, developing matrices of normalised data, selection of the formula and calculating the aggregated indicator of the relative level of sustainable development of the EU countries, calculating partial development indicators for three studies dimensions: social, economic and environmental and the classification of the EU countries according to the relative level of sustainable development. Statistical date were collected based on the Polish Central Statistical Office publication.

  10. Gene- and pathway-based association tests for multiple traits with GWAS summary statistics.

    PubMed

    Kwak, Il-Youp; Pan, Wei

    2017-01-01

    To identify novel genetic variants associated with complex traits and to shed new insights on underlying biology, in addition to the most popular single SNP-single trait association analysis, it would be useful to explore multiple correlated (intermediate) traits at the gene- or pathway-level by mining existing single GWAS or meta-analyzed GWAS data. For this purpose, we present an adaptive gene-based test and a pathway-based test for association analysis of multiple traits with GWAS summary statistics. The proposed tests are adaptive at both the SNP- and trait-levels; that is, they account for possibly varying association patterns (e.g. signal sparsity levels) across SNPs and traits, thus maintaining high power across a wide range of situations. Furthermore, the proposed methods are general: they can be applied to mixed types of traits, and to Z-statistics or P-values as summary statistics obtained from either a single GWAS or a meta-analysis of multiple GWAS. Our numerical studies with simulated and real data demonstrated the promising performance of the proposed methods. The methods are implemented in R package aSPU, freely and publicly available at: https://cran.r-project.org/web/packages/aSPU/ CONTACT: weip@biostat.umn.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  11. Is Quality/Effectiveness An Empirically Demonstrable School Attribute? Statistical Aids for Determining Appropriate Levels of Analysis.

    ERIC Educational Resources Information Center

    Griffith, James

    2002-01-01

    Describes and demonstrates analytical techniques used in organizational psychology and contemporary multilevel analysis. Using these analytic techniques, examines the relationship between educational outcomes and the school environment. Finds that at least some indicators might be represented as school-level phenomena. Results imply that the…

  12. QPROT: Statistical method for testing differential expression using protein-level intensity data in label-free quantitative proteomics.

    PubMed

    Choi, Hyungwon; Kim, Sinae; Fermin, Damian; Tsou, Chih-Chiang; Nesvizhskii, Alexey I

    2015-11-03

    We introduce QPROT, a statistical framework and computational tool for differential protein expression analysis using protein intensity data. QPROT is an extension of the QSPEC suite, originally developed for spectral count data, adapted for the analysis using continuously measured protein-level intensity data. QPROT offers a new intensity normalization procedure and model-based differential expression analysis, both of which account for missing data. Determination of differential expression of each protein is based on the standardized Z-statistic based on the posterior distribution of the log fold change parameter, guided by the false discovery rate estimated by a well-known Empirical Bayes method. We evaluated the classification performance of QPROT using the quantification calibration data from the clinical proteomic technology assessment for cancer (CPTAC) study and a recently published Escherichia coli benchmark dataset, with evaluation of FDR accuracy in the latter. QPROT is a statistical framework with computational software tool for comparative quantitative proteomics analysis. It features various extensions of QSPEC method originally built for spectral count data analysis, including probabilistic treatment of missing values in protein intensity data. With the increasing popularity of label-free quantitative proteomics data, the proposed method and accompanying software suite will be immediately useful for many proteomics laboratories. This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Objective research of auscultation signals in Traditional Chinese Medicine based on wavelet packet energy and support vector machine.

    PubMed

    Yan, Jianjun; Shen, Xiaojing; Wang, Yiqin; Li, Fufeng; Xia, Chunming; Guo, Rui; Chen, Chunfeng; Shen, Qingwei

    2010-01-01

    This study aims at utilising Wavelet Packet Transform (WPT) and Support Vector Machine (SVM) algorithm to make objective analysis and quantitative research for the auscultation in Traditional Chinese Medicine (TCM) diagnosis. First, Wavelet Packet Decomposition (WPD) at level 6 was employed to split more elaborate frequency bands of the auscultation signals. Then statistic analysis was made based on the extracted Wavelet Packet Energy (WPE) features from WPD coefficients. Furthermore, the pattern recognition was used to distinguish mixed subjects' statistical feature values of sample groups through SVM. Finally, the experimental results showed that the classification accuracies were at a high level.

  14. Using Multilevel Modeling in Language Assessment Research: A Conceptual Introduction

    ERIC Educational Resources Information Center

    Barkaoui, Khaled

    2013-01-01

    This article critiques traditional single-level statistical approaches (e.g., multiple regression analysis) to examining relationships between language test scores and variables in the assessment setting. It highlights the conceptual, methodological, and statistical problems associated with these techniques in dealing with multilevel or nested…

  15. Experimental analysis of computer system dependability

    NASA Technical Reports Server (NTRS)

    Iyer, Ravishankar, K.; Tang, Dong

    1993-01-01

    This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.

  16. Why weight? Modelling sample and observational level variability improves power in RNA-seq analyses

    PubMed Central

    Liu, Ruijie; Holik, Aliaksei Z.; Su, Shian; Jansz, Natasha; Chen, Kelan; Leong, Huei San; Blewitt, Marnie E.; Asselin-Labat, Marie-Liesse; Smyth, Gordon K.; Ritchie, Matthew E.

    2015-01-01

    Variations in sample quality are frequently encountered in small RNA-sequencing experiments, and pose a major challenge in a differential expression analysis. Removal of high variation samples reduces noise, but at a cost of reducing power, thus limiting our ability to detect biologically meaningful changes. Similarly, retaining these samples in the analysis may not reveal any statistically significant changes due to the higher noise level. A compromise is to use all available data, but to down-weight the observations from more variable samples. We describe a statistical approach that facilitates this by modelling heterogeneity at both the sample and observational levels as part of the differential expression analysis. At the sample level this is achieved by fitting a log-linear variance model that includes common sample-specific or group-specific parameters that are shared between genes. The estimated sample variance factors are then converted to weights and combined with observational level weights obtained from the mean–variance relationship of the log-counts-per-million using ‘voom’. A comprehensive analysis involving both simulations and experimental RNA-sequencing data demonstrates that this strategy leads to a universally more powerful analysis and fewer false discoveries when compared to conventional approaches. This methodology has wide application and is implemented in the open-source ‘limma’ package. PMID:25925576

  17. EBprot: Statistical analysis of labeling-based quantitative proteomics data.

    PubMed

    Koh, Hiromi W L; Swa, Hannah L F; Fermin, Damian; Ler, Siok Ghee; Gunaratne, Jayantha; Choi, Hyungwon

    2015-08-01

    Labeling-based proteomics is a powerful method for detection of differentially expressed proteins (DEPs). The current data analysis platform typically relies on protein-level ratios, which is obtained by summarizing peptide-level ratios for each protein. In shotgun proteomics, however, some proteins are quantified with more peptides than others, and this reproducibility information is not incorporated into the differential expression (DE) analysis. Here, we propose a novel probabilistic framework EBprot that directly models the peptide-protein hierarchy and rewards the proteins with reproducible evidence of DE over multiple peptides. To evaluate its performance with known DE states, we conducted a simulation study to show that the peptide-level analysis of EBprot provides better receiver-operating characteristic and more accurate estimation of the false discovery rates than the methods based on protein-level ratios. We also demonstrate superior classification performance of peptide-level EBprot analysis in a spike-in dataset. To illustrate the wide applicability of EBprot in different experimental designs, we applied EBprot to a dataset for lung cancer subtype analysis with biological replicates and another dataset for time course phosphoproteome analysis of EGF-stimulated HeLa cells with multiplexed labeling. Through these examples, we show that the peptide-level analysis of EBprot is a robust alternative to the existing statistical methods for the DE analysis of labeling-based quantitative datasets. The software suite is freely available on the Sourceforge website http://ebprot.sourceforge.net/. All MS data have been deposited in the ProteomeXchange with identifier PXD001426 (http://proteomecentral.proteomexchange.org/dataset/PXD001426/). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Pathway analysis with next-generation sequencing data.

    PubMed

    Zhao, Jinying; Zhu, Yun; Boerwinkle, Eric; Xiong, Momiao

    2015-04-01

    Although pathway analysis methods have been developed and successfully applied to association studies of common variants, the statistical methods for pathway-based association analysis of rare variants have not been well developed. Many investigators observed highly inflated false-positive rates and low power in pathway-based tests of association of rare variants. The inflated false-positive rates and low true-positive rates of the current methods are mainly due to their lack of ability to account for gametic phase disequilibrium. To overcome these serious limitations, we develop a novel statistic that is based on the smoothed functional principal component analysis (SFPCA) for pathway association tests with next-generation sequencing data. The developed statistic has the ability to capture position-level variant information and account for gametic phase disequilibrium. By intensive simulations, we demonstrate that the SFPCA-based statistic for testing pathway association with either rare or common or both rare and common variants has the correct type 1 error rates. Also the power of the SFPCA-based statistic and 22 additional existing statistics are evaluated. We found that the SFPCA-based statistic has a much higher power than other existing statistics in all the scenarios considered. To further evaluate its performance, the SFPCA-based statistic is applied to pathway analysis of exome sequencing data in the early-onset myocardial infarction (EOMI) project. We identify three pathways significantly associated with EOMI after the Bonferroni correction. In addition, our preliminary results show that the SFPCA-based statistic has much smaller P-values to identify pathway association than other existing methods.

  19. Learning and understanding the Kruskal-Wallis one-way analysis-of-variance-by-ranks test for differences among three or more independent groups.

    PubMed

    Chan, Y; Walmsley, R P

    1997-12-01

    When several treatment methods are available for the same problem, many clinicians are faced with the task of deciding which treatment to use. Many clinicians may have conducted informal "mini-experiments" on their own to determine which treatment is best suited for the problem. These results are usually not documented or reported in a formal manner because many clinicians feel that they are "statistically challenged." Another reason may be because clinicians do not feel they have controlled enough test conditions to warrant analysis. In this update, a statistic is described that does not involve complicated statistical assumptions, making it a simple and easy-to-use statistical method. This update examines the use of two statistics and does not deal with other issues that could affect clinical research such as issues affecting credibility. For readers who want a more in-depth examination of this topic, references have been provided. The Kruskal-Wallis one-way analysis-of-variance-by-ranks test (or H test) is used to determine whether three or more independent groups are the same or different on some variable of interest when an ordinal level of data or an interval or ratio level of data is available. A hypothetical example will be presented to explain when and how to use this statistic, how to interpret results using the statistic, the advantages and disadvantages of the statistic, and what to look for in a written report. This hypothetical example will involve the use of ratio data to demonstrate how to choose between using the nonparametric H test and the more powerful parametric F test.

  20. Advanced microwave soil moisture studies. [Big Sioux River Basin, Iowa

    NASA Technical Reports Server (NTRS)

    Dalsted, K. J.; Harlan, J. C.

    1983-01-01

    Comparisons of low level L-band brightness temperature (TB) and thermal infrared (TIR) data as well as the following data sets: soil map and land cover data; direct soil moisture measurement; and a computer generated contour map were statistically evaluated using regression analysis and linear discriminant analysis. Regression analysis of footprint data shows that statistical groupings of ground variables (soil features and land cover) hold promise for qualitative assessment of soil moisture and for reducing variance within the sampling space. Dry conditions appear to be more conductive to producing meaningful statistics than wet conditions. Regression analysis using field averaged TB and TIR data did not approach the higher sq R values obtained using within-field variations. The linear discriminant analysis indicates some capacity to distinguish categories with the results being somewhat better on a field basis than a footprint basis.

  1. Statistical analysis of vegetation and stormwater runoff in an urban watershed during summer and winter storms in Portland, Oregon, U.S

    Treesearch

    Geoffrey H. Donovan; David T. Butry; Megan Y. Mao

    2016-01-01

    Past research has examined the effect of urban trees, and other vegetation, on stormwater runoff using hydrological models or small-scale experiments. However, there has been no statistical analysis of the influence of vegetation on runoff in an intact urban watershed, and it is not clear how results from small-scale studies scale up to the city level. Researchers...

  2. World Population: Facts in Focus. World Population Data Sheet Workbook. Population Learning Series.

    ERIC Educational Resources Information Center

    Crews, Kimberly A.

    This workbook teaches population analysis using world population statistics. To complete the four student activity sheets, the students refer to the included "1988 World Population Data Sheet" which lists nations' statistical data that includes population totals, projected population, birth and death rates, fertility levels, and the…

  3. Some Statistics for Assessing Person-Fit Based on Continuous-Response Models

    ERIC Educational Resources Information Center

    Ferrando, Pere Joan

    2010-01-01

    This article proposes several statistics for assessing individual fit based on two unidimensional models for continuous responses: linear factor analysis and Samejima's continuous response model. Both models are approached using a common framework based on underlying response variables and are formulated at the individual level as fixed regression…

  4. Conducting Multilevel Analyses in Medical Education

    ERIC Educational Resources Information Center

    Zyphur, Michael J.; Kaplan, Seth A.; Islam, Gazi; Barsky, Adam P.; Franklin, Michael S.

    2008-01-01

    A significant body of education literature has begun using multilevel statistical models to examine data that reside at multiple levels of analysis. In order to provide a primer for medical education researchers, the current work gives a brief overview of some issues associated with multilevel statistical modeling. To provide an example of this…

  5. Analysis of molecular variance inferred from metric distances among DNA haplotypes: application to human mitochondrial DNA restriction data.

    PubMed

    Excoffier, L; Smouse, P E; Quattro, J M

    1992-06-01

    We present here a framework for the study of molecular variation within a single species. Information on DNA haplotype divergence is incorporated into an analysis of variance format, derived from a matrix of squared-distances among all pairs of haplotypes. This analysis of molecular variance (AMOVA) produces estimates of variance components and F-statistic analogs, designated here as phi-statistics, reflecting the correlation of haplotypic diversity at different levels of hierarchical subdivision. The method is flexible enough to accommodate several alternative input matrices, corresponding to different types of molecular data, as well as different types of evolutionary assumptions, without modifying the basic structure of the analysis. The significance of the variance components and phi-statistics is tested using a permutational approach, eliminating the normality assumption that is conventional for analysis of variance but inappropriate for molecular data. Application of AMOVA to human mitochondrial DNA haplotype data shows that population subdivisions are better resolved when some measure of molecular differences among haplotypes is introduced into the analysis. At the intraspecific level, however, the additional information provided by knowing the exact phylogenetic relations among haplotypes or by a nonlinear translation of restriction-site change into nucleotide diversity does not significantly modify the inferred population genetic structure. Monte Carlo studies show that site sampling does not fundamentally affect the significance of the molecular variance components. The AMOVA treatment is easily extended in several different directions and it constitutes a coherent and flexible framework for the statistical analysis of molecular data.

  6. Longitudinal Assessment of Self-Reported Recent Back Pain and Combat Deployment in the Millennium Cohort Study

    DTIC Science & Technology

    2016-11-15

    participants who were followed for the development of back pain for an average of 3.9 years. Methods. Descriptive statistics and longitudinal...health, military personnel, occupational health, outcome assessment, statistics, survey methodology . Level of Evidence: 3 Spine 2016;41:1754–1763ack...based on the National Health and Nutrition Examination Survey.21 Statistical Analysis Descriptive and univariate analyses compared character- istics

  7. Block observations of neighbourhood physical disorder are associated with neighbourhood crime, firearm injuries and deaths, and teen births.

    PubMed

    Wei, Evelyn; Hipwell, Alison; Pardini, Dustin; Beyers, Jennifer M; Loeber, Rolf

    2005-10-01

    To provide reliability information for a brief observational measure of physical disorder and determine its relation with neighbourhood level crime and health variables after controlling for census based measures of concentrated poverty and minority concentration. Psychometric analysis of block observation data comprising a brief measure of neighbourhood physical disorder, and cross sectional analysis of neighbourhood physical disorder, neighbourhood crime and birth statistics, and neighbourhood level poverty and minority concentration. Pittsburgh, Pennsylvania, US (2000 population=334 563). Pittsburgh neighbourhoods (n=82) and their residents (as reflected in neighbourhood level statistics). The physical disorder index showed adequate reliability and validity and was associated significantly with rates of crime, firearm injuries and homicides, and teen births, while controlling for concentrated poverty and minority population. This brief measure of neighbourhood physical disorder may help increase our understanding of how community level factors reflect health and crime outcomes.

  8. Temporal Variability of Upper-level Winds at the Eastern Range, Western Range and Wallops Flight Facility

    NASA Technical Reports Server (NTRS)

    Decker, Ryan K.; Barbre, Robert E., Jr.

    2014-01-01

    Space launch vehicles incorporate upper-level wind profiles to determine wind effects on the vehicle and for a commit to launch decision. These assessments incorporate wind profiles measured hours prior to launch and may not represent the actual wind the vehicle will fly through. Uncertainty in the upper-level winds over the time period between the assessment and launch can be mitigated by a statistical analysis of wind change over time periods of interest using historical data from the launch range. Five sets of temporal wind pairs at various times (.75, 1.5, 2, 3 and 4-hrs) at the Eastern Range, Western Range and Wallops Flight Facility were developed for use in upper-level wind assessments. Database development procedures as well as statistical analysis of temporal wind variability at each launch range will be presented.

  9. Descriptive statistics: the specification of statistical measures and their presentation in tables and graphs. Part 7 of a series on evaluation of scientific publications.

    PubMed

    Spriestersbach, Albert; Röhrig, Bernd; du Prel, Jean-Baptist; Gerhold-Ay, Aslihan; Blettner, Maria

    2009-09-01

    Descriptive statistics are an essential part of biometric analysis and a prerequisite for the understanding of further statistical evaluations, including the drawing of inferences. When data are well presented, it is usually obvious whether the author has collected and evaluated them correctly and in keeping with accepted practice in the field. Statistical variables in medicine may be of either the metric (continuous, quantitative) or categorical (nominal, ordinal) type. Easily understandable examples are given. Basic techniques for the statistical description of collected data are presented and illustrated with examples. The goal of a scientific study must always be clearly defined. The definition of the target value or clinical endpoint determines the level of measurement of the variables in question. Nearly all variables, whatever their level of measurement, can be usefully presented graphically and numerically. The level of measurement determines what types of diagrams and statistical values are appropriate. There are also different ways of presenting combinations of two independent variables graphically and numerically. The description of collected data is indispensable. If the data are of good quality, valid and important conclusions can already be drawn when they are properly described. Furthermore, data description provides a basis for inferential statistics.

  10. Recovering incomplete data using Statistical Multiple Imputations (SMI): a case study in environmental chemistry.

    PubMed

    Mercer, Theresa G; Frostick, Lynne E; Walmsley, Anthony D

    2011-10-15

    This paper presents a statistical technique that can be applied to environmental chemistry data where missing values and limit of detection levels prevent the application of statistics. A working example is taken from an environmental leaching study that was set up to determine if there were significant differences in levels of leached arsenic (As), chromium (Cr) and copper (Cu) between lysimeters containing preservative treated wood waste and those containing untreated wood. Fourteen lysimeters were setup and left in natural conditions for 21 weeks. The resultant leachate was analysed by ICP-OES to determine the As, Cr and Cu concentrations. However, due to the variation inherent in each lysimeter combined with the limits of detection offered by ICP-OES, the collected quantitative data was somewhat incomplete. Initial data analysis was hampered by the number of 'missing values' in the data. To recover the dataset, the statistical tool of Statistical Multiple Imputation (SMI) was applied, and the data was re-analysed successfully. It was demonstrated that using SMI did not affect the variance in the data, but facilitated analysis of the complete dataset. Copyright © 2011 Elsevier B.V. All rights reserved.

  11. Evaluation of calcium ion, hydroxyl ion release and pH levels in various calcium hydroxide based intracanal medicaments: An in vitro study

    PubMed Central

    Fulzele, Punit; Baliga, Sudhindra; Thosar, Nilima; Pradhan, Debaprya

    2011-01-01

    Aims: Evaluation of calcium ion and hydroxyl ion release and pH levels in various calcium hydroxide based intracanal medicaments. Objective: The purpose of this study was to evaluate calcium and hydroxyl ion release and pH levels of calcium hydroxide based products, namely, RC Cal, Metapex, calcium hydroxide with distilled water, along with the new gutta-percha points with calcium hydroxide. Materials and Methods: The materials were inserted in polyethylene tubes and immersed in deionized water. The pH variation, Ca++ and OH- release were monitored periodically for 1 week. Statistical Analysis Used: Statistical analysis was carried out using one-way analysis of variance and Tukey's post hoc tests with PASW Statistics version 18 software to compare the statistical difference. Results: After 1 week, calcium hydroxide with distilled water and RC Cal raised the pH to 12.7 and 11.8, respectively, while a small change was observed for Metapex, calcium hydroxide gutta-percha points. The calcium released after 1 week was 15.36 mg/dL from RC Cal, followed by 13.04, 1.296, 3.064 mg/dL from calcium hydroxide with sterile water, Metapex and calcium hydroxide gutta-percha points, respectively. Conclusions: Calcium hydroxide with sterile water and RC Cal pastes liberate significantly more calcium and hydroxyl ions and raise the pH higher than Metapex and calcium hydroxidegutta-percha points. PMID:22346155

  12. An Overview of R in Health Decision Sciences.

    PubMed

    Jalal, Hawre; Pechlivanoglou, Petros; Krijkamp, Eline; Alarid-Escudero, Fernando; Enns, Eva; Hunink, M G Myriam

    2017-10-01

    As the complexity of health decision science applications increases, high-level programming languages are increasingly adopted for statistical analyses and numerical computations. These programming languages facilitate sophisticated modeling, model documentation, and analysis reproducibility. Among the high-level programming languages, the statistical programming framework R is gaining increased recognition. R is freely available, cross-platform compatible, and open source. A large community of users who have generated an extensive collection of well-documented packages and functions supports it. These functions facilitate applications of health decision science methodology as well as the visualization and communication of results. Although R's popularity is increasing among health decision scientists, methodological extensions of R in the field of decision analysis remain isolated. The purpose of this article is to provide an overview of existing R functionality that is applicable to the various stages of decision analysis, including model design, input parameter estimation, and analysis of model outputs.

  13. Event coincidence analysis for quantifying statistical interrelationships between event time series. On the role of flood events as triggers of epidemic outbreaks

    NASA Astrophysics Data System (ADS)

    Donges, J. F.; Schleussner, C.-F.; Siegmund, J. F.; Donner, R. V.

    2016-05-01

    Studying event time series is a powerful approach for analyzing the dynamics of complex dynamical systems in many fields of science. In this paper, we describe the method of event coincidence analysis to provide a framework for quantifying the strength, directionality and time lag of statistical interrelationships between event series. Event coincidence analysis allows to formulate and test null hypotheses on the origin of the observed interrelationships including tests based on Poisson processes or, more generally, stochastic point processes with a prescribed inter-event time distribution and other higher-order properties. Applying the framework to country-level observational data yields evidence that flood events have acted as triggers of epidemic outbreaks globally since the 1950s. Facing projected future changes in the statistics of climatic extreme events, statistical techniques such as event coincidence analysis will be relevant for investigating the impacts of anthropogenic climate change on human societies and ecosystems worldwide.

  14. The Economic Contribution of Canada's Colleges and Institutes. An Analysis of Investment Effectiveness and Economic Growth. Volume 2: Detailed Results by Gender and Entry Level of Education

    ERIC Educational Resources Information Center

    Robison, M. Henry; Christophersen, Kjell A.

    2008-01-01

    The purpose of this volume is to present the results of the economic impact analysis in detail by gender and entry level of education. On the data entry side, gender and entry level of education are important variables that help characterize the student body profile. This profile data links to national statistical databases which are already…

  15. Analysis and modeling of wafer-level process variability in 28 nm FD-SOI using split C-V measurements

    NASA Astrophysics Data System (ADS)

    Pradeep, Krishna; Poiroux, Thierry; Scheer, Patrick; Juge, André; Gouget, Gilles; Ghibaudo, Gérard

    2018-07-01

    This work details the analysis of wafer level global process variability in 28 nm FD-SOI using split C-V measurements. The proposed approach initially evaluates the native on wafer process variability using efficient extraction methods on split C-V measurements. The on-wafer threshold voltage (VT) variability is first studied and modeled using a simple analytical model. Then, a statistical model based on the Leti-UTSOI compact model is proposed to describe the total C-V variability in different bias conditions. This statistical model is finally used to study the contribution of each process parameter to the total C-V variability.

  16. Noise induced hearing loss of forest workers in Turkey.

    PubMed

    Tunay, M; Melemez, K

    2008-09-01

    In this study, a total number of 114 workers who were in 3 different groups in terms of age and work underwent audiometric analysis. In order to determine whether there was a statistically significant difference between the hearing loss levels of the workers who were included in the study, variance analysis was applied with the help of the data obtained as a result of the evaluation. Correlation and regression analysis were applied in order to determine the relations between hearing loss and their age and their time of work. As a result of the variance analysis, statistically significant differences were found at 500, 2000 and 4000 Hz frequencies. The most specific difference was observed among chainsaw machine operators at 4000 Hz frequency, which was determined by the variance analysis. As a result of the correlation analysis, significant relations were found between time of work and hearing loss in 0.01 confidence level and between age and hearing loss in 0.05 confidence level. Forest workers using chainsaw machines should be informed, they should wear or use protective materials and less noising chainsaw machines should be used if possible and workers should undergo audiometric tests when they start work and once a year.

  17. A new framework for estimating return levels using regional frequency analysis

    NASA Astrophysics Data System (ADS)

    Winter, Hugo; Bernardara, Pietro; Clegg, Georgina

    2017-04-01

    We propose a new framework for incorporating more spatial and temporal information into the estimation of extreme return levels. Currently, most studies use extreme value models applied to data from a single site; an approach which is inefficient statistically and leads to return level estimates that are less physically realistic. We aim to highlight the benefits that could be obtained by using methodology based upon regional frequency analysis as opposed to classic single site extreme value analysis. This motivates a shift in thinking, which permits the evaluation of local and regional effects and makes use of the wide variety of data that are now available on high temporal and spatial resolutions. The recent winter storms over the UK during the winters of 2013-14 and 2015-16, which have caused wide-ranging disruption and damaged important infrastructure, provide the main motivation for the current work. One of the most impactful natural hazards is flooding, which is often initiated by extreme precipitation. In this presentation, we focus on extreme rainfall, but shall discuss other meteorological variables alongside potentially damaging hazard combinations. To understand the risks posed by extreme precipitation, we need reliable statistical models which can be used to estimate quantities such as the T-year return level, i.e. the level which is expected to be exceeded once every T-years. Extreme value theory provides the main collection of statistical models that can be used to estimate the risks posed by extreme precipitation events. Broadly, at a single site, a statistical model is fitted to exceedances of a high threshold and the model is used to extrapolate to levels beyond the range of the observed data. However, when we have data at many sites over a spatial domain, fitting a separate model for each separate site makes little sense and it would be better if we could incorporate all this information to improve the reliability of return level estimates. Here, we use the regional frequency analysis approach to define homogeneous regions which are affected by the same storms. Extreme value models are then fitted to the data pooled from across a region. We find that this approach leads to more spatially consistent return level estimates with reduced uncertainty bounds.

  18. The platelet activating factor acetyl hydrolase, oxidized low-density lipoprotein, paraoxonase 1 and arylesterase levels in treated and untreated patients with polycystic ovary syndrome.

    PubMed

    Carlioglu, Ayse; Kaygusuz, Ikbal; Karakurt, Feridun; Gumus, Ilknur Inegol; Uysal, Aysel; Kasapoglu, Benan; Armutcu, Ferah; Uysal, Sema; Keskin, Esra Aktepe; Koca, Cemile

    2014-11-01

    To evaluate the platelet activating factor acetyl hydrolyze (PAF-AH), oxidized low-density lipoprotein (ox-LDL), paraoxonase 1 (PON1), arylesterase (ARE) levels and the effects of metformin and Diane-35 (ethinyl oestradiol + cyproterone acetate) therapies on these parameters and to determine the PON1 polymorphisms among PCOS patients. Ninety patients with PCOS, age 30, and body mass index-matched healthy controls were included in the study. Patients were divided into three groups: metformin treatment, Diane-35 treatment and no medication groups. The treatment with metformin or Diane-35 was continued for 6 months and all subjects were evaluated with clinical and biochemical parameters 6 months later. One-way Anova test, t test and non-parametric Mann-Whitney U tests were used for statistical analysis. PAF-AH and ox-LDL levels were statistically significantly higher in untreated PCOS patients than controls, and they were statistically significantly lower in patients treated with metformin or Diane-35 than untreated PCOS patients. In contrast, there were lower PON1 (not statistically significant) and ARE (statistically significant) levels in untreated PCOS patients than the control group and they significantly increased after metformin and Diane-35 treatments. In PCOS patients serum PON1 levels for QQ, QR and RR phenotypes were statistically significantly lower than the control group. In patients with PCOS, proatherogenic markers increase. The treatment of PCOS with metformin or Diane-35 had positive effects on lipid profile, increased PON1 level, which is a protector from atherosclerosis and decreased the proatherogenic PAF-AH and ox-LDL levels.

  19. Statistics Test Questions: Content and Trends

    ERIC Educational Resources Information Center

    Salcedo, Audy

    2014-01-01

    This study presents the results of the analysis of a group of teacher-made test questions for statistics courses at the university level. Teachers were asked to submit tests they had used in their previous two semesters. Ninety-seven tests containing 978 questions were gathered and classified according to the SOLO taxonomy (Biggs & Collis,…

  20. "Using Power Tables to Compute Statistical Power in Multilevel Experimental Designs"

    ERIC Educational Resources Information Center

    Konstantopoulos, Spyros

    2009-01-01

    Power computations for one-level experimental designs that assume simple random samples are greatly facilitated by power tables such as those presented in Cohen's book about statistical power analysis. However, in education and the social sciences experimental designs have naturally nested structures and multilevel models are needed to compute the…

  1. The Power of 'Evidence': Reliable Science or a Set of Blunt Tools?

    ERIC Educational Resources Information Center

    Wrigley, Terry

    2018-01-01

    In response to the increasing emphasis on 'evidence-based teaching', this article examines the privileging of randomised controlled trials and their statistical synthesis (meta-analysis). It also pays particular attention to two third-level statistical syntheses: John Hattie's "Visible learning" project and the EEF's "Teaching and…

  2. Statistical Analysis and Time Series Modeling of Air Traffic Operations Data From Flight Service Stations and Terminal Radar Approach Control Facilities : Two Case Studies

    DOT National Transportation Integrated Search

    1981-10-01

    Two statistical procedures have been developed to estimate hourly or daily aircraft counts. These counts can then be transformed into estimates of instantaneous air counts. The first procedure estimates the stable (deterministic) mean level of hourly...

  3. Mean values of Arnett's soft tissue analysis in Maratha ethnic (Indian) population - A cephalometric study.

    PubMed

    Singh, Shikha; Deshmukh, Sonali; Merani, Varsha; Rejintal, Neeta

    2016-01-01

    The aim of this article is to evaluate the mean cephalometric values for Arnett's soft tissue analysis in the Maratha ethnic (Indian) population. Lateral cephalograms of 60 patients (30 males and 30 females) aged 18-26 years were obtained with the patients in the Natural Head Position (NHP), with teeth in maximum intercuspation and lips in the rest position. Moreover, hand tracings were also done. The statistical analysis was performed with the help of a statistical software, the Statistical Package for the Social Sciences version 16, and Microsoft word and Excel (Microsoft office 2007) were used to generate the analytical data. Statistical significance was tested atP level (1% and 5% level of significance). Statistical analysis using student's unpaired t-test were performed. Various cephalometric values for the Maratha ethnic (Indian) population differed from Caucasian cephalometric values such as nasolabial inclination, incisor proclination, and exposure, which may affect the outcome of the orthodontic and orthognathic treatment. Marathas have more proclined maxillary incisors, less prominent chin, less facial length, acute nasolabial angle, and all soft tissue thickness are greater in Marathas except lower lip thickness (in Maratha males and females) and upper lip angle (in Maratha males) than those of the Caucasian population. It is a fact that all different ethnic races have different facial characters. The variability of the soft tissue integument in people with different ethnic origin makes it necessary to study the soft tissue standards of a particular community and consider those norms when planning an orthodontic and orthognathic treatment for particular racial and ethnic patients.

  4. Developing a Campaign Plan to Target Centers of Gravity Within Economic Systems

    DTIC Science & Technology

    1995-05-01

    Conclusion 67 CHAPTER 7: CURRENT AND FUTURE CONCERNS 69 Decision Making and Planning 69 Conclusion 72 CHAPTER 8: CONCLUSION 73 APPENDIX A: STATISTICS 80...Terminology and Statistical Tests 80 Country Analysis 84 APPENDIX B 154 BIBLIOGRAPHY 157 VITAE 162 IV LIST OF FIGURES Figure 1. Air Campaign...This project furthers the original statistical effort and adds to this a campaign planning approach (including both systems and operational level

  5. Criteria for a State-of-the-Art Vision Test System

    DTIC Science & Technology

    1985-05-01

    tests are enumerated for possible inclusion in a battery of candidate vision tests to be statistically examined for validity as predictors of aircrew...derived subset thereof) of vision tests may be given to a series of individuals, and statistical tests may be used to determine which visual functions...no target. Statistical analysis of the responses would set a threshold level, which would define the smallest size - (most distant target) or least

  6. Influence of bismuth oxide concentration on the pH level and biocompatibility of white Portland cement.

    PubMed

    Marciano, Marina Angélica; Garcia, Roberto Brandão; Cavenago, Bruno Cavalini; Minotti, Paloma Gagliardi; Midena, Raquel Zanin; Guimarães, Bruno Martini; Ordinola-Zapata, Ronald; Duarte, Marco Antonio Hungaro

    2014-01-01

    To investigate if there is a relation between the increase of bismuth oxide and the decrease of pH levels and an intensification of toxicity in the Portland cement. White Portland cement (WPC) was mixed with 0, 15, 20, 30 and 50% bismuth oxide, in weight. For the pH level test, polyethylene tubes were filled with the cements and immersed in Milli-Q water for 15, 30 and 60 days. After each period, the increase of the pH level was assessed. For the biocompatibility, two polyethylene tubes filled with the cements were implanted in ninety albino rats (n=6). The analysis of the intensity of the inflammatory infiltrate was performed after 15, 30 and 60 days. The statistical analysis was performed using the Kruskal-Wallis, Dunn and Friedman tests for the pH level and the Kruskal-Wallis and Dunn tests for the biological analysis (p<0.05). The results showed an increase of the pH level after 15 days, followed by a slight increase after 30 days and a decrease after 60 days. There were no significant statistical differences among the groups (p>0.05). For the inflammatory infiltrates, no significant statistical differences were found among the groups in each period (p>0.05). The 15% WPC showed a significant decrease of the inflammatory infiltrate from 15 to 30 and 60 days (p<0.05). The addition of bismuth oxide into Portland cement did not affect the pH level and the biological response. The concentration of 15% of bismuth oxide resulted in significant reduction in inflammatory response in comparison with the other concentrations evaluated.

  7. Impact of ecological factors on concern and awareness about disability: a statistical analysis.

    PubMed

    Walker, Gabriela

    2014-11-01

    The barriers that people with disabilities face around the world are not only inherent in the limitations resulting from the disability itself, but, more importantly, these barriers rest with the societal technologies of exclusion. A multiple regression analysis was conducted to examine the statistical relationship between the national level of development, the level of democratization, and the level of education of a country's population on one hand, and expressed concern for people with disabilities on another hand. The results reveal that a greater worry for the well-being of people with disabilities is correlated with a high level of country development, a decreased value of political stability and absence of violence, a decreased level of government effectiveness, and a greater level of law enforcement. There is a direct correlation between concern for people with disabilities and people's awareness about disabilities. Surprisingly, the level of education has no impact on the compassion toward people with disabilities. A comparison case for in depth illustration is discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Forecasting the discomfort levels within the greater Athens area, Greece using artificial neural networks and multiple criteria analysis

    NASA Astrophysics Data System (ADS)

    Vouterakos, P. A.; Moustris, K. P.; Bartzokas, A.; Ziomas, I. C.; Nastos, P. T.; Paliatsos, A. G.

    2012-12-01

    In this work, artificial neural networks (ANNs) were developed and applied in order to forecast the discomfort levels due to the combination of high temperature and air humidity, during the hot season of the year, in eight different regions within the Greater Athens area (GAA), Greece. For the selection of the best type and architecture of ANNs-forecasting models, the multiple criteria analysis (MCA) technique was applied. Three different types of ANNs were developed and tested with the MCA method. Concretely, the multilayer perceptron, the generalized feed forward networks (GFFN), and the time-lag recurrent networks were developed and tested. Results showed that the best ANNs type performance was achieved by using the GFFN model for the prediction of discomfort levels due to high temperature and air humidity within GAA. For the evaluation of the constructed ANNs, appropriate statistical indices were used. The analysis proved that the forecasting ability of the developed ANNs models is very satisfactory at a significant statistical level of p < 0.01.

  9. Use of statistical tools to evaluate the reductive dechlorination of high levels of TCE in microcosm studies.

    PubMed

    Harkness, Mark; Fisher, Angela; Lee, Michael D; Mack, E Erin; Payne, Jo Ann; Dworatzek, Sandra; Roberts, Jeff; Acheson, Carolyn; Herrmann, Ronald; Possolo, Antonio

    2012-04-01

    A large, multi-laboratory microcosm study was performed to select amendments for supporting reductive dechlorination of high levels of trichloroethylene (TCE) found at an industrial site in the United Kingdom (UK) containing dense non-aqueous phase liquid (DNAPL) TCE. The study was designed as a fractional factorial experiment involving 177 bottles distributed between four industrial laboratories and was used to assess the impact of six electron donors, bioaugmentation, addition of supplemental nutrients, and two TCE levels (0.57 and 1.90 mM or 75 and 250 mg/L in the aqueous phase) on TCE dechlorination. Performance was assessed based on the concentration changes of TCE and reductive dechlorination degradation products. The chemical data was evaluated using analysis of variance (ANOVA) and survival analysis techniques to determine both main effects and important interactions for all the experimental variables during the 203-day study. The statistically based design and analysis provided powerful tools that aided decision-making for field application of this technology. The analysis showed that emulsified vegetable oil (EVO), lactate, and methanol were the most effective electron donors, promoting rapid and complete dechlorination of TCE to ethene. Bioaugmentation and nutrient addition also had a statistically significant positive impact on TCE dechlorination. In addition, the microbial community was measured using phospholipid fatty acid analysis (PLFA) for quantification of total biomass and characterization of the community structure and quantitative polymerase chain reaction (qPCR) for enumeration of Dehalococcoides organisms (Dhc) and the vinyl chloride reductase (vcrA) gene. The highest increase in levels of total biomass and Dhc was observed in the EVO microcosms, which correlated well with the dechlorination results. Copyright © 2012 Elsevier B.V. All rights reserved.

  10. Nearfield Summary and Statistical Analysis of the Second AIAA Sonic Boom Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Park, Michael A.; Nemec, Marian

    2017-01-01

    A summary is provided for the Second AIAA Sonic Boom Workshop held 8-9 January 2017 in conjunction with AIAA SciTech 2017. The workshop used three required models of increasing complexity: an axisymmetric body, a wing body, and a complete configuration with flow-through nacelle. An optional complete configuration with propulsion boundary conditions is also provided. These models are designed with similar nearfield signatures to isolate geometry and shock/expansion interaction effects. Eleven international participant groups submitted nearfield signatures with forces, pitching moment, and iterative convergence norms. Statistics and grid convergence of these nearfield signatures are presented. These submissions are propagated to the ground, and noise levels are computed. This allows the grid convergence and the statistical distribution of a noise level to be computed. While progress is documented since the first workshop, improvement to the analysis methods for a possible subsequent workshop are provided. The complete configuration with flow-through nacelle showed the most dramatic improvement between the two workshops. The current workshop cases are more relevant to vehicles with lower loudness and have the potential for lower annoyance than the first workshop cases. The models for this workshop with quieter ground noise levels than the first workshop exposed weaknesses in analysis, particularly in convective discretization.

  11. Statistical analysis of the calibration procedure for personnel radiation measurement instruments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bush, W.J.; Bengston, S.J.; Kalbeitzer, F.L.

    1980-11-01

    Thermoluminescent analyzer (TLA) calibration procedures were used to estimate personnel radiation exposure levels at the Idaho National Engineering Laboratory (INEL). A statistical analysis is presented herein based on data collected over a six month period in 1979 on four TLA's located in the Department of Energy (DOE) Radiological and Environmental Sciences Laboratory at the INEL. The data were collected according to the day-to-day procedure in effect at that time. Both gamma and beta radiation models are developed. Observed TLA readings of thermoluminescent dosimeters are correlated with known radiation levels. This correlation is then used to predict unknown radiation doses frommore » future analyzer readings of personnel thermoluminescent dosimeters. The statistical techniques applied in this analysis include weighted linear regression, estimation of systematic and random error variances, prediction interval estimation using Scheffe's theory of calibration, the estimation of the ratio of the means of two normal bivariate distributed random variables and their corresponding confidence limits according to Kendall and Stuart, tests of normality, experimental design, a comparison between instruments, and quality control.« less

  12. THE MEASUREMENT OF BONE QUALITY USING GRAY LEVEL CO-OCCURRENCE MATRIX TEXTURAL FEATURES.

    PubMed

    Shirvaikar, Mukul; Huang, Ning; Dong, Xuanliang Neil

    2016-10-01

    In this paper, statistical methods for the estimation of bone quality to predict the risk of fracture are reported. Bone mineral density and bone architecture properties are the main contributors of bone quality. Dual-energy X-ray Absorptiometry (DXA) is the traditional clinical measurement technique for bone mineral density, but does not include architectural information to enhance the prediction of bone fragility. Other modalities are not practical due to cost and access considerations. This study investigates statistical parameters based on the Gray Level Co-occurrence Matrix (GLCM) extracted from two-dimensional projection images and explores links with architectural properties and bone mechanics. Data analysis was conducted on Micro-CT images of 13 trabecular bones (with an in-plane spatial resolution of about 50μm). Ground truth data for bone volume fraction (BV/TV), bone strength and modulus were available based on complex 3D analysis and mechanical tests. Correlation between the statistical parameters and biomechanical test results was studied using regression analysis. The results showed Cluster-Shade was strongly correlated with the microarchitecture of the trabecular bone and related to mechanical properties. Once the principle thesis of utilizing second-order statistics is established, it can be extended to other modalities, providing cost and convenience advantages for patients and doctors.

  13. THE MEASUREMENT OF BONE QUALITY USING GRAY LEVEL CO-OCCURRENCE MATRIX TEXTURAL FEATURES

    PubMed Central

    Shirvaikar, Mukul; Huang, Ning; Dong, Xuanliang Neil

    2016-01-01

    In this paper, statistical methods for the estimation of bone quality to predict the risk of fracture are reported. Bone mineral density and bone architecture properties are the main contributors of bone quality. Dual-energy X-ray Absorptiometry (DXA) is the traditional clinical measurement technique for bone mineral density, but does not include architectural information to enhance the prediction of bone fragility. Other modalities are not practical due to cost and access considerations. This study investigates statistical parameters based on the Gray Level Co-occurrence Matrix (GLCM) extracted from two-dimensional projection images and explores links with architectural properties and bone mechanics. Data analysis was conducted on Micro-CT images of 13 trabecular bones (with an in-plane spatial resolution of about 50μm). Ground truth data for bone volume fraction (BV/TV), bone strength and modulus were available based on complex 3D analysis and mechanical tests. Correlation between the statistical parameters and biomechanical test results was studied using regression analysis. The results showed Cluster-Shade was strongly correlated with the microarchitecture of the trabecular bone and related to mechanical properties. Once the principle thesis of utilizing second-order statistics is established, it can be extended to other modalities, providing cost and convenience advantages for patients and doctors. PMID:28042512

  14. Routes to failure: analysis of 41 civil aviation accidents from the Republic of China using the human factors analysis and classification system.

    PubMed

    Li, Wen-Chin; Harris, Don; Yu, Chung-San

    2008-03-01

    The human factors analysis and classification system (HFACS) is based upon Reason's organizational model of human error. HFACS was developed as an analytical framework for the investigation of the role of human error in aviation accidents, however, there is little empirical work formally describing the relationship between the components in the model. This research analyses 41 civil aviation accidents occurring to aircraft registered in the Republic of China (ROC) between 1999 and 2006 using the HFACS framework. The results show statistically significant relationships between errors at the operational level and organizational inadequacies at both the immediately adjacent level (preconditions for unsafe acts) and higher levels in the organization (unsafe supervision and organizational influences). The pattern of the 'routes to failure' observed in the data from this analysis of civil aircraft accidents show great similarities to that observed in the analysis of military accidents. This research lends further support to Reason's model that suggests that active failures are promoted by latent conditions in the organization. Statistical relationships linking fallible decisions in upper management levels were found to directly affect supervisory practices, thereby creating the psychological preconditions for unsafe acts and hence indirectly impairing the performance of pilots, ultimately leading to accidents.

  15. Why weight? Modelling sample and observational level variability improves power in RNA-seq analyses.

    PubMed

    Liu, Ruijie; Holik, Aliaksei Z; Su, Shian; Jansz, Natasha; Chen, Kelan; Leong, Huei San; Blewitt, Marnie E; Asselin-Labat, Marie-Liesse; Smyth, Gordon K; Ritchie, Matthew E

    2015-09-03

    Variations in sample quality are frequently encountered in small RNA-sequencing experiments, and pose a major challenge in a differential expression analysis. Removal of high variation samples reduces noise, but at a cost of reducing power, thus limiting our ability to detect biologically meaningful changes. Similarly, retaining these samples in the analysis may not reveal any statistically significant changes due to the higher noise level. A compromise is to use all available data, but to down-weight the observations from more variable samples. We describe a statistical approach that facilitates this by modelling heterogeneity at both the sample and observational levels as part of the differential expression analysis. At the sample level this is achieved by fitting a log-linear variance model that includes common sample-specific or group-specific parameters that are shared between genes. The estimated sample variance factors are then converted to weights and combined with observational level weights obtained from the mean-variance relationship of the log-counts-per-million using 'voom'. A comprehensive analysis involving both simulations and experimental RNA-sequencing data demonstrates that this strategy leads to a universally more powerful analysis and fewer false discoveries when compared to conventional approaches. This methodology has wide application and is implemented in the open-source 'limma' package. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  16. A novel complete-case analysis to determine statistical significance between treatments in an intention-to-treat population of randomized clinical trials involving missing data.

    PubMed

    Liu, Wei; Ding, Jinhui

    2018-04-01

    The application of the principle of the intention-to-treat (ITT) to the analysis of clinical trials is challenged in the presence of missing outcome data. The consequences of stopping an assigned treatment in a withdrawn subject are unknown. It is difficult to make a single assumption about missing mechanisms for all clinical trials because there are complicated reactions in the human body to drugs due to the presence of complex biological networks, leading to data missing randomly or non-randomly. Currently there is no statistical method that can tell whether a difference between two treatments in the ITT population of a randomized clinical trial with missing data is significant at a pre-specified level. Making no assumptions about the missing mechanisms, we propose a generalized complete-case (GCC) analysis based on the data of completers. An evaluation of the impact of missing data on the ITT analysis reveals that a statistically significant GCC result implies a significant treatment effect in the ITT population at a pre-specified significance level unless, relative to the comparator, the test drug is poisonous to the non-completers as documented in their medical records. Applications of the GCC analysis are illustrated using literature data, and its properties and limits are discussed.

  17. Multi-resolutional shape features via non-Euclidean wavelets: Applications to statistical analysis of cortical thickness

    PubMed Central

    Kim, Won Hwa; Singh, Vikas; Chung, Moo K.; Hinrichs, Chris; Pachauri, Deepti; Okonkwo, Ozioma C.; Johnson, Sterling C.

    2014-01-01

    Statistical analysis on arbitrary surface meshes such as the cortical surface is an important approach to understanding brain diseases such as Alzheimer’s disease (AD). Surface analysis may be able to identify specific cortical patterns that relate to certain disease characteristics or exhibit differences between groups. Our goal in this paper is to make group analysis of signals on surfaces more sensitive. To do this, we derive multi-scale shape descriptors that characterize the signal around each mesh vertex, i.e., its local context, at varying levels of resolution. In order to define such a shape descriptor, we make use of recent results from harmonic analysis that extend traditional continuous wavelet theory from the Euclidean to a non-Euclidean setting (i.e., a graph, mesh or network). Using this descriptor, we conduct experiments on two different datasets, the Alzheimer’s Disease NeuroImaging Initiative (ADNI) data and images acquired at the Wisconsin Alzheimer’s Disease Research Center (W-ADRC), focusing on individuals labeled as having Alzheimer’s disease (AD), mild cognitive impairment (MCI) and healthy controls. In particular, we contrast traditional univariate methods with our multi-resolution approach which show increased sensitivity and improved statistical power to detect a group-level effects. We also provide an open source implementation. PMID:24614060

  18. Combine bivariate statistics analysis and multivariate statistics analysis to assess landslide susceptibility in Chen-Yu-Lan watershed, Nantou, Taiwan.

    NASA Astrophysics Data System (ADS)

    Ngan Nguyen, Thi To; Liu, Cheng-Chien

    2013-04-01

    How landslides occurred and which factors triggered and sped up landslide occurrences were usually asked by researchers in the past decades. Many investigations carried out in many places in the world to finding out methods that predict and prevent damages from landslides phenomena. Chen-Yu-Lan River watershed is reputed as a 'hot pot' of landslide researches in Taiwan by its complicated geological structures with the significant tectonic fault systems and steeply mountainous terrain. Beside annual high precipitation concentration and the abrupt slopes, some natural disaster, as typhoons (Sinlaku-2008, Kalmaegi-2008, and Marakot-2009) and earthquake (Chi-Chi earthquake-1999) are also the triggered factors cause landslides with serious damages in this place. This research expresses the quantitative approaches to generate landslide susceptible map for Chen-Yu-Lan watershed, a mountainous area in the central Taiwan. Landslide inventories data, which were detected from the Formosat-2 imageries for eight years from 2004 to 2011, were applied to carry out landslide susceptibility mapping. Bivariate statistics analysis and multivariate statistics analysis would be applied to calculate susceptible index of landslides. The weights of parameters were computed based on landslide data for eight years from 2004 to 2011. To validate effective levels of factors to landslide occurrences, this method built some multivariate algorithms and compared these results with real landslide occurrences. Besides this method, the historical data of landslides were also used to assess and classify landslide susceptibility levels. From long-term landslide data, relation between landslide susceptibility levels and landslide repetition was assigned. The results demonstrated differently effective levels of potential factors, such as, slope gradient, drainage density, lithology and land use to landslide phenomena. The results also showed logical relationship between weights and characteristics of factors' classes. Depending on these results be able to help planning managers localize the high risk areas of landslide or safely areas by building and human activities.

  19. Low-Level Contrast Statistics of Natural Images Can Modulate the Frequency of Event-Related Potentials (ERP) in Humans.

    PubMed

    Ghodrati, Masoud; Ghodousi, Mahrad; Yoonessi, Ali

    2016-01-01

    Humans are fast and accurate in categorizing complex natural images. It is, however, unclear what features of visual information are exploited by brain to perceive the images with such speed and accuracy. It has been shown that low-level contrast statistics of natural scenes can explain the variance of amplitude of event-related potentials (ERP) in response to rapidly presented images. In this study, we investigated the effect of these statistics on frequency content of ERPs. We recorded ERPs from human subjects, while they viewed natural images each presented for 70 ms. Our results showed that Weibull contrast statistics, as a biologically plausible model, explained the variance of ERPs the best, compared to other image statistics that we assessed. Our time-frequency analysis revealed a significant correlation between these statistics and ERPs' power within theta frequency band (~3-7 Hz). This is interesting, as theta band is believed to be involved in context updating and semantic encoding. This correlation became significant at ~110 ms after stimulus onset, and peaked at 138 ms. Our results show that not only the amplitude but also the frequency of neural responses can be modulated with low-level contrast statistics of natural images and highlights their potential role in scene perception.

  20. Low-Level Contrast Statistics of Natural Images Can Modulate the Frequency of Event-Related Potentials (ERP) in Humans

    PubMed Central

    Ghodrati, Masoud; Ghodousi, Mahrad; Yoonessi, Ali

    2016-01-01

    Humans are fast and accurate in categorizing complex natural images. It is, however, unclear what features of visual information are exploited by brain to perceive the images with such speed and accuracy. It has been shown that low-level contrast statistics of natural scenes can explain the variance of amplitude of event-related potentials (ERP) in response to rapidly presented images. In this study, we investigated the effect of these statistics on frequency content of ERPs. We recorded ERPs from human subjects, while they viewed natural images each presented for 70 ms. Our results showed that Weibull contrast statistics, as a biologically plausible model, explained the variance of ERPs the best, compared to other image statistics that we assessed. Our time-frequency analysis revealed a significant correlation between these statistics and ERPs' power within theta frequency band (~3–7 Hz). This is interesting, as theta band is believed to be involved in context updating and semantic encoding. This correlation became significant at ~110 ms after stimulus onset, and peaked at 138 ms. Our results show that not only the amplitude but also the frequency of neural responses can be modulated with low-level contrast statistics of natural images and highlights their potential role in scene perception. PMID:28018197

  1. Study of photon correlation techniques for processing of laser velocimeter signals

    NASA Technical Reports Server (NTRS)

    Mayo, W. T., Jr.

    1977-01-01

    The objective was to provide the theory and a system design for a new type of photon counting processor for low level dual scatter laser velocimeter (LV) signals which would be capable of both the first order measurements of mean flow and turbulence intensity and also the second order time statistics: cross correlation auto correlation, and related spectra. A general Poisson process model for low level LV signals and noise which is valid from the photon-resolved regime all the way to the limiting case of nonstationary Gaussian noise was used. Computer simulation algorithms and higher order statistical moment analysis of Poisson processes were derived and applied to the analysis of photon correlation techniques. A system design using a unique dual correlate and subtract frequency discriminator technique is postulated and analyzed. Expectation analysis indicates that the objective measurements are feasible.

  2. Spatially Pooled Contrast Responses Predict Neural and Perceptual Similarity of Naturalistic Image Categories

    PubMed Central

    Groen, Iris I. A.; Ghebreab, Sennay; Lamme, Victor A. F.; Scholte, H. Steven

    2012-01-01

    The visual world is complex and continuously changing. Yet, our brain transforms patterns of light falling on our retina into a coherent percept within a few hundred milliseconds. Possibly, low-level neural responses already carry substantial information to facilitate rapid characterization of the visual input. Here, we computationally estimated low-level contrast responses to computer-generated naturalistic images, and tested whether spatial pooling of these responses could predict image similarity at the neural and behavioral level. Using EEG, we show that statistics derived from pooled responses explain a large amount of variance between single-image evoked potentials (ERPs) in individual subjects. Dissimilarity analysis on multi-electrode ERPs demonstrated that large differences between images in pooled response statistics are predictive of more dissimilar patterns of evoked activity, whereas images with little difference in statistics give rise to highly similar evoked activity patterns. In a separate behavioral experiment, images with large differences in statistics were judged as different categories, whereas images with little differences were confused. These findings suggest that statistics derived from low-level contrast responses can be extracted in early visual processing and can be relevant for rapid judgment of visual similarity. We compared our results with two other, well- known contrast statistics: Fourier power spectra and higher-order properties of contrast distributions (skewness and kurtosis). Interestingly, whereas these statistics allow for accurate image categorization, they do not predict ERP response patterns or behavioral categorization confusions. These converging computational, neural and behavioral results suggest that statistics of pooled contrast responses contain information that corresponds with perceived visual similarity in a rapid, low-level categorization task. PMID:23093921

  3. Bilirubin and Stroke Risk Using a Mendelian Randomization Design.

    PubMed

    Lee, Sun Ju; Jee, Yon Ho; Jung, Keum Ji; Hong, Seri; Shin, Eun Soon; Jee, Sun Ha

    2017-05-01

    Circulating bilirubin, a natural antioxidant, is associated with decreased risk of stroke. However, the nature of the relationship between the two remains unknown. We used a Mendelian randomization analysis to assess the causal effect of serum bilirubin on stroke risk in Koreans. The 14 single-nucleotide polymorphisms (SNPs) (<10 -7 ) including rs6742078 of uridine diphosphoglucuronyl-transferase were selected from genome-wide association study of bilirubin level in the KCPS-II (Korean Cancer Prevention Study-II) Biobank subcohort consisting of 4793 healthy Korean and 806 stroke cases. Weighted genetic risk score was calculated using 14 SNPs selected from the top SNPs. Both rs6742078 (F statistics=138) and weighted genetic risk score with 14 SNPs (F statistics=187) were strongly associated with bilirubin levels. Simultaneously, serum bilirubin level was associated with decreased risk of stroke in an ordinary least-squares analysis. However, in 2-stage least-squares Mendelian randomization analysis, no causal relationship between serum bilirubin and stroke risk was found. There is no evidence that bilirubin level is causally associated with risk of stroke in Koreans. Therefore, bilirubin level is not a risk determinant of stroke. © 2017 American Heart Association, Inc.

  4. Efficiency Analysis of Public Universities in Thailand

    ERIC Educational Resources Information Center

    Kantabutra, Saranya; Tang, John C. S.

    2010-01-01

    This paper examines the performance of Thai public universities in terms of efficiency, using a non-parametric approach called data envelopment analysis. Two efficiency models, the teaching efficiency model and the research efficiency model, are developed and the analysis is conducted at the faculty level. Further statistical analyses are also…

  5. Emergent Irreversibility and Entanglement Spectrum Statistics

    NASA Astrophysics Data System (ADS)

    Chamon, Claudio; Hamma, Alioscia; Mucciolo, Eduardo R.

    2014-06-01

    We study the problem of irreversibility when the dynamical evolution of a many-body system is described by a stochastic quantum circuit. Such evolution is more general than a Hamiltonian one, and since energy levels are not well defined, the well-established connection between the statistical fluctuations of the energy spectrum and irreversibility cannot be made. We show that the entanglement spectrum provides a more general connection. Irreversibility is marked by a failure of a disentangling algorithm and is preceded by the appearance of Wigner-Dyson statistical fluctuations in the entanglement spectrum. This analysis can be done at the wave-function level and offers an alternative route to study quantum chaos and quantum integrability.

  6. In vivo evaluation of the effect of stimulus distribution on FIR statistical efficiency in event-related fMRI

    PubMed Central

    Jansma, J Martijn; de Zwart, Jacco A; van Gelderen, Peter; Duyn, Jeff H; Drevets, Wayne C; Furey, Maura L

    2013-01-01

    Technical developments in MRI have improved signal to noise, allowing use of analysis methods such as Finite impulse response (FIR) of rapid event related functional MRI (er-fMRI). FIR is one of the most informative analysis methods as it determines onset and full shape of the hemodynamic response function (HRF) without any a-priori assumptions. FIR is however vulnerable to multicollinearity, which is directly related to the distribution of stimuli over time. Efficiency can be optimized by simplifying a design, and restricting stimuli distribution to specific sequences, while more design flexibility necessarily reduces efficiency. However, the actual effect of efficiency on fMRI results has never been tested in vivo. Thus, it is currently difficult to make an informed choice between protocol flexibility and statistical efficiency. The main goal of this study was to assign concrete fMRI signal to noise values to the abstract scale of FIR statistical efficiency. Ten subjects repeated a perception task with five random and m-sequence based protocol, with varying but, according to literature, acceptable levels of multicollinearity. Results indicated substantial differences in signal standard deviation, while the level was a function of multicollinearity. Experiment protocols varied up to 55.4% in standard deviation. Results confirm that quality of fMRI in an FIR analysis can significantly and substantially vary with statistical efficiency. Our in vivo measurements can be used to aid in making an informed decision between freedom in protocol design and statistical efficiency. PMID:23473798

  7. Effects of the water level on the flow topology over the Bolund island

    NASA Astrophysics Data System (ADS)

    Cuerva-Tejero, A.; Yeow, T. S.; Gallego-Castillo, C.; Lopez-Garcia, O.

    2014-06-01

    We have analyzed the influence of the actual height of Bolund island above water level on different full-scale statistics of the velocity field over the peninsula. Our analysis is focused on the database of 10-minute statistics provided by Risø-DTU for the Bolund Blind Experiment. We have considered 10-minut.e periods with near-neutral atmospheric conditions, mean wind speed values in the interval [5,20] m/s, and westerly wind directions. As expected, statistics such as speed-up, normalized increase of turbulent kinetic energy and probability of recirculating flow show a large dependence on the emerged height of the island for the locations close to the escarpment. For the published ensemble mean values of speed-up and normalized increase of turbulent kinetic energy in these locations, we propose that some ammount of uncertainty could be explained as a deterministic dependence of the flow field statistics upon the actual height of the Bolund island above the sea level.

  8. A Statistical Analysis of Induced Travel Effects in the U.S. Mid-Atlantic Region

    DOT National Transportation Integrated Search

    2000-04-01

    We investigate the hypothesis of induced travel demand. County level data from Maryland, Virginia, North Carolina, and Washington, DC are used to estimate "fixed-effects" cross-sectional time-series models that relate travel levels, measured as daily...

  9. Finding P-Values for F Tests of Hypothesis on a Spreadsheet.

    ERIC Educational Resources Information Center

    Rochowicz, John A., Jr.

    The calculation of the F statistic for a one-factor analysis of variance (ANOVA) and the construction of an ANOVA tables are easily implemented on a spreadsheet. This paper describes how to compute the p-value (observed significance level) for a particular F statistic on a spreadsheet. Decision making on a spreadsheet and applications to the…

  10. An Automated Statistical Process Control Study of Inline Mixing Using Spectrophotometric Detection

    ERIC Educational Resources Information Center

    Dickey, Michael D.; Stewart, Michael D.; Willson, C. Grant

    2006-01-01

    An experiment is described, which is designed for a junior-level chemical engineering "fundamentals of measurements and data analysis" course, where students are introduced to the concept of statistical process control (SPC) through a simple inline mixing experiment. The students learn how to create and analyze control charts in an effort to…

  11. Multiple linear regression analysis

    NASA Technical Reports Server (NTRS)

    Edwards, T. R.

    1980-01-01

    Program rapidly selects best-suited set of coefficients. User supplies only vectors of independent and dependent data and specifies confidence level required. Program uses stepwise statistical procedure for relating minimal set of variables to set of observations; final regression contains only most statistically significant coefficients. Program is written in FORTRAN IV for batch execution and has been implemented on NOVA 1200.

  12. A novel bi-level meta-analysis approach: applied to biological pathway analysis.

    PubMed

    Nguyen, Tin; Tagett, Rebecca; Donato, Michele; Mitrea, Cristina; Draghici, Sorin

    2016-02-01

    The accumulation of high-throughput data in public repositories creates a pressing need for integrative analysis of multiple datasets from independent experiments. However, study heterogeneity, study bias, outliers and the lack of power of available methods present real challenge in integrating genomic data. One practical drawback of many P-value-based meta-analysis methods, including Fisher's, Stouffer's, minP and maxP, is that they are sensitive to outliers. Another drawback is that, because they perform just one statistical test for each individual experiment, they may not fully exploit the potentially large number of samples within each study. We propose a novel bi-level meta-analysis approach that employs the additive method and the Central Limit Theorem within each individual experiment and also across multiple experiments. We prove that the bi-level framework is robust against bias, less sensitive to outliers than other methods, and more sensitive to small changes in signal. For comparative analysis, we demonstrate that the intra-experiment analysis has more power than the equivalent statistical test performed on a single large experiment. For pathway analysis, we compare the proposed framework versus classical meta-analysis approaches (Fisher's, Stouffer's and the additive method) as well as against a dedicated pathway meta-analysis package (MetaPath), using 1252 samples from 21 datasets related to three human diseases, acute myeloid leukemia (9 datasets), type II diabetes (5 datasets) and Alzheimer's disease (7 datasets). Our framework outperforms its competitors to correctly identify pathways relevant to the phenotypes. The framework is sufficiently general to be applied to any type of statistical meta-analysis. The R scripts are available on demand from the authors. sorin@wayne.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  13. Lower education level is a major risk factor for peritonitis incidence in chronic peritoneal dialysis patients: a retrospective cohort study with 12-year follow-up.

    PubMed

    Chern, Yahn-Bor; Ho, Pei-Shan; Kuo, Li-Chueh; Chen, Jin-Bor

    2013-01-01

    Peritoneal dialysis (PD)-related peritonitis remains an important complication in PD patients, potentially causing technique failure and influencing patient outcome. To date, no comprehensive study in the Taiwanese PD population has used a time-dependent statistical method to analyze the factors associated with PD-related peritonitis. Our single-center retrospective cohort study, conducted in southern Taiwan between February 1999 and July 2010, used time-dependent statistical methods to analyze the factors associated with PD-related peritonitis. The study recruited 404 PD patients for analysis, 150 of whom experienced at least 1 episode of peritonitis during the follow-up period. The incidence rate of peritonitis was highest during the first 6 months after PD start. A comparison of patients in the two groups (peritonitis vs null-peritonitis) by univariate analysis showed that the peritonitis group included fewer men (p = 0.048) and more patients of older age (≥65 years, p = 0.049). In addition, patients who had never received compulsory education showed a statistically higher incidence of PD-related peritonitis in the univariate analysis (p = 0.04). A proportional hazards model identified education level (less than elementary school vs any higher education level) as having an independent association with PD-related peritonitis [hazard ratio (HR): 1.45; 95% confidence interval (CI): 1.01 to 2.06; p = 0.045). Comorbidities measured using the Charlson comorbidity index (score >2 vs ≤2) showed borderline statistical significance (HR: 1.44; 95% CI: 1.00 to 2.13; p = 0.053). A lower education level is a major risk factor for PD-related peritonitis independent of age, sex, hypoalbuminemia, and comorbidities. Our study emphasizes that a comprehensive PD education program is crucial for PD patients with a lower education level.

  14. Method and system of Jones-matrix mapping of blood plasma films with "fuzzy" analysis in differentiation of breast pathology changes

    NASA Astrophysics Data System (ADS)

    Zabolotna, Natalia I.; Radchenko, Kostiantyn O.; Karas, Oleksandr V.

    2018-01-01

    A fibroadenoma diagnosing of breast using statistical analysis (determination and analysis of statistical moments of the 1st-4th order) of the obtained polarization images of Jones matrix imaginary elements of the optically thin (attenuation coefficient τ <= 0,1 ) blood plasma films with further intellectual differentiation based on the method of "fuzzy" logic and discriminant analysis were proposed. The accuracy of the intellectual differentiation of blood plasma samples to the "norm" and "fibroadenoma" of breast was 82.7% by the method of linear discriminant analysis, and by the "fuzzy" logic method is 95.3%. The obtained results allow to confirm the potentially high level of reliability of the method of differentiation by "fuzzy" analysis.

  15. Statistics of high-level scene context.

    PubMed

    Greene, Michelle R

    2013-01-01

    CONTEXT IS CRITICAL FOR RECOGNIZING ENVIRONMENTS AND FOR SEARCHING FOR OBJECTS WITHIN THEM: contextual associations have been shown to modulate reaction time and object recognition accuracy, as well as influence the distribution of eye movements and patterns of brain activations. However, we have not yet systematically quantified the relationships between objects and their scene environments. Here I seek to fill this gap by providing descriptive statistics of object-scene relationships. A total of 48, 167 objects were hand-labeled in 3499 scenes using the LabelMe tool (Russell et al., 2008). From these data, I computed a variety of descriptive statistics at three different levels of analysis: the ensemble statistics that describe the density and spatial distribution of unnamed "things" in the scene; the bag of words level where scenes are described by the list of objects contained within them; and the structural level where the spatial distribution and relationships between the objects are measured. The utility of each level of description for scene categorization was assessed through the use of linear classifiers, and the plausibility of each level for modeling human scene categorization is discussed. Of the three levels, ensemble statistics were found to be the most informative (per feature), and also best explained human patterns of categorization errors. Although a bag of words classifier had similar performance to human observers, it had a markedly different pattern of errors. However, certain objects are more useful than others, and ceiling classification performance could be achieved using only the 64 most informative objects. As object location tends not to vary as a function of category, structural information provided little additional information. Additionally, these data provide valuable information on natural scene redundancy that can be exploited for machine vision, and can help the visual cognition community to design experiments guided by statistics rather than intuition.

  16. [Statistical analysis of articles in "Chinese journal of applied physiology" from 1999 to 2008].

    PubMed

    Du, Fei; Fang, Tao; Ge, Xue-ming; Jin, Peng; Zhang, Xiao-hong; Sun, Jin-li

    2010-05-01

    To evaluate the academic level and influence of "Chinese Journal of Applied Physiology" through statistical analysis for the fund sponsored articles published in the recent ten years. The articles of "Chinese Journal of Applied Physiology" from 1999 to 2008 were investigated. The number and the percentage of the fund sponsored articles, the fund organization and the author region were quantitatively analyzed by using the literature metrology method. The number of the fund sponsored articles increased unceasingly. The ratio of the fund from local government significantly enhanced in the latter five years. Most of the articles were from institutes located at Beijing, Zhejiang and Tianjin. "Chinese Journal of Applied Physiology" has a fine academic level and social influence.

  17. Statistical modeling of crystalline silica exposure by trade in the construction industry using a database compiled from the literature.

    PubMed

    Sauvé, Jean-François; Beaudry, Charles; Bégin, Denis; Dion, Chantal; Gérin, Michel; Lavoué, Jérôme

    2012-09-01

    A quantitative determinants-of-exposure analysis of respirable crystalline silica (RCS) levels in the construction industry was performed using a database compiled from an extensive literature review. Statistical models were developed to predict work-shift exposure levels by trade. Monte Carlo simulation was used to recreate exposures derived from summarized measurements which were combined with single measurements for analysis. Modeling was performed using Tobit models within a multimodel inference framework, with year, sampling duration, type of environment, project purpose, project type, sampling strategy and use of exposure controls as potential predictors. 1346 RCS measurements were included in the analysis, of which 318 were non-detects and 228 were simulated from summary statistics. The model containing all the variables explained 22% of total variability. Apart from trade, sampling duration, year and strategy were the most influential predictors of RCS levels. The use of exposure controls was associated with an average decrease of 19% in exposure levels compared to none, and increased concentrations were found for industrial, demolition and renovation projects. Predicted geometric means for year 1999 were the highest for drilling rig operators (0.238 mg m(-3)) and tunnel construction workers (0.224 mg m(-3)), while the estimated exceedance fraction of the ACGIH TLV by trade ranged from 47% to 91%. The predicted geometric means in this study indicated important overexposure compared to the TLV. However, the low proportion of variability explained by the models suggests that the construction trade is only a moderate predictor of work-shift exposure levels. The impact of the different tasks performed during a work shift should also be assessed to provide better management and control of RCS exposure levels on construction sites.

  18. Review and statistical analysis of the use of ultrasonic velocity for estimating the porosity fraction in polycrystalline materials

    NASA Technical Reports Server (NTRS)

    Roth, D. J.; Swickard, S. M.; Stang, D. B.; Deguire, M. R.

    1991-01-01

    A review and statistical analysis of the ultrasonic velocity method for estimating the porosity fraction in polycrystalline materials is presented. Initially, a semiempirical model is developed showing the origin of the linear relationship between ultrasonic velocity and porosity fraction. Then, from a compilation of data produced by many researchers, scatter plots of velocity versus percent porosity data are shown for Al2O3, MgO, porcelain-based ceramics, PZT, SiC, Si3N4, steel, tungsten, UO2,(U0.30Pu0.70)C, and YBa2Cu3O(7-x). Linear regression analysis produces predicted slope, intercept, correlation coefficient, level of significance, and confidence interval statistics for the data. Velocity values predicted from regression analysis of fully-dense materials are in good agreement with those calculated from elastic properties.

  19. Review and statistical analysis of the ultrasonic velocity method for estimating the porosity fraction in polycrystalline materials

    NASA Technical Reports Server (NTRS)

    Roth, D. J.; Swickard, S. M.; Stang, D. B.; Deguire, M. R.

    1990-01-01

    A review and statistical analysis of the ultrasonic velocity method for estimating the porosity fraction in polycrystalline materials is presented. Initially, a semi-empirical model is developed showing the origin of the linear relationship between ultrasonic velocity and porosity fraction. Then, from a compilation of data produced by many researchers, scatter plots of velocity versus percent porosity data are shown for Al2O3, MgO, porcelain-based ceramics, PZT, SiC, Si3N4, steel, tungsten, UO2,(U0.30Pu0.70)C, and YBa2Cu3O(7-x). Linear regression analysis produced predicted slope, intercept, correlation coefficient, level of significance, and confidence interval statistics for the data. Velocity values predicted from regression analysis for fully-dense materials are in good agreement with those calculated from elastic properties.

  20. Controlling the joint local false discovery rate is more powerful than meta-analysis methods in joint analysis of summary statistics from multiple genome-wide association studies.

    PubMed

    Jiang, Wei; Yu, Weichuan

    2017-02-15

    In genome-wide association studies (GWASs) of common diseases/traits, we often analyze multiple GWASs with the same phenotype together to discover associated genetic variants with higher power. Since it is difficult to access data with detailed individual measurements, summary-statistics-based meta-analysis methods have become popular to jointly analyze datasets from multiple GWASs. In this paper, we propose a novel summary-statistics-based joint analysis method based on controlling the joint local false discovery rate (Jlfdr). We prove that our method is the most powerful summary-statistics-based joint analysis method when controlling the false discovery rate at a certain level. In particular, the Jlfdr-based method achieves higher power than commonly used meta-analysis methods when analyzing heterogeneous datasets from multiple GWASs. Simulation experiments demonstrate the superior power of our method over meta-analysis methods. Also, our method discovers more associations than meta-analysis methods from empirical datasets of four phenotypes. The R-package is available at: http://bioinformatics.ust.hk/Jlfdr.html . eeyu@ust.hk. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  1. Utilization of an Enhanced Canonical Correlation Analysis (ECCA) to Predict Daily Precipitation and Temperature in a Semi-Arid Environment

    NASA Astrophysics Data System (ADS)

    Lopez, S. R.; Hogue, T. S.

    2011-12-01

    Global climate models (GCMs) are primarily used to generate historical and future large-scale circulation patterns at a coarse resolution (typical order of 50,000 km2) and fail to capture climate variability at the ground level due to localized surface influences (i.e topography, marine, layer, land cover, etc). Their inability to accurately resolve these processes has led to the development of numerous 'downscaling' techniques. The goal of this study is to enhance statistical downscaling of daily precipitation and temperature for regions with heterogeneous land cover and topography. Our analysis was divided into two periods, historical (1961-2000) and contemporary (1980-2000), and tested using sixteen predictand combinations from four GCMs (GFDL CM2.0, GFDL CM2.1, CNRM-CM3 and MRI-CGCM2 3.2a. The Southern California area was separated into five county regions: Santa Barbara, Ventura, Los Angeles, Orange and San Diego. Principle component analysis (PCA) was performed on ground-based observations in order to (1) reduce the number of redundant gauges and minimize dimensionality and (2) cluster gauges that behave statistically similarly for post-analysis. Post-PCA analysis included extensive testing of predictor-predictand relationships using an enhanced canonical correlation analysis (ECCA). The ECCA includes obtaining the optimal predictand sets for all models within each spatial domain (county) as governed by daily and monthly overall statistics. Results show all models maintain mean annual and monthly behavior within each county and daily statistics are improved. The level of improvement highly depends on the vegetation extent within each county and the land-to-ocean ratio within the GCM spatial grid. The utilization of the entire historical period also leads to better statistical representation of observed daily precipitation. The validated ECCA technique is being applied to future climate scenarios distributed by the IPCC in order to provide forcing data for regional hydrologic models and assess future water resources in the Southern California region.

  2. Meta‐analysis using individual participant data: one‐stage and two‐stage approaches, and why they may differ

    PubMed Central

    Ensor, Joie; Riley, Richard D.

    2016-01-01

    Meta‐analysis using individual participant data (IPD) obtains and synthesises the raw, participant‐level data from a set of relevant studies. The IPD approach is becoming an increasingly popular tool as an alternative to traditional aggregate data meta‐analysis, especially as it avoids reliance on published results and provides an opportunity to investigate individual‐level interactions, such as treatment‐effect modifiers. There are two statistical approaches for conducting an IPD meta‐analysis: one‐stage and two‐stage. The one‐stage approach analyses the IPD from all studies simultaneously, for example, in a hierarchical regression model with random effects. The two‐stage approach derives aggregate data (such as effect estimates) in each study separately and then combines these in a traditional meta‐analysis model. There have been numerous comparisons of the one‐stage and two‐stage approaches via theoretical consideration, simulation and empirical examples, yet there remains confusion regarding when each approach should be adopted, and indeed why they may differ. In this tutorial paper, we outline the key statistical methods for one‐stage and two‐stage IPD meta‐analyses, and provide 10 key reasons why they may produce different summary results. We explain that most differences arise because of different modelling assumptions, rather than the choice of one‐stage or two‐stage itself. We illustrate the concepts with recently published IPD meta‐analyses, summarise key statistical software and provide recommendations for future IPD meta‐analyses. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. PMID:27747915

  3. The effect of orbital implantation on peripheral blood melatonin and sex hormone levels in child patients with congenital eyeball dysplasia.

    PubMed

    Ma, Junze; Liu, Tao; Qu, Jianqiang

    2017-09-01

    The aim of the study was to examine the effect of orbital implantation on peripheral blood melatonin and sex hormone levels in pediatric patients with congenital eyeball dysplasia. A total of 28 cases of pediatric patients with congenital eyeball dysplasia diagnosed in the Second Affiliated Hospital of Xi'an Jiaotong University from June 2014 to December 2014 were selected for the study. The patients included those that received orbital implantation, and the melatonin levels in the peripheral blood in patients before and after operation was observed. In addition, the sex hormone levels and T lymphocytes, plasma reactive oxygen species (ROS) and VEGF levels, urine 8-OHdG and 8-isoPGF2α levels in patients before and after treatment were detected, followed by statistical analysis. As a result, after 3 months of orbital implantation, the sex hormone levels in peripheral blood in child patients fluctuated significantly, and differences were not statistically significant (P>0.05). The peripheral blood T lymphocytes and ROS levels were significantly lower than those before treatment, and the differences were statistically significant (P<0.05). The correlation analysis revealed that the peripheral blood melatonin levels were negatively related to ROS levels; the correlation coefficient was rs = -0.481 and P<0.05. In conclusion, orbital implantation does not have significant impact on sex hormone levels in child patients with congenital eyeball dysplasia. The hydroxyapatite orbital implantation can achieve more satisfactory curative effects, and there are fewer postoperative complications. It does not affect the appearance of the eye, and therefore, it is suitable for patients with congenital eyeball dysplasia.

  4. Is math anxiety in the secondary classroom limiting physics mastery? A study of math anxiety and physics performance

    NASA Astrophysics Data System (ADS)

    Mercer, Gary J.

    This quantitative study examined the relationship between secondary students with math anxiety and physics performance in an inquiry-based constructivist classroom. The Revised Math Anxiety Rating Scale was used to evaluate math anxiety levels. The results were then compared to the performance on a physics standardized final examination. A simple correlation was performed, followed by a multivariate regression analysis to examine effects based on gender and prior math background. The correlation showed statistical significance between math anxiety and physics performance. The regression analysis showed statistical significance for math anxiety, physics performance, and prior math background, but did not show statistical significance for math anxiety, physics performance, and gender.

  5. Inverse statistics and information content

    NASA Astrophysics Data System (ADS)

    Ebadi, H.; Bolgorian, Meysam; Jafari, G. R.

    2010-12-01

    Inverse statistics analysis studies the distribution of investment horizons to achieve a predefined level of return. This distribution provides a maximum investment horizon which determines the most likely horizon for gaining a specific return. There exists a significant difference between inverse statistics of financial market data and a fractional Brownian motion (fBm) as an uncorrelated time-series, which is a suitable criteria to measure information content in financial data. In this paper we perform this analysis for the DJIA and S&P500 as two developed markets and Tehran price index (TEPIX) as an emerging market. We also compare these probability distributions with fBm probability, to detect when the behavior of the stocks are the same as fBm.

  6. Texture analysis of apparent diffusion coefficient maps for treatment response assessment in prostate cancer bone metastases-A pilot study.

    PubMed

    Reischauer, Carolin; Patzwahl, René; Koh, Dow-Mu; Froehlich, Johannes M; Gutzeit, Andreas

    2018-04-01

    To evaluate whole-lesion volumetric texture analysis of apparent diffusion coefficient (ADC) maps for assessing treatment response in prostate cancer bone metastases. Texture analysis is performed in 12 treatment-naïve patients with 34 metastases before treatment and at one, two, and three months after the initiation of androgen deprivation therapy. Four first-order and 19 second-order statistical texture features are computed on the ADC maps in each lesion at every time point. Repeatability, inter-patient variability, and changes in the feature values under therapy are investigated. Spearman rank's correlation coefficients are calculated across time to demonstrate the relationship between the texture features and the serum prostate specific antigen (PSA) levels. With few exceptions, the texture features exhibited moderate to high precision. At the same time, Friedman's tests revealed that all first-order and second-order statistical texture features changed significantly in response to therapy. Thereby, the majority of texture features showed significant changes in their values at all post-treatment time points relative to baseline. Bivariate analysis detected significant correlations between the great majority of texture features and the serum PSA levels. Thereby, three first-order and six second-order statistical features showed strong correlations with the serum PSA levels across time. The findings in the present work indicate that whole-tumor volumetric texture analysis may be utilized for response assessment in prostate cancer bone metastases. The approach may be used as a complementary measure for treatment monitoring in conjunction with averaged ADC values. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Wastewater-Based Epidemiology of Stimulant Drugs: Functional Data Analysis Compared to Traditional Statistical Methods.

    PubMed

    Salvatore, Stefania; Bramness, Jørgen Gustav; Reid, Malcolm J; Thomas, Kevin Victor; Harman, Christopher; Røislien, Jo

    2015-01-01

    Wastewater-based epidemiology (WBE) is a new methodology for estimating the drug load in a population. Simple summary statistics and specification tests have typically been used to analyze WBE data, comparing differences between weekday and weekend loads. Such standard statistical methods may, however, overlook important nuanced information in the data. In this study, we apply functional data analysis (FDA) to WBE data and compare the results to those obtained from more traditional summary measures. We analysed temporal WBE data from 42 European cities, using sewage samples collected daily for one week in March 2013. For each city, the main temporal features of two selected drugs were extracted using functional principal component (FPC) analysis, along with simpler measures such as the area under the curve (AUC). The individual cities' scores on each of the temporal FPCs were then used as outcome variables in multiple linear regression analysis with various city and country characteristics as predictors. The results were compared to those of functional analysis of variance (FANOVA). The three first FPCs explained more than 99% of the temporal variation. The first component (FPC1) represented the level of the drug load, while the second and third temporal components represented the level and the timing of a weekend peak. AUC was highly correlated with FPC1, but other temporal characteristic were not captured by the simple summary measures. FANOVA was less flexible than the FPCA-based regression, and even showed concordance results. Geographical location was the main predictor for the general level of the drug load. FDA of WBE data extracts more detailed information about drug load patterns during the week which are not identified by more traditional statistical methods. Results also suggest that regression based on FPC results is a valuable addition to FANOVA for estimating associations between temporal patterns and covariate information.

  8. Surgical Treatment for Discogenic Low-Back Pain: Lumbar Arthroplasty Results in Superior Pain Reduction and Disability Level Improvement Compared With Lumbar Fusion

    PubMed Central

    2007-01-01

    Background The US Food and Drug Administration approved the Charité artificial disc on October 26, 2004. This approval was based on an extensive analysis and review process; 20 years of disc usage worldwide; and the results of a prospective, randomized, controlled clinical trial that compared lumbar artificial disc replacement to fusion. The results of the investigational device exemption (IDE) study led to a conclusion that clinical outcomes following lumbar arthroplasty were at least as good as outcomes from fusion. Methods The author performed a new analysis of the Visual Analog Scale pain scores and the Oswestry Disability Index scores from the Charité artificial disc IDE study and used a nonparametric statistical test, because observed data distributions were not normal. The analysis included all of the enrolled subjects in both the nonrandomized and randomized phases of the study. Results Subjects from both the treatment and control groups improved from the baseline situation (P < .001) at all follow-up times (6 weeks to 24 months). Additionally, these pain and disability levels with artificial disc replacement were superior (P < .05) to the fusion treatment at all follow-up times including 2 years. Conclusions The a priori statistical plan for an IDE study may not adequately address the final distribution of the data. Therefore, statistical analyses more appropriate to the distribution may be necessary to develop meaningful statistical conclusions from the study. A nonparametric statistical analysis of the Charité artificial disc IDE outcomes scores demonstrates superiority for lumbar arthroplasty versus fusion at all follow-up time points to 24 months. PMID:25802574

  9. Response of SiC{sub f}/Si{sub 3}N{sub 4} composites under static and cyclic loading -- An experimental and statistical analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahfuz, H.; Maniruzzaman, M.; Vaidya, U.

    1997-04-01

    Monotonic tensile and fatigue response of continuous silicon carbide fiber reinforced silicon nitride (SiC{sub f}/Si{sub 3}N{sub 4}) composites has been investigated. The monotonic tensile tests have been performed at room and elevated temperatures. Fatigue tests have been conducted at room temperature (RT), at a stress ratio, R = 0.1 and a frequency of 5 Hz. It is observed during the monotonic tests that the composites retain only 30% of its room temperature strength at 1,600 C suggesting a substantial chemical degradation of the matrix at that temperature. The softening of the matrix at elevated temperature also causes reduction in tensilemore » modulus, and the total reduction in modulus is around 45%. Fatigue data have been generated at three load levels and the fatigue strength of the composite has been found to be considerably high; about 75% of its ultimate room temperature strength. Extensive statistical analysis has been performed to understand the degree of scatter in the fatigue as well as in the static test data. Weibull shape factors and characteristic values have been determined for each set of tests and their relationship with the response of the composites has been discussed. A statistical fatigue life prediction method developed from the Weibull distribution is also presented. Maximum Likelihood Estimator with censoring techniques and data pooling schemes has been employed to determine the distribution parameters for the statistical analysis. These parameters have been used to generate the S-N diagram with desired level of reliability. Details of the statistical analysis and the discussion of the static and fatigue behavior of the composites are presented in this paper.« less

  10. a Study of Women Engineering Students and Time to Completion of First-Year Required Courses at Texas A&M University

    NASA Astrophysics Data System (ADS)

    Kimball, Jorja; Cole, Bryan; Hobson, Margaret; Watson, Karan; Stanley, Christine

    This paper reports findings on gender that were part of a larger study reviewing time to completion of course work that includes the first two semesters of calculus, chemistry, and physics, which are often considered the stumbling points or "barrier courses" to an engineering baccalaureate degree. Texas A&M University terms these courses core body of knowledge (CBK), and statistical analysis was conducted on two cohorts of first-year enrolling engineering students at the institution. Findings indicate that gender is statistically significantly related to completion of CBK with female engineering students completing required courses faster than males at the .01 level (p = 0.008). Statistical significance for gender and ethnicity was found between white male and white female students at the .01 level (p = 0.008). Descriptive analysis indicated that of the five majors studied (chemical, civil, computer, electrical, and mechanical engineering), women completed CBK faster than men, and African American and Hispanic women completed CBK faster than males of the same ethnicity.

  11. A Flexible Approach for the Statistical Visualization of Ensemble Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Potter, K.; Wilson, A.; Bremer, P.

    2009-09-29

    Scientists are increasingly moving towards ensemble data sets to explore relationships present in dynamic systems. Ensemble data sets combine spatio-temporal simulation results generated using multiple numerical models, sampled input conditions and perturbed parameters. While ensemble data sets are a powerful tool for mitigating uncertainty, they pose significant visualization and analysis challenges due to their complexity. We present a collection of overview and statistical displays linked through a high level of interactivity to provide a framework for gaining key scientific insight into the distribution of the simulation results as well as the uncertainty associated with the data. In contrast to methodsmore » that present large amounts of diverse information in a single display, we argue that combining multiple linked statistical displays yields a clearer presentation of the data and facilitates a greater level of visual data analysis. We demonstrate this approach using driving problems from climate modeling and meteorology and discuss generalizations to other fields.« less

  12. Statistical analysis of fNIRS data: a comprehensive review.

    PubMed

    Tak, Sungho; Ye, Jong Chul

    2014-01-15

    Functional near-infrared spectroscopy (fNIRS) is a non-invasive method to measure brain activities using the changes of optical absorption in the brain through the intact skull. fNIRS has many advantages over other neuroimaging modalities such as positron emission tomography (PET), functional magnetic resonance imaging (fMRI), or magnetoencephalography (MEG), since it can directly measure blood oxygenation level changes related to neural activation with high temporal resolution. However, fNIRS signals are highly corrupted by measurement noises and physiology-based systemic interference. Careful statistical analyses are therefore required to extract neuronal activity-related signals from fNIRS data. In this paper, we provide an extensive review of historical developments of statistical analyses of fNIRS signal, which include motion artifact correction, short source-detector separation correction, principal component analysis (PCA)/independent component analysis (ICA), false discovery rate (FDR), serially-correlated errors, as well as inference techniques such as the standard t-test, F-test, analysis of variance (ANOVA), and statistical parameter mapping (SPM) framework. In addition, to provide a unified view of various existing inference techniques, we explain a linear mixed effect model with restricted maximum likelihood (ReML) variance estimation, and show that most of the existing inference methods for fNIRS analysis can be derived as special cases. Some of the open issues in statistical analysis are also described. Copyright © 2013 Elsevier Inc. All rights reserved.

  13. The Ontology of Biological and Clinical Statistics (OBCS) for standardized and reproducible statistical analysis.

    PubMed

    Zheng, Jie; Harris, Marcelline R; Masci, Anna Maria; Lin, Yu; Hero, Alfred; Smith, Barry; He, Yongqun

    2016-09-14

    Statistics play a critical role in biological and clinical research. However, most reports of scientific results in the published literature make it difficult for the reader to reproduce the statistical analyses performed in achieving those results because they provide inadequate documentation of the statistical tests and algorithms applied. The Ontology of Biological and Clinical Statistics (OBCS) is put forward here as a step towards solving this problem. The terms in OBCS including 'data collection', 'data transformation in statistics', 'data visualization', 'statistical data analysis', and 'drawing a conclusion based on data', cover the major types of statistical processes used in basic biological research and clinical outcome studies. OBCS is aligned with the Basic Formal Ontology (BFO) and extends the Ontology of Biomedical Investigations (OBI), an OBO (Open Biological and Biomedical Ontologies) Foundry ontology supported by over 20 research communities. Currently, OBCS comprehends 878 terms, representing 20 BFO classes, 403 OBI classes, 229 OBCS specific classes, and 122 classes imported from ten other OBO ontologies. We discuss two examples illustrating how the ontology is being applied. In the first (biological) use case, we describe how OBCS was applied to represent the high throughput microarray data analysis of immunological transcriptional profiles in human subjects vaccinated with an influenza vaccine. In the second (clinical outcomes) use case, we applied OBCS to represent the processing of electronic health care data to determine the associations between hospital staffing levels and patient mortality. Our case studies were designed to show how OBCS can be used for the consistent representation of statistical analysis pipelines under two different research paradigms. Other ongoing projects using OBCS for statistical data processing are also discussed. The OBCS source code and documentation are available at: https://github.com/obcs/obcs . The Ontology of Biological and Clinical Statistics (OBCS) is a community-based open source ontology in the domain of biological and clinical statistics. OBCS is a timely ontology that represents statistics-related terms and their relations in a rigorous fashion, facilitates standard data analysis and integration, and supports reproducible biological and clinical research.

  14. A Novel Bit-level Image Encryption Method Based on Chaotic Map and Dynamic Grouping

    NASA Astrophysics Data System (ADS)

    Zhang, Guo-Ji; Shen, Yan

    2012-10-01

    In this paper, a novel bit-level image encryption method based on dynamic grouping is proposed. In the proposed method, the plain-image is divided into several groups randomly, then permutation-diffusion process on bit level is carried out. The keystream generated by logistic map is related to the plain-image, which confuses the relationship between the plain-image and the cipher-image. The computer simulation results of statistical analysis, information entropy analysis and sensitivity analysis show that the proposed encryption method is secure and reliable enough to be used for communication application.

  15. An empirical analysis of the distribution of the duration of overshoots in a stationary gaussian stochastic process

    NASA Technical Reports Server (NTRS)

    Parrish, R. S.; Carter, M. C.

    1974-01-01

    This analysis utilizes computer simulation and statistical estimation. Realizations of stationary gaussian stochastic processes with selected autocorrelation functions are computer simulated. Analysis of the simulated data revealed that the mean and the variance of a process were functionally dependent upon the autocorrelation parameter and crossing level. Using predicted values for the mean and standard deviation, by the method of moments, the distribution parameters was estimated. Thus, given the autocorrelation parameter, crossing level, mean, and standard deviation of a process, the probability of exceeding the crossing level for a particular length of time was calculated.

  16. Selenium Speciation in the Fountain Creek Watershed (Colorado, USA) Correlates with Water Hardness, Ca and Mg Levels.

    PubMed

    Carsella, James S; Sánchez-Lombardo, Irma; Bonetti, Sandra J; Crans, Debbie C

    2017-04-30

    The environmental levels of selenium (Se) are regulated and strictly enforced by the Environmental Protection Agency (EPA) because of the toxicity that Se can exert at high levels. However, speciation plays an important role in the overall toxicity of Se, and only when speciation analysis has been conducted will a detailed understanding of the system be possible. In the following, we carried out the speciation analysis of the creek waters in three of the main tributaries-Upper Fountain Creek, Monument Creek and Lower Fountain Creek-located in the Fountain Creek Watershed (Colorado, USA). There are statistically significant differences between the Se, Ca and Mg, levels in each of the tributaries and seasonal swings in Se, Ca and Mg levels have been observed. There are also statistically significant differences between the Se levels when grouped by Pierre Shale type. These factors are considered when determining the forms of Se present and analyzing their chemistry using the reported thermodynamic relationships considering Ca 2+ , Mg 2+ , SeO₄ 2- , SeO₃ 2- and carbonates. This analysis demonstrated that the correlation between Se and water hardness can be explained in terms of formation of soluble CaSeO₄. The speciation analysis demonstrated that for the Fountain Creek waters, the Ca 2+ ion may be mainly responsible for the observed correlation with the Se level. Considering that the Mg 2+ level is also correlating linearly with the Se levels it is important to recognize that without Mg 2+ the Ca 2+ would be significantly reduced. The major role of Mg 2+ is thus to raise the Ca 2+ levels despite the equilibria with carbonate and other anions that would otherwise decrease Ca 2+ levels.

  17. An elevated level of physical activity is associated with normal lipoprotein(a) levels in individuals from Maracaibo, Venezuela.

    PubMed

    Bermúdez, Valmore; Aparicio, Daniel; Rojas, Edward; Peñaranda, Lianny; Finol, Freddy; Acosta, Luis; Mengual, Edgardo; Rojas, Joselyn; Arráiz, Nailet; Toledo, Alexandra; Colmenares, Carlos; Urribarí, Jesica; Sanchez, Wireynis; Pineda, Carlos; Rodriguez, Dalia; Faria, Judith; Añez, Roberto; Cano, Raquel; Cano, Clímaco; Sorell, Luis; Velasco, Manuel

    2010-01-01

    Coronary artery disease is the main cause of death worldwide. Lipoprotein(a) [Lp(a)], is an independent risk factor for coronary artery disease in which concentrations are genetically regulated. Contradictory results have been published about physical activity influence on Lp(a) concentration. This research aimed to determine associations between different physical activity levels and Lp(a) concentration. A descriptive and cross-sectional study was made in 1340 randomly selected subjects (males = 598; females = 712) to whom a complete clinical history, the International Physical Activity Questionnaire, and Lp(a) level determination were made. Statistical analysis was carried out to assess qualitative variables relationship by chi2 and differences between means by one-way analysis of variance considering a P value <0.05 as statistically significant. Results are shown as absolute frequencies, percentages, and mean +/- standard deviation according to case. Physical activity levels were ordinal classified as follows: low activity with 24.3% (n = 318), moderate activity with 35.0% (n = 458), and high physical activity with 40.8% (n = 534). Lp(a) concentration in the studied sample was 26.28 +/- 12.64 (IC: 25.59-26.96) mg/dL. Lp(a) concentration according to low, moderate, and high physical activity levels were 29.22 +/- 13.74, 26.27 +/- 12.91, and 24.53 +/- 11.35 mg/dL, respectively, observing statistically significant differences between low and moderate level (P = 0.004) and low and high level (P < 0.001). A strong association (chi2 = 9.771; P = 0.002) was observed among a high physical activity level and a normal concentration of Lp(a) (less than 30 mg/dL). A lifestyle characterized by high physical activity is associated with normal Lp(a) levels.

  18. North Atlantic Coast Comprehensive Study Phase I: Statistical Analysis of Historical Extreme Water Levels with Sea Level Change

    DTIC Science & Technology

    2014-09-01

    14-7 ii Abstract The U.S. North Atlantic coast is subject to coastal flooding as a result of both severe extratropical storms (e.g., Nor’easters...Products and Services, excluding any kind of high-resolution hydrodynamic modeling. Tropical and extratropical storms were treated as a single...joint probability analysis and high-fidelity modeling of tropical and extratropical storms

  19. OSPAR standard method and software for statistical analysis of beach litter data.

    PubMed

    Schulz, Marcus; van Loon, Willem; Fleet, David M; Baggelaar, Paul; van der Meulen, Eit

    2017-09-15

    The aim of this study is to develop standard statistical methods and software for the analysis of beach litter data. The optimal ensemble of statistical methods comprises the Mann-Kendall trend test, the Theil-Sen slope estimation, the Wilcoxon step trend test and basic descriptive statistics. The application of Litter Analyst, a tailor-made software for analysing the results of beach litter surveys, to OSPAR beach litter data from seven beaches bordering on the south-eastern North Sea, revealed 23 significant trends in the abundances of beach litter types for the period 2009-2014. Litter Analyst revealed a large variation in the abundance of litter types between beaches. To reduce the effects of spatial variation, trend analysis of beach litter data can most effectively be performed at the beach or national level. Spatial aggregation of beach litter data within a region is possible, but resulted in a considerable reduction in the number of significant trends. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Statistical Analysis of CFD Solutions from the Fourth AIAA Drag Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Morrison, Joseph H.

    2010-01-01

    A graphical framework is used for statistical analysis of the results from an extensive N-version test of a collection of Reynolds-averaged Navier-Stokes computational fluid dynamics codes. The solutions were obtained by code developers and users from the U.S., Europe, Asia, and Russia using a variety of grid systems and turbulence models for the June 2009 4th Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic configuration for this workshop was a new subsonic transport model, the Common Research Model, designed using a modern approach for the wing and included a horizontal tail. The fourth workshop focused on the prediction of both absolute and incremental drag levels for wing-body and wing-body-horizontal tail configurations. This work continues the statistical analysis begun in the earlier workshops and compares the results from the grid convergence study of the most recent workshop with earlier workshops using the statistical framework.

  1. Study on the relationship between the methylation of the MMP-9 gene promoter region and diabetic nephropathy.

    PubMed

    Yang, Xiao-Hui; Feng, Shi-Ya; Yu, Yang; Liang, Zhou

    2018-01-01

    This study aims to explore the relationship between the methylation of matrix metalloproteinase (MMP)-9 gene promoter region and diabetic nephropathy (DN) through the detection of the methylation level of MMP-9 gene promoter region in the peripheral blood of patients with DN in different periods and serum MMP-9 concentration. The methylation level of the MMP-9 gene promoter region was detected by methylation-specific polymerase chain reaction (MSP), and the content of MMP-9 in serum was determined by enzyme-linked immunosorbent assay (ELISA). Results of the statistical analysis revealed that serum MMP-9 protein expression levels gradually increased in patients in the simple diabetic group, early diabetic nephropathy group and clinical diabetic nephropathy group, compared with the control group; and the difference was statistically significant (P < 0.05). Compared with the control group, the methylation levels of MMP-9 gene promoter regions gradually decreased in patients in the simple diabetic group, early diabetic nephropathy group, and clinical diabetic nephropathy group; and the difference was statistically significant (P < 0.05). Furthermore, correlation analysis results indicated that the demethylation levels of the MMP-9 gene promoter region was positively correlated with serum protein levels, urinary albumin to creatinine ratio (UACR), urea and creatinine; and was negatively correlated with GFR. The demethylation of the MMP-9 gene promoter region may be involved in the occurrence and development of diabetic nephropathy by regulating the expression of MMP-9 protein in serum.

  2. Graphical augmentations to the funnel plot assess the impact of additional evidence on a meta-analysis.

    PubMed

    Langan, Dean; Higgins, Julian P T; Gregory, Walter; Sutton, Alexander J

    2012-05-01

    We aim to illustrate the potential impact of a new study on a meta-analysis, which gives an indication of the robustness of the meta-analysis. A number of augmentations are proposed to one of the most widely used of graphical displays, the funnel plot. Namely, 1) statistical significance contours, which define regions of the funnel plot in which a new study would have to be located to change the statistical significance of the meta-analysis; and 2) heterogeneity contours, which show how a new study would affect the extent of heterogeneity in a given meta-analysis. Several other features are also described, and the use of multiple features simultaneously is considered. The statistical significance contours suggest that one additional study, no matter how large, may have a very limited impact on the statistical significance of a meta-analysis. The heterogeneity contours illustrate that one outlying study can increase the level of heterogeneity dramatically. The additional features of the funnel plot have applications including 1) informing sample size calculations for the design of future studies eligible for inclusion in the meta-analysis; and 2) informing the updating prioritization of a portfolio of meta-analyses such as those prepared by the Cochrane Collaboration. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. Cognitive-Developmental Hierarchies: A Search for Structure Using Item-Level Data.

    ERIC Educational Resources Information Center

    Martinez, Michael E.; Simpson, R. Scott

    Item-level statistics from ability and achievement tests have been underutilized as sources of data for building models of cognitive development. How item data can be used to build a cognitive-developmental map of proportional reasoning is demonstrated. The product of the analysis is a cognitive hierarchy with levels corresponding to categories of…

  4. Knowledge and utilization of computer-software for statistics among Nigerian dentists.

    PubMed

    Chukwuneke, F N; Anyanechi, C E; Obiakor, A O; Amobi, O; Onyejiaka, N; Alamba, I

    2013-01-01

    The use of computer soft ware for generation of statistic analysis has transformed health information and data to simplest form in the areas of access, storage, retrieval and analysis in the field of research. This survey therefore was carried out to assess the level of knowledge and utilization of computer software for statistical analysis among dental researchers in eastern Nigeria. Questionnaires on the use of computer software for statistical analysis were randomly distributed to 65 practicing dental surgeons of above 5 years experience in the tertiary academic hospitals in eastern Nigeria. The focus was on: years of clinical experience; research work experience; knowledge and application of computer generated software for data processing and stastistical analysis. Sixty-two (62/65; 95.4%) of these questionnaires were returned anonymously, which were used in our data analysis. Twenty-nine (29/62; 46.8%) respondents fall within those with 5-10 years of clinical experience out of which none has completed the specialist training programme. Practitioners with above 10 years clinical experiences were 33 (33/62; 53.2%) out of which 15 (15/33; 45.5%) are specialists representing 24.2% (15/62) of the total number of respondents. All the 15 specialists are actively involved in research activities and only five (5/15; 33.3%) can utilize software statistical analysis unaided. This study has i dentified poor utilization of computer software for statistic analysis among dental researchers in eastern Nigeria. This is strongly associated with lack of exposure on the use of these software early enough especially during the undergraduate training. This call for introduction of computer training programme in dental curriculum to enable practitioners develops the attitude of using computer software for their research.

  5. On the implications of the classical ergodic theorems: analysis of developmental processes has to focus on intra-individual variation.

    PubMed

    Molenaar, Peter C M

    2008-01-01

    It is argued that general mathematical-statistical theorems imply that standard statistical analysis techniques of inter-individual variation are invalid to investigate developmental processes. Developmental processes have to be analyzed at the level of individual subjects, using time series data characterizing the patterns of intra-individual variation. It is shown that standard statistical techniques based on the analysis of inter-individual variation appear to be insensitive to the presence of arbitrary large degrees of inter-individual heterogeneity in the population. An important class of nonlinear epigenetic models of neural growth is described which can explain the occurrence of such heterogeneity in brain structures and behavior. Links with models of developmental instability are discussed. A simulation study based on a chaotic growth model illustrates the invalidity of standard analysis of inter-individual variation, whereas time series analysis of intra-individual variation is able to recover the true state of affairs. (c) 2007 Wiley Periodicals, Inc.

  6. Collagen morphology and texture analysis: from statistics to classification

    PubMed Central

    Mostaço-Guidolin, Leila B.; Ko, Alex C.-T.; Wang, Fei; Xiang, Bo; Hewko, Mark; Tian, Ganghong; Major, Arkady; Shiomi, Masashi; Sowa, Michael G.

    2013-01-01

    In this study we present an image analysis methodology capable of quantifying morphological changes in tissue collagen fibril organization caused by pathological conditions. Texture analysis based on first-order statistics (FOS) and second-order statistics such as gray level co-occurrence matrix (GLCM) was explored to extract second-harmonic generation (SHG) image features that are associated with the structural and biochemical changes of tissue collagen networks. Based on these extracted quantitative parameters, multi-group classification of SHG images was performed. With combined FOS and GLCM texture values, we achieved reliable classification of SHG collagen images acquired from atherosclerosis arteries with >90% accuracy, sensitivity and specificity. The proposed methodology can be applied to a wide range of conditions involving collagen re-modeling, such as in skin disorders, different types of fibrosis and muscular-skeletal diseases affecting ligaments and cartilage. PMID:23846580

  7. Saliva, Serum Levels of Interleukin-21, -33 and Prostaglandin E2 in Patients with Generalised Aggressive or Chronic Periodontitis.

    PubMed

    Gümüş, Pınar; Nizam, Nejat; Nalbantsoy, Ayşe; Özçaka, Özgün; Buduneli, Nurcan

    This cross-sectional study aims to evaluate saliva, serum levels of interleukin-21 (IL-21), IL-33, and prostaglandin E2 (PGE2) in patients with generalised chronic periodontitis or aggressive periodontitis. Before initiation of any periodontal treatment, saliva and serum samples were collected and clinical periodontal measurements were recorded from 94 participants (25 aggressive periodontitis patients, 25 chronic periodontitis patients, 44 periodontally healthy individuals). IL-21, IL-33 and PGE2 levels in serum and saliva samples were determined by ELISA. Data were tested statistically using Kruskal-Wallis, Mann-Whitney U-, and Spearman-rho rank tests. Saliva IL-33 levels were statistically significantly higher in the chronic than the aggressive group (p < 0.05). Serum IL-33, saliva and serum IL-21 and PGE2 levels were similar in the two periodontitis groups. Saliva IL-33 levels correlated with age in the chronic periodontitis group (p < 0.05). Statistically significant positive correlations were found between serum, saliva PGE2 levels and plaque index (p < 0.05). IL-33 and IL-21 levels in serum samples positively correlated in the periodontitis groups (p < 0.05). IL-21 and PGE2 analysis did not exhibit discriminating data between generalised chronic and aggressive periodontitis, but the present findings support the role of these cytokines in periodontitis. Statistically significantly higher saliva IL-33 levels in the chronic periodontitis group warrant further research.

  8. Statistical Significance and Baseline Monitoring.

    DTIC Science & Technology

    1984-07-01

    impacted at once........................... 24 6 Observed versus nominal a levels for multivariate tests of data sets (50 runs of 4 groups each...cumulative proportion of the observations found for each nominal level. The results of the comparisons of the observed versus nominal a levels for the...a values are always higher than nominal levels. Virtual- . .,ly all nominal a levels are below 0.20. In other words, the discriminant analysis models

  9. Temperature rise, sea level rise and increased radiative forcing - an application of cointegration methods

    NASA Astrophysics Data System (ADS)

    Schmith, Torben; Thejll, Peter; Johansen, Søren

    2016-04-01

    We analyse the statistical relationship between changes in global temperature, global steric sea level and radiative forcing in order to reveal causal relationships. There are in this, however, potential pitfalls due to the trending nature of the time series. We therefore apply a statistical method called cointegration analysis, originating from the field of econometrics, which is able to correctly handle the analysis of series with trends and other long-range dependencies. Further, we find a relationship between steric sea level and temperature and find that temperature causally depends on the steric sea level, which can be understood as a consequence of the large heat capacity of the ocean. This result is obtained both when analyzing observed data and data from a CMIP5 historical model run. Finally, we find that in the data from the historical run, the steric sea level, in turn, is driven by the external forcing. Finally, we demonstrate that combining these two results can lead to a novel estimate of radiative forcing back in time based on observations.

  10. Research of epidermal cellular vegetal cycle of intravascular low level laser irradiation in treatment of psoriasis

    NASA Astrophysics Data System (ADS)

    Zhu, Jing; Bao, Xiaoqing; Zhang, Mei-Jue

    2005-07-01

    Objective: To research epidermal cellular vegetal cycle and the difference of DNA content between pre and post Intravascular Low Level Laser Irradiation treatment of psoriasis. Method: 15 patients suffered from psoriasis were treated by intravascular low level laser irradiation (output power: 4-5mw, 1 hour per day, a course of treatment is 10 days). We checked the different DNA content of epidermal cell between pre and post treatment of psoriasis and 8 natural human. Then the percentage of each phase among the whole cellular cycle was calculated and the statistical analysis was made. Results: The mean value of G1/S phase is obviously down while G2+M phase increased obviously. T test P<0.05.The related statistical analysis showed significant difference between pre and post treatments. Conclusions: The Intravascular Low Level Laser Irradiation (ILLLI) in treatment of psoriasis is effective according to the research of epidermal cellular vegetal cycle and the difference DNA content of Intravascular Low Level Laser Irradiation between pre and post treatment of psoriasis

  11. Probabilistic/Fracture-Mechanics Model For Service Life

    NASA Technical Reports Server (NTRS)

    Watkins, T., Jr.; Annis, C. G., Jr.

    1991-01-01

    Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.

  12. Multi-scale statistical analysis of coronal solar activity

    DOE PAGES

    Gamborino, Diana; del-Castillo-Negrete, Diego; Martinell, Julio J.

    2016-07-08

    Multi-filter images from the solar corona are used to obtain temperature maps that are analyzed using techniques based on proper orthogonal decomposition (POD) in order to extract dynamical and structural information at various scales. Exploring active regions before and after a solar flare and comparing them with quiet regions, we show that the multi-scale behavior presents distinct statistical properties for each case that can be used to characterize the level of activity in a region. Information about the nature of heat transport is also to be extracted from the analysis.

  13. How Miniature/Microminiature (2M) Repair Capabilities Can Reduce the Impact of No Evidence of Failure (NEOF) Among Repairables on the Navy’s Operations and Maintenance Account

    DTIC Science & Technology

    1988-06-01

    and PCBs. The pilot program involved screening, testing , and repairing of EMs/PCBs for both COMNAVSEASYSCOM and Commander, Naval Electronic Systems...were chosen from the Support and Test Equipment Engineering Program (STEEP) tests rformed by"IMA San Diego duringl987. A statistical analysis and a Level...were chosen from the Support and Test Equipment Engineering Program (STEEP) tests performed by SIMA San Diego during 1987. A statistical analysis and a

  14. Fuzzy interval Finite Element/Statistical Energy Analysis for mid-frequency analysis of built-up systems with mixed fuzzy and interval parameters

    NASA Astrophysics Data System (ADS)

    Yin, Hui; Yu, Dejie; Yin, Shengwen; Xia, Baizhan

    2016-10-01

    This paper introduces mixed fuzzy and interval parametric uncertainties into the FE components of the hybrid Finite Element/Statistical Energy Analysis (FE/SEA) model for mid-frequency analysis of built-up systems, thus an uncertain ensemble combining non-parametric with mixed fuzzy and interval parametric uncertainties comes into being. A fuzzy interval Finite Element/Statistical Energy Analysis (FIFE/SEA) framework is proposed to obtain the uncertain responses of built-up systems, which are described as intervals with fuzzy bounds, termed as fuzzy-bounded intervals (FBIs) in this paper. Based on the level-cut technique, a first-order fuzzy interval perturbation FE/SEA (FFIPFE/SEA) and a second-order fuzzy interval perturbation FE/SEA method (SFIPFE/SEA) are developed to handle the mixed parametric uncertainties efficiently. FFIPFE/SEA approximates the response functions by the first-order Taylor series, while SFIPFE/SEA improves the accuracy by considering the second-order items of Taylor series, in which all the mixed second-order items are neglected. To further improve the accuracy, a Chebyshev fuzzy interval method (CFIM) is proposed, in which the Chebyshev polynomials is used to approximate the response functions. The FBIs are eventually reconstructed by assembling the extrema solutions at all cut levels. Numerical results on two built-up systems verify the effectiveness of the proposed methods.

  15. Statistical process control methods allow the analysis and improvement of anesthesia care.

    PubMed

    Fasting, Sigurd; Gisvold, Sven E

    2003-10-01

    Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.

  16. Association of Glycemic Status with Bone Turnover Markers in Type 2 Diabetes Mellitus.

    PubMed

    Kulkarni, Sweta Vilas; Meenatchi, Suruthi; Reeta, R; Ramesh, Ramasamy; Srinivasan, A R; Lenin, C

    2017-01-01

    Type 2 diabetes mellitus has profound implications on the skeleton. Even though bone mineral density is increased in type 2 diabetes mellitus patients, they are more prone for fractures. The weakening of bone tissue in type 2 diabetes mellitus can be due to uncontrolled blood sugar levels leading to high levels of bone turnover markers in blood. The aim of this study is to find the association between glycemic status and bone turnover markers in type 2 diabetes mellitus. This case-control study was carried out in a tertiary health care hospital. Fifty clinically diagnosed type 2 diabetes mellitus patients in the age group between 30 and 50 years were included as cases. Fifty age- and gender-matched healthy nondiabetics were included as controls. Patients with complications and chronic illness were excluded from the study. Depending on glycated hemoglobin (HbA1c) levels, patients were grouped into uncontrolled (HbA1c >7%, n = 36) and controlled (HbA1c <7%, n = 14) diabetics. Based on duration of diabetes, patients were grouped into newly diagnosed, 1-2 years, 3-5 years, and >5 years. Serum osteocalcin (OC), bone alkaline phosphatase (BAP), acid phosphatase (ACP), and HbA1c levels were estimated. OC/BAP and OC/ACP ratio was calculated. Student's t -test, analysis of variance, and Chi-square tests were used for analysis. Receiver operating characteristic (ROC) curve analysis was done for OC/BAP and OC/ACP ratios. Serum OC, HbA1c, and OC/BAP ratio were increased in cases when compared to controls and were statistically significant ( P < 0.001). OC/ACP ratio was decreased in type 2 diabetes mellitus and was statistically significant ( P = 0.01). In patients with >5-year duration of diabetes, HbA1c level was high and was statistically significant ( P < 0.042). BAP levels were high in uncontrolled diabetics but statistically not significant. ROC curve showed OC/BAP ratio better marker than OC/ACP ratio. Uncontrolled type 2 diabetes mellitus affects bone tissue resulting in variations in bone turnover markers. Bone turnover markers are better in predicting recent changes in bone morphology and are cost effective.

  17. 'Chain pooling' model selection as developed for the statistical analysis of a rotor burst protection experiment

    NASA Technical Reports Server (NTRS)

    Holms, A. G.

    1977-01-01

    A statistical decision procedure called chain pooling had been developed for model selection in fitting the results of a two-level fixed-effects full or fractional factorial experiment not having replication. The basic strategy included the use of one nominal level of significance for a preliminary test and a second nominal level of significance for the final test. The subject has been reexamined from the point of view of using as many as three successive statistical model deletion procedures in fitting the results of a single experiment. The investigation consisted of random number studies intended to simulate the results of a proposed aircraft turbine-engine rotor-burst-protection experiment. As a conservative approach, population model coefficients were chosen to represent a saturated 2 to the 4th power experiment with a distribution of parameter values unfavorable to the decision procedures. Three model selection strategies were developed.

  18. A comprehensive review of arsenic levels in the semiconductor manufacturing industry.

    PubMed

    Park, Donguk; Yang, Haengsun; Jeong, Jeeyeon; Ha, Kwonchul; Choi, Sangjun; Kim, Chinyon; Yoon, Chungsik; Park, Dooyong; Paek, Domyung

    2010-11-01

    This paper presents a summary of arsenic level statistics from air and wipe samples taken from studies conducted in fabrication operations. The main objectives of this study were not only to describe arsenic measurement data but also, through a literature review, to categorize fabrication workers in accordance with observed arsenic levels. All airborne arsenic measurements reported were included in the summary statistics for analysis of the measurement data. The arithmetic mean was estimated assuming a lognormal distribution from the geometric mean and the geometric standard deviation or the range. In addition, weighted arithmetic means (WAMs) were calculated based on the number of measurements reported for each mean. Analysis of variance (ANOVA) was employed to compare arsenic levels classified according to several categories such as the year, sampling type, location sampled, operation type, and cleaning technique. Nine papers were found reporting airborne arsenic measurement data from maintenance workers or maintenance areas in semiconductor chip-making plants. A total of 40 statistical summaries from seven articles were identified that represented a total of 423 airborne arsenic measurements. Arsenic exposure levels taken during normal operating activities in implantation operations (WAM = 1.6 μg m⁻³, no. of samples = 77, no. of statistical summaries = 2) were found to be lower than exposure levels of engineers who were involved in maintenance works (7.7 μg m⁻³, no. of samples = 181, no. of statistical summaries = 19). The highest level (WAM = 218.6 μg m⁻³) was associated with various maintenance works performed inside an ion implantation chamber. ANOVA revealed no significant differences in the WAM arsenic levels among the categorizations based on operation and sampling characteristics. Arsenic levels (56.4 μg m⁻³) recorded during maintenance works performed in dry conditions were found to be much higher than those from maintenance works in wet conditions (0.6 μg m⁻³). Arsenic levels from wipe samples in process areas after maintenance activities ranged from non-detectable to 146 μg cm⁻², indicating the potential for dispersion into the air and hence inhalation. We conclude that workers who are regularly or occasionally involved in maintenance work have higher potential for occupational exposure than other employees who are in charge of routine production work. In addition, fabrication workers can be classified into two groups based on the reviewed arsenic exposure levels: operators with potential for low levels of exposure and maintenance engineers with high levels of exposure. These classifications could be used as a basis for a qualitative ordinal ranking of exposure in an epidemiological study.

  19. Nonlinear histogram binning for quantitative analysis of lung tissue fibrosis in high-resolution CT data

    NASA Astrophysics Data System (ADS)

    Zavaletta, Vanessa A.; Bartholmai, Brian J.; Robb, Richard A.

    2007-03-01

    Diffuse lung diseases, such as idiopathic pulmonary fibrosis (IPF), can be characterized and quantified by analysis of volumetric high resolution CT scans of the lungs. These data sets typically have dimensions of 512 x 512 x 400. It is too subjective and labor intensive for a radiologist to analyze each slice and quantify regional abnormalities manually. Thus, computer aided techniques are necessary, particularly texture analysis techniques which classify various lung tissue types. Second and higher order statistics which relate the spatial variation of the intensity values are good discriminatory features for various textures. The intensity values in lung CT scans range between [-1024, 1024]. Calculation of second order statistics on this range is too computationally intensive so the data is typically binned between 16 or 32 gray levels. There are more effective ways of binning the gray level range to improve classification. An optimal and very efficient way to nonlinearly bin the histogram is to use a dynamic programming algorithm. The objective of this paper is to show that nonlinear binning using dynamic programming is computationally efficient and improves the discriminatory power of the second and higher order statistics for more accurate quantification of diffuse lung disease.

  20. A global compilation of coral sea-level benchmarks: Implications and new challenges

    NASA Astrophysics Data System (ADS)

    Medina-Elizalde, Martín

    2013-01-01

    I present a quality-controlled compilation of sea-level data from U-Th dated corals, encompassing 30 studies of 13 locations around the world. The compilation contains relative sea level (RSL) data from each location based on both conventional and open-system U-Th ages. I have applied a commonly used age quality control criterion based on the initial 234U/238U activity ratios of corals in order to select reliable ages and to reconstruct sea level histories for the last 150,000 yr. This analysis reveals scatter of RSL estimates among coeval coral benchmarks both within individual locations and between locations, particularly during Marine Isotope Stage (MIS) 5a and the glacial inception following the last interglacial. The character of data scatter during these time intervals imply that uncertainties still exist regarding tectonics, glacio-isostacy, U-series dating, and/or coral position. To elucidate robust underlying patterns, with confidence limits, I performed a Monte Carlo-style statistical analysis of the compiled coral data considering appropriate age and sea-level uncertainties. By its nature, such an analysis has the tendency to smooth/obscure millennial-scale (and finer) details that may be important in individual datasets, and favour the major underlying patterns that are supported by all datasets. This statistical analysis is thus functional to illustrate major trends that are statistically robust ('what we know'), trends that are suggested but still are supported by few data ('what we might know, subject to addition of more supporting data and improved corrections'), and which patterns/data are clear outliers ('unlikely to be realistic given the rest of the global data and possibly needing further adjustments'). Prior to the last glacial maximum and with the possible exception of the 130-120 ka period, available coral data generally have insufficient temporal resolution and unexplained scatter, which hinders identification of a well-defined pattern with usefully narrow confidence limits. This analysis thus provides a framework that objectively identifies critical targets for new data collection, improved corrections, and integration of coral data with independent, stratigraphically continuous methods of sea-level reconstruction.

  1. Analysis of ground-water data for selected wells near Holloman Air Force Base, New Mexico, 1950-95

    USGS Publications Warehouse

    Huff, G.F.

    1996-01-01

    Ground-water-level, ground-water-withdrawal, and ground- water-quality data were evaluated for trends. Holloman Air Force Base is located in the west-central part of Otero County, New Mexico. Ground-water-data analyses include assembly and inspection of U.S. Geological Survey and Holloman Air Force Base data, including ground-water-level data for public-supply and observation wells and withdrawal and water-quality data for public-supply wells in the area. Well Douglas 4 shows a statistically significant decreasing trend in water levels for 1972-86 and a statistically significant increasing trend in water levels for 1986-90. Water levels in wells San Andres 5 and San Andres 6 show statistically significant decreasing trends for 1972-93 and 1981-89, respectively. A mixture of statistically significant increasing trends, statistically significant decreasing trends, and lack of statistically significant trends over periods ranging from the early 1970's to the early 1990's are indicated for the Boles wells and wells near the Boles wells. Well Boles 5 shows a statistically significant increasing trend in water levels for 1981-90. Well Boles 5 and well 17S.09E.25.343 show no statistically significant trends in water levels for 1990-93 and 1988-93, respectively. For 1986-93, well Frenchy 1 shows a statistically significant decreasing trend in water levels. Ground-water withdrawal from the San Andres and Douglas wells regularly exceeded estimated ground-water recharge from San Andres Canyon for 1963-87. For 1951-57 and 1960-86, ground-water withdrawal from the Boles wells regularly exceeded total estimated ground-water recharge from Mule, Arrow, and Lead Canyons. Ground-water withdrawal from the San Andres and Douglas wells and from the Boles wells nearly equaled estimated ground- water recharge for 1989-93 and 1986-93, respectively. For 1987- 93, ground-water withdrawal from the Escondido well regularly exceeded estimated ground-water recharge from Escondido Canyon, and ground-water withdrawal from the Frenchy wells regularly exceeded total estimated ground-water recharge from Dog and Deadman Canyons. Water-quality samples were collected from selected Douglas, San Andres, and Boles public-supply wells from December 1994 to February 1995. Concentrations of dissolved nitrate show the most consistent increases between current and historical data. Current concentrations of dissolved nitrate are greater than historical concentrations in 7 of 10 wells.

  2. Analysis of the Level of Dysphagia, Anxiety, and Nutritional Status Before and After Speech Therapy in Patients with Stroke

    PubMed Central

    Drozdz, Daniela; Mancopes, Renata; Silva, Ana Maria Toniolo; Reppold, Caroline

    2014-01-01

    Introduction: The rehabilitation in oropharyngeal dysphagia evidence-based implies the relationship between the interventions and their results. Objective: Analyze level of dysphagia, oral ingestion, anxiety levels and nutritional status of patients with stroke diagnosis, before and after speech therapy. Method: Clinical assessment of dysphagia partially using the Protocol of Risk Assessment for Dysphagia (PARD), applying the scale Functional Oral Intake Scale for Dysphagia in Stroke Patients (FOIS), Beck Anxiety Inventory (BAI) and the Mini Nutritional Assessment MNA®. The sample consisted of 12 patients, mean age of 64.6 years, with a medical diagnosis of hemorrhagic and ischemic stroke and without cognitive disorders. All tests were applied before and after speech therapy (15 sessions). Statistical analysis was performed using the chi-square test or Fisher's exact test, McNemar's test, Bowker's symmetry test and Wilcoxon's test. Results: During the pre-speech therapy assessments, 33.3% of patients had mild to moderate dysphagia, 88.2% did not receive food orally, 47.1% of the patients showed malnutrition and 35.3% of patients had mild anxiety level. After the therapy sessions, it was found that 33.3% of patients had mild dysphagia, 16.7% were malnourished and 50% of patients had minimal level of anxiety. Conclusion:  There were statistically significant evolution of the level of dysphagia (p = 0.017) and oral intake (p = 0.003) post-speech therapy. Although not statistically significant, there was considerable progress in relation to the level of anxiety and nutritional status. PMID:25992086

  3. In vivo evaluation of the effect of stimulus distribution on FIR statistical efficiency in event-related fMRI.

    PubMed

    Jansma, J Martijn; de Zwart, Jacco A; van Gelderen, Peter; Duyn, Jeff H; Drevets, Wayne C; Furey, Maura L

    2013-05-15

    Technical developments in MRI have improved signal to noise, allowing use of analysis methods such as Finite impulse response (FIR) of rapid event related functional MRI (er-fMRI). FIR is one of the most informative analysis methods as it determines onset and full shape of the hemodynamic response function (HRF) without any a priori assumptions. FIR is however vulnerable to multicollinearity, which is directly related to the distribution of stimuli over time. Efficiency can be optimized by simplifying a design, and restricting stimuli distribution to specific sequences, while more design flexibility necessarily reduces efficiency. However, the actual effect of efficiency on fMRI results has never been tested in vivo. Thus, it is currently difficult to make an informed choice between protocol flexibility and statistical efficiency. The main goal of this study was to assign concrete fMRI signal to noise values to the abstract scale of FIR statistical efficiency. Ten subjects repeated a perception task with five random and m-sequence based protocol, with varying but, according to literature, acceptable levels of multicollinearity. Results indicated substantial differences in signal standard deviation, while the level was a function of multicollinearity. Experiment protocols varied up to 55.4% in standard deviation. Results confirm that quality of fMRI in an FIR analysis can significantly and substantially vary with statistical efficiency. Our in vivo measurements can be used to aid in making an informed decision between freedom in protocol design and statistical efficiency. Published by Elsevier B.V.

  4. Rating locomotive crew diesel emission exposure profiles using statistics and Bayesian Decision Analysis.

    PubMed

    Hewett, Paul; Bullock, William H

    2014-01-01

    For more than 20 years CSX Transportation (CSXT) has collected exposure measurements from locomotive engineers and conductors who are potentially exposed to diesel emissions. The database included measurements for elemental and total carbon, polycyclic aromatic hydrocarbons, aromatics, aldehydes, carbon monoxide, and nitrogen dioxide. This database was statistically analyzed and summarized, and the resulting statistics and exposure profiles were compared to relevant occupational exposure limits (OELs) using both parametric and non-parametric descriptive and compliance statistics. Exposure ratings, using the American Industrial Health Association (AIHA) exposure categorization scheme, were determined using both the compliance statistics and Bayesian Decision Analysis (BDA). The statistical analysis of the elemental carbon data (a marker for diesel particulate) strongly suggests that the majority of levels in the cabs of the lead locomotives (n = 156) were less than the California guideline of 0.020 mg/m(3). The sample 95th percentile was roughly half the guideline; resulting in an AIHA exposure rating of category 2/3 (determined using BDA). The elemental carbon (EC) levels in the trailing locomotives tended to be greater than those in the lead locomotive; however, locomotive crews rarely ride in the trailing locomotive. Lead locomotive EC levels were similar to those reported by other investigators studying locomotive crew exposures and to levels measured in urban areas. Lastly, both the EC sample mean and 95%UCL were less than the Environmental Protection Agency (EPA) reference concentration of 0.005 mg/m(3). With the exception of nitrogen dioxide, the overwhelming majority of the measurements for total carbon, polycyclic aromatic hydrocarbons, aromatics, aldehydes, and combustion gases in the cabs of CSXT locomotives were either non-detects or considerably less than the working OELs for the years represented in the database. When compared to the previous American Conference of Governmental Industrial Hygienists (ACGIH) threshold limit value (TLV) of 3 ppm the nitrogen dioxide exposure profile merits an exposure rating of AIHA exposure category 1. However, using the newly adopted TLV of 0.2 ppm the exposure profile receives an exposure rating of category 4. Further evaluation is recommended to determine the current status of nitrogen dioxide exposures. [Supplementary materials are available for this article. Go to the publisher's online edition of Journal of Occupational and Environmental Hygiene for the following free supplemental resource: additional text on OELs, methods, results, and additional figures and tables.].

  5. On Muthen's Maximum Likelihood for Two-Level Covariance Structure Models

    ERIC Educational Resources Information Center

    Yuan, Ke-Hai; Hayashi, Kentaro

    2005-01-01

    Data in social and behavioral sciences are often hierarchically organized. Special statistical procedures that take into account the dependence of such observations have been developed. Among procedures for 2-level covariance structure analysis, Muthen's maximum likelihood (MUML) has the advantage of easier computation and faster convergence. When…

  6. Climatological Study to Determine the Impact of Icing on the Low Level Windshear Alert System. Volume I. Analysis.

    DOT National Transportation Integrated Search

    1989-09-01

    The climatological study was performed to determine the impact of icing on the performance of Low Level Windshear Alert System (LLWAS). : This report presents the icing statistical profile in the form of data tables and histograms of 106 LLWAS sites....

  7. Periods of High Intensity Solar Proton Flux

    NASA Technical Reports Server (NTRS)

    Xapsos, Michael A.; Stauffer, Craig A.; Jordan, Thomas M.; Adams, James H.; Dietrich, William F.

    2012-01-01

    Analysis is presented for times during a space mission that specified solar proton flux levels are exceeded. This includes both total time and continuous time periods during missions. Results for the solar maximum and solar minimum phases of the solar cycle are presented and compared for a broad range of proton energies and shielding levels. This type of approach is more amenable to reliability analysis for spacecraft systems and instrumentation than standard statistical models.

  8. Time-Frequency Cross Mutual Information Analysis of the Brain Functional Networks Underlying Multiclass Motor Imagery.

    PubMed

    Gong, Anmin; Liu, Jianping; Chen, Si; Fu, Yunfa

    2018-01-01

    To study the physiologic mechanism of the brain during different motor imagery (MI) tasks, the authors employed a method of brain-network modeling based on time-frequency cross mutual information obtained from 4-class (left hand, right hand, feet, and tongue) MI tasks recorded as brain-computer interface (BCI) electroencephalography data. The authors explored the brain network revealed by these MI tasks using statistical analysis and the analysis of topologic characteristics, and observed significant differences in the reaction level, reaction time, and activated target during 4-class MI tasks. There was a great difference in the reaction level between the execution and resting states during different tasks: the reaction level of the left-hand MI task was the greatest, followed by that of the right-hand, feet, and tongue MI tasks. The reaction time required to perform the tasks also differed: during the left-hand and right-hand MI tasks, the brain networks of subjects reacted promptly and strongly, but there was a delay during the feet and tongue MI task. Statistical analysis and the analysis of network topology revealed the target regions of the brain network during different MI processes. In conclusion, our findings suggest a new way to explain the neural mechanism behind MI.

  9. Assessment of Online Patient Education Materials from Major Dermatologic Associations

    PubMed Central

    John, Ann M.; John, Elizabeth S.; Hansberry, David R.

    2016-01-01

    Objective: Patients increasingly use the internet to find medical information regarding their conditions and treatments. Physicians often supplement visits with written education materials. Online patient education materials from major dermatologic associations should be written at appropriate reading levels to optimize utility for patients. The purpose of this study is to assess online patient education materials from major dermatologic associations and determine if they are written at the fourth to sixth grade level recommended by the American Medical Association and National Institutes of Health. Design: This is a descriptive and correlational design. Setting: Academic institution. Participants/measurements: Patient education materials from eight major dermatology websites were downloaded and assessed using 10 readability scales. A one-way analysis of variance and Tukey’s Honestly Statistically Different post hoc analysis were performed to determine the difference in readability levels between websites. Results: Two hundred and sixty patient education materials were assessed. Collectively, patient education materials were written at a mean grade level of 11.13, with 65.8 percent of articles written above a tenth grade level and no articles written at the American Medical Association/National Institutes of Health recommended grade levels. Analysis of variance demonstrated a significant difference between websites for each reading scale (p<0.001), which was confirmed with Tukey’s Honestly Statistically Different post hoc analysis. Conclusion: Online patient education materials from major dermatologic association websites are written well above recommended reading levels. Associations should consider revising patient education materials to allow more effective patient comprehension. (J ClinAesthet Dermatol. 2016;9(9):23–28.) PMID:27878059

  10. Assessment of Online Patient Education Materials from Major Dermatologic Associations.

    PubMed

    John, Ann M; John, Elizabeth S; Hansberry, David R; Lambert, William Clark

    2016-09-01

    Objective: Patients increasingly use the internet to find medical information regarding their conditions and treatments. Physicians often supplement visits with written education materials. Online patient education materials from major dermatologic associations should be written at appropriate reading levels to optimize utility for patients. The purpose of this study is to assess online patient education materials from major dermatologic associations and determine if they are written at the fourth to sixth grade level recommended by the American Medical Association and National Institutes of Health. Design: This is a descriptive and correlational design. Setting: Academic institution. Participants/measurements: Patient education materials from eight major dermatology websites were downloaded and assessed using 10 readability scales. A one-way analysis of variance and Tukey's Honestly Statistically Different post hoc analysis were performed to determine the difference in readability levels between websites. Results: Two hundred and sixty patient education materials were assessed. Collectively, patient education materials were written at a mean grade level of 11.13, with 65.8 percent of articles written above a tenth grade level and no articles written at the American Medical Association/National Institutes of Health recommended grade levels. Analysis of variance demonstrated a significant difference between websites for each reading scale (p<0.001), which was confirmed with Tukey's Honestly Statistically Different post hoc analysis. Conclusion: Online patient education materials from major dermatologic association websites are written well above recommended reading levels. Associations should consider revising patient education materials to allow more effective patient comprehension. (J ClinAesthet Dermatol. 2016;9(9):23-28.).

  11. Impact of a Single Unusually Large Rainfall Event on the Level of Risk Used for Infrastructure Design

    NASA Astrophysics Data System (ADS)

    Dhakal, N.; Jain, S.

    2013-12-01

    Rare and unusually large events (such as hurricanes and floods) can create unusual and interesting trends in statistics. Generalized Extreme Value (GEV) distribution is usually used to statistically describe extreme rainfall events. A number of the recent studies have shown that the frequency of extreme rainfall events has increased over the last century and as a result, there has been change in parameters of GEV distribution with the time (non-stationary). But what impact does a single unusually large rainfall event (e.g., hurricane Irene) have on the GEV parameters and consequently on the level of risks or the return periods used in designing the civil infrastructures? In other words, if such a large event occurs today, how will it influence the level of risks (estimated based on past rainfall records) for the civil infrastructures? To answer these questions, we performed sensitivity analysis of the distribution parameters of GEV as well as the return periods to unusually large outlier events. The long-term precipitation records over the period of 1981-2010 from 12 USHCN stations across the state of Maine were used for analysis. For most of the stations, addition of each outlier event caused an increase in the shape parameter with a huge decrease on the corresponding return period. This is a key consideration for time-varying engineering design. These isolated extreme weather events should simultaneously be considered with traditional statistical methodology related to extreme events while designing civil infrastructures (such as dams, bridges, and culverts). Such analysis is also useful in understanding the statistical uncertainty of projecting extreme events into future.

  12. Comparison of classical statistical methods and artificial neural network in traffic noise prediction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nedic, Vladimir, E-mail: vnedic@kg.ac.rs; Despotovic, Danijela, E-mail: ddespotovic@kg.ac.rs; Cvetanovic, Slobodan, E-mail: slobodan.cvetanovic@eknfak.ni.ac.rs

    2014-11-15

    Traffic is the main source of noise in urban environments and significantly affects human mental and physical health and labor productivity. Therefore it is very important to model the noise produced by various vehicles. Techniques for traffic noise prediction are mainly based on regression analysis, which generally is not good enough to describe the trends of noise. In this paper the application of artificial neural networks (ANNs) for the prediction of traffic noise is presented. As input variables of the neural network, the proposed structure of the traffic flow and the average speed of the traffic flow are chosen. Themore » output variable of the network is the equivalent noise level in the given time period L{sub eq}. Based on these parameters, the network is modeled, trained and tested through a comparative analysis of the calculated values and measured levels of traffic noise using the originally developed user friendly software package. It is shown that the artificial neural networks can be a useful tool for the prediction of noise with sufficient accuracy. In addition, the measured values were also used to calculate equivalent noise level by means of classical methods, and comparative analysis is given. The results clearly show that ANN approach is superior in traffic noise level prediction to any other statistical method. - Highlights: • We proposed an ANN model for prediction of traffic noise. • We developed originally designed user friendly software package. • The results are compared with classical statistical methods. • The results are much better predictive capabilities of ANN model.« less

  13. Strength and life criteria for corrugated fiberboard by three methods

    Treesearch

    Thomas J. Urbanik

    1997-01-01

    The conventional test method for determining the stacking life of corrugated containers at a fixed load level does not adequately predict a safe load when storage time is fixed. This study introduced multiple load levels and related the probability of time at failure to load. A statistical analysis of logarithm-of-time failure data varying with load level predicts the...

  14. Assessment of chamber pressure oscillations in the Shuttle SRB

    NASA Technical Reports Server (NTRS)

    Mathes, H. B.

    1980-01-01

    Combustion stability evaluations of the Shuttle solid propellant booster motor are reviewed. Measurement of the amplitude and frequency of low level chamber pressure oscillations which have been detected in motor firings, are discussed and a statistical analysis of the data is presented. Oscillatory data from three recent motor firings are shown and the results are compared with statistical predictions which are based on earlier motor firings.

  15. Voxel-based statistical analysis of cerebral glucose metabolism in patients with permanent vegetative state after acquired brain injury.

    PubMed

    Kim, Yong Wook; Kim, Hyoung Seop; An, Young-Sil; Im, Sang Hee

    2010-10-01

    Permanent vegetative state is defined as the impaired level of consciousness longer than 12 months after traumatic causes and 3 months after non-traumatic causes of brain injury. Although many studies assessed the cerebral metabolism in patients with acute and persistent vegetative state after brain injury, few studies investigated the cerebral metabolism in patients with permanent vegetative state. In this study, we performed the voxel-based analysis of cerebral glucose metabolism and investigated the relationship between regional cerebral glucose metabolism and the severity of impaired consciousness in patients with permanent vegetative state after acquired brain injury. We compared the regional cerebral glucose metabolism as demonstrated by F-18 fluorodeoxyglucose positron emission tomography from 12 patients with permanent vegetative state after acquired brain injury with those from 12 control subjects. Additionally, covariance analysis was performed to identify regions where decreased changes in regional cerebral glucose metabolism significantly correlated with a decrease of level of consciousness measured by JFK-coma recovery scale. Statistical analysis was performed using statistical parametric mapping. Compared with controls, patients with permanent vegetative state demonstrated decreased cerebral glucose metabolism in the left precuneus, both posterior cingulate cortices, the left superior parietal lobule (P(corrected) < 0.001), and increased cerebral glucose metabolism in the both cerebellum and the right supramarginal cortices (P(corrected) < 0.001). In the covariance analysis, a decrease in the level of consciousness was significantly correlated with decreased cerebral glucose metabolism in the both posterior cingulate cortices (P(uncorrected) < 0.005). Our findings suggest that the posteromedial parietal cortex, which are part of neural network for consciousness, may be relevant structure for pathophysiological mechanism in patients with permanent vegetative state after acquired brain injury.

  16. Modeling and replicating statistical topology and evidence for CMB nonhomogeneity

    PubMed Central

    Agami, Sarit

    2017-01-01

    Under the banner of “big data,” the detection and classification of structure in extremely large, high-dimensional, data sets are two of the central statistical challenges of our times. Among the most intriguing new approaches to this challenge is “TDA,” or “topological data analysis,” one of the primary aims of which is providing nonmetric, but topologically informative, preanalyses of data which make later, more quantitative, analyses feasible. While TDA rests on strong mathematical foundations from topology, in applications, it has faced challenges due to difficulties in handling issues of statistical reliability and robustness, often leading to an inability to make scientific claims with verifiable levels of statistical confidence. We propose a methodology for the parametric representation, estimation, and replication of persistence diagrams, the main diagnostic tool of TDA. The power of the methodology lies in the fact that even if only one persistence diagram is available for analysis—the typical case for big data applications—the replications permit conventional statistical hypothesis testing. The methodology is conceptually simple and computationally practical, and provides a broadly effective statistical framework for persistence diagram TDA analysis. We demonstrate the basic ideas on a toy example, and the power of the parametric approach to TDA modeling in an analysis of cosmic microwave background (CMB) nonhomogeneity. PMID:29078301

  17. P-MartCancer-Interactive Online Software to Enable Analysis of Shotgun Cancer Proteomic Datasets.

    PubMed

    Webb-Robertson, Bobbie-Jo M; Bramer, Lisa M; Jensen, Jeffrey L; Kobold, Markus A; Stratton, Kelly G; White, Amanda M; Rodland, Karin D

    2017-11-01

    P-MartCancer is an interactive web-based software environment that enables statistical analyses of peptide or protein data, quantitated from mass spectrometry-based global proteomics experiments, without requiring in-depth knowledge of statistical programming. P-MartCancer offers a series of statistical modules associated with quality assessment, peptide and protein statistics, protein quantification, and exploratory data analyses driven by the user via customized workflows and interactive visualization. Currently, P-MartCancer offers access and the capability to analyze multiple cancer proteomic datasets generated through the Clinical Proteomics Tumor Analysis Consortium at the peptide, gene, and protein levels. P-MartCancer is deployed as a web service (https://pmart.labworks.org/cptac.html), alternatively available via Docker Hub (https://hub.docker.com/r/pnnl/pmart-web/). Cancer Res; 77(21); e47-50. ©2017 AACR . ©2017 American Association for Cancer Research.

  18. Geostatistical applications in environmental remediation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, R.N.; Purucker, S.T.; Lyon, B.F.

    1995-02-01

    Geostatistical analysis refers to a collection of statistical methods for addressing data that vary in space. By incorporating spatial information into the analysis, geostatistics has advantages over traditional statistical analysis for problems with a spatial context. Geostatistics has a history of success in earth science applications, and its popularity is increasing in other areas, including environmental remediation. Due to recent advances in computer technology, geostatistical algorithms can be executed at a speed comparable to many standard statistical software packages. When used responsibly, geostatistics is a systematic and defensible tool can be used in various decision frameworks, such as the Datamore » Quality Objectives (DQO) process. At every point in the site, geostatistics can estimate both the concentration level and the probability or risk of exceeding a given value. Using these probability maps can assist in identifying clean-up zones. Given any decision threshold and an acceptable level of risk, the probability maps identify those areas that are estimated to be above or below the acceptable risk. Those areas that are above the threshold are of the most concern with regard to remediation. In addition to estimating clean-up zones, geostatistics can assist in designing cost-effective secondary sampling schemes. Those areas of the probability map with high levels of estimated uncertainty are areas where more secondary sampling should occur. In addition, geostatistics has the ability to incorporate soft data directly into the analysis. These data include historical records, a highly correlated secondary contaminant, or expert judgment. The role of geostatistics in environmental remediation is a tool that in conjunction with other methods can provide a common forum for building consensus.« less

  19. Ethnic disparities in the risk of colorectal adenomas associated with lipid levels: a retrospective multiethnic study.

    PubMed

    Davis-Yadley, Ashley H; Lipka, Seth; Shen, Huafeng; Devanney, Valerie; Swarup, Supreeya; Barnowsky, Alex; Silpe, Jeff; Mosdale, Josh; Pan, Qinshi; Fridlyand, Svetlana; Sreeharshan, Suhas; Abraham, Albin; Viswanathan, Prakash; Krishnamachari, Bhuma

    2015-03-01

    Although data exists showing that uncontrolled lipid levels in white and black patients is associated with colorectal adenomas, there are currently no studies looking only at the Hispanic population. With the rapid increase in the Hispanic population, we aimed to look at their risk of colorectal adenomas in association with lipid levels. We retrospectively analyzed 1473 patients undergoing colonoscopy from 2009 to 2011 at a community hospital. Statistical analysis was performed using Chi-squared for categorical variables and t test for continuous variables with age-, gender-, and race-adjusted odds ratios. Unconditional logistic regression model was used to estimate 95 % confidence intervals (CI). SAS 9.3 software was used to perform all statistical analysis. In our general population, there was an association with elevated triglyceride levels greater than 150 and presence of multiple colorectal adenomas with odds ratio (OR) 1.60 (1.03, 2.48). There was an association with proximal colon adenomas and cholesterol levels between 200 and 239 with OR 1.57 (1.07, 2.30), and low-density lipoprotein (LDL) levels of greater than 130 with OR 1.54 (1.04, 2.30). There was no association between high-density lipoproteins (HDL) levels and colorectal adenomas. The Hispanic population showed no statistical correlation between elevated triglycerides, cholesterol, or LDL with the presence, size, location, or multiplicity of colorectal adenomas. We found a significant correlation between elevated lipid levels and colorectal adenomas in white and black patients; however, there was no such association in the Hispanic population. This finding can possibly be due to environmental factors such as dietary, colonic flora, or genetic susceptibility, which fosters further investigation and research.

  20. Summary and statistical analysis of precipitation and groundwater data for Brunswick County, North Carolina, Water Year 2008

    USGS Publications Warehouse

    McSwain, Kristen Bukowski; Strickland, A.G.

    2010-01-01

    Groundwater conditions in Brunswick County, North Carolina, have been monitored continuously since 2000 through the operation and maintenance of groundwater-level observation wells in the surficial, Castle Hayne, and Peedee aquifers of the North Atlantic Coastal Plain aquifer system. Groundwater-resource conditions for the Brunswick County area were evaluated by relating the normal range (25th to 75th percentile) monthly mean groundwater-level and precipitation data for water years 2001 to 2008 to median monthly mean groundwater levels and monthly sum of daily precipitation for water year 2008. Summaries of precipitation and groundwater conditions for the Brunswick County area and hydrographs and statistics of continuous groundwater levels collected during the 2008 water year are presented in this report. Groundwater levels varied by aquifer and geographic location within Brunswick County, but were influenced by drought conditions and groundwater withdrawals. Water levels were normal in two of the eight observation wells and below normal in the remaining six wells. Seasonal Kendall trend analysis performed on more than 9 years of monthly mean groundwater-level data collected in an observation well located within the Brunswick County well field indicated there is a strong downward trend, with water levels declining at a rate of about 2.2 feet per year.

  1. A statistical analysis of the impact of advertising signs on road safety.

    PubMed

    Yannis, George; Papadimitriou, Eleonora; Papantoniou, Panagiotis; Voulgari, Chrisoula

    2013-01-01

    This research aims to investigate the impact of advertising signs on road safety. An exhaustive review of international literature was carried out on the effect of advertising signs on driver behaviour and safety. Moreover, a before-and-after statistical analysis with control groups was applied on several road sites with different characteristics in the Athens metropolitan area, in Greece, in order to investigate the correlation between the placement or removal of advertising signs and the related occurrence of road accidents. Road accident data for the 'before' and 'after' periods on the test sites and the control sites were extracted from the database of the Hellenic Statistical Authority, and the selected 'before' and 'after' periods vary from 2.5 to 6 years. The statistical analysis shows no statistical correlation between road accidents and advertising signs in none of the nine sites examined, as the confidence intervals of the estimated safety effects are non-significant at 95% confidence level. This can be explained by the fact that, in the examined road sites, drivers are overloaded with information (traffic signs, directions signs, labels of shops, pedestrians and other vehicles, etc.) so that the additional information load from advertising signs may not further distract them.

  2. Correlation between hospital-level antibiotic consumption and incident health care facility-onset Clostridium difficile infection.

    PubMed

    Crew, Page E; Rhodes, Nathaniel J; O'Donnell, J Nicholas; Miglis, Cristina; Gilbert, Elise M; Zembower, Teresa R; Qi, Chao; Silkaitis, Christina; Sutton, Sarah H; Scheetz, Marc H

    2018-03-01

    The purpose of this single-center, ecologic study is to characterize the relationship between facility-wide (FacWide) antibiotic consumption and incident health care facility-onset Clostridium difficile infection (HO-CDI). FacWide antibiotic consumption and incident HO-CDI were tallied on a monthly basis and standardized, from January 2013 through April 2015. Spearman rank-order correlation coefficients were calculated using matched-months analysis and a 1-month delay. Regression analyses were performed, with P < .05 considered statistically significant. FacWide analysis identified a matched-months correlation between ceftriaxone and HO-CDI (ρ = 0.44, P = .018). A unit of stem cell transplant recipients did not have significant correlation between carbapenems and HO-CDI in matched months (ρ = 0.37, P = .098), but a significant correlation was observed when a 1-month lag was applied (ρ = 0.54, P = .014). Three statistically significant lag associations were observed between FacWide/unit-level antibiotic consumption and HO-CDI, and 1 statistically significant nonlagged association was observed FacWide. Antibiotic consumption may convey extended ward-level risk for incident CDI. Consumption of antibiotic agents may have immediate and prolonged influence on incident CDI. Additional studies are needed to investigate the immediate and delayed associations between antibiotic consumption and C difficile colonization, infection, and transmission at the hospital level. Published by Elsevier Inc.

  3. Assessing landslide susceptibility by statistical data analysis and GIS: the case of Daunia (Apulian Apennines, Italy)

    NASA Astrophysics Data System (ADS)

    Ceppi, C.; Mancini, F.; Ritrovato, G.

    2009-04-01

    This study aim at the landslide susceptibility mapping within an area of the Daunia (Apulian Apennines, Italy) by a multivariate statistical method and data manipulation in a Geographical Information System (GIS) environment. Among the variety of existing statistical data analysis techniques, the logistic regression was chosen to produce a susceptibility map all over an area where small settlements are historically threatened by landslide phenomena. By logistic regression a best fitting between the presence or absence of landslide (dependent variable) and the set of independent variables is performed on the basis of a maximum likelihood criterion, bringing to the estimation of regression coefficients. The reliability of such analysis is therefore due to the ability to quantify the proneness to landslide occurrences by the probability level produced by the analysis. The inventory of dependent and independent variables were managed in a GIS, where geometric properties and attributes have been translated into raster cells in order to proceed with the logistic regression by means of SPSS (Statistical Package for the Social Sciences) package. A landslide inventory was used to produce the bivariate dependent variable whereas the independent set of variable concerned with slope, aspect, elevation, curvature, drained area, lithology and land use after their reductions to dummy variables. The effect of independent parameters on landslide occurrence was assessed by the corresponding coefficient in the logistic regression function, highlighting a major role played by the land use variable in determining occurrence and distribution of phenomena. Once the outcomes of the logistic regression are determined, data are re-introduced in the GIS to produce a map reporting the proneness to landslide as predicted level of probability. As validation of results and regression model a cell-by-cell comparison between the susceptibility map and the initial inventory of landslide events was performed and an agreement at 75% level achieved.

  4. Missing Data and Multiple Imputation: An Unbiased Approach

    NASA Technical Reports Server (NTRS)

    Foy, M.; VanBaalen, M.; Wear, M.; Mendez, C.; Mason, S.; Meyers, V.; Alexander, D.; Law, J.

    2014-01-01

    The default method of dealing with missing data in statistical analyses is to only use the complete observations (complete case analysis), which can lead to unexpected bias when data do not meet the assumption of missing completely at random (MCAR). For the assumption of MCAR to be met, missingness cannot be related to either the observed or unobserved variables. A less stringent assumption, missing at random (MAR), requires that missingness not be associated with the value of the missing variable itself, but can be associated with the other observed variables. When data are truly MAR as opposed to MCAR, the default complete case analysis method can lead to biased results. There are statistical options available to adjust for data that are MAR, including multiple imputation (MI) which is consistent and efficient at estimating effects. Multiple imputation uses informing variables to determine statistical distributions for each piece of missing data. Then multiple datasets are created by randomly drawing on the distributions for each piece of missing data. Since MI is efficient, only a limited number, usually less than 20, of imputed datasets are required to get stable estimates. Each imputed dataset is analyzed using standard statistical techniques, and then results are combined to get overall estimates of effect. A simulation study will be demonstrated to show the results of using the default complete case analysis, and MI in a linear regression of MCAR and MAR simulated data. Further, MI was successfully applied to the association study of CO2 levels and headaches when initial analysis showed there may be an underlying association between missing CO2 levels and reported headaches. Through MI, we were able to show that there is a strong association between average CO2 levels and the risk of headaches. Each unit increase in CO2 (mmHg) resulted in a doubling in the odds of reported headaches.

  5. An Analysis of Operational Suitability for Test and Evaluation of Highly Reliable Systems

    DTIC Science & Technology

    1994-03-04

    Exposition," Journal of the American Statistical A iation-59: 353-375 (June 1964). 17. SYS 229, Test and Evaluation Management Coursebook , School of Systems...in hours, 0 is 2-5 the desired MTBCF in hours, R is the number of critical failures, and a is the P[type-I error] of the X2 statistic with 2*R+2...design of experiments (DOE) tables and the use of Bayesian statistics to increase the confidence level of the test results that will be obtained from

  6. Chronic atrophic gastritis in association with hair mercury level.

    PubMed

    Xue, Zeyun; Xue, Huiping; Jiang, Jianlan; Lin, Bing; Zeng, Si; Huang, Xiaoyun; An, Jianfu

    2014-11-01

    The objective of this study was to explore hair mercury level in association with chronic atrophic gastritis, a precancerous stage of gastric cancer (GC), and thus provide a brand new angle of view on the timely intervention of precancerous stage of GC. We recruited 149 healthy volunteers as controls and 152 patients suffering from chronic gastritis as cases. The controls denied upper gastrointestinal discomforts, and the cases were diagnosed as chronic superficial gastritis (n=68) or chronic atrophic gastritis (n=84). We utilized Mercury Automated Analyzer (NIC MA-3000) to detect hair mercury level of both healthy controls and cases of chronic gastritis. The statistic of measurement data was expressed as mean ± standard deviation, which was analyzed using Levene variance equality test and t test. Pearson correlation analysis was employed to determine associated factors affecting hair mercury levels, and multiple stepwise regression analysis was performed to deduce regression equations. Statistical significance is considered if p value is less than 0.05. The overall hair mercury level was 0.908949 ± 0.8844490 ng/g [mean ± standard deviation (SD)] in gastritis cases and 0.460198 ± 0.2712187 ng/g (mean±SD) in healthy controls; the former level was significantly higher than the latter one (p=0.000<0.01). The hair mercury level in chronic atrophic gastritis subgroup was 1.155220 ± 0.9470246 ng/g (mean ± SD) and that in chronic superficial gastritis subgroup was 0.604732 ± 0.6942509 ng/g (mean ± SD); the former level was significantly higher than the latter level (p<0.01). The hair mercury level in chronic superficial gastritis cases was significantly higher than that in healthy controls (p<0.05). The hair mercury level in chronic atrophic gastritis cases was significantly higher than that in healthy controls (p<0.01). Stratified analysis indicated that the hair mercury level in healthy controls with eating seafood was significantly higher than that in healthy controls without eating seafood (p<0.01) and that the hair mercury level in chronic atrophic gastritis cases was significantly higher than that in chronic superficial gastritis cases (p<0.01). Pearson correlation analysis indicated that eating seafood was most correlated with hair mercury level and positively correlated in the healthy controls and that the severity of gastritis was most correlated with hair mercury level and positively correlated in the gastritis cases. Multiple stepwise regression analysis indicated that the regression equation of hair mercury level in controls could be expressed as 0.262 multiplied the value of eating seafood plus 0.434, the model that was statistically significant (p<0.01). Multiple stepwise regression analysis also indicated that the regression equation of hair mercury level in gastritis cases could be expressed as 0.305 multiplied the severity of gastritis, the model that was also statistically significant (p<0.01). The graphs of regression standardized residual for both controls and cases conformed to normal distribution. The main positively correlated factor affecting the hair mercury level is eating seafood in healthy people whereas the predominant positively correlated factor affecting the hair mercury level is the severity of gastritis in chronic gastritis patients. That is to say, the severity of chronic gastritis is positively correlated with the level of hair mercury. The incessantly increased level of hair mercury possibly reflects the development of gastritis from normal stomach to superficial gastritis and to atrophic gastritis. The detection of hair mercury is potentially a means to predict the severity of chronic gastritis and possibly to insinuate the environmental mercury threat to human health in terms of gastritis or even carcinogenesis.

  7. Bayesian Statistical Analysis of Historical and Late Holocene Rates of Sea-Level Change

    NASA Astrophysics Data System (ADS)

    Cahill, Niamh; Parnell, Andrew; Kemp, Andrew; Horton, Benjamin

    2014-05-01

    A fundamental concern associated with climate change is the rate at which sea levels are rising. Studies of past sea level (particularly beyond the instrumental data range) allow modern sea-level rise to be placed in a more complete context. Considering this, we perform a Bayesian statistical analysis on historical and late Holocene rates of sea-level change. The data that form the input to the statistical model are tide-gauge measurements and proxy reconstructions from cores of coastal sediment. The aims are to estimate rates of sea-level rise, to determine when modern rates of sea-level rise began and to observe how these rates have been changing over time. Many of the current methods for doing this use simple linear regression to estimate rates. This is often inappropriate as it is too rigid and it can ignore uncertainties that arise as part of the data collection exercise. This can lead to over confidence in the sea-level trends being characterized. The proposed Bayesian model places a Gaussian process prior on the rate process (i.e. the process that determines how rates of sea-level are changing over time). The likelihood of the observed data is the integral of this process. When dealing with proxy reconstructions, this is set in an errors-in-variables framework so as to take account of age uncertainty. It is also necessary, in this case, for the model to account for glacio-isostatic adjustment, which introduces a covariance between individual age and sea-level observations. This method provides a flexible fit and it allows for the direct estimation of the rate process with full consideration of all sources of uncertainty. Analysis of tide-gauge datasets and proxy reconstructions in this way means that changing rates of sea level can be estimated more comprehensively and accurately than previously possible. The model captures the continuous and dynamic evolution of sea-level change and results show that not only are modern sea levels rising but that the rates of rise are continuously increasing. Analysis of the a global tide-gauge record (Church and White, 2011) indicated that the rate of sea-level rise increased continuously since 1880AD and is currently 2.57mm/yr (95% credible interval of 1.71 to 4.35mm/yr). Application of the model a proxy reconstruction from North Carolina (Kemp et al., 2011) indicated that the mean rate of rise in this locality since the middle of the 19th century (current rate of 2.66 mm/yr with a 95% credible interval of 1.29 to 4.59mm/yr) is in agreement with results from the tide gauge analysis and is unprecedented in at least the last 2000 years.

  8. Application of short-data methods on extreme surge levels

    NASA Astrophysics Data System (ADS)

    Feng, X.

    2014-12-01

    Tropical cyclone-induced storm surges are among the most destructive natural hazards that impact the United States. Unfortunately for academic research, the available time series for extreme surge analysis are very short. The limited data introduces uncertainty and affects the accuracy of statistical analyses of extreme surge levels. This study deals with techniques applicable to data sets less than 20 years, including simulation modelling and methods based on the parameters of the parent distribution. The verified water levels from water gauges spread along the Southwest and Southeast Florida Coast, as well as the Florida Keys, are used in this study. Methods to calculate extreme storm surges are described and reviewed, including 'classical' methods based on the generalized extreme value (GEV) distribution and the generalized Pareto distribution (GPD), and approaches designed specifically to deal with short data sets. Incorporating global-warming influence, the statistical analysis reveals enhanced extreme surge magnitudes and frequencies during warm years, while reduced levels of extreme surge activity are observed in the same study domain during cold years. Furthermore, a non-stationary GEV distribution is applied to predict the extreme surge levels with warming sea surface temperatures. The non-stationary GEV distribution indicates that with 1 Celsius degree warming in sea surface temperature from the baseline climate, the 100-year return surge level in Southwest and Southeast Florida will increase by up to 40 centimeters. The considered statistical approaches for extreme surge estimation based on short data sets will be valuable to coastal stakeholders, including urban planners, emergency managers, and the hurricane and storm surge forecasting and warning system.

  9. Serum adipokines and HIV viral replication in patients undergoing antiretroviral therapy

    PubMed Central

    Aramă, Victoria; Tilişcan, Cătălin; Ion, Daniela Adriana; Mihăilescu, Raluca; Munteanu, Daniela; Streinu-Cercel, Anca; Tudor, Ana Maria; Hristea, Adriana; Leoveanu, Viorica; Olaru, Ioana; Aramă, Ştefan Sorin

    2012-01-01

    Introduction Several studies have reported that cytokines secreted by adipose tissue (adipokines) may be linked to HIV replication. The aim of the study was to evaluate the relationship between HIV replication and serum levels of adipokines, in a Caucasian HIV-infected population of men and women undergoing complex antiretroviral therapy. Methods A cross-sectional study was conducted in an unselected sample of 77 HIV-1-positive patients. Serum adipokines levels were measured including circulating adiponectin, leptin, resistin, tumor necrosis factor alpha (TNF-alpha) and interleukin-6 (IL-6). Patients were divided into two groups: Group 1 - with undetectable viral load and Group 2 - with persistent HIV viral replication. Differences between groups ? were tested using independent-sample t-test for Gaussian variables and Mann–Whitney–Wilcoxon test for non-parametric variables. Pearson's chi-squared test was used for correlation analysis. Results A total of 77 patients (age range: 17-65, mean: 32.5 years) including 44 men (57.1% men, age range: 17–63 years, mean: 34.1 years) and 33 women (42.9% women age range: 19–65 years, mean: 30.3 years) were included in the study. TNF-alpha had significantly higher serum levels in patients with detectable viral load (16.89 vs. 9.35 pg/mL), (p=0.043), but correlation analysis lacked statistical significance. Adiponectin had median serum levels of 9.22 ìg/mL in Group 1 vs. 16.50 ìg/mL in Group 2 but the results lacked statistical significance (p=0.059). Higher leptin, IL-6 and resistin serum levels were noted in patients with undetectable HIV viral load, without statistical significance. Conclusions The present study reported higher TNF-alpha serum levels in patients with persistent HIV viral load. We found no statistically significant correlations between adiponectin, leptin, resistin and IL-6 and HIV viral load in our Caucasian HIV-positive study population, undergoing antiretroviral therapy. PMID:24432258

  10. The impact of mother's literacy on child dental caries: Individual data or aggregate data analysis?

    PubMed

    Haghdoost, Ali-Akbar; Hessari, Hossein; Baneshi, Mohammad Reza; Rad, Maryam; Shahravan, Arash

    2017-01-01

    To evaluate the impact of mother's literacy on child dental caries based on a national oral health survey in Iran and to investigate the possibility of ecological fallacy in aggregate data analysis. Existing data were from second national oral health survey that was carried out in 2004, which including 8725 6 years old participants. The association of mother's literacy with caries occurrence (DMF (Decayed, Missing, Filling) total score >0) of her child was assessed using individual data by logistic regression model. Then the association of the percentages of mother's literacy and the percentages of decayed teeth in each 30 provinces of Iran was assessed using aggregated data retrieved from the data of second national oral health survey of Iran and alternatively from census of "Statistical Center of Iran" using linear regression model. The significance level was set at 0.05 for all analysis. Individual data analysis showed a statistically significant association between mother's literacy and decayed teeth of children ( P = 0.02, odds ratio = 0.83). There were not statistical significant association between mother's literacy and child dental caries in aggregate data analysis of oral health survey ( P = 0.79, B = 0.03) and census of "Statistical Center of Statistics" ( P = 0.60, B = 0.14). Literate mothers have a preventive effect on occurring dental caries of children. According to the high percentage of illiterate parents in Iran, it's logical to consider suitable methods of oral health education which do not need reading or writing. Aggregate data analysis and individual data analysis had completely different results in this study.

  11. Statistics of high-level scene context

    PubMed Central

    Greene, Michelle R.

    2013-01-01

    Context is critical for recognizing environments and for searching for objects within them: contextual associations have been shown to modulate reaction time and object recognition accuracy, as well as influence the distribution of eye movements and patterns of brain activations. However, we have not yet systematically quantified the relationships between objects and their scene environments. Here I seek to fill this gap by providing descriptive statistics of object-scene relationships. A total of 48, 167 objects were hand-labeled in 3499 scenes using the LabelMe tool (Russell et al., 2008). From these data, I computed a variety of descriptive statistics at three different levels of analysis: the ensemble statistics that describe the density and spatial distribution of unnamed “things” in the scene; the bag of words level where scenes are described by the list of objects contained within them; and the structural level where the spatial distribution and relationships between the objects are measured. The utility of each level of description for scene categorization was assessed through the use of linear classifiers, and the plausibility of each level for modeling human scene categorization is discussed. Of the three levels, ensemble statistics were found to be the most informative (per feature), and also best explained human patterns of categorization errors. Although a bag of words classifier had similar performance to human observers, it had a markedly different pattern of errors. However, certain objects are more useful than others, and ceiling classification performance could be achieved using only the 64 most informative objects. As object location tends not to vary as a function of category, structural information provided little additional information. Additionally, these data provide valuable information on natural scene redundancy that can be exploited for machine vision, and can help the visual cognition community to design experiments guided by statistics rather than intuition. PMID:24194723

  12. Space biology initiative program definition review. Trade study 3: Hardware miniaturization versus cost

    NASA Technical Reports Server (NTRS)

    Jackson, L. Neal; Crenshaw, John, Sr.; Davidson, William L.; Herbert, Frank J.; Bilodeau, James W.; Stoval, J. Michael; Sutton, Terry

    1989-01-01

    The optimum hardware miniaturization level with the lowest cost impact for space biology hardware was determined. Space biology hardware and/or components/subassemblies/assemblies which are the most likely candidates for application of miniaturization are to be defined and relative cost impacts of such miniaturization are to be analyzed. A mathematical or statistical analysis method with the capability to support development of parametric cost analysis impacts for levels of production design miniaturization are provided.

  13. Eruption patterns of the chilean volcanoes Villarrica, Llaima, and Tupungatito

    NASA Astrophysics Data System (ADS)

    Muñoz, Miguel

    1983-09-01

    The historical eruption records of three Chilean volcanoes have been subjected to many statistical tests, and none have been found to differ significantly from random, or Poissonian, behaviour. The statistical analysis shows rough conformity with the descriptions determined from the eruption rate functions. It is possible that a constant eruption rate describes the activity of Villarrica; Llaima and Tupungatito present complex eruption rate patterns that appear, however, to have no statistical significance. Questions related to loading and extinction processes and to the existence of shallow secondary magma chambers to which magma is supplied from a deeper system are also addressed. The analysis and the computation of the serial correlation coefficients indicate that the three series may be regarded as stationary renewal processes. None of the test statistics indicates rejection of the Poisson hypothesis at a level less than 5%, but the coefficient of variation for the eruption series at Llaima is significantly different from the value expected for a Poisson process. Also, the estimates of the normalized spectrum of the counting process for the three series suggest a departure from the random model, but the deviations are not found to be significant at the 5% level. Kolmogorov-Smirnov and chi-squared test statistics, applied directly to ascertaining to which probability P the random Poisson model fits the data, indicate that there is significant agreement in the case of Villarrica ( P=0.59) and Tupungatito ( P=0.3). Even though the P-value for Llaima is a marginally significant 0.1 (which is equivalent to rejecting the Poisson model at the 90% confidence level), the series suggests that nonrandom features are possibly present in the eruptive activity of this volcano.

  14. An Economic Analysis of the Demand for State and Local Government Employees.

    ERIC Educational Resources Information Center

    Ehrenberg, Ronald G.

    This study presents estimates of the wage elasticities of demand for state and local government employees. Almost uniformly each functional category of state and local government employee's employment level is shown to be statistically significantly negatively related to the category real and relative wage level. However, the magnitude of these…

  15. Statistical Modeling of the Individual: Rationale and Application of Multivariate Stationary Time Series Analysis

    ERIC Educational Resources Information Center

    Hamaker, Ellen L.; Dolan, Conor V.; Molenaar, Peter C. M.

    2005-01-01

    Results obtained with interindividual techniques in a representative sample of a population are not necessarily generalizable to the individual members of this population. In this article the specific condition is presented that must be satisfied to generalize from the interindividual level to the intraindividual level. A way to investigate…

  16. Knee osteoarthritis, dyslipidemia syndrome and exercise.

    PubMed

    Păstrăiguş, Carmen; Ancuţa, Codrina; Miu, Smaranda; Ancuţa, E; Chirieac, Rodica

    2012-01-01

    The aim of our study was to evaluate the influence of aerobic training on the dyslipedemia in patients with knee osteoarthritis (KOA). Prospective observational six-month study performed on 40 patients with KOA, fulfilling the inclusion criteria, classified according to their participation in specific aerobic training program (30 minutes/day, 5 days/ week) in two subgroups. A standard evaluation protocol was followed assessing lipid parameters (total cholesterol, triglycerides, LDL-cholesterol, HDL-cholesterol levels) at baseline, three and six months. Statistical analysis was performed in SPSS 16.0, p < 0.05. Subgroup analysis has demonstrated a statistical significant improvement in plasma lipids levels in all patients performing regular aerobic training (cholesterol, triglycerides, HDL-cholesterol, LDL-cholesterol) (p < 0.05). Although the difference reported for total cholesterol, triglycerides and LDL-cholesterol after six months between subgroups was not significant (p > 0.05), the mean level of HDL-cholesterol was significantly higher in patients performing aerobic training, reaching the cardio-vascular protective levels. Regular aerobic exercise has a positive effect on plasma lipoprotein concentrations; further research is needed for the assessment of long-term effects of physical exercises for both KOA and lipid pattern.

  17. Correlation of FCGRT genomic structure with serum immunoglobulin, albumin and farletuzumab pharmacokinetics in patients with first relapsed ovarian cancer.

    PubMed

    O'Shannessy, Daniel J; Bendas, Katie; Schweizer, Charles; Wang, Wenquan; Albone, Earl; Somers, Elizabeth B; Weil, Susan; Meredith, Rhonda K; Wustner, Jason; Grasso, Luigi; Landers, Mark; Nicolaides, Nicholas C

    2017-07-01

    Farletuzumab (FAR) is a humanized monoclonal antibody (mAb) that binds to folate receptor alpha. A Ph3 trial in ovarian cancer patients treated with carboplatin/taxane plus FAR or placebo did not meet the primary statistical endpoint. Subgroup analysis demonstrated that subjects with high FAR exposure levels (Cmin>57.6μg/mL) showed statistically significant improvements in PFS and OS. The neonatal Fc receptor (fcgrt) plays a central role in albumin/IgG stasis and mAb pharmacokinetics (PK). Here we evaluated fcgrt sequence and association of its promoter variable number tandem repeats (VNTR) and coding single nucleotide variants (SNV) with albumin/IgG levels and FAR PK in the Ph3 patients. A statistical correlation existed between high FAR Cmin and AUC in patients with the highest quartile of albumin and lowest quartile of IgG1. Analysis of fcgrt identified 5 different VNTRs in the promoter region and 9 SNVs within the coding region, 4 which are novel. Copyright © 2017. Published by Elsevier Inc.

  18. Statistical results on restorative dentistry experiments: effect of the interaction between main variables

    PubMed Central

    CAVALCANTI, Andrea Nóbrega; MARCHI, Giselle Maria; AMBROSANO, Gláucia Maria Bovi

    2010-01-01

    Statistical analysis interpretation is a critical field in scientific research. When there is more than one main variable being studied in a research, the effect of the interaction between those variables is fundamental on experiments discussion. However, some doubts can occur when the p-value of the interaction is greater than the significance level. Objective To determine the most adequate interpretation for factorial experiments with p-values of the interaction nearly higher than the significance level. Materials and methods The p-values of the interactions found in two restorative dentistry experiments (0.053 and 0.068) were interpreted in two distinct ways: considering the interaction as not significant and as significant. Results Different findings were observed between the two analyses, and studies results became more coherent when the significant interaction was used. Conclusion The p-value of the interaction between main variables must be analyzed with caution because it can change the outcomes of research studies. Researchers are strongly advised to interpret carefully the results of their statistical analysis in order to discuss the findings of their experiments properly. PMID:20857003

  19. Eutrophication risk assessment in coastal embayments using simple statistical models.

    PubMed

    Arhonditsis, G; Eleftheriadou, M; Karydis, M; Tsirtsis, G

    2003-09-01

    A statistical methodology is proposed for assessing the risk of eutrophication in marine coastal embayments. The procedure followed was the development of regression models relating the levels of chlorophyll a (Chl) with the concentration of the limiting nutrient--usually nitrogen--and the renewal rate of the systems. The method was applied in the Gulf of Gera, Island of Lesvos, Aegean Sea and a surrogate for renewal rate was created using the Canberra metric as a measure of the resemblance between the Gulf and the oligotrophic waters of the open sea in terms of their physical, chemical and biological properties. The Chl-total dissolved nitrogen-renewal rate regression model was the most significant, accounting for 60% of the variation observed in Chl. Predicted distributions of Chl for various combinations of the independent variables, based on Bayesian analysis of the models, enabled comparison of the outcomes of specific scenarios of interest as well as further analysis of the system dynamics. The present statistical approach can be used as a methodological tool for testing the resilience of coastal ecosystems under alternative managerial schemes and levels of exogenous nutrient loading.

  20. Mixed-Methods Research in the Discipline of Nursing.

    PubMed

    Beck, Cheryl Tatano; Harrison, Lisa

    2016-01-01

    In this review article, we examined the prevalence and characteristics of 294 mixed-methods studies in the discipline of nursing. Creswell and Plano Clark's typology was most frequently used along with concurrent timing. Bivariate statistics was most often the highest level of statistics reported in the results. As for qualitative data analysis, content analysis was most frequently used. The majority of nurse researchers did not specifically address the purpose, paradigm, typology, priority, timing, interaction, or integration of their mixed-methods studies. Strategies are suggested for improving the design, conduct, and reporting of mixed-methods studies in the discipline of nursing.

  1. HIV/AIDS information by African companies: an empirical analysis.

    PubMed

    Barako, Dulacha G; Taplin, Ross H; Brown, Alistair M

    2010-01-01

    This article investigates the extent of Human Immunodeficiency Virus/Acquired Immune Deficiency Syndrome Disclosures (HIV/AIDSD) in online annual reports by 200 listed companies from 10 African countries for the year ending 2006. Descriptive statistics reveal a very low level of overall HIV/AIDSD practices with a mean of 6 per cent disclosure, with half (100 out of 200) of the African companies making no disclosures at all. Logistic regression analysis reveals that company size and country are highly significant predictors of any disclosure of HIV/AIDS in annual reports. Profitability is also statistically significantly associated with the extent of disclosure.

  2. Statistical inference of dynamic resting-state functional connectivity using hierarchical observation modeling.

    PubMed

    Sojoudi, Alireza; Goodyear, Bradley G

    2016-12-01

    Spontaneous fluctuations of blood-oxygenation level-dependent functional magnetic resonance imaging (BOLD fMRI) signals are highly synchronous between brain regions that serve similar functions. This provides a means to investigate functional networks; however, most analysis techniques assume functional connections are constant over time. This may be problematic in the case of neurological disease, where functional connections may be highly variable. Recently, several methods have been proposed to determine moment-to-moment changes in the strength of functional connections over an imaging session (so called dynamic connectivity). Here a novel analysis framework based on a hierarchical observation modeling approach was proposed, to permit statistical inference of the presence of dynamic connectivity. A two-level linear model composed of overlapping sliding windows of fMRI signals, incorporating the fact that overlapping windows are not independent was described. To test this approach, datasets were synthesized whereby functional connectivity was either constant (significant or insignificant) or modulated by an external input. The method successfully determines the statistical significance of a functional connection in phase with the modulation, and it exhibits greater sensitivity and specificity in detecting regions with variable connectivity, when compared with sliding-window correlation analysis. For real data, this technique possesses greater reproducibility and provides a more discriminative estimate of dynamic connectivity than sliding-window correlation analysis. Hum Brain Mapp 37:4566-4580, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  3. The effects of academic grouping on student performance in science

    NASA Astrophysics Data System (ADS)

    Scoggins, Sally Smykla

    The current action research study explored how student placement in heterogeneous or homogeneous classes in seventh-grade science affected students' eighth-grade Science State of Texas Assessment of Academic Readiness (STAAR) scores, and how ability grouping affected students' scores based on race and socioeconomic status. The population included all eighth-grade students in the target district who took the regular eighth-grade science STAAR over four academic school years. The researcher ran three statistical tests: a t-test for independent samples, a one-way between subjects analysis of variance (ANOVA) and a two-way between subjects ANOVA. The results showed no statistically significant difference between eighth-grade Pre-AP students from seventh-grade Pre-AP classes and eighth-grade Pre-AP students from heterogeneous seventh-grade classes and no statistically significant difference between Pre-AP students' scores based on socioeconomic status. There was no statistically significant interaction between socioeconomic status and the seventh-grade science classes. The scores between regular eighth-grade students who were in heterogeneous seventh-grade classes were statistically significantly higher than the scores of regular eighth-grade students who were in regular seventh-grade classes. The results also revealed that the scores of students who were White were statistically significantly higher than the scores of students who were Black and Hispanic. Black and Hispanic scores did not differ significantly. Further results indicated that the STAAR Level II and Level III scores were statistically significantly higher for the Pre-AP eighth-grade students who were in heterogeneous seventh-grade classes than the STAAR Level II and Level III scores of Pre-AP eighth-grade students who were in Pre-AP seventh-grade classes.

  4. Understanding nanocellulose chirality and structure–properties relationship at the single fibril level

    PubMed Central

    Usov, Ivan; Nyström, Gustav; Adamcik, Jozef; Handschin, Stephan; Schütz, Christina; Fall, Andreas; Bergström, Lennart; Mezzenga, Raffaele

    2015-01-01

    Nanocellulose fibrils are ubiquitous in nature and nanotechnologies but their mesoscopic structural assembly is not yet fully understood. Here we study the structural features of rod-like cellulose nanoparticles on a single particle level, by applying statistical polymer physics concepts on electron and atomic force microscopy images, and we assess their physical properties via quantitative nanomechanical mapping. We show evidence of right-handed chirality, observed on both bundles and on single fibrils. Statistical analysis of contours from microscopy images shows a non-Gaussian kink angle distribution. This is inconsistent with a structure consisting of alternating amorphous and crystalline domains along the contour and supports process-induced kink formation. The intrinsic mechanical properties of nanocellulose are extracted from nanoindentation and persistence length method for transversal and longitudinal directions, respectively. The structural analysis is pushed to the level of single cellulose polymer chains, and their smallest associated unit with a proposed 2 × 2 chain-packing arrangement. PMID:26108282

  5. [Music therapy and regional anesthesia in orthopedic surgery].

    PubMed

    Rupérez Ruiz, Ma Pilar; De San José, Isabel; Hermoso Montoya, Anna; Ferreira Valencia, Teresa; Gómez Sanz, Amelia; López Gutiérrez, Anna

    2014-06-01

    To evaluate the sedative effects, intra-operation, of music therapy in orthopedic surgery patients with locoregional anesthesia in the Hospital Clínic i Provincial of Barcelona. Prospective comparative study on a random sample of 110 patients undergoing or not music therapy. The degree of anxiety was assessed with the Questionnaire STAIC. The application of the music was done with a MP3 player and headphones. The collected data were analyzed with Excel. For the statistical analysis we used the SPSS-18 software and Chi-square test to test the hypothesis of whether there was relationship between the level of peace and music therapy. After the analysis, the results of Chi-square were in the group of no sedation with/without music Chi2 = 2.01, P = 0.35. The statistical significance level was p < 0.05. No relationship was found between hearing music or not and the patient's comfort level. Most patients recommend listening to music in the operating room despite the sounds around do not bother them.

  6. Real-time movement detection and analysis for video surveillance applications

    NASA Astrophysics Data System (ADS)

    Hueber, Nicolas; Hennequin, Christophe; Raymond, Pierre; Moeglin, Jean-Pierre

    2014-06-01

    Pedestrian movement along critical infrastructures like pipes, railways or highways, is of major interest in surveillance applications as well as its behavior in urban environment. The goal is to anticipate illicit or dangerous human activities. For this purpose, we propose an all-in-one small autonomous system which delivers high level statistics and reports alerts in specific cases. This situational awareness project leads us to manage efficiently the scene by performing movement analysis. A dynamic background extraction algorithm is developed to reach the degree of robustness against natural and urban environment perturbations and also to match the embedded implementation constraints. When changes are detected in the scene, specific patterns are applied to detect and highlight relevant movements. Depending on the applications, specific descriptors can be extracted and fused in order to reach a high level of interpretation. In this paper, our approach is applied to two operational use cases: pedestrian urban statistics and railway surveillance. In the first case, a grid of prototypes is deployed over a city centre to collect pedestrian movement statistics up to a macroscopic level of analysis. The results demonstrate the relevance of the delivered information; in particular, the flow density map highlights pedestrian preferential paths along the streets. In the second case, one prototype is set next to high speed train tracks to secure the area. The results exhibit a low false alarm rate and assess our approach of a large sensor network for delivering a precise operational picture without overwhelming a supervisor.

  7. SPSS and SAS programs for addressing interdependence and basic levels-of-analysis issues in psychological data.

    PubMed

    O'Connor, Brian P

    2004-02-01

    Levels-of-analysis issues arise whenever individual-level data are collected from more than one person from the same dyad, family, classroom, work group, or other interaction unit. Interdependence in data from individuals in the same interaction units also violates the independence-of-observations assumption that underlies commonly used statistical tests. This article describes the data analysis challenges that are presented by these issues and presents SPSS and SAS programs for conducting appropriate analyses. The programs conduct the within-and-between-analyses described by Dansereau, Alutto, and Yammarino (1984) and the dyad-level analyses described by Gonzalez and Griffin (1999) and Griffin and Gonzalez (1995). Contrasts with general multilevel modeling procedures are then discussed.

  8. Statistical analysis of arsenic contamination in drinking water in a city of Iran and its modeling using GIS.

    PubMed

    Sadeghi, Fatemeh; Nasseri, Simin; Mosaferi, Mohammad; Nabizadeh, Ramin; Yunesian, Masud; Mesdaghinia, Alireza

    2017-05-01

    In this research, probable arsenic contamination in drinking water in the city of Ardabil was studied in 163 samples during four seasons. In each season, sampling was carried out randomly in the study area. Results were analyzed statistically applying SPSS 19 software, and the data was also modeled by Arc GIS 10.1 software. The maximum permissible arsenic concentration in drinking water defined by the World Health Organization and Iranian national standard is 10 μg/L. Statistical analysis showed 75, 88, 47, and 69% of samples in autumn, winter, spring, and summer, respectively, had concentrations higher than the national standard. The mean concentrations of arsenic in autumn, winter, spring, and summer were 19.89, 15.9, 10.87, and 14.6 μg/L, respectively, and the overall average in all samples through the year was 15.32 μg/L. Although GIS outputs indicated that the concentration distribution profiles changed in four consecutive seasons, variance analysis of the results showed that statistically there is no significant difference in arsenic levels in four seasons.

  9. Manipulating measurement scales in medical statistical analysis and data mining: A review of methodologies

    PubMed Central

    Marateb, Hamid Reza; Mansourian, Marjan; Adibi, Peyman; Farina, Dario

    2014-01-01

    Background: selecting the correct statistical test and data mining method depends highly on the measurement scale of data, type of variables, and purpose of the analysis. Different measurement scales are studied in details and statistical comparison, modeling, and data mining methods are studied based upon using several medical examples. We have presented two ordinal–variables clustering examples, as more challenging variable in analysis, using Wisconsin Breast Cancer Data (WBCD). Ordinal-to-Interval scale conversion example: a breast cancer database of nine 10-level ordinal variables for 683 patients was analyzed by two ordinal-scale clustering methods. The performance of the clustering methods was assessed by comparison with the gold standard groups of malignant and benign cases that had been identified by clinical tests. Results: the sensitivity and accuracy of the two clustering methods were 98% and 96%, respectively. Their specificity was comparable. Conclusion: by using appropriate clustering algorithm based on the measurement scale of the variables in the study, high performance is granted. Moreover, descriptive and inferential statistics in addition to modeling approach must be selected based on the scale of the variables. PMID:24672565

  10. Statistical comparison of a hybrid approach with approximate and exact inference models for Fusion 2+

    NASA Astrophysics Data System (ADS)

    Lee, K. David; Wiesenfeld, Eric; Gelfand, Andrew

    2007-04-01

    One of the greatest challenges in modern combat is maintaining a high level of timely Situational Awareness (SA). In many situations, computational complexity and accuracy considerations make the development and deployment of real-time, high-level inference tools very difficult. An innovative hybrid framework that combines Bayesian inference, in the form of Bayesian Networks, and Possibility Theory, in the form of Fuzzy Logic systems, has recently been introduced to provide a rigorous framework for high-level inference. In previous research, the theoretical basis and benefits of the hybrid approach have been developed. However, lacking is a concrete experimental comparison of the hybrid framework with traditional fusion methods, to demonstrate and quantify this benefit. The goal of this research, therefore, is to provide a statistical analysis on the comparison of the accuracy and performance of hybrid network theory, with pure Bayesian and Fuzzy systems and an inexact Bayesian system approximated using Particle Filtering. To accomplish this task, domain specific models will be developed under these different theoretical approaches and then evaluated, via Monte Carlo Simulation, in comparison to situational ground truth to measure accuracy and fidelity. Following this, a rigorous statistical analysis of the performance results will be performed, to quantify the benefit of hybrid inference to other fusion tools.

  11. Subtle Cognitive Effects of Moderate Hypoxia

    DTIC Science & Technology

    2009-08-01

    using SPSS® 13.0 with significance set at an alpha level of .05 for all statistical tests. A repeated measures analysis of variance (ANOVA) was...there was not statistically significant change in reaction time (p=.781), accuracy (p=.152), or throughout (p=.967) with increasing altitude. The...results indicate that healthy individuals aged 19 to 45 years do not experience significant cognitive deficit, as measured by the CogScreen®-HE, when

  12. Adhesive properties and adhesive joints strength of graphite/epoxy composites

    NASA Astrophysics Data System (ADS)

    Rudawska, Anna; Stančeková, Dana; Cubonova, Nadezda; Vitenko, Tetiana; Müller, Miroslav; Valášek, Petr

    2017-05-01

    The article presents the results of experimental research of the adhesive joints strength of graphite/epoxy composites and the results of the surface free energy of the composite surfaces. Two types of graphite/epoxy composites with different thickness were tested which are used to aircraft structure. The single-lap adhesive joints of epoxy composites were considered. Adhesive properties were described by surface free energy. Owens-Wendt method was used to determine surface free energy. The epoxy two-component adhesive was used to preparing the adhesive joints. Zwick/Roell 100 strength device were used to determination the shear strength of adhesive joints of epoxy composites. The strength test results showed that the highest value was obtained for adhesive joints of graphite-epoxy composite of smaller material thickness (0.48 mm). Statistical analysis of the results obtained, the study showed statistically significant differences between the values of the strength of the confidence level of 0.95. The statistical analysis of the results also showed that there are no statistical significant differences in average values of surface free energy (0.95 confidence level). It was noted that in each of the results the dispersion component of surface free energy was much greater than polar component of surface free energy.

  13. Statistical Method for Identification of Potential Groundwater Recharge Zone

    NASA Astrophysics Data System (ADS)

    Banerjee, Pallavi; Singh, V. S.

    2010-05-01

    The effective development of groundwater resource is essential for a country like India. Artificial recharge is the planned, human activity of augmenting the amount of groundwater available through works designed to increase the natural replenishment or percolation of surface waters into the groundwater aquifers, resulting in a corresponding increase in the amount of groundwater available for abstraction. India receives good amount of average annual rainfall about 114 cm but most of it's part waste through runoff. The imbalance between rainfall and recharge has caused serious shortage of water for drinking, agriculture and industrial purposes. The over exploitation of groundwater due to increasing population is an additional cause of water crisis that resulting in reduction in per capita availability of water in the country. Thus the planning for effective development of groundwater is essential through artificial recharge. Objective of the paper is to identification of artificial recharge zones by arresting runoff through suitable sites to restore groundwater conditions using statistical technique. The water table variation follows a pattern similar to rainfall variation with time delay. The rainfall and its relationship with recharge is a very important process in a shallow aquifer system. Understanding of this process is of critical importance to management of groundwater resource in any terrain. Groundwater system in a top weathered regolith in a balastic terrain forms shallow aquifer is often classified into shallow water table category. In the present study an effort has been made to understand the suitable recharge zone with relation to rainfall and water level by using statistical analysis. Daily time series data of rainfall and borehole water level data are cross correlated to investigate variations in groundwater level response time during the months of monsoon. This measurement facilitate to demarcate favorable areas for Artificial Recharge. KEYWORDS: Water level; Rainfall; Recharge; Statistical analysis; Cross correlation.

  14. Descriptive and inferential statistical methods used in burns research.

    PubMed

    Al-Benna, Sammy; Al-Ajam, Yazan; Way, Benjamin; Steinstraesser, Lars

    2010-05-01

    Burns research articles utilise a variety of descriptive and inferential methods to present and analyse data. The aim of this study was to determine the descriptive methods (e.g. mean, median, SD, range, etc.) and survey the use of inferential methods (statistical tests) used in articles in the journal Burns. This study defined its population as all original articles published in the journal Burns in 2007. Letters to the editor, brief reports, reviews, and case reports were excluded. Study characteristics, use of descriptive statistics and the number and types of statistical methods employed were evaluated. Of the 51 articles analysed, 11(22%) were randomised controlled trials, 18(35%) were cohort studies, 11(22%) were case control studies and 11(22%) were case series. The study design and objectives were defined in all articles. All articles made use of continuous and descriptive data. Inferential statistics were used in 49(96%) articles. Data dispersion was calculated by standard deviation in 30(59%). Standard error of the mean was quoted in 19(37%). The statistical software product was named in 33(65%). Of the 49 articles that used inferential statistics, the tests were named in 47(96%). The 6 most common tests used (Student's t-test (53%), analysis of variance/co-variance (33%), chi(2) test (27%), Wilcoxon & Mann-Whitney tests (22%), Fisher's exact test (12%)) accounted for the majority (72%) of statistical methods employed. A specified significance level was named in 43(88%) and the exact significance levels were reported in 28(57%). Descriptive analysis and basic statistical techniques account for most of the statistical tests reported. This information should prove useful in deciding which tests should be emphasised in educating burn care professionals. These results highlight the need for burn care professionals to have a sound understanding of basic statistics, which is crucial in interpreting and reporting data. Advice should be sought from professionals in the fields of biostatistics and epidemiology when using more advanced statistical techniques. Copyright 2009 Elsevier Ltd and ISBI. All rights reserved.

  15. A Tutorial in Bayesian Potential Outcomes Mediation Analysis.

    PubMed

    Miočević, Milica; Gonzalez, Oscar; Valente, Matthew J; MacKinnon, David P

    2018-01-01

    Statistical mediation analysis is used to investigate intermediate variables in the relation between independent and dependent variables. Causal interpretation of mediation analyses is challenging because randomization of subjects to levels of the independent variable does not rule out the possibility of unmeasured confounders of the mediator to outcome relation. Furthermore, commonly used frequentist methods for mediation analysis compute the probability of the data given the null hypothesis, which is not the probability of a hypothesis given the data as in Bayesian analysis. Under certain assumptions, applying the potential outcomes framework to mediation analysis allows for the computation of causal effects, and statistical mediation in the Bayesian framework gives indirect effects probabilistic interpretations. This tutorial combines causal inference and Bayesian methods for mediation analysis so the indirect and direct effects have both causal and probabilistic interpretations. Steps in Bayesian causal mediation analysis are shown in the application to an empirical example.

  16. Statistical analysis of subjective preferences for video enhancement

    NASA Astrophysics Data System (ADS)

    Woods, Russell L.; Satgunam, PremNandhini; Bronstad, P. Matthew; Peli, Eli

    2010-02-01

    Measuring preferences for moving video quality is harder than for static images due to the fleeting and variable nature of moving video. Subjective preferences for image quality can be tested by observers indicating their preference for one image over another. Such pairwise comparisons can be analyzed using Thurstone scaling (Farrell, 1999). Thurstone (1927) scaling is widely used in applied psychology, marketing, food tasting and advertising research. Thurstone analysis constructs an arbitrary perceptual scale for the items that are compared (e.g. enhancement levels). However, Thurstone scaling does not determine the statistical significance of the differences between items on that perceptual scale. Recent papers have provided inferential statistical methods that produce an outcome similar to Thurstone scaling (Lipovetsky and Conklin, 2004). Here, we demonstrate that binary logistic regression can analyze preferences for enhanced video.

  17. GIA Model Statistics for GRACE Hydrology, Cryosphere, and Ocean Science

    NASA Astrophysics Data System (ADS)

    Caron, L.; Ivins, E. R.; Larour, E.; Adhikari, S.; Nilsson, J.; Blewitt, G.

    2018-03-01

    We provide a new analysis of glacial isostatic adjustment (GIA) with the goal of assembling the model uncertainty statistics required for rigorously extracting trends in surface mass from the Gravity Recovery and Climate Experiment (GRACE) mission. Such statistics are essential for deciphering sea level, ocean mass, and hydrological changes because the latter signals can be relatively small (≤2 mm/yr water height equivalent) over very large regions, such as major ocean basins and watersheds. With abundant new >7 year continuous measurements of vertical land motion (VLM) reported by Global Positioning System stations on bedrock and new relative sea level records, our new statistical evaluation of GIA uncertainties incorporates Bayesian methodologies. A unique aspect of the method is that both the ice history and 1-D Earth structure vary through a total of 128,000 forward models. We find that best fit models poorly capture the statistical inferences needed to correctly invert for lower mantle viscosity and that GIA uncertainty exceeds the uncertainty ascribed to trends from 14 years of GRACE data in polar regions.

  18. Effect Size Measure and Analysis of Single Subject Designs

    ERIC Educational Resources Information Center

    Swaminathan, Hariharan; Horner, Robert H.; Rogers, H. Jane; Sugai, George

    2012-01-01

    This study is aimed at addressing the criticisms that have been leveled at the currently available statistical procedures for analyzing single subject designs (SSD). One of the vexing problems in the analysis of SSD is in the assessment of the effect of intervention. Serial dependence notwithstanding, the linear model approach that has been…

  19. The Effects of Ability Grouping: A Meta-Analysis of Research Findings.

    ERIC Educational Resources Information Center

    Noland, Theresa Koontz; Taylor, Bob L.

    The study reported in this paper quantitatively integrated the recent research findings on ability grouping in order to generalize about these effects on student achievement and student self-concept. Meta-analysis was used to statistically integrate the empirical data. The relationships among various experimental variables including grade level,…

  20. Grade Trend Analysis for a Credit-Bearing Library Instruction Course

    ERIC Educational Resources Information Center

    Guo, Shu

    2015-01-01

    Statistics suggest the prevalence of grade inflation nationwide, and researchers perform many analyses on student grades at both university and college levels. This analysis focuses on a one-credit library instruction course for undergraduate students at a large public university. The studies examine thirty semester GPAs and the percentages of As…

  1. Multilevel Factor Analysis by Model Segregation: New Applications for Robust Test Statistics

    ERIC Educational Resources Information Center

    Schweig, Jonathan

    2014-01-01

    Measures of classroom environments have become central to policy efforts that assess school and teacher quality. This has sparked a wide interest in using multilevel factor analysis to test measurement hypotheses about classroom-level variables. One approach partitions the total covariance matrix and tests models separately on the…

  2. HPLC determination of caffeine in coffee beverage

    NASA Astrophysics Data System (ADS)

    Fajara, B. E. P.; Susanti, H.

    2017-11-01

    Coffee is the second largest beverage which is consumed by people in the world, besides the water. One of the compounds which contained in coffee is caffeine. Caffeine has the pharmacological effect such as stimulating the central nervous system. The purpose of this study is to determine the level of caffeine in coffee beverages with HPLC method. Three branded coffee beverages which include in 3 of Top Brand Index 2016 Phase 2 were used as samples. Qualitative analysis was performed by Parry method, Dragendorff reagent, and comparing the retention time between sample and caffeine standard. Quantitative analysis was done by HPLC method with methanol-water (95:5v/v) as mobile phase and ODS as stationary phasewith flow rate 1 mL/min and UV 272 nm as the detector. The level of caffeine data was statistically analyzed using Anova at 95% confidence level. The Qualitative analysis showed that the three samples contained caffeine. The average of caffeine level in coffee bottles of X, Y, and Z were 138.048 mg/bottle, 109.699 mg/bottle, and 147.669 mg/bottle, respectively. The caffeine content of the three coffee beverage samples are statistically different (p<0.05). The levels of caffeine contained in X, Y, and Z coffee beverage samples were not meet the requirements set by the Indonesian Standard Agency of 50 mg/serving.

  3. Analysis strategies for longitudinal attachment loss data.

    PubMed

    Beck, J D; Elter, J R

    2000-02-01

    The purpose of this invited review is to describe and discuss methods currently in use to quantify the progression of attachment loss in epidemiological studies of periodontal disease, and to make recommendations for specific analytic methods based upon the particular design of the study and structure of the data. The review concentrates on the definition of incident attachment loss (ALOSS) and its component parts; measurement issues including thresholds and regression to the mean; methods of accounting for longitudinal change, including changes in means, changes in proportions of affected sites, incidence density, the effect of tooth loss and reversals, and repeated events; statistical models of longitudinal change, including the incorporation of the time element, use of linear, logistic or Poisson regression or survival analysis, and statistical tests; site vs person level of analysis, including statistical adjustment for correlated data; the strengths and limitations of ALOSS data. Examples from the Piedmont 65+ Dental Study are used to illustrate specific concepts. We conclude that incidence density is the preferred methodology to use for periodontal studies with more than one period of follow-up and that the use of studies not employing methods for dealing with complex samples, correlated data, and repeated measures does not take advantage of our current understanding of the site- and person-level variables important in periodontal disease and may generate biased results.

  4. Serum Levels of 25-hydroxyvitamin D in Chronic Urticaria and its Association with Disease Activity: A Case Control Study.

    PubMed

    Rather, Shagufta; Keen, Abid; Sajad, Peerzada

    2018-01-01

    To evaluate the relationship between vitamin D levels and chronic spontaneous urticaria (CSU) and compare with healthy age and sex matched controls. This was a hospital-based cross-sectional study conducted over a period of 1 year, in which 110 patients with CSU were recruited along with an equal number of sex and age-matched healthy controls. For each patient, urticaria activity score (UAS) was calculated and autologous serum skin test (ASST) was performed. Plasma 25-hydroxyvitamin D [25-(OH)D] was analyzed by chemiluminescence method. A deficiency in vitamin D was defined as serum 25-(OH)D concentrations <30 ng/mL. The statistical analysis was carried out by using appropriate statistical tests. The mean serum 25-(OH)D levels of CSU patients was 19.6 ± 6.9 ng/mL, whereas in control group, the mean level was 38.5 ± 6.7, the difference being statistically significant ( P < 0.001). A significant negative correlation was found between vitamin D levels and UAS. ( P < 0.001). The number of patients with ASST positivity was 44 (40%). The patients with CSU had reduced levels of vitamin D when compared to healthy controls. Furthermore, there was a significant negative correlation between the levels of serum vitamin D and severity of CSU.

  5. Experimental design of an interlaboratory study for trace metal analysis of liquid fluids. [for aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Greenbauer-Seng, L. A.

    1983-01-01

    The accurate determination of trace metals and fuels is an important requirement in much of the research into and development of alternative fuels for aerospace applications. Recognizing the detrimental effects of certain metals on fuel performance and fuel systems at the part per million and in some cases part per billion levels requires improved accuracy in determining these low concentration elements. Accurate analyses are also required to ensure interchangeability of analysis results between vendor, researcher, and end use for purposes of quality control. Previous interlaboratory studies have demonstrated the inability of different laboratories to agree on the results of metal analysis, particularly at low concentration levels, yet typically good precisions are reported within a laboratory. An interlaboratory study was designed to gain statistical information about the sources of variation in the reported concentrations. Five participant laboratories were used on a fee basis and were not informed of the purpose of the analyses. The effects of laboratory, analytical technique, concentration level, and ashing additive were studied in four fuel types for 20 elements of interest. The prescribed sample preparation schemes (variations of dry ashing) were used by all of the laboratories. The analytical data were statistically evaluated using a computer program for the analysis of variance technique.

  6. Statistics and bioinformatics in nutritional sciences: analysis of complex data in the era of systems biology⋆

    PubMed Central

    Fu, Wenjiang J.; Stromberg, Arnold J.; Viele, Kert; Carroll, Raymond J.; Wu, Guoyao

    2009-01-01

    Over the past two decades, there have been revolutionary developments in life science technologies characterized by high throughput, high efficiency, and rapid computation. Nutritionists now have the advanced methodologies for the analysis of DNA, RNA, protein, low-molecular-weight metabolites, as well as access to bioinformatics databases. Statistics, which can be defined as the process of making scientific inferences from data that contain variability, has historically played an integral role in advancing nutritional sciences. Currently, in the era of systems biology, statistics has become an increasingly important tool to quantitatively analyze information about biological macromolecules. This article describes general terms used in statistical analysis of large, complex experimental data. These terms include experimental design, power analysis, sample size calculation, and experimental errors (type I and II errors) for nutritional studies at population, tissue, cellular, and molecular levels. In addition, we highlighted various sources of experimental variations in studies involving microarray gene expression, real-time polymerase chain reaction, proteomics, and other bioinformatics technologies. Moreover, we provided guidelines for nutritionists and other biomedical scientists to plan and conduct studies and to analyze the complex data. Appropriate statistical analyses are expected to make an important contribution to solving major nutrition-associated problems in humans and animals (including obesity, diabetes, cardiovascular disease, cancer, ageing, and intrauterine fetal retardation). PMID:20233650

  7. Single-row, double-row, and transosseous equivalent techniques for isolated supraspinatus tendon tears with minimal atrophy: A retrospective comparative outcome and radiographic analysis at minimum 2-year followup

    PubMed Central

    McCormick, Frank; Gupta, Anil; Bruce, Ben; Harris, Josh; Abrams, Geoff; Wilson, Hillary; Hussey, Kristen; Cole, Brian J.

    2014-01-01

    Purpose: The purpose of this study was to measure and compare the subjective, objective, and radiographic healing outcomes of single-row (SR), double-row (DR), and transosseous equivalent (TOE) suture techniques for arthroscopic rotator cuff repair. Materials and Methods: A retrospective comparative analysis of arthroscopic rotator cuff repairs by one surgeon from 2004 to 2010 at minimum 2-year followup was performed. Cohorts were matched for age, sex, and tear size. Subjective outcome variables included ASES, Constant, SST, UCLA, and SF-12 scores. Objective outcome variables included strength, active range of motion (ROM). Radiographic healing was assessed by magnetic resonance imaging (MRI). Statistical analysis was performed using analysis of variance (ANOVA), Mann — Whitney and Kruskal — Wallis tests with significance, and the Fisher exact probability test <0.05. Results: Sixty-three patients completed the study requirements (20 SR, 21 DR, 22 TOE). There was a clinically and statistically significant improvement in outcomes with all repair techniques (ASES mean improvement P = <0.0001). The mean final ASES scores were: SR 83; (SD 21.4); DR 87 (SD 18.2); TOE 87 (SD 13.2); (P = 0.73). There was a statistically significant improvement in strength for each repair technique (P < 0.001). There was no significant difference between techniques across all secondary outcome assessments: ASES improvement, Constant, SST, UCLA, SF-12, ROM, Strength, and MRI re-tear rates. There was a decrease in re-tear rates from single row (22%) to double-row (18%) to transosseous equivalent (11%); however, this difference was not statistically significant (P = 0.6). Conclusions: Compared to preoperatively, arthroscopic rotator cuff repair, using SR, DR, or TOE techniques, yielded a clinically and statistically significant improvement in subjective and objective outcomes at a minimum 2-year follow-up. Level of Evidence: Therapeutic level 3. PMID:24926159

  8. Dependency of high coastal water level and river discharge at the global scale

    NASA Astrophysics Data System (ADS)

    Ward, P.; Couasnon, A.; Haigh, I. D.; Muis, S.; Veldkamp, T.; Winsemius, H.; Wahl, T.

    2017-12-01

    It is widely recognized that floods cause huge socioeconomic impacts. From 1980-2013, global flood losses exceeded $1 trillion, with 220,000 fatalities. These impacts are particularly hard felt in low-lying densely populated deltas and estuaries, whose location at the coast-land interface makes them naturally prone to flooding. When river and coastal floods coincide, their impacts in these deltas and estuaries are often worse than when they occur in isolation. Such floods are examples of so-called `compound events'. In this contribution, we present the first global scale analysis of the statistical dependency of high coastal water levels (and the storm surge component alone) and river discharge. We show that there is statistical dependency between these components at more than half of the stations examined. We also show time-lags in the highest correlation between peak discharges and coastal water levels. Finally, we assess the probability of the simultaneous occurrence of design discharge and design coastal water levels, assuming both independence and statistical dependence. For those stations where we identified statistical dependency, the probability is between 1 and 5 times greater, when the dependence structure is accounted for. This information is essential for understanding the likelihood of compound flood events occurring at locations around the world as well as for accurate flood risk assessments and effective flood risk management. The research was carried out by analysing the statistical dependency between observed coastal water levels (and the storm surge component) from GESLA-2 and river discharge using gauged data from GRDC stations all around the world. The dependence structure was examined using copula functions.

  9. Potentiation Following Ballistic and Nonballistic Complexes: The Effect of Strength Level.

    PubMed

    Suchomel, Timothy J; Sato, Kimitake; DeWeese, Brad H; Ebben, William P; Stone, Michael H

    2016-07-01

    Suchomel, TJ, Sato, K, DeWeese, BH, Ebben, WP, and Stone, MH. Potentiation following ballistic and nonballistic complexes: the effect of strength level. J Strength Cond Res 30(7): 1825-1833, 2016-The purpose of this study was to compare the temporal profile of strong and weak subjects during ballistic and nonballistic potentiation complexes. Eight strong (relative back squat = 2.1 ± 0.1 times body mass) and 8 weak (relative back squat = 1.6 ± 0.2 times body mass) males performed squat jumps immediately and every minute up to 10 minutes following potentiation complexes that included ballistic or nonballistic concentric-only half-squat (COHS) performed at 90% of their 1 repetition maximum COHS. Jump height (JH) and allometrically scaled peak power (PPa) were compared using a series of 2 × 12 repeated measures analyses of variance. No statistically significant strength level main effects for JH (p = 0.442) or PPa (p = 0.078) existed during the ballistic condition. In contrast, statistically significant main effects for time existed for both JH (p = 0.014) and PPa (p < 0.001); however, no statistically significant pairwise comparisons were present (p > 0.05). Statistically significant strength level main effects existed for PPa (p = 0.039) but not for JH (p = 0.137) during the nonballistic condition. Post hoc analysis revealed that the strong subjects produced statistically greater PPa than the weaker subjects (p = 0.039). Statistically significant time main effects existed for time existed for PPa (p = 0.015), but not for JH (p = 0.178). No statistically significant strength level × time interaction effects for JH (p = 0.319) or PPa (p = 0.203) were present for the ballistic or nonballistic conditions. Practical significance indicated by effect sizes and the relationships between maximum potentiation and relative strength suggest that stronger subjects potentiate earlier and to a greater extent than weaker subjects during ballistic and nonballistic potentiation complexes.

  10. Macro scale models for freight railroad terminals.

    DOT National Transportation Integrated Search

    2016-03-02

    The project has developed a yard capacity model for macro-level analysis. The study considers the detailed sequence and scheduling in classification yards and their impacts on yard capacities simulate typical freight railroad terminals, and statistic...

  11. VOXEL-LEVEL MAPPING OF TRACER KINETICS IN PET STUDIES: A STATISTICAL APPROACH EMPHASIZING TISSUE LIFE TABLES.

    PubMed

    O'Sullivan, Finbarr; Muzi, Mark; Mankoff, David A; Eary, Janet F; Spence, Alexander M; Krohn, Kenneth A

    2014-06-01

    Most radiotracers used in dynamic positron emission tomography (PET) scanning act in a linear time-invariant fashion so that the measured time-course data are a convolution between the time course of the tracer in the arterial supply and the local tissue impulse response, known as the tissue residue function. In statistical terms the residue is a life table for the transit time of injected radiotracer atoms. The residue provides a description of the tracer kinetic information measurable by a dynamic PET scan. Decomposition of the residue function allows separation of rapid vascular kinetics from slower blood-tissue exchanges and tissue retention. For voxel-level analysis, we propose that residues be modeled by mixtures of nonparametrically derived basis residues obtained by segmentation of the full data volume. Spatial and temporal aspects of diagnostics associated with voxel-level model fitting are emphasized. Illustrative examples, some involving cancer imaging studies, are presented. Data from cerebral PET scanning with 18 F fluoro-deoxyglucose (FDG) and 15 O water (H2O) in normal subjects is used to evaluate the approach. Cross-validation is used to make regional comparisons between residues estimated using adaptive mixture models with more conventional compartmental modeling techniques. Simulations studies are used to theoretically examine mean square error performance and to explore the benefit of voxel-level analysis when the primary interest is a statistical summary of regional kinetics. The work highlights the contribution that multivariate analysis tools and life-table concepts can make in the recovery of local metabolic information from dynamic PET studies, particularly ones in which the assumptions of compartmental-like models, with residues that are sums of exponentials, might not be certain.

  12. The predictive value of mean serum uric acid levels for developing prediabetes.

    PubMed

    Zhang, Qing; Bao, Xue; Meng, Ge; Liu, Li; Wu, Hongmei; Du, Huanmin; Shi, Hongbin; Xia, Yang; Guo, Xiaoyan; Liu, Xing; Li, Chunlei; Su, Qian; Gu, Yeqing; Fang, Liyun; Yu, Fei; Yang, Huijun; Yu, Bin; Sun, Shaomei; Wang, Xing; Zhou, Ming; Jia, Qiyu; Zhao, Honglin; Huang, Guowei; Song, Kun; Niu, Kaijun

    2016-08-01

    We aimed to assess the predictive value of mean serum uric acid (SUA) levels for incident prediabetes. Normoglycemic adults (n=39,353) were followed for a median of 3.0years. Prediabetes is defined as impaired fasting glucose (IFG), impaired glucose tolerance (IGT), or impaired HbA1c (IA1c), based on the American Diabetes Association criteria. Serum SUA levels were measured annually. Four diagnostic strategies were used to detect prediabetes in four separate analyses (Analysis 1: IFG. Analysis 2: IFG+IGT. Analysis 3: IFG+IA1c. Analysis 4: IFG+IGT+IA1c). Cox proportional hazards regression models were used to assess the relationship between SUA quintiles and prediabetes. C-statistic was additionally used in the final analysis to assess the accuracy of predictions based upon baseline SUA and mean SUA, respectively. After adjustment for potential confounders, the hazard ratios (95% confidence interval) of prediabetes for the highest versus lowest quintile of mean SUA were 1.22 (1.10, 1.36) in analysis 1; 1.59 (1.23, 2.05) in analysis 2; 1.62 (1.34, 1.95) in analysis 3 and 1.67 (1.31, 2.13) in analysis 4. In contrast, for baseline SUA, significance was only reached in analyses 3 and 4. Moreover, compared with baseline SUA, mean SUA value was associated with a significant increase in the C-statistic (P<0.001). Mean SUA value was strongly and positively related to prediabetes risk, and showed better predictive ability for prediabetes than baseline SUA. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  13. Students' attitudes towards learning statistics

    NASA Astrophysics Data System (ADS)

    Ghulami, Hassan Rahnaward; Hamid, Mohd Rashid Ab; Zakaria, Roslinazairimah

    2015-05-01

    Positive attitude towards learning is vital in order to master the core content of the subject matters under study. This is unexceptional in learning statistics course especially at the university level. Therefore, this study investigates the students' attitude towards learning statistics. Six variables or constructs have been identified such as affect, cognitive competence, value, difficulty, interest, and effort. The instrument used for the study is questionnaire that was adopted and adapted from the reliable instrument of Survey of Attitudes towards Statistics(SATS©). This study is conducted to engineering undergraduate students in one of the university in the East Coast of Malaysia. The respondents consist of students who were taking the applied statistics course from different faculties. The results are analysed in terms of descriptive analysis and it contributes to the descriptive understanding of students' attitude towards the teaching and learning process of statistics.

  14. Detector noise statistics in the non-linear regime

    NASA Technical Reports Server (NTRS)

    Shopbell, P. L.; Bland-Hawthorn, J.

    1992-01-01

    The statistical behavior of an idealized linear detector in the presence of threshold and saturation levels is examined. It is assumed that the noise is governed by the statistical fluctuations in the number of photons emitted by the source during an exposure. Since physical detectors cannot have infinite dynamic range, our model illustrates that all devices have non-linear regimes, particularly at high count rates. The primary effect is a decrease in the statistical variance about the mean signal due to a portion of the expected noise distribution being removed via clipping. Higher order statistical moments are also examined, in particular, skewness and kurtosis. In principle, the expected distortion in the detector noise characteristics can be calibrated using flatfield observations with count rates matched to the observations. For this purpose, some basic statistical methods that utilize Fourier analysis techniques are described.

  15. Evaluation of the VIDAS Listeria (LIS) immunoassay for the detection of Listeria in foods using demi-Fraser and Fraser enrichment broths, as modification of AOAC Official Method 999.06 (AOAC Official Method 2004.06).

    PubMed

    Silbernagel, Karen M; Jechorek, Robert P; Kaufer, Amanda L; Johnson, Ronald L; Aleo, V; Brown, B; Buen, M; Buresh, J; Carson, M; Franklin, J; Ham, P; Humes, L; Husby, G; Hutchins, J; Jechorek, R; Jenkins, J; Kaufer, A; Kexel, N; Kora, L; Lam, L; Lau, D; Leighton, S; Loftis, M; Luc, S; Martin, J; Nacar, I; Nogle, J; Park, J; Schultz, A; Seymore, D; Smith, C; Smith, J; Thou, P; Ulmer, M; Voss, R; Weaver, V

    2005-01-01

    A multilaboratory study was conducted to compare the VIDAS LIS immunoassay with the standard cultural methods for the detection of Listeria in foods using an enrichment modification of AOAC Official Method 999.06. The modified enrichment protocol was implemented to harmonize the VIDAS LIS assay with the VIDAS LMO2 assay. Five food types--brie cheese, vanilla ice cream, frozen green beans, frozen raw tilapia fish, and cooked roast beef--at 3 inoculation levels, were analyzed by each method. A total of 15 laboratories representing government and industry participated. In this study, 1206 test portions were tested, of which 1170 were used in the statistical analysis. There were 433 positive by the VIDAS LIS assay and 396 positive by the standard culture methods. A Chi-square analysis of each of the 5 food types, at the 3 inoculation levels tested, was performed. The resulting average Chi square analysis, 0.42, indicated that, overall, there are no statistical differences between the VIDAS LIS assay and the standard methods at the 5% level of significance.

  16. Comparative inter-institutional study of stress among dentists.

    PubMed

    Pozos-Radillo, Blanca E; Galván-Ramírez, Ma Luz; Pando, Manuel; Carrión, Ma De los Angeles; González, Guillermo J

    2010-01-01

    Dentistry is considered to be a stressful profession due to different factors caused by work, representing a threat to dentists'health. The objectives of this work were to identify and compare chronic stress in dentists among the different health institutions and the association of stress with risk factors. The study in question is observational, transversal and comparative; 256 dentists were included, distributed among five public health institutions in the city of Guadalajara, Jalisco, Mexico, namely: the Mexican Institute of Social Security (IMSS), the Ministry of Health (SS), the Integral Development of the Family (DIF), the Social Security Services Institute for the Workers (ISSSTE) and the University of Guadalajara (U. de G) Data were obtained by means of the census technique. Stress was identified using the Stress Symptoms Inventory and the statistical analysis was performed using the Odds Ratio (O.R.) and the chi-square statistic. From the total population studied, 219 subjects presented high levels of chronic stress and 37, low levels. In the results of comparative analysis, significant differences were found between IMSS and U. de G and likewise between IMSS and SS. However, in the analysis of association, only U. de G was found to be associated with the high level of chronic stress.

  17. Performance in College Chemistry: a Statistical Comparison Using Gender and Jungian Personality Type

    NASA Astrophysics Data System (ADS)

    Greene, Susan V.; Wheeler, Henry R.; Riley, Wayne D.

    This study sorted college introductory chemistry students by gender and Jungian personality type. It recognized differences from the general population distribution and statistically compared the students' grades with their Jungian personality types. Data from 577 female students indicated that ESFP (extroverted, sensory, feeling, perceiving) and ENFP (extroverted, intuitive, feeling, perceiving) profiles performed poorly at statistically significant levels when compared with the distribution of females enrolled in introductory chemistry. The comparable analysis using data from 422 male students indicated that the poorly performing male profiles were ISTP (introverted, sensory, thinking, perceiving) and ESTP (extroverted, sensory, thinking, perceiving). ESTJ (extroverted, sensory, thinking, judging) female students withdrew from the course at a statistically significant level. For both genders, INTJ (introverted, intuitive, thinking, judging) students were the best performers. By examining the documented characteristics of Jungian profiles that correspond with poorly performing students in chemistry, one may more effectively assist the learning process and the retention of these individuals in the fields of natural science, engineering, and technology.

  18. Regression Analysis of Long-term Profile Ozone Data Set from BUV Instruments

    NASA Technical Reports Server (NTRS)

    Frith, Stacey; Taylor, Steve; DeLand, Matt; Ahn, Chang-Woo; Stolarski, Richard S.

    2005-01-01

    We have produced a profile merged ozone data set (MOD) based on the SBUV/SBUV2 series of nadir-viewing satellite backscatter instruments, covering the period from November 1978 - December 2003. In 2004, data from the Nimbus 7 SBUV and NOAA 9,11, and 16 SBUV/2 instruments were reprocessed using the Version 8 (V8) algorithm and most recent calibrations. More recently, data from the Nimbus 4 BUV instrument, which operated from 1970 - 1977, were also reprocessed using the V8 algorithm. As part of the V8 profile calibration, the Nimbus 7 and NOAA 9 (1993-1997 only) instrument calibrations have been adjusted to match the NOAA 11 calibration, which was established from comparisons with SSBUV shuttle flight data. Given the level of agreement between the data sets, we simply average the ozone values during periods of instrument overlap to produce the MOD profile data set. We use statistical time-series analysis of the MOD profile data set (1978-2003) to estimate the change in profile ozone due to changing stratospheric chlorine levels. The Nimbus 4 BUV data offer an opportunity to test the physical properties of our statistical model. We extrapolate our statistical model fit backwards in time and compare to the Nimbus 4 data. We compare the statistics of the residuals from the fit for the Nimbus 4 period to those obtained from the 1978-2003 period over which the statistical model coefficients were estimated.

  19. Methods for evaluating temporal groundwater quality data and results of decadal-scale changes in chloride, dissolved solids, and nitrate concentrations in groundwater in the United States, 1988-2010

    USGS Publications Warehouse

    Lindsey, Bruce D.; Rupert, Michael G.

    2012-01-01

    Decadal-scale changes in groundwater quality were evaluated by the U.S. Geological Survey National Water-Quality Assessment (NAWQA) Program. Samples of groundwater collected from wells during 1988-2000 - a first sampling event representing the decade ending the 20th century - were compared on a pair-wise basis to samples from the same wells collected during 2001-2010 - a second sampling event representing the decade beginning the 21st century. The data set consists of samples from 1,236 wells in 56 well networks, representing major aquifers and urban and agricultural land-use areas, with analytical results for chloride, dissolved solids, and nitrate. Statistical analysis was done on a network basis rather than by individual wells. Although spanning slightly more or less than a 10-year period, the two-sample comparison between the first and second sampling events is referred to as an analysis of decadal-scale change based on a step-trend analysis. The 22 principal aquifers represented by these 56 networks account for nearly 80 percent of the estimated withdrawals of groundwater used for drinking-water supply in the Nation. Well networks where decadal-scale changes in concentrations were statistically significant were identified using the Wilcoxon-Pratt signed-rank test. For the statistical analysis of chloride, dissolved solids, and nitrate concentrations at the network level, more than half revealed no statistically significant change over the decadal period. However, for networks that had statistically significant changes, increased concentrations outnumbered decreased concentrations by a large margin. Statistically significant increases of chloride concentrations were identified for 43 percent of 56 networks. Dissolved solids concentrations increased significantly in 41 percent of the 54 networks with dissolved solids data, and nitrate concentrations increased significantly in 23 percent of 56 networks. At least one of the three - chloride, dissolved solids, or nitrate - had a statistically significant increase in concentration in 66 percent of the networks. Statistically significant decreases in concentrations were identified in 4 percent of the networks for chloride, 2 percent of the networks for dissolved solids, and 9 percent of the networks for nitrate. A larger percentage of urban land-use networks had statistically significant increases in chloride, dissolved solids, and nitrate concentrations than agricultural land-use networks. In order to assess the magnitude of statistically significant changes, the median of the differences between constituent concentrations from the first full-network sampling event and those from the second full-network sampling event was calculated using the Turnbull method. The largest median decadal increases in chloride concentrations were in networks in the Upper Illinois River Basin (67 mg/L) and in the New England Coastal Basins (34 mg/L), whereas the largest median decadal decrease in chloride concentrations was in the Upper Snake River Basin (1 mg/L). The largest median decadal increases in dissolved solids concentrations were in networks in the Rio Grande Valley (260 mg/L) and the Upper Illinois River Basin (160 mg/L). The largest median decadal decrease in dissolved solids concentrations was in the Apalachicola-Chattahoochee-Flint River Basin (6.0 mg/L). The largest median decadal increases in nitrate as nitrogen (N) concentrations were in networks in the South Platte River Basin (2.0 mg/L as N) and the San Joaquin-Tulare Basins (1.0 mg/L as N). The largest median decadal decrease in nitrate concentrations was in the Santee River Basin and Coastal Drainages (0.63 mg/L). The magnitude of change in networks with statistically significant increases typically was much larger than the magnitude of change in networks with statistically significant decreases. The magnitude of change was greatest for chloride in the urban land-use networks and greatest for dissolved solids and nitrate in the agricultural land-use networks. Analysis of data from all networks combined indicated statistically significant increases for chloride, dissolved solids, and nitrate. Although chloride, dissolved solids, and nitrate concentrations were typically less than the drinking-water standards and guidelines, a statistical test was used to determine whether or not the proportion of samples exceeding the drinking-water standard or guideline changed significantly between the first and second full-network sampling events. The proportion of samples exceeding the U.S. Environmental Protection Agency (USEPA) Secondary Maximum Contaminant Level for dissolved solids (500 milligrams per liter) increased significantly between the first and second full-network sampling events when evaluating all networks combined at the national level. Also, for all networks combined, the proportion of samples exceeding the USEPA Maximum Contaminant Level (MCL) of 10 mg/L as N for nitrate increased significantly. One network in the Delmarva Peninsula had a significant increase in the proportion of samples exceeding the MCL for nitrate. A subset of 261 wells was sampled every other year (biennially) to evaluate decadal-scale changes using a time-series analysis. The analysis of the biennial data set showed that changes were generally similar to the findings from the analysis of decadal-scale change that was based on a step-trend analysis. Because of the small number of wells in a network with biennial data (typically 4-5 wells), the time-series analysis is more useful for understanding water-quality responses to changes in site-specific conditions rather than as an indicator of the change for the entire network.

  20. VoxelStats: A MATLAB Package for Multi-Modal Voxel-Wise Brain Image Analysis.

    PubMed

    Mathotaarachchi, Sulantha; Wang, Seqian; Shin, Monica; Pascoal, Tharick A; Benedet, Andrea L; Kang, Min Su; Beaudry, Thomas; Fonov, Vladimir S; Gauthier, Serge; Labbe, Aurélie; Rosa-Neto, Pedro

    2016-01-01

    In healthy individuals, behavioral outcomes are highly associated with the variability on brain regional structure or neurochemical phenotypes. Similarly, in the context of neurodegenerative conditions, neuroimaging reveals that cognitive decline is linked to the magnitude of atrophy, neurochemical declines, or concentrations of abnormal protein aggregates across brain regions. However, modeling the effects of multiple regional abnormalities as determinants of cognitive decline at the voxel level remains largely unexplored by multimodal imaging research, given the high computational cost of estimating regression models for every single voxel from various imaging modalities. VoxelStats is a voxel-wise computational framework to overcome these computational limitations and to perform statistical operations on multiple scalar variables and imaging modalities at the voxel level. VoxelStats package has been developed in Matlab(®) and supports imaging formats such as Nifti-1, ANALYZE, and MINC v2. Prebuilt functions in VoxelStats enable the user to perform voxel-wise general and generalized linear models and mixed effect models with multiple volumetric covariates. Importantly, VoxelStats can recognize scalar values or image volumes as response variables and can accommodate volumetric statistical covariates as well as their interaction effects with other variables. Furthermore, this package includes built-in functionality to perform voxel-wise receiver operating characteristic analysis and paired and unpaired group contrast analysis. Validation of VoxelStats was conducted by comparing the linear regression functionality with existing toolboxes such as glim_image and RMINC. The validation results were identical to existing methods and the additional functionality was demonstrated by generating feature case assessments (t-statistics, odds ratio, and true positive rate maps). In summary, VoxelStats expands the current methods for multimodal imaging analysis by allowing the estimation of advanced regional association metrics at the voxel level.

  1. Surgical resident supervision in the operating room and outcomes of care in Veterans Affairs hospitals.

    PubMed

    Itani, Kamal M F; DePalma, Ralph G; Schifftner, Tracy; Sanders, Karen M; Chang, Barbara K; Henderson, William G; Khuri, Shukri F

    2005-11-01

    There has been concern that a reduced level of surgical resident supervision in the operating room (OR) is correlated with worse patient outcomes. Until September 2004, Veterans' Affairs (VA) hospitals entered in the surgical record level 3 supervision on every surgical case when the attending physician was available but not physically present in the OR or the OR suite. In this study, we assessed the impact of level 3 on risk-adjusted morbidity and mortality in the VA system. Surgical cases entered into the National Surgical Quality Improvement Program database between 1998 and 2004, from 99 VA teaching facilities, were included in a logistic regression analysis for each year. Level 3 versus all other levels of supervision were forced into the model, and patient characteristics then were selected stepwise to arrive at a final model. Confidence limits for the odds ratios were calculated by profile likelihood. A total of 610,660 cases were available for analysis. Thirty-day mortality and morbidity rates were reported in 14,441 (2.36%) and 63,079 (10.33%) cases, respectively. Level 3 supervision decreased from 8.72% in 1998 to 2.69% in 2004. In the logistic regression analysis, the odds ratios for mortality for level 3 ranged from .72 to 1.03. Only in the year 2000 were the odds ratio for mortality statistically significant at the .05 level (odds ratio, .72; 95% confidence interval, .594-.858). For morbidity, the odds ratios for level 3 supervision ranged from .66 to 1.01, and all odds ratios except for the year 2004 were statistically significant. Between 1998 and 2004, the level of resident supervision in the OR did not affect clinical outcomes adversely for surgical patients in the VA teaching hospitals.

  2. Analysis of the Relationship between the Emotional Intelligence and Professional Burnout Levels of Teachers

    ERIC Educational Resources Information Center

    Adilogullari, Ilhan

    2014-01-01

    The purpose of this study is to analyze the relationship between the emotional intelligence and professional burnout levels of teachers. The nature of the study consists of high school teachers employed in city center of Kirsehir Province; 563 volunteer teachers form the nature of sampling. The statistical implementation of the study is performed…

  3. The Performance of Methods to Test Upper-Level Mediation in the Presence of Nonnormal Data

    ERIC Educational Resources Information Center

    Pituch, Keenan A.; Stapleton, Laura M.

    2008-01-01

    A Monte Carlo study compared the statistical performance of standard and robust multilevel mediation analysis methods to test indirect effects for a cluster randomized experimental design under various departures from normality. The performance of these methods was examined for an upper-level mediation process, where the indirect effect is a fixed…

  4. Monitoring the Earth System Grid Federation through the ESGF Dashboard

    NASA Astrophysics Data System (ADS)

    Fiore, S.; Bell, G. M.; Drach, B.; Williams, D.; Aloisio, G.

    2012-12-01

    The Climate Model Intercomparison Project, phase 5 (CMIP5) is a global effort coordinated by the World Climate Research Programme (WCRP) involving tens of modeling groups spanning 19 countries. It is expected the CMIP5 distributed data archive will total upwards of 3.5 petabytes, stored across several ESGF Nodes on four continents (North America, Europe, Asia, and Australia). The Earth System Grid Federation (ESGF) provides the IT infrastructure to support the CMIP5. In this regard, the monitoring of the distributed ESGF infrastructure represents a crucial part carried out by the ESGF Dashboard. The ESGF Dashboard is a software component of the ESGF stack, responsible for collecting key information about the status of the federation in terms of: 1) Network topology (peer-groups composition), 2) Node type (host/services mapping), 3) Registered users (including their Identity Providers), 4) System metrics (e.g., round-trip time, service availability, CPU, memory, disk, processes, etc.), 5) Download metrics (both at the Node and federation level). The last class of information is very important since it provides a strong insight of the CMIP5 experiment: the data usage statistics. In this regard, CMCC and LLNL have developed a data analytics management system for the analysis of both node-level and federation-level data usage statistics. It provides data usage statistics aggregated by project, model, experiment, variable, realm, peer node, time, ensemble, datasetname (including version), etc. The back-end of the system is able to infer the data usage information of the entire federation, by carrying out: - at node level: a 18-step reconciliation process on the peer node databases (i.e. node manager and publisher DB) which provides a 15-dimension datawarehouse with local statistics and - at global level: an aggregation process which federates the data usage statistics into a 16-dimension datawarehouse with federation-level data usage statistics. The front-end of the Dashboard system exploits a web desktop approach, which joins the pervasivity of a web application with the flexibility of a desktop one.

  5. Bureau of Labor Statistics Employment Projections: Detailed Analysis of Selected Occupations and Industries. Report to the Honorable Berkley Bedell, United States House of Representatives.

    ERIC Educational Resources Information Center

    General Accounting Office, Washington, DC.

    To compile its projections of future employment levels, the Bureau of Labor Statistics (BLS) combines the following five interlinked models in a six-step process: a labor force model, an econometric model of the U.S. economy, an industry activity model, an industry labor demand model, and an occupational labor demand model. The BLS was asked to…

  6. Models of dyadic social interaction.

    PubMed Central

    Griffin, Dale; Gonzalez, Richard

    2003-01-01

    We discuss the logic of research designs for dyadic interaction and present statistical models with parameters that are tied to psychologically relevant constructs. Building on Karl Pearson's classic nineteenth-century statistical analysis of within-organism similarity, we describe several approaches to indexing dyadic interdependence and provide graphical methods for visualizing dyadic data. We also describe several statistical and conceptual solutions to the 'levels of analytic' problem in analysing dyadic data. These analytic strategies allow the researcher to examine and measure psychological questions of interdependence and social influence. We provide illustrative data from casually interacting and romantic dyads. PMID:12689382

  7. Dihydrotestosterone and testosterone levels in men screened for prostate cancer: a study of a randomized population.

    PubMed

    Gustafsson, O; Norming, U; Gustafsson, S; Eneroth, P; Aström, G; Nyman, C R

    1996-03-01

    To investigate the possible relationship between serum levels of prostate specific antigen (PSA), dihydrotestosterone (DHT), testosterone, sexual-hormone binding globulin (SHBG) and tumour stage, grade and ploidy in 65 cases of prostate cancer diagnosed in a screening study compared to 130 controls from the same population. From a population of 26,602 men between the ages of 55 and 70 years, 2400 were selected randomly and invited to undergo screening for prostate cancer using a digital rectal examination, transrectal ultrasonography and PSA analysis. Among the 1782 attendees, 65 cases of prostate cancer were diagnosed. Each case was matched with two control subjects of similar age and prostate volume from the screening population. Frozen serum samples were analysed for PSA, DHT, testosterone and SHBG, and compared to the diagnosis and tumour stage, grade and ploidy. Comparisons between these variables, and multivariate and regression analyses were performed. There were significant differences in PSA level with all variables except tumour ploidy. DHT levels were slightly lower in patients with prostate cancer but the difference was not statistically significant. There was a trend towards lower DHT values in more advanced tumours and the difference for T-stages was close to statistical significance (P = 0.059). Testosterone levels were lower in patients with cancer than in the control group, but the differences were not significant. There was no correlation between testosterone levels, tumour stage and ploidy, but the differences in testosterone level in tumours of a low grade of differentiation compared to those with intermediate and high grade was nearly significant (P = 0.058). The testosterone/DHT ratio tended to be higher in patients with more advanced tumours. SHBG levels were lower in patients with cancer than in controls but the differences were not statistically significant. There were no systematic variations of tumour stage, grade and ploidy. Multivariate analysis showed that if the PSA level was known, then DHT, testosterone or SHBG added no further information concerning diagnosis, stage, grade or ploidy. Regression analysis on T-stage, PSA level and DHT showed an inverse linear relationship between PSA and DHT for stage T-3 (P = 0.035), but there was no relationship between PSA and testosterone. PSA was of value in discriminating between cases and controls and between various tumour stages and grades, but no statistically significant correlation was found for ploidy. If PSA level was known, no other variable added information in individual cases. Within a group, DHT levels tended to be lower among cases and in those with more advanced tumours. There was an inverse relationship between tumour volume, as defined by PSA level, and 5 alpha-reductase activity, as defined by DHT level, and the testosterone/DHT ratio. This trend was most obvious with T-stage. No systematic variation were found in the levels of testosterone or SHBG.

  8. A Weibull statistics-based lignocellulose saccharification model and a built-in parameter accurately predict lignocellulose hydrolysis performance.

    PubMed

    Wang, Mingyu; Han, Lijuan; Liu, Shasha; Zhao, Xuebing; Yang, Jinghua; Loh, Soh Kheang; Sun, Xiaomin; Zhang, Chenxi; Fang, Xu

    2015-09-01

    Renewable energy from lignocellulosic biomass has been deemed an alternative to depleting fossil fuels. In order to improve this technology, we aim to develop robust mathematical models for the enzymatic lignocellulose degradation process. By analyzing 96 groups of previously published and newly obtained lignocellulose saccharification results and fitting them to Weibull distribution, we discovered Weibull statistics can accurately predict lignocellulose saccharification data, regardless of the type of substrates, enzymes and saccharification conditions. A mathematical model for enzymatic lignocellulose degradation was subsequently constructed based on Weibull statistics. Further analysis of the mathematical structure of the model and experimental saccharification data showed the significance of the two parameters in this model. In particular, the λ value, defined the characteristic time, represents the overall performance of the saccharification system. This suggestion was further supported by statistical analysis of experimental saccharification data and analysis of the glucose production levels when λ and n values change. In conclusion, the constructed Weibull statistics-based model can accurately predict lignocellulose hydrolysis behavior and we can use the λ parameter to assess the overall performance of enzymatic lignocellulose degradation. Advantages and potential applications of the model and the λ value in saccharification performance assessment were discussed. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Does restoration of focal lumbar lordosis for single level degenerative spondylolisthesis result in better patient-reported clinical outcomes? A systematic literature review.

    PubMed

    Rhee, Chanseok; Visintini, Sarah; Dunning, Cynthia E; Oxner, William M; Glennie, R Andrew

    2017-10-01

    It is controversial whether the surgical restoration of sagittal balance and spinopelvic angulation in a single level lumbar degenerative spondylolisthesis results in clinical improvements. The purpose of this study to systematically review the available literature to determine whether the surgical correction of malalignment in lumbar degenerative spondylolisthesis correlates with improvements in patient-reported clinical outcomes. Literature searches were performed via Ovid Medline, Embase, CENTRAL and Web of Science using search terms "lumbar," "degenerative/spondylolisthesis" and "surgery/surgical/surgeries/fusion". This resulted in 844 articles and after reviewing the abstracts and full-texts, 13 articles were included for summary and final analysis. There were two Level II articles, four Level III articles and five Level IV articles. Most commonly used patient-reported outcome measures (PROMs) were Oswestery disability index (ODI) and visual analogue scale (VAS). Four articles were included for the final statistical analysis. There was no statistically significant difference between the patient groups who achieved successful surgical correction of malalignment and those who did not for either ODI (mean difference -0.94, CI -8.89-7.00) or VAS (mean difference 1.57, CI -3.16-6.30). Two studies assessed the efficacy of manual reduction of lumbar degenerative spondylolisthesis and their clinical outcomes after the operation, and there was no statistically significant improvement. Overall, the restoration of focal lumbar lordosis and restoration of sagittal balance for single-level lumbar degenerative spondylolisthesis does not seem to yield clinical improvements but well-powered studies on this specific topic is lacking in the current literature. Future well-powered studies are needed for a more definitive conclusion. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Effect of a Dedicated Orthopaedic Advanced Practice Provider in a Level I Trauma Center: Analysis of Length of Stay and Cost.

    PubMed

    Hiza, Elise A; Gottschalk, Michael B; Umpierrez, Erica; Bush, Patricia; Reisman, William M

    2015-07-01

    The objective of this study is to analyze the effect of an orthopaedic trauma advanced practice provider on length of stay (LOS) and cost in a level I trauma center. The hypothesis of this study is that the addition of a single full-time nurse practitioner (NP) to the orthopaedic trauma team at a level I Trauma center would decrease overall LOS and hospital cost. A retrospective chart review of all patients discharged from the orthopaedic surgery service 1 year before the addition of a NP (pre-NP) and 1 year after the hiring of a NP (post-NP) were reviewed. Chart review included age, gender, LOS, discharge destination, intravenous antibiotic use, wound VAC therapy, admission location, and length of time to surgery. Statistical analysis was performed using the Wilcoxon/Kruskal-Wallis test. The hiring of a NP yielded a statistically significant decrease in the LOS across the following patient subgroups: patients transferred from the trauma service (13.56 compared with 7.02 days, P < 0.001), patients aged 60 years and older (7.34 compared with 5.04 days, P = 0.037), patients discharged to a rehabilitation facility (10.84 compared with 8.31 days, P = 0.002), and patients discharged on antibiotics/wound VAC therapy (15.16 compared with 11.24 days, P = 0.017). Length of time to surgery was also decreased (1.48 compared with 1.31 days, P = 0.37). The addition of a dedicated orthopaedic trauma advanced practice provider at a county level I trauma center resulted in a statistically significant decrease in LOS and thus reduced indirect costs to the hospital. Economic Level IV. See Instructions for Authors for a complete description of levels of evidence.

  11. Serum concentrations of soluble (s)L- and (s)P-selectins in women with ovarian cancer.

    PubMed

    Majchrzak-Baczmańska, Dominika B; Głowacka, Ewa; Wilczyński, Miłosz; Malinowski, Andrzej

    2018-03-01

    The aim of the study was to compare serum concentration of soluble L- and P-selectins in women with ovarian cancer (OC) and healthy controls, and to investigate sL- and sP-selectin levels with regard to clinical and pathological parameters. Correlation analysis was used to measure the following: sL- and sP-selectin concentration and Ca125; sP-selectin and platelet concentrations; and sL-selectin and serum leukocyte levels in women with OC. The study included 29 patients with OC and 23 healthy controls. Serum concentrations of sL- and sP-selectins were measured in all subjects. Routine diagnostic tests: CBC and USG (both groups) and Ca125 (study group) were performed. Significantly higher serum concentrations of sL- and sP-selectins were found in the study group as compared to controls. Lower levels of serum sL-selectin were observed in women with poorly-differentiated OC (G3) and advanced stages of the disease (FIGO III, IV), but the results were statistically insignificant. No statistically significant relationship was detected between sP-selectin serum concentration in women with OC and tumour differentiation, histological type, and stage of the disease. No significant correlation was found between sL- and sP-selectins and Ca125 levels. A weak correlation was found between serum concentration of sP-selectin in women with OC and platelet count. No statistically significant correlation was observed between sL-selectin concentration and serum leukocyte levels in women with OC. The analysis of sL- and sP-selectin concentrations may be a useful tool in the diagnosis of OC. The levels of sL-selectin decrease with disease progression.

  12. Investigating spousal concordance of diabetes through statistical analysis and data mining.

    PubMed

    Wang, Jong-Yi; Liu, Chiu-Shong; Lung, Chi-Hsuan; Yang, Ya-Tun; Lin, Ming-Hung

    2017-01-01

    Spousal clustering of diabetes merits attention. Whether old-age vulnerability or a shared family environment determines the concordance of diabetes is also uncertain. This study investigated the spousal concordance of diabetes and compared the risk of diabetes concordance between couples and noncouples by using nationally representative data. A total of 22,572 individuals identified from the 2002-2013 National Health Insurance Research Database of Taiwan constituted 5,643 couples and 5,643 noncouples through 1:1 dual propensity score matching (PSM). Factors associated with concordance in both spouses with diabetes were analyzed at the individual level. The risk of diabetes concordance between couples and noncouples was compared at the couple level. Logistic regression was the main statistical method. Statistical data were analyzed using SAS 9.4. C&RT and Apriori of data mining conducted in IBM SPSS Modeler 13 served as a supplement to statistics. High odds of the spousal concordance of diabetes were associated with old age, middle levels of urbanization, and high comorbidities (all P < 0.05). The dual PSM analysis revealed that the risk of diabetes concordance was significantly higher in couples (5.19%) than in noncouples (0.09%; OR = 61.743, P < 0.0001). A high concordance rate of diabetes in couples may indicate the influences of assortative mating and shared environment. Diabetes in a spouse implicates its risk in the partner. Family-based diabetes care that emphasizes the screening of couples at risk of diabetes by using the identified risk factors is suggested in prospective clinical practice interventions.

  13. Investigating spousal concordance of diabetes through statistical analysis and data mining

    PubMed Central

    Liu, Chiu-Shong; Lung, Chi-Hsuan; Yang, Ya-Tun; Lin, Ming-Hung

    2017-01-01

    Objective Spousal clustering of diabetes merits attention. Whether old-age vulnerability or a shared family environment determines the concordance of diabetes is also uncertain. This study investigated the spousal concordance of diabetes and compared the risk of diabetes concordance between couples and noncouples by using nationally representative data. Methods A total of 22,572 individuals identified from the 2002–2013 National Health Insurance Research Database of Taiwan constituted 5,643 couples and 5,643 noncouples through 1:1 dual propensity score matching (PSM). Factors associated with concordance in both spouses with diabetes were analyzed at the individual level. The risk of diabetes concordance between couples and noncouples was compared at the couple level. Logistic regression was the main statistical method. Statistical data were analyzed using SAS 9.4. C&RT and Apriori of data mining conducted in IBM SPSS Modeler 13 served as a supplement to statistics. Results High odds of the spousal concordance of diabetes were associated with old age, middle levels of urbanization, and high comorbidities (all P < 0.05). The dual PSM analysis revealed that the risk of diabetes concordance was significantly higher in couples (5.19%) than in noncouples (0.09%; OR = 61.743, P < 0.0001). Conclusions A high concordance rate of diabetes in couples may indicate the influences of assortative mating and shared environment. Diabetes in a spouse implicates its risk in the partner. Family-based diabetes care that emphasizes the screening of couples at risk of diabetes by using the identified risk factors is suggested in prospective clinical practice interventions. PMID:28817654

  14. Meta-Analysis: Effects of Probiotic Supplementation on Lipid Profiles in Normal to Mildly Hypercholesterolemic Individuals.

    PubMed

    Shimizu, Mikiko; Hashiguchi, Masayuki; Shiga, Tsuyoshi; Tamura, Hiro-omi; Mochizuki, Mayumi

    2015-01-01

    Recent experimental and clinical studies have suggested that probiotic supplementation has beneficial effects on serum lipid profiles. However, there are conflicting results on the efficacy of probiotic preparations in reducing serum cholesterol. To evaluate the effects of probiotics on human serum lipid levels, we conducted a meta-analysis of interventional studies. Eligible reports were obtained by searches of electronic databases. We included randomized, controlled clinical trials comparing probiotic supplementation with placebo or no treatment (control). Statistical analysis was performed with Review Manager 5.3.3. Subanalyses were also performed. Eleven of 33 randomized clinical trials retrieved were eligible for inclusion in the meta-analysis. No participant had received any cholesterol-lowering agent. Probiotic interventions (including fermented milk products and probiotics) produced changes in total cholesterol (TC) (mean difference -0.17 mmol/L, 95% CI: -0.27 to -0.07 mmol/L) and low-density lipoprotein cholesterol (LDL-C) (mean difference -0.22 mmol/L, 95% CI: -0.30 to -0.13 mmol/L). High-density lipoprotein cholesterol and triglyceride levels did not differ significantly between probiotic and control groups. In subanalysis, long-term (> 4-week) probiotic intervention was statistically more effective in decreasing TC and LDL-C than short-term (≤ 4-week) intervention. The decreases in TC and LDL-C levels with probiotic intervention were greater in mildly hypercholesterolemic than in normocholesterolemic individuals. Both fermented milk product and probiotic preparations decreased TC and LDL-C levels. Gaio and the Lactobacillus acidophilus strain reduced TC and LDL-C levels to a greater extent than other bacterial strains. In conclusion, this meta-analysis showed that probiotic supplementation could be useful in the primary prevention of hypercholesterolemia and may lead to reductions in risk factors for cardiovascular disease.

  15. Low-level processing for real-time image analysis

    NASA Technical Reports Server (NTRS)

    Eskenazi, R.; Wilf, J. M.

    1979-01-01

    A system that detects object outlines in television images in real time is described. A high-speed pipeline processor transforms the raw image into an edge map and a microprocessor, which is integrated into the system, clusters the edges, and represents them as chain codes. Image statistics, useful for higher level tasks such as pattern recognition, are computed by the microprocessor. Peak intensity and peak gradient values are extracted within a programmable window and are used for iris and focus control. The algorithms implemented in hardware and the pipeline processor architecture are described. The strategy for partitioning functions in the pipeline was chosen to make the implementation modular. The microprocessor interface allows flexible and adaptive control of the feature extraction process. The software algorithms for clustering edge segments, creating chain codes, and computing image statistics are also discussed. A strategy for real time image analysis that uses this system is given.

  16. An Investigation of the Variety and Complexity of Statistical Methods Used in Current Internal Medicine Literature.

    PubMed

    Narayanan, Roshni; Nugent, Rebecca; Nugent, Kenneth

    2015-10-01

    Accreditation Council for Graduate Medical Education guidelines require internal medicine residents to develop skills in the interpretation of medical literature and to understand the principles of research. A necessary component is the ability to understand the statistical methods used and their results, material that is not an in-depth focus of most medical school curricula and residency programs. Given the breadth and depth of the current medical literature and an increasing emphasis on complex, sophisticated statistical analyses, the statistical foundation and education necessary for residents are uncertain. We reviewed the statistical methods and terms used in 49 articles discussed at the journal club in the Department of Internal Medicine residency program at Texas Tech University between January 1, 2013 and June 30, 2013. We collected information on the study type and on the statistical methods used for summarizing and comparing samples, determining the relations between independent variables and dependent variables, and estimating models. We then identified the typical statistics education level at which each term or method is learned. A total of 14 articles came from the Journal of the American Medical Association Internal Medicine, 11 from the New England Journal of Medicine, 6 from the Annals of Internal Medicine, 5 from the Journal of the American Medical Association, and 13 from other journals. Twenty reported randomized controlled trials. Summary statistics included mean values (39 articles), category counts (38), and medians (28). Group comparisons were based on t tests (14 articles), χ2 tests (21), and nonparametric ranking tests (10). The relations between dependent and independent variables were analyzed with simple regression (6 articles), multivariate regression (11), and logistic regression (8). Nine studies reported odds ratios with 95% confidence intervals, and seven analyzed test performance using sensitivity and specificity calculations. These papers used 128 statistical terms and context-defined concepts, including some from data analysis (56), epidemiology-biostatistics (31), modeling (24), data collection (12), and meta-analysis (5). Ten different software programs were used in these articles. Based on usual undergraduate and graduate statistics curricula, 64.3% of the concepts and methods used in these papers required at least a master's degree-level statistics education. The interpretation of the current medical literature can require an extensive background in statistical methods at an education level exceeding the material and resources provided to most medical students and residents. Given the complexity and time pressure of medical education, these deficiencies will be hard to correct, but this project can serve as a basis for developing a curriculum in study design and statistical methods needed by physicians-in-training.

  17. An Efficient Objective Analysis System for Parallel Computers

    NASA Technical Reports Server (NTRS)

    Stobie, J.

    1999-01-01

    A new atmospheric objective analysis system designed for parallel computers will be described. The system can produce a global analysis (on a 1 X 1 lat-lon grid with 18 levels of heights and winds and 10 levels of moisture) using 120,000 observations in 17 minutes on 32 CPUs (SGI Origin 2000). No special parallel code is needed (e.g. MPI or multitasking) and the 32 CPUs do not have to be on the same platform. The system is totally portable and can run on several different architectures at once. In addition, the system can easily scale up to 100 or more CPUS. This will allow for much higher resolution and significant increases in input data. The system scales linearly as the number of observations and the number of grid points. The cost overhead in going from 1 to 32 CPUs is 18%. In addition, the analysis results are identical regardless of the number of processors used. This system has all the characteristics of optimal interpolation, combining detailed instrument and first guess error statistics to produce the best estimate of the atmospheric state. Static tests with a 2 X 2.5 resolution version of this system showed it's analysis increments are comparable to the latest NASA operational system including maintenance of mass-wind balance. Results from several months of cycling test in the Goddard EOS Data Assimilation System (GEOS DAS) show this new analysis retains the same level of agreement between the first guess and observations (O-F statistics) as the current operational system.

  18. P-MartCancer–Interactive Online Software to Enable Analysis of Shotgun Cancer Proteomic Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Webb-Robertson, Bobbie-Jo M.; Bramer, Lisa M.; Jensen, Jeffrey L.

    P-MartCancer is a new interactive web-based software environment that enables biomedical and biological scientists to perform in-depth analyses of global proteomics data without requiring direct interaction with the data or with statistical software. P-MartCancer offers a series of statistical modules associated with quality assessment, peptide and protein statistics, protein quantification and exploratory data analyses driven by the user via customized workflows and interactive visualization. Currently, P-MartCancer offers access to multiple cancer proteomic datasets generated through the Clinical Proteomics Tumor Analysis Consortium (CPTAC) at the peptide, gene and protein levels. P-MartCancer is deployed using Azure technologies (http://pmart.labworks.org/cptac.html), the web-service is alternativelymore » available via Docker Hub (https://hub.docker.com/r/pnnl/pmart-web/) and many statistical functions can be utilized directly from an R package available on GitHub (https://github.com/pmartR).« less

  19. Statistical analysis of NaOH pretreatment effects on sweet sorghum bagasse characteristics

    NASA Astrophysics Data System (ADS)

    Putri, Ary Mauliva Hada; Wahyuni, Eka Tri; Sudiyani, Yanni

    2017-01-01

    We analyze the behavior of sweet sorghum bagasse characteristics before and after NaOH pretreatments by statistical analysis. These characteristics include the percentages of lignocellulosic materials and the degree of crystallinity. We use the chi-square method to get the values of fitted parameters, and then deploy student's t-test to check whether they are significantly different from zero at 99.73% confidence level (C.L.). We obtain, in the cases of hemicellulose and lignin, that their percentages after pretreatment decrease statistically. On the other hand, crystallinity does not possess similar behavior as the data proves that all fitted parameters in this case might be consistent with zero. Our statistical result is then cross examined with the observations from X-ray diffraction (XRD) and Fourier Transform Infrared (FTIR) Spectroscopy, showing pretty good agreement. This result may indicate that the 10% NaOH pretreatment might not be sufficient in changing the crystallinity index of the sweet sorghum bagasse.

  20. Does the pre-operative serum phosphate level predict early hypocalcaemia following parathyroidectomy for primary hyperparathyroidism?

    PubMed

    Ellul, David; Townsley, Richard Brendan; Clark, Louise Jane

    2013-06-01

    Hypocalcaemia is a significant post-operative complication following parathyroidectomy. Early identification of risk factors can help pre-empt hypocalcaemia and avoid serious sequelae. It can also help identify those patients that are not suitable for day-case surgery. The aim of this study was to analyse the predictive value of the pre-operative serum phosphate level as an indicator for developing hypocalcaemia post-operatively in patients undergoing parathyroidectomy for primary hyperparathyroidism. We performed a retrospective review of all patients who underwent parathyroidectomy between 2008 and 2010 at the Southern General Hospital in Glasgow. Data collected included the number of parathyroid glands excised and their histology, pre-operative adjusted calcium (aCa) and phosphate levels, post-operative aCa at 6 and 24 h following surgery, and the fall in aCa levels in the first 6 h and 24 h following surgery. Minitab Statistical Analysis (Version 15) was used for data analysis. Fifty-six patients underwent parathyroidectomy in the study period. Twelve patients were excluded for various reasons including incomplete records and secondary hyperparathyroidism. Patients given calcium or Vitamin D supplements immediately post-operatively were also excluded. Statistical analysis showed no significant correlation between the pre-operative phosphate level and the post-operative decline in aCa level 6 h or 24 h following surgery. Patients with a lower phosphate level pre-operatively were not at risk of a more drastic fall in calcium levels following parathyroidectomy. The pre-operative phosphate level was not found to be predictive of post-operative hypocalcaemia in our study. Copyright © 2012 Royal College of Surgeons of Edinburgh (Scottish charity number SC005317) and Royal College of Surgeons in Ireland. Published by Elsevier Ltd. All rights reserved.

  1. Identifying What Student Affairs Professionals Value: A Mixed Methods Analysis of Professional Competencies Listed in Job Descriptions

    ERIC Educational Resources Information Center

    Hoffman, John L.; Bresciani, Marilee J.

    2012-01-01

    This mixed method study explored the professional competencies that administrators expect from entry-, mid-, and senior-level professionals as reflected in 1,759 job openings posted in 2008. Knowledge, skill, and dispositional competencies were identified during the qualitative phase of the study. Statistical analysis of the prevalence of…

  2. Sister chromatid exchanges and micronuclei analysis in lymphocytes of men exposed to simazine through drinking water.

    PubMed

    Suárez, Susanna; Rubio, Arantxa; Sueiro, Rosa Ana; Garrido, Joaquín

    2003-06-06

    In some cities of the autonomous community of Extremadura (south-west of Spain), levels of simazine from 10 to 30 ppm were detected in tap water. To analyse the possible effect of this herbicide, two biomarkers, sister chromatid exchanges (SCE) and micronuclei (MN), were used in peripheral blood lymphocytes from males exposed to simazine through drinking water. SCE and MN analysis failed to detect any statistically significant increase in the people exposed to simazine when compared with the controls. With respect to high frequency cells (HFC), a statistically significant difference was detected between exposed and control groups.

  3. Statistical Approaches Used to Assess the Equity of Access to Food Outlets: A Systematic Review

    PubMed Central

    Lamb, Karen E.; Thornton, Lukar E.; Cerin, Ester; Ball, Kylie

    2015-01-01

    Background Inequalities in eating behaviours are often linked to the types of food retailers accessible in neighbourhood environments. Numerous studies have aimed to identify if access to healthy and unhealthy food retailers is socioeconomically patterned across neighbourhoods, and thus a potential risk factor for dietary inequalities. Existing reviews have examined differences between methodologies, particularly focussing on neighbourhood and food outlet access measure definitions. However, no review has informatively discussed the suitability of the statistical methodologies employed; a key issue determining the validity of study findings. Our aim was to examine the suitability of statistical approaches adopted in these analyses. Methods Searches were conducted for articles published from 2000–2014. Eligible studies included objective measures of the neighbourhood food environment and neighbourhood-level socio-economic status, with a statistical analysis of the association between food outlet access and socio-economic status. Results Fifty-four papers were included. Outlet accessibility was typically defined as the distance to the nearest outlet from the neighbourhood centroid, or as the number of food outlets within a neighbourhood (or buffer). To assess if these measures were linked to neighbourhood disadvantage, common statistical methods included ANOVA, correlation, and Poisson or negative binomial regression. Although all studies involved spatial data, few considered spatial analysis techniques or spatial autocorrelation. Conclusions With advances in GIS software, sophisticated measures of neighbourhood outlet accessibility can be considered. However, approaches to statistical analysis often appear less sophisticated. Care should be taken to consider assumptions underlying the analysis and the possibility of spatially correlated residuals which could affect the results. PMID:29546115

  4. Bone and Soft Tissue Response in Bone-Level Implants Restored with Platform Switching: A 5-Year Clinical Prospective Study.

    PubMed

    Lago, Laura; da Silva, Luis; Gude, Francisco; Rilo, Benito

    The aim of this prospective study was to evaluate radiographic levels of peri-implant bone crest as well as soft tissue response, papilla height, and buccal mucosa recession, in bone-level implants restored with platform switching after 1-year and 5-year follow-ups. This prospective study called for the placement of 59 implants to obtain a target of 90% power. To compensate for possible dropouts, the sample size was adjusted to 67 implants. To assess marginal bone level changes, periapical radiographs were taken at baseline, 1 year, and 5 years after the definitive restorations. Peri-implant soft tissue modifications were evaluated by performing a photographic sequence at 15 days, 1 year, and 5 years after implant restoration. Parameters measured were: (1) distance from the tip of the papilla to the contact point and (2) apicocoronal crown length. A one-way analysis of variance (ANOVA rank test) was used to compare quantitative data among the three time points studied. Mean marginal bone level changes were as follows: -0.06 ± 0.32 mm from baseline to 1 year, -0.23 ± 0.38 mm from 1 to 5 years, and -0.28 ± 0.45 mm from baseline to 5 years. In bone-level outcomes, no statistically significant differences were found between baseline and 1 year, while the mean differences between 1 and 5 years and baseline and 5 years showed statistically significant differences. In the soft tissue analysis, the distance from the tip of the papilla to the contact point showed the following values: baseline, 2.08 mm; 1 year, 1.54 mm; 5 years, 1.31 mm. No statistically significant differences were found between baseline and 1 year, whereas statistically significant differences between 1 and 5 years and baseline and 5 years were found. Apicocoronal crown length measurements showed the following values: baseline, 9.44 mm; 1 year, 9.28 mm; 5 years, 9.81 mm. No significant differences were found between times studied. This prospective clinical study of 67 bone-level implants restored according to the platform-switching concept reported that radiographic levels of peri-implant bone crest were statistically significant between 1 and 5 years and baseline and 5 years. For the soft tissue response, the greatest reduction in the distance from the papilla to the contact point from 1 to 5 years and baseline to 5 years was observed. No significant differences were shown in the buccal margin.

  5. Assessment of levels of bacterial contamination of large wild game meat in Europe.

    PubMed

    Membré, Jeanne-Marie; Laroche, Michel; Magras, Catherine

    2011-08-01

    The variations in prevalence and levels of pathogens and fecal contamination indicators in large wild game meat were studied to assess their potential impact on consumers. This analysis was based on hazard analysis, data generation and statistical analysis. A total of 2919 meat samples from three species (red deer, roe deer, wild boar) were collected at French game meat traders' facilities using two sampling protocols. Information was gathered on the types of meat cuts (forequarter or haunch; first sampling protocol) or type of retail-ready meat (stewing meat or roasting meat; second protocol), and also on the meat storage conditions (frozen or chilled), country of origin (eight countries) and shooting season (autumn, winter, spring). The samples were analyzed in both protocols for detection and enumeration of Escherichia coli, coagulase+staphylococci and Clostridium perfringens. In addition, detection and enumeration of thermotolerant coliforms and Listeria monocytogenes were performed for samples collected in the first and second protocols, respectively. The levels of bacterial contamination of the raw meat were determined by performing statistical analysis involving probabilistic techniques and Bayesian inference. C. perfringens was found in the highest numbers for the three indicators of microbial quality, hygiene and good handling, and L. monocytogenes in the lowest. Differences in contamination levels between game species and between meats distributed as chilled or frozen products were not significant. These results might be included in quantitative exposure assessments. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Quantitative comparison of tympanic membrane displacements using two optical methods to recover the optical phase

    NASA Astrophysics Data System (ADS)

    Santiago-Lona, Cynthia V.; Hernández-Montes, María del Socorro; Mendoza-Santoyo, Fernando; Esquivel-Tejeda, Jesús

    2018-02-01

    The study and quantification of the tympanic membrane (TM) displacements add important information to advance the knowledge about the hearing process. A comparative statistical analysis between two commonly used demodulation methods employed to recover the optical phase in digital holographic interferometry, namely the fast Fourier transform and phase-shifting interferometry, is presented as applied to study thin tissues such as the TM. The resulting experimental TM surface displacement data are used to contrast both methods through the analysis of variance and F tests. Data are gathered when the TMs are excited with continuous sound stimuli at levels 86, 89 and 93 dB SPL for the frequencies of 800, 1300 and 2500 Hz under the same experimental conditions. The statistical analysis shows repeatability in z-direction displacements with a standard deviation of 0.086, 0.098 and 0.080 μm using the Fourier method, and 0.080, 0.104 and 0.055 μm with the phase-shifting method at a 95% confidence level for all frequencies. The precision and accuracy are evaluated by means of the coefficient of variation; the results with the Fourier method are 0.06143, 0.06125, 0.06154 and 0.06154, 0.06118, 0.06111 with phase-shifting. The relative error between both methods is 7.143, 6.250 and 30.769%. On comparing the measured displacements, the results indicate that there is no statistically significant difference between both methods for frequencies at 800 and 1300 Hz; however, errors and other statistics increase at 2500 Hz.

  7. Statistical Assessment of a Paired-site Approach for Verification of Carbon and Nitrogen Sequestration on CRP Land

    NASA Astrophysics Data System (ADS)

    Kucharik, C.; Roth, J.

    2002-12-01

    The threat of global climate change has provoked policy-makers to consider plausible strategies to slow the accumulation of greenhouse gases, especially carbon dioxide, in the atmosphere. One such idea involves the sequestration of atmospheric carbon (C) in degraded agricultural soils as part of the Conservation Reserve Program (CRP). While the potential for significant C sequestration in CRP grassland ecosystems has been demonstrated, the paired-site sampling approach traditionally used to quantify soil C changes has not been evaluated with robust statistical analysis. In this study, 14 paired CRP (> 8 years old) and cropland sites in Dane County, Wisconsin (WI) were used to assess whether a paired-site sampling design could detect statistically significant differences (ANOVA) in mean soil organic C and total nitrogen (N) storage. We compared surface (0 to 10 cm) bulk density, and sampled soils (0 to 5, 5 to 10, and 10 to 25 cm) for textural differences and chemical analysis of organic matter (OM), soil organic C (SOC), total N, and pH. The CRP contributed to lowering soil bulk density by 13% (p < 0.0001) and increased SOC and OM storage (kg m-2) by 13 to 17% in the 0 to 5 cm layer (p = 0.1). We tested the statistical power associated with ANOVA for measured soil properties, and calculated minimum detectable differences (MDD). We concluded that 40 to 65 paired sites and soil sampling in 5 cm increments near the surface were needed to achieve an 80% confidence level (α = 0.05; β = 0.20) in soil C and N sequestration rates. Because soil C and total N storage was highly variable among these sites (CVs > 20%), only a 23 to 29% change in existing total organic C and N pools could be reliably detected. While C and N sequestration (247 kg C ha{-1 } yr-1 and 17 kg N ha-1 yr-1) may be occurring and confined to the surface 5 cm as part of the WI CRP, our sampling design did not statistically support the desired 80% confidence level. We conclude that usage of statistical power analysis is essential to insure a high level of confidence in soil C and N sequestration rates that are quantified using paired plots.

  8. AutoBayes Program Synthesis System Users Manual

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Jafari, Hamed; Pressburger, Tom; Denney, Ewen; Buntine, Wray; Fischer, Bernd

    2008-01-01

    Program synthesis is the systematic, automatic construction of efficient executable code from high-level declarative specifications. AutoBayes is a fully automatic program synthesis system for the statistical data analysis domain; in particular, it solves parameter estimation problems. It has seen many successful applications at NASA and is currently being used, for example, to analyze simulation results for Orion. The input to AutoBayes is a concise description of a data analysis problem composed of a parameterized statistical model and a goal that is a probability term involving parameters and input data. The output is optimized and fully documented C/C++ code computing the values for those parameters that maximize the probability term. AutoBayes can solve many subproblems symbolically rather than having to rely on numeric approximation algorithms, thus yielding effective, efficient, and compact code. Statistical analysis is faster and more reliable, because effort can be focused on model development and validation rather than manual development of solution algorithms and code.

  9. Statistical approaches to account for missing values in accelerometer data: Applications to modeling physical activity.

    PubMed

    Yue Xu, Selene; Nelson, Sandahl; Kerr, Jacqueline; Godbole, Suneeta; Patterson, Ruth; Merchant, Gina; Abramson, Ian; Staudenmayer, John; Natarajan, Loki

    2018-04-01

    Physical inactivity is a recognized risk factor for many chronic diseases. Accelerometers are increasingly used as an objective means to measure daily physical activity. One challenge in using these devices is missing data due to device nonwear. We used a well-characterized cohort of 333 overweight postmenopausal breast cancer survivors to examine missing data patterns of accelerometer outputs over the day. Based on these observed missingness patterns, we created psuedo-simulated datasets with realistic missing data patterns. We developed statistical methods to design imputation and variance weighting algorithms to account for missing data effects when fitting regression models. Bias and precision of each method were evaluated and compared. Our results indicated that not accounting for missing data in the analysis yielded unstable estimates in the regression analysis. Incorporating variance weights and/or subject-level imputation improved precision by >50%, compared to ignoring missing data. We recommend that these simple easy-to-implement statistical tools be used to improve analysis of accelerometer data.

  10. Does objective cluster analysis serve as a useful precursor to seasonal precipitation prediction at local scale? Application to western Ethiopia

    NASA Astrophysics Data System (ADS)

    Zhang, Ying; Moges, Semu; Block, Paul

    2018-01-01

    Prediction of seasonal precipitation can provide actionable information to guide management of various sectoral activities. For instance, it is often translated into hydrological forecasts for better water resources management. However, many studies assume homogeneity in precipitation across an entire study region, which may prove ineffective for operational and local-level decisions, particularly for locations with high spatial variability. This study proposes advancing local-level seasonal precipitation predictions by first conditioning on regional-level predictions, as defined through objective cluster analysis, for western Ethiopia. To our knowledge, this is the first study predicting seasonal precipitation at high resolution in this region, where lives and livelihoods are vulnerable to precipitation variability given the high reliance on rain-fed agriculture and limited water resources infrastructure. The combination of objective cluster analysis, spatially high-resolution prediction of seasonal precipitation, and a modeling structure spanning statistical and dynamical approaches makes clear advances in prediction skill and resolution, as compared with previous studies. The statistical model improves versus the non-clustered case or dynamical models for a number of specific clusters in northwestern Ethiopia, with clusters having regional average correlation and ranked probability skill score (RPSS) values of up to 0.5 and 33 %, respectively. The general skill (after bias correction) of the two best-performing dynamical models over the entire study region is superior to that of the statistical models, although the dynamical models issue predictions at a lower resolution and the raw predictions require bias correction to guarantee comparable skills.

  11. Poisson statistics of PageRank probabilities of Twitter and Wikipedia networks

    NASA Astrophysics Data System (ADS)

    Frahm, Klaus M.; Shepelyansky, Dima L.

    2014-04-01

    We use the methods of quantum chaos and Random Matrix Theory for analysis of statistical fluctuations of PageRank probabilities in directed networks. In this approach the effective energy levels are given by a logarithm of PageRank probability at a given node. After the standard energy level unfolding procedure we establish that the nearest spacing distribution of PageRank probabilities is described by the Poisson law typical for integrable quantum systems. Our studies are done for the Twitter network and three networks of Wikipedia editions in English, French and German. We argue that due to absence of level repulsion the PageRank order of nearby nodes can be easily interchanged. The obtained Poisson law implies that the nearby PageRank probabilities fluctuate as random independent variables.

  12. Overall voice and strain level analysis in rock singers.

    PubMed

    Gonsalves, Aline; Amin, Elisabeth; Behlau, Mara

    2010-01-01

    overall voice and strain level analysis in rock singers. to analyze the voice o rock singers according to two specific parameters: overall level of vocal deviation (OLVD) and strain level (SL); to compare these parameters in three different music samples. participants were 26 male rock singers, ranging in age from 17 to 46 years (mean = 29.8 years). All of the participants answered a questionnaire for sample characterization and were submitted to the recording of three voice samples: Brazilian National Anthem (BNA), Satisfaction and self-selected repertoire song (RS). Voice samples were analyzed by five speech-language pathologists according to OLVD and SL. Statistical analysis was done using the software SPSS, version 13.0. statistically significant differences were observed for the mean values of OLVD and SL during the performance of Satisfaction (OLVD = 32.8 and SL = 0.024 / p=0.024) and during the RS performance (OLVD = 38.4 and SL = 55.8 / p=0.010). The values of OLVD and SL are directly proportional to the samples of the BNA* and RS**, i.e. the higher the strain the higher the OLVD (p,0.001*; p=0.010**). When individually analyzing the three song samples, it is observed that the OLVD does not vary significantly among them. However, the mean values present a trend to increase from non-rock to rock performances (24.0 BNA / 32.8 Satisfaction / 38.4 RS). The level of strain found during the BNA performance presents statistically significant difference when compared to the rock performances (Satisfaction and RS, p=0.008 and p=0.001). the obtained data suggest that rock style is related to the greater use of vocal strain and that this strain does not necessarily impose a negative impression to the voice, but corresponds to a common interpretative factor related to this style of music.

  13. Factors determining access to oral health services among children aged less than 12 years in Peru.

    PubMed

    Azañedo, Diego; Hernández-Vásquez, Akram; Casas-Bendezú, Mixsi; Gutiérrez, César; Agudelo-Suárez, Andrés A; Cortés, Sandra

    2017-01-01

    Background: Understanding problems of access to oral health services requires knowledge of factors that determine access. This study aimed to evaluate factors that determine access to oral health services among children aged <12 years in Peru between 2014 and 2015. Methods: We performed a secondary data analysis of 71,614 Peruvian children aged <12 years and their caregivers. Data were obtained from the Survey on Demography and Family Health 2014-2015 (Encuesta Demográfica y de Salud Familiar - ENDES). Children's access to oral health services within the previous 6 months was used as the dependent variable (i.e. Yes/No), and the Andersen and col model was used to select independent variables. Predisposing (e.g., language spoken by  tutor or guardian, wealth level, caregivers' educational level, area of residence, natural region of residence, age, and sex) and enabling factors (e.g. type of health insurance) were considered. Descriptive statistics were calculated, and multivariate analysis was performed using generalized linear models (Poisson family). Results: Of all the children, 51% were males, 56% were aged <5 years, and 62.6% lived in urban areas. The most common type of health insurance was Integral Health Insurance (57.8%), and most respondents were in the first quintile of wealth (31.6%). Regarding caregivers, the most common educational level was high school (43.02%) and the most frequently spoken language was Spanish (88.4%). Univariate analysis revealed that all variables, except sex and primary educational level, were statistically significant. After adjustment, sex, area of residence, and language were insignificant, whereas the remaining variables were statistically significant. Conclusions: Wealth index, caregivers' education level, natural region of residence, age, and type of health insurance are factors that determine access to oral health services among children aged <12 years in Peru. These factors should be considered when devising strategies to mitigate against inequities in access to oral health services.

  14. Factors determining access to oral health services among children aged less than 12 years in Peru

    PubMed Central

    Azañedo, Diego; Hernández-Vásquez, Akram; Casas-Bendezú, Mixsi; Gutiérrez, César; Agudelo-Suárez, Andrés A.; Cortés, Sandra

    2017-01-01

    Background: Understanding problems of access to oral health services requires knowledge of factors that determine access. This study aimed to evaluate factors that determine access to oral health services among children aged <12 years in Peru between 2014 and 2015. Methods: We performed a secondary data analysis of 71,614 Peruvian children aged <12 years and their caregivers. Data were obtained from the Survey on Demography and Family Health 2014-2015 (Encuesta Demográfica y de Salud Familiar - ENDES). Children’s access to oral health services within the previous 6 months was used as the dependent variable (i.e. Yes/No), and the Andersen and col model was used to select independent variables. Predisposing (e.g., language spoken by  tutor or guardian, wealth level, caregivers’ educational level, area of residence, natural region of residence, age, and sex) and enabling factors (e.g. type of health insurance) were considered. Descriptive statistics were calculated, and multivariate analysis was performed using generalized linear models (Poisson family). Results: Of all the children, 51% were males, 56% were aged <5 years, and 62.6% lived in urban areas. The most common type of health insurance was Integral Health Insurance (57.8%), and most respondents were in the first quintile of wealth (31.6%). Regarding caregivers, the most common educational level was high school (43.02%) and the most frequently spoken language was Spanish (88.4%). Univariate analysis revealed that all variables, except sex and primary educational level, were statistically significant. After adjustment, sex, area of residence, and language were insignificant, whereas the remaining variables were statistically significant. Conclusions: Wealth index, caregivers’ education level, natural region of residence, age, and type of health insurance are factors that determine access to oral health services among children aged <12 years in Peru. These factors should be considered when devising strategies to mitigate against inequities in access to oral health services. PMID:29527289

  15. GIS and statistical analysis for landslide susceptibility mapping in the Daunia area, Italy

    NASA Astrophysics Data System (ADS)

    Mancini, F.; Ceppi, C.; Ritrovato, G.

    2010-09-01

    This study focuses on landslide susceptibility mapping in the Daunia area (Apulian Apennines, Italy) and achieves this by using a multivariate statistical method and data processing in a Geographical Information System (GIS). The Logistic Regression (hereafter LR) method was chosen to produce a susceptibility map over an area of 130 000 ha where small settlements are historically threatened by landslide phenomena. By means of LR analysis, the tendency to landslide occurrences was, therefore, assessed by relating a landslide inventory (dependent variable) to a series of causal factors (independent variables) which were managed in the GIS, while the statistical analyses were performed by means of the SPSS (Statistical Package for the Social Sciences) software. The LR analysis produced a reliable susceptibility map of the investigated area and the probability level of landslide occurrence was ranked in four classes. The overall performance achieved by the LR analysis was assessed by local comparison between the expected susceptibility and an independent dataset extrapolated from the landslide inventory. Of the samples classified as susceptible to landslide occurrences, 85% correspond to areas where landslide phenomena have actually occurred. In addition, the consideration of the regression coefficients provided by the analysis demonstrated that a major role is played by the "land cover" and "lithology" causal factors in determining the occurrence and distribution of landslide phenomena in the Apulian Apennines.

  16. Statistical model specification and power: recommendations on the use of test-qualified pooling in analysis of experimental data

    PubMed Central

    Colegrave, Nick

    2017-01-01

    A common approach to the analysis of experimental data across much of the biological sciences is test-qualified pooling. Here non-significant terms are dropped from a statistical model, effectively pooling the variation associated with each removed term with the error term used to test hypotheses (or estimate effect sizes). This pooling is only carried out if statistical testing on the basis of applying that data to a previous more complicated model provides motivation for this model simplification; hence the pooling is test-qualified. In pooling, the researcher increases the degrees of freedom of the error term with the aim of increasing statistical power to test their hypotheses of interest. Despite this approach being widely adopted and explicitly recommended by some of the most widely cited statistical textbooks aimed at biologists, here we argue that (except in highly specialized circumstances that we can identify) the hoped-for improvement in statistical power will be small or non-existent, and there is likely to be much reduced reliability of the statistical procedures through deviation of type I error rates from nominal levels. We thus call for greatly reduced use of test-qualified pooling across experimental biology, more careful justification of any use that continues, and a different philosophy for initial selection of statistical models in the light of this change in procedure. PMID:28330912

  17. Lack of evidence for meteorological effects on infradian dynamics of testosterone

    NASA Astrophysics Data System (ADS)

    Celec, Peter; Smreková, Lucia; Ostatníková, Daniela; Čabajová, Zlata; Hodosy, Július; Kúdela, Matúš

    2009-09-01

    Climatic factors are known to influence the endocrine system. Previous studies have shown that circannual seasonal variations of testosterone might be partly explained by changes in air temperature. Whether infradian variations are affected by meteorological factors is unknown. To analyze possible effects of meteorological parameters on infradian variations of salivary testosterone levels in both sexes, daily salivary testosterone levels were measured during 1 month in 14 men and 17 women. A correlation analysis between hormonal levels and selected meteorological parameters was performed. The results indicate that high testosterone levels are loosely associated with cold, sunny and dry weather in both sexes. However, only the correlations between testosterone and air temperature (men) and actual cloudiness (women) were statistically significant ( p < 0,05). Although some correlations reached the level of statistical significance, the effects of selected meteorological parameters on salivary testosterone levels remain unclear. Further longer-term studies concentrating on air temperature, cloudiness and average relative humidity in relation to the sex hormone axis are needed.

  18. Effect of censoring trace-level water-quality data on trend-detection capability

    USGS Publications Warehouse

    Gilliom, R.J.; Hirsch, R.M.; Gilroy, E.J.

    1984-01-01

    Monte Carlo experiments were used to evaluate whether trace-level water-quality data that are routinely censored (not reported) contain valuable information for trend detection. Measurements are commonly censored if they fall below a level associated with some minimum acceptable level of reliability (detection limit). Trace-level organic data were simulated with best- and worst-case estimates of measurement uncertainty, various concentrations and degrees of linear trend, and different censoring rules. The resulting classes of data were subjected to a nonparametric statistical test for trend. For all classes of data evaluated, trends were most effectively detected in uncensored data as compared to censored data even when the data censored were highly unreliable. Thus, censoring data at any concentration level may eliminate valuable information. Whether or not valuable information for trend analysis is, in fact, eliminated by censoring of actual rather than simulated data depends on whether the analytical process is in statistical control and bias is predictable for a particular type of chemical analyses.

  19. Investigating the intrinsic and extrinsic work values of 10th grade students in science-oriented charter schools

    NASA Astrophysics Data System (ADS)

    Ozer, Ozgur

    The purpose of this study was to investigate to what extent gender, achievement level, and income level predict the intrinsic and extrinsic work values of 10th grade students. The study explored whether group differences were good predictors of scores in work values. The research was a descriptive, cross-sectional study conducted on 131 10th graders who attended science-oriented charter schools. Students took Super's Work Values Instrument, a Likert-type test that links to 15 work values, which can be categorized as intrinsic and extrinsic values (Super, 1970). Multiple regression analysis was employed as the main analysis followed by ANCOVA. Multiple regression analysis results indicated that there is evidence that 8.9% of the variance in intrinsic work values and 10.2% of the variance in extrinsic work values can be explained by the independent variables ( p < .05). Achievement Level and Income Level may help predict intrinsic work value scores; Achievement Level may also help predict extrinsic work values. Achievement Level was the covariate in ANCOVA. Results indicated that males (M = .174) in this sample have a higher mean of extrinsic work values than that of females (M = -.279). However, there was no statistically significant difference between the intrinsic work values by gender. One possible interpretation of this might be school choice; students in these science-oriented charter schools may have higher intrinsic work values regardless of gender. Results indicated that there was no statistically significant difference among the means of extrinsic work values by income level (p < .05). However, free lunch students (M = .268) have a higher mean of intrinsic work values than that of paid lunch students ( M = -.279). A possible interpretation of this might be that lower income students benefit greatly from the intrinsic work values in overcoming obstacles. Further research is needed in each of these areas. The study produced statistically significant results with little practical significance. Students, parents, teachers, and counselors may still be advised to consider the work value orientations of students during the career choice process.

  20. A Three Dimensional Kinematic and Kinetic Study of the Golf Swing

    PubMed Central

    Nesbit, Steven M.

    2005-01-01

    This paper discusses the three-dimensional kinematics and kinetics of a golf swing as performed by 84 male and one female amateur subjects of various skill levels. The analysis was performed using a variable full-body computer model of a human coupled with a flexible model of a golf club. Data to drive the model was obtained from subject swings recorded using a multi-camera motion analysis system. Model output included club trajectories, golfer/club interaction forces and torques, work and power, and club deflections. These data formed the basis for a statistical analysis of all subjects, and a detailed analysis and comparison of the swing characteristics of four of the subjects. The analysis generated much new data concerning the mechanics of the golf swing. It revealed that a golf swing is a highly coordinated and individual motion and subject-to-subject variations were significant. The study highlighted the importance of the wrists in generating club head velocity and orienting the club face. The trajectory of the hands and the ability to do work were the factors most closely related to skill level. Key Points Full-body model of the golf swing. Mechanical description of the golf swing. Statistical analysis of golf swing mechanics. Comparisons of subject swing mechanics PMID:24627665

  1. A three dimensional kinematic and kinetic study of the golf swing.

    PubMed

    Nesbit, Steven M

    2005-12-01

    This paper discusses the three-dimensional kinematics and kinetics of a golf swing as performed by 84 male and one female amateur subjects of various skill levels. The analysis was performed using a variable full-body computer model of a human coupled with a flexible model of a golf club. Data to drive the model was obtained from subject swings recorded using a multi-camera motion analysis system. Model output included club trajectories, golfer/club interaction forces and torques, work and power, and club deflections. These data formed the basis for a statistical analysis of all subjects, and a detailed analysis and comparison of the swing characteristics of four of the subjects. The analysis generated much new data concerning the mechanics of the golf swing. It revealed that a golf swing is a highly coordinated and individual motion and subject-to-subject variations were significant. The study highlighted the importance of the wrists in generating club head velocity and orienting the club face. The trajectory of the hands and the ability to do work were the factors most closely related to skill level. Key PointsFull-body model of the golf swing.Mechanical description of the golf swing.Statistical analysis of golf swing mechanics.Comparisons of subject swing mechanics.

  2. [The application of the multidimensional statistical methods in the evaluation of the influence of atmospheric pollution on the population's health].

    PubMed

    Surzhikov, V D; Surzhikov, D V

    2014-01-01

    The search and measurement of causal relationships between exposure to air pollution and health state of the population is based on the system analysis and risk assessment to improve the quality of research. With this purpose there is applied the modern statistical analysis with the use of criteria of independence, principal component analysis and discriminate function analysis. As a result of analysis out of all atmospheric pollutants there were separated four main components: for diseases of the circulatory system main principal component is implied with concentrations of suspended solids, nitrogen dioxide, carbon monoxide, hydrogen fluoride, for the respiratory diseases the main c principal component is closely associated with suspended solids, sulfur dioxide and nitrogen dioxide, charcoal black. The discriminant function was shown to be used as a measure of the level of air pollution.

  3. Spatio-temporal hierarchical modeling of rates and variability of Holocene sea-level changes in the western North Atlantic and the Caribbean

    NASA Astrophysics Data System (ADS)

    Ashe, E.; Kopp, R. E.; Khan, N.; Horton, B.; Engelhart, S. E.

    2016-12-01

    Sea level varies over of both space and time. Prior to the instrumental period, the sea-level record depends upon geological reconstructions that contain vertical and temporal uncertainty. Spatio-temporal statistical models enable the interpretation of RSL and rates of change as well as the reconstruction of the entire sea-level field from such noisy data. Hierarchical models explicitly distinguish between a process level, which characterizes the spatio-temporal field, and a data level, by which sparse proxy data and its noise is recorded. A hyperparameter level depicts prior expectations about the structure of variability in the spatio-temporal field. Spatio-temporal hierarchical models are amenable to several analysis approaches, with tradeoffs regarding computational efficiency and comprehensiveness of uncertainty characterization. A fully-Bayesian hierarchical model (BHM), which places prior probability distributions upon the hyperparameters, is more computationally intensive than an empirical hierarchical model (EHM), which uses point estimates of hyperparameters, derived from the data [1]. Here, we assess the sensitivity of posterior estimates of relative sea level (RSL) and rates to different statistical approaches by varying prior assumptions about the spatial and temporal structure of sea-level variability and applying multiple analytical approaches to Holocene sea-level proxies along the Atlantic coast of North American and the Caribbean [2]. References: 1. N Cressie, Wikle CK (2011) Statistics for spatio-temporal data (John Wiley & Sons). 2. Kahn N et al. (2016). Quaternary Science Reviews (in revision).

  4. Comparative Evaluation of C-reactive Proteins in Pregnant Women with and without Periodontal Pathologies: A Prospective Cohort Analysis.

    PubMed

    Mannava, Padmakanth; Gokhale, Sunil; Pujari, Sudarshan; Biswas, Krishna P; Kaliappan, Satish; Vijapure, Shashank

    2016-06-01

    Inflammation of tooth supporting structures is referred to as periodontitis. C-reactive proteins (CRP) levels are usually increased in case of chronic inflammatory process like periodontitis. Association of CRP with pregnancy has been observed in the past, which includes most commonly preterm delivery, preeclampsia, etc. Therefore, it can be hypothesized that CRP may act as a link between periodontitis and adverse pregnancy outcomes. Hence, we aim to evaluate the plasma CRP levels in pregnant women with and without periodontal pathologies. The study included 210 pregnant women who reported to the hospital with periodontal problems and for routine checkups. All the patients were divided into three groups based on the presence and absence of periodontal pathologies. Russell's Periodontal Index Score was used for the evaluation of periodontal status of the subjects. While comparing the mean CRP levels in all the three study groups, statistically significant results were obtained. Statistically significant results were obtained while comparing the mean CRP levels in group C patients before treatment and after treatment therapy. The CRP levels were estimated by taking blood samples. Paired t-test and one-way analysis of variance was used to assess the correlation between the two parameters. Casual association might exist between the CRP levels and periodontal diseases in pregnant women and the CRP levels may also get elevated in pregnant women.

  5. C-reactive protein as a predictor of chorioamnionitis.

    PubMed

    Smith, Erik J; Muller, Corinna L; Sartorius, Jennifer A; White, David R; Maslow, Arthur S

    2012-10-01

    Chorioamnionitis (CAM) affects many pregnancies complicated by preterm premature rupture of membranes (PPROM). Finding a serum factor that could accurately predict the presence of CAM could potentially lead to more efficient management of PPROM and improved neonatal outcomes. To determine if C-reactive protein (CRP) is an effective early marker of CAM in patients with PPROM. A retrospective evaluation of pregnant women with PPROM at Geisinger Medical Center in Danville, Pennsylvania, between January 2005 and January 2009. Nonparametric statistical tests (ie, Wilcoxon rank sum and Spearman rank correlation) were used to compare distributions that were skewed. Characteristics of the study population were compared using 2-sample t tests for continuous variables and Fisher exact tests for discrete variables. Logistic regression analysis was used to generate receiver operating characteristic curves and obtain area under the curve estimates in stepwise fashion for predicting histologic CAM. A secondary analysis compared the characteristics among patients with clinical CAM, histologic CAM, or non-CAM. The total population of 73 women was subdivided into patients with histologic CAM (n=26) and patients without histologic CAM (ie, no evidence of CAM on placental pathology; n=47). There was no difference between groups in CRP levels, days of pregnancy latency, white blood cell count, smoking status, antibiotic administration, or steroid benefit. The group with histologic CAM delivered at earlier gestational ages: mean (standard deviation) age was 29.5 (4.4) weeks vs 31.9 (3.5) weeks (P=.02). For our primary analysis, we found no difference in CRP levels (P=.32). Receiver operating characteristic curve plots of CRP levels, temperature at delivery, and white blood cell count resulted in an area under the curve estimate of 0.696, which was 70% predictive of histologic CAM. In the secondary analysis, after adjusting for gestational age, the estimated hazard ratio for CRP change was 1.05 (95% confidence interval, 1.02-1.08; P=.001). Therefore, increasing CRP levels from PPROM was statistically significant in predicting clinical CAM development over time. C-reactive protein levels were not effective independent predictors of clinical or histologic CAM, nor was sequential CRP testing statistically significant for the identification of clinical or histologic CAM in patients with PPROM.

  6. Method of analysis of local neuronal circuits in the vertebrate central nervous system.

    PubMed

    Reinis, S; Weiss, D S; McGaraughty, S; Tsoukatos, J

    1992-06-01

    Although a considerable amount of knowledge has been accumulated about the activity of individual nerve cells in the brain, little is known about their mutual interactions at the local level. The method presented in this paper allows the reconstruction of functional relations within a group of neurons as recorded by a single microelectrode. Data are sampled at 10 or 13 kHz. Prominent spikes produced by one or more single cells are selected and sorted by K-means cluster analysis. The activities of single cells are then related to the background firing of neurons in their vicinity. Auto-correlograms of the leading cells, auto-correlograms of the background cells (mass correlograms) and cross-correlograms between these two levels of firing are computed and evaluated. The statistical probability of mutual interactions is determined, and the statistically significant, most common interspike intervals are stored and attributed to real pairs of spikes in the original record. Selected pairs of spikes, characterized by statistically significant intervals between them, are then assembled into a working model of the system. This method has revealed substantial differences between the information processing in the visual cortex, the inferior colliculus, the rostral ventromedial medulla and the ventrobasal complex of the thalamus. Even short 1-s records of the multiple neuronal activity may provide meaningful and statistically significant results.

  7. Immunohistochemical Analysis of the Role Connective Tissue Growth Factor in Drug-induced Gingival Overgrowth in Response to Phenytoin, Cyclosporine, and Nifedipine

    PubMed Central

    Anand, A. J.; Gopalakrishnan, Sivaram; Karthikeyan, R.; Mishra, Debasish; Mohapatra, Shreeyam

    2018-01-01

    Objective: To evaluate for the presence of connective tissue growth factor (CTGF) in drug (phenytoin, cyclosporine, and nifedipine)-induced gingival overgrowth (DIGO) and to compare it with healthy controls in the absence of overgrowth. Materials and Methods: Thirty-five patients were chosen for the study and segregated into study (25) and control groups (10). The study group consisted of phenytoin-induced (10), cyclosporine-induced (10), and nifedipine-induced (5) gingival overgrowth. After completing necessary medical evaluations, biopsy was done. The tissue samples were fixed in 10% formalin and then immunohistochemically evaluated for the presence of CTGF. The statistical analysis of the values was done using statistical package SPSS PC+ (Statistical Package for the Social Sciences, version 4.01). Results: The outcome of immunohistochemistry shows that DIGO samples express more CTGF than control group and phenytoin expresses more CTGF followed by nifedipine and cyclosporine. Conclusion: The study shows that there is an increase in the levels of CTGF in patients with DIGO in comparison to the control group without any gingival overgrowth. In the study, we compared the levels of CTGF in DIGO induced by three most commonly used drugs phenytoin, cyclosporine, and nifedipine. By comparing the levels of CTGF, we find that cyclosporine induces the production of least amount of CTGF. Therefore, it might be a more viable drug choice with reduced side effects. PMID:29629324

  8. The value of hypercalciuria in patients with osteopenia versus osteoporosis.

    PubMed

    Girón-Prieto, María Sierra; Del Carmen Cano-García, María; Poyatos-Andújar, Antonio; Arias-Santiago, Salvador; de Haro-Muñoz, Tomás; Arrabal-Martín, Miguel; Arrabal-Polo, Miguel Ángel

    2017-06-01

    The aim of this study was to analyze the presence of lithogenic metabolic factors in the blood and urine of patients with osteopenia versus osteoporosis. This is a cross-sectional study including 67 patients who were divided into two groups according to the presence of either osteopenia or osteoporosis as measured by bone densitometry: group 1-40 patients with osteopenia (22 men and 18 women) and group 2-27 patients with osteoporosis (13 men and 14 women). Metabolic studies were performed on the blood and urine; statistical analysis was performed comparing means and conducting linear correlation and multivariate analyses with SPSS. Statistical significance was considered to be p ≤ 0.05. The mean age of patients in group 1 was 52.9 ± 12.8 years versus 50.3 ± 11.4 in group 2; the difference was not statistically significant. In group 2, higher levels of osteocalcin, β-crosslaps, urinary calcium, fasting urine calcium/creatinine, 24 h urine calcium/creatinine and 24 h oxaluria were observed compared to group 1. In the multivariate analysis, only the β-crosslaps and urinary calcium were independently associated with osteoporosis. It would be advisable to determine the urinary calcium levels in patients with osteoporosis since altered levels may necessitate modifying the diagnostic and therapeutic approach to osteoporosis.

  9. Assessing the Primary Schools--A Multi-Dimensional Approach: A School Level Analysis Based on Indian Data

    ERIC Educational Resources Information Center

    Sengupta, Atanu; Pal, Naibedya Prasun

    2012-01-01

    Primary education is essential for the economic development in any country. Most studies give more emphasis to the final output (such as literacy, enrolment etc.) rather than the delivery of the entire primary education system. In this paper, we study the school level data from an Indian district, collected under the official DISE statistics. We…

  10. Investing in Upskilling: Gains for Individuals, Employers and Government. In Focus: Benefit Receipt Payments

    ERIC Educational Resources Information Center

    Murray, Scott; Shillington, Richard

    2012-01-01

    Examining costs and savings associated with moving every Canadian with a Literacy Level 1 or 2 (on the international literacy scale) to Level 3, this analysis is based upon statistically matched data from the "2003 International Adult Literacy and Skills Survey and the 2005-2009 Surveys of Labour and Income Dynamics." The methods provide…

  11. Principals' Time, Tasks, and Professional Development: An Analysis of Schools and Staffing Survey Data. REL 2017-201

    ERIC Educational Resources Information Center

    Lavigne, Heather J.; Shakman, Karen; Zweig, Jacqueline; Greller, Sara L.

    2016-01-01

    This study describes how principals reported spending their time and what professional development they reported participating in, based on data collected through the Schools and Staffing Survey by the National Center for Education Statistics during the 2011/12 school year. The study analyzes schools by grade level, poverty level, and within…

  12. Dental Composite Restorations and Neuropsychological Development in Children: Treatment Level Analysis from a Randomized Clinical Trial

    PubMed Central

    Maserejian, Nancy N.; Trachtenberg, Felicia L.; Hauser, Russ; McKinlay, Sonja; Shrader, Peter; Bellinger, David C.

    2012-01-01

    Background Resin-based dental restorations may intra-orally release their components and bisphenol A. Gestational bisphenol A exposure has been associated with poorer executive functioning in children. Objectives To examine whether exposure to resin-based composite restorations is associated with neuropsychological development in children. Methods Secondary analysis of treatment level data from the New England Children’s Amalgam Trial, a 2-group randomized safety trial conducted from 1997–2006. Children (N=534) aged 6–10 y with >2 posterior tooth caries were randomized to treatment with amalgam or resin-based composites (bisphenol-A-diglycidyl-dimethacrylate-composite for permanent teeth; urethane dimethacrylate-based polyacid-modified compomer for primary teeth). Neuropsychological function at 4- and 5-year follow-up (N=444) was measured by a battery of tests of executive function, intelligence, memory, visual-spatial skills, verbal fluency, and problem-solving. Multivariable generalized linear regression models were used to examine the association between composite exposure levels and changes in neuropsychological test scores from baseline to follow-up. For comparison, data on children randomized to amalgam treatment were similarly analyzed. Results With greater exposure to either dental composite material, results were generally consistent in the direction of slightly poorer changes in tests of intelligence, achievement or memory, but there were no statistically significant associations. For the four primary measures of executive function, scores were slightly worse with greater total composite exposure, but statistically significant only for the test of Letter Fluency (10-surface-years β= −0.8, SE=0.4, P=0.035), and the subtest of color naming (β= −1.5, SE=0.5, P=0.004) in the Stroop Color-Word Interference Test. Multivariate analysis of variance confirmed that the negative associations between composite level and executive function were not statistically significant (MANOVA P=0.18). Results for greater amalgam exposure were mostly nonsignificant in the opposite direction of slightly improved scores over follow-up. Conclusions Dental composite restorations had statistically insignificant associations of small magnitude with impairments in neuropsychological test change scores over 4- or 5-years of follow-up in this trial. PMID:22906860

  13. Estimated association between dwelling soil contamination and internal radiation contamination levels after the 2011 Fukushima Daiichi nuclear accident in Japan

    PubMed Central

    Tsubokura, Masaharu; Nomura, Shuhei; Sakaihara, Kikugoro; Kato, Shigeaki; Leppold, Claire; Furutani, Tomoyuki; Morita, Tomohiro; Oikawa, Tomoyoshi; Kanazawa, Yukio

    2016-01-01

    Objectives Measurement of soil contamination levels has been considered a feasible method for dose estimation of internal radiation exposure following the Chernobyl disaster by means of aggregate transfer factors; however, it is still unclear whether the estimation of internal contamination based on soil contamination levels is universally valid or incident specific. Methods To address this issue, we evaluated relationships between in vivo and soil cesium-137 (Cs-137) contamination using data on internal contamination levels among Minamisoma (10–40 km north from the Fukushima Daiichi nuclear power plant), Fukushima residents 2–3 years following the disaster, and constructed three models for statistical analysis based on continuous and categorical (equal intervals and quantiles) soil contamination levels. Results A total of 7987 people with a mean age of 55.4 years underwent screening of in vivo Cs-137 whole-body counting. A statistically significant association was noted between internal and continuous Cs-137 soil contamination levels (model 1, p value <0.001), although the association was slight (relative risk (RR): 1.03 per 10 kBq/m2 increase in soil contamination). Analysis of categorical soil contamination levels showed statistical (but not clinical) significance only in relatively higher soil contamination levels (model 2: Cs-137 levels above 100 kBq/m2 compared to those <25 kBq/m2, RR=1.75, p value <0.01; model 3: levels above 63 kBq/m2 compared to those <11 kBq/m2, RR=1.45, p value <0.05). Conclusions Low levels of internal and soil contamination were not associated, and only loose/small associations were observed in areas with slightly higher levels of soil contamination in Fukushima, representing a clear difference from the strong associations found in post-disaster Chernobyl. These results indicate that soil contamination levels generally do not contribute to the internal contamination of residents in Fukushima; thus, individual measurements are essential for the precise evaluation of chronic internal radiation contamination. PMID:27357196

  14. A Statistical Method for Synthesizing Mediation Analyses Using the Product of Coefficient Approach Across Multiple Trials

    PubMed Central

    Huang, Shi; MacKinnon, David P.; Perrino, Tatiana; Gallo, Carlos; Cruden, Gracelyn; Brown, C Hendricks

    2016-01-01

    Mediation analysis often requires larger sample sizes than main effect analysis to achieve the same statistical power. Combining results across similar trials may be the only practical option for increasing statistical power for mediation analysis in some situations. In this paper, we propose a method to estimate: 1) marginal means for mediation path a, the relation of the independent variable to the mediator; 2) marginal means for path b, the relation of the mediator to the outcome, across multiple trials; and 3) the between-trial level variance-covariance matrix based on a bivariate normal distribution. We present the statistical theory and an R computer program to combine regression coefficients from multiple trials to estimate a combined mediated effect and confidence interval under a random effects model. Values of coefficients a and b, along with their standard errors from each trial are the input for the method. This marginal likelihood based approach with Monte Carlo confidence intervals provides more accurate inference than the standard meta-analytic approach. We discuss computational issues, apply the method to two real-data examples and make recommendations for the use of the method in different settings. PMID:28239330

  15. An Introduction to Macro- Level Spatial Nonstationarity: a Geographically Weighted Regression Analysis of Diabetes and Poverty

    PubMed Central

    Siordia, Carlos; Saenz, Joseph; Tom, Sarah E.

    2014-01-01

    Type II diabetes is a growing health problem in the United States. Understanding geographic variation in diabetes prevalence will inform where resources for management and prevention should be allocated. Investigations of the correlates of diabetes prevalence have largely ignored how spatial nonstationarity might play a role in the macro-level distribution of diabetes. This paper introduces the reader to the concept of spatial nonstationarity—variance in statistical relationships as a function of geographical location. Since spatial nonstationarity means different predictors can have varying effects on model outcomes, we make use of a geographically weighed regression to calculate correlates of diabetes as a function of geographic location. By doing so, we demonstrate an exploratory example in which the diabetes-poverty macro-level statistical relationship varies as a function of location. In particular, we provide evidence that when predicting macro-level diabetes prevalence, poverty is not always positively associated with diabetes PMID:25414731

  16. An Introduction to Macro- Level Spatial Nonstationarity: a Geographically Weighted Regression Analysis of Diabetes and Poverty.

    PubMed

    Siordia, Carlos; Saenz, Joseph; Tom, Sarah E

    2012-01-01

    Type II diabetes is a growing health problem in the United States. Understanding geographic variation in diabetes prevalence will inform where resources for management and prevention should be allocated. Investigations of the correlates of diabetes prevalence have largely ignored how spatial nonstationarity might play a role in the macro-level distribution of diabetes. This paper introduces the reader to the concept of spatial nonstationarity-variance in statistical relationships as a function of geographical location. Since spatial nonstationarity means different predictors can have varying effects on model outcomes, we make use of a geographically weighed regression to calculate correlates of diabetes as a function of geographic location. By doing so, we demonstrate an exploratory example in which the diabetes-poverty macro-level statistical relationship varies as a function of location. In particular, we provide evidence that when predicting macro-level diabetes prevalence, poverty is not always positively associated with diabetes.

  17. Distinguishing Man from Molecules: The Distinctiveness of Medical Concepts at Different Levels of Description

    PubMed Central

    Cole, William G.; Michael, Patricia; Blois, Marsden S.

    1987-01-01

    A computer program was created to use information about the statistical distribution of words in journal abstracts to make probabilistic judgments about the level of description (e.g. molecular, cell, organ) of medical text. Statistical analysis of 7,409 journal abstracts taken from three medical journals representing distinct levels of description revealed that many medical words seem to be highly specific to one or another level of description. For example, the word adrenoreceptors occurred only in the American Journal of Physiology, never in Journal of Biological Chemistry or in Journal of American Medical Association. Such highly specific words occured so frequently that the automatic classification program was able to classify correctly 45 out of 45 test abstracts, with 100% confidence. These findings are interpreted in terms of both a theory of the structure of medical knowledge and the pragmatics of automatic classification.

  18. Emergent irreversibility and entanglement spectrum statistics

    NASA Astrophysics Data System (ADS)

    Mucciolo, Eduardo; Chamon, Claudio; Hamma, Alioscia

    2014-03-01

    We study the problem of irreversibility when the dynamical evolution of a many-body system is described by a stochastic quantum circuit. Such evolution is more general than Hamitonian, and since energy levels are not well defined, the well-established connection between the statistical fluctuations of the energy spectrum and irreversibility cannot be made. We show that the entanglement spectrum provides a more general connection. Irreversibility is marked by a failure of a disentangling algorithm and is preceded by the appearance of Wigner-Dyson statistical fluctuations in the entanglement spectrum. This analysis can be done at the wavefunction level and offers a new route to study quantum chaos and quantum integrability. We acknowledge financial support from the U.S. National Science Foundation through grants CCF 1116590 and CCF 1117241, from the National Basic Research Program of China through grants 2011CBA00300 and 2011CBA00301, and from the National Natural Science Fo.

  19. Likert scales, levels of measurement and the "laws" of statistics.

    PubMed

    Norman, Geoff

    2010-12-01

    Reviewers of research reports frequently criticize the choice of statistical methods. While some of these criticisms are well-founded, frequently the use of various parametric methods such as analysis of variance, regression, correlation are faulted because: (a) the sample size is too small, (b) the data may not be normally distributed, or (c) The data are from Likert scales, which are ordinal, so parametric statistics cannot be used. In this paper, I dissect these arguments, and show that many studies, dating back to the 1930s consistently show that parametric statistics are robust with respect to violations of these assumptions. Hence, challenges like those above are unfounded, and parametric methods can be utilized without concern for "getting the wrong answer".

  20. Data exploration, quality control and statistical analysis of ChIP-exo/nexus experiments

    PubMed Central

    Welch, Rene; Chung, Dongjun; Grass, Jeffrey; Landick, Robert

    2017-01-01

    Abstract ChIP-exo/nexus experiments rely on innovative modifications of the commonly used ChIP-seq protocol for high resolution mapping of transcription factor binding sites. Although many aspects of the ChIP-exo data analysis are similar to those of ChIP-seq, these high throughput experiments pose a number of unique quality control and analysis challenges. We develop a novel statistical quality control pipeline and accompanying R/Bioconductor package, ChIPexoQual, to enable exploration and analysis of ChIP-exo and related experiments. ChIPexoQual evaluates a number of key issues including strand imbalance, library complexity, and signal enrichment of data. Assessment of these features are facilitated through diagnostic plots and summary statistics computed over regions of the genome with varying levels of coverage. We evaluated our QC pipeline with both large collections of public ChIP-exo/nexus data and multiple, new ChIP-exo datasets from Escherichia coli. ChIPexoQual analysis of these datasets resulted in guidelines for using these QC metrics across a wide range of sequencing depths and provided further insights for modelling ChIP-exo data. PMID:28911122

  1. Data exploration, quality control and statistical analysis of ChIP-exo/nexus experiments.

    PubMed

    Welch, Rene; Chung, Dongjun; Grass, Jeffrey; Landick, Robert; Keles, Sündüz

    2017-09-06

    ChIP-exo/nexus experiments rely on innovative modifications of the commonly used ChIP-seq protocol for high resolution mapping of transcription factor binding sites. Although many aspects of the ChIP-exo data analysis are similar to those of ChIP-seq, these high throughput experiments pose a number of unique quality control and analysis challenges. We develop a novel statistical quality control pipeline and accompanying R/Bioconductor package, ChIPexoQual, to enable exploration and analysis of ChIP-exo and related experiments. ChIPexoQual evaluates a number of key issues including strand imbalance, library complexity, and signal enrichment of data. Assessment of these features are facilitated through diagnostic plots and summary statistics computed over regions of the genome with varying levels of coverage. We evaluated our QC pipeline with both large collections of public ChIP-exo/nexus data and multiple, new ChIP-exo datasets from Escherichia coli. ChIPexoQual analysis of these datasets resulted in guidelines for using these QC metrics across a wide range of sequencing depths and provided further insights for modelling ChIP-exo data. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  2. Exocrine Dysfunction Correlates with Endocrinal Impairment of Pancreas in Type 2 Diabetes Mellitus.

    PubMed

    Prasanna Kumar, H R; Gowdappa, H Basavana; Hosmani, Tejashwi; Urs, Tejashri

    2018-01-01

    Diabetes mellitus (DM) is a chronic abnormal metabolic condition, which manifests elevated blood sugar level over a prolonged period. The pancreatic endocrine system generally gets affected during diabetes, but often abnormal exocrine functions are also manifested due to its proximity to the endocrine system. Fecal elastase-1 (FE-1) is found to be an ideal biomarker to reflect the exocrine insufficiency of the pancreas. The aim of this study was conducted to assess exocrine dysfunction of the pancreas in patients with type-2 DM (T2DM) by measuring FE levels and to associate the level of hyperglycemia with exocrine pancreatic dysfunction. A prospective, cross-sectional comparative study was conducted on both T2DM patients and healthy nondiabetic volunteers. FE-1 levels were measured using a commercial kit (Human Pancreatic Elastase ELISA BS 86-01 from Bioserv Diagnostics). Data analysis was performed based on the important statistical parameters such as mean, standard deviation, standard error, t -test-independent samples, and Chi-square test/cross tabulation using SPSS for Windows version 20.0. Statistically nonsignificant ( P = 0.5051) relationship between FE-1 deficiency and age was obtained, which implied age as a noncontributing factor toward exocrine pancreatic insufficiency among diabetic patients. Statistically significant correlation ( P = 0.003) between glycated hemoglobin and FE-1 levels was also noted. The association between retinopathy ( P = 0.001) and peripheral pulses ( P = 0.001) with FE-1 levels were found to be statistically significant. This study validates the benefit of FE-1 estimation, as a surrogate marker of exocrine pancreatic insufficiency, which remains unmanifest and subclinical.

  3. 2,4,6-Trinitrotoluene (TNT) air concentrations, hemoglobin changes, and anemia cases in respirator protected TNT munitions demilitarization workers.

    PubMed

    Bradley, Melville D

    2011-03-01

    2,4,6-Trinitrotoluene (TNT) is an explosive used in munitions production that is known to cause both aplastic and hemolytic anemia in exposed workers. Anemia in a TNT worker is considered a sentinel health event (occupational) (SHE(O)) in the United States (US). Deaths have been reported secondary to aplastic anemia. Studies have shown that TNT systemic absorption is significant by both the respiratory and dermal routes. No studies encountered looked at hemoglobin change or anemia cases in respiratory protected workers. It is hypothesized that respiratory protection is insufficient to protect TNT workers from the risk of anemia development and hemoglobin concentration drop. A records review of eight groups of respiratory protected TNT workers' pre-exposure hemoglobin levels were compared with their during-exposure hemoglobin levels for statistically significant (alpha level 0.05) hemoglobin level changes, and anemia cases were recorded. A curve estimation analysis was performed between mean TNT air concentrations and mean hemoglobin change values. Statistically significant hemoglobin level drops and anemia cases were apparent at TNT air concentrations about the REL and PEL in respiratory protected workers. There were no anemia cases or statistically significant hemoglobin level drops at concentrations about the TLV, however. A statistically significant inverse non-linear regression model was found to be the best fit for regressing hemoglobin change on TNT air concentration. Respiratory protection may be inadequate to prevent workers who are at risk for TNT skin absorption from developing anemia. This study contributes evidence that the TLV should be considered for adoption as the new PEL.

  4. Quantifying Climate Change Hydrologic Risk at NASA Ames Research Center

    NASA Astrophysics Data System (ADS)

    Mills, W. B.; Bromirski, P. D.; Coats, R. N.; Costa-Cabral, M.; Fong, J.; Loewenstein, M.; Milesi, C.; Miller, N.; Murphy, N.; Roy, S.

    2013-12-01

    In response to 2009 Executive Order 13514 mandating U.S. federal agencies to evaluate infrastructure vulnerabilities due to climate variability and change we provide an analysis of future climate flood risk at NASA Ames Research Center (Ames) along South S.F. Bay. This includes likelihood analysis of large-scale water vapor transport, statistical analysis of intense precipitation, high winds, sea level rise, storm surge, estuary dynamics, saturated overland flooding, and likely impacts to wetlands and habitat loss near Ames. We use the IPCC CMIP5 data from three Atmosphere-Ocean General Circulation Models with Radiative Concentration Pathways of 8.5 Wm-2 and 4.5 Wm-2 and provide an analysis of climate variability and change associated with flooding and impacts at Ames. Intense storms impacting Ames are due to two large-scale processes, sub-tropical atmospheric rivers (AR) and north Pacific Aleutian low-pressure (AL) storm systems, both of which are analyzed here in terms of the Integrated Water Vapor (IWV) exceeding a critical threshold within a search domain and the wind vector transporting the IWV from southerly to westerly to northwesterly for ARs and northwesterly to northerly for ALs and within the Ames impact area during 1970-1999, 2040-2069, and 2070-2099. We also include a statistical model of extreme precipitation at Ames based on large-scale climatic predictors, and characterize changes using CMIP5 projections. Requirements for levee height to protect Ames are projected to increase and continually accelerate throughout this century as sea level rises. We use empirical statistical and analytical methods to determine the likelihood, in each year from present through 2099, of water level surpassing different threshold values in SF Bay near NASA Ames. We study the sensitivity of the water level corresponding to a 1-in-10 and 1-in-100 likelihood of exceedance to changes in the statistical distribution of storm surge height and ENSO height, in addition to increasing mean sea level. We examine the implications in the face of the CMIP5 projections. Storm intensification may result in increased flooding hazards at Ames. We analyze how the changes in precipitation intensity will impact the storm drainage system at Ames through continuous stormwater modeling of runoff with the EPA model SWMM 5 and projected downscaled daily precipitation data. Although extreme events will not adversely affect wetland habitats, adaptation projects--especially levee construction and improvement--will require filling of wetlands. Federal law mandates mitigation for fill placed in wetlands. We are currently calculating the potential mitigation burden by habitat type.

  5. Evaluation of the ecological relevance of mysid toxicity tests using population modeling techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuhn-Hines, A.; Munns, W.R. Jr.; Lussier, S.

    1995-12-31

    A number of acute and chronic bioassay statistics are used to evaluate the toxicity and risks of chemical stressors to the mysid shrimp, Mysidopsis bahia. These include LC{sub 50}S from acute tests, NOECs from 7-day and life-cycle tests, and the US EPA Water Quality Criteria Criterion Continuous Concentrations (CCC). Because these statistics are generated from endpoints which focus upon the responses of individual organisms, their relationships to significant effects at higher levels of ecological organization are unknown. This study was conducted to evaluate the quantitative relationships between toxicity test statistics and a concentration-based statistic derived from exposure-response models describing populationmore » growth rate ({lambda}) to stressor concentration. This statistic, C{sup {sm_bullet}} (concentration where {lambda} = I, zero population growth) describes the concentration above which mysid populations are projected to decline in abundance as determined using population modeling techniques. An analysis of M. bahia responses to 9 metals and 9 organic contaminants indicated the NOEC from life-cycle tests to be the best predictor of C{sup {sm_bullet}}, although the acute LC{sub 50} predicted population-level response surprisingly well. These analyses provide useful information regarding uncertainties of extrapolation among test statistics in assessments of ecological risk.« less

  6. Serum Levels of 25-hydroxyvitamin D in Chronic Urticaria and its Association with Disease Activity: A Case Control Study

    PubMed Central

    Rather, Shagufta; Keen, Abid; Sajad, Peerzada

    2018-01-01

    Aim: To evaluate the relationship between vitamin D levels and chronic spontaneous urticaria (CSU) and compare with healthy age and sex matched controls. Material and Methods: This was a hospital-based cross-sectional study conducted over a period of 1 year, in which 110 patients with CSU were recruited along with an equal number of sex and age-matched healthy controls. For each patient, urticaria activity score (UAS) was calculated and autologous serum skin test (ASST) was performed. Plasma 25-hydroxyvitamin D [25-(OH)D] was analyzed by chemiluminescence method. A deficiency in vitamin D was defined as serum 25-(OH)D concentrations <30 ng/mL. The statistical analysis was carried out by using appropriate statistical tests. Results: The mean serum 25-(OH)D levels of CSU patients was 19.6 ± 6.9 ng/mL, whereas in control group, the mean level was 38.5 ± 6.7, the difference being statistically significant (P < 0.001). A significant negative correlation was found between vitamin D levels and UAS. (P < 0.001). The number of patients with ASST positivity was 44 (40%). Conclusion: The patients with CSU had reduced levels of vitamin D when compared to healthy controls. Furthermore, there was a significant negative correlation between the levels of serum vitamin D and severity of CSU. PMID:29854636

  7. [Ways of urban sanitary and epidemiological well-being management].

    PubMed

    Kreĭmer, M A

    2010-01-01

    The scientific rationale for preventive measures based on sanitary-and-epidemiological surveillance on environmental objects is considered. The sizes of functional zones and space for various types of communal services and amenities and leisure are regulated to ensure good urban vital activities. Multistorey housing causes an increase in the number of negative factors per area units and in their impact on health. A proposal has been made for the standardization of the ranges of urban population upsurge and size, by using the sanitary-and-hygienic rules and norms rather than climatic parameters. A criterion system for assessing the data of statistical observations has been substantiated and 5 levels of analysis and managerial decision-making have been proposed. Cause-and-effect relations may be determined for the parameters of the second level; models of program-oriented studies for the third level, only sanitary-and-epidemiological surveillance is possible for the fourth and fifth levels. The space planning scheme must provide for water supply reserves, generation areas for pure air coming into the town, and waste disposal areas. The general layout may use statistical observation parameters characterizing the second level of occurrence of negative phenomena. The statistical observation parameters characterizing the third and fourth levels of occurrence of negative phenomena may be used for municipal improvements and sanitary maintenance. These characterizing the fourth and fifth level may be used for prevention in therapeutic-and-prophylactic institutions.

  8. Serum erythropoietin levels in patients with central serous chorioretinopathy

    PubMed Central

    Turgut, Burak; Ilhan, Nevin; Uyar, Fatma Yayla; Celiker, Ulku; Demir, Tamer; Koca, Suleyman Serdar

    2010-01-01

    Objective To evaluate the levels of erythropoietin (EPO) in the serum in patients with central serous chorioretinopathy (CSC). Methods An institutional comperative clinical study. The serum EPO levels were measured with the enzyme-linked immunosorbent assay (ELISA) method, of 15 patients with active CSC (Group 1), 15 patients with inactive CSC (Group 2) and 15 healthy volunteers (Group 3). Kruskal–Wallis variance analysis and Mann–Whitney U test were used for statistical analysis. Results The patient and control groups were matched for age and sex. There was no statistically significant variation with regard to age and gender among the groups (P > 0.05). The mean serum EPO concentrations in patients with active CSC (Group 1), inactive CSC (Group 2) and in healthy controls (Group 3) were 11.39 ± 3.01 mlU/mL, 11.79 ± 3.78 mlU/mL and 11.95 ± 3.27 mlU/mL, respectively. There was no significant variation among the serum EPO concentrations of the study groups (P > 0.05). Conclusion These findings suggest no role of serum EPO in pathogenesis of CSC. PMID:28539767

  9. Serum erythropoietin levels in patients with central serous chorioretinopathy.

    PubMed

    Turgut, Burak; Ilhan, Nevin; Uyar, Fatma Yayla; Celiker, Ulku; Demir, Tamer; Koca, Suleyman Serdar

    2010-01-01

    To evaluate the levels of erythropoietin (EPO) in the serum in patients with central serous chorioretinopathy (CSC). An institutional comperative clinical study. The serum EPO levels were measured with the enzyme-linked immunosorbent assay (ELISA) method, of 15 patients with active CSC (Group 1), 15 patients with inactive CSC (Group 2) and 15 healthy volunteers (Group 3). Kruskal-Wallis variance analysis and Mann-Whitney U test were used for statistical analysis. The patient and control groups were matched for age and sex. There was no statistically significant variation with regard to age and gender among the groups ( P > 0.05). The mean serum EPO concentrations in patients with active CSC (Group 1), inactive CSC (Group 2) and in healthy controls (Group 3) were 11.39 ± 3.01 mlU/mL, 11.79 ± 3.78 mlU/mL and 11.95 ± 3.27 mlU/mL, respectively. There was no significant variation among the serum EPO concentrations of the study groups ( P > 0.05). These findings suggest no role of serum EPO in pathogenesis of CSC.

  10. Using multi-level data to estimate the effect of an 'alcogenic' environment on hazardous alcohol consumption in the former Soviet Union.

    PubMed

    Murphy, Adrianna; Roberts, Bayard; Ploubidis, George B; Stickley, Andrew; McKee, Martin

    2014-05-01

    The purpose of this study was to assess whether alcohol-related community characteristics act collectively to influence individual-level alcohol consumption in the former Soviet Union (fSU). Using multi-level data from nine countries in the fSU we conducted a factor analysis of seven alcohol-related community characteristics. The association between any latent factors underlying these characteristics and two measures of hazardous alcohol consumption was then analysed using a population average regression modelling approach. Our factor analysis produced one factor with an eigenvalue >1 (EV=1.28), which explained 94% of the variance. This factor was statistically significantly associated with increased odds of CAGE problem drinking (OR=1.40 (1.08-1.82)). The estimated association with EHD was not statistically significant (OR=1.10 (0.85-1.44)). Our findings suggest that a high number of beer, wine and spirit advertisements and high alcohol outlet density may work together to create an 'alcogenic' environment that encourages hazardous alcohol consumption in the fSU. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. A quantitative study of factors influencing quality of life in rural Mexican women diagnosed with HIV.

    PubMed

    Holtz, Carol; Sowell, Richard; VanBrackle, Lewis; Velasquez, Gabriela; Hernandez-Alonso, Virginia

    2014-01-01

    This quantitative study explored the level of Quality of Life (QoL) in indigenous Mexican women and identified psychosocial factors that significantly influenced their QoL, using face-to-face interviews with 101 women accessing care in an HIV clinic in Oaxaca, Mexico. Variables included demographic characteristics, levels of depression, coping style, family functioning, HIV-related beliefs, and QoL. Descriptive statistics were used to analyze participant characteristics, and women's scores on data collection instruments. Pearson's R correlational statistics were used to determine the level of significance between study variables. Multiple regression analysis examined all variables that were significantly related to QoL. Pearson's correlational analysis of relationships between Spirituality, Educating Self about HIV, Family Functioning, Emotional Support, Physical Care, and Staying Positive demonstrated positive correlation to QoL. Stigma, depression, and avoidance coping were significantly and negatively associated with QoL. The final regression model indicated that depression and avoidance coping were the best predictor variables for QoL. Copyright © 2014 Association of Nurses in AIDS Care. Published by Elsevier Inc. All rights reserved.

  12. Emission and reflection from healthy and stressed natural targets with computer analysis of spectroradiometric and multispectral scanner data

    NASA Technical Reports Server (NTRS)

    Kumar, R.; Silva, L. F.

    1973-01-01

    Special emphasis was on corn plants, and the healthy targets were differentiated from stressed ones by remote sensing. Infrared radiometry of plants is reviewed thoroughly with emphasis on agricultural crops. Theory and error analysis of the determination of emittance of a natural target by radiometer is discussed. Experiments were conducted on corn (Zea mays L.) plants with long wavelength spectroradiometer under field conditions. Analysis of multispectral scanner data of ten selected flightlines of Corn Blight Watch Experiment of 1972 indicated: (1) There was no regular pattern of the mean response of the higher level/levels blighted corn vs. lower level/levels blighted corn in any of the spectral channels. (2) The greater the difference between the blight levels, the more statistically separable they usually were in subsets of one, two, three and four spectral channels.

  13. Applications of external cavity diode laser-based technique to noninvasive clinical diagnosis using expired breath ammonia analysis: chronic kidney disease, epilepsy

    NASA Astrophysics Data System (ADS)

    Bayrakli, Ismail; Turkmen, Aysenur; Akman, Hatice; Sezer, M. Tugrul; Kutluhan, Suleyman

    2016-08-01

    An external cavity laser (ECL)-based off-axis cavity-enhanced absorption spectroscopy was applied to noninvasive clinical diagnosis using expired breath ammonia analysis: (1) the correlation between breath ammonia levels and blood parameters related to chronic kidney disease (CKD) was investigated and (2) the relationship between breath ammonia levels and blood concentrations of valproic acid (VAP) was studied. The concentrations of breath ammonia in 15 healthy volunteers, 10 epilepsy patients (before and after taking VAP), and 27 patients with different stages of CKD were examined. The range of breath ammonia levels was 120 to 530 ppb for healthy subjects and 710 to 10,400 ppb for patients with CKD. There was a statistically significant positive correlation between breath ammonia concentrations and urea, blood urea nitrogen, creatinine, or estimated glomerular filtration rate in 27 patients. It was demonstrated that taking VAP gave rise to increasing breath ammonia levels. A statistically significant difference was found between the levels of exhaled ammonia (NH3) in healthy subjects and in patients with epilepsy before and after taking VAP. The results suggest that our breath ammonia measurement system has great potential as an easy, noninvasive, real-time, and continuous monitor of the clinical parameters related to epilepsy and CKD.

  14. Increasing URM Undergraduate Student Success through Assessment-Driven Interventions: A Multiyear Study Using Freshman-Level General Biology as a Model System

    PubMed Central

    Carmichael, Mary C.; St. Clair, Candace; Edwards, Andrea M.; Barrett, Peter; McFerrin, Harris; Davenport, Ian; Awad, Mohamed; Kundu, Anup; Ireland, Shubha Kale

    2016-01-01

    Xavier University of Louisiana leads the nation in awarding BS degrees in the biological sciences to African-American students. In this multiyear study with ∼5500 participants, data-driven interventions were adopted to improve student academic performance in a freshman-level general biology course. The three hour-long exams were common and administered concurrently to all students. New exam questions were developed using Bloom’s taxonomy, and exam results were analyzed statistically with validated assessment tools. All but the comprehensive final exam were returned to students for self-evaluation and remediation. Among other approaches, course rigor was monitored by using an identical set of 60 questions on the final exam across 10 semesters. Analysis of the identical sets of 60 final exam questions revealed that overall averages increased from 72.9% (2010) to 83.5% (2015). Regression analysis demonstrated a statistically significant correlation between high-risk students and their averages on the 60 questions. Additional analysis demonstrated statistically significant improvements for at least one letter grade from midterm to final and a 20% increase in the course pass rates over time, also for the high-risk population. These results support the hypothesis that our data-driven interventions and assessment techniques are successful in improving student retention, particularly for our academically at-risk students. PMID:27543637

  15. [Intensification of post-traumatic stress disorder of Siberian deportees from the North-East region of Poland].

    PubMed

    Monieta, Adela; Anczurowski, Wojciech

    2004-01-01

    Presentation of Post-Traumatic Stress Disorder based on the approach of various authors concentrating, upon the concept of the American classification: DSM III (1980) and DSM IV (1994). We acknowledged the necessity of displaying empirical results of intensification of PTSD among the Siberian deportees population in the region of North-East part of Poland. In our analysis, we stressed the importance of the distant in time, psychological consequences of dwelling in extremely difficult living conditions that often threatened the life of those who had been deported to Siberia between 1939 and 1956. 40 "Siberian deportees" (20 men and 20 women) were examined. The method of PTSD-Interview (PTSD-I) was used here in order to obtain, in each individual case, the indicatory number indispensable for the statistical analysis. An average result of PTSD intensification in the case of women reaches a "very significant" level and in the case of men it is even higher. The disparity between the average results of women and of men are statistically significant (p<0.05). This research has confirmed the assumptions that suffering from trauma in the early stage of development (within the age range of 8-15) leaves a permanent mark in the human psyche. Statistical analysis revealed a high level of intensification of PTSD among the population of the "Siberian deportees" from the North-East region of Poland.

  16. Implementation and evaluation of an efficient secure computation system using ‘R’ for healthcare statistics

    PubMed Central

    Chida, Koji; Morohashi, Gembu; Fuji, Hitoshi; Magata, Fumihiko; Fujimura, Akiko; Hamada, Koki; Ikarashi, Dai; Yamamoto, Ryuichi

    2014-01-01

    Background and objective While the secondary use of medical data has gained attention, its adoption has been constrained due to protection of patient privacy. Making medical data secure by de-identification can be problematic, especially when the data concerns rare diseases. We require rigorous security management measures. Materials and methods Using secure computation, an approach from cryptography, our system can compute various statistics over encrypted medical records without decrypting them. An issue of secure computation is that the amount of processing time required is immense. We implemented a system that securely computes healthcare statistics from the statistical computing software ‘R’ by effectively combining secret-sharing-based secure computation with original computation. Results Testing confirmed that our system could correctly complete computation of average and unbiased variance of approximately 50 000 records of dummy insurance claim data in a little over a second. Computation including conditional expressions and/or comparison of values, for example, t test and median, could also be correctly completed in several tens of seconds to a few minutes. Discussion If medical records are simply encrypted, the risk of leaks exists because decryption is usually required during statistical analysis. Our system possesses high-level security because medical records remain in encrypted state even during statistical analysis. Also, our system can securely compute some basic statistics with conditional expressions using ‘R’ that works interactively while secure computation protocols generally require a significant amount of processing time. Conclusions We propose a secure statistical analysis system using ‘R’ for medical data that effectively integrates secret-sharing-based secure computation and original computation. PMID:24763677

  17. Implementation and evaluation of an efficient secure computation system using 'R' for healthcare statistics.

    PubMed

    Chida, Koji; Morohashi, Gembu; Fuji, Hitoshi; Magata, Fumihiko; Fujimura, Akiko; Hamada, Koki; Ikarashi, Dai; Yamamoto, Ryuichi

    2014-10-01

    While the secondary use of medical data has gained attention, its adoption has been constrained due to protection of patient privacy. Making medical data secure by de-identification can be problematic, especially when the data concerns rare diseases. We require rigorous security management measures. Using secure computation, an approach from cryptography, our system can compute various statistics over encrypted medical records without decrypting them. An issue of secure computation is that the amount of processing time required is immense. We implemented a system that securely computes healthcare statistics from the statistical computing software 'R' by effectively combining secret-sharing-based secure computation with original computation. Testing confirmed that our system could correctly complete computation of average and unbiased variance of approximately 50,000 records of dummy insurance claim data in a little over a second. Computation including conditional expressions and/or comparison of values, for example, t test and median, could also be correctly completed in several tens of seconds to a few minutes. If medical records are simply encrypted, the risk of leaks exists because decryption is usually required during statistical analysis. Our system possesses high-level security because medical records remain in encrypted state even during statistical analysis. Also, our system can securely compute some basic statistics with conditional expressions using 'R' that works interactively while secure computation protocols generally require a significant amount of processing time. We propose a secure statistical analysis system using 'R' for medical data that effectively integrates secret-sharing-based secure computation and original computation. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  18. Assessing human metal accumulations in an urban superfund site.

    PubMed

    Hailer, M Katie; Peck, Christopher P; Calhoun, Michael W; West, Robert F; James, Kyle J; Siciliano, Steven D

    2017-09-01

    Butte, Montana is part of the largest superfund site in the continental United States. Open-pit mining continues in close proximity to Butte's urban population. This study seeks to establish baseline metal concentrations in the hair and blood of individuals living in Butte, MT and possible routes of exposure. Volunteers from Butte (n=116) and Bozeman (n=86) were recruited to submit hair and blood samples and asked to complete a lifestyle survey. Elemental analysis of hair and blood samples was performed by ICP-MS. Three air monitors were stationed in Butte to collect particulate and filters were analyzed by ICP-MS. Soil samples from the yards of Butte volunteers were quantified by ICP-MS. Hair analysis revealed concentrations of Al, As, Cd, Cu, Mn, Mo, and U to be statistically elevated in Butte's population. Blood analysis revealed that the concentration of As was also statistically elevated in the Butte population. Multiple regression analysis was performed for the elements As, Cu, and Mn for hair and blood samples. Soil samples revealed detectable levels of As, Pb, Cu, Mn, and Cd, with As and Cu levels being higher than expected in some of the samples. Air sampling revealed consistently elevated As and Mn levels in the larger particulate sampled as compared to average U.S. ambient air data. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Spatial analysis on future housing markets: economic development and housing implications.

    PubMed

    Liu, Xin; Wang, Lizhe

    2014-01-01

    A coupled projection method combining formal modelling and other statistical techniques was developed to delineate the relationship between economic and social drivers for net new housing allocations. Using the example of employment growth in Tyne and Wear, UK, until 2016, the empirical analysis yields housing projections at the macro- and microspatial levels (e.g., region to subregion to elected ward levels). The results have important implications for the strategic planning of locations for housing and employment, demonstrating both intuitively and quantitatively how local economic developments affect housing demand.

  20. Spatial Analysis on Future Housing Markets: Economic Development and Housing Implications

    PubMed Central

    Liu, Xin; Wang, Lizhe

    2014-01-01

    A coupled projection method combining formal modelling and other statistical techniques was developed to delineate the relationship between economic and social drivers for net new housing allocations. Using the example of employment growth in Tyne and Wear, UK, until 2016, the empirical analysis yields housing projections at the macro- and microspatial levels (e.g., region to subregion to elected ward levels). The results have important implications for the strategic planning of locations for housing and employment, demonstrating both intuitively and quantitatively how local economic developments affect housing demand. PMID:24892097

  1. The Impact of Measurement Noise in GPA Diagnostic Analysis of a Gas Turbine Engine

    NASA Astrophysics Data System (ADS)

    Ntantis, Efstratios L.; Li, Y. G.

    2013-12-01

    The performance diagnostic analysis of a gas turbine is accomplished by estimating a set of internal engine health parameters from available sensor measurements. No physical measuring instruments however can ever completely eliminate the presence of measurement uncertainties. Sensor measurements are often distorted by noise and bias leading to inaccurate estimation results. This paper explores the impact of measurement noise on Gas Turbine GPA analysis. The analysis is demonstrated with a test case where gas turbine performance simulation and diagnostics code TURBOMATCH is used to build a performance model of a model engine similar to Rolls-Royce Trent 500 turbofan engine, and carry out the diagnostic analysis with the presence of different levels of measurement noise. Conclusively, to improve the reliability of the diagnostic results, a statistical analysis of the data scattering caused by sensor uncertainties is made. The diagnostic tool used to deal with the statistical analysis of measurement noise impact is a model-based method utilizing a non-linear GPA.

  2. Evaluation of asbestos levels in two schools before and after asbestos removal. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karaffa, M.A.; Chesson, J.; Russell, J.

    This report presents a statistical evaluation of airborne asbestos data collected at two schools before and after removal of asbestos-containing material (ACM). Although the monitoring data are not totally consistent with new Asbestos Hazard Emergency Response Act (AHERA) requirements and recent EPA guidelines, the study evaluates these historical data by standard statistical methods to determine if abated work areas meet proposed clearance criteria. The objectives of this statistical analysis were to compare (1) airborne asbestos levels indoors after removal with levels outdoors, (2) airborne asbestos levels before and after removal of asbestos, and (3) static sampling and aggressive sampling ofmore » airborne asbestos. The results of this evaluation indicated the following: the effect of asbestos removal on indoor air quality is unpredictable; the variability in fiber concentrations among different sampling sites within the same building indicates the need to treat different sites as separate areas for the purpose of clearance; and aggressive sampling is appropriate for clearance testing because it captures more entrainable asbestos structures. Aggressive sampling lowers the chance of declaring a worksite clean when entrainable asbestos is still present.« less

  3. Students' Successes and Challenges Applying Data Analysis and Measurement Skills in a Fifth-Grade Integrated STEM Unit

    ERIC Educational Resources Information Center

    Glancy, Aran W.; Moore, Tamara J.; Guzey, Selcen; Smith, Karl A.

    2017-01-01

    An understanding of statistics and skills in data analysis are becoming more and more essential, yet research consistently shows that students struggle with these concepts at all levels. This case study documents some of the struggles four groups of fifth-grade students encounter as they collect, organize, and interpret data and then ultimately…

  4. When Does a Nation-Level Analysis Make Sense? ESD and Educational Governance in Brazil, South Africa, and the USA

    ERIC Educational Resources Information Center

    Feinstein, Noah Weeth; Jacobi, Pedro Roberto; Lotz-Sisitka, Heila

    2013-01-01

    International policy analysis tends to simplify the nation state, portraying countries as coherent units that can be described by one statistic or placed into one category. As scholars from Brazil, South Africa, and the USA, we find the nation-centric research perspective particularly challenging. In each of our home countries, the effective…

  5. Analysis of Pre-Service Science Teachers' Views about the Methods Which Develop Reflective Thinking

    ERIC Educational Resources Information Center

    Töman, Ufuk; Odabasi Çimer, Sabiha; Çimer, Atilla

    2014-01-01

    In this study, we investigate of science and technology pre-service teachers' opinions about the methods developed reflective thinking and we determined at the level of reflective thinking. This study is a descriptive study. Open-ended questions were used to determine the views of pre-service teachers. Questions used in the statistical analysis of…

  6. Ethics Education in University Aviation Management Programs in the US: Part Two B--Statistical Analysis of Current Practice.

    ERIC Educational Resources Information Center

    Oderman, Dale

    2003-01-01

    Part Two B of a three-part study examined how 40 universities with baccalaureate programs in aviation management include ethics education in the curricula. Analysis of responses suggests that there is strong support for ethics instruction and that active department head involvement leads to higher levels of planned ethics inclusion. (JOW)

  7. Using Key Part-of-Speech Analysis to Examine Spoken Discourse by Taiwanese EFL Learners

    ERIC Educational Resources Information Center

    Lin, Yen-Liang

    2015-01-01

    This study reports on a corpus analysis of samples of spoken discourse between a group of British and Taiwanese adolescents, with the aim of exploring the statistically significant differences in the use of grammatical categories between the two groups of participants. The key word method extended to a part-of-speech level using the web-based…

  8. [Factors associated with physical activity among Chinese immigrant women].

    PubMed

    Cho, Sung-Hye; Lee, Hyeonkyeong

    2013-12-01

    This study was done to assess the level of physical activity among Chinese immigrant women and to determine the relationships of physical activity with individual characteristics and behavior-specific cognition. A cross-sectional descriptive study was conducted with 161 Chinese immigrant women living in Busan. A health promotion model of physical activity adapted from Pender's Health Promotion Model was used. Self-administered questionnaires were used to collect data during the period from September 25 to November 20, 2012. Using SPSS 18.0 program, descriptive statistics, t-test, analysis of variance, correlation analysis, and multiple regression analysis were done. The average level of physical activity of the Chinese immigrant women was 1,050.06 ± 686.47 MET-min/week and the minimum activity among types of physical activity was most dominant (59.6%). As a result of multiple regression analysis, it was confirmed that self-efficacy and acculturation were statistically significant variables in the model (p<.001), with an explanatory power of 23.7%. The results indicate that the development and application of intervention strategies to increase acculturation and self-efficacy for immigrant women will aid in increasing the physical activity in Chinese immigrant women.

  9. Parameters optimization defined by statistical analysis for cysteine-dextran radiolabeling with technetium tricarbonyl core.

    PubMed

    Núñez, Eutimio Gustavo Fernández; Faintuch, Bluma Linkowski; Teodoro, Rodrigo; Wiecek, Danielle Pereira; da Silva, Natanael Gomes; Papadopoulos, Minas; Pelecanou, Maria; Pirmettis, Ioannis; de Oliveira Filho, Renato Santos; Duatti, Adriano; Pasqualini, Roberto

    2011-04-01

    The objective of this study was the development of a statistical approach for radiolabeling optimization of cysteine-dextran conjugates with Tc-99m tricarbonyl core. This strategy has been applied to the labeling of 2-propylene-S-cysteine-dextran in the attempt to prepare a new class of tracers for sentinel lymph node detection, and can be extended to other radiopharmaceuticals for different targets. The statistical routine was based on three-level factorial design. Best labeling conditions were achieved. The specific activity reached was 5 MBq/μg. Crown Copyright © 2011. Published by Elsevier Ltd. All rights reserved.

  10. Comparative role of 20% cord blood serum and 20% autologous serum in dry eye associated with Hansen's disease: a tear proteomic study.

    PubMed

    Mukhopadhyay, Somnath; Sen, Swarnali; Datta, Himadri

    2015-01-01

    To compare the role of topically applied serum therapy with preservative-free artificial tear (AT) drops in patients with moderate to severe dry eye in Hansen's disease along with change in tear protein profile. 144 consecutive patients were randomly divided into three groups. After a baseline examination of clinical parameters, each of the patients received designated modality of topical therapy six times a day for 6 weeks. Post-treatment documentation of clinical parameters was done at 6 weeks, and then at 12 weeks after discontinuation of topical therapy. Analysis of three tear proteins using gel electrophoresis (sodium dodecyl sulfate polyacrylamide gel electrophoresis) was done at baseline, at the first and second post-treatment visits. In the cord blood serum (CBS) group, except for McMonnies score and staining score, all other clinical parameters showed continued improvement in the first and second post-treatment analyses. In the autologous serum (ALS) group, all the clinical parameters except Schirmer's I showed significant improvement in the first post-treatment analysis .This was sustained at a significant level in the second analysis except for tear film break-up time (TBUT) and conjunctival impression cytology grading. In the AT group, all the parameters improved at a non-significant level except for TBUT in the first analysis. In the next analysis, apart from McMonnies score and TBUT, other clinical parameters did not improve. In the ALS and CBS groups, tear lysozyme, lactoferrin levels improved in both post-treatment measurements (statistically insignificant).Total tear protein continued to increase at statistically significant levels in the first and second post-treatment analyses in the CBS group and at a statistically insignificant level in the ALS group. In the AT group, the three tear proteins continued to decrease in both the analyses. In moderate to severe dry eye in Hansen's disease, serum therapy in comparison with AT drops, improves clinical parameters and causes betterment in tear protein profile. CTRI/2013/07/003802. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  11. Balneotherapy for osteoarthritis. A cochrane review.

    PubMed

    Verhagen, Arianne; Bierma-Zeinstra, Sita; Lambeck, Johan; Cardoso, Jefferson Rosa; de Bie, Rob; Boers, Maarten; de Vet, Henrica C W

    2008-06-01

    Balneotherapy (or spa therapy, mineral baths) for patients with arthritis is one of the oldest forms of therapy. We assessed effectiveness of balneotherapy for patients with osteoarthritis (OA). We performed a broad search strategy to retrieve eligible studies, selecting randomized controlled trials comparing balneotherapy with any intervention or with no intervention. Two authors independently assessed quality and extracted data. Disagreements were solved by consensus. In the event of clinical heterogeneity or lack of data we refrained from statistical pooling. Seven trials (498 patients) were included in this review: one performed an intention-to-treat analysis, 2 provided data for our own analysis, and one reported a "quality of life" outcome. We found silver-level evidence of mineral baths compared to no treatment (effect sizes 0.34-1.82). Adverse events were not measured or found in included trials. We found silver-level evidence concerning the beneficial effects of mineral baths compared to no treatment. Of all other balneological treatments, no clear effects were found. However, the scientific evidence is weak because of the poor methodological quality and the absence of an adequate statistical analysis and data presentation.

  12. Correlation of spleen metabolism assessed by 18F-FDG PET with serum interleukin-2 receptor levels and other biomarkers in patients with untreated sarcoidosis.

    PubMed

    Kalkanis, Alexandros; Kalkanis, Dimitrios; Drougas, Dimitrios; Vavougios, George D; Datseris, Ioannis; Judson, Marc A; Georgiou, Evangelos

    2016-03-01

    The objective of our study was to assess the possible relationship between splenic F-18-fluorodeoxyglucose (18F-FDG) uptake and other established biochemical markers of sarcoidosis activity. Thirty treatment-naive sarcoidosis patients were prospectively enrolled in this study. They underwent biochemical laboratory tests, including serum interleukin-2 receptor (sIL-2R), serum C-reactive protein, serum angiotensin-I converting enzyme, and 24-h urine calcium levels, and a whole-body combined 18F-FDG PET/computed tomography (PET/CT) scan as a part of an ongoing study at our institute. These biomarkers were statistically compared in these patients. A statistically significant linear dependence was detected between sIL-2R and log-transformed spleen-average standard uptake value (SUV avg) (R2=0.488, P<0.0001) and log-transformed spleen-maximum standard uptake value (SUV max) (R2=0.490, P<0.0001). sIL-2R levels and splenic size correlated linearly (Pearson's r=0.373, P=0.042). Multivariate linear regression analysis revealed that this correlation remained significant after age and sex adjustment (β=0.001, SE=0.001, P=0.024). No statistically significant associations were detected between (a) any two serum biomarkers or (b) between spleen-SUV measurements and any serum biomarker other than sIL-2R. Our analysis revealed an association between sIL-2R levels and spleen 18F-FDG uptake and size, whereas all other serum biomarkers were not significantly associated with each other or with PET 18F-FDG uptake. Our results suggest that splenic inflammation may be related to the systemic inflammatory response in sarcoidosis that may be associated with elevated sIL-2R levels.

  13. Diet and liver apoptosis in rats: a particular metabolic pathway.

    PubMed

    Monteiro, Maria Emilia Lopes; Xavier, Analucia Rampazzo; Azeredo, Vilma Blondet

    2017-03-30

    Various studies have indicated an association between modifi cation in dietary macronutrient composition and liver apoptosis. To explain how changes in metabolic pathways associated with a high-protein, high-fat, and low-carbohydrate diet causes liver apoptosis. Two groups of rats were compared. An experimental diet group (n = 8) using a high-protein (59.46%), high-fat (31.77%), and low-carbohydrate (8.77%) diet versus a control one (n = 9) with American Institute of Nutrition (AIN)-93-M diet. Animals were sacrificed after eight weeks, the adipose tissue weighed, the liver removed for flow cytometry analysis, and blood collected to measure glucose, insulin, glucagon, IL-6, TNF, triglycerides, malondialdehyde, and β-hydroxybutyrate. Statistical analysis was carried out using the unpaired and parametric Student's t-test and Pearson's correlation coeffi ents. Significance was set at p < 0.05. Animals from the experimental group presented less adipose tissue than dose of the control group. Percentage of nonviable hepatocytes in the experimental group was 2.18 times larger than the control group (p = 0.001). No statistically significant differences were found in capillary glucose, insulin, glucagon, IL-6, or TNF-α between two groups. Plasmatic β-hydroxybutyrate and malondialdehyde of the experimental group expressed higher levels and triglycerides lower levels compared with the control group. The results show a positive and significant correlation between the percentage of nonviable hepatocytes and malondialdehyde levels (p = 0.0217) and a statistically significant negative correlation with triglycerides levels (p = 0.006). Results suggest that plasmatic malondialdehyde and triglyceride levels are probably good predictors of liver damage associated with an experimental low-carbohydrate diet in rats.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad Allen

    EDENx is a multivariate data visualization tool that allows interactive user driven analysis of large-scale data sets with high dimensionality. EDENx builds on our earlier system, called EDEN to enable analysis of more dimensions and larger scale data sets. EDENx provides an initial overview of summary statistics for each variable in the data set under investigation. EDENx allows the user to interact with graphical summary plots of the data to investigate subsets and their statistical associations. These plots include histograms, binned scatterplots, binned parallel coordinate plots, timeline plots, and graphical correlation indicators. From the EDENx interface, a user can selectmore » a subsample of interest and launch a more detailed data visualization via the EDEN system. EDENx is best suited for high-level, aggregate analysis tasks while EDEN is more appropriate for detail data investigations.« less

  15. Statistical analysis of water-quality data containing multiple detection limits: S-language software for regression on order statistics

    USGS Publications Warehouse

    Lee, L.; Helsel, D.

    2005-01-01

    Trace contaminants in water, including metals and organics, often are measured at sufficiently low concentrations to be reported only as values below the instrument detection limit. Interpretation of these "less thans" is complicated when multiple detection limits occur. Statistical methods for multiply censored, or multiple-detection limit, datasets have been developed for medical and industrial statistics, and can be employed to estimate summary statistics or model the distributions of trace-level environmental data. We describe S-language-based software tools that perform robust linear regression on order statistics (ROS). The ROS method has been evaluated as one of the most reliable procedures for developing summary statistics of multiply censored data. It is applicable to any dataset that has 0 to 80% of its values censored. These tools are a part of a software library, or add-on package, for the R environment for statistical computing. This library can be used to generate ROS models and associated summary statistics, plot modeled distributions, and predict exceedance probabilities of water-quality standards. ?? 2005 Elsevier Ltd. All rights reserved.

  16. Statistical analysis of effective singular values in matrix rank determination

    NASA Technical Reports Server (NTRS)

    Konstantinides, Konstantinos; Yao, Kung

    1988-01-01

    A major problem in using SVD (singular-value decomposition) as a tool in determining the effective rank of a perturbed matrix is that of distinguishing between significantly small and significantly large singular values to the end, conference regions are derived for the perturbed singular values of matrices with noisy observation data. The analysis is based on the theories of perturbations of singular values and statistical significance test. Threshold bounds for perturbation due to finite-precision and i.i.d. random models are evaluated. In random models, the threshold bounds depend on the dimension of the matrix, the noisy variance, and predefined statistical level of significance. Results applied to the problem of determining the effective order of a linear autoregressive system from the approximate rank of a sample autocorrelation matrix are considered. Various numerical examples illustrating the usefulness of these bounds and comparisons to other previously known approaches are given.

  17. Laser diagnostics of native cervix dabs with human papilloma virus in high carcinogenic risk

    NASA Astrophysics Data System (ADS)

    Peresunko, O. P.; Karpenko, Ju. G.; Burkovets, D. N.; Ivashko, P. V.; Nikorych, A. V.; Yermolenko, S. B.; Gruia, Ion; Gruia, M. J.

    2015-11-01

    The results of experimental studies of coordinate distributions of Mueller matrix elements of the following types of cervical scraping tissue are presented: rate- low-grade - highly differentiated dysplasia (CIN1-CIN3) - adenocarcinoma of high, medium and low levels of differentiation (G1-G3). The rationale for the choice of statistical points 1-4 orders polarized coherent radiation field, transformed as a result of interaction with the oncologic modified biological layers "epithelium-stroma" as a quantitative criterion of polarimetric optical differentiation state of human biological tissues are shown here. The analysis of the obtained Mueller matrix elements and statistical correlation methods, the systematized by types studied tissues is accomplished. The results of research images of Mueller matrix elements m34 for this type of pathology as low-grade dysplasia (CIN2), the results of its statistical and correlation analysis are presented.

  18. A phylogenetic transform enhances analysis of compositional microbiota data.

    PubMed

    Silverman, Justin D; Washburne, Alex D; Mukherjee, Sayan; David, Lawrence A

    2017-02-15

    Surveys of microbial communities (microbiota), typically measured as relative abundance of species, have illustrated the importance of these communities in human health and disease. Yet, statistical artifacts commonly plague the analysis of relative abundance data. Here, we introduce the PhILR transform, which incorporates microbial evolutionary models with the isometric log-ratio transform to allow off-the-shelf statistical tools to be safely applied to microbiota surveys. We demonstrate that analyses of community-level structure can be applied to PhILR transformed data with performance on benchmarks rivaling or surpassing standard tools. Additionally, by decomposing distance in the PhILR transformed space, we identified neighboring clades that may have adapted to distinct human body sites. Decomposing variance revealed that covariation of bacterial clades within human body sites increases with phylogenetic relatedness. Together, these findings illustrate how the PhILR transform combines statistical and phylogenetic models to overcome compositional data challenges and enable evolutionary insights relevant to microbial communities.

  19. Geostatistical interpolation model selection based on ArcGIS and spatio-temporal variability analysis of groundwater level in piedmont plains, northwest China.

    PubMed

    Xiao, Yong; Gu, Xiaomin; Yin, Shiyang; Shao, Jingli; Cui, Yali; Zhang, Qiulan; Niu, Yong

    2016-01-01

    Based on the geo-statistical theory and ArcGIS geo-statistical module, datas of 30 groundwater level observation wells were used to estimate the decline of groundwater level in Beijing piedmont. Seven different interpolation methods (inverse distance weighted interpolation, global polynomial interpolation, local polynomial interpolation, tension spline interpolation, ordinary Kriging interpolation, simple Kriging interpolation and universal Kriging interpolation) were used for interpolating groundwater level between 2001 and 2013. Cross-validation, absolute error and coefficient of determination (R(2)) was applied to evaluate the accuracy of different methods. The result shows that simple Kriging method gave the best fit. The analysis of spatial and temporal variability suggest that the nugget effects from 2001 to 2013 were increasing, which means the spatial correlation weakened gradually under the influence of human activities. The spatial variability in the middle areas of the alluvial-proluvial fan is relatively higher than area in top and bottom. Since the changes of the land use, groundwater level also has a temporal variation, the average decline rate of groundwater level between 2007 and 2013 increases compared with 2001-2006. Urban development and population growth cause over-exploitation of residential and industrial areas. The decline rate of the groundwater level in residential, industrial and river areas is relatively high, while the decreasing of farmland area and development of water-saving irrigation reduce the quantity of water using by agriculture and decline rate of groundwater level in agricultural area is not significant.

  20. Statistical analysis to assess automated level of suspicion scoring methods in breast ultrasound

    NASA Astrophysics Data System (ADS)

    Galperin, Michael

    2003-05-01

    A well-defined rule-based system has been developed for scoring 0-5 the Level of Suspicion (LOS) based on qualitative lexicon describing the ultrasound appearance of breast lesion. The purposes of the research are to asses and select one of the automated LOS scoring quantitative methods developed during preliminary studies in benign biopsies reduction. The study has used Computer Aided Imaging System (CAIS) to improve the uniformity and accuracy of applying the LOS scheme by automatically detecting, analyzing and comparing breast masses. The overall goal is to reduce biopsies on the masses with lower levels of suspicion, rather that increasing the accuracy of diagnosis of cancers (will require biopsy anyway). On complex cysts and fibroadenoma cases experienced radiologists were up to 50% less certain in true negatives than CAIS. Full correlation analysis was applied to determine which of the proposed LOS quantification methods serves CAIS accuracy the best. This paper presents current results of applying statistical analysis for automated LOS scoring quantification for breast masses with known biopsy results. It was found that First Order Ranking method yielded most the accurate results. The CAIS system (Image Companion, Data Companion software) is developed by Almen Laboratories and was used to achieve the results.

  1. Fast and accurate imputation of summary statistics enhances evidence of functional enrichment

    PubMed Central

    Pasaniuc, Bogdan; Zaitlen, Noah; Shi, Huwenbo; Bhatia, Gaurav; Gusev, Alexander; Pickrell, Joseph; Hirschhorn, Joel; Strachan, David P.; Patterson, Nick; Price, Alkes L.

    2014-01-01

    Motivation: Imputation using external reference panels (e.g. 1000 Genomes) is a widely used approach for increasing power in genome-wide association studies and meta-analysis. Existing hidden Markov models (HMM)-based imputation approaches require individual-level genotypes. Here, we develop a new method for Gaussian imputation from summary association statistics, a type of data that is becoming widely available. Results: In simulations using 1000 Genomes (1000G) data, this method recovers 84% (54%) of the effective sample size for common (>5%) and low-frequency (1–5%) variants [increasing to 87% (60%) when summary linkage disequilibrium information is available from target samples] versus the gold standard of 89% (67%) for HMM-based imputation, which cannot be applied to summary statistics. Our approach accounts for the limited sample size of the reference panel, a crucial step to eliminate false-positive associations, and it is computationally very fast. As an empirical demonstration, we apply our method to seven case–control phenotypes from the Wellcome Trust Case Control Consortium (WTCCC) data and a study of height in the British 1958 birth cohort (1958BC). Gaussian imputation from summary statistics recovers 95% (105%) of the effective sample size (as quantified by the ratio of χ2 association statistics) compared with HMM-based imputation from individual-level genotypes at the 227 (176) published single nucleotide polymorphisms (SNPs) in the WTCCC (1958BC height) data. In addition, for publicly available summary statistics from large meta-analyses of four lipid traits, we publicly release imputed summary statistics at 1000G SNPs, which could not have been obtained using previously published methods, and demonstrate their accuracy by masking subsets of the data. We show that 1000G imputation using our approach increases the magnitude and statistical evidence of enrichment at genic versus non-genic loci for these traits, as compared with an analysis without 1000G imputation. Thus, imputation of summary statistics will be a valuable tool in future functional enrichment analyses. Availability and implementation: Publicly available software package available at http://bogdan.bioinformatics.ucla.edu/software/. Contact: bpasaniuc@mednet.ucla.edu or aprice@hsph.harvard.edu Supplementary information: Supplementary materials are available at Bioinformatics online. PMID:24990607

  2. Archival Legacy Investigations of Circumstellar Environments (ALICE): Statistical assessment of point source detections

    NASA Astrophysics Data System (ADS)

    Choquet, Élodie; Pueyo, Laurent; Soummer, Rémi; Perrin, Marshall D.; Hagan, J. Brendan; Gofas-Salas, Elena; Rajan, Abhijith; Aguilar, Jonathan

    2015-09-01

    The ALICE program, for Archival Legacy Investigation of Circumstellar Environment, is currently conducting a virtual survey of about 400 stars, by re-analyzing the HST-NICMOS coronagraphic archive with advanced post-processing techniques. We present here the strategy that we adopted to identify detections and potential candidates for follow-up observations, and we give a preliminary overview of our detections. We present a statistical analysis conducted to evaluate the confidence level on these detection and the completeness of our candidate search.

  3. The role of environmental heterogeneity in meta-analysis of gene-environment interactions with quantitative traits.

    PubMed

    Li, Shi; Mukherjee, Bhramar; Taylor, Jeremy M G; Rice, Kenneth M; Wen, Xiaoquan; Rice, John D; Stringham, Heather M; Boehnke, Michael

    2014-07-01

    With challenges in data harmonization and environmental heterogeneity across various data sources, meta-analysis of gene-environment interaction studies can often involve subtle statistical issues. In this paper, we study the effect of environmental covariate heterogeneity (within and between cohorts) on two approaches for fixed-effect meta-analysis: the standard inverse-variance weighted meta-analysis and a meta-regression approach. Akin to the results in Simmonds and Higgins (), we obtain analytic efficiency results for both methods under certain assumptions. The relative efficiency of the two methods depends on the ratio of within versus between cohort variability of the environmental covariate. We propose to use an adaptively weighted estimator (AWE), between meta-analysis and meta-regression, for the interaction parameter. The AWE retains full efficiency of the joint analysis using individual level data under certain natural assumptions. Lin and Zeng (2010a, b) showed that a multivariate inverse-variance weighted estimator retains full efficiency as joint analysis using individual level data, if the estimates with full covariance matrices for all the common parameters are pooled across all studies. We show consistency of our work with Lin and Zeng (2010a, b). Without sacrificing much efficiency, the AWE uses only univariate summary statistics from each study, and bypasses issues with sharing individual level data or full covariance matrices across studies. We compare the performance of the methods both analytically and numerically. The methods are illustrated through meta-analysis of interaction between Single Nucleotide Polymorphisms in FTO gene and body mass index on high-density lipoprotein cholesterol data from a set of eight studies of type 2 diabetes. © 2014 WILEY PERIODICALS, INC.

  4. Chromosome breakage in humans exposed to methyl mercury through fish consumption

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skerfving, S.; Hansson, K.; Lindsten, J.

    1980-08-01

    Chromosome analysis was performed on cells from lymphocyte cultures from nine subjects with increased levels of mercury in their red blood cells and in four healthy controls. The elevated mercury levels were likely to have originated from dietary fish with high levels of methyl mercury. A statistically significant rank correlation was found between the frequency of cells with chromosome breaks and mercury concentration. The biological significance of these findings is at present unknown.

  5. Modulating toll-like receptor-mediated inflammatory responses following exposure of whole cell and lipopolysaccharide component from Porphyromonas gingivalis in wistar rat models

    PubMed Central

    Nelwan, Sindy Cornelia; Nugraha, Ricardo Adrian; Endaryanto, Anang; Retno, Indrawati

    2017-01-01

    Objective: To explore host innate inflammatory response and the signal pathway induced by Porphyromonas gingivalis by measuring level of toll-like receptor 2 (TLR2) and TLR4 activity. Materials and Methods: Animal experimental study with pretest-posttest controlled group design were done between January 1 and December 10, 2016.. Total of 28 wistar rats had been used, randomized into 7 groups, each were given various dose of intra-sulcural injection of Porphyromonas gingivalis lipopolysaccharide. Statistical Analysis: Normality were measured by Shapiro–Wilk test, while statistical analysis made by ANOVA, t test, Pearson, and linear regression model.. Results: At day 0, no significant difference TLR2 and TLR4 level were measured. At day 4, there is a slight difference between TLR2 and TLR4 level in each group. At day 11, there is a significant difference between TLR2 and TLR4 level in each group. Group with exposure of whole cell will develop greater TLR2 but lower TLR4 level. In the contrary, group with exposure of LPS will develop greater TLR4 but lower TLR2 level. Conclusion: Our data supported that P. gingivalis played a vital role in the pathogenesis of pathogen-induced inflammatory responses in which TLR2 and TLR4 have different molecular mechanisms following recognition of pathogens and inflammatory response. PMID:29279665

  6. Water levels and groundwater and surface-water exchanges in lakes of the northeast Twin Cities Metropolitan Area, Minnesota, 2002 through 2015

    USGS Publications Warehouse

    Jones, Perry M.; Trost, Jared J.; Erickson, Melinda L.

    2016-10-19

    OverviewThis study assessed lake-water levels and regional and local groundwater and surface-water exchanges near northeast Twin Cities Metropolitan Area lakes applying three approaches: statistical analysis, field study, and groundwater-flow modeling.  Statistical analyses of lake levels were completed to assess the effect of physical setting and climate on lake-level fluctuations of selected lakes. A field study of groundwater and surface-water interactions in selected lakes was completed to (1) estimate potential percentages of surface-water contributions to well water across the northeast Twin Cities Metropolitan Area, (2) estimate general ages for waters extracted from the wells, and (3) assess groundwater inflow to lakes and lake-water outflow to aquifers downgradient from White Bear Lake.  Groundwater flow was simulated using a steady-state, groundwater-flow model to assess regional groundwater and surface-water exchanges and the effects of groundwater withdrawals, climate, and other factors on water levels of northeast Twin Cities Metropolitan Area lakes.

  7. Economic stress or random variation? Revisiting German reunification as a natural experiment to investigate the effect of economic contraction on sex ratios at birth.

    PubMed

    Schnettler, Sebastian; Klüsener, Sebastian

    2014-12-22

    The economic stress hypothesis (ESH) predicts decreases in the sex ratio at birth (SRB) following economic decline. However, as many factors influence the SRB, this hypothesis is difficult to test empirically. Thus, researchers make use of quasi-experiments such as German reunification: The economy in East, but not in West Germany, underwent a rapid decline in 1991. A co-occurrence of a decline in the East German SRB in 1991 has been interpreted by some as support for the ESH. However, another explanation might be that the low SRB in 1991 stems from increased random variation in the East German SRB due to a drastically reduced number of births during the crisis. We look into this alternative random variation hypothesis (RVH) by re-examining the German case with more detailed data. Our analysis has two parts. First, using aggregate-level birth register data for all births in the period between 1946 and 2011, we plot the quantum and variance of the SRB and the number of births and unemployment rates, separately for East and West Germany, and conduct a time series analysis on the East German SRB over time. Second, we model the odds for a male birth at the individual level in a multiple logistic regression (1991-2010, ~13.9 million births). Explanatory variables are related to the level of the individual birth, the mother of the child born, and the regional economic context. The aggregate-level analysis reveals a higher degree of variation of the SRB in East Germany. Deviations from the time trend occur in several years, seemingly unrelated to economic development, and the deviation in 1991 is not statistically significant. The individual-level analysis confirms that the 1991-drop in the East German SRB cannot directly be attributed to economic development and that there is no statistically significant effect of economic development on sex determination in East or West Germany. Outcomes support the RVH but not the ESH. Furthermore, our results speak against a statistically significant effect of the reunification event itself on the East German SRB. We discuss the relative importance of behavioral and physiological responses to macro-level stressors, a distinction that may help integrate previously mixed findings.

  8. Introduction to bioinformatics.

    PubMed

    Can, Tolga

    2014-01-01

    Bioinformatics is an interdisciplinary field mainly involving molecular biology and genetics, computer science, mathematics, and statistics. Data intensive, large-scale biological problems are addressed from a computational point of view. The most common problems are modeling biological processes at the molecular level and making inferences from collected data. A bioinformatics solution usually involves the following steps: Collect statistics from biological data. Build a computational model. Solve a computational modeling problem. Test and evaluate a computational algorithm. This chapter gives a brief introduction to bioinformatics by first providing an introduction to biological terminology and then discussing some classical bioinformatics problems organized by the types of data sources. Sequence analysis is the analysis of DNA and protein sequences for clues regarding function and includes subproblems such as identification of homologs, multiple sequence alignment, searching sequence patterns, and evolutionary analyses. Protein structures are three-dimensional data and the associated problems are structure prediction (secondary and tertiary), analysis of protein structures for clues regarding function, and structural alignment. Gene expression data is usually represented as matrices and analysis of microarray data mostly involves statistics analysis, classification, and clustering approaches. Biological networks such as gene regulatory networks, metabolic pathways, and protein-protein interaction networks are usually modeled as graphs and graph theoretic approaches are used to solve associated problems such as construction and analysis of large-scale networks.

  9. Two-mode mazer injected with V-type three-level atoms

    NASA Astrophysics Data System (ADS)

    Liang, Wen-Qing; Zhang, Zhi-Ming; Xie, Sheng-Wu

    2003-12-01

    The properties of the two-mode mazer operating on V-type three-level atoms are studied. The effect of the one-atom pumping on the two modes of the cavity field in number-state is asymmetric, that is, the atom emits a photon into one mode with some probability and absorbs a photon from the other mode with some other probability. This effect makes the steady-state photon distribution and the steady-state photon statistics asymmetric for the two modes. The diagram of the probability currents for the photon distribution, given by the analysis of the master equation, reveals that there is no detailed balance solution for the master equation. The computations show that the photon statistics of one mode or both modes can be sub-Poissonian, that the two modes can have anticorrelation or correlation, that the photon statistics increases with the increase of thermal photons and that the resonant position and strength of the photon statistics are influenced by the ratio of the two coupling strengths of the two modes. These properties are also discussed physically.

  10. A statistical approach to optimizing concrete mixture design.

    PubMed

    Ahmad, Shamsad; Alghamdi, Saeid A

    2014-01-01

    A step-by-step statistical approach is proposed to obtain optimum proportioning of concrete mixtures using the data obtained through a statistically planned experimental program. The utility of the proposed approach for optimizing the design of concrete mixture is illustrated considering a typical case in which trial mixtures were considered according to a full factorial experiment design involving three factors and their three levels (3(3)). A total of 27 concrete mixtures with three replicates (81 specimens) were considered by varying the levels of key factors affecting compressive strength of concrete, namely, water/cementitious materials ratio (0.38, 0.43, and 0.48), cementitious materials content (350, 375, and 400 kg/m(3)), and fine/total aggregate ratio (0.35, 0.40, and 0.45). The experimental data were utilized to carry out analysis of variance (ANOVA) and to develop a polynomial regression model for compressive strength in terms of the three design factors considered in this study. The developed statistical model was used to show how optimization of concrete mixtures can be carried out with different possible options.

  11. A Statistical Approach to Optimizing Concrete Mixture Design

    PubMed Central

    Alghamdi, Saeid A.

    2014-01-01

    A step-by-step statistical approach is proposed to obtain optimum proportioning of concrete mixtures using the data obtained through a statistically planned experimental program. The utility of the proposed approach for optimizing the design of concrete mixture is illustrated considering a typical case in which trial mixtures were considered according to a full factorial experiment design involving three factors and their three levels (33). A total of 27 concrete mixtures with three replicates (81 specimens) were considered by varying the levels of key factors affecting compressive strength of concrete, namely, water/cementitious materials ratio (0.38, 0.43, and 0.48), cementitious materials content (350, 375, and 400 kg/m3), and fine/total aggregate ratio (0.35, 0.40, and 0.45). The experimental data were utilized to carry out analysis of variance (ANOVA) and to develop a polynomial regression model for compressive strength in terms of the three design factors considered in this study. The developed statistical model was used to show how optimization of concrete mixtures can be carried out with different possible options. PMID:24688405

  12. An Adaptive Association Test for Multiple Phenotypes with GWAS Summary Statistics.

    PubMed

    Kim, Junghi; Bai, Yun; Pan, Wei

    2015-12-01

    We study the problem of testing for single marker-multiple phenotype associations based on genome-wide association study (GWAS) summary statistics without access to individual-level genotype and phenotype data. For most published GWASs, because obtaining summary data is substantially easier than accessing individual-level phenotype and genotype data, while often multiple correlated traits have been collected, the problem studied here has become increasingly important. We propose a powerful adaptive test and compare its performance with some existing tests. We illustrate its applications to analyses of a meta-analyzed GWAS dataset with three blood lipid traits and another with sex-stratified anthropometric traits, and further demonstrate its potential power gain over some existing methods through realistic simulation studies. We start from the situation with only one set of (possibly meta-analyzed) genome-wide summary statistics, then extend the method to meta-analysis of multiple sets of genome-wide summary statistics, each from one GWAS. We expect the proposed test to be useful in practice as more powerful than or complementary to existing methods. © 2015 WILEY PERIODICALS, INC.

  13. Managing Complexity in Evidence Analysis: A Worked Example in Pediatric Weight Management.

    PubMed

    Parrott, James Scott; Henry, Beverly; Thompson, Kyle L; Ziegler, Jane; Handu, Deepa

    2018-05-02

    Nutrition interventions are often complex and multicomponent. Typical approaches to meta-analyses that focus on individual causal relationships to provide guideline recommendations are not sufficient to capture this complexity. The objective of this study is to describe the method of meta-analysis used for the Pediatric Weight Management (PWM) Guidelines update and provide a worked example that can be applied in other areas of dietetics practice. The effects of PWM interventions were examined for body mass index (BMI), body mass index z-score (BMIZ), and waist circumference at four different time periods. For intervention-level effects, intervention types were identified empirically using multiple correspondence analysis paired with cluster analysis. Pooled effects of identified types were examined using random effects meta-analysis models. Differences in effects among types were examined using meta-regression. Context-level effects are examined using qualitative comparative analysis. Three distinct types (or families) of PWM interventions were identified: medical nutrition, behavioral, and missing components. Medical nutrition and behavioral types showed statistically significant improvements in BMIZ across all time points. Results were less consistent for BMI and waist circumference, although four distinct patterns of weight status change were identified. These varied by intervention type as well as outcome measure. Meta-regression indicated statistically significant differences between the medical nutrition and behavioral types vs the missing component type for both BMIZ and BMI, although the pattern varied by time period and intervention type. Qualitative comparative analysis identified distinct configurations of context characteristics at each time point that were consistent with positive outcomes among the intervention types. Although analysis of individual causal relationships is invaluable, this approach is inadequate to capture the complexity of dietetics practice. An alternative approach that integrates intervention-level with context-level meta-analyses may provide deeper understanding in the development of practice guidelines. Copyright © 2018 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  14. [Review of meta-analysis research on exercise in South Korea].

    PubMed

    Song, Youngshin; Gang, Moonhee; Kim, Sun Ae; Shin, In Soo

    2014-10-01

    The purpose of this study was to evaluate the quality of meta-analysis regarding exercise using Assessment of Multiple Systematic Reviews (AMSTAR) as well as to compare effect size according to outcomes. Electronic databases including the Korean Studies Information Service System (KISS), the National Assembly Library and the DBpia, HAKJISA and RISS4U for the dates 1990 to January 2014 were searched for 'meta-analysis' and 'exercise' in the fields of medical, nursing, physical therapy and physical exercise in Korea. AMSTAR was scored for quality assessment of the 33 articles included in the study. Data were analyzed using descriptive statistics, t-test, ANOVA and χ²-test. The mean score for AMSTAR evaluations was 4.18 (SD=1.78) and about 67% were classified at the low-quality level and 30% at the moderate-quality level. The scores of quality were statistically different by field of research, number of participants, number of databases, financial support and approval by IRB. The effect size that presented in individual studies were different by type of exercise in the applied intervention. This critical appraisal of meta-analysis published in various field that focused on exercise indicates that a guideline such as the PRISMA checklist should be strongly recommended for optimum reporting of meta-analysis across research fields.

  15. Multiple Phenotype Association Tests Using Summary Statistics in Genome-Wide Association Studies

    PubMed Central

    Liu, Zhonghua; Lin, Xihong

    2017-01-01

    Summary We study in this paper jointly testing the associations of a genetic variant with correlated multiple phenotypes using the summary statistics of individual phenotype analysis from Genome-Wide Association Studies (GWASs). We estimated the between-phenotype correlation matrix using the summary statistics of individual phenotype GWAS analyses, and developed genetic association tests for multiple phenotypes by accounting for between-phenotype correlation without the need to access individual-level data. Since genetic variants often affect multiple phenotypes differently across the genome and the between-phenotype correlation can be arbitrary, we proposed robust and powerful multiple phenotype testing procedures by jointly testing a common mean and a variance component in linear mixed models for summary statistics. We computed the p-values of the proposed tests analytically. This computational advantage makes our methods practically appealing in large-scale GWASs. We performed simulation studies to show that the proposed tests maintained correct type I error rates, and to compare their powers in various settings with the existing methods. We applied the proposed tests to a GWAS Global Lipids Genetics Consortium summary statistics data set and identified additional genetic variants that were missed by the original single-trait analysis. PMID:28653391

  16. Multiple phenotype association tests using summary statistics in genome-wide association studies.

    PubMed

    Liu, Zhonghua; Lin, Xihong

    2018-03-01

    We study in this article jointly testing the associations of a genetic variant with correlated multiple phenotypes using the summary statistics of individual phenotype analysis from Genome-Wide Association Studies (GWASs). We estimated the between-phenotype correlation matrix using the summary statistics of individual phenotype GWAS analyses, and developed genetic association tests for multiple phenotypes by accounting for between-phenotype correlation without the need to access individual-level data. Since genetic variants often affect multiple phenotypes differently across the genome and the between-phenotype correlation can be arbitrary, we proposed robust and powerful multiple phenotype testing procedures by jointly testing a common mean and a variance component in linear mixed models for summary statistics. We computed the p-values of the proposed tests analytically. This computational advantage makes our methods practically appealing in large-scale GWASs. We performed simulation studies to show that the proposed tests maintained correct type I error rates, and to compare their powers in various settings with the existing methods. We applied the proposed tests to a GWAS Global Lipids Genetics Consortium summary statistics data set and identified additional genetic variants that were missed by the original single-trait analysis. © 2017, The International Biometric Society.

  17. A comparison of dental hygienists' salaries to state dental supervision levels.

    PubMed

    Catlett, April

    2014-12-01

    The purpose of this study is to evaluate the effect of dental supervision on registered dental hygienists' salaries in the 50 states and District of Columbia by comparing the average dental hygiene salaries from the largest metropolitan city within each state from May 2011, the most recent valid data, in relation to the required level of dental supervision. A retrospective contrasted-group quasi-experimental design analysis was conducted using the most current mean dental hygiene salaries for the largest metropolitan city within each state and the District of Columbia which was matched to the appropriate dental supervision level. In addition, a dental assisting salary control group was utilized and correlated to the appropriate dental hygienist salary in the same metropolitan city and state. Samples were obtained from the U.S. Department of Labor. A multivariate analysis of variance (MANOVA) statistical analysis was utilized to assess the relationship of the 5 levels of dentist supervision, with the registered dental hygienist salaries. The MANOVA analysis was also utilized to assess the control group, dental assistant salaries. No statistically significant results were found among the dental supervision levels on the measures of dental hygiene salaries and dental assistant salaries. Wilks's Λ=0.81, F (8, 90)=1.29, p=0.26. Analyses of variances (ANOVA) on the dependent variables were also conducted as follow-up tests to the MANOVA. Study results suggest dental hygienists who are required to have a dentist on the premises to complete any dental treatment obtain similar salaries to those dental hygienists who are allowed to work in some settings unsupervised by a dentist. Therefore, dental supervision does not seem to have an impact on dental hygienists' salaries. Copyright © 2014 The American Dental Hygienists’ Association.

  18. An Efficient Objective Analysis System for Parallel Computers

    NASA Technical Reports Server (NTRS)

    Stobie, James G.

    1999-01-01

    A new objective analysis system designed for parallel computers will be described. The system can produce a global analysis (on a 2 x 2.5 lat-lon grid with 20 levels of heights and winds and 10 levels of moisture) using 120,000 observations in less than 3 minutes on 32 CPUs (SGI Origin 2000). No special parallel code is needed (e.g. MPI or multitasking) and the 32 CPUs do not have to be on the same platform. The system Ls totally portable and can run on -several different architectures at once. In addition, the system can easily scale up to 100 or more CPUS. This will allow for much higher resolution and significant increases in input data. The system scales linearly as the number of observations and the number of grid points. The cost overhead in going from I to 32 CPus is 18%. in addition, the analysis results are identical regardless of the number of processors used. T'his system has all the characteristics of optimal interpolation, combining detailed instrument and first guess error statistics to produce the best estimate of the atmospheric state. It also includes a new quality control (buddy check) system. Static tests with the system showed it's analysis increments are comparable to the latest NASA operational system including maintenance of mass-wind balance. Results from a 2-month cycling test in the Goddard EOS Data Assimilation System (GEOS DAS) show this new analysis retains the same level of agreement between the first guess and observations (0-F statistics) throughout the entire two months.

  19. Statistical methods for astronomical data with upper limits. II - Correlation and regression

    NASA Technical Reports Server (NTRS)

    Isobe, T.; Feigelson, E. D.; Nelson, P. I.

    1986-01-01

    Statistical methods for calculating correlations and regressions in bivariate censored data where the dependent variable can have upper or lower limits are presented. Cox's regression and the generalization of Kendall's rank correlation coefficient provide significant levels of correlations, and the EM algorithm, under the assumption of normally distributed errors, and its nonparametric analog using the Kaplan-Meier estimator, give estimates for the slope of a regression line. Monte Carlo simulations demonstrate that survival analysis is reliable in determining correlations between luminosities at different bands. Survival analysis is applied to CO emission in infrared galaxies, X-ray emission in radio galaxies, H-alpha emission in cooling cluster cores, and radio emission in Seyfert galaxies.

  20. Evaluation of higher order statistics parameters for multi channel sEMG using different force levels.

    PubMed

    Naik, Ganesh R; Kumar, Dinesh K

    2011-01-01

    The electromyograpy (EMG) signal provides information about the performance of muscles and nerves. The shape of the muscle signal and motor unit action potential (MUAP) varies due to the movement of the position of the electrode or due to changes in contraction level. This research deals with evaluating the non-Gaussianity in Surface Electromyogram signal (sEMG) using higher order statistics (HOS) parameters. To achieve this, experiments were conducted for four different finger and wrist actions at different levels of Maximum Voluntary Contractions (MVCs). Our experimental analysis shows that at constant force and for non-fatiguing contractions, probability density functions (PDF) of sEMG signals were non-Gaussian. For lesser MVCs (below 30% of MVC) PDF measures tends to be Gaussian process. The above measures were verified by computing the Kurtosis values for different MVCs.

  1. Second-hand smoking and carboxyhemoglobin levels in children: a prospective observational study.

    PubMed

    Yee, Branden E; Ahmed, Mohammed I; Brugge, Doug; Farrell, Maureen; Lozada, Gustavo; Idupaganthi, Raghu; Schumann, Roman

    2010-01-01

    To establish baseline noninvasive carboxyhemoglobin (COHb) levels in children and determine the influence of exposure to environmental sources of carbon monoxide (CO), especially environmental tobacco smoke, on such levels. Second-hand smoking may be a risk factor for adverse outcomes following anesthesia and surgery in children (1) and may potentially be preventable. Parents and their children between the ages of 1-12 were enrolled on the day of elective surgery. The preoperative COHb levels of the children were assessed noninvasively using a CO-Oximeter (Radical-7 Rainbow SET Pulse CO-Oximeter; Masimo, Irvine, CA, USA). The parents were asked to complete an environmental air-quality questionnaire. The COHb levels were tabulated and correlated with responses to the survey in aggregate analysis. Statistical analyses were performed using the nonparametric Mann-Whitney and Kruskal-Wallis tests. P < 0.05 was statistically significant. Two hundred children with their parents were enrolled. Children exposed to parental smoking had higher COHb levels than the children of nonsmoking controls. Higher COHb values were seen in the youngest children, ages 1-2, exposed to parental cigarette smoke. However, these trends did not reach statistical significance, and confidence intervals were wide. This study revealed interesting trends of COHb levels in children presenting for anesthesia and surgery. However, the COHb levels measured in our patients were close to the error margin of the device used in our study. An expected improvement in measurement technology may allow screening children for potential pulmonary perioperative risk factors in the future.

  2. Student performance on levels 1 and 2-CE of COMLEX-USA: do elective upper-level undergraduate science courses matter?

    PubMed

    Wong, Stanley K; Ramirez, Juan R; Helf, Scott C

    2009-11-01

    The effect of a variety of preadmission variables, including the number of elective preadmission upper-level science courses, on academic achievement is not well established. To investigate the relationship between number of preadmission variables and overall student academic achievement in osteopathic medical school. Academic records of osteopathic medical students in the 2008 and 2009 graduating classes of Western University of Health Sciences College of Osteopathic Medicine of the Pacific in Pomona, California, were analyzed. Multivariate linear regression analyses were performed to identify predictors of academic achievement based on Medical College Admission Test (MCAT) subscores, undergraduate grade point average (GPA), GPA in medical school basic science (preclinical GPA) and clinical clerkship (clinical GPA), and scores on the Comprehensive Osteopathic Medical Licensing Examination-USA (COMLEX-USA) Level 1 and Level 2-Cognitive Evaluation (CE). Records of 358 osteopathic medical students were evaluated. Analysis of beta coefficients suggested that undergraduate science GPA was the most important predictor of overall student academic achievement (P<.01). Biological sciences MCAT subscore was a more modest but still statistically significant predictor of preclinical GPA and COMLEX-USA Level 1 score (P<.01). Physical sciences MCAT subscore was also a statistically significant predictor of preclinical GPA, and verbal reasoning MCAT subscore was a statistically significant predictor of COMLEX-USA Level 2-CE score (both P<.01). Women had statistically significantly higher preclinical GPA and COMLEX-USA Level 2-CE scores than men (P<.05). Differences in some outcome variables were also associated with racial-ethnic background and age. Number of preadmission elective upper-level science courses taken by students before matriculation was not significantly correlated with any academic achievement variable. Although undergraduate science GPA and MCAT biological sciences subscore were significant predictors of overall academic achievement for osteopathic medical students, the number of elective upper-level science courses taken preadmission had no predictive value.

  3. ASSESSMENT OF SPATIAL AUTOCORRELATION IN EMPIRICAL MODELS IN ECOLOGY

    EPA Science Inventory

    Statistically assessing ecological models is inherently difficult because data are autocorrelated and this autocorrelation varies in an unknown fashion. At a simple level, the linking of a single species to a habitat type is a straightforward analysis. With some investigation int...

  4. Serum vitamin d level and susceptibility to multidrug-resistant tuberculosis among household contacts

    NASA Astrophysics Data System (ADS)

    Herlina, N.; Sinaga, B. Y. M.; Siagian, P.; Mutiara, E.

    2018-03-01

    Low levels of vitamin D is a predisposing factor for Multidrug-resistant tuberculosis. Family members in contact with the patient are also at risk of infection. Currently, there is no study that compares vitamin D levels between MDR-TB patients and household contact. This study aims to identify the association between level vitamin D within MDR-TB occurrence. This was a case-control study, with the number of samples in each group (MDR-TB) patients and household contactswere40 people. Each member of each group was checked for vitamin D levels using enzyme-linked immunosorbent assay (ELISA) technique. Statistical analysis was by using Chi-Square analysis using SPSS. Mean levels of vitamin D in MDR-TB patients were 32.21, household contact 31.7. There was anosignificant association between vitamin D levels and MDR-TB occurrence (p=1.0).No significant associationbetween vitamin D level with theMDR-TB occurrence.

  5. The extended statistical analysis of toxicity tests using standardised effect sizes (SESs): a comparison of nine published papers.

    PubMed

    Festing, Michael F W

    2014-01-01

    The safety of chemicals, drugs, novel foods and genetically modified crops is often tested using repeat-dose sub-acute toxicity tests in rats or mice. It is important to avoid misinterpretations of the results as these tests are used to help determine safe exposure levels in humans. Treated and control groups are compared for a range of haematological, biochemical and other biomarkers which may indicate tissue damage or other adverse effects. However, the statistical analysis and presentation of such data poses problems due to the large number of statistical tests which are involved. Often, it is not clear whether a "statistically significant" effect is real or a false positive (type I error) due to sampling variation. The author's conclusions appear to be reached somewhat subjectively by the pattern of statistical significances, discounting those which they judge to be type I errors and ignoring any biomarker where the p-value is greater than p = 0.05. However, by using standardised effect sizes (SESs) a range of graphical methods and an over-all assessment of the mean absolute response can be made. The approach is an extension, not a replacement of existing methods. It is intended to assist toxicologists and regulators in the interpretation of the results. Here, the SES analysis has been applied to data from nine published sub-acute toxicity tests in order to compare the findings with those of the author's. Line plots, box plots and bar plots show the pattern of response. Dose-response relationships are easily seen. A "bootstrap" test compares the mean absolute differences across dose groups. In four out of seven papers where the no observed adverse effect level (NOAEL) was estimated by the authors, it was set too high according to the bootstrap test, suggesting that possible toxicity is under-estimated.

  6. Mapping the global health employment market: an analysis of global health jobs.

    PubMed

    Keralis, Jessica M; Riggin-Pathak, Brianne L; Majeski, Theresa; Pathak, Bogdan A; Foggia, Janine; Cullinen, Kathleen M; Rajagopal, Abbhirami; West, Heidi S

    2018-02-27

    The number of university global health training programs has grown in recent years. However, there is little research on the needs of the global health profession. We therefore set out to characterize the global health employment market by analyzing global health job vacancies. We collected data from advertised, paid positions posted to web-based job boards, email listservs, and global health organization websites from November 2015 to May 2016. Data on requirements for education, language proficiency, technical expertise, physical location, and experience level were analyzed for all vacancies. Descriptive statistics were calculated for the aforementioned job characteristics. Associations between technical specialty area and requirements for non-English language proficiency and overseas experience were calculated using Chi-square statistics. A qualitative thematic analysis was performed on a subset of vacancies. We analyzed the data from 1007 global health job vacancies from 127 employers. Among private and non-profit sector vacancies, 40% (n = 354) were for technical or subject matter experts, 20% (n = 177) for program directors, and 16% (n = 139) for managers, compared to 9.8% (n = 87) for entry-level and 13.6% (n = 120) for mid-level positions. The most common technical focus area was program or project management, followed by HIV/AIDS and quantitative analysis. Thematic analysis demonstrated a common emphasis on program operations, relations, design and planning, communication, and management. Our analysis shows a demand for candidates with several years of experience with global health programs, particularly program managers/directors and technical experts, with very few entry-level positions accessible to recent graduates of global health training programs. It is unlikely that global health training programs equip graduates to be competitive for the majority of positions that are currently available in this field.

  7. Analysis of the sleep quality of elderly people using biomedical signals.

    PubMed

    Moreno-Alsasua, L; Garcia-Zapirain, B; Mendez-Zorrilla, A

    2015-01-01

    This paper presents a technical solution that analyses sleep signals captured by biomedical sensors to find possible disorders during rest. Specifically, the method evaluates electrooculogram (EOG) signals, skin conductance (GSR), air flow (AS), and body temperature. Next, a quantitative sleep quality analysis determines significant changes in the biological signals, and any similarities between them in a given time period. Filtering techniques such as the Fourier transform method and IIR filters process the signal and identify significant variations. Once these changes have been identified, all significant data is compared and a quantitative and statistical analysis is carried out to determine the level of a person's rest. To evaluate the correlation and significant differences, a statistical analysis has been calculated showing correlation between EOG and AS signals (p=0,005), EOG, and GSR signals (p=0,037) and, finally, the EOG and Body temperature (p=0,04). Doctors could use this information to monitor changes within a patient.

  8. Deformable image registration as a tool to improve survival prediction after neoadjuvant chemotherapy for breast cancer: results from the ACRIN 6657/I-SPY-1 trial

    NASA Astrophysics Data System (ADS)

    Jahani, Nariman; Cohen, Eric; Hsieh, Meng-Kang; Weinstein, Susan P.; Pantalone, Lauren; Davatzikos, Christos; Kontos, Despina

    2018-02-01

    We examined the ability of DCE-MRI longitudinal features to give early prediction of recurrence-free survival (RFS) in women undergoing neoadjuvant chemotherapy for breast cancer, in a retrospective analysis of 106 women from the ISPY 1 cohort. These features were based on the voxel-wise changes seen in registered images taken before treatment and after the first round of chemotherapy. We computed the transformation field using a robust deformable image registration technique to match breast images from these two visits. Using the deformation field, parametric response maps (PRM) — a voxel-based feature analysis of longitudinal changes in images between visits — was computed for maps of four kinetic features (signal enhancement ratio, peak enhancement, and wash-in/wash-out slopes). A two-level discrete wavelet transform was applied to these PRMs to extract heterogeneity information about tumor change between visits. To estimate survival, a Cox proportional hazard model was applied with the C statistic as the measure of success in predicting RFS. The best PRM feature (as determined by C statistic in univariable analysis) was determined for each of the four kinetic features. The baseline model, incorporating functional tumor volume, age, race, and hormone response status, had a C statistic of 0.70 in predicting RFS. The model augmented with the four PRM features had a C statistic of 0.76. Thus, our results suggest that adding information on the texture of voxel-level changes in tumor kinetic response between registered images of first and second visits could improve early RFS prediction in breast cancer after neoadjuvant chemotherapy.

  9. Evaluation of person-level heterogeneity of treatment effects in published multiperson N-of-1 studies: systematic review and reanalysis.

    PubMed

    Raman, Gowri; Balk, Ethan M; Lai, Lana; Shi, Jennifer; Chan, Jeffrey; Lutz, Jennifer S; Dubois, Robert W; Kravitz, Richard L; Kent, David M

    2018-05-26

    Individual patients with the same condition may respond differently to similar treatments. Our aim is to summarise the reporting of person-level heterogeneity of treatment effects (HTE) in multiperson N-of-1 studies and to examine the evidence for person-level HTE through reanalysis. Systematic review and reanalysis of multiperson N-of-1 studies. Medline, Cochrane Controlled Trials, EMBASE, Web of Science and review of references through August 2017 for N-of-1 studies published in English. N-of-1 studies of pharmacological interventions with at least two subjects. Citation screening and data extractions were performed in duplicate. We performed statistical reanalysis testing for person-level HTE on all studies presenting person-level data. We identified 62 multiperson N-of-1 studies with at least two subjects. Statistical tests examining HTE were described in only 13 (21%), of which only two (3%) tested person-level HTE. Only 25 studies (40%) provided person-level data sufficient to reanalyse person-level HTE. Reanalysis using a fixed effect linear model identified statistically significant person-level HTE in 8 of the 13 studies (62%) reporting person-level treatment effects and in 8 of the 14 studies (57%) reporting person-level outcomes. Our analysis suggests that person-level HTE is common and often substantial. Reviewed studies had incomplete information on person-level treatment effects and their variation. Improved assessment and reporting of person-level treatment effects in multiperson N-of-1 studies are needed. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  10. Impact of sediment characteristics on the heavy metal concentration and their ecological risk level of surface sediments of Vaigai river, Tamilnadu, India.

    PubMed

    Paramasivam, K; Ramasamy, V; Suresh, G

    2015-02-25

    The distributions of the metals (Al, Fe, Mg, Cd, Cr, Cu, Ni, Pb and Zn) were measured for the surface sediments of the Vaigai river, Tamilnadu, India. These values are compared with different standard values to assess the level of toxicity of the heavy metals in the sediments. Risk indices (CF, PLI and PER) are also calculated to understand the level of toxicity of the metals. Multivariate statistical analyses (Pearson's correlation analysis, cluster analysis and factor analysis) are carried out to know the inter-relationship between sediment characteristics and the heavy metals. From this analysis, it is confirmed that the contents of clay and organic matter play an important role to raise the level of heavy metal contents as well as PLI and PER (level of toxicity). Heavy metal concentrations of the samples (after removing silt and clay fractions from bulk samples) show decrease in their concentrations and risk indices compared to the level of bulk samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. HydroApps: An R package for statistical simulation to use in regional analysis

    NASA Astrophysics Data System (ADS)

    Ganora, D.

    2013-12-01

    The HydroApps package is a newborn R extension initially developed to support the use of a recent model for flood frequency estimation developed for applications in Northwestern Italy; it also contains some general tools for regional analyses and can be easily extended to include other statistical models. The package is currently at an experimental level of development. The HydroApps is a corollary of the SSEM project for regional flood frequency analysis, although it was developed independently to support various instances of regional analyses. Its aim is to provide a basis for interplay between statistical simulation and practical operational use. In particular, the main module of the package deals with the building of the confidence bands of flood frequency curves expressed by means of their L-moments. Other functions include pre-processing and visualization of hydrologic time series, analysis of the optimal design-flood under uncertainty, but also tools useful in water resources management for the estimation of flow duration curves and their sensitivity to water withdrawals. Particular attention is devoted to the code granularity, i.e. the level of detail and aggregation of the code: a greater detail means more low-level functions, which entails more flexibility but reduces the ease of use for practical use. A balance between detail and simplicity is necessary and can be resolved with appropriate wrapping functions and specific help pages for each working block. From a more general viewpoint, the package has not really and user-friendly interface, but runs on multiple operating systems and it's easy to update, as many other open-source projects., The HydroApps functions and their features are reported in order to share ideas and materials to improve the ';technological' and information transfer between scientist communities and final users like policy makers.

  12. Statistical analysis of the uncertainty related to flood hazard appraisal

    NASA Astrophysics Data System (ADS)

    Notaro, Vincenza; Freni, Gabriele

    2015-12-01

    The estimation of flood hazard frequency statistics for an urban catchment is of great interest in practice. It provides the evaluation of potential flood risk and related damage and supports decision making for flood risk management. Flood risk is usually defined as function of the probability, that a system deficiency can cause flooding (hazard), and the expected damage, due to the flooding magnitude (damage), taking into account both the exposure and the vulnerability of the goods at risk. The expected flood damage can be evaluated by an a priori estimation of potential damage caused by flooding or by interpolating real damage data. With regard to flood hazard appraisal several procedures propose to identify some hazard indicator (HI) such as flood depth or the combination of flood depth and velocity and to assess the flood hazard corresponding to the analyzed area comparing the HI variables with user-defined threshold values or curves (penalty curves or matrixes). However, flooding data are usually unavailable or piecemeal allowing for carrying out a reliable flood hazard analysis, therefore hazard analysis is often performed by means of mathematical simulations aimed at evaluating water levels and flow velocities over catchment surface. As results a great part of the uncertainties intrinsic to flood risk appraisal can be related to the hazard evaluation due to the uncertainty inherent to modeling results and to the subjectivity of the user defined hazard thresholds applied to link flood depth to a hazard level. In the present work, a statistical methodology was proposed for evaluating and reducing the uncertainties connected with hazard level estimation. The methodology has been applied to a real urban watershed as case study.

  13. Compositional differences among Chinese soy sauce types studied by (13)C NMR spectroscopy coupled with multivariate statistical analysis.

    PubMed

    Kamal, Ghulam Mustafa; Wang, Xiaohua; Bin Yuan; Wang, Jie; Sun, Peng; Zhang, Xu; Liu, Maili

    2016-09-01

    Soy sauce a well known seasoning all over the world, especially in Asia, is available in global market in a wide range of types based on its purpose and the processing methods. Its composition varies with respect to the fermentation processes and addition of additives, preservatives and flavor enhancers. A comprehensive (1)H NMR based study regarding the metabonomic variations of soy sauce to differentiate among different types of soy sauce available on the global market has been limited due to the complexity of the mixture. In present study, (13)C NMR spectroscopy coupled with multivariate statistical data analysis like principle component analysis (PCA), and orthogonal partial least square-discriminant analysis (OPLS-DA) was applied to investigate metabonomic variations among different types of soy sauce, namely super light, super dark, red cooking and mushroom soy sauce. The main additives in soy sauce like glutamate, sucrose and glucose were easily distinguished and quantified using (13)C NMR spectroscopy which were otherwise difficult to be assigned and quantified due to serious signal overlaps in (1)H NMR spectra. The significantly higher concentration of sucrose in dark, red cooking and mushroom flavored soy sauce can directly be linked to the addition of caramel in soy sauce. Similarly, significantly higher level of glutamate in super light as compared to super dark and mushroom flavored soy sauce may come from the addition of monosodium glutamate. The study highlights the potentiality of (13)C NMR based metabonomics coupled with multivariate statistical data analysis in differentiating between the types of soy sauce on the basis of level of additives, raw materials and fermentation procedures. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Integrative pathway analysis of a genome-wide association study of V̇o2max response to exercise training

    PubMed Central

    Vivar, Juan C.; Sarzynski, Mark A.; Sung, Yun Ju; Timmons, James A.; Bouchard, Claude; Rankinen, Tuomo

    2013-01-01

    We previously reported the findings from a genome-wide association study of the response of maximal oxygen uptake (V̇o2max) to an exercise program. Here we follow up on these results to generate hypotheses on genes, pathways, and systems involved in the ability to respond to exercise training. A systems biology approach can help us better establish a comprehensive physiological description of what underlies V̇o2maxtrainability. The primary material for this exploration was the individual single-nucleotide polymorphism (SNP), SNP-gene mapping, and statistical significance levels. We aimed to generate novel hypotheses through analyses that go beyond statistical association of single-locus markers. This was accomplished through three complementary approaches: 1) building de novo evidence of gene candidacy through informatics-driven literature mining; 2) aggregating evidence from statistical associations to link variant enrichment in biological pathways to V̇o2max trainability; and 3) predicting possible consequences of variants residing in the pathways of interest. We started with candidate gene prioritization followed by pathway analysis focused on overrepresentation analysis and gene set enrichment analysis. Subsequently, leads were followed using in silico analysis of predicted SNP functions. Pathways related to cellular energetics (pantothenate and CoA biosynthesis; PPAR signaling) and immune functions (complement and coagulation cascades) had the highest levels of SNP burden. In particular, long-chain fatty acid transport and fatty acid oxidation genes and sequence variants were found to influence differences in V̇o2max trainability. Together, these methods allow for the hypothesis-driven ranking and prioritization of genes and pathways for future experimental testing and validation. PMID:23990238

  15. Polypropylene Production Optimization in Fluidized Bed Catalytic Reactor (FBCR): Statistical Modeling and Pilot Scale Experimental Validation

    PubMed Central

    Khan, Mohammad Jakir Hossain; Hussain, Mohd Azlan; Mujtaba, Iqbal Mohammed

    2014-01-01

    Propylene is one type of plastic that is widely used in our everyday life. This study focuses on the identification and justification of the optimum process parameters for polypropylene production in a novel pilot plant based fluidized bed reactor. This first-of-its-kind statistical modeling with experimental validation for the process parameters of polypropylene production was conducted by applying ANNOVA (Analysis of variance) method to Response Surface Methodology (RSM). Three important process variables i.e., reaction temperature, system pressure and hydrogen percentage were considered as the important input factors for the polypropylene production in the analysis performed. In order to examine the effect of process parameters and their interactions, the ANOVA method was utilized among a range of other statistical diagnostic tools such as the correlation between actual and predicted values, the residuals and predicted response, outlier t plot, 3D response surface and contour analysis plots. The statistical analysis showed that the proposed quadratic model had a good fit with the experimental results. At optimum conditions with temperature of 75°C, system pressure of 25 bar and hydrogen percentage of 2%, the highest polypropylene production obtained is 5.82% per pass. Hence it is concluded that the developed experimental design and proposed model can be successfully employed with over a 95% confidence level for optimum polypropylene production in a fluidized bed catalytic reactor (FBCR). PMID:28788576

  16. Clinical study of the Erlanger silver catheter--data management and biometry.

    PubMed

    Martus, P; Geis, C; Lugauer, S; Böswald, M; Guggenbichler, J P

    1999-01-01

    The clinical evaluation of venous catheters for catheter-induced infections must conform to a strict biometric methodology. The statistical planning of the study (target population, design, degree of blinding), data management (database design, definition of variables, coding), quality assurance (data inspection at several levels) and the biometric evaluation of the Erlanger silver catheter project are described. The three-step data flow included: 1) primary data from the hospital, 2) relational database, 3) files accessible for statistical evaluation. Two different statistical models were compared: analyzing the first catheter only of a patient in the analysis (independent data) and analyzing several catheters from the same patient (dependent data) by means of the generalized estimating equations (GEE) method. The main result of the study was based on the comparison of both statistical models.

  17. An experiment on the impact of a neonicotinoid pesticide on honeybees: the value of a formal analysis of the data.

    PubMed

    Schick, Robert S; Greenwood, Jeremy J D; Buckland, Stephen T

    2017-01-01

    We assess the analysis of the data resulting from a field experiment conducted by Pilling et al. (PLoS ONE. doi: 10.1371/journal.pone.0077193, 5) on the potential effects of thiamethoxam on honeybees. The experiment had low levels of replication, so Pilling et al. concluded that formal statistical analysis would be misleading. This would be true if such an analysis merely comprised tests of statistical significance and if the investigators concluded that lack of significance meant little or no effect. However, an analysis that includes estimation of the size of any effects-with confidence limits-allows one to reach conclusions that are not misleading and that produce useful insights. For the data of Pilling et al., we use straightforward statistical analysis to show that the confidence limits are generally so wide that any effects of thiamethoxam could have been large without being statistically significant. Instead of formal analysis, Pilling et al. simply inspected the data and concluded that they provided no evidence of detrimental effects and from this that thiamethoxam poses a "low risk" to bees. Conclusions derived from the inspection of the data were not just misleading in this case but also are unacceptable in principle, for if data are inadequate for a formal analysis (or only good enough to provide estimates with wide confidence intervals), then they are bound to be inadequate as a basis for reaching any sound conclusions. Given that the data in this case are largely uninformative with respect to the treatment effect, any conclusions reached from such informal approaches can do little more than reflect the prior beliefs of those involved.

  18. gsSKAT: Rapid gene set analysis and multiple testing correction for rare-variant association studies using weighted linear kernels.

    PubMed

    Larson, Nicholas B; McDonnell, Shannon; Cannon Albright, Lisa; Teerlink, Craig; Stanford, Janet; Ostrander, Elaine A; Isaacs, William B; Xu, Jianfeng; Cooney, Kathleen A; Lange, Ethan; Schleutker, Johanna; Carpten, John D; Powell, Isaac; Bailey-Wilson, Joan E; Cussenot, Olivier; Cancel-Tassin, Geraldine; Giles, Graham G; MacInnis, Robert J; Maier, Christiane; Whittemore, Alice S; Hsieh, Chih-Lin; Wiklund, Fredrik; Catalona, William J; Foulkes, William; Mandal, Diptasri; Eeles, Rosalind; Kote-Jarai, Zsofia; Ackerman, Michael J; Olson, Timothy M; Klein, Christopher J; Thibodeau, Stephen N; Schaid, Daniel J

    2017-05-01

    Next-generation sequencing technologies have afforded unprecedented characterization of low-frequency and rare genetic variation. Due to low power for single-variant testing, aggregative methods are commonly used to combine observed rare variation within a single gene. Causal variation may also aggregate across multiple genes within relevant biomolecular pathways. Kernel-machine regression and adaptive testing methods for aggregative rare-variant association testing have been demonstrated to be powerful approaches for pathway-level analysis, although these methods tend to be computationally intensive at high-variant dimensionality and require access to complete data. An additional analytical issue in scans of large pathway definition sets is multiple testing correction. Gene set definitions may exhibit substantial genic overlap, and the impact of the resultant correlation in test statistics on Type I error rate control for large agnostic gene set scans has not been fully explored. Herein, we first outline a statistical strategy for aggregative rare-variant analysis using component gene-level linear kernel score test summary statistics as well as derive simple estimators of the effective number of tests for family-wise error rate control. We then conduct extensive simulation studies to characterize the behavior of our approach relative to direct application of kernel and adaptive methods under a variety of conditions. We also apply our method to two case-control studies, respectively, evaluating rare variation in hereditary prostate cancer and schizophrenia. Finally, we provide open-source R code for public use to facilitate easy application of our methods to existing rare-variant analysis results. © 2017 WILEY PERIODICALS, INC.

  19. Evaluating Courses of Actions at the Strategic Planning Level

    DTIC Science & Technology

    2013-03-01

    and statistical decision theory ( Schultz , Borrowman and Small 2011). Nowadays, it is hard to make a decision by ourselves. Modern organizations...Analysis." Lecture Slides, October 2011. Schultz , Martin T., Thomas D. Borrowman, and Mitchell J. Small. Bayesian Networks for Modeling Dredging...www.ukessays.com/essays/business/strategic-analysis-of-procter-and-gamble.php (accessed October 09, 2012). Vego, Milan . Joint Operational Warfare. Vol. Vol 1

  20. Novel Image Encryption Scheme Based on Chebyshev Polynomial and Duffing Map

    PubMed Central

    2014-01-01

    We present a novel image encryption algorithm using Chebyshev polynomial based on permutation and substitution and Duffing map based on substitution. Comprehensive security analysis has been performed on the designed scheme using key space analysis, visual testing, histogram analysis, information entropy calculation, correlation coefficient analysis, differential analysis, key sensitivity test, and speed test. The study demonstrates that the proposed image encryption algorithm shows advantages of more than 10113 key space and desirable level of security based on the good statistical results and theoretical arguments. PMID:25143970

  1. Electricity supply efficiency and organizational growth and profitability in Lagos, Nigeria

    NASA Astrophysics Data System (ADS)

    Adeleke, Adedeji Tajudeen

    A modern and efficient infrastructure is a basic necessity for economic development and integration into the global economy. The specific problem was the inadequate and unreliable supply of electricity to manufacturing corporations in Lagos, Nigeria. The purpose of the current quantitative correlational research study was to examine if there was a correlation between electricity supply efficiency and organizational growth and profitability in manufacturing corporations in Lagos, Nigeria. The population of the current correlational research study involved 28 out of 34 manufacturing corporations from various industrial sectors in Lagos, Nigeria, that are listed and traded on the Nigerian Stock Exchange. Spearman rho correlations were used to assess the relationships between independent variables of electricity supply efficiency levels and the dependent variables of organizational growth and profitability. The result of the correlational analysis of the data revealed that there was a statistically significant, strong positive correlation between the Average Gross Income (1998-2007) and Average Actual Electricity supply efficiency level (1998-2007), rho = 0.57; p = 0.002. A statistically significant, strong positive correlation was found between the Average Balance Sheet Size (1998-2007) and Average Actual Electricity Supply Efficiency Level (1998-2007), rho = 0.54; p = 0.003. A statistically significant, strong positive correlation between the Average Profit After Tax (1998-2007) and Average Actual Electricity Supply Efficiency Level (1998-2007), rho = 0.60; p = 0.001, was found. No statistically significant correlation between the Average Return on Investment (1998-2007) and Average Actual Electricity supply efficiency level (1998-2007), rho = 0.19; p = 0.33, was discovered.

  2. FMAP: Functional Mapping and Analysis Pipeline for metagenomics and metatranscriptomics studies.

    PubMed

    Kim, Jiwoong; Kim, Min Soo; Koh, Andrew Y; Xie, Yang; Zhan, Xiaowei

    2016-10-10

    Given the lack of a complete and comprehensive library of microbial reference genomes, determining the functional profile of diverse microbial communities is challenging. The available functional analysis pipelines lack several key features: (i) an integrated alignment tool, (ii) operon-level analysis, and (iii) the ability to process large datasets. Here we introduce our open-sourced, stand-alone functional analysis pipeline for analyzing whole metagenomic and metatranscriptomic sequencing data, FMAP (Functional Mapping and Analysis Pipeline). FMAP performs alignment, gene family abundance calculations, and statistical analysis (three levels of analyses are provided: differentially-abundant genes, operons and pathways). The resulting output can be easily visualized with heatmaps and functional pathway diagrams. FMAP functional predictions are consistent with currently available functional analysis pipelines. FMAP is a comprehensive tool for providing functional analysis of metagenomic/metatranscriptomic sequencing data. With the added features of integrated alignment, operon-level analysis, and the ability to process large datasets, FMAP will be a valuable addition to the currently available functional analysis toolbox. We believe that this software will be of great value to the wider biology and bioinformatics communities.

  3. Exocrine Dysfunction Correlates with Endocrinal Impairment of Pancreas in Type 2 Diabetes Mellitus

    PubMed Central

    Prasanna Kumar, H. R.; Gowdappa, H. Basavana; Hosmani, Tejashwi; Urs, Tejashri

    2018-01-01

    Background: Diabetes mellitus (DM) is a chronic abnormal metabolic condition, which manifests elevated blood sugar level over a prolonged period. The pancreatic endocrine system generally gets affected during diabetes, but often abnormal exocrine functions are also manifested due to its proximity to the endocrine system. Fecal elastase-1 (FE-1) is found to be an ideal biomarker to reflect the exocrine insufficiency of the pancreas. Aim: The aim of this study was conducted to assess exocrine dysfunction of the pancreas in patients with type-2 DM (T2DM) by measuring FE levels and to associate the level of hyperglycemia with exocrine pancreatic dysfunction. Methodology: A prospective, cross-sectional comparative study was conducted on both T2DM patients and healthy nondiabetic volunteers. FE-1 levels were measured using a commercial kit (Human Pancreatic Elastase ELISA BS 86-01 from Bioserv Diagnostics). Data analysis was performed based on the important statistical parameters such as mean, standard deviation, standard error, t-test-independent samples, and Chi-square test/cross tabulation using SPSS for Windows version 20.0. Results: Statistically nonsignificant (P = 0.5051) relationship between FE-1 deficiency and age was obtained, which implied age as a noncontributing factor toward exocrine pancreatic insufficiency among diabetic patients. Statistically significant correlation (P = 0.003) between glycated hemoglobin and FE-1 levels was also noted. The association between retinopathy (P = 0.001) and peripheral pulses (P = 0.001) with FE-1 levels were found to be statistically significant. Conclusion: This study validates the benefit of FE-1 estimation, as a surrogate marker of exocrine pancreatic insufficiency, which remains unmanifest and subclinical. PMID:29535950

  4. Alveolar bone level changes in maxillary expansion treatments assessed through CBCT.

    PubMed

    Pham, Vi; Lagravère, Manuel O

    2017-03-01

    Determine changes in alveolar bone levels during expansion treatments as assessed through cone-beam computer tomography (CBCT). Sixty-one patients from Edmonton, Canada, with maxillary transverse deficiencies were split into three groups. One group was treated with a bone-anchored expander, another group was treated with a tooth-borne maxillary expander (Hyrax) and one group was untreated. CBCTs were obtained from each patient at two time points (initialT 1 and at removal of appliance after 6 months T 2 ). CBCTs were analyzed using AVIZO software and landmarks were placed on different dental and skeletal structures. Intra-examiner reliability for landmarks was done by randomly selecting 10 images and measuring each landmark 3 times, 1 week apart. Descriptive statistics, intraclass correlation coefficients (ICC) and ANOVA analysis were used to determine if there were changes to the alveolar bone levels and if these changes were statistically significant within each group. Landmarks reliability showed an ICC of at least 0.99 with a 95% confidence interval and a mean measurement error of at least 0.2067mm. Descriptive statistics show that changes in alveolar bone levels were less than 1mm for all three groups and therefore clinically insignificant. Changes between groups were not statistically different (P<0.05) from one another with the exception of 8 distances. However, since the distances were small, they were not considered clinically significant. Alveolar bone level changes were similar in maxillary expansion treatments and in the control group. The effects of maxillary expansion treatments on alveolar bone levels are not clinically significant. Copyright © 2016 CEO. Published by Elsevier Masson SAS. All rights reserved.

  5. Longitudinal and Immediate Effect of Kundalini Yoga on Salivary Levels of Cortisol and Activity of Alpha-Amylase and Its Effect on Perceived Stress

    PubMed Central

    García-Sesnich, Jocelyn N; Flores, Mauricio Garrido; Ríos, Marcela Hernández; Aravena, Jorge Gamonal

    2017-01-01

    Context: Stress is defined as an alteration of an organism's balance in response to a demand perceived from the environment. Diverse methods exist to evaluate physiological response. A noninvasive method is salivary measurement of cortisol and alpha-amylase. A growing body of evidence suggests that the regular practice of Yoga would be an effective treatment for stress. Aims: To determine the Kundalini Yoga (KY) effect, immediate and after 3 months of regular practice, on the perception of psychological stress and the salivary levels of cortisol and alpha-amylase activity. Settings and Design: To determine the psychological perceived stress, levels of cortisol and alpha-amylase activity in saliva, and compare between the participants to KY classes performed for 3 months and a group that does not practice any type of yoga. Subjects and Methods: The total sample consisted of 26 people between 18 and 45-year-old; 13 taking part in KY classes given at the Faculty of Dentistry, University of Chile and 13 controls. Salivary samples were collected, enzyme-linked immunosorbent assay was performed to quantify cortisol and kinetic reaction test was made to determine alpha-amylase activity. Perceived Stress Scale was applied at the beginning and at the end of the intervention. Statistical Analysis Used: Statistical analysis was applied using Stata v11.1 software. Shapiro–Wilk test was used to determine data distribution. The paired analysis was fulfilled by t-test or Wilcoxon signed-rank test. T-test or Mann–Whitney's test was applied to compare longitudinal data. A statistical significance was considered when P < 0.05. Results: KY practice had an immediate effect on salivary cortisol. The activity of alpha-amylase did not show significant changes. A significant decrease of perceived stress in the study group was found. Conclusions: KY practice shows an immediate effect on salivary cortisol levels and on perceived stress after 3 months of practice. PMID:28546677

  6. DNA Damage Analysis in Children with Non-syndromic Developmental Delay by Comet Assay.

    PubMed

    Susai, Surraj; Chand, Parkash; Ballambattu, Vishnu Bhat; Hanumanthappa, Nandeesha; Veeramani, Raveendranath

    2016-05-01

    Majority of the developmental delays in children are non-syndromic and they are believed to have an underlying DNA damage, though not well substantiated. Hence the present study was carried out to find out if there is any increased DNA damage in children with non-syndromic developmental delay by using the comet assay. The present case-control study was undertaken to assess the level of DNA damage in children with non syndromic developmental delay and compare the same with that of age and sex matched controls using submarine gel electrophoresis (Comet Assay). The blood from clinically diagnosed children with non syndromic developmental delay and controls were subjected for alkaline version of comet assay - Single cell gel electrophoresis using lymphocytes isolated from the peripheral blood. The comets were observed under a bright field microscope; photocaptured and scored using the Image J image quantification software. Comet parameters were compared between the cases and controls and statistical analysis and interpretation of results was done using the statistical software SPSS version 20. The mean comet tail length in cases and control was 20.77+7.659μm and 08.97+4.398μm respectively which was statistically significant (p<0.001). Other comet parameters like total comet length and % DNA in tail also showed a statistically significant difference (p < 0.001) between cases and controls. The current investigation unraveled increased levels of DNA damage in children with non syndromic developmental delay when compared to the controls.

  7. A Course on Macromolecules.

    ERIC Educational Resources Information Center

    Horta, Arturo

    1985-01-01

    Describes a senior-level course that: (1) focuses on the structure and reactions of macromolecules; (2) treats industrial polymers in a unified way; and (3) uses analysis of conformation and conformational statistics as a unifying approach. Also discusses course topics, including polysaccharides, proteins, nucleic acids, and others. (JN)

  8. Statistical analysis of alcohol-related driving trends, 1982-2005

    DOT National Transportation Integrated Search

    2008-05-01

    Overall, the percent of drivers involved in fatal crashes who had consumed alcohol and had blood alcohol concentration (BAC) of .08 or above prior to the crash steadily decreased from 1982 to 1997 and then leveled off (more or less). In an attempt to...

  9. An Establishment-Level Test of the Statistical Discrimination Hypothesis.

    ERIC Educational Resources Information Center

    Tomaskovic-Devey, Donald; Skaggs, Sheryl

    1999-01-01

    Analysis of a sample of 306 workers shows that neither the gender nor racial composition of the workplace is associated with productivity. An alternative explanation for lower wages of women and minorities is social closure--the monopolizing of desirable positions by advantaged workers. (SK)

  10. Bioconductor Workflow for Microbiome Data Analysis: from raw reads to community analyses

    PubMed Central

    Callahan, Ben J.; Sankaran, Kris; Fukuyama, Julia A.; McMurdie, Paul J.; Holmes, Susan P.

    2016-01-01

    High-throughput sequencing of PCR-amplified taxonomic markers (like the 16S rRNA gene) has enabled a new level of analysis of complex bacterial communities known as microbiomes. Many tools exist to quantify and compare abundance levels or OTU composition of communities in different conditions. The sequencing reads have to be denoised and assigned to the closest taxa from a reference database. Common approaches use a notion of 97% similarity and normalize the data by subsampling to equalize library sizes. In this paper, we show that statistical models allow more accurate abundance estimates. By providing a complete workflow in R, we enable the user to do sophisticated downstream statistical analyses, whether parametric or nonparametric. We provide examples of using the R packages dada2, phyloseq, DESeq2, ggplot2 and vegan to filter, visualize and test microbiome data. We also provide examples of supervised analyses using random forests and nonparametric testing using community networks and the ggnetwork package. PMID:27508062

  11. Distribution of the two-sample t-test statistic following blinded sample size re-estimation.

    PubMed

    Lu, Kaifeng

    2016-05-01

    We consider the blinded sample size re-estimation based on the simple one-sample variance estimator at an interim analysis. We characterize the exact distribution of the standard two-sample t-test statistic at the final analysis. We describe a simulation algorithm for the evaluation of the probability of rejecting the null hypothesis at given treatment effect. We compare the blinded sample size re-estimation method with two unblinded methods with respect to the empirical type I error, the empirical power, and the empirical distribution of the standard deviation estimator and final sample size. We characterize the type I error inflation across the range of standardized non-inferiority margin for non-inferiority trials, and derive the adjusted significance level to ensure type I error control for given sample size of the internal pilot study. We show that the adjusted significance level increases as the sample size of the internal pilot study increases. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  12. The application of data mining techniques to oral cancer prognosis.

    PubMed

    Tseng, Wan-Ting; Chiang, Wei-Fan; Liu, Shyun-Yeu; Roan, Jinsheng; Lin, Chun-Nan

    2015-05-01

    This study adopted an integrated procedure that combines the clustering and classification features of data mining technology to determine the differences between the symptoms shown in past cases where patients died from or survived oral cancer. Two data mining tools, namely decision tree and artificial neural network, were used to analyze the historical cases of oral cancer, and their performance was compared with that of logistic regression, the popular statistical analysis tool. Both decision tree and artificial neural network models showed superiority to the traditional statistical model. However, as to clinician, the trees created by the decision tree models are relatively easier to interpret compared to that of the artificial neural network models. Cluster analysis also discovers that those stage 4 patients whose also possess the following four characteristics are having an extremely low survival rate: pN is N2b, level of RLNM is level I-III, AJCC-T is T4, and cells mutate situation (G) is moderate.

  13. Some Interesting Applications of Probabilistic Techiques in Structural Dynamic Analysis of Rocket Engines

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.

    2014-01-01

    Numerical and Analytical methods developed to determine damage accumulation in specific engine components when speed variation included. Dither Life Ratio shown to be well over factor of 2 for specific example. Steady-State assumption shown to be accurate for most turbopump cases, allowing rapid calculation of DLR. If hot-fire speed data unknown, Monte Carlo method developed that uses speed statistics for similar engines. Application of techniques allow analyst to reduce both uncertainty and excess conservatism. High values of DLR could allow previously unacceptable part to pass HCF criteria without redesign. Given benefit and ease of implementation, recommend that any finite life turbomachine component analysis adopt these techniques. Probability Values calculated, compared, and evaluated for several industry-proposed methods for combining random and harmonic loads. Two new excel macros written to calculate combined load for any specific probability level. Closed form Curve fits generated for widely used 3(sigma) and 2(sigma) probability levels. For design of lightweight aerospace components, obtaining accurate, reproducible, statistically meaningful answer critical.

  14. Nurses' autonomy level in teaching hospitals and its relationship with the underlying factors.

    PubMed

    Amini, Kourosh; Negarandeh, Reza; Ramezani-Badr, Farhad; Moosaeifard, Mahdi; Fallah, Ramezan

    2015-02-01

    This study aimed to determine the autonomy level of nurses in hospitals affiliated to Zanjan University of Medical Sciences, Iran. In this descriptive cross-sectional study, 252 subjects were recruited using systematic random sampling method. The data were collected using questionnaire including Dempster Practice Behavior Scale. For data analysis, descriptive statistics and to compare the overall score and its subscales according to the demographic variables, t-test and analysis of variance test were used. The nurses in this study had medium professional autonomy. Statistical tests showed significant differences in the research sample according to age, gender, work experience, working position and place of work. The results of this study revealed that most of the nurses who participated in the study compared with western societies have lower professional autonomy. More studies are needed to determine the factors related to this difference and how we can promote Iranian nurses' autonomy. © 2013 Wiley Publishing Asia Pty Ltd.

  15. The Automotive Situation in Poznań versus other Cities and National Indexes

    NASA Astrophysics Data System (ADS)

    Kozak, Karolina; Kozak, Miłosław; Merkisz, Jerzy; Nijak, Dawid; Wiśniewska, Bożena

    2012-09-01

    Following the dynamic development of the automotive industry and economic changes in the last 20 years Polish transport-related needs and citizen mobility have changed as well. An increased demand for traveling and easy access to individual means of transport in the form of passenger cars put Poznan in the top ten of the largest cities of Poland in terms of the motorization level. The paper analyses the current situation of the level of motorization of the city of Poznan based on statistical data from Central Vehicle and Driver Register, Department of Motor Vehicles in Poznan and published by Central Office of Statistics. A synthetic analysis has been presented of the situation in Poznan against other largest cities of Poland and the average situation in the country. The paper also presents the analysis of the preferences of the citizens of Poznan in terms of engine capacity, type of fuel, engine type as and the most popular vehicle makes.

  16. Power flow as a complement to statistical energy analysis and finite element analysis

    NASA Technical Reports Server (NTRS)

    Cuschieri, J. M.

    1987-01-01

    Present methods of analysis of the structural response and the structure-borne transmission of vibrational energy use either finite element (FE) techniques or statistical energy analysis (SEA) methods. The FE methods are a very useful tool at low frequencies where the number of resonances involved in the analysis is rather small. On the other hand SEA methods can predict with acceptable accuracy the response and energy transmission between coupled structures at relatively high frequencies where the structural modal density is high and a statistical approach is the appropriate solution. In the mid-frequency range, a relatively large number of resonances exist which make finite element method too costly. On the other hand SEA methods can only predict an average level form. In this mid-frequency range a possible alternative is to use power flow techniques, where the input and flow of vibrational energy to excited and coupled structural components can be expressed in terms of input and transfer mobilities. This power flow technique can be extended from low to high frequencies and this can be integrated with established FE models at low frequencies and SEA models at high frequencies to form a verification of the method. This method of structural analysis using power flo and mobility methods, and its integration with SEA and FE analysis is applied to the case of two thin beams joined together at right angles.

  17. The resolving power of in vitro genotoxicity assays for cigarette smoke particulate matter.

    PubMed

    Scott, K; Saul, J; Crooks, I; Camacho, O M; Dillon, D; Meredith, C

    2013-06-01

    In vitro genotoxicity assays are often used to compare tobacco smoke particulate matter (PM) from different cigarettes. The quantitative aspect of the comparisons requires appropriate statistical methods and replication levels, to support the interpretation in terms of power and significance. This paper recommends a uniform statistical analysis for the Ames test, mouse lymphoma mammalian cell mutation assay (MLA) and the in vitro micronucleus test (IVMNT); involving a hierarchical decision process with respect to slope, fixed effect and single dose comparisons. With these methods, replication levels of 5 (Ames test TA98), 4 (Ames test TA100), 10 (Ames test TA1537), 6 (MLA) and 4 (IVMNT) resolved a 30% difference in PM genotoxicity. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Level statistics of words: Finding keywords in literary texts and symbolic sequences

    NASA Astrophysics Data System (ADS)

    Carpena, P.; Bernaola-Galván, P.; Hackenberg, M.; Coronado, A. V.; Oliver, J. L.

    2009-03-01

    Using a generalization of the level statistics analysis of quantum disordered systems, we present an approach able to extract automatically keywords in literary texts. Our approach takes into account not only the frequencies of the words present in the text but also their spatial distribution along the text, and is based on the fact that relevant words are significantly clustered (i.e., they self-attract each other), while irrelevant words are distributed randomly in the text. Since a reference corpus is not needed, our approach is especially suitable for single documents for which no a priori information is available. In addition, we show that our method works also in generic symbolic sequences (continuous texts without spaces), thus suggesting its general applicability.

  19. EVALUATION OF THE EXTRACELLULAR MATRIX OF INJURED SUPRASPINATUS IN RATS

    PubMed Central

    Almeida, Luiz Henrique Oliveira; Ikemoto, Roberto; Mader, Ana Maria; Pinhal, Maria Aparecida Silva; Munhoz, Bruna; Murachovsky, Joel

    2016-01-01

    ABSTRACT Objective: To evaluate the evolution of injuries of the supraspinatus muscle by immunohistochemistry (IHC) and anatomopathological analysis in animal model (Wistar rats). Methods: Twenty-five Wistar rats were submitted to complete injury of the supraspinatus tendon, then subsequently sacrificed in groups of five animals at the following periods: immediately after the injury, 24h after the injury, 48h after, 30 days after and three months after the injury. All groups underwent histological and IHC analysis. Results: Regarding vascular proliferation and inflammatory infiltrate, we found a statistically significant difference between groups 1(control group) and 2 (24h after injury). IHC analysis showed that expression of vascular endothelial growth factor (VEGF) showed a statistically significant difference between groups 1 and 2, and collagen type 1 (Col-1) evaluation presented a statistically significant difference between groups 1 and 4. Conclusion: We observed changes in the extracellular matrix components compatible with remodeling and healing. Remodeling is more intense 24h after injury. However, VEGF and Col-1 are substantially increased at 24h and 30 days after the injury, respectively. Level of Evidence I, Experimental Study. PMID:26997907

  20. Modeling Signal-to-Noise Ratio of Otoacoustic Emissions in Workers Exposed to Different Industrial Noise Levels

    PubMed Central

    Nassiri, Parvin; Zare, Sajad; Monazzam, Mohammad R.; Pourbakht, Akram; Azam, Kamal; Golmohammadi, Taghi

    2016-01-01

    Introduction: Noise is considered as the most common cause of harmful physical effects in the workplace. A sound that is generated from within the inner ear is known as an otoacoustic emission (OAE). Distortion-product otoacoustic emissions (DPOAEs) assess evoked emission and hearing capacity. The aim of this study was to assess the signal-to-noise ratio in different frequencies and at different times of the shift work in workers exposed to various levels of noise. It was also aimed to provide a statistical model for signal-to-noise ratio (SNR) of OAEs in different frequencies based on the two variables of sound pressure level (SPL) and exposure time. Materials and Methods: This case–control study was conducted on 45 workers during autumn 2014. The workers were divided into three groups based on the level of noise exposure. The SNR was measured in frequencies of 1000, 2000, 3000, 4000, and 6000 Hz in both ears, and in three different time intervals during the shift work. According to the inclusion criterion, SNR of 6 dB or greater was included in the study. The analysis was performed using repeated measurements of analysis of variance, spearman correlation coefficient, and paired samples t-test. Results: The results showed that there was no statistically significant difference between the three exposed groups in terms of the mean values of SNR (P > 0.05). Only in signal pressure levels of 88 dBA with an interval time of 10:30–11:00 AM, there was a statistically significant difference between the right and left ears with the mean SNR values of 3000 frequency (P = 0.038). The SPL had a significant effect on the SNR in both the right and left ears (P = 0.023, P = 0.041). The effect of the duration of measurement on the SNR was statistically significant in both the right and left ears (P = 0.027, P < 0.001). Conclusion: The findings of this study demonstrated that after noise exposure during the shift, SNR of OAEs reduced from the beginning to the end of the shift. PMID:27991472

  1. Hypocretin-1 Levels Associate with Fragmented Sleep in Patients with Narcolepsy Type 1.

    PubMed

    Alakuijala, Anniina; Sarkanen, Tomi; Partinen, Markku

    2016-05-01

    We aimed to analyze nocturnal sleep characteristics of patients with narcolepsy type 1 (narcolepsy with cataplexy) measured by actigraphy in respect to cerebrospinal fluid hypocretin-1 levels of the same patients. Actigraphy recording of 1-2 w and hypocretin-1 concentration analysis were done to thirty-six unmedicated patients, aged 7 to 63 y, 50% female. Twenty-six of them had hypocretin-1 levels under 30 pg/mL and the rest had levels of 31-79 pg/mL. According to actigraphy, patients with very low hypocretin levels had statistically significantly longer sleep latency (P = 0.033) and more fragmented sleep, indicated by both the number of immobile phases of 1 min (P = 0.020) and movement + fragmentation index (P = 0.049). There were no statistically significant differences in the actual sleep time or circadian rhythm parameters measured by actigraphy. Actigraphy gives additional information about the stabilization of sleep in patients with narcolepsy type 1. Very low hypocretin levels associate with more wake intruding into sleep. © 2016 Associated Professional Sleep Societies, LLC.

  2. Proton pump inhibitor use for 12 months is not associated with changes in serum magnesium levels: a prospective open label comparative study.

    PubMed

    Bahtiri, Elton; Islami, Hilmi; Hoxha, Rexhep; Gashi, Afrim; Thaçi, Kujtim; Karakulak, Çağla; Thaçi, Shpetim; Qorraj Bytyqi, Hasime

    2017-03-01

    Proton pump inhibitors (PPIs) are a widely used class of drugs because of a generally acceptable safety profile. Among recently raised safety issues of the long-term use of PPIs is the increased risk of developing hypomagnesemia. As there have been very few prospective studies measuring serum magnesium levels before and after PPI therapy, we aimed to prospectively assess the potential association between PPI therapy for 12 months and the risk of hypomagnesemia as well as the incidence of new-onset hypomagnesemia during the study. In addition, the association of PPI therapy with the risk of hypocalcemia was assessed. The study included 250 patients with normal serum magnesium and total calcium levels, who underwent a long-term PPI treatment. Serum magnesium, total calcium, and parathormone (PTH) levels were measured at baseline and after 12 months. Of the 250 study participants, 209 completed 12 months of treatment and were included in the statistical analysis. The Wilcoxon signed rank test showed no statistically significant differences in serum magnesium levels between measurements at two different time points. However, there were statistically significant differences in serum total calcium and PTH levels in PPI users. Stable serum magnesium levels were demonstrated after 12 months and no association between PPI use and risk of hypomagnesemia was shown in the general population. Significant reductions of serum total calcium levels were demonstrated among PPI users; nevertheless, further research is required before recommending any serum calcium and PTH level monitoring in patients initiated on long-term PPI therapy.

  3. Sialochemical Analysis: A Portal for the Oral Diagnostician

    PubMed Central

    Bhambal, Ajay

    2014-01-01

    Background: Depressive disorders, worldwide, may rank second by the year 2020. In India; about 10 million people suffer from depressive disorders, the prevalence rate being recorded as 31.2 for every 1000 individuals. A significant impairment of all personal hygiene may occur due to a depressive episode which in turn may result in altered biochemical composition of some important salivary parameters. The present study was done to assess the relationship and bring about a comparison of certain selective sialochemical alterations between normal and subjects with depressive disorders. Settings and Design: The present study was a hospital- based clinical cross-sectional study which was conducted in Bhopal, the heart of Madhya Pradesh, India. The survey period extended over a period of one year and two months, from May 2009 to July 2010. Material and Methods: Unstimulated whole saliva was analysed biochemically for α- amylase, calcium, sodium, potassium, total proteins and urea. The data obtained in this study were statistically analyzed by using Unpaired Student’s t–test. Results: Salivary calcium and total protein levels were found to be statistically significant among all three groups (p< 0.0001). Salivary amylase levels between Groups II and III and between Groups I and III (p< 0.0001) was statistically significant while the salivary urea levels between Groups I and Group II and between Groups I and III were found to be statistically significant (p< 0.0001). However, there was no statistical difference in their sodium and potassium levels. Conclusions: It was observed that drugs do affect the salivary composition. It was observed that cyclic antidepressants produced significant alteration in the sialochemical constituents of saliva as compared to TCAs and TeCAs. PMID:24995243

  4. One-stage individual participant data meta-analysis models: estimation of treatment-covariate interactions must avoid ecological bias by separating out within-trial and across-trial information.

    PubMed

    Hua, Hairui; Burke, Danielle L; Crowther, Michael J; Ensor, Joie; Tudur Smith, Catrin; Riley, Richard D

    2017-02-28

    Stratified medicine utilizes individual-level covariates that are associated with a differential treatment effect, also known as treatment-covariate interactions. When multiple trials are available, meta-analysis is used to help detect true treatment-covariate interactions by combining their data. Meta-regression of trial-level information is prone to low power and ecological bias, and therefore, individual participant data (IPD) meta-analyses are preferable to examine interactions utilizing individual-level information. However, one-stage IPD models are often wrongly specified, such that interactions are based on amalgamating within- and across-trial information. We compare, through simulations and an applied example, fixed-effect and random-effects models for a one-stage IPD meta-analysis of time-to-event data where the goal is to estimate a treatment-covariate interaction. We show that it is crucial to centre patient-level covariates by their mean value in each trial, in order to separate out within-trial and across-trial information. Otherwise, bias and coverage of interaction estimates may be adversely affected, leading to potentially erroneous conclusions driven by ecological bias. We revisit an IPD meta-analysis of five epilepsy trials and examine age as a treatment effect modifier. The interaction is -0.011 (95% CI: -0.019 to -0.003; p = 0.004), and thus highly significant, when amalgamating within-trial and across-trial information. However, when separating within-trial from across-trial information, the interaction is -0.007 (95% CI: -0.019 to 0.005; p = 0.22), and thus its magnitude and statistical significance are greatly reduced. We recommend that meta-analysts should only use within-trial information to examine individual predictors of treatment effect and that one-stage IPD models should separate within-trial from across-trial information to avoid ecological bias. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  5. Sigma: Strain-level inference of genomes from metagenomic analysis for biosurveillance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahn, Tae-Hyuk; Chai, Juanjuan; Pan, Chongle

    Motivation: Metagenomic sequencing of clinical samples provides a promising technique for direct pathogen detection and characterization in biosurveillance. Taxonomic analysis at the strain level can be used to resolve serotypes of a pathogen in biosurveillance. Sigma was developed for strain-level identification and quantification of pathogens using their reference genomes based on metagenomic analysis. Results: Sigma provides not only accurate strain-level inferences, but also three unique capabilities: (i) Sigma quantifies the statistical uncertainty of its inferences, which includes hypothesis testing of identified genomes and confidence interval estimation of their relative abundances; (ii) Sigma enables strain variant calling by assigning metagenomic readsmore » to their most likely reference genomes; and (iii) Sigma supports parallel computing for fast analysis of large datasets. In conclusion, the algorithm performance was evaluated using simulated mock communities and fecal samples with spike-in pathogen strains. Availability and Implementation: Sigma was implemented in C++ with source codes and binaries freely available at http://sigma.omicsbio.org.« less

  6. Sigma: Strain-level inference of genomes from metagenomic analysis for biosurveillance

    DOE PAGES

    Ahn, Tae-Hyuk; Chai, Juanjuan; Pan, Chongle

    2014-09-29

    Motivation: Metagenomic sequencing of clinical samples provides a promising technique for direct pathogen detection and characterization in biosurveillance. Taxonomic analysis at the strain level can be used to resolve serotypes of a pathogen in biosurveillance. Sigma was developed for strain-level identification and quantification of pathogens using their reference genomes based on metagenomic analysis. Results: Sigma provides not only accurate strain-level inferences, but also three unique capabilities: (i) Sigma quantifies the statistical uncertainty of its inferences, which includes hypothesis testing of identified genomes and confidence interval estimation of their relative abundances; (ii) Sigma enables strain variant calling by assigning metagenomic readsmore » to their most likely reference genomes; and (iii) Sigma supports parallel computing for fast analysis of large datasets. In conclusion, the algorithm performance was evaluated using simulated mock communities and fecal samples with spike-in pathogen strains. Availability and Implementation: Sigma was implemented in C++ with source codes and binaries freely available at http://sigma.omicsbio.org.« less

  7. [Organizational climate and burnout syndrome].

    PubMed

    Lubrańska, Anna

    2011-01-01

    The paper addresses the issue of organizational climate and burnout syndrome. It has been assumed that burnout syndrome is dependent on work climate (organizational climate), therefore, two concepts were analyzed: by D. Kolb (organizational climate) and by Ch. Maslach (burnout syndrome). The research involved 239 persons (122 woman, 117 men), aged 21-66. In the study Maslach Burnout Inventory (MBI) and Inventory of Organizational Climate were used. The results of statistical methods (correlation analysis, one-variable analysis of variance and regression analysis) evidenced a strong relationship between organizational climate and burnout dimension. As depicted by the results, there are important differences in the level of burnout between the study participants who work in different types of organizational climate. The results of the statistical analyses indicate that the organizational climate determines burnout syndrome. Therefore, creating supportive conditions at the workplace might reduce the risk of burnout.

  8. Effect of local and global geomagnetic activity on human cardiovascular homeostasis.

    PubMed

    Dimitrova, Svetla; Stoilova, Irina; Yanev, Toni; Cholakov, Ilia

    2004-02-01

    The authors investigated the effects of local and planetary geomagnetic activity on human physiology. They collected data in Sofia, Bulgaria, from a group of 86 volunteers during the periods of the autumnal and vernal equinoxes. They used the factors local/planetary geomagnetic activity, day of measurement, gender, and medication use to apply a four-factor multiple analysis of variance. They also used a post hoc analysis to establish the statistical significance of the differences between the average values of the measured physiological parameters in the separate factor levels. In addition, the authors performed correlation analysis between the physiological parameters examined and geophysical factors. The results revealed that geomagnetic changes had a statistically significant influence on arterial blood pressure. Participants expressed this reaction with weak local geomagnetic changes and when major and severe global geomagnetic storms took place.

  9. Multi-reader ROC studies with split-plot designs: a comparison of statistical methods.

    PubMed

    Obuchowski, Nancy A; Gallas, Brandon D; Hillis, Stephen L

    2012-12-01

    Multireader imaging trials often use a factorial design, in which study patients undergo testing with all imaging modalities and readers interpret the results of all tests for all patients. A drawback of this design is the large number of interpretations required of each reader. Split-plot designs have been proposed as an alternative, in which one or a subset of readers interprets all images of a sample of patients, while other readers interpret the images of other samples of patients. In this paper, the authors compare three methods of analysis for the split-plot design. Three statistical methods are presented: the Obuchowski-Rockette method modified for the split-plot design, a newly proposed marginal-mean analysis-of-variance approach, and an extension of the three-sample U-statistic method. A simulation study using the Roe-Metz model was performed to compare the type I error rate, power, and confidence interval coverage of the three test statistics. The type I error rates for all three methods are close to the nominal level but tend to be slightly conservative. The statistical power is nearly identical for the three methods. The coverage of 95% confidence intervals falls close to the nominal coverage for small and large sample sizes. The split-plot multireader, multicase study design can be statistically efficient compared to the factorial design, reducing the number of interpretations required per reader. Three methods of analysis, shown to have nominal type I error rates, similar power, and nominal confidence interval coverage, are available for this study design. Copyright © 2012 AUR. All rights reserved.

  10. A pedagogical derivation of the matrix element method in particle physics data analysis

    NASA Astrophysics Data System (ADS)

    Sumowidagdo, Suharyo

    2018-03-01

    The matrix element method provides a direct connection between the underlying theory of particle physics processes and detector-level physical observables. I am presenting a pedagogically-oriented derivation of the matrix element method, drawing from elementary concepts in probability theory, statistics, and the process of experimental measurements. The level of treatment should be suitable for beginning research student in phenomenology and experimental high energy physics.

  11. Multiple imputation methods for bivariate outcomes in cluster randomised trials.

    PubMed

    DiazOrdaz, K; Kenward, M G; Gomes, M; Grieve, R

    2016-09-10

    Missing observations are common in cluster randomised trials. The problem is exacerbated when modelling bivariate outcomes jointly, as the proportion of complete cases is often considerably smaller than the proportion having either of the outcomes fully observed. Approaches taken to handling such missing data include the following: complete case analysis, single-level multiple imputation that ignores the clustering, multiple imputation with a fixed effect for each cluster and multilevel multiple imputation. We contrasted the alternative approaches to handling missing data in a cost-effectiveness analysis that uses data from a cluster randomised trial to evaluate an exercise intervention for care home residents. We then conducted a simulation study to assess the performance of these approaches on bivariate continuous outcomes, in terms of confidence interval coverage and empirical bias in the estimated treatment effects. Missing-at-random clustered data scenarios were simulated following a full-factorial design. Across all the missing data mechanisms considered, the multiple imputation methods provided estimators with negligible bias, while complete case analysis resulted in biased treatment effect estimates in scenarios where the randomised treatment arm was associated with missingness. Confidence interval coverage was generally in excess of nominal levels (up to 99.8%) following fixed-effects multiple imputation and too low following single-level multiple imputation. Multilevel multiple imputation led to coverage levels of approximately 95% throughout. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  12. Cerebral oxygenation and desaturations in preterm infants - a longitudinal data analysis.

    PubMed

    Mayer, Benjamin; Pohl, Moritz; Hummler, Helmut D; Schmid, Manuel B

    2017-01-01

    Hypoxemic episodes commonly occur in very preterm infants and may be associated with several adverse effects. Cerebral tissue oxygen saturation (StO2) as measured by near infrared spectroscopy (NIRS) may be a useful measure to assess brain oxygenation. However, knowledge on variability of StO2 is limited in preterm infants at this time, so StO2 dependency on arterial oxygenation (SpO2) and heart rate (HR) was assessed in preterm infants using statistical methods of time series analysis. StO2, SpO2, and HR were recorded from 15 preterm infants every 2 seconds for six hours. Statistical methods of time series and longitudinal data analysis were applied to the data. The mean StO2 level was found as 72% (95% confidence interval (CI) 55.5% -85.5%) based on a moving average process with a 5 minute order. Accordingly, longitudinal SpO2 measurements showed a mean level of 91% (95% CI 69% -98%). Generally, compensation strategies to cope with both StO2 and SpO2 desaturations were observed in the studied patients. SpO2 had a significant effect on cerebral oxygenation (p < 0.001), but HR did not, which led to inconclusive results considering different time intervals. In infants with intermittent hypoxemia and bradycardia, we found a mean StO2 level of 72% and a strong correlation with SpO2. We observed large differences between individuals in the ability to maintain StO2 at a stable level.

  13. Patient-reported outcomes before and after treatment of major depressive disorder

    PubMed Central

    IsHak, Waguih William; Mirocha, James; Pi, Sarah; Tobia, Gabriel; Becker, Bret; Peselow, Eric D.; Cohen, Robert M.

    2014-01-01

    Patient reported outcomes (PROs) of quality of life (QoL), functioning, and depressive symptom severity are important in assessing the burden of illness of major depressive disorder (MDD) and to evaluate the impact of treatment. We sought to provide a detailed analysis of PROs before and after treatment of MDD from the large Sequenced Treatment Alternatives to Relieve Depression (STAR*D) study. This analysis examines PROs before and after treatment in the second level of STAR*D. The complete data on QoL, functioning, and depressive symptom severity, were analyzed for each STAR*D level 2 treatment. PROs of QoL, functioning, and depressive symptom severity showed substantial impairments after failing a selective serotonin reuptake inhibitor trial using citalopram (level 1). The seven therapeutic options in level 2 had positive statistically (P values) and clinically (Cohen's standardized differences [Cohen's d]) significant impact on QoL, functioning, depressive symptom severity, and reduction in calculated burden of illness. There were no statistically significant differences between the interventions. However, a substantial proportion of patients still suffered from patient-reported QoL and functioning impairment after treatment, an effect that was more pronounced in nonremitters. PROs are crucial in understanding the impact of MDD and in examining the effects of treatment interventions, both in research and clinical settings. PMID:25152656

  14. Analyzing gene expression from relative codon usage bias in Yeast genome: a statistical significance and biological relevance.

    PubMed

    Das, Shibsankar; Roymondal, Uttam; Sahoo, Satyabrata

    2009-08-15

    Based on the hypothesis that highly expressed genes are often characterized by strong compositional bias in terms of codon usage, there are a number of measures currently in use that quantify codon usage bias in genes, and hence provide numerical indices to predict the expression levels of genes. With the recent advent of expression measure from the score of the relative codon usage bias (RCBS), we have explicitly tested the performance of this numerical measure to predict the gene expression level and illustrate this with an analysis of Yeast genomes. In contradiction with previous other studies, we observe a weak correlations between GC content and RCBS, but a selective pressure on the codon preferences in highly expressed genes. The assertion that the expression of a given gene depends on the score of relative codon usage bias (RCBS) is supported by the data. We further observe a strong correlation between RCBS and protein length indicating natural selection in favour of shorter genes to be expressed at higher level. We also attempt a statistical analysis to assess the strength of relative codon bias in genes as a guide to their likely expression level, suggesting a decrease of the informational entropy in the highly expressed genes.

  15. Assistive technology: a health care reform for people with disabilities.

    PubMed

    Santiago-Pintor, Jorge; Hernández-Maldonado, María; Correa-Colón, Angela; Méndez-Fernández, Héctor L

    2009-03-01

    Assistive technology has become one of the most powerful tools in assisting people with disabilities fight for social equality both in Puerto Rico as well as in other cities worldwide. In spite of this, the availability of assistive technology equipment does not constitute reason enough for people with disabilities to have all the technology resources for making them independent and productive in a society as competitive as ours. An assistive technology evaluation process is recommended in order to achieve an optimum level of self-sufficiency in people with disabilities. The evaluation process should take into consideration both the individual's needs and strength and the advantages and disadvantages of the equipment. The main purpose of this research was to determine the satisfaction level of 69 consumers evaluated at the Assistive Technology Integrated Services Center. These evaluations were conducted during 2001-2005. Statistical tests including distribution of frequencies, chi-square, bivariate and variance analysis were produced in order to determine if a scientific association existed between the consumers' level of satisfaction with the services and the assisted conditions. The data analysis results showed a significant difference between the satisfaction level with consumer's age, type of disability, and recommended equipment acquisition. Besides, statistical associations were established between general satisfaction concept dimensions, type of disability, and consumers' particular characteristics.

  16. Canal transportation and centering ability of protaper and self-adjusting file system in long oval canals: An ex-vivo cone-beam computed tomography analysis.

    PubMed

    Shah, Dipali Yogesh; Wadekar, Swati Ishwara; Dadpe, Ashwini Manish; Jadhav, Ganesh Ranganath; Choudhary, Lalit Jayant; Kalra, Dheeraj Deepak

    2017-01-01

    The purpose of this study was to compare and evaluate the shaping ability of ProTaper (PT) and Self-Adjusting File (SAF) system using cone-beam computed tomography (CBCT) to assess their performance in oval-shaped root canals. Sixty-two mandibular premolars with single oval canals were divided into two experimental groups ( n = 31) according to the systems used: Group I - PT and Group II - SAF. Canals were evaluated before and after instrumentation using CBCT to assess centering ratio and canal transportation at three levels. Data were statistically analyzed using one-way analysis of variance, post hoc Tukey's test, and t -test. The SAF showed better centering ability and lesser canal transportation than the PT only in the buccolingual plane at 6 and 9 mm levels. The shaping ability of the PT was best in the apical third in both the planes. The SAF had statistically significant better centering and lesser canal transportation in the buccolingual as compared to the mesiodistal plane at the middle and coronal levels. The SAF produced significantly less transportation and remained centered than the PT at the middle and coronal levels in the buccolingual plane of oval canals. In the mesiodistal plane, the performance of both the systems was parallel.

  17. Statistical comparison of pooled nitrogen washout data of various altitude decompression response groups

    NASA Technical Reports Server (NTRS)

    Edwards, B. F.; Waligora, J. M.; Horrigan, D. J., Jr.

    1985-01-01

    This analysis was done to determine whether various decompression response groups could be characterized by the pooled nitrogen (N2) washout profiles of the group members, pooling individual washout profiles provided a smooth time dependent function of means representative of the decompression response group. No statistically significant differences were detected. The statistical comparisons of the profiles were performed by means of univariate weighted t-test at each 5 minute profile point, and with levels of significance of 5 and 10 percent. The estimated powers of the tests (i.e., probabilities) to detect the observed differences in the pooled profiles were of the order of 8 to 30 percent.

  18. Dark Energy Survey Year 1 Results: Multi-Probe Methodology and Simulated Likelihood Analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krause, E.; et al.

    We present the methodology for and detail the implementation of the Dark Energy Survey (DES) 3x2pt DES Year 1 (Y1) analysis, which combines configuration-space two-point statistics from three different cosmological probes: cosmic shear, galaxy-galaxy lensing, and galaxy clustering, using data from the first year of DES observations. We have developed two independent modeling pipelines and describe the code validation process. We derive expressions for analytical real-space multi-probe covariances, and describe their validation with numerical simulations. We stress-test the inference pipelines in simulated likelihood analyses that vary 6-7 cosmology parameters plus 20 nuisance parameters and precisely resemble the analysis to be presented in the DES 3x2pt analysis paper, using a variety of simulated input data vectors with varying assumptions. We find that any disagreement between pipelines leads to changes in assigned likelihoodmore » $$\\Delta \\chi^2 \\le 0.045$$ with respect to the statistical error of the DES Y1 data vector. We also find that angular binning and survey mask do not impact our analytic covariance at a significant level. We determine lower bounds on scales used for analysis of galaxy clustering (8 Mpc$$~h^{-1}$$) and galaxy-galaxy lensing (12 Mpc$$~h^{-1}$$) such that the impact of modeling uncertainties in the non-linear regime is well below statistical errors, and show that our analysis choices are robust against a variety of systematics. These tests demonstrate that we have a robust analysis pipeline that yields unbiased cosmological parameter inferences for the flagship 3x2pt DES Y1 analysis. We emphasize that the level of independent code development and subsequent code comparison as demonstrated in this paper is necessary to produce credible constraints from increasingly complex multi-probe analyses of current data.« less

  19. A clinical study of electrophysiological correlates of behavioural comfort levels in cochlear implantees.

    PubMed

    Raghunandhan, S; Ravikumar, A; Kameswaran, Mohan; Mandke, Kalyani; Ranjith, R

    2014-05-01

    Indications for cochlear implantation have expanded today to include very young children and those with syndromes/multiple handicaps. Programming the implant based on behavioural responses may be tedious for audiologists in such cases, wherein matching an effective Measurable Auditory Percept (MAP) and appropriate MAP becomes the key issue in the habilitation program. In 'Difficult to MAP' scenarios, objective measures become paramount to predict optimal current levels to be set in the MAP. We aimed to (a) study the trends in multi-modal electrophysiological tests and behavioural responses sequentially over the first year of implant use; (b) generate normative data from the above; (c) correlate the multi-modal electrophysiological thresholds levels with behavioural comfort levels; and (d) create predictive formulae for deriving optimal comfort levels (if unknown), using linear and multiple regression analysis. This prospective study included 10 profoundly hearing impaired children aged between 2 and 7 years with normal inner ear anatomy and no additional handicaps. They received the Advanced Bionics HiRes 90 K Implant with Harmony Speech processor and used HiRes-P with Fidelity 120 strategy. They underwent, impedance telemetry, neural response imaging, electrically evoked stapedial response telemetry (ESRT), and electrically evoked auditory brainstem response (EABR) tests at 1, 4, 8, and 12 months of implant use, in conjunction with behavioural mapping. Trends in electrophysiological and behavioural responses were analyzed using paired t-test. By Karl Pearson's correlation method, electrode-wise correlations were derived for neural response imaging (NRI) thresholds versus most comfortable level (M-levels) and offset based (apical, mid-array, and basal array) correlations for EABR and ESRT thresholds versus M-levels were calculated over time. These were used to derive predictive formulae by linear and multiple regression analysis. Such statistically predicted M-levels were compared with the behaviourally recorded M-levels among the cohort, using Cronbach's alpha reliability test method for confirming the efficacy of this method. NRI, ESRT, and EABR thresholds showed statistically significant positive correlations with behavioural M-levels, which improved with implant use over time. These correlations were used to derive predicted M-levels using regression analysis. On an average, predicted M-levels were found to be statistically reliable and they were a fair match to the actual behavioural M-levels. When applied in clinical practice, the predicted values were found to be useful for programming members of the study group. However, individuals showed considerable deviations in behavioural M-levels, above and below the electrophysiologically predicted values, due to various factors. While the current method appears helpful as a reference to predict initial maps in 'difficult to Map' subjects, it is recommended that behavioural measures are mandatory to further optimize the maps for these individuals. The study explores the trends, correlations and individual variabilities that occur between electrophysiological tests and behavioural responses, recorded over time among a cohort of cochlear implantees. The statistical method shown may be used as a guideline to predict optimal behavioural levels in difficult situations among future implantees, bearing in mind that optimal M-levels for individuals can vary from predicted values. In 'Difficult to MAP' scenarios, following a protocol of sequential behavioural programming, in conjunction with electrophysiological correlates will provide the best outcomes.

  20. [How to fit and interpret multilevel models using SPSS].

    PubMed

    Pardo, Antonio; Ruiz, Miguel A; San Martín, Rafael

    2007-05-01

    Hierarchic or multilevel models are used to analyse data when cases belong to known groups and sample units are selected both from the individual level and from the group level. In this work, the multilevel models most commonly discussed in the statistic literature are described, explaining how to fit these models using the SPSS program (any version as of the 11 th ) and how to interpret the outcomes of the analysis. Five particular models are described, fitted, and interpreted: (1) one-way analysis of variance with random effects, (2) regression analysis with means-as-outcomes, (3) one-way analysis of covariance with random effects, (4) regression analysis with random coefficients, and (5) regression analysis with means- and slopes-as-outcomes. All models are explained, trying to make them understandable to researchers in health and behaviour sciences.

  1. Analysis of Information Content in High-Spectral Resolution Sounders using Subset Selection Analysis

    NASA Technical Reports Server (NTRS)

    Velez-Reyes, Miguel; Joiner, Joanna

    1998-01-01

    In this paper, we summarize the results of the sensitivity analysis and data reduction carried out to determine the information content of AIRS and IASI channels. The analysis and data reduction was based on the use of subset selection techniques developed in the linear algebra and statistical community to study linear dependencies in high dimensional data sets. We applied the subset selection method to study dependency among channels by studying the dependency among their weighting functions. Also, we applied the technique to study the information provided by the different levels in which the atmosphere is discretized for retrievals and analysis. Results from the method correlate well with intuition in many respects and point out to possible modifications for band selection in sensor design and number and location of levels in the analysis process.

  2. Statistical analysis of early failures in electromigration

    NASA Astrophysics Data System (ADS)

    Gall, M.; Capasso, C.; Jawarani, D.; Hernandez, R.; Kawasaki, H.; Ho, P. S.

    2001-07-01

    The detection of early failures in electromigration (EM) and the complicated statistical nature of this important reliability phenomenon have been difficult issues to treat in the past. A satisfactory experimental approach for the detection and the statistical analysis of early failures has not yet been established. This is mainly due to the rare occurrence of early failures and difficulties in testing of large sample populations. Furthermore, experimental data on the EM behavior as a function of varying number of failure links are scarce. In this study, a technique utilizing large interconnect arrays in conjunction with the well-known Wheatstone Bridge is presented. Three types of structures with a varying number of Ti/TiN/Al(Cu)/TiN-based interconnects were used, starting from a small unit of five lines in parallel. A serial arrangement of this unit enabled testing of interconnect arrays encompassing 480 possible failure links. In addition, a Wheatstone Bridge-type wiring using four large arrays in each device enabled simultaneous testing of 1920 interconnects. In conjunction with a statistical deconvolution to the single interconnect level, the results indicate that the electromigration failure mechanism studied here follows perfect lognormal behavior down to the four sigma level. The statistical deconvolution procedure is described in detail. Over a temperature range from 155 to 200 °C, a total of more than 75 000 interconnects were tested. None of the samples have shown an indication of early, or alternate, failure mechanisms. The activation energy of the EM mechanism studied here, namely the Cu incubation time, was determined to be Q=1.08±0.05 eV. We surmise that interface diffusion of Cu along the Al(Cu) sidewalls and along the top and bottom refractory layers, coupled with grain boundary diffusion within the interconnects, constitutes the Cu incubation mechanism.

  3. Effects of Omega-3 Fatty Acid Supplementation on Glucose Control and Lipid Levels in Type 2 Diabetes: A Meta-Analysis

    PubMed Central

    Chen, Cai; Yu, Xuefeng; Shao, Shiying

    2015-01-01

    Background Many studies assessed the impact of marine omega-3 fatty acids on glycemic homeostasis and lipid profiles in patients with type 2 diabetes (T2DM), but reported controversial results. Our goal was to systematically evaluate the effects of omega-3 on glucose control and lipid levels. Methods Medline, Pubmed, Cochrane Library, Embase, the National Research Register, and SIGLE were searched to identify eligible randomized clinical trials (RCTs). Extracted data from RCTs were analyzed using STATA 11.0 statistical software with fixed or random effects model. Effect sizes were presented as weighted mean differences (WMD) with 95% confidence intervals (95% CI). Heterogeneity was assessed using the Chi-square test with significance level set at p < 0.1. Results 20 RCT trials were included into this meta-analysis. Among patients with omega-3 supplementation, triglyceride (TG) levels were significantly decreased by 0.24 mmol/L. No marked change in total cholesterol (TC), HbA1c, fasting plasma glucose, postprandial plasma glucose, BMI or body weight was observed. High ratio of EPA/DHA contributed to a greater decreasing tendency in plasma insulin, HbAc1, TC, TG, and BMI measures, although no statistical significance was identified (except TG). FPG levels were increased by 0.42 mmol/L in Asians. No evidence of publication bias was observed in this meta-analysis. Conclusions The ratio of EPA/DHA and early intervention with omega 3 fatty acids may affect their effects on glucose control and lipid levels, which may serve as a dietary reference for clinicians or nutritionists who manage diabetic patients. PMID:26431431

  4. Probabilistic models for reactive behaviour in heterogeneous condensed phase media

    NASA Astrophysics Data System (ADS)

    Baer, M. R.; Gartling, D. K.; DesJardin, P. E.

    2012-02-01

    This work presents statistically-based models to describe reactive behaviour in heterogeneous energetic materials. Mesoscale effects are incorporated in continuum-level reactive flow descriptions using probability density functions (pdfs) that are associated with thermodynamic and mechanical states. A generalised approach is presented that includes multimaterial behaviour by treating the volume fraction as a random kinematic variable. Model simplifications are then sought to reduce the complexity of the description without compromising the statistical approach. Reactive behaviour is first considered for non-deformable media having a random temperature field as an initial state. A pdf transport relationship is derived and an approximate moment approach is incorporated in finite element analysis to model an example application whereby a heated fragment impacts a reactive heterogeneous material which leads to a delayed cook-off event. Modelling is then extended to include deformation effects associated with shock loading of a heterogeneous medium whereby random variables of strain, strain-rate and temperature are considered. A demonstrative mesoscale simulation of a non-ideal explosive is discussed that illustrates the joint statistical nature of the strain and temperature fields during shock loading to motivate the probabilistic approach. This modelling is derived in a Lagrangian framework that can be incorporated in continuum-level shock physics analysis. Future work will consider particle-based methods for a numerical implementation of this modelling approach.

  5. Identifying Pleiotropic Genes in Genome-Wide Association Studies for Multivariate Phenotypes with Mixed Measurement Scales

    PubMed Central

    Williams, L. Keoki; Buu, Anne

    2017-01-01

    We propose a multivariate genome-wide association test for mixed continuous, binary, and ordinal phenotypes. A latent response model is used to estimate the correlation between phenotypes with different measurement scales so that the empirical distribution of the Fisher’s combination statistic under the null hypothesis is estimated efficiently. The simulation study shows that our proposed correlation estimation methods have high levels of accuracy. More importantly, our approach conservatively estimates the variance of the test statistic so that the type I error rate is controlled. The simulation also shows that the proposed test maintains the power at the level very close to that of the ideal analysis based on known latent phenotypes while controlling the type I error. In contrast, conventional approaches–dichotomizing all observed phenotypes or treating them as continuous variables–could either reduce the power or employ a linear regression model unfit for the data. Furthermore, the statistical analysis on the database of the Study of Addiction: Genetics and Environment (SAGE) demonstrates that conducting a multivariate test on multiple phenotypes can increase the power of identifying markers that may not be, otherwise, chosen using marginal tests. The proposed method also offers a new approach to analyzing the Fagerström Test for Nicotine Dependence as multivariate phenotypes in genome-wide association studies. PMID:28081206

  6. Hydrostatic paradox: experimental verification of pressure equilibrium

    NASA Astrophysics Data System (ADS)

    Kodejška, Č.; Ganci, S.; Říha, J.; Sedláčková, H.

    2017-11-01

    This work is focused on the experimental verification of the balance between the atmospheric pressure acting on the sheet of paper, which encloses the cylinder completely or partially filled with water from below, where the hydrostatic pressure of the water column acts against the atmospheric pressure. First of all this paper solves a theoretical analysis of the problem, which is based, firstly, on the equation for isothermal process and, secondly, on the equality of pressures inside and outside the cylinder. From the measured values the confirmation of the theoretical quadratic dependence of the air pressure inside the cylinder on the level of the liquid in the cylinder is obtained, the maximum change in the volume of air within the cylinder occurs for the height of the water column L of one half of the total height of the vessel H. The measurements were made for different diameters of the cylinder and with plates made of different materials located at the bottom of the cylinder to prevent liquid from flowing out of the cylinder. The measured values were subjected to statistical analysis, which demonstrated the validity of the zero hypothesis, i.e. that the measured values are not statistically significantly different from the theoretically calculated ones at the statistical significance level α  =  0.05.

  7. Correlation of sweat chloride and percent predicted FEV1 in cystic fibrosis patients treated with ivacaftor.

    PubMed

    Fidler, Meredith C; Beusmans, Jack; Panorchan, Paul; Van Goor, Fredrick

    2017-01-01

    Ivacaftor, a CFTR potentiator that enhances chloride transport by acting directly on CFTR to increase its channel gating activity, has been evaluated in patients with different CFTR mutations. Several previous analyses have reported no statistical correlation between change from baseline in ppFEV 1 and reduction in sweat chloride levels for individuals treated with ivacaftor. The objective of the post hoc analysis described here was to expand upon previous analyses and evaluate the correlation between sweat chloride levels and absolute ppFEV 1 changes across multiple cohorts of patients with different CF-causing mutations who were treated with ivacaftor. The goal of the analysis was to help define the potential value of sweat chloride as a pharmacodynamic biomarker for use in CFTR modulator trials. For any given study, reductions in sweat chloride levels and improvements in absolute ppFEV 1 were not correlated for individual patients. However, when the data from all studies were combined, a statistically significant correlation between sweat chloride levels and ppFEV 1 changes was observed (p<0.0001). Thus, sweat chloride level changes in response to potentiation of the CFTR protein by ivacaftor appear to be a predictive pharmacodynamic biomarker of lung function changes on a population basis but are unsuitable for the prediction of treatment benefits for individuals. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  8. Online and offline tools for head movement compensation in MEG.

    PubMed

    Stolk, Arjen; Todorovic, Ana; Schoffelen, Jan-Mathijs; Oostenveld, Robert

    2013-03-01

    Magnetoencephalography (MEG) is measured above the head, which makes it sensitive to variations of the head position with respect to the sensors. Head movements blur the topography of the neuronal sources of the MEG signal, increase localization errors, and reduce statistical sensitivity. Here we describe two novel and readily applicable methods that compensate for the detrimental effects of head motion on the statistical sensitivity of MEG experiments. First, we introduce an online procedure that continuously monitors head position. Second, we describe an offline analysis method that takes into account the head position time-series. We quantify the performance of these methods in the context of three different experimental settings, involving somatosensory, visual and auditory stimuli, assessing both individual and group-level statistics. The online head localization procedure allowed for optimal repositioning of the subjects over multiple sessions, resulting in a 28% reduction of the variance in dipole position and an improvement of up to 15% in statistical sensitivity. Offline incorporation of the head position time-series into the general linear model resulted in improvements of group-level statistical sensitivity between 15% and 29%. These tools can substantially reduce the influence of head movement within and between sessions, increasing the sensitivity of many cognitive neuroscience experiments. Copyright © 2012 Elsevier Inc. All rights reserved.

  9. Retrospective space-time cluster analysis of whooping cough, re-emergence in Barcelona, Spain, 2000-2011.

    PubMed

    Solano, Rubén; Gómez-Barroso, Diana; Simón, Fernando; Lafuente, Sarah; Simón, Pere; Rius, Cristina; Gorrindo, Pilar; Toledo, Diana; Caylà, Joan A

    2014-05-01

    A retrospective, space-time study of whooping cough cases reported to the Public Health Agency of Barcelona, Spain between the years 2000 and 2011 is presented. It is based on 633 individual whooping cough cases and the 2006 population census from the Spanish National Statistics Institute, stratified by age and sex at the census tract level. Cluster identification was attempted using space-time scan statistic assuming a Poisson distribution and restricting temporal extent to 7 days and spatial distance to 500 m. Statistical calculations were performed with Stata 11 and SatScan and mapping was performed with ArcGis 10.0. Only clusters showing statistical significance (P <0.05) were mapped. The most likely cluster identified included five census tracts located in three neighbourhoods in central Barcelona during the week from 17 to 23 August 2011. This cluster included five cases compared with the expected level of 0.0021 (relative risk = 2436, P <0.001). In addition, 11 secondary significant space-time clusters were detected with secondary clusters occurring at different times and localizations. Spatial statistics is felt to be useful by complementing epidemiological surveillance systems through visualizing excess in the number of cases in space and time and thus increase the possibility of identifying outbreaks not reported by the surveillance system.

  10. Meta-Analysis: Effects of Probiotic Supplementation on Lipid Profiles in Normal to Mildly Hypercholesterolemic Individuals

    PubMed Central

    Shimizu, Mikiko; Hashiguchi, Masayuki; Shiga, Tsuyoshi; Tamura, Hiro-omi; Mochizuki, Mayumi

    2015-01-01

    Introduction Recent experimental and clinical studies have suggested that probiotic supplementation has beneficial effects on serum lipid profiles. However, there are conflicting results on the efficacy of probiotic preparations in reducing serum cholesterol. Objective To evaluate the effects of probiotics on human serum lipid levels, we conducted a meta-analysis of interventional studies. Methods Eligible reports were obtained by searches of electronic databases. We included randomized, controlled clinical trials comparing probiotic supplementation with placebo or no treatment (control). Statistical analysis was performed with Review Manager 5.3.3. Subanalyses were also performed. Results Eleven of 33 randomized clinical trials retrieved were eligible for inclusion in the meta-analysis. No participant had received any cholesterol-lowering agent. Probiotic interventions (including fermented milk products and probiotics) produced changes in total cholesterol (TC) (mean difference –0.17 mmol/L, 95% CI: –0.27 to –0.07 mmol/L) and low-density lipoprotein cholesterol (LDL-C) (mean difference –0.22 mmol/L, 95% CI: –0.30 to –0.13 mmol/L). High-density lipoprotein cholesterol and triglyceride levels did not differ significantly between probiotic and control groups. In subanalysis, long-term (>4-week) probiotic intervention was statistically more effective in decreasing TC and LDL-C than short-term (≤4-week) intervention. The decreases in TC and LDL-C levels with probiotic intervention were greater in mildly hypercholesterolemic than in normocholesterolemic individuals. Both fermented milk product and probiotic preparations decreased TC and LDL-C levels. Gaio and the Lactobacillus acidophilus strain reduced TC and LDL-C levels to a greater extent than other bacterial strains. Conclusions In conclusion, this meta-analysis showed that probiotic supplementation could be useful in the primary prevention of hypercholesterolemia and may lead to reductions in risk factors for cardiovascular disease. PMID:26473340

  11. Development of computer-assisted instruction application for statistical data analysis android platform as learning resource

    NASA Astrophysics Data System (ADS)

    Hendikawati, P.; Arifudin, R.; Zahid, M. Z.

    2018-03-01

    This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.

  12. Cigarette taxes and respiratory cancers: new evidence from panel co-integration analysis.

    PubMed

    Liu, Echu; Yu, Wei-Choun; Hsieh, Hsin-Ling

    2011-01-01

    Using a set of state-level longitudinal data from 1954 through 2005, this study investigates the "long-run equilibrium" relationship between cigarette excise taxes and the mortality rates of respiratory cancers in the United States. Statistical tests show that both cigarette excise taxes in real terms and mortality rates from respiratory cancers contain unit roots and are co-integrated. Estimates of co-integrating vectors indicated that a 10 percent increase in real cigarette excise tax rate leads to a 2.5 percent reduction in respiratory cancer mortality rate, implying a decline of 3,922 deaths per year, on a national level in the long run. These effects are statistically significant at the one percent level. Moreover, estimates of co-integrating vectors show that higher cigarette excise tax rates lead to lower mortality rates in most states; however, this relationship does not hold for Alaska, Florida, Hawaii, and Texas.

  13. Language Learning Strategy Use and Reading Achievement

    ERIC Educational Resources Information Center

    Ghafournia, Narjes

    2014-01-01

    The current study investigated the differences across the varying levels of EFL learners in the frequency and choice of learning strategies. Using a reading test, questionnaire, and parametric statistical analysis, the findings yielded up discrepancies among the participants in the implementation of language-learning strategies concerning their…

  14. Research Analysis on MOOC Course Dropout and Retention Rates

    ERIC Educational Resources Information Center

    Gomez-Zermeno, Marcela Gerogina; Aleman de La Garza, Lorena

    2016-01-01

    This research's objective was to identify the terminal efficiency of the Massive Online Open Course "Educational Innovation with Open Resources" offered by a Mexican private university. A quantitative methodology was used, combining descriptive statistics and probabilistic models to analyze the levels of retention, completion, and…

  15. Economic Impacts of Wind Turbine Development in U.S. Counties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J., Brown; B., Hoen; E., Lantz

    2011-07-25

    The objective is to address the research question using post-project construction, county-level data, and econometric evaluation methods. Wind energy is expanding rapidly in the United States: Over the last 4 years, wind power has contributed approximately 35 percent of all new electric power capacity. Wind power plants are often developed in rural areas where local economic development impacts from the installation are projected, including land lease and property tax payments and employment growth during plant construction and operation. Wind energy represented 2.3 percent of the U.S. electricity supply in 2010, but studies show that penetrations of at least 20 percentmore » are feasible. Several studies have used input-output models to predict direct, indirect, and induced economic development impacts. These analyses have often been completed prior to project construction. Available studies have not yet investigated the economic development impacts of wind development at the county level using post-construction econometric evaluation methods. Analysis of county-level impacts is limited. However, previous county-level analyses have estimated operation-period employment at 0.2 to 0.6 jobs per megawatt (MW) of power installed and earnings at $9,000/MW to $50,000/MW. We find statistically significant evidence of positive impacts of wind development on county-level per capita income from the OLS and spatial lag models when they are applied to the full set of wind and non-wind counties. The total impact on annual per capita income of wind turbine development (measured in MW per capita) in the spatial lag model was $21,604 per MW. This estimate is within the range of values estimated in the literature using input-output models. OLS results for the wind-only counties and matched samples are similar in magnitude, but are not statistically significant at the 10-percent level. We find a statistically significant impact of wind development on employment in the OLS analysis for wind counties only, but not in the other models. Our estimates of employment impacts are not precise enough to assess the validity of employment impacts from input-output models applied in advance of wind energy project construction. The analysis provides empirical evidence of positive income effects at the county level from cumulative wind turbine development, consistent with the range of impacts estimated using input-output models. Employment impacts are less clear.« less

  16. Estimated association between dwelling soil contamination and internal radiation contamination levels after the 2011 Fukushima Daiichi nuclear accident in Japan.

    PubMed

    Tsubokura, Masaharu; Nomura, Shuhei; Sakaihara, Kikugoro; Kato, Shigeaki; Leppold, Claire; Furutani, Tomoyuki; Morita, Tomohiro; Oikawa, Tomoyoshi; Kanazawa, Yukio

    2016-06-29

    Measurement of soil contamination levels has been considered a feasible method for dose estimation of internal radiation exposure following the Chernobyl disaster by means of aggregate transfer factors; however, it is still unclear whether the estimation of internal contamination based on soil contamination levels is universally valid or incident specific. To address this issue, we evaluated relationships between in vivo and soil cesium-137 (Cs-137) contamination using data on internal contamination levels among Minamisoma (10-40 km north from the Fukushima Daiichi nuclear power plant), Fukushima residents 2-3 years following the disaster, and constructed three models for statistical analysis based on continuous and categorical (equal intervals and quantiles) soil contamination levels. A total of 7987 people with a mean age of 55.4 years underwent screening of in vivo Cs-137 whole-body counting. A statistically significant association was noted between internal and continuous Cs-137 soil contamination levels (model 1, p value <0.001), although the association was slight (relative risk (RR): 1.03 per 10 kBq/m(2) increase in soil contamination). Analysis of categorical soil contamination levels showed statistical (but not clinical) significance only in relatively higher soil contamination levels (model 2: Cs-137 levels above 100 kBq/m(2) compared to those <25 kBq/m(2), RR=1.75, p value <0.01; model 3: levels above 63 kBq/m(2) compared to those <11 kBq/m(2), RR=1.45, p value <0.05). Low levels of internal and soil contamination were not associated, and only loose/small associations were observed in areas with slightly higher levels of soil contamination in Fukushima, representing a clear difference from the strong associations found in post-disaster Chernobyl. These results indicate that soil contamination levels generally do not contribute to the internal contamination of residents in Fukushima; thus, individual measurements are essential for the precise evaluation of chronic internal radiation contamination. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  17. Increased fibulin-1 plasma levels in polycystic ovary syndrome (PCOS) patients: possible contribution to the link between PCOS and cardiovascular risk.

    PubMed

    Scarinci, E; Tropea, A; Russo, G; Notaristefano, G; Messana, C; Alesiani, O; Fabozzi, S M; Lanzone, A; Apa, R

    2018-04-21

    To investigate a possible relation between fibulin-1 plasma levels and PCOS. ELISA quantitative determination of human fibulin-1. 50 women with PCOS and 40 control patients who attended the Unit of Human Reproductive Pathophysiology, Università Cattolica del Sacro Cuore, Rome, were enrolled. Ultrasonographic pelvic examinations, hormonal profile assays, oral tolerance test OGTT, lipid profile and ELISA quantitative determination of human fibulin-1 were performed. Fibulin-1 levels were found to be statistically significantly higher in PCOS patients than in matched control women. No statistically significant positive correlation was found between fibulin-1 and AUCi, HOMA-IR, total cholesterol, LDL, AMH, androstenedione and FAI, whereas a statistically significant positive correlation was found between fibulin-1 and 17OHP (p = 0.016) in the PCOS group. However, multivariable linear regression analysis showed that 17 OH P did not independently predict fibulin-1 levels (p = 0.089). Our data could contribute to explain the hypothesized increased cardiovascular risk and vascular damage in patients with PCOS. A better understanding of the cellular and molecular mechanisms involved in cardiometabolic disorders associated with PCOS is mandatory to identify new therapeutic strategies to eventually prevent the progression of cardiovascular diseases in these patients.

  18. Resolving the Antarctic contribution to sea-level rise: a hierarchical modelling framework.

    PubMed

    Zammit-Mangion, Andrew; Rougier, Jonathan; Bamber, Jonathan; Schön, Nana

    2014-06-01

    Determining the Antarctic contribution to sea-level rise from observational data is a complex problem. The number of physical processes involved (such as ice dynamics and surface climate) exceeds the number of observables, some of which have very poor spatial definition. This has led, in general, to solutions that utilise strong prior assumptions or physically based deterministic models to simplify the problem. Here, we present a new approach for estimating the Antarctic contribution, which only incorporates descriptive aspects of the physically based models in the analysis and in a statistical manner. By combining physical insights with modern spatial statistical modelling techniques, we are able to provide probability distributions on all processes deemed to play a role in both the observed data and the contribution to sea-level rise. Specifically, we use stochastic partial differential equations and their relation to geostatistical fields to capture our physical understanding and employ a Gaussian Markov random field approach for efficient computation. The method, an instantiation of Bayesian hierarchical modelling, naturally incorporates uncertainty in order to reveal credible intervals on all estimated quantities. The estimated sea-level rise contribution using this approach corroborates those found using a statistically independent method. © 2013 The Authors. Environmetrics Published by John Wiley & Sons, Ltd.

  19. Resolving the Antarctic contribution to sea-level rise: a hierarchical modelling framework†

    PubMed Central

    Zammit-Mangion, Andrew; Rougier, Jonathan; Bamber, Jonathan; Schön, Nana

    2014-01-01

    Determining the Antarctic contribution to sea-level rise from observational data is a complex problem. The number of physical processes involved (such as ice dynamics and surface climate) exceeds the number of observables, some of which have very poor spatial definition. This has led, in general, to solutions that utilise strong prior assumptions or physically based deterministic models to simplify the problem. Here, we present a new approach for estimating the Antarctic contribution, which only incorporates descriptive aspects of the physically based models in the analysis and in a statistical manner. By combining physical insights with modern spatial statistical modelling techniques, we are able to provide probability distributions on all processes deemed to play a role in both the observed data and the contribution to sea-level rise. Specifically, we use stochastic partial differential equations and their relation to geostatistical fields to capture our physical understanding and employ a Gaussian Markov random field approach for efficient computation. The method, an instantiation of Bayesian hierarchical modelling, naturally incorporates uncertainty in order to reveal credible intervals on all estimated quantities. The estimated sea-level rise contribution using this approach corroborates those found using a statistically independent method. © 2013 The Authors. Environmetrics Published by John Wiley & Sons, Ltd. PMID:25505370

  20. An Analysis of High School Students' Performance on Five Integrated Science Process Skills

    NASA Astrophysics Data System (ADS)

    Beaumont-Walters, Yvonne; Soyibo, Kola

    2001-02-01

    This study determined Jamaican high school students' level of performance on five integrated science process skills and if there were statistically significant differences in their performance linked to their gender, grade level, school location, school type, student type and socio-economic background (SEB). The 305 subjects comprised 133 males, 172 females, 146 ninth graders, 159 10th graders, 150 traditional and 155 comprehensive high school students, 164 students from the Reform of Secondary Education (ROSE) project and 141 non-ROSE students, 166 urban and 139 rural students and 110 students from a high SEB and 195 from a low SEB. Data were collected with the authors' constructed integrated science process skills test the results indicated that the subjects' mean score was low and unsatisfactory; their performance in decreasing order was: interpreting data, recording data, generalising, formulating hypotheses and identifying variables; there were statistically significant differences in their performance based on their grade level, school type, student type, and SEB in favour of the 10th graders, traditional high school students, ROSE students and students from a high SEB. There was a positive, statistically significant and fairly strong relationship between their performance and school type, but weak relationships among their student type, grade level and SEB and performance.

  1. Edge co-occurrences can account for rapid categorization of natural versus animal images

    NASA Astrophysics Data System (ADS)

    Perrinet, Laurent U.; Bednar, James A.

    2015-06-01

    Making a judgment about the semantic category of a visual scene, such as whether it contains an animal, is typically assumed to involve high-level associative brain areas. Previous explanations require progressively analyzing the scene hierarchically at increasing levels of abstraction, from edge extraction to mid-level object recognition and then object categorization. Here we show that the statistics of edge co-occurrences alone are sufficient to perform a rough yet robust (translation, scale, and rotation invariant) scene categorization. We first extracted the edges from images using a scale-space analysis coupled with a sparse coding algorithm. We then computed the “association field” for different categories (natural, man-made, or containing an animal) by computing the statistics of edge co-occurrences. These differed strongly, with animal images having more curved configurations. We show that this geometry alone is sufficient for categorization, and that the pattern of errors made by humans is consistent with this procedure. Because these statistics could be measured as early as the primary visual cortex, the results challenge widely held assumptions about the flow of computations in the visual system. The results also suggest new algorithms for image classification and signal processing that exploit correlations between low-level structure and the underlying semantic category.

  2. A supportive-educative telephone program: impact on knowledge and anxiety after coronary artery bypass graft surgery.

    PubMed

    Beckie, T

    1989-01-01

    The purpose of this study was to investigate the impact of a supportive-educative telephone program on the levels of knowledge and anxiety of patients undergoing coronary artery bypass graft surgery during the first 6 weeks after hospital discharge. With a posttest-only control group design, the first 74 patients scheduled, between September 1986 and February 1987, for coronary artery bypass graft surgery in a large, western Canadian teaching hospital were randomly assigned to either an experimental or a control group. The effect of the intervention, which was implemented by a cardiac rehabilitation nurse specialist, was assessed by a knowledge test and a state anxiety inventory. Data were collected without knowledge of the participants' group assignment. As hypothesized, data analysis with independent t tests revealed a statistically significant (p less than 0.05) difference between the knowledge level of the experimental and the control group in the areas of coronary artery disease, diet, medications, physical activity restrictions, exercise, and rest. A statistically significant difference between the state anxiety level of the experimental and the control group was also evident, as was a statistically significant inverse relationship between participants' knowledge and anxiety levels. From these findings, several implications and recommendations for nursing practice and research have been generated.

  3. Non-operative management (NOM) of blunt hepatic trauma: 80 cases.

    PubMed

    Özoğul, Bünyami; Kısaoğlu, Abdullah; Aydınlı, Bülent; Öztürk, Gürkan; Bayramoğlu, Atıf; Sarıtemur, Murat; Aköz, Ayhan; Bulut, Özgür Hakan; Atamanalp, Sabri Selçuk

    2014-03-01

    Liver is the most frequently injured organ upon abdominal trauma. We present a group of patients with blunt hepatic trauma who were managed without any invasive diagnostic tools and/or surgical intervention. A total of 80 patients with blunt liver injury who were hospitalized to the general surgery clinic or other clinics due to the concomitant injuries were followed non-operatively. The normally distributed numeric variables were evaluated by Student's t-test or one way analysis of variance, while non-normally distributed variables were analyzed by Mann-Whitney U-test or Kruskal-Wallis variance analysis. Chi-square test was also employed for the comparison of categorical variables. Statistical significance was assumed for p<0.05. There was no significant relationship between patients' Hgb level and liver injury grade, outcome, and mechanism of injury. Also, there was no statistical relationship between liver injury grade, outcome, and mechanism of injury and ALT levels as well as AST level. There was no mortality in any of the patients. During the last quarter of century, changes in the diagnosis and treatment of liver injury were associated with increased survival. NOM of liver injury in patients with stable hemodynamics and hepatic trauma seems to be the gold standard.

  4. 2015 TRI National Analysis: Toxics Release Inventory Releases at Various Summary Levels

    EPA Pesticide Factsheets

    The TRI National Analysis is EPA's annual interpretation of TRI data at various summary levels. It highlights how toxic chemical wastes were managed, where toxic chemicals were released and how the 2015 TRI data compare to data from previous years. This dataset reports US state, county, large aquatic ecosystem, metro/micropolitan statistical area, and facility level statistics from 2015 TRI releases, including information on: number of 2015 TRI facilities in the geographic area and their releases (total, water, air, land); population information, including populations living within 1 mile of TRI facilities (total, minority, in poverty); and Risk Screening Environmental Indicators (RSEI) model related pounds, toxicity-weighted pounds, and RSEI score. The source of administrative boundary data is the 2013 cartographic boundary shapefiles. Location of facilities is provided by EPA's Facility Registry Service (FRS). Large Aquatic Ecosystems boundaries were dissolved from the hydrologic unit boundaries and codes for the United States, Puerto Rico, and the U.S. Virgin Islands. It was revised for inclusion in the National Atlas of the United States of America (November 2002), and updated to match the streams file created by the USGS National Mapping Division (NMD) for the National Atlas of the United States of America.

  5. Adopting a Patient-Centered Approach to Primary Outcome Analysis of Acute Stroke Trials Using a Utility-Weighted Modified Rankin Scale.

    PubMed

    Chaisinanunkul, Napasri; Adeoye, Opeolu; Lewis, Roger J; Grotta, James C; Broderick, Joseph; Jovin, Tudor G; Nogueira, Raul G; Elm, Jordan J; Graves, Todd; Berry, Scott; Lees, Kennedy R; Barreto, Andrew D; Saver, Jeffrey L

    2015-08-01

    Although the modified Rankin Scale (mRS) is the most commonly used primary end point in acute stroke trials, its power is limited when analyzed in dichotomized fashion and its indication of effect size challenging to interpret when analyzed ordinally. Weighting the 7 Rankin levels by utilities may improve scale interpretability while preserving statistical power. A utility-weighted mRS (UW-mRS) was derived by averaging values from time-tradeoff (patient centered) and person-tradeoff (clinician centered) studies. The UW-mRS, standard ordinal mRS, and dichotomized mRS were applied to 11 trials or meta-analyses of acute stroke treatments, including lytic, endovascular reperfusion, blood pressure moderation, and hemicraniectomy interventions. Utility values were 1.0 for mRS level 0; 0.91 for mRS level 1; 0.76 for mRS level 2; 0.65 for mRS level 3; 0.33 for mRS level 4; 0 for mRS level 5; and 0 for mRS level 6. For trials with unidirectional treatment effects, the UW-mRS paralleled the ordinal mRS and outperformed dichotomous mRS analyses. Both the UW-mRS and the ordinal mRS were statistically significant in 6 of 8 unidirectional effect trials, whereas dichotomous analyses were statistically significant in 2 to 4 of 8. In bidirectional effect trials, both the UW-mRS and ordinal tests captured the divergent treatment effects by showing neutral results, whereas some dichotomized analyses showed positive results. Mean utility differences in trials with statistically significant positive results ranged from 0.026 to 0.249. A UW-mRS performs similar to the standard ordinal mRS in detecting treatment effects in actual stroke trials and ensures the quantitative outcome is a valid reflection of patient-centered benefits. © 2015 American Heart Association, Inc.

  6. Exploratory study on a statistical method to analyse time resolved data obtained during nanomaterial exposure measurements

    NASA Astrophysics Data System (ADS)

    Clerc, F.; Njiki-Menga, G.-H.; Witschger, O.

    2013-04-01

    Most of the measurement strategies that are suggested at the international level to assess workplace exposure to nanomaterials rely on devices measuring, in real time, airborne particles concentrations (according different metrics). Since none of the instruments to measure aerosols can distinguish a particle of interest to the background aerosol, the statistical analysis of time resolved data requires special attention. So far, very few approaches have been used for statistical analysis in the literature. This ranges from simple qualitative analysis of graphs to the implementation of more complex statistical models. To date, there is still no consensus on a particular approach and the current period is always looking for an appropriate and robust method. In this context, this exploratory study investigates a statistical method to analyse time resolved data based on a Bayesian probabilistic approach. To investigate and illustrate the use of the this statistical method, particle number concentration data from a workplace study that investigated the potential for exposure via inhalation from cleanout operations by sandpapering of a reactor producing nanocomposite thin films have been used. In this workplace study, the background issue has been addressed through the near-field and far-field approaches and several size integrated and time resolved devices have been used. The analysis of the results presented here focuses only on data obtained with two handheld condensation particle counters. While one was measuring at the source of the released particles, the other one was measuring in parallel far-field. The Bayesian probabilistic approach allows a probabilistic modelling of data series, and the observed task is modelled in the form of probability distributions. The probability distributions issuing from time resolved data obtained at the source can be compared with the probability distributions issuing from the time resolved data obtained far-field, leading in a quantitative estimation of the airborne particles released at the source when the task is performed. Beyond obtained results, this exploratory study indicates that the analysis of the results requires specific experience in statistics.

  7. Feasibility study of using statistical process control to customized quality assurance in proton therapy.

    PubMed

    Rah, Jeong-Eun; Shin, Dongho; Oh, Do Hoon; Kim, Tae Hyun; Kim, Gwe-Ya

    2014-09-01

    To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. The authors established a customized tolerance level of ±2% for D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors' analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety.

  8. Evaluation of Mean Platelet Volume values in lean women with polycystic ovary syndrome.

    PubMed

    Silfeler, Dilek Benk; Kurt, Raziye Keskin; Yengil, Erhan; Un, Burak; Arica, Secil; Baloglu, Ali

    2014-05-01

    Mean Platelet Volume (MPV) is an important indicator of platelet activation. It is known that MPV increases in patients with coronory artery disease, diabetes mellitus, atherosclerosis and Polycystic ovary syndrome (PCOS). Our aim was to measure the MPV in lean patients with polycystic ovary syndrome. The present study was designed to examine the platelet function by measuring MPV in non-obese women with PCOS. A total of 50 outpatients with PCOS were included. The control group consisted of 50 healthy subjects. Serum platelet, MPV, and white blood cell (WBC) levels were compared and evaluated retrospectively in all participants. These values were compared by statistical analysis. There were no statistically significant difference in between groups regarding MPV (p═0.357), WBC (p═0,414) and platelet (p═0,666). There are studies implying MPV increase in PCOS patients, in our patients MPV levels did not correlate with PCOS except for patients with obesity. We think that PCOS itself has no effect on MPV levels and obesity changes MPV levels.

  9. [Analysis on theses of the Chinese Journal of Parasitology and Parasitic Diseases in 2009-2012].

    PubMed

    Yi, Feng-Yun; Qu, Lin-Ping; Yan, He; Sheng, Hui-Feng

    2013-12-01

    The published articles at the Chinese Journal of Parasitology and Parasitic Diseases in 2009-2012 were statistically analyzed. Among 547 papers published in the four years, original articles occupied 45.3% (248/547). The number of authors was 2712, with an average cooperation degree of 5.0, and the co-authorship accounted for 95.4% of the papers. Authors were mainly from colleges/universities (51.9%, 284/547), institutions for disease control (34.4%, 188/547) and hospitals health centers (13.7%, 75/547). The average publishing delay was 212, 141, 191 and 207 d in 2009-2012. Statistical analysis reflected the characteristics and academic level for improving the quality of the journal, and revealed the latest development and trends.

  10. Statistical errors in molecular dynamics averages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schiferl, S.K.; Wallace, D.C.

    1985-11-15

    A molecular dynamics calculation produces a time-dependent fluctuating signal whose average is a thermodynamic quantity of interest. The average of the kinetic energy, for example, is proportional to the temperature. A procedure is described for determining when the molecular dynamics system is in equilibrium with respect to a given variable, according to the condition that the mean and the bandwidth of the signal should be sensibly constant in time. Confidence limits for the mean are obtained from an analysis of a finite length of the equilibrium signal. The role of serial correlation in this analysis is discussed. The occurence ofmore » unstable behavior in molecular dynamics data is noted, and a statistical test for a level shift is described.« less

  11. Evaluation of Skylab IB sensitivity to on-pad winds with turbulence

    NASA Technical Reports Server (NTRS)

    Coffin, T.

    1972-01-01

    Computer simulation was performed to estimate displacements and bending moments experienced by the SKYLAB 1B vehicle on the launch pad due to atmospheric winds. The vehicle was assumed to be a beam-like structure represented by a finite number of generalized coordinates. Wind flow across the vehicle was treated as a nonhomogeneous, stationary random process. Response computations were performed by the assumption of simple strip theory and application of generalized harmonic analysis. Displacement and bending moment statistics were obtained for six vehicle propellant loading conditions and four representative reference wind profile and turbulence levels. Means, variances and probability distributions are presented graphically for each case. A separate analysis was performed to indicate the influence of wind gradient variations on vehicle response statistics.

  12. Statistical analysis and digital processing of the Mössbauer spectra

    NASA Astrophysics Data System (ADS)

    Prochazka, Roman; Tucek, Pavel; Tucek, Jiri; Marek, Jaroslav; Mashlan, Miroslav; Pechousek, Jiri

    2010-02-01

    This work is focused on using the statistical methods and development of the filtration procedures for signal processing in Mössbauer spectroscopy. Statistical tools for noise filtering in the measured spectra are used in many scientific areas. The use of a pure statistical approach in accumulated Mössbauer spectra filtration is described. In Mössbauer spectroscopy, the noise can be considered as a Poisson statistical process with a Gaussian distribution for high numbers of observations. This noise is a superposition of the non-resonant photons counting with electronic noise (from γ-ray detection and discrimination units), and the velocity system quality that can be characterized by the velocity nonlinearities. The possibility of a noise-reducing process using a new design of statistical filter procedure is described. This mathematical procedure improves the signal-to-noise ratio and thus makes it easier to determine the hyperfine parameters of the given Mössbauer spectra. The filter procedure is based on a periodogram method that makes it possible to assign the statistically important components in the spectral domain. The significance level for these components is then feedback-controlled using the correlation coefficient test results. The estimation of the theoretical correlation coefficient level which corresponds to the spectrum resolution is performed. Correlation coefficient test is based on comparison of the theoretical and the experimental correlation coefficients given by the Spearman method. The correctness of this solution was analyzed by a series of statistical tests and confirmed by many spectra measured with increasing statistical quality for a given sample (absorber). The effect of this filter procedure depends on the signal-to-noise ratio and the applicability of this method has binding conditions.

  13. A procedure for combining acoustically induced and mechanically induced loads (first passage failure design criterion)

    NASA Technical Reports Server (NTRS)

    Crowe, D. R.; Henricks, W.

    1983-01-01

    The combined load statistics are developed by taking the acoustically induced load to be a random population, assumed to be stationary. Each element of this ensemble of acoustically induced loads is assumed to have the same power spectral density (PSD), obtained previously from a random response analysis employing the given acoustic field in the STS cargo bay as a stationary random excitation. The mechanically induced load is treated as either (1) a known deterministic transient, or (2) a nonstationary random variable of known first and second statistical moments which vary with time. A method is then shown for determining the probability that the combined load would, at any time, have a value equal to or less than a certain level. Having obtained a statistical representation of how the acoustic and mechanical loads are expected to combine, an analytical approximation for defining design levels for these loads is presented using the First Passage failure criterion.

  14. Terrestrial gamma radiation dose (TGRD) levels in northern zone of Jos Plateau, Nigeria: Statistical relationship between dose rates and geological formations

    NASA Astrophysics Data System (ADS)

    Abba, Habu Tela; Hassan, Wan Muhamad Saridan Wan; Saleh, Muneer Aziz; Aliyu, Abubakar Sadiq; Ramli, Ahmad Termizi

    2017-11-01

    In- situ measurement of terrestrial gamma radiation dose rates (TGRD) was conducted in northern zone of Jos Plateau and a statistical relationship between the TGRD and the underlying geological formations was investigated. The TGRD rates in all the measurements ranged from 40 to 1265 nGy h-1 with a mean value of 250 nGy h-1. The maximum TGDR was recorded on geological type G8 (Younger Granites) at Bisitchi, and the lowest TGDR was recorded on G6 (Basaltic rocks) at Gabia. One way analysis of variance (ANOVA) statistical test was used to compared the data. Significantly, the results of this study inferred a strong relationship between TGRD levels with geological structures of a place. An isodose map was plotted to represent exposure rates due to TGRD. The results of this investigation could be useful for multiple public interest such as evaluating public dose for the area.

  15. [A documentation procedure for community social psychiatry services--a pilot project in Bielefeld and Minden].

    PubMed

    Hellmeier, W; Genin, G; Klewe-Niemann, S

    1996-04-01

    The status of health reporting (on community levels) has improved considerably during recent years. It is being increasingly used as an instrument for planning, controlling and evaluating political processes. In addition to individual studies the statistics within the departments of the health authorities are an important factor for meaningful health reporting on a local level. The IDIS (from Jan. 1st, 1995 LOGD) and the social psychiatric services on the Minden-Lübbecke district and the city of Bielefeld have developed a programme for automation-aided management of the statistics for social psychiatric services on a local level. Details on the personal situation and illnesses of the clients as well as on the activities of the services staff are recorded and analysed. Based on the WHO programme EPI-info 6.01 the documentation programme SPD-STAT was developed. This programme is menudriven and, in addition to the functions for the statistical data input and retrieval of fixed table sets, also offers the possibility of processing data with the full functionality of the ANALYSIS-module of EPI-Info. Thus interactive ad-hoc evaluations for current questions are made possible. Using SPD-STAT in as many local regions in NRW as possible may be a big step forward for health reporting on local levels as well as for health reporting on a state level.

  16. An Interactive Tool For Semi-automated Statistical Prediction Using Earth Observations and Models

    NASA Astrophysics Data System (ADS)

    Zaitchik, B. F.; Berhane, F.; Tadesse, T.

    2015-12-01

    We developed a semi-automated statistical prediction tool applicable to concurrent analysis or seasonal prediction of any time series variable in any geographic location. The tool was developed using Shiny, JavaScript, HTML and CSS. A user can extract a predictand by drawing a polygon over a region of interest on the provided user interface (global map). The user can select the Climatic Research Unit (CRU) precipitation or Climate Hazards Group InfraRed Precipitation with Station data (CHIRPS) as predictand. They can also upload their own predictand time series. Predictors can be extracted from sea surface temperature, sea level pressure, winds at different pressure levels, air temperature at various pressure levels, and geopotential height at different pressure levels. By default, reanalysis fields are applied as predictors, but the user can also upload their own predictors, including a wide range of compatible satellite-derived datasets. The package generates correlations of the variables selected with the predictand. The user also has the option to generate composites of the variables based on the predictand. Next, the user can extract predictors by drawing polygons over the regions that show strong correlations (composites). Then, the user can select some or all of the statistical prediction models provided. Provided models include Linear Regression models (GLM, SGLM), Tree-based models (bagging, random forest, boosting), Artificial Neural Network, and other non-linear models such as Generalized Additive Model (GAM) and Multivariate Adaptive Regression Splines (MARS). Finally, the user can download the analysis steps they used, such as the region they selected, the time period they specified, the predictand and predictors they chose and preprocessing options they used, and the model results in PDF or HTML format. Key words: Semi-automated prediction, Shiny, R, GLM, ANN, RF, GAM, MARS

  17. Nutritional status and CD4 cell counts in patients with HIV/AIDS receiving antiretroviral therapy.

    PubMed

    Santos, Ana Célia Oliveira dos; Almeida, Ana Maria Rampeloti

    2013-01-01

    Even with current highly active antiretroviral therapy, individuals with AIDS continue to exhibit important nutritional deficits and reduced levels of albumin and hemoglobin, which may be directly related to their cluster of differentiation 4 (CD4) cell counts. The aim of this study was to characterize the nutritional status of individuals with human immunodeficiency virus/acquired immunodeficiency syndrome (HIV/AIDS) and relate the findings to the albumin level, hemoglobin level and CD4 cell count. Patients over 20 years of age with AIDS who were hospitalized in a university hospital and were receiving antiretroviral therapy were studied with regard to clinical, anthropometric, biochemical and sociodemographic characteristics. Body mass index, percentage of weight loss, arm circumference, triceps skinfold and arm muscle circumference were analyzed. Data on albumin, hemoglobin, hematocrit and CD4 cell count were obtained from patient charts. Statistical analysis was performed using Fisher's exact test, Student's t-test for independent variables and the Mann-Whitney U-test. The level of significance was set to 0.05 (α = 5%). Statistical analysis was performed using Statistical Package for the Social Sciences (SPSS) 17.0 software for Windows. Of the 50 patients evaluated, 70% were male. The prevalence of malnutrition was higher when the definition was based on arm circumference and triceps skinfold measurement. The concentrations of all biochemical variables were significantly lower among patients with a body mass index of less than 18.5kg/m2. The CD4 cell count, albumin, hemoglobin and hematocrit anthropometric measures were directly related to each other. These findings underscore the importance of nutritional follow-up for underweight patients with AIDS, as nutritional status proved to be related to important biochemical alterations.

  18. Gram-Negative Bacterial Wound Infections

    DTIC Science & Technology

    2014-05-01

    shows an effect with increasing concentration, however survival analysis does not show a significant difference between treatment groups and controls ...with 3 dead larvae in the 25 mM group compared to a single dead larva in the control group (Fig. 7). Probit analysis estimates the lethal...statistically differ- ent from that of the control group . The levels (CFU/g) of bacteria in lung tissue correlated with the survival curves. The median

  19. Exploratory Multivariate Analysis. A Graphical Approach.

    DTIC Science & Technology

    1981-01-01

    Gnanadesikan , 1977) but we feel that these should be used with great caution unless one really has good reason to believe that the data came from such a...are referred to Gnanadesikan (1977). The present author hopes that the convenience of a single summary or significance level will not deter his readers...fit of a harmonic model to meteorological data. (In preparation). Gnanadesikan , R. (1977). Methods for Statistical Data Analysis of Multivariate

  20. Predictive data modeling of human type II diabetes related statistics

    NASA Astrophysics Data System (ADS)

    Jaenisch, Kristina L.; Jaenisch, Holger M.; Handley, James W.; Albritton, Nathaniel G.

    2009-04-01

    During the course of routine Type II treatment of one of the authors, it was decided to derive predictive analytical Data Models of the daily sampled vital statistics: namely weight, blood pressure, and blood sugar, to determine if the covariance among the observed variables could yield a descriptive equation based model, or better still, a predictive analytical model that could forecast the expected future trend of the variables and possibly eliminate the number of finger stickings required to montior blood sugar levels. The personal history and analysis with resulting models are presented.

Top