Sample records for previous statistical analysis

  1. The Empirical Review of Meta-Analysis Published in Korea

    ERIC Educational Resources Information Center

    Park, Sunyoung; Hong, Sehee

    2016-01-01

    Meta-analysis is a statistical method that is increasingly utilized to combine and compare the results of previous primary studies. However, because of the lack of comprehensive guidelines for how to use meta-analysis, many meta-analysis studies have failed to consider important aspects, such as statistical programs, power analysis, publication…

  2. Mathematical background and attitudes toward statistics in a sample of Spanish college students.

    PubMed

    Carmona, José; Martínez, Rafael J; Sánchez, Manuel

    2005-08-01

    To examine the relation of mathematical background and initial attitudes toward statistics of Spanish college students in social sciences the Survey of Attitudes Toward Statistics was given to 827 students. Multivariate analyses tested the effects of two indicators of mathematical background (amount of exposure and achievement in previous courses) on the four subscales. Analysis suggested grades in previous courses are more related to initial attitudes toward statistics than the number of mathematics courses taken. Mathematical background was related with students' affective responses to statistics but not with their valuing of statistics. Implications of possible research are discussed.

  3. Statistical Analysis of CFD Solutions From the Fifth AIAA Drag Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Morrison, Joseph H.

    2013-01-01

    A graphical framework is used for statistical analysis of the results from an extensive N-version test of a collection of Reynolds-averaged Navier-Stokes computational fluid dynamics codes. The solutions were obtained by code developers and users from North America, Europe, Asia, and South America using a common grid sequence and multiple turbulence models for the June 2012 fifth Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic configuration for this workshop was the Common Research Model subsonic transport wing-body previously used for the 4th Drag Prediction Workshop. This work continues the statistical analysis begun in the earlier workshops and compares the results from the grid convergence study of the most recent workshop with previous workshops.

  4. Study designs, use of statistical tests, and statistical analysis software choice in 2015: Results from two Pakistani monthly Medline indexed journals.

    PubMed

    Shaikh, Masood Ali

    2017-09-01

    Assessment of research articles in terms of study designs used, statistical tests applied and the use of statistical analysis programmes help determine research activity profile and trends in the country. In this descriptive study, all original articles published by Journal of Pakistan Medical Association (JPMA) and Journal of the College of Physicians and Surgeons Pakistan (JCPSP), in the year 2015 were reviewed in terms of study designs used, application of statistical tests, and the use of statistical analysis programmes. JPMA and JCPSP published 192 and 128 original articles, respectively, in the year 2015. Results of this study indicate that cross-sectional study design, bivariate inferential statistical analysis entailing comparison between two variables/groups, and use of statistical software programme SPSS to be the most common study design, inferential statistical analysis, and statistical analysis software programmes, respectively. These results echo previously published assessment of these two journals for the year 2014.

  5. The Use of a Context-Based Information Retrieval Technique

    DTIC Science & Technology

    2009-07-01

    provided in context. Latent Semantic Analysis (LSA) is a statistical technique for inferring contextual and structural information, and previous studies...WAIS). 10 DSTO-TR-2322 1.4.4 Latent Semantic Analysis LSA, which is also known as latent semantic indexing (LSI), uses a statistical and...1.4.6 Language Models In contrast, natural language models apply algorithms that combine statistical information with semantic information. Semantic

  6. Looking Back over Their Shoulders: A Qualitative Analysis of Portuguese Teachers' Attitudes towards Statistics

    ERIC Educational Resources Information Center

    Martins, Jose Alexandre; Nascimento, Maria Manuel; Estrada, Assumpta

    2012-01-01

    Teachers' attitudes towards statistics can have a significant effect on their own statistical training, their teaching of statistics, and the future attitudes of their students. The influence of attitudes in teaching statistics in different contexts was previously studied in the work of Estrada et al. (2004, 2010a, 2010b) and Martins et al.…

  7. Proceedings of the first ERDA statistical symposium, Los Alamos, NM, November 3--5, 1975. [Sixteen papers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nicholson, W L; Harris, J L

    1976-03-01

    The First ERDA Statistical Symposium was organized to provide a means for communication among ERDA statisticians, and the sixteen papers presented at the meeting are given. Topics include techniques of numerical analysis used for accelerators, nuclear reactors, skewness and kurtosis statistics, radiochemical spectral analysis, quality control, and other statistics problems. Nine of the papers were previously announced in Nuclear Science Abstracts (NSA), while the remaining seven were abstracted for ERDA Energy Research Abstracts (ERA) and INIS Atomindex. (PMA)

  8. Citation of previous meta-analyses on the same topic: a clue to perpetuation of incorrect methods?

    PubMed

    Li, Tianjing; Dickersin, Kay

    2013-06-01

    Systematic reviews and meta-analyses serve as a basis for decision-making and clinical practice guidelines and should be carried out using appropriate methodology to avoid incorrect inferences. We describe the characteristics, statistical methods used for meta-analyses, and citation patterns of all 21 glaucoma systematic reviews we identified pertaining to the effectiveness of prostaglandin analog eye drops in treating primary open-angle glaucoma, published between December 2000 and February 2012. We abstracted data, assessed whether appropriate statistical methods were applied in meta-analyses, and examined citation patterns of included reviews. We identified two forms of problematic statistical analyses in 9 of the 21 systematic reviews examined. Except in 1 case, none of the 9 reviews that used incorrect statistical methods cited a previously published review that used appropriate methods. Reviews that used incorrect methods were cited 2.6 times more often than reviews that used appropriate statistical methods. We speculate that by emulating the statistical methodology of previous systematic reviews, systematic review authors may have perpetuated incorrect approaches to meta-analysis. The use of incorrect statistical methods, perhaps through emulating methods described in previous research, calls conclusions of systematic reviews into question and may lead to inappropriate patient care. We urge systematic review authors and journal editors to seek the advice of experienced statisticians before undertaking or accepting for publication a systematic review and meta-analysis. The author(s) have no proprietary or commercial interest in any materials discussed in this article. Copyright © 2013 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  9. Statistical Analysis of CFD Solutions from the 6th AIAA CFD Drag Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Derlaga, Joseph M.; Morrison, Joseph H.

    2017-01-01

    A graphical framework is used for statistical analysis of the results from an extensive N- version test of a collection of Reynolds-averaged Navier-Stokes computational uid dynam- ics codes. The solutions were obtained by code developers and users from North America, Europe, Asia, and South America using both common and custom grid sequencees as well as multiple turbulence models for the June 2016 6th AIAA CFD Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic con guration for this workshop was the Common Research Model subsonic transport wing- body previously used for both the 4th and 5th Drag Prediction Workshops. This work continues the statistical analysis begun in the earlier workshops and compares the results from the grid convergence study of the most recent workshop with previous workshops.

  10. Examples of Data Analysis with SPSS/PC+ Studentware.

    ERIC Educational Resources Information Center

    MacFarland, Thomas W.

    Intended for classroom use only, these unpublished notes contain computer lessons on descriptive statistics with files previously created in WordPerfect 4.2 and Lotus 1-2-3 Version 1.A for the IBM PC+. The statistical measures covered include Student's t-test with two independent samples; Student's t-test with a paired sample; Chi-square analysis;…

  11. Low-flow frequency and flow duration of selected South Carolina streams in the Broad River basin through March 2008

    USGS Publications Warehouse

    Guimaraes, Wladmir B.; Feaster, Toby D.

    2010-01-01

    Of the 23 streamgaging stations for which recurrence interval computations were made, 14 had low-flow statistics that were published in previous U.S. Geological Survey reports. A comparison of the low-flow statistics for the minimum mean flow for a 7-consecutive-day period with a 10-year recurrence interval (7Q10) from this study with the most recently published values indicated that 8 of the 14 streamgaging stations had values that were within plus or minus 25 percent of the previous value. Ten of the 14 streamgaging stations had negative percent differences indicating the low-flow statistic had decreased since the previous study, and 4 streamgaging stations had positive percent differences indicating that the low-flow statistic had increased since the previous study. The low-flow statistics are influenced by length of record, hydrologic regime under which the record was collected, techniques used to do the analysis, and other changes, such as urbanization, diversions, and so on, that may have occurred in the basin.

  12. Nigerian pharmacists’ self-perceived competence and confidence to plan and conduct pharmacy practice research

    PubMed Central

    Usman, Mohammad N.; Umar, Muhammad D.

    2018-01-01

    Background: Recent studies have revealed that pharmacists have interest in conducting research. However, lack of confidence is a major barrier. Objective: This study evaluated pharmacists’ self-perceived competence and confidence to plan and conduct health-related research. Method: This cross sectional study was conducted during the 89th Annual National Conference of the Pharmaceutical Society of Nigeria in November 2016. An adapted questionnaire was validated and administered to 200 pharmacist delegates during the conference. Result: Overall, 127 questionnaires were included in the analysis. At least 80% of the pharmacists had previous health-related research experience. Pharmacist’s competence and confidence scores were lowest for research skills such as: using software for statistical analysis, choosing and applying appropriate inferential statistical test and method, and outlining detailed statistical plan to be used in data analysis. Highest competence and confidence scores were observed for conception of research idea, literature search and critical appraisal of literature. Pharmacists with previous research experience had higher competence and confidence scores than those with no previous research experience (p<0.05). The only predictor of moderate-to-extreme self-competence and confidence was having at least one journal article publication during the last 5 years. Conclusion: Nigerian pharmacists indicated interest to participate in health-related research. However, self-competence and confidence to plan and conduct research were low. This was particularly so for skills related to statistical analysis. Training programs and building of Pharmacy Practice Research Network are recommended to enhance pharmacist’s research capacity. PMID:29619141

  13. Common pitfalls in statistical analysis: Absolute risk reduction, relative risk reduction, and number needed to treat

    PubMed Central

    Ranganathan, Priya; Pramesh, C. S.; Aggarwal, Rakesh

    2016-01-01

    In the previous article in this series on common pitfalls in statistical analysis, we looked at the difference between risk and odds. Risk, which refers to the probability of occurrence of an event or outcome, can be defined in absolute or relative terms. Understanding what these measures represent is essential for the accurate interpretation of study results. PMID:26952180

  14. Evolution of statistical properties for a nonlinearly propagating sinusoid.

    PubMed

    Shepherd, Micah R; Gee, Kent L; Hanford, Amanda D

    2011-07-01

    The nonlinear propagation of a pure sinusoid is considered using time domain statistics. The probability density function, standard deviation, skewness, kurtosis, and crest factor are computed for both the amplitude and amplitude time derivatives as a function of distance. The amplitude statistics vary only in the postshock realm, while the amplitude derivative statistics vary rapidly in the preshock realm. The statistical analysis also suggests that the sawtooth onset distance can be considered to be earlier than previously realized. © 2011 Acoustical Society of America

  15. Teaching statistics in biology: using inquiry-based learning to strengthen understanding of statistical analysis in biology laboratory courses.

    PubMed

    Metz, Anneke M

    2008-01-01

    There is an increasing need for students in the biological sciences to build a strong foundation in quantitative approaches to data analyses. Although most science, engineering, and math field majors are required to take at least one statistics course, statistical analysis is poorly integrated into undergraduate biology course work, particularly at the lower-division level. Elements of statistics were incorporated into an introductory biology course, including a review of statistics concepts and opportunity for students to perform statistical analysis in a biological context. Learning gains were measured with an 11-item statistics learning survey instrument developed for the course. Students showed a statistically significant 25% (p < 0.005) increase in statistics knowledge after completing introductory biology. Students improved their scores on the survey after completing introductory biology, even if they had previously completed an introductory statistics course (9%, improvement p < 0.005). Students retested 1 yr after completing introductory biology showed no loss of their statistics knowledge as measured by this instrument, suggesting that the use of statistics in biology course work may aid long-term retention of statistics knowledge. No statistically significant differences in learning were detected between male and female students in the study.

  16. Ecological Momentary Assessments and Automated Time Series Analysis to Promote Tailored Health Care: A Proof-of-Principle Study.

    PubMed

    van der Krieke, Lian; Emerencia, Ando C; Bos, Elisabeth H; Rosmalen, Judith Gm; Riese, Harriëtte; Aiello, Marco; Sytema, Sjoerd; de Jonge, Peter

    2015-08-07

    Health promotion can be tailored by combining ecological momentary assessments (EMA) with time series analysis. This combined method allows for studying the temporal order of dynamic relationships among variables, which may provide concrete indications for intervention. However, application of this method in health care practice is hampered because analyses are conducted manually and advanced statistical expertise is required. This study aims to show how this limitation can be overcome by introducing automated vector autoregressive modeling (VAR) of EMA data and to evaluate its feasibility through comparisons with results of previously published manual analyses. We developed a Web-based open source application, called AutoVAR, which automates time series analyses of EMA data and provides output that is intended to be interpretable by nonexperts. The statistical technique we used was VAR. AutoVAR tests and evaluates all possible VAR models within a given combinatorial search space and summarizes their results, thereby replacing the researcher's tasks of conducting the analysis, making an informed selection of models, and choosing the best model. We compared the output of AutoVAR to the output of a previously published manual analysis (n=4). An illustrative example consisting of 4 analyses was provided. Compared to the manual output, the AutoVAR output presents similar model characteristics and statistical results in terms of the Akaike information criterion, the Bayesian information criterion, and the test statistic of the Granger causality test. Results suggest that automated analysis and interpretation of times series is feasible. Compared to a manual procedure, the automated procedure is more robust and can save days of time. These findings may pave the way for using time series analysis for health promotion on a larger scale. AutoVAR was evaluated using the results of a previously conducted manual analysis. Analysis of additional datasets is needed in order to validate and refine the application for general use.

  17. Ecological Momentary Assessments and Automated Time Series Analysis to Promote Tailored Health Care: A Proof-of-Principle Study

    PubMed Central

    Emerencia, Ando C; Bos, Elisabeth H; Rosmalen, Judith GM; Riese, Harriëtte; Aiello, Marco; Sytema, Sjoerd; de Jonge, Peter

    2015-01-01

    Background Health promotion can be tailored by combining ecological momentary assessments (EMA) with time series analysis. This combined method allows for studying the temporal order of dynamic relationships among variables, which may provide concrete indications for intervention. However, application of this method in health care practice is hampered because analyses are conducted manually and advanced statistical expertise is required. Objective This study aims to show how this limitation can be overcome by introducing automated vector autoregressive modeling (VAR) of EMA data and to evaluate its feasibility through comparisons with results of previously published manual analyses. Methods We developed a Web-based open source application, called AutoVAR, which automates time series analyses of EMA data and provides output that is intended to be interpretable by nonexperts. The statistical technique we used was VAR. AutoVAR tests and evaluates all possible VAR models within a given combinatorial search space and summarizes their results, thereby replacing the researcher’s tasks of conducting the analysis, making an informed selection of models, and choosing the best model. We compared the output of AutoVAR to the output of a previously published manual analysis (n=4). Results An illustrative example consisting of 4 analyses was provided. Compared to the manual output, the AutoVAR output presents similar model characteristics and statistical results in terms of the Akaike information criterion, the Bayesian information criterion, and the test statistic of the Granger causality test. Conclusions Results suggest that automated analysis and interpretation of times series is feasible. Compared to a manual procedure, the automated procedure is more robust and can save days of time. These findings may pave the way for using time series analysis for health promotion on a larger scale. AutoVAR was evaluated using the results of a previously conducted manual analysis. Analysis of additional datasets is needed in order to validate and refine the application for general use. PMID:26254160

  18. Quantification of integrated HIV DNA by repetitive-sampling Alu-HIV PCR on the basis of poisson statistics.

    PubMed

    De Spiegelaere, Ward; Malatinkova, Eva; Lynch, Lindsay; Van Nieuwerburgh, Filip; Messiaen, Peter; O'Doherty, Una; Vandekerckhove, Linos

    2014-06-01

    Quantification of integrated proviral HIV DNA by repetitive-sampling Alu-HIV PCR is a candidate virological tool to monitor the HIV reservoir in patients. However, the experimental procedures and data analysis of the assay are complex and hinder its widespread use. Here, we provide an improved and simplified data analysis method by adopting binomial and Poisson statistics. A modified analysis method on the basis of Poisson statistics was used to analyze the binomial data of positive and negative reactions from a 42-replicate Alu-HIV PCR by use of dilutions of an integration standard and on samples of 57 HIV-infected patients. Results were compared with the quantitative output of the previously described Alu-HIV PCR method. Poisson-based quantification of the Alu-HIV PCR was linearly correlated with the standard dilution series, indicating that absolute quantification with the Poisson method is a valid alternative for data analysis of repetitive-sampling Alu-HIV PCR data. Quantitative outputs of patient samples assessed by the Poisson method correlated with the previously described Alu-HIV PCR analysis, indicating that this method is a valid alternative for quantifying integrated HIV DNA. Poisson-based analysis of the Alu-HIV PCR data enables absolute quantification without the need of a standard dilution curve. Implementation of the CI estimation permits improved qualitative analysis of the data and provides a statistical basis for the required minimal number of technical replicates. © 2014 The American Association for Clinical Chemistry.

  19. Statistically Characterizing Intra- and Inter-Individual Variability in Children with Developmental Coordination Disorder

    ERIC Educational Resources Information Center

    King, Bradley R.; Harring, Jeffrey R.; Oliveira, Marcio A.; Clark, Jane E.

    2011-01-01

    Previous research investigating children with Developmental Coordination Disorder (DCD) has consistently reported increased intra- and inter-individual variability during motor skill performance. Statistically characterizing this variability is not only critical for the analysis and interpretation of behavioral data, but also may facilitate our…

  20. Common pitfalls in statistical analysis: Linear regression analysis

    PubMed Central

    Aggarwal, Rakesh; Ranganathan, Priya

    2017-01-01

    In a previous article in this series, we explained correlation analysis which describes the strength of relationship between two continuous variables. In this article, we deal with linear regression analysis which predicts the value of one continuous variable from another. We also discuss the assumptions and pitfalls associated with this analysis. PMID:28447022

  1. Methodologies for the Statistical Analysis of Memory Response to Radiation

    NASA Astrophysics Data System (ADS)

    Bosser, Alexandre L.; Gupta, Viyas; Tsiligiannis, Georgios; Frost, Christopher D.; Zadeh, Ali; Jaatinen, Jukka; Javanainen, Arto; Puchner, Helmut; Saigné, Frédéric; Virtanen, Ari; Wrobel, Frédéric; Dilillo, Luigi

    2016-08-01

    Methodologies are proposed for in-depth statistical analysis of Single Event Upset data. The motivation for using these methodologies is to obtain precise information on the intrinsic defects and weaknesses of the tested devices, and to gain insight on their failure mechanisms, at no additional cost. The case study is a 65 nm SRAM irradiated with neutrons, protons and heavy ions. This publication is an extended version of a previous study [1].

  2. Teaching Statistics in Biology: Using Inquiry-based Learning to Strengthen Understanding of Statistical Analysis in Biology Laboratory Courses

    PubMed Central

    2008-01-01

    There is an increasing need for students in the biological sciences to build a strong foundation in quantitative approaches to data analyses. Although most science, engineering, and math field majors are required to take at least one statistics course, statistical analysis is poorly integrated into undergraduate biology course work, particularly at the lower-division level. Elements of statistics were incorporated into an introductory biology course, including a review of statistics concepts and opportunity for students to perform statistical analysis in a biological context. Learning gains were measured with an 11-item statistics learning survey instrument developed for the course. Students showed a statistically significant 25% (p < 0.005) increase in statistics knowledge after completing introductory biology. Students improved their scores on the survey after completing introductory biology, even if they had previously completed an introductory statistics course (9%, improvement p < 0.005). Students retested 1 yr after completing introductory biology showed no loss of their statistics knowledge as measured by this instrument, suggesting that the use of statistics in biology course work may aid long-term retention of statistics knowledge. No statistically significant differences in learning were detected between male and female students in the study. PMID:18765754

  3. Spatio-temporal dependencies between hospital beds, physicians and health expenditure using visual variables and data classification in statistical table

    NASA Astrophysics Data System (ADS)

    Medyńska-Gulij, Beata; Cybulski, Paweł

    2016-06-01

    This paper analyses the use of table visual variables of statistical data of hospital beds as an important tool for revealing spatio-temporal dependencies. It is argued that some of conclusions from the data about public health and public expenditure on health have a spatio-temporal reference. Different from previous studies, this article adopts combination of cartographic pragmatics and spatial visualization with previous conclusions made in public health literature. While the significant conclusions about health care and economic factors has been highlighted in research papers, this article is the first to apply visual analysis to statistical table together with maps which is called previsualisation.

  4. Statistics Test Questions: Content and Trends

    ERIC Educational Resources Information Center

    Salcedo, Audy

    2014-01-01

    This study presents the results of the analysis of a group of teacher-made test questions for statistics courses at the university level. Teachers were asked to submit tests they had used in their previous two semesters. Ninety-seven tests containing 978 questions were gathered and classified according to the SOLO taxonomy (Biggs & Collis,…

  5. More on Time Series Designs: A Reanalysis of Mayer and Kozlow's Data.

    ERIC Educational Resources Information Center

    Willson, Victor L.

    1982-01-01

    Differentiating between time-series design and time-series analysis, examines design considerations and reanalyzes data previously reported by Mayer and Kozlow in this journal. The current analysis supports the analysis performed by Mayer and Kozlow but puts the results on a somewhat firmer statistical footing. (Author/JN)

  6. Mapping patent classifications: portfolio and statistical analysis, and the comparison of strengths and weaknesses.

    PubMed

    Leydesdorff, Loet; Kogler, Dieter Franz; Yan, Bowen

    2017-01-01

    The Cooperative Patent Classifications (CPC) recently developed cooperatively by the European and US Patent Offices provide a new basis for mapping patents and portfolio analysis. CPC replaces International Patent Classifications (IPC) of the World Intellectual Property Organization. In this study, we update our routines previously based on IPC for CPC and use the occasion for rethinking various parameter choices. The new maps are significantly different from the previous ones, although this may not always be obvious on visual inspection. We provide nested maps online and a routine for generating portfolio overlays on the maps; a new tool is provided for "difference maps" between patent portfolios of organizations or firms. This is illustrated by comparing the portfolios of patents granted to two competing firms-Novartis and MSD-in 2016. Furthermore, the data is organized for the purpose of statistical analysis.

  7. Online Statistical Modeling (Regression Analysis) for Independent Responses

    NASA Astrophysics Data System (ADS)

    Made Tirta, I.; Anggraeni, Dian; Pandutama, Martinus

    2017-06-01

    Regression analysis (statistical analmodelling) are among statistical methods which are frequently needed in analyzing quantitative data, especially to model relationship between response and explanatory variables. Nowadays, statistical models have been developed into various directions to model various type and complex relationship of data. Rich varieties of advanced and recent statistical modelling are mostly available on open source software (one of them is R). However, these advanced statistical modelling, are not very friendly to novice R users, since they are based on programming script or command line interface. Our research aims to developed web interface (based on R and shiny), so that most recent and advanced statistical modelling are readily available, accessible and applicable on web. We have previously made interface in the form of e-tutorial for several modern and advanced statistical modelling on R especially for independent responses (including linear models/LM, generalized linier models/GLM, generalized additive model/GAM and generalized additive model for location scale and shape/GAMLSS). In this research we unified them in the form of data analysis, including model using Computer Intensive Statistics (Bootstrap and Markov Chain Monte Carlo/ MCMC). All are readily accessible on our online Virtual Statistics Laboratory. The web (interface) make the statistical modeling becomes easier to apply and easier to compare them in order to find the most appropriate model for the data.

  8. Research design and statistical methods in Pakistan Journal of Medical Sciences (PJMS).

    PubMed

    Akhtar, Sohail; Shah, Syed Wadood Ali; Rafiq, M; Khan, Ajmal

    2016-01-01

    This article compares the study design and statistical methods used in 2005, 2010 and 2015 of Pakistan Journal of Medical Sciences (PJMS). Only original articles of PJMS were considered for the analysis. The articles were carefully reviewed for statistical methods and designs, and then recorded accordingly. The frequency of each statistical method and research design was estimated and compared with previous years. A total of 429 articles were evaluated (n=74 in 2005, n=179 in 2010, n=176 in 2015) in which 171 (40%) were cross-sectional and 116 (27%) were prospective study designs. A verity of statistical methods were found in the analysis. The most frequent methods include: descriptive statistics (n=315, 73.4%), chi-square/Fisher's exact tests (n=205, 47.8%) and student t-test (n=186, 43.4%). There was a significant increase in the use of statistical methods over time period: t-test, chi-square/Fisher's exact test, logistic regression, epidemiological statistics, and non-parametric tests. This study shows that a diverse variety of statistical methods have been used in the research articles of PJMS and frequency improved from 2005 to 2015. However, descriptive statistics was the most frequent method of statistical analysis in the published articles while cross-sectional study design was common study design.

  9. Patterns of Puffery: An Analysis of Non-Fiction Blurbs

    ERIC Educational Resources Information Center

    Cronin, Blaise; La Barre, Kathryn

    2005-01-01

    The blurb is a paratextual element which has not previously been subjected to systematic analysis. We describe the nature and purpose of this publishing epiphenomenon, highlight some of the related marketing issues and ethical concerns and provide a statistical analysis of almost 2000 blurbs identified in a sample of 450 non-fiction books.…

  10. Wavelet Statistical Analysis of Low-Latitude Geomagnetic Measurements

    NASA Astrophysics Data System (ADS)

    Papa, A. R.; Akel, A. F.

    2009-05-01

    Following previous works by our group (Papa et al., JASTP, 2006), where we analyzed a series of records acquired at the Vassouras National Geomagnetic Observatory in Brazil for the month of October 2000, we introduced a wavelet analysis for the same type of data and for other periods. It is well known that wavelets allow a more detailed study in several senses: the time window for analysis can be drastically reduced if compared to other traditional methods (Fourier, for example) and at the same time allow an almost continuous accompaniment of both amplitude and frequency of signals as time goes by. This advantage brings some possibilities for potentially useful forecasting methods of the type also advanced by our group in previous works (see for example, Papa and Sosman, JASTP, 2008). However, the simultaneous statistical analysis of both time series (in our case amplitude and frequency) is a challenging matter and is in this sense that we have found what we consider our main goal. Some possible trends for future works are advanced.

  11. Reliability and statistical power analysis of cortical and subcortical FreeSurfer metrics in a large sample of healthy elderly.

    PubMed

    Liem, Franziskus; Mérillat, Susan; Bezzola, Ladina; Hirsiger, Sarah; Philipp, Michel; Madhyastha, Tara; Jäncke, Lutz

    2015-03-01

    FreeSurfer is a tool to quantify cortical and subcortical brain anatomy automatically and noninvasively. Previous studies have reported reliability and statistical power analyses in relatively small samples or only selected one aspect of brain anatomy. Here, we investigated reliability and statistical power of cortical thickness, surface area, volume, and the volume of subcortical structures in a large sample (N=189) of healthy elderly subjects (64+ years). Reliability (intraclass correlation coefficient) of cortical and subcortical parameters is generally high (cortical: ICCs>0.87, subcortical: ICCs>0.95). Surface-based smoothing increases reliability of cortical thickness maps, while it decreases reliability of cortical surface area and volume. Nevertheless, statistical power of all measures benefits from smoothing. When aiming to detect a 10% difference between groups, the number of subjects required to test effects with sufficient power over the entire cortex varies between cortical measures (cortical thickness: N=39, surface area: N=21, volume: N=81; 10mm smoothing, power=0.8, α=0.05). For subcortical regions this number is between 16 and 76 subjects, depending on the region. We also demonstrate the advantage of within-subject designs over between-subject designs. Furthermore, we publicly provide a tool that allows researchers to perform a priori power analysis and sensitivity analysis to help evaluate previously published studies and to design future studies with sufficient statistical power. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Alignments of parity even/odd-only multipoles in CMB

    NASA Astrophysics Data System (ADS)

    Aluri, Pavan K.; Ralston, John P.; Weltman, Amanda

    2017-12-01

    We compare the statistics of parity even and odd multipoles of the cosmic microwave background (CMB) sky from Planck full mission temperature measurements. An excess power in odd multipoles compared to even multipoles has previously been found on large angular scales. Motivated by this apparent parity asymmetry, we evaluate directional statistics associated with even compared to odd multipoles, along with their significances. Primary tools are the Power tensor and Alignment tensor statistics. We limit our analysis to the first 60 multipoles i.e. l = [2, 61]. We find no evidence for statistically unusual alignments of even parity multipoles. More than one independent statistic finds evidence for alignments of anisotropy axes of odd multipoles, with a significance equivalent to ∼2σ or more. The robustness of alignment axes is tested by making Galactic cuts and varying the multipole range. Very interestingly, the region spanned by the (a)symmetry axes is found to broadly contain other parity (a)symmetry axes previously observed in the literature.

  13. Performance evaluation of tile-based Fisher Ratio analysis using a benchmark yeast metabolome dataset.

    PubMed

    Watson, Nathanial E; Parsons, Brendon A; Synovec, Robert E

    2016-08-12

    Performance of tile-based Fisher Ratio (F-ratio) data analysis, recently developed for discovery-based studies using comprehensive two-dimensional gas chromatography coupled with time-of-flight mass spectrometry (GC×GC-TOFMS), is evaluated with a metabolomics dataset that had been previously analyzed in great detail, but while taking a brute force approach. The previously analyzed data (referred to herein as the benchmark dataset) were intracellular extracts from Saccharomyces cerevisiae (yeast), either metabolizing glucose (repressed) or ethanol (derepressed), which define the two classes in the discovery-based analysis to find metabolites that are statistically different in concentration between the two classes. Beneficially, this previously analyzed dataset provides a concrete means to validate the tile-based F-ratio software. Herein, we demonstrate and validate the significant benefits of applying tile-based F-ratio analysis. The yeast metabolomics data are analyzed more rapidly in about one week versus one year for the prior studies with this dataset. Furthermore, a null distribution analysis is implemented to statistically determine an adequate F-ratio threshold, whereby the variables with F-ratio values below the threshold can be ignored as not class distinguishing, which provides the analyst with confidence when analyzing the hit table. Forty-six of the fifty-four benchmarked changing metabolites were discovered by the new methodology while consistently excluding all but one of the benchmarked nineteen false positive metabolites previously identified. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. A Divergence Statistics Extension to VTK for Performance Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pebay, Philippe Pierre; Bennett, Janine Camille

    This report follows the series of previous documents ([PT08, BPRT09b, PT09, BPT09, PT10, PB13], where we presented the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k -means, order and auto-correlative statistics engines which we developed within the Visualization Tool Kit ( VTK ) as a scalable, parallel and versatile statistics package. We now report on a new engine which we developed for the calculation of divergence statistics, a concept which we hereafter explain and whose main goal is to quantify the discrepancy, in a stasticial manner akin to measuring a distance, between an observed empirical distribution and a theoretical,more » "ideal" one. The ease of use of the new diverence statistics engine is illustrated by the means of C++ code snippets. Although this new engine does not yet have a parallel implementation, it has already been applied to HPC performance analysis, of which we provide an example.« less

  15. Statistical analysis and application of quasi experiments to antimicrobial resistance intervention studies.

    PubMed

    Shardell, Michelle; Harris, Anthony D; El-Kamary, Samer S; Furuno, Jon P; Miller, Ram R; Perencevich, Eli N

    2007-10-01

    Quasi-experimental study designs are frequently used to assess interventions that aim to limit the emergence of antimicrobial-resistant pathogens. However, previous studies using these designs have often used suboptimal statistical methods, which may result in researchers making spurious conclusions. Methods used to analyze quasi-experimental data include 2-group tests, regression analysis, and time-series analysis, and they all have specific assumptions, data requirements, strengths, and limitations. An example of a hospital-based intervention to reduce methicillin-resistant Staphylococcus aureus infection rates and reduce overall length of stay is used to explore these methods.

  16. Statistical Analysis of CFD Solutions from the Drag Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Hemsch, Michael J.

    2002-01-01

    A simple, graphical framework is presented for robust statistical evaluation of results obtained from N-Version testing of a series of RANS CFD codes. The solutions were obtained by a variety of code developers and users for the June 2001 Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic configuration used for the computational tests is the DLR-F4 wing-body combination previously tested in several European wind tunnels and for which a previous N-Version test had been conducted. The statistical framework is used to evaluate code results for (1) a single cruise design point, (2) drag polars and (3) drag rise. The paper concludes with a discussion of the meaning of the results, especially with respect to predictability, Validation, and reporting of solutions.

  17. Energy-density field approach for low- and medium-frequency vibroacoustic analysis of complex structures using a statistical computational model

    NASA Astrophysics Data System (ADS)

    Kassem, M.; Soize, C.; Gagliardini, L.

    2009-06-01

    In this paper, an energy-density field approach applied to the vibroacoustic analysis of complex industrial structures in the low- and medium-frequency ranges is presented. This approach uses a statistical computational model. The analyzed system consists of an automotive vehicle structure coupled with its internal acoustic cavity. The objective of this paper is to make use of the statistical properties of the frequency response functions of the vibroacoustic system observed from previous experimental and numerical work. The frequency response functions are expressed in terms of a dimensionless matrix which is estimated using the proposed energy approach. Using this dimensionless matrix, a simplified vibroacoustic model is proposed.

  18. Research design and statistical methods in Pakistan Journal of Medical Sciences (PJMS)

    PubMed Central

    Akhtar, Sohail; Shah, Syed Wadood Ali; Rafiq, M.; Khan, Ajmal

    2016-01-01

    Objective: This article compares the study design and statistical methods used in 2005, 2010 and 2015 of Pakistan Journal of Medical Sciences (PJMS). Methods: Only original articles of PJMS were considered for the analysis. The articles were carefully reviewed for statistical methods and designs, and then recorded accordingly. The frequency of each statistical method and research design was estimated and compared with previous years. Results: A total of 429 articles were evaluated (n=74 in 2005, n=179 in 2010, n=176 in 2015) in which 171 (40%) were cross-sectional and 116 (27%) were prospective study designs. A verity of statistical methods were found in the analysis. The most frequent methods include: descriptive statistics (n=315, 73.4%), chi-square/Fisher’s exact tests (n=205, 47.8%) and student t-test (n=186, 43.4%). There was a significant increase in the use of statistical methods over time period: t-test, chi-square/Fisher’s exact test, logistic regression, epidemiological statistics, and non-parametric tests. Conclusion: This study shows that a diverse variety of statistical methods have been used in the research articles of PJMS and frequency improved from 2005 to 2015. However, descriptive statistics was the most frequent method of statistical analysis in the published articles while cross-sectional study design was common study design. PMID:27022365

  19. Exceedance statistics of accelerations resulting from thruster firings on the Apollo-Soyuz mission

    NASA Technical Reports Server (NTRS)

    Fichtl, G. H.; Holland, R. L.

    1983-01-01

    Spacecraft acceleration resulting from firings of vernier control system thrusters is an important consideration in the design, planning, execution and post-flight analysis of laboratory experiments in space. In particular, scientists and technologists involved with the development of experiments to be performed in space in many instances required statistical information on the magnitude and rate of occurrence of spacecraft accelerations. Typically, these accelerations are stochastic in nature, so that it is useful to characterize these accelerations in statistical terms. Statistics of spacecraft accelerations are summarized. Previously announced in STAR as N82-12127

  20. Enhancement of CFD validation exercise along the roof profile of a low-rise building

    NASA Astrophysics Data System (ADS)

    Deraman, S. N. C.; Majid, T. A.; Zaini, S. S.; Yahya, W. N. W.; Abdullah, J.; Ismail, M. A.

    2018-04-01

    The aim of this study is to enhance the validation of CFD exercise along the roof profile of a low-rise building. An isolated gabled-roof house having 26.6° roof pitch was simulated to obtain the pressure coefficient around the house. Validation of CFD analysis with experimental data requires many input parameters. This study performed CFD simulation based on the data from a previous study. Where the input parameters were not clearly stated, new input parameters were established from the open literatures. The numerical simulations were performed in FLUENT 14.0 by applying the Computational Fluid Dynamics (CFD) approach based on steady RANS equation together with RNG k-ɛ model. Hence, the result from CFD was analysed by using quantitative test (statistical analysis) and compared with CFD results from the previous study. The statistical analysis results from ANOVA test and error measure showed that the CFD results from the current study produced good agreement and exhibited the closest error compared to the previous study. All the input data used in this study can be extended to other types of CFD simulation involving wind flow over an isolated single storey house.

  1. Statistical testing of association between menstruation and migraine.

    PubMed

    Barra, Mathias; Dahl, Fredrik A; Vetvik, Kjersti G

    2015-02-01

    To repair and refine a previously proposed method for statistical analysis of association between migraine and menstruation. Menstrually related migraine (MRM) affects about 20% of female migraineurs in the general population. The exact pathophysiological link from menstruation to migraine is hypothesized to be through fluctuations in female reproductive hormones, but the exact mechanisms remain unknown. Therefore, the main diagnostic criterion today is concurrency of migraine attacks with menstruation. Methods aiming to exclude spurious associations are wanted, so that further research into these mechanisms can be performed on a population with a true association. The statistical method is based on a simple two-parameter null model of MRM (which allows for simulation modeling), and Fisher's exact test (with mid-p correction) applied to standard 2 × 2 contingency tables derived from the patients' headache diaries. Our method is a corrected version of a previously published flawed framework. To our best knowledge, no other published methods for establishing a menstruation-migraine association by statistical means exist today. The probabilistic methodology shows good performance when subjected to receiver operator characteristic curve analysis. Quick reference cutoff values for the clinical setting were tabulated for assessing association given a patient's headache history. In this paper, we correct a proposed method for establishing association between menstruation and migraine by statistical methods. We conclude that the proposed standard of 3-cycle observations prior to setting an MRM diagnosis should be extended with at least one perimenstrual window to obtain sufficient information for statistical processing. © 2014 American Headache Society.

  2. A two-point diagnostic for the H II galaxy Hubble diagram

    NASA Astrophysics Data System (ADS)

    Leaf, Kyle; Melia, Fulvio

    2018-03-01

    A previous analysis of starburst-dominated H II galaxies and H II regions has demonstrated a statistically significant preference for the Friedmann-Robertson-Walker cosmology with zero active mass, known as the Rh = ct universe, over Λcold dark matter (ΛCDM) and its related dark-matter parametrizations. In this paper, we employ a two-point diagnostic with these data to present a complementary statistical comparison of Rh = ct with Planck ΛCDM. Our two-point diagnostic compares, in a pairwise fashion, the difference between the distance modulus measured at two redshifts with that predicted by each cosmology. Our results support the conclusion drawn by a previous comparative analysis demonstrating that Rh = ct is statistically preferred over Planck ΛCDM. But we also find that the reported errors in the H II measurements may not be purely Gaussian, perhaps due to a partial contamination by non-Gaussian systematic effects. The use of H II galaxies and H II regions as standard candles may be improved even further with a better handling of the systematics in these sources.

  3. Toward Reflective Judgment in Exploratory Factor Analysis Decisions: Determining the Extraction Method and Number of Factors To Retain.

    ERIC Educational Resources Information Center

    Knight, Jennifer L.

    This paper considers some decisions that must be made by the researcher conducting an exploratory factor analysis. The primary purpose is to aid the researcher in making informed decisions during the factor analysis instead of relying on defaults in statistical programs or traditions of previous researchers. Three decision areas are addressed.…

  4. Exploratory Visual Analysis of Statistical Results from Microarray Experiments Comparing High and Low Grade Glioma

    PubMed Central

    Reif, David M.; Israel, Mark A.; Moore, Jason H.

    2007-01-01

    The biological interpretation of gene expression microarray results is a daunting challenge. For complex diseases such as cancer, wherein the body of published research is extensive, the incorporation of expert knowledge provides a useful analytical framework. We have previously developed the Exploratory Visual Analysis (EVA) software for exploring data analysis results in the context of annotation information about each gene, as well as biologically relevant groups of genes. We present EVA as a flexible combination of statistics and biological annotation that provides a straightforward visual interface for the interpretation of microarray analyses of gene expression in the most commonly occuring class of brain tumors, glioma. We demonstrate the utility of EVA for the biological interpretation of statistical results by analyzing publicly available gene expression profiles of two important glial tumors. The results of a statistical comparison between 21 malignant, high-grade glioblastoma multiforme (GBM) tumors and 19 indolent, low-grade pilocytic astrocytomas were analyzed using EVA. By using EVA to examine the results of a relatively simple statistical analysis, we were able to identify tumor class-specific gene expression patterns having both statistical and biological significance. Our interactive analysis highlighted the potential importance of genes involved in cell cycle progression, proliferation, signaling, adhesion, migration, motility, and structure, as well as candidate gene loci on a region of Chromosome 7 that has been implicated in glioma. Because EVA does not require statistical or computational expertise and has the flexibility to accommodate any type of statistical analysis, we anticipate EVA will prove a useful addition to the repertoire of computational methods used for microarray data analysis. EVA is available at no charge to academic users and can be found at http://www.epistasis.org. PMID:19390666

  5. Power Analysis in Two-Level Unbalanced Designs

    ERIC Educational Resources Information Center

    Konstantopoulos, Spyros

    2010-01-01

    Previous work on statistical power has discussed mainly single-level designs or 2-level balanced designs with random effects. Although balanced experiments are common, in practice balance cannot always be achieved. Work on class size is one example of unbalanced designs. This study provides methods for power analysis in 2-level unbalanced designs…

  6. Factor analysis in optimization of formulation of high content uniformity tablets containing low dose active substance.

    PubMed

    Lukášová, Ivana; Muselík, Jan; Franc, Aleš; Goněc, Roman; Mika, Filip; Vetchý, David

    2017-11-15

    Warfarin is intensively discussed drug with narrow therapeutic range. There have been cases of bleeding attributed to varying content or altered quality of the active substance. Factor analysis is useful for finding suitable technological parameters leading to high content uniformity of tablets containing low amount of active substance. The composition of tabletting blend and technological procedure were set with respect to factor analysis of previously published results. The correctness of set parameters was checked by manufacturing and evaluation of tablets containing 1-10mg of warfarin sodium. The robustness of suggested technology was checked by using "worst case scenario" and statistical evaluation of European Pharmacopoeia (EP) content uniformity limits with respect to Bergum division and process capability index (Cpk). To evaluate the quality of active substance and tablets, dissolution method was developed (water; EP apparatus II; 25rpm), allowing for statistical comparison of dissolution profiles. Obtained results prove the suitability of factor analysis to optimize the composition with respect to batches manufactured previously and thus the use of metaanalysis under industrial conditions is feasible. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Practical steganalysis of digital images: state of the art

    NASA Astrophysics Data System (ADS)

    Fridrich, Jessica; Goljan, Miroslav

    2002-04-01

    Steganography is the art of hiding the very presence of communication by embedding secret messages into innocuous looking cover documents, such as digital images. Detection of steganography, estimation of message length, and its extraction belong to the field of steganalysis. Steganalysis has recently received a great deal of attention both from law enforcement and the media. In our paper, we classify and review current stego-detection algorithms that can be used to trace popular steganographic products. We recognize several qualitatively different approaches to practical steganalysis - visual detection, detection based on first order statistics (histogram analysis), dual statistics methods that use spatial correlations in images and higher-order statistics (RS steganalysis), universal blind detection schemes, and special cases, such as JPEG compatibility steganalysis. We also present some new results regarding our previously proposed detection of LSB embedding using sensitive dual statistics. The recent steganalytic methods indicate that the most common paradigm in image steganography - the bit-replacement or bit substitution - is inherently insecure with safe capacities far smaller than previously thought.

  8. Evidence-Based Medicine as a Tool for Undergraduate Probability and Statistics Education

    PubMed Central

    Masel, J.; Humphrey, P. T.; Blackburn, B.; Levine, J. A.

    2015-01-01

    Most students have difficulty reasoning about chance events, and misconceptions regarding probability can persist or even strengthen following traditional instruction. Many biostatistics classes sidestep this problem by prioritizing exploratory data analysis over probability. However, probability itself, in addition to statistics, is essential both to the biology curriculum and to informed decision making in daily life. One area in which probability is particularly important is medicine. Given the preponderance of pre health students, in addition to more general interest in medicine, we capitalized on students’ intrinsic motivation in this area to teach both probability and statistics. We use the randomized controlled trial as the centerpiece of the course, because it exemplifies the most salient features of the scientific method, and the application of critical thinking to medicine. The other two pillars of the course are biomedical applications of Bayes’ theorem and science and society content. Backward design from these three overarching aims was used to select appropriate probability and statistics content, with a focus on eliciting and countering previously documented misconceptions in their medical context. Pretest/posttest assessments using the Quantitative Reasoning Quotient and Attitudes Toward Statistics instruments are positive, bucking several negative trends previously reported in statistics education. PMID:26582236

  9. Detailed Spectral Analysis of the 260 ks XMM-Newton Data of 1E 1207.4-5209 and Significance of a 2.1 keV Absorption Feature

    NASA Astrophysics Data System (ADS)

    Mori, Kaya; Chonko, James C.; Hailey, Charles J.

    2005-10-01

    We have reanalyzed the 260 ks XMM-Newton observation of 1E 1207.4-5209. There are several significant improvements over previous work. First, a much broader range of physically plausible spectral models was used. Second, we have used a more rigorous statistical analysis. The standard F-distribution was not employed, but rather the exact finite statistics F-distribution was determined by Monte Carlo simulations. This approach was motivated by the recent work of Protassov and coworkers and Freeman and coworkers. They demonstrated that the standard F-distribution is not even asymptotically correct when applied to assess the significance of additional absorption features in a spectrum. With our improved analysis we do not find a third and fourth spectral feature in 1E 1207.4-5209 but only the two broad absorption features previously reported. Two additional statistical tests, one line model dependent and the other line model independent, confirmed our modified F-test analysis. For all physically plausible continuum models in which the weak residuals are strong enough to fit, the residuals occur at the instrument Au M edge. As a sanity check we confirmed that the residuals are consistent in strength and position with the instrument Au M residuals observed in 3C 273.

  10. Statistical mapping of zones of focused groundwater/surface-water exchange using fiber-optic distributed temperature sensing

    USGS Publications Warehouse

    Mwakanyamale, Kisa; Day-Lewis, Frederick D.; Slater, Lee D.

    2013-01-01

    Fiber-optic distributed temperature sensing (FO-DTS) increasingly is used to map zones of focused groundwater/surface-water exchange (GWSWE). Previous studies of GWSWE using FO-DTS involved identification of zones of focused GWSWE based on arbitrary cutoffs of FO-DTS time-series statistics (e.g., variance, cross-correlation between temperature and stage, or spectral power). New approaches are needed to extract more quantitative information from large, complex FO-DTS data sets while concurrently providing an assessment of uncertainty associated with mapping zones of focused GSWSE. Toward this end, we present a strategy combining discriminant analysis (DA) and spectral analysis (SA). We demonstrate the approach using field experimental data from a reach of the Columbia River adjacent to the Hanford 300 Area site. Results of the combined SA/DA approach are shown to be superior to previous results from qualitative interpretation of FO-DTS spectra alone.

  11. RepExplore: addressing technical replicate variance in proteomics and metabolomics data analysis.

    PubMed

    Glaab, Enrico; Schneider, Reinhard

    2015-07-01

    High-throughput omics datasets often contain technical replicates included to account for technical sources of noise in the measurement process. Although summarizing these replicate measurements by using robust averages may help to reduce the influence of noise on downstream data analysis, the information on the variance across the replicate measurements is lost in the averaging process and therefore typically disregarded in subsequent statistical analyses.We introduce RepExplore, a web-service dedicated to exploit the information captured in the technical replicate variance to provide more reliable and informative differential expression and abundance statistics for omics datasets. The software builds on previously published statistical methods, which have been applied successfully to biomedical omics data but are difficult to use without prior experience in programming or scripting. RepExplore facilitates the analysis by providing a fully automated data processing and interactive ranking tables, whisker plot, heat map and principal component analysis visualizations to interpret omics data and derived statistics. Freely available at http://www.repexplore.tk enrico.glaab@uni.lu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  12. Uncertainty Analysis of Inertial Model Attitude Sensor Calibration and Application with a Recommended New Calibration Method

    NASA Technical Reports Server (NTRS)

    Tripp, John S.; Tcheng, Ping

    1999-01-01

    Statistical tools, previously developed for nonlinear least-squares estimation of multivariate sensor calibration parameters and the associated calibration uncertainty analysis, have been applied to single- and multiple-axis inertial model attitude sensors used in wind tunnel testing to measure angle of attack and roll angle. The analysis provides confidence and prediction intervals of calibrated sensor measurement uncertainty as functions of applied input pitch and roll angles. A comparative performance study of various experimental designs for inertial sensor calibration is presented along with corroborating experimental data. The importance of replicated calibrations over extended time periods has been emphasized; replication provides independent estimates of calibration precision and bias uncertainties, statistical tests for calibration or modeling bias uncertainty, and statistical tests for sensor parameter drift over time. A set of recommendations for a new standardized model attitude sensor calibration method and usage procedures is included. The statistical information provided by these procedures is necessary for the uncertainty analysis of aerospace test results now required by users of industrial wind tunnel test facilities.

  13. Childbirth after pelvic floor surgery: analysis of Hospital Episode Statistics in England, 2002-2008.

    PubMed

    Pradhan, A; Tincello, D G; Kearney, R

    2013-01-01

    To report the numbers of patients having childbirth after pelvic floor surgery in England. Retrospective analysis of Hospital Episode Statistics data. Hospital Episode Statistics database. Women, aged 20-44 years, undergoing childbirth after pelvic floor surgery between the years 2002 and 2008. Analysis of the Hospital Episode Statistics database using Office of Population, Censuses and Surveys: Classification of Interventions and Procedures, 4th Revision (OPCS-4) code at the four-character level for pelvic floor surgery and delivery, in women aged 20-44 years, between the years 2002 and 2008. Numbers of women having delivery episodes after previous pelvic floor surgery, and numbers having further pelvic floor surgery after delivery. Six hundred and three women had a delivery episode after previous pelvic floor surgery in the time period 2002-2008. In this group of 603 women, 42 had a further pelvic floor surgery episode following delivery in the same time period. The incidence of repeat surgery episode following delivery was higher in the group delivered vaginally than in those delivered by caesarean (13.6 versus 4.4%; odds ratio, 3.38; 95% confidence interval, 1.87-6.10). There were 603 women having childbirth after pelvic floor surgery in the time period 2002-2008. The incidence of further pelvic floor surgery after childbirth was lower after caesarean delivery than after vaginal delivery, and this may indicate a protective effect of abdominal delivery. © 2012 The Authors BJOG An International Journal of Obstetrics and Gynaecology © 2012 RCOG.

  14. Improved Statistics for Genome-Wide Interaction Analysis

    PubMed Central

    Ueki, Masao; Cordell, Heather J.

    2012-01-01

    Recently, Wu and colleagues [1] proposed two novel statistics for genome-wide interaction analysis using case/control or case-only data. In computer simulations, their proposed case/control statistic outperformed competing approaches, including the fast-epistasis option in PLINK and logistic regression analysis under the correct model; however, reasons for its superior performance were not fully explored. Here we investigate the theoretical properties and performance of Wu et al.'s proposed statistics and explain why, in some circumstances, they outperform competing approaches. Unfortunately, we find minor errors in the formulae for their statistics, resulting in tests that have higher than nominal type 1 error. We also find minor errors in PLINK's fast-epistasis and case-only statistics, although theory and simulations suggest that these errors have only negligible effect on type 1 error. We propose adjusted versions of all four statistics that, both theoretically and in computer simulations, maintain correct type 1 error rates under the null hypothesis. We also investigate statistics based on correlation coefficients that maintain similar control of type 1 error. Although designed to test specifically for interaction, we show that some of these previously-proposed statistics can, in fact, be sensitive to main effects at one or both loci, particularly in the presence of linkage disequilibrium. We propose two new “joint effects” statistics that, provided the disease is rare, are sensitive only to genuine interaction effects. In computer simulations we find, in most situations considered, that highest power is achieved by analysis under the correct genetic model. Such an analysis is unachievable in practice, as we do not know this model. However, generally high power over a wide range of scenarios is exhibited by our joint effects and adjusted Wu statistics. We recommend use of these alternative or adjusted statistics and urge caution when using Wu et al.'s originally-proposed statistics, on account of the inflated error rate that can result. PMID:22496670

  15. Analysis and meta-analysis of single-case designs with a standardized mean difference statistic: a primer and applications.

    PubMed

    Shadish, William R; Hedges, Larry V; Pustejovsky, James E

    2014-04-01

    This article presents a d-statistic for single-case designs that is in the same metric as the d-statistic used in between-subjects designs such as randomized experiments and offers some reasons why such a statistic would be useful in SCD research. The d has a formal statistical development, is accompanied by appropriate power analyses, and can be estimated using user-friendly SPSS macros. We discuss both advantages and disadvantages of d compared to other approaches such as previous d-statistics, overlap statistics, and multilevel modeling. It requires at least three cases for computation and assumes normally distributed outcomes and stationarity, assumptions that are discussed in some detail. We also show how to test these assumptions. The core of the article then demonstrates in depth how to compute d for one study, including estimation of the autocorrelation and the ratio of between case variance to total variance (between case plus within case variance), how to compute power using a macro, and how to use the d to conduct a meta-analysis of studies using single-case designs in the free program R, including syntax in an appendix. This syntax includes how to read data, compute fixed and random effect average effect sizes, prepare a forest plot and a cumulative meta-analysis, estimate various influence statistics to identify studies contributing to heterogeneity and effect size, and do various kinds of publication bias analyses. This d may prove useful for both the analysis and meta-analysis of data from SCDs. Copyright © 2013 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  16. Early-type galaxies in the Antlia cluster: catalogue and isophotal analysis

    NASA Astrophysics Data System (ADS)

    Calderón, Juan P.; Bassino, Lilia P.; Cellone, Sergio A.; Gómez, Matías

    2018-06-01

    We present a statistical isophotal analysis of 138 early-type galaxies in the Antlia cluster, located at a distance of ˜ 35 Mpc. The observational material consists of CCD images of four 36 × 36 arcmin2 fields obtained with the MOSAIC II camera at the Blanco 4-m telescope at Cerro Tololo Interamerican Observatory. Our present work supersedes previous Antlia studies in the sense that the covered area is four times larger, the limiting magnitude is MB ˜ -9.6 mag, and the surface photometry parameters of each galaxy are derived from Sérsic model fits extrapolated to infinity. In a companion previous study we focused on the scaling relations obtained by means of surface photometry, and now we present the data, on which the previous paper is based, the parameters of the isophotal fits as well as an isophotal analysis. For each galaxy, we derive isophotal shape parameters along the semimajor axis and search for correlations within different radial bins. Through extensive statistical tests, we also analyse the behaviour of these values against photometric and global parameters of the galaxies themselves. While some galaxies do display radial gradients in their ellipticity (ɛ) and/or their Fourier coefficients, differences in mean values between adjacent regions are not statistically significant. Regarding Fourier coefficients, dwarf galaxies usually display gradients between all adjacent regions, while non-dwarfs tend to show this behaviour just between the two outermost regions. Globally, there is no obvious correlation between Fourier coefficients and luminosity for the whole magnitude range (-12 ≳ MV ≳ -22); however, dwarfs display much higher dispersions at all radii.

  17. Benefits of a strategic national forest inventory to science and society: the USDA Forest Service Forest Inventory and Analysis program

    Treesearch

    J. D. Shaw

    2006-01-01

    Benefits of a strategic national forest inventory to science and society: the USDA Forest Service Forest Inventory and Analysis program. Forest Inventory and Analysis, previously known as Forest Survey, is one of the oldest research and development programs in the USDA Forest Service. Statistically-based inventory efforts that started in Scandinavian countries in the...

  18. A comprehensive study on pavement edge line implementation.

    DOT National Transportation Integrated Search

    2014-04-01

    The previous 2011 study Safety Improvement from Edge Lines on Rural Two-Lane Highways analyzed the crash data of : three years before and one year after edge line implementation by using the latest safety analysis statistical method. It : concl...

  19. An overview of data acquisition, signal coding and data analysis techniques for MST radars

    NASA Technical Reports Server (NTRS)

    Rastogi, P. K.

    1986-01-01

    An overview is given of the data acquisition, signal processing, and data analysis techniques that are currently in use with high power MST/ST (mesosphere stratosphere troposphere/stratosphere troposphere) radars. This review supplements the works of Rastogi (1983) and Farley (1984) presented at previous MAP workshops. A general description is given of data acquisition and signal processing operations and they are characterized on the basis of their disparate time scales. Then signal coding, a brief description of frequently used codes, and their limitations are discussed, and finally, several aspects of statistical data processing such as signal statistics, power spectrum and autocovariance analysis, outlier removal techniques are discussed.

  20. Upgrade Summer Severe Weather Tool

    NASA Technical Reports Server (NTRS)

    Watson, Leela

    2011-01-01

    The goal of this task was to upgrade to the existing severe weather database by adding observations from the 2010 warm season, update the verification dataset with results from the 2010 warm season, use statistical logistic regression analysis on the database and develop a new forecast tool. The AMU analyzed 7 stability parameters that showed the possibility of providing guidance in forecasting severe weather, calculated verification statistics for the Total Threat Score (TTS), and calculated warm season verification statistics for the 2010 season. The AMU also performed statistical logistic regression analysis on the 22-year severe weather database. The results indicated that the logistic regression equation did not show an increase in skill over the previously developed TTS. The equation showed less accuracy than TTS at predicting severe weather, little ability to distinguish between severe and non-severe weather days, and worse standard categorical accuracy measures and skill scores over TTS.

  1. Re-Analysis Report: Daylighting in Schools, Additional Analysis. Tasks 2.2.1 through 2.2.5.

    ERIC Educational Resources Information Center

    Heschong, Lisa; Elzeyadi, Ihab; Knecht, Carey

    This study expands and validates previous research that found a statistical correlation between the amount of daylight in elementary school classrooms and the performance of students on standardized math and reading tests. The researchers reanalyzed the 19971998 school year student performance data from the Capistrano Unified School District…

  2. Compositional Solution Space Quantification for Probabilistic Software Analysis

    NASA Technical Reports Server (NTRS)

    Borges, Mateus; Pasareanu, Corina S.; Filieri, Antonio; d'Amorim, Marcelo; Visser, Willem

    2014-01-01

    Probabilistic software analysis aims at quantifying how likely a target event is to occur during program execution. Current approaches rely on symbolic execution to identify the conditions to reach the target event and try to quantify the fraction of the input domain satisfying these conditions. Precise quantification is usually limited to linear constraints, while only approximate solutions can be provided in general through statistical approaches. However, statistical approaches may fail to converge to an acceptable accuracy within a reasonable time. We present a compositional statistical approach for the efficient quantification of solution spaces for arbitrarily complex constraints over bounded floating-point domains. The approach leverages interval constraint propagation to improve the accuracy of the estimation by focusing the sampling on the regions of the input domain containing the sought solutions. Preliminary experiments show significant improvement on previous approaches both in results accuracy and analysis time.

  3. A benchmark for statistical microarray data analysis that preserves actual biological and technical variance.

    PubMed

    De Hertogh, Benoît; De Meulder, Bertrand; Berger, Fabrice; Pierre, Michael; Bareke, Eric; Gaigneaux, Anthoula; Depiereux, Eric

    2010-01-11

    Recent reanalysis of spike-in datasets underscored the need for new and more accurate benchmark datasets for statistical microarray analysis. We present here a fresh method using biologically-relevant data to evaluate the performance of statistical methods. Our novel method ranks the probesets from a dataset composed of publicly-available biological microarray data and extracts subset matrices with precise information/noise ratios. Our method can be used to determine the capability of different methods to better estimate variance for a given number of replicates. The mean-variance and mean-fold change relationships of the matrices revealed a closer approximation of biological reality. Performance analysis refined the results from benchmarks published previously.We show that the Shrinkage t test (close to Limma) was the best of the methods tested, except when two replicates were examined, where the Regularized t test and the Window t test performed slightly better. The R scripts used for the analysis are available at http://urbm-cluster.urbm.fundp.ac.be/~bdemeulder/.

  4. Consequences of common data analysis inaccuracies in CNS trauma injury basic research.

    PubMed

    Burke, Darlene A; Whittemore, Scott R; Magnuson, David S K

    2013-05-15

    The development of successful treatments for humans after traumatic brain or spinal cord injuries (TBI and SCI, respectively) requires animal research. This effort can be hampered when promising experimental results cannot be replicated because of incorrect data analysis procedures. To identify and hopefully avoid these errors in future studies, the articles in seven journals with the highest number of basic science central nervous system TBI and SCI animal research studies published in 2010 (N=125 articles) were reviewed for their data analysis procedures. After identifying the most common statistical errors, the implications of those findings were demonstrated by reanalyzing previously published data from our laboratories using the identified inappropriate statistical procedures, then comparing the two sets of results. Overall, 70% of the articles contained at least one type of inappropriate statistical procedure. The highest percentage involved incorrect post hoc t-tests (56.4%), followed by inappropriate parametric statistics (analysis of variance and t-test; 37.6%). Repeated Measures analysis was inappropriately missing in 52.0% of all articles and, among those with behavioral assessments, 58% were analyzed incorrectly. Reanalysis of our published data using the most common inappropriate statistical procedures resulted in a 14.1% average increase in significant effects compared to the original results. Specifically, an increase of 15.5% occurred with Independent t-tests and 11.1% after incorrect post hoc t-tests. Utilizing proper statistical procedures can allow more-definitive conclusions, facilitate replicability of research results, and enable more accurate translation of those results to the clinic.

  5. Selecting the "Best" Factor Structure and Moving Measurement Validation Forward: An Illustration.

    PubMed

    Schmitt, Thomas A; Sass, Daniel A; Chappelle, Wayne; Thompson, William

    2018-04-09

    Despite the broad literature base on factor analysis best practices, research seeking to evaluate a measure's psychometric properties frequently fails to consider or follow these recommendations. This leads to incorrect factor structures, numerous and often overly complex competing factor models and, perhaps most harmful, biased model results. Our goal is to demonstrate a practical and actionable process for factor analysis through (a) an overview of six statistical and psychometric issues and approaches to be aware of, investigate, and report when engaging in factor structure validation, along with a flowchart for recommended procedures to understand latent factor structures; (b) demonstrating these issues to provide a summary of the updated Posttraumatic Stress Disorder Checklist (PCL-5) factor models and a rationale for validation; and (c) conducting a comprehensive statistical and psychometric validation of the PCL-5 factor structure to demonstrate all the issues we described earlier. Considering previous research, the PCL-5 was evaluated using a sample of 1,403 U.S. Air Force remotely piloted aircraft operators with high levels of battlefield exposure. Previously proposed PCL-5 factor structures were not supported by the data, but instead a bifactor model is arguably more statistically appropriate.

  6. A Statistical Study of Eiscat Electron and Ion Temperature Measurements In The E-region

    NASA Astrophysics Data System (ADS)

    Hussey, G.; Haldoupis, C.; Schlegel, K.; Bösinger, T.

    Motivated by the large EISCAT data base, which covers over 15 years of common programme operation, and previous statistical work with EISCAT data (e.g., C. Hal- doupis, K. Schlegel, and G. Hussey, Auroral E-region electron density gradients mea- sured with EISCAT, Ann. Geopshysicae, 18, 1172-1181, 2000), a detailed statistical analysis of electron and ion EISCAT temperature measurements has been undertaken. This study was specifically concerned with the statistical dependence of heating events with other ambient parameters such as the electric field and electron density. The re- sults showed previously reported dependences such as the electron temperature being directly correlated with the ambient electric field and inversely related to the electron density. However, these correlations were found to be also dependent upon altitude. There was also evidence of the so called "Schlegel effect" (K. Schlegel, Reduced effective recombination coefficient in the disturbed polar E-region, J. Atmos. Terr. Phys., 44, 183-185, 1982); that is, the heated electron gas leads to increases in elec- tron density through a reduction in the recombination rate. This paper will present the statistical heating results and attempt to offer physical explanations and interpretations of the findings.

  7. IDENTIFICATION OF REGIME SHIFTS IN TIME SERIES USING NEIGHBORHOOD STATISTICS

    EPA Science Inventory

    The identification of alternative dynamic regimes in ecological systems requires several lines of evidence. Previous work on time series analysis of dynamic regimes includes mainly model-fitting methods. We introduce two methods that do not use models. These approaches use state-...

  8. Automatic Classification of Medical Text: The Influence of Publication Form1

    PubMed Central

    Cole, William G.; Michael, Patricia A.; Stewart, James G.; Blois, Marsden S.

    1988-01-01

    Previous research has shown that within the domain of medical journal abstracts the statistical distribution of words is neither random nor uniform, but is highly characteristic. Many words are used mainly or solely by one medical specialty or when writing about one particular level of description. Due to this regularity of usage, automatic classification within journal abstracts has proved quite successful. The present research asks two further questions. It investigates whether this statistical regularity and automatic classification success can also be achieved in medical textbook chapters. It then goes on to see whether the statistical distribution found in textbooks is sufficiently similar to that found in abstracts to permit accurate classification of abstracts based solely on previous knowledge of textbooks. 14 textbook chapters and 45 MEDLINE abstracts were submitted to an automatic classification program that had been trained only on chapters drawn from a standard textbook series. Statistical analysis of the properties of abstracts vs. chapters revealed important differences in word use. Automatic classification performance was good for chapters, but poor for abstracts.

  9. An analysis of tire tread wear groove patterns and the effect of heteroscedasticity on tire tread wear statistics

    DOT National Transportation Integrated Search

    1985-09-01

    This report examines the groove wear variability among tires subjected to the : Uniform Tire Quality Grading (UTQC) test procedure for determining tire tread wear. : The effects of heteroscedasticity (variable variance) on a previously reported : sta...

  10. LANDSCAPE STRUCTURE AND ESTUARINE CONDITION IN THE MID-ATLANTIC REGION OF THE UNITED STATES: I. DEVELOPING QUANTITATIVE RELATIONSHIPS

    EPA Science Inventory

    In a previously published study, quantitative relationships were developed between landscape metrics and sediment contamination for 25 small estuarine systems within Chesapeake Bay. Nonparametric statistical analysis (rank transformation) was used to develop an empirical relation...

  11. Multivariate analysis of fears in dental phobic patients according to a reduced FSS-II scale.

    PubMed

    Hakeberg, M; Gustafsson, J E; Berggren, U; Carlsson, S G

    1995-10-01

    This study analyzed and assessed dimensions of a questionnaire developed to measure general fears and phobias. A previous factor analysis among 109 dental phobics had revealed a five-factor structure with 22 items and an explained total variance of 54%. The present study analyzed the same material using a multivariate statistical procedure (LISREL) to reveal structural latent variables. The LISREL analysis, based on the correlation matrix, yielded a chi-square of 216.6 with 195 degrees of freedom (P = 0.138) and showed a model with seven latent variables. One was a general fear factor correlated to all 22 items. The other six factors concerned "Illness & Death" (5 items), "Failures & Embarrassment" (5 items), "Social situations" (5 items), "Physical injuries" (4 items), "Animals & Natural phenomena" (4 items). One item (opposite sex) was included in both "Failures & Embarrassment" and "Social situations". The last factor, "Social interaction", combined all the items in "Failures & Embarrassment" and "Social situations" (9 items). In conclusion, this multivariate statistical analysis (LISREL) revealed and confirmed a factor structure similar to our previous study, but added two important dimensions not shown with a traditional factor analysis. This reduced FSS-II version measures general fears and phobias and may be used on a routine clinical basis as well as in dental phobia research.

  12. A Third Moment Adjusted Test Statistic for Small Sample Factor Analysis.

    PubMed

    Lin, Johnny; Bentler, Peter M

    2012-01-01

    Goodness of fit testing in factor analysis is based on the assumption that the test statistic is asymptotically chi-square; but this property may not hold in small samples even when the factors and errors are normally distributed in the population. Robust methods such as Browne's asymptotically distribution-free method and Satorra Bentler's mean scaling statistic were developed under the presumption of non-normality in the factors and errors. This paper finds new application to the case where factors and errors are normally distributed in the population but the skewness of the obtained test statistic is still high due to sampling error in the observed indicators. An extension of Satorra Bentler's statistic is proposed that not only scales the mean but also adjusts the degrees of freedom based on the skewness of the obtained test statistic in order to improve its robustness under small samples. A simple simulation study shows that this third moment adjusted statistic asymptotically performs on par with previously proposed methods, and at a very small sample size offers superior Type I error rates under a properly specified model. Data from Mardia, Kent and Bibby's study of students tested for their ability in five content areas that were either open or closed book were used to illustrate the real-world performance of this statistic.

  13. The Simpson's paradox unraveled

    PubMed Central

    Hernán, Miguel A; Clayton, David; Keiding, Niels

    2011-01-01

    Background In a famous article, Simpson described a hypothetical data example that led to apparently paradoxical results. Methods We make the causal structure of Simpson's example explicit. Results We show how the paradox disappears when the statistical analysis is appropriately guided by subject-matter knowledge. We also review previous explanations of Simpson's paradox that attributed it to two distinct phenomena: confounding and non-collapsibility. Conclusion Analytical errors may occur when the problem is stripped of its causal context and analyzed merely in statistical terms. PMID:21454324

  14. Transition-Region Ultraviolet Explosive Events in IRIS Si IV: A Statistical Analysis

    NASA Astrophysics Data System (ADS)

    Bartz, Allison

    2018-01-01

    Explosive events (EEs) in the solar transition region are characterized by broad, non-Gaussian line profiles with wings at Doppler velocities exceeding the speed of sound. We present a statistical analysis of 23 IRIS (Interface Region Imaging Spectrograph) sit-and-stare observations, observed between April 2014 and March 2017. Using the IRIS Si IV 1394 Å and 1403 Å spectral windows and the 1400Å Slit Jaw images we have identified 581 EEs. We found that most EEs last less than 20 min. and have a spatial scale on the slit less than 10”, agreeing with measurements in previous work. We observed most EEs in active regions, regardless of date of observation, but selection bias of IRIS observations cannot be ruled out. We also present preliminary findings of optical depth effects from our statistical study.

  15. Survival of dental implants placed in sites of previously failed implants.

    PubMed

    Chrcanovic, Bruno R; Kisch, Jenö; Albrektsson, Tomas; Wennerberg, Ann

    2017-11-01

    To assess the survival of dental implants placed in sites of previously failed implants and to explore the possible factors that might affect the outcome of this reimplantation procedure. Patients that had failed dental implants, which were replaced with the same implant type at the same site, were included. Descriptive statistics were used to describe the patients and implants; survival analysis was also performed. The effect of systemic, environmental, and local factors on the survival of the reoperated implants was evaluated. 175 of 10,096 implants in 98 patients were replaced by another implant at the same location (159, 14, and 2 implants at second, third, and fourth surgeries, respectively). Newly replaced implants were generally of similar diameter but of shorter length compared to the previously placed fixtures. A statistically significant greater percentage of lost implants were placed in sites with low bone quantity. There was a statistically significant difference (P = 0.032) in the survival rates between implants that were inserted for the first time (94%) and implants that replaced the ones lost (73%). There was a statistically higher failure rate of the reoperated implants for patients taking antidepressants and antithrombotic agents. Dental implants replacing failed implants had lower survival rates than the rates reported for the previous attempts of implant placement. It is suggested that a site-specific negative effect may possibly be associated with this phenomenon, as well as the intake of antidepressants and antithrombotic agents. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. Hydrochemical evolution and groundwater flow processes in the Galilee and Eromanga basins, Great Artesian Basin, Australia: a multivariate statistical approach.

    PubMed

    Moya, Claudio E; Raiber, Matthias; Taulis, Mauricio; Cox, Malcolm E

    2015-03-01

    The Galilee and Eromanga basins are sub-basins of the Great Artesian Basin (GAB). In this study, a multivariate statistical approach (hierarchical cluster analysis, principal component analysis and factor analysis) is carried out to identify hydrochemical patterns and assess the processes that control hydrochemical evolution within key aquifers of the GAB in these basins. The results of the hydrochemical assessment are integrated into a 3D geological model (previously developed) to support the analysis of spatial patterns of hydrochemistry, and to identify the hydrochemical and hydrological processes that control hydrochemical variability. In this area of the GAB, the hydrochemical evolution of groundwater is dominated by evapotranspiration near the recharge area resulting in a dominance of the Na-Cl water types. This is shown conceptually using two selected cross-sections which represent discrete groundwater flow paths from the recharge areas to the deeper parts of the basins. With increasing distance from the recharge area, a shift towards a dominance of carbonate (e.g. Na-HCO3 water type) has been observed. The assessment of hydrochemical changes along groundwater flow paths highlights how aquifers are separated in some areas, and how mixing between groundwater from different aquifers occurs elsewhere controlled by geological structures, including between GAB aquifers and coal bearing strata of the Galilee Basin. The results of this study suggest that distinct hydrochemical differences can be observed within the previously defined Early Cretaceous-Jurassic aquifer sequence of the GAB. A revision of the two previously recognised hydrochemical sequences is being proposed, resulting in three hydrochemical sequences based on systematic differences in hydrochemistry, salinity and dominant hydrochemical processes. The integrated approach presented in this study which combines different complementary multivariate statistical techniques with a detailed assessment of the geological framework of these sedimentary basins, can be adopted in other complex multi-aquifer systems to assess hydrochemical evolution and its geological controls. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. A comparative analysis of the statistical properties of large mobile phone calling networks.

    PubMed

    Li, Ming-Xia; Jiang, Zhi-Qiang; Xie, Wen-Jie; Miccichè, Salvatore; Tumminello, Michele; Zhou, Wei-Xing; Mantegna, Rosario N

    2014-05-30

    Mobile phone calling is one of the most widely used communication methods in modern society. The records of calls among mobile phone users provide us a valuable proxy for the understanding of human communication patterns embedded in social networks. Mobile phone users call each other forming a directed calling network. If only reciprocal calls are considered, we obtain an undirected mutual calling network. The preferential communication behavior between two connected users can be statistically tested and it results in two Bonferroni networks with statistically validated edges. We perform a comparative analysis of the statistical properties of these four networks, which are constructed from the calling records of more than nine million individuals in Shanghai over a period of 110 days. We find that these networks share many common structural properties and also exhibit idiosyncratic features when compared with previously studied large mobile calling networks. The empirical findings provide us an intriguing picture of a representative large social network that might shed new lights on the modelling of large social networks.

  18. Methods for Assessment of Memory Reactivation.

    PubMed

    Liu, Shizhao; Grosmark, Andres D; Chen, Zhe

    2018-04-13

    It has been suggested that reactivation of previously acquired experiences or stored information in declarative memories in the hippocampus and neocortex contributes to memory consolidation and learning. Understanding memory consolidation depends crucially on the development of robust statistical methods for assessing memory reactivation. To date, several statistical methods have seen established for assessing memory reactivation based on bursts of ensemble neural spike activity during offline states. Using population-decoding methods, we propose a new statistical metric, the weighted distance correlation, to assess hippocampal memory reactivation (i.e., spatial memory replay) during quiet wakefulness and slow-wave sleep. The new metric can be combined with an unsupervised population decoding analysis, which is invariant to latent state labeling and allows us to detect statistical dependency beyond linearity in memory traces. We validate the new metric using two rat hippocampal recordings in spatial navigation tasks. Our proposed analysis framework may have a broader impact on assessing memory reactivations in other brain regions under different behavioral tasks.

  19. School Readiness Factor Analyzed.

    ERIC Educational Resources Information Center

    Brenner, Anton; Scott, Leland H.

    This paper is an empirical statistical analysis and interpretation of data relating to school readiness previously examined and reported on a theoretical basis. A total of 118 white, middle class children from six consecutive kindergarten groups in Dearborn, Michigan were tested with seven instruments, evaluated in terms of achievement, ability,…

  20. Higher order statistical analysis of /x/ in male speech.

    PubMed

    Orr, M C; Lithgow, B

    2005-03-01

    This paper presents a study of kurtosis analysis for the sound /x/ in male speech, /x/ is the sound of the 'o' at the end of words such as 'ago'. The sound analysed for this paper came from the Australian National Database of Spoken Language, more specifically the male speaker 17. The /x/ was isolated and extracted from the database by the author in a quiet booth using standard multimedia software. A 5 millisecond window was used for the analysis as it was shown previously by the author to be the most appropriate size for speech phoneme analysis. The significance of the research presented here is shown in the results where a majority of coefficients had a platykurtic (kurtosis between 0 and 3) value as opposed to the previously held leptokurtic (kurtosis > 3) belief.

  1. Dealing with missing standard deviation and mean values in meta-analysis of continuous outcomes: a systematic review.

    PubMed

    Weir, Christopher J; Butcher, Isabella; Assi, Valentina; Lewis, Stephanie C; Murray, Gordon D; Langhorne, Peter; Brady, Marian C

    2018-03-07

    Rigorous, informative meta-analyses rely on availability of appropriate summary statistics or individual participant data. For continuous outcomes, especially those with naturally skewed distributions, summary information on the mean or variability often goes unreported. While full reporting of original trial data is the ideal, we sought to identify methods for handling unreported mean or variability summary statistics in meta-analysis. We undertook two systematic literature reviews to identify methodological approaches used to deal with missing mean or variability summary statistics. Five electronic databases were searched, in addition to the Cochrane Colloquium abstract books and the Cochrane Statistics Methods Group mailing list archive. We also conducted cited reference searching and emailed topic experts to identify recent methodological developments. Details recorded included the description of the method, the information required to implement the method, any underlying assumptions and whether the method could be readily applied in standard statistical software. We provided a summary description of the methods identified, illustrating selected methods in example meta-analysis scenarios. For missing standard deviations (SDs), following screening of 503 articles, fifteen methods were identified in addition to those reported in a previous review. These included Bayesian hierarchical modelling at the meta-analysis level; summary statistic level imputation based on observed SD values from other trials in the meta-analysis; a practical approximation based on the range; and algebraic estimation of the SD based on other summary statistics. Following screening of 1124 articles for methods estimating the mean, one approximate Bayesian computation approach and three papers based on alternative summary statistics were identified. Illustrative meta-analyses showed that when replacing a missing SD the approximation using the range minimised loss of precision and generally performed better than omitting trials. When estimating missing means, a formula using the median, lower quartile and upper quartile performed best in preserving the precision of the meta-analysis findings, although in some scenarios, omitting trials gave superior results. Methods based on summary statistics (minimum, maximum, lower quartile, upper quartile, median) reported in the literature facilitate more comprehensive inclusion of randomised controlled trials with missing mean or variability summary statistics within meta-analyses.

  2. Toward improved analysis of concentration data: Embracing nondetects.

    PubMed

    Shoari, Niloofar; Dubé, Jean-Sébastien

    2018-03-01

    Various statistical tests on concentration data serve to support decision-making regarding characterization and monitoring of contaminated media, assessing exposure to a chemical, and quantifying the associated risks. However, the routine statistical protocols cannot be directly applied because of challenges arising from nondetects or left-censored observations, which are concentration measurements below the detection limit of measuring instruments. Despite the existence of techniques based on survival analysis that can adjust for nondetects, these are seldom taken into account properly. A comprehensive review of the literature showed that managing policies regarding analysis of censored data do not always agree and that guidance from regulatory agencies may be outdated. Therefore, researchers and practitioners commonly resort to the most convenient way of tackling the censored data problem by substituting nondetects with arbitrary constants prior to data analysis, although this is generally regarded as a bias-prone approach. Hoping to improve the interpretation of concentration data, the present article aims to familiarize researchers in different disciplines with the significance of left-censored observations and provides theoretical and computational recommendations (under both frequentist and Bayesian frameworks) for adequate analysis of censored data. In particular, the present article synthesizes key findings from previous research with respect to 3 noteworthy aspects of inferential statistics: estimation of descriptive statistics, hypothesis testing, and regression analysis. Environ Toxicol Chem 2018;37:643-656. © 2017 SETAC. © 2017 SETAC.

  3. Network Meta-Analysis Using R: A Review of Currently Available Automated Packages

    PubMed Central

    Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph

    2014-01-01

    Network meta-analysis (NMA) – a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously – has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA. PMID:25541687

  4. Network meta-analysis using R: a review of currently available automated packages.

    PubMed

    Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph

    2014-01-01

    Network meta-analysis (NMA)--a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously--has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA.

  5. Conditional statistics in a turbulent premixed flame derived from direct numerical simulation

    NASA Technical Reports Server (NTRS)

    Mantel, Thierry; Bilger, Robert W.

    1994-01-01

    The objective of this paper is to briefly introduce conditional moment closure (CMC) methods for premixed systems and to derive the transport equation for the conditional species mass fraction conditioned on the progress variable based on the enthalpy. Our statistical analysis will be based on the 3-D DNS database of Trouve and Poinsot available at the Center for Turbulence Research. The initial conditions and characteristics (turbulence, thermo-diffusive properties) as well as the numerical method utilized in the DNS of Trouve and Poinsot are presented, and some details concerning our statistical analysis are also given. From the analysis of DNS results, the effects of the position in the flame brush, of the Damkoehler and Lewis numbers on the conditional mean scalar dissipation, and conditional mean velocity are presented and discussed. Information concerning unconditional turbulent fluxes are also presented. The anomaly found in previous studies of counter-gradient diffusion for the turbulent flux of the progress variable is investigated.

  6. Waveform classification and statistical analysis of seismic precursors to the July 2008 Vulcanian Eruption of Soufrière Hills Volcano, Montserrat

    NASA Astrophysics Data System (ADS)

    Rodgers, Mel; Smith, Patrick; Pyle, David; Mather, Tamsin

    2016-04-01

    Understanding the transition between quiescence and eruption at dome-forming volcanoes, such as Soufrière Hills Volcano (SHV), Montserrat, is important for monitoring volcanic activity during long-lived eruptions. Statistical analysis of seismic events (e.g. spectral analysis and identification of multiplets via cross-correlation) can be useful for characterising seismicity patterns and can be a powerful tool for analysing temporal changes in behaviour. Waveform classification is crucial for volcano monitoring, but consistent classification, both during real-time analysis and for retrospective analysis of previous volcanic activity, remains a challenge. Automated classification allows consistent re-classification of events. We present a machine learning (random forest) approach to rapidly classify waveforms that requires minimal training data. We analyse the seismic precursors to the July 2008 Vulcanian explosion at SHV and show systematic changes in frequency content and multiplet behaviour that had not previously been recognised. These precursory patterns of seismicity may be interpreted as changes in pressure conditions within the conduit during magma ascent and could be linked to magma flow rates. Frequency analysis of the different waveform classes supports the growing consensus that LP and Hybrid events should be considered end members of a continuum of low-frequency source processes. By using both supervised and unsupervised machine-learning methods we investigate the nature of waveform classification and assess current classification schemes.

  7. Bayesian approach for counting experiment statistics applied to a neutrino point source analysis

    NASA Astrophysics Data System (ADS)

    Bose, D.; Brayeur, L.; Casier, M.; de Vries, K. D.; Golup, G.; van Eijndhoven, N.

    2013-12-01

    In this paper we present a model independent analysis method following Bayesian statistics to analyse data from a generic counting experiment and apply it to the search for neutrinos from point sources. We discuss a test statistic defined following a Bayesian framework that will be used in the search for a signal. In case no signal is found, we derive an upper limit without the introduction of approximations. The Bayesian approach allows us to obtain the full probability density function for both the background and the signal rate. As such, we have direct access to any signal upper limit. The upper limit derivation directly compares with a frequentist approach and is robust in the case of low-counting observations. Furthermore, it allows also to account for previous upper limits obtained by other analyses via the concept of prior information without the need of the ad hoc application of trial factors. To investigate the validity of the presented Bayesian approach, we have applied this method to the public IceCube 40-string configuration data for 10 nearby blazars and we have obtained a flux upper limit, which is in agreement with the upper limits determined via a frequentist approach. Furthermore, the upper limit obtained compares well with the previously published result of IceCube, using the same data set.

  8. A Weibull statistics-based lignocellulose saccharification model and a built-in parameter accurately predict lignocellulose hydrolysis performance.

    PubMed

    Wang, Mingyu; Han, Lijuan; Liu, Shasha; Zhao, Xuebing; Yang, Jinghua; Loh, Soh Kheang; Sun, Xiaomin; Zhang, Chenxi; Fang, Xu

    2015-09-01

    Renewable energy from lignocellulosic biomass has been deemed an alternative to depleting fossil fuels. In order to improve this technology, we aim to develop robust mathematical models for the enzymatic lignocellulose degradation process. By analyzing 96 groups of previously published and newly obtained lignocellulose saccharification results and fitting them to Weibull distribution, we discovered Weibull statistics can accurately predict lignocellulose saccharification data, regardless of the type of substrates, enzymes and saccharification conditions. A mathematical model for enzymatic lignocellulose degradation was subsequently constructed based on Weibull statistics. Further analysis of the mathematical structure of the model and experimental saccharification data showed the significance of the two parameters in this model. In particular, the λ value, defined the characteristic time, represents the overall performance of the saccharification system. This suggestion was further supported by statistical analysis of experimental saccharification data and analysis of the glucose production levels when λ and n values change. In conclusion, the constructed Weibull statistics-based model can accurately predict lignocellulose hydrolysis behavior and we can use the λ parameter to assess the overall performance of enzymatic lignocellulose degradation. Advantages and potential applications of the model and the λ value in saccharification performance assessment were discussed. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Modelling multiple sources of dissemination bias in meta-analysis.

    PubMed

    Bowden, Jack; Jackson, Dan; Thompson, Simon G

    2010-03-30

    Asymmetry in the funnel plot for a meta-analysis suggests the presence of dissemination bias. This may be caused by publication bias through the decisions of journal editors, by selective reporting of research results by authors or by a combination of both. Typically, study results that are statistically significant or have larger estimated effect sizes are more likely to appear in the published literature, hence giving a biased picture of the evidence-base. Previous statistical approaches for addressing dissemination bias have assumed only a single selection mechanism. Here we consider a more realistic scenario in which multiple dissemination processes, involving both the publishing authors and journals, are operating. In practical applications, the methods can be used to provide sensitivity analyses for the potential effects of multiple dissemination biases operating in meta-analysis.

  10. The Novice-Expert Continuum in Astronomy Knowledge

    ERIC Educational Resources Information Center

    Bryce, T. G. K.; Blown, E. J.

    2012-01-01

    The nature of expertise in astronomy was investigated across a broad spectrum of ages and experience in China and New Zealand. Five hypotheses (capable of quantification and statistical analysis) were used to probe types of expertise identified by previous researchers: (a) domain-specific knowledge-skill in the use of scientific vocabulary and…

  11. Season of Birth in Autism: A Fiction Revisited.

    ERIC Educational Resources Information Center

    Landau, Edwina C.; Cicchetti, Domenic V.; Klin, Ami; Volkmar, Fred R.

    1999-01-01

    This study attempted to replicate previously reported increases in birth rates in March and August for individuals with autism. Statistical analysis of 904 cases revealed no significant seasonal effect. Samples were subcategorized into verbal and mute groups and again results failed to support the seasonal hypothesis. (Author/DB)

  12. Statistical Analysis of PDF's for Na Released by Photons from Solid Surfaces

    NASA Astrophysics Data System (ADS)

    Gamborino, D.; Wurz, P.

    2018-05-01

    We analyse the adequacy of three model speed PDF's previously used to describe the desorption of Na from a solid surface either by ESD or PSD. We found that the Maxwell PDF is too wide compared to measurements and non-thermal PDF's are better suited.

  13. A multi-scale analysis of landscape statistics

    Treesearch

    Douglas H. Cain; Kurt H. Riitters; Kenneth Orvis

    1997-01-01

    It is now feasible to monitor some aspects of landscape ecological condition nationwide using remotely- sensed imagery and indicators of land cover pattern. Previous research showed redundancies among many reported pattern indicators and identified six unique dimensions of land cover pattern. This study tested the stability of those dimensions and representative...

  14. Bayesian analysis of spatially-dependent functional responses with spatially-dependent multi-dimensional functional predictors

    USDA-ARS?s Scientific Manuscript database

    Recent advances in technology have led to the collection of high-dimensional data not previously encountered in many scientific environments. As a result, scientists are often faced with the challenging task of including these high-dimensional data into statistical models. For example, data from sen...

  15. Development of a Bayesian Belief Network Runway Incursion and Excursion Model

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.

    2014-01-01

    In a previous work, a statistical analysis of runway incursion (RI) event data was conducted to ascertain the relevance of this data to the top ten Technical Challenges (TC) of the National Aeronautics and Space Administration (NASA) Aviation Safety Program (AvSP). The study revealed connections to several of the AvSP top ten TC and identified numerous primary causes and contributing factors of RI events. The statistical analysis served as the basis for developing a system-level Bayesian Belief Network (BBN) model for RI events, also previously reported. Through literature searches and data analysis, this RI event network has now been extended to also model runway excursion (RE) events. These RI and RE event networks have been further modified and vetted by a Subject Matter Expert (SME) panel. The combined system-level BBN model will allow NASA to generically model the causes of RI and RE events and to assess the effectiveness of technology products being developed under NASA funding. These products are intended to reduce the frequency of runway safety incidents/accidents, and to improve runway safety in general. The development and structure of the BBN for both RI and RE events are documented in this paper.

  16. Neandertal admixture in Eurasia confirmed by maximum-likelihood analysis of three genomes.

    PubMed

    Lohse, Konrad; Frantz, Laurent A F

    2014-04-01

    Although there has been much interest in estimating histories of divergence and admixture from genomic data, it has proved difficult to distinguish recent admixture from long-term structure in the ancestral population. Thus, recent genome-wide analyses based on summary statistics have sparked controversy about the possibility of interbreeding between Neandertals and modern humans in Eurasia. Here we derive the probability of full mutational configurations in nonrecombining sequence blocks under both admixture and ancestral structure scenarios. Dividing the genome into short blocks gives an efficient way to compute maximum-likelihood estimates of parameters. We apply this likelihood scheme to triplets of human and Neandertal genomes and compare the relative support for a model of admixture from Neandertals into Eurasian populations after their expansion out of Africa against a history of persistent structure in their common ancestral population in Africa. Our analysis allows us to conclusively reject a model of ancestral structure in Africa and instead reveals strong support for Neandertal admixture in Eurasia at a higher rate (3.4-7.3%) than suggested previously. Using analysis and simulations we show that our inference is more powerful than previous summary statistics and robust to realistic levels of recombination.

  17. Neandertal Admixture in Eurasia Confirmed by Maximum-Likelihood Analysis of Three Genomes

    PubMed Central

    Lohse, Konrad; Frantz, Laurent A. F.

    2014-01-01

    Although there has been much interest in estimating histories of divergence and admixture from genomic data, it has proved difficult to distinguish recent admixture from long-term structure in the ancestral population. Thus, recent genome-wide analyses based on summary statistics have sparked controversy about the possibility of interbreeding between Neandertals and modern humans in Eurasia. Here we derive the probability of full mutational configurations in nonrecombining sequence blocks under both admixture and ancestral structure scenarios. Dividing the genome into short blocks gives an efficient way to compute maximum-likelihood estimates of parameters. We apply this likelihood scheme to triplets of human and Neandertal genomes and compare the relative support for a model of admixture from Neandertals into Eurasian populations after their expansion out of Africa against a history of persistent structure in their common ancestral population in Africa. Our analysis allows us to conclusively reject a model of ancestral structure in Africa and instead reveals strong support for Neandertal admixture in Eurasia at a higher rate (3.4−7.3%) than suggested previously. Using analysis and simulations we show that our inference is more powerful than previous summary statistics and robust to realistic levels of recombination. PMID:24532731

  18. Living Animals in the Classroom: A Meta-Analysis on Learning Outcome and a Treatment-Control Study Focusing on Knowledge and Motivation

    ERIC Educational Resources Information Center

    Hummel, Eberhard; Randler, Christoph

    2012-01-01

    Prior research states that the use of living animals in the classroom leads to a higher knowledge but those previous studies have methodological and statistical problems. We applied a meta-analysis and developed a treatment-control study in a middle school classroom. The treatments (film vs. living animal) differed only by the presence of the…

  19. Contour plot assessment of existing meta-analyses confirms robust association of statin use and acute kidney injury risk.

    PubMed

    Chevance, Aurélie; Schuster, Tibor; Steele, Russell; Ternès, Nils; Platt, Robert W

    2015-10-01

    Robustness of an existing meta-analysis can justify decisions on whether to conduct an additional study addressing the same research question. We illustrate the graphical assessment of the potential impact of an additional study on an existing meta-analysis using published data on statin use and the risk of acute kidney injury. A previously proposed graphical augmentation approach is used to assess the sensitivity of the current test and heterogeneity statistics extracted from existing meta-analysis data. In addition, we extended the graphical augmentation approach to assess potential changes in the pooled effect estimate after updating a current meta-analysis and applied the three graphical contour definitions to data from meta-analyses on statin use and acute kidney injury risk. In the considered example data, the pooled effect estimates and heterogeneity indices demonstrated to be considerably robust to the addition of a future study. Supportingly, for some previously inconclusive meta-analyses, a study update might yield statistically significant kidney injury risk increase associated with higher statin exposure. The illustrated contour approach should become a standard tool for the assessment of the robustness of meta-analyses. It can guide decisions on whether to conduct additional studies addressing a relevant research question. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Re-Evaluation of Event Correlations in Virtual California Using Statistical Analysis

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Heflin, M. B.; Granat, R. A.; Yikilmaz, M. B.; Heien, E.; Rundle, J.; Donnellan, A.

    2010-12-01

    Fusing the results of simulation tools with statistical analysis methods has contributed to our better understanding of the earthquake process. In a previous study, we used a statistical method to investigate emergent phenomena in data produced by the Virtual California earthquake simulator. The analysis indicated that there were some interesting fault interactions and possible triggering and quiescence relationships between events. We have converted the original code from Matlab to python/C++ and are now evaluating data from the most recent version of Virtual California in order to analyze and compare any new behavior exhibited by the model. The Virtual California earthquake simulator can be used to study fault and stress interaction scenarios for realistic California earthquakes. The simulation generates a synthetic earthquake catalog of events with a minimum size of ~M 5.8 that can be evaluated using statistical analysis methods. Virtual California utilizes realistic fault geometries and a simple Amontons - Coulomb stick and slip friction law in order to drive the earthquake process by means of a back-slip model where loading of each segment occurs due to the accumulation of a slip deficit at the prescribed slip rate of the segment. Like any complex system, Virtual California may generate emergent phenomena unexpected even by its designers. In order to investigate this, we have developed a statistical method that analyzes the interaction between Virtual California fault elements and thereby determine whether events on any given fault elements show correlated behavior. Our method examines events on one fault element and then determines whether there is an associated event within a specified time window on a second fault element. Note that an event in our analysis is defined as any time an element slips, rather than any particular “earthquake” along the entire fault length. Results are then tabulated and then differenced with an expected correlation, calculated by assuming a uniform distribution of events in time. We generate a correlation score matrix, which indicates how weakly or strongly correlated each fault element is to every other in the course of the VC simulation. We calculate correlation scores by summing the difference between the actual and expected correlations over all time window lengths and normalizing by the time window size. The correlation score matrix can focus attention on the most interesting areas for more in-depth analysis of event correlation vs. time. The previous study included 59 faults (639 elements) in the model, which included all the faults save the creeping section of the San Andreas. The analysis spanned 40,000 yrs of Virtual California-generated earthquake data. The newly revised VC model includes 70 faults, 8720 fault elements, and spans 110,000 years. Due to computational considerations, we will evaluate the elements comprising the southern California region, which our previous study indicated showed interesting fault interaction and event triggering/quiescence relationships.

  1. Statistical analysis of effective singular values in matrix rank determination

    NASA Technical Reports Server (NTRS)

    Konstantinides, Konstantinos; Yao, Kung

    1988-01-01

    A major problem in using SVD (singular-value decomposition) as a tool in determining the effective rank of a perturbed matrix is that of distinguishing between significantly small and significantly large singular values to the end, conference regions are derived for the perturbed singular values of matrices with noisy observation data. The analysis is based on the theories of perturbations of singular values and statistical significance test. Threshold bounds for perturbation due to finite-precision and i.i.d. random models are evaluated. In random models, the threshold bounds depend on the dimension of the matrix, the noisy variance, and predefined statistical level of significance. Results applied to the problem of determining the effective order of a linear autoregressive system from the approximate rank of a sample autocorrelation matrix are considered. Various numerical examples illustrating the usefulness of these bounds and comparisons to other previously known approaches are given.

  2. A statistically derived index for classifying East Coast fever reactions in cattle challenged with Theileria parva under experimental conditions.

    PubMed

    Rowlands, G J; Musoke, A J; Morzaria, S P; Nagda, S M; Ballingall, K T; McKeever, D J

    2000-04-01

    A statistically derived disease reaction index based on parasitological, clinical and haematological measurements observed in 309 5 to 8-month-old Boran cattle following laboratory challenge with Theileria parva is described. Principal component analysis was applied to 13 measures including first appearance of schizonts, first appearance of piroplasms and first occurrence of pyrexia, together with the duration and severity of these symptoms, and white blood cell count. The first principal component, which was based on approximately equal contributions of the 13 variables, provided the definition for the disease reaction index, defined on a scale of 0-10. As well as providing a more objective measure of the severity of the reaction, the continuous nature of the index score enables more powerful statistical analysis of the data compared with that which has been previously possible through clinically derived categories of non-, mild, moderate and severe reactions.

  3. [PASS neurocognitive dysfunction in attention deficit].

    PubMed

    Pérez-Alvarez, F; Timoneda-Gallart, C

    Attention deficit disorder shows both cognitive and behavioral patterns. To determine a particular PASS (planning, attention, successive and simultaneous) pattern in order to early diagnosis and remediation according to PASS theory. 80 patients were selected from the neuropediatric attendance, aged 6 to 12 years old, 55 boys and 25 girls. Inclusion criteria were inattention (80 cases) and inattention with hyperactive symptoms (40 cases) according to the Diagnostic and Statistical Manual (DSM-IV). Exclusion criteria were the criteria of phonologic awareness previously reported, considered useful to diagnose dyslexia. A control group of 300 individuals, aged 5 to 12 years old, was used, criteria above mentioned being controlled. DN:CAS (Das-Naglieri Cognitive Assessment System) battery, translated to native language, was given to assess PASS cognitive processes. Results were analyzed with cluster analysis and t-Student test. Statistical factor analysis of the control group had previously identified the four PASS processes: planning, attention, successive and simultaneous. The dendrogram of the cluster analysis discriminated three categories of attention deficit disorder: 1. The most frequent, with planning deficit; 2. Without planning deficit but with deficit in other processes, and 3. Just only a few cases, without cognitive processing deficit. Cognitive deficiency in terms of means of scores was statistically significant when compared to control group (p = 0.001). According to PASS pattern, planning deficiency is a relevant factor. Neurological planning is not exactly the same than neurological executive function. The behavioral pattern is mainly linked to planning deficiency, but also to other PASS processing deficits and even to no processing deficit.

  4. Predicting Cortical Dark/Bright Asymmetries from Natural Image Statistics and Early Visual Transforms

    PubMed Central

    Cooper, Emily A.; Norcia, Anthony M.

    2015-01-01

    The nervous system has evolved in an environment with structure and predictability. One of the ubiquitous principles of sensory systems is the creation of circuits that capitalize on this predictability. Previous work has identified predictable non-uniformities in the distributions of basic visual features in natural images that are relevant to the encoding tasks of the visual system. Here, we report that the well-established statistical distributions of visual features -- such as visual contrast, spatial scale, and depth -- differ between bright and dark image components. Following this analysis, we go on to trace how these differences in natural images translate into different patterns of cortical input that arise from the separate bright (ON) and dark (OFF) pathways originating in the retina. We use models of these early visual pathways to transform natural images into statistical patterns of cortical input. The models include the receptive fields and non-linear response properties of the magnocellular (M) and parvocellular (P) pathways, with their ON and OFF pathway divisions. The results indicate that there are regularities in visual cortical input beyond those that have previously been appreciated from the direct analysis of natural images. In particular, several dark/bright asymmetries provide a potential account for recently discovered asymmetries in how the brain processes visual features, such as violations of classic energy-type models. On the basis of our analysis, we expect that the dark/bright dichotomy in natural images plays a key role in the generation of both cortical and perceptual asymmetries. PMID:26020624

  5. Characterizing microstructural features of biomedical samples by statistical analysis of Mueller matrix images

    NASA Astrophysics Data System (ADS)

    He, Honghui; Dong, Yang; Zhou, Jialing; Ma, Hui

    2017-03-01

    As one of the salient features of light, polarization contains abundant structural and optical information of media. Recently, as a comprehensive description of polarization property, the Mueller matrix polarimetry has been applied to various biomedical studies such as cancerous tissues detections. In previous works, it has been found that the structural information encoded in the 2D Mueller matrix images can be presented by other transformed parameters with more explicit relationship to certain microstructural features. In this paper, we present a statistical analyzing method to transform the 2D Mueller matrix images into frequency distribution histograms (FDHs) and their central moments to reveal the dominant structural features of samples quantitatively. The experimental results of porcine heart, intestine, stomach, and liver tissues demonstrate that the transformation parameters and central moments based on the statistical analysis of Mueller matrix elements have simple relationships to the dominant microstructural properties of biomedical samples, including the density and orientation of fibrous structures, the depolarization power, diattenuation and absorption abilities. It is shown in this paper that the statistical analysis of 2D images of Mueller matrix elements may provide quantitative or semi-quantitative criteria for biomedical diagnosis.

  6. Statistical approaches in published ophthalmic clinical science papers: a comparison to statistical practice two decades ago.

    PubMed

    Zhang, Harrison G; Ying, Gui-Shuang

    2018-02-09

    The aim of this study is to evaluate the current practice of statistical analysis of eye data in clinical science papers published in British Journal of Ophthalmology ( BJO ) and to determine whether the practice of statistical analysis has improved in the past two decades. All clinical science papers (n=125) published in BJO in January-June 2017 were reviewed for their statistical analysis approaches for analysing primary ocular measure. We compared our findings to the results from a previous paper that reviewed BJO papers in 1995. Of 112 papers eligible for analysis, half of the studies analysed the data at an individual level because of the nature of observation, 16 (14%) studies analysed data from one eye only, 36 (32%) studies analysed data from both eyes at ocular level, one study (1%) analysed the overall summary of ocular finding per individual and three (3%) studies used the paired comparison. Among studies with data available from both eyes, 50 (89%) of 56 papers in 2017 did not analyse data from both eyes or ignored the intereye correlation, as compared with in 60 (90%) of 67 papers in 1995 (P=0.96). Among studies that analysed data from both eyes at an ocular level, 33 (92%) of 36 studies completely ignored the intereye correlation in 2017, as compared with in 16 (89%) of 18 studies in 1995 (P=0.40). A majority of studies did not analyse the data properly when data from both eyes were available. The practice of statistical analysis did not improve in the past two decades. Collaborative efforts should be made in the vision research community to improve the practice of statistical analysis for ocular data. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  7. VAGINAL PROGESTERONE VERSUS CERVICAL CERCLAGE FOR THE PREVENTION OF PRETERM BIRTH IN WOMEN WITH A SONOGRAPHIC SHORT CERVIX, SINGLETON GESTATION, AND PREVIOUS PRETERM BIRTH: A SYSTEMATIC REVIEW AND INDIRECT COMPARISON META-ANALYSIS

    PubMed Central

    CONDE-AGUDELO, Agustin; ROMERO, Roberto; NICOLAIDES, Kypros; CHAIWORAPONGSA, Tinnakorn; O'BRIEN, John M.; CETINGOZ, Elcin; DA FONSECA, Eduardo; CREASY, George; SOMA-PILLAY, Priya; FUSEY, Shalini; CAM, Cetin; ALFIREVIC, Zarko; HASSAN, Sonia S.

    2012-01-01

    OBJECTIVE No randomized controlled trial has directly compared vaginal progesterone and cervical cerclage for the prevention of preterm birth in women with a sonographic short cervix in the midtrimester, singleton gestation, and previous spontaneous preterm birth. We performed an indirect comparison of vaginal progesterone versus cerclage, using placebo/no cerclage as the common comparator. STUDY DESIGN Adjusted indirect meta-analysis of randomized controlled trials. RESULTS Four studies evaluating vaginal progesterone versus placebo (158 patients) and five evaluating cerclage versus no cerclage (504 patients) were included. Both interventions were associated with a statistically significant reduction in the risk of preterm birth <32 weeks of gestation and composite perinatal morbidity and mortality compared with placebo/no cerclage. Adjusted indirect meta-analyses did not show statistically significant differences between vaginal progesterone and cerclage in reducing preterm birth or adverse perinatal outcomes. CONCLUSION Based on state-of-the-art methodology for indirect comparisons, either vaginal progesterone or cerclage are equally efficacious in the prevention of preterm birth in women with a sonographic short cervix in the midtrimester, singleton gestation, and previous preterm birth. The selection of the optimal treatment may depend upon adverse events, cost and patient/clinician preferences. PMID:23157855

  8. [Risk factors for infection in total knee artrhoplasty, including previously unreported intraoperative fracture and deep venous thrombosis].

    PubMed

    de Dios, M; Cordero-Ampuero, J

    2015-01-01

    To carry out a statistical analysis on the significant risk factors for deep late infection (prosthetic joint infection, PJI) in patients with a knee arthroplasty (TKA). A retrospective observational case-control study was conducted on a case series of 32 consecutive knee infections, using an analysis of all the risk factors reported in the literature. A control series of 100 randomly selected patients operated in the same Department of a University General Hospital during the same period of time, with no sign of deep infection in their knee arthroplasty during follow-up. Statistical comparisons were made using Pearson for qualitative and ANOVA for quantitative variables. The significant (p>0.05) factors found in the series were: Preoperative previous knee surgery, glucocorticoids, immunosuppressants, inflammatory arthritis. prolonged surgical time, inadequate antibiotic prophylaxis, intraoperative fractures. Postoperative secretion of the wound longer than 10 days, deep palpable haematoma, need for a new surgery, and deep venous thrombosis in lower limbs. Distant infections cutaneous, generalized sepsis, urinary tract, pneumonia, abdominal. This is the first report of intraoperative fractures and deep venous thrombosis as significantly more frequent factors in infected TKAs. Other previously described risk factors for TKA PJI are also confirmed. Copyright © 2014 SECOT. Published by Elsevier Espana. All rights reserved.

  9. A Third Moment Adjusted Test Statistic for Small Sample Factor Analysis

    PubMed Central

    Lin, Johnny; Bentler, Peter M.

    2012-01-01

    Goodness of fit testing in factor analysis is based on the assumption that the test statistic is asymptotically chi-square; but this property may not hold in small samples even when the factors and errors are normally distributed in the population. Robust methods such as Browne’s asymptotically distribution-free method and Satorra Bentler’s mean scaling statistic were developed under the presumption of non-normality in the factors and errors. This paper finds new application to the case where factors and errors are normally distributed in the population but the skewness of the obtained test statistic is still high due to sampling error in the observed indicators. An extension of Satorra Bentler’s statistic is proposed that not only scales the mean but also adjusts the degrees of freedom based on the skewness of the obtained test statistic in order to improve its robustness under small samples. A simple simulation study shows that this third moment adjusted statistic asymptotically performs on par with previously proposed methods, and at a very small sample size offers superior Type I error rates under a properly specified model. Data from Mardia, Kent and Bibby’s study of students tested for their ability in five content areas that were either open or closed book were used to illustrate the real-world performance of this statistic. PMID:23144511

  10. Statistical model specification and power: recommendations on the use of test-qualified pooling in analysis of experimental data

    PubMed Central

    Colegrave, Nick

    2017-01-01

    A common approach to the analysis of experimental data across much of the biological sciences is test-qualified pooling. Here non-significant terms are dropped from a statistical model, effectively pooling the variation associated with each removed term with the error term used to test hypotheses (or estimate effect sizes). This pooling is only carried out if statistical testing on the basis of applying that data to a previous more complicated model provides motivation for this model simplification; hence the pooling is test-qualified. In pooling, the researcher increases the degrees of freedom of the error term with the aim of increasing statistical power to test their hypotheses of interest. Despite this approach being widely adopted and explicitly recommended by some of the most widely cited statistical textbooks aimed at biologists, here we argue that (except in highly specialized circumstances that we can identify) the hoped-for improvement in statistical power will be small or non-existent, and there is likely to be much reduced reliability of the statistical procedures through deviation of type I error rates from nominal levels. We thus call for greatly reduced use of test-qualified pooling across experimental biology, more careful justification of any use that continues, and a different philosophy for initial selection of statistical models in the light of this change in procedure. PMID:28330912

  11. Morphological image analysis for classification of gastrointestinal tissues using optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Garcia-Allende, P. Beatriz; Amygdalos, Iakovos; Dhanapala, Hiruni; Goldin, Robert D.; Hanna, George B.; Elson, Daniel S.

    2012-01-01

    Computer-aided diagnosis of ophthalmic diseases using optical coherence tomography (OCT) relies on the extraction of thickness and size measures from the OCT images, but such defined layers are usually not observed in emerging OCT applications aimed at "optical biopsy" such as pulmonology or gastroenterology. Mathematical methods such as Principal Component Analysis (PCA) or textural analyses including both spatial textural analysis derived from the two-dimensional discrete Fourier transform (DFT) and statistical texture analysis obtained independently from center-symmetric auto-correlation (CSAC) and spatial grey-level dependency matrices (SGLDM), as well as, quantitative measurements of the attenuation coefficient have been previously proposed to overcome this problem. We recently proposed an alternative approach consisting of a region segmentation according to the intensity variation along the vertical axis and a pure statistical technology for feature quantification. OCT images were first segmented in the axial direction in an automated manner according to intensity. Afterwards, a morphological analysis of the segmented OCT images was employed for quantifying the features that served for tissue classification. In this study, a PCA processing of the extracted features is accomplished to combine their discriminative power in a lower number of dimensions. Ready discrimination of gastrointestinal surgical specimens is attained demonstrating that the approach further surpasses the algorithms previously reported and is feasible for tissue classification in the clinical setting.

  12. Do emotional intelligence and previous caring experience influence student nurse performance? A comparative analysis.

    PubMed

    Stenhouse, Rosie; Snowden, Austyn; Young, Jenny; Carver, Fiona; Carver, Hannah; Brown, Norrie

    2016-08-01

    Reports of poor nursing care have focused attention on values based selection of candidates onto nursing programmes. Values based selection lacks clarity and valid measures. Previous caring experience might lead to better care. Emotional intelligence (EI) might be associated with performance, is conceptualised and measurable. To examine the impact of 1) previous caring experience, 2) emotional intelligence 3) social connection scores on performance and retention in a cohort of first year nursing and midwifery students in Scotland. A longitudinal, quasi experimental design. Adult and mental health nursing, and midwifery programmes in a Scottish University. Adult, mental health and midwifery students (n=598) completed the Trait Emotional Intelligence Questionnaire-short form and Schutte's Emotional Intelligence Scale on entry to their programmes at a Scottish University, alongside demographic and previous caring experience data. Social connection was calculated from a subset of questions identified within the TEIQue-SF in a prior factor and Rasch analysis. Student performance was calculated as the mean mark across the year. Withdrawal data were gathered. 598 students completed baseline measures. 315 students declared previous caring experience, 277 not. An independent-samples t-test identified that those without previous caring experience scored higher on performance (57.33±11.38) than those with previous caring experience (54.87±11.19), a statistically significant difference of 2.47 (95% CI, 0.54 to 4.38), t(533)=2.52, p=.012. Emotional intelligence scores were not associated with performance. Social connection scores for those withdrawing (mean rank=249) and those remaining (mean rank=304.75) were statistically significantly different, U=15,300, z=-2.61, p$_amp_$lt;0.009. Previous caring experience led to worse performance in this cohort. Emotional intelligence was not a useful indicator of performance. Lower scores on the social connection factor were associated with withdrawal from the course. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Can PC-9 Zhong chong replace K-1 Yong quan for the acupunctural resuscitation of a bilateral double-amputee? Stating the “random criterion problem” in its statistical analysis

    PubMed Central

    Inchauspe, Adrián Angel

    2016-01-01

    AIM: To present an inclusion criterion for patients who have suffered bilateral amputation in order to be treated with the supplementary resuscitation treatment which is hereby proposed by the author. METHODS: This work is based on a Retrospective Cohort model so that a certainly lethal risk to the control group is avoided. RESULTS: This paper presents a hypothesis on acupunctural PC-9 Zhong chong point, further supported by previous statistical work recorded for the K-1 Yong quan resuscitation point. CONCLUSION: Thanks to the application of the resuscitation maneuver herein proposed on the previously mentioned point, patients with bilateral amputation would have another alternative treatment available in case basic and advanced CPR should fail. PMID:27152257

  14. How does the past of a soccer match influence its future? Concepts and statistical analysis.

    PubMed

    Heuer, Andreas; Rubner, Oliver

    2012-01-01

    Scoring goals in a soccer match can be interpreted as a stochastic process. In the most simple description of a soccer match one assumes that scoring goals follows from independent rate processes of both teams. This would imply simple Poissonian and Markovian behavior. Deviations from this behavior would imply that the previous course of the match has an impact on the present match behavior. Here a general framework for the identification of deviations from this behavior is presented. For this endeavor it is essential to formulate an a priori estimate of the expected number of goals per team in a specific match. This can be done based on our previous work on the estimation of team strengths. Furthermore, the well-known general increase of the number of the goals in the course of a soccer match has to be removed by appropriate normalization. In general, three different types of deviations from a simple rate process can exist. First, the goal rate may depend on the exact time of the previous goals. Second, it may be influenced by the time passed since the previous goal and, third, it may reflect the present score. We show that the Poissonian scenario is fulfilled quite well for the German Bundesliga. However, a detailed analysis reveals significant deviations for the second and third aspect. Dramatic effects are observed if the away team leads by one or two goals in the final part of the match. This analysis allows one to identify generic features about soccer matches and to learn about the hidden complexities behind scoring goals. Among others the reason for the fact that the number of draws is larger than statistically expected can be identified.

  15. How Does the Past of a Soccer Match Influence Its Future? Concepts and Statistical Analysis

    PubMed Central

    Heuer, Andreas; Rubner, Oliver

    2012-01-01

    Scoring goals in a soccer match can be interpreted as a stochastic process. In the most simple description of a soccer match one assumes that scoring goals follows from independent rate processes of both teams. This would imply simple Poissonian and Markovian behavior. Deviations from this behavior would imply that the previous course of the match has an impact on the present match behavior. Here a general framework for the identification of deviations from this behavior is presented. For this endeavor it is essential to formulate an a priori estimate of the expected number of goals per team in a specific match. This can be done based on our previous work on the estimation of team strengths. Furthermore, the well-known general increase of the number of the goals in the course of a soccer match has to be removed by appropriate normalization. In general, three different types of deviations from a simple rate process can exist. First, the goal rate may depend on the exact time of the previous goals. Second, it may be influenced by the time passed since the previous goal and, third, it may reflect the present score. We show that the Poissonian scenario is fulfilled quite well for the German Bundesliga. However, a detailed analysis reveals significant deviations for the second and third aspect. Dramatic effects are observed if the away team leads by one or two goals in the final part of the match. This analysis allows one to identify generic features about soccer matches and to learn about the hidden complexities behind scoring goals. Among others the reason for the fact that the number of draws is larger than statistically expected can be identified. PMID:23226200

  16. The minimal SUSY B - L model: simultaneous Wilson lines and string thresholds

    DOE PAGES

    Deen, Rehan; Ovrut, Burt A.; Purves, Austin

    2016-07-08

    In previous work, we presented a statistical scan over the soft supersymmetry breaking parameters of the minimal SUSY B - L model. For specificity of calculation, unification of the gauge parameters was enforced by allowing the two Z 3×Z 3 Wilson lines to have mass scales separated by approximately an order of magnitude. This introduced an additional “left-right” sector below the unification scale. In this paper, for three important reasons, we modify our previous analysis by demanding that the mass scales of the two Wilson lines be simultaneous and equal to an “average unification” mass U >. The present analysismore » is 1) more “natural” than the previous calculations, which were only valid in a very specific region of the Calabi-Yau moduli space, 2) the theory is conceptually simpler in that the left-right sector has been removed and 3) in the present analysis the lack of gauge unification is due to threshold effects — particularly heavy string thresholds, which we calculate statistically in detail. As in our previous work, the theory is renormalization group evolved from U > to the electroweak scale — being subjected, sequentially, to the requirement of radiative B - L and electroweak symmetry breaking, the present experimental lower bounds on the B - L vector boson and sparticle masses, as well as the lightest neutral Higgs mass of ~125 GeV. The subspace of soft supersymmetry breaking masses that satisfies all such constraints is presented and shown to be substantial.« less

  17. Explanation of Two Anomalous Results in Statistical Mediation Analysis

    ERIC Educational Resources Information Center

    Fritz, Matthew S.; Taylor, Aaron B.; MacKinnon, David P.

    2012-01-01

    Previous studies of different methods of testing mediation models have consistently found two anomalous results. The first result is elevated Type I error rates for the bias-corrected and accelerated bias-corrected bootstrap tests not found in nonresampling tests or in resampling tests that did not include a bias correction. This is of special…

  18. Clinical and Electrodiagnostic Abnormalities of the Median Nerve in US Army Dental Assistants at the Onset of Training

    DTIC Science & Technology

    2012-01-01

    EMG studies). Data Management and Analysis Descriptive statistics for subject demographics and nerve conduction study variables were calculated using...military N/A Family history of CTS; previous work history as electrician, guitar player 49 (R) None N/A Dental assistant; waiter NCS indicates

  19. Recent Reliability Reporting Practices in "Psychological Assessment": Recognizing the People behind the Data

    ERIC Educational Resources Information Center

    Green, Carlton E.; Chen, Cynthia E.; Helms, Janet E.; Henze, Kevin T.

    2011-01-01

    Helms, Henze, Sass, and Mifsud (2006) defined good practices for internal consistency reporting, interpretation, and analysis consistent with an alpha-as-data perspective. Their viewpoint (a) expands on previous arguments that reliability coefficients are group-level summary statistics of samples' responses rather than stable properties of scales…

  20. Relationship of Class-Size to Classroom Processes, Teacher Satisfaction and Pupil Affect: A Meta-Analysis.

    ERIC Educational Resources Information Center

    Smith, Mary Lee; Glass, Gene V.

    Using data from previously completed research, the authors of this report attempted to examine the relationship between class size and measures of outcomes such as student attitudes and behavior, classroom processes and learning environment, and teacher satisfaction. The authors report that statistical integration of the existing research…

  1. Queensland Teachers' Conceptions of Assessment: The Impact of Policy Priorities on Teacher Attitudes

    ERIC Educational Resources Information Center

    Brown, Gavin T. L.; Lake, Robert; Matters, Gabrielle

    2011-01-01

    The conceptions Queensland teachers have about assessment purposes were surveyed in 2003 with an abridged version of the Teacher Conceptions of Assessment Inventory. Multi-group analysis found that a model with four factors, somewhat different in structure to previous studies, was statistically different between Queensland primary and (lower)…

  2. Thrust imbalance of solid rocket motor pairs on Space Shuttle flights

    NASA Technical Reports Server (NTRS)

    Foster, W. A., Jr.; Shu, P. H.; Sforzini, R. H.

    1986-01-01

    This analysis extends the investigation presented at the 17th Joint Propulsion Conference in 1981 to include fifteen sets of Space Shuttle flight data. The previous report dealt only with static test data and the first flight pair. The objective is to compare the authors' previous theoretical analysis of thrust imbalance with actual Space Shuttle performance. The theoretical prediction method, which involves a Monte Carlo technique, is reviewed briefly as are salient features of the flight instrumentation system and the statistical analysis. A scheme for smoothing flight data is discussed. The effects of changes in design parameters are discussed with special emphasis on the filament wound motor case being developed to replace the steel case. Good agreement between the predictions and the flight data is demonstrated.

  3. Exercise and Bone Density: Meta-Analysis

    DTIC Science & Technology

    2003-10-01

    were estimat- included in our analysis. Thus, for example, if BMD ed using previously developed methods .ŕ T- was also assessed in women performing...from 12.6% unpublished work is inappropriate because it has in the placebo group to 9.0% in the alendronate not gone through the peer review process...Olkin I. Statistical Methods for Meta-Analy- taken that could enhance BMD, cigarette smoking, 0 sis. San Diego, CA: Academic Press; 1985. take tht

  4. A methodological analysis of chaplaincy research: 2000-2009.

    PubMed

    Galek, Kathleen; Flannelly, Kevin J; Jankowski, Katherine R B; Handzo, George F

    2011-01-01

    The present article presents a comprehensive review and analysis of quantitative research conducted in the United States on chaplaincy and closely related topics published between 2000 and 2009. A combined search strategy identified 49 quantitative studies in 13 journals. The analysis focuses on the methodological sophistication of the studies, compared to earlier research on chaplaincy and pastoral care. Cross-sectional surveys of convenience samples still dominate the field, but sample sizes have increased somewhat over the past three decades. Reporting of the validity and reliability of measures continues to be low, although reporting of response rates has improved. Improvements in the use of inferential statistics and statistical controls were also observed, compared to previous research. The authors conclude that more experimental research is needed on chaplaincy, along with an increased use of hypothesis testing, regardless of the research designs that are used.

  5. Disutility analysis of oil spills: graphs and trends.

    PubMed

    Ventikos, Nikolaos P; Sotiropoulos, Foivos S

    2014-04-15

    This paper reports the results of an analysis of oil spill cost data assembled from a worldwide pollution database that mainly includes data from the International Oil Pollution Compensation Fund. The purpose of the study is to analyze the conditions of marine pollution accidents and the factors that impact the costs of oil spills worldwide. The accidents are classified into categories based on their characteristics, and the cases are compared using charts to show how the costs are affected under all conditions. This study can be used as a helpful reference for developing a detailed statistical model that is capable of reliably and realistically estimating the total costs of oil spills. To illustrate the differences identified by this statistical analysis, the results are compared with the results of previous studies, and the findings are discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. 77 FR 17405 - Notice of Intent To Revise a Previously Approved Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-26

    ... DEPARTMENT OF AGRICULTURE National Agricultural Statistics Service Notice of Intent To Revise a Previously Approved Information Collection AGENCY: National Agricultural Statistics Service, USDA. ACTION... notice announces the intent of the National Agricultural Statistics Service (NASS) to seek reinstatement...

  7. Discovering genetic variants in Crohn's disease by exploring genomic regions enriched of weak association signals.

    PubMed

    D'Addabbo, Annarita; Palmieri, Orazio; Maglietta, Rosalia; Latiano, Anna; Mukherjee, Sayan; Annese, Vito; Ancona, Nicola

    2011-08-01

    A meta-analysis has re-analysed previous genome-wide association scanning definitively confirming eleven genes and further identifying 21 new loci. However, the identified genes/loci still explain only the minority of genetic predisposition of Crohn's disease. To identify genes weakly involved in disease predisposition by analysing chromosomal regions enriched of single nucleotide polymorphisms with modest statistical association. We utilized the WTCCC data set evaluating 1748 CD and 2938 controls. The identification of candidate genes/loci was performed by a two-step procedure: first of all chromosomal regions enriched of weak association signals were localized; subsequently, weak signals clustered in gene regions were identified. The statistical significance was assessed by non parametric permutation tests. The cytoband enrichment analysis highlighted 44 regions (P≤0.05) enriched with single nucleotide polymorphisms significantly associated with the trait including 23 out of 31 previously confirmed and replicated genes. Importantly, we highlight further 20 novel chromosomal regions carrying approximately one hundred genes/loci with modest association. Amongst these we find compelling functional candidate genes such as MAPT, GRB2 and CREM, LCT, and IL12RB2. Our study suggests a different statistical perspective to discover genes weakly associated with a given trait, although further confirmatory functional studies are needed. Copyright © 2011 Editrice Gastroenterologica Italiana S.r.l. All rights reserved.

  8. Meta-analyses and Forest plots using a microsoft excel spreadsheet: step-by-step guide focusing on descriptive data analysis.

    PubMed

    Neyeloff, Jeruza L; Fuchs, Sandra C; Moreira, Leila B

    2012-01-20

    Meta-analyses are necessary to synthesize data obtained from primary research, and in many situations reviews of observational studies are the only available alternative. General purpose statistical packages can meta-analyze data, but usually require external macros or coding. Commercial specialist software is available, but may be expensive and focused in a particular type of primary data. Most available softwares have limitations in dealing with descriptive data, and the graphical display of summary statistics such as incidence and prevalence is unsatisfactory. Analyses can be conducted using Microsoft Excel, but there was no previous guide available. We constructed a step-by-step guide to perform a meta-analysis in a Microsoft Excel spreadsheet, using either fixed-effect or random-effects models. We have also developed a second spreadsheet capable of producing customized forest plots. It is possible to conduct a meta-analysis using only Microsoft Excel. More important, to our knowledge this is the first description of a method for producing a statistically adequate but graphically appealing forest plot summarizing descriptive data, using widely available software.

  9. Meta-analyses and Forest plots using a microsoft excel spreadsheet: step-by-step guide focusing on descriptive data analysis

    PubMed Central

    2012-01-01

    Background Meta-analyses are necessary to synthesize data obtained from primary research, and in many situations reviews of observational studies are the only available alternative. General purpose statistical packages can meta-analyze data, but usually require external macros or coding. Commercial specialist software is available, but may be expensive and focused in a particular type of primary data. Most available softwares have limitations in dealing with descriptive data, and the graphical display of summary statistics such as incidence and prevalence is unsatisfactory. Analyses can be conducted using Microsoft Excel, but there was no previous guide available. Findings We constructed a step-by-step guide to perform a meta-analysis in a Microsoft Excel spreadsheet, using either fixed-effect or random-effects models. We have also developed a second spreadsheet capable of producing customized forest plots. Conclusions It is possible to conduct a meta-analysis using only Microsoft Excel. More important, to our knowledge this is the first description of a method for producing a statistically adequate but graphically appealing forest plot summarizing descriptive data, using widely available software. PMID:22264277

  10. Multiple comparison analysis testing in ANOVA.

    PubMed

    McHugh, Mary L

    2011-01-01

    The Analysis of Variance (ANOVA) test has long been an important tool for researchers conducting studies on multiple experimental groups and one or more control groups. However, ANOVA cannot provide detailed information on differences among the various study groups, or on complex combinations of study groups. To fully understand group differences in an ANOVA, researchers must conduct tests of the differences between particular pairs of experimental and control groups. Tests conducted on subsets of data tested previously in another analysis are called post hoc tests. A class of post hoc tests that provide this type of detailed information for ANOVA results are called "multiple comparison analysis" tests. The most commonly used multiple comparison analysis statistics include the following tests: Tukey, Newman-Keuls, Scheffee, Bonferroni and Dunnett. These statistical tools each have specific uses, advantages and disadvantages. Some are best used for testing theory while others are useful in generating new theory. Selection of the appropriate post hoc test will provide researchers with the most detailed information while limiting Type 1 errors due to alpha inflation.

  11. Evidence-based orthodontics. Current statistical trends in published articles in one journal.

    PubMed

    Law, Scott V; Chudasama, Dipak N; Rinchuse, Donald J

    2010-09-01

    To ascertain the number, type, and overall usage of statistics in American Journal of Orthodontics and Dentofacial (AJODO) articles for 2008. These data were then compared to data from three previous years: 1975, 1985, and 2003. The frequency and distribution of statistics used in the AJODO original articles for 2008 were dichotomized into those using statistics and those not using statistics. Statistical procedures were then broadly divided into descriptive statistics (mean, standard deviation, range, percentage) and inferential statistics (t-test, analysis of variance). Descriptive statistics were used to make comparisons. In 1975, 1985, 2003, and 2008, AJODO published 72, 87, 134, and 141 original articles, respectively. The percentage of original articles using statistics was 43.1% in 1975, 75.9% in 1985, 94.0% in 2003, and 92.9% in 2008; original articles using statistics stayed relatively the same from 2003 to 2008, with only a small 1.1% decrease. The percentage of articles using inferential statistical analyses was 23.7% in 1975, 74.2% in 1985, 92.9% in 2003, and 84.4% in 2008. Comparing AJODO publications in 2003 and 2008, there was an 8.5% increase in the use of descriptive articles (from 7.1% to 15.6%), and there was an 8.5% decrease in articles using inferential statistics (from 92.9% to 84.4%).

  12. Analysis on Time-Lag Effect of Research and Development Investment in the Pharmaceutical Industry in Korea

    PubMed Central

    Lee, Munjae; Choi, Mankyu

    2015-01-01

    Objectives The aim of this study is to analyze the influence of the research and development (R&D) investment of pharmaceutical companies on enterprise value. Methods The period of the empirical analysis is from 2000 to 2012, considering the period after the influence of the financial crisis. Financial statements and comments in general and internal transactions were extracted from TS-2000 of the Korea Listed Company Association, and data related to stock price were extracted from KISVALUE-III of National Information and Credit Evaluation Information Service Co., Ltd. STATA 12.0 was used as the statistical package for panel analysis. Results In the pharmaceutical firms, the influence of the R&D intensity with regard to Tobin's q was found to be positive. However, only the R&D expenditure intensities of previous years 2 and 5 (t–2 and t–5, respectively) were statistically significant (p < 0.1), whereas those of previous years 1, 3, and 4 years (t–1, t–3, and t–4, respectively) were not statistically significant. Conclusion R&D investment not only affects the enterprise value but is also evaluated as an investment activity that raises the long-term enterprise value. The research findings will serve as valuable data to understand the enterprise value of the Korea pharmaceutical industry and to strengthen reform measures. Not only should new drug development be made, but also investment and support should be provided according to the specific factors suitable to improve the competitiveness of each company, such as generic, incrementally modified drugs, and biosimilar products. PMID:26473091

  13. Analysis on Time-Lag Effect of Research and Development Investment in the Pharmaceutical Industry in Korea.

    PubMed

    Lee, Munjae; Choi, Mankyu

    2015-08-01

    The aim of this study is to analyze the influence of the research and development (R&D) investment of pharmaceutical companies on enterprise value. The period of the empirical analysis is from 2000 to 2012, considering the period after the influence of the financial crisis. Financial statements and comments in general and internal transactions were extracted from TS-2000 of the Korea Listed Company Association, and data related to stock price were extracted from KISVALUE-III of National Information and Credit Evaluation Information Service Co., Ltd. STATA 12.0 was used as the statistical package for panel analysis. In the pharmaceutical firms, the influence of the R&D intensity with regard to Tobin's q was found to be positive. However, only the R&D expenditure intensities of previous years 2 and 5 (t-2 and t-5, respectively) were statistically significant (p < 0.1), whereas those of previous years 1, 3, and 4 years (t-1, t-3, and t-4, respectively) were not statistically significant. R&D investment not only affects the enterprise value but is also evaluated as an investment activity that raises the long-term enterprise value. The research findings will serve as valuable data to understand the enterprise value of the Korea pharmaceutical industry and to strengthen reform measures. Not only should new drug development be made, but also investment and support should be provided according to the specific factors suitable to improve the competitiveness of each company, such as generic, incrementally modified drugs, and biosimilar products.

  14. Significant Association of Urinary Toxic Metals and Autism-Related Symptoms—A Nonlinear Statistical Analysis with Cross Validation

    PubMed Central

    Adams, James; Kruger, Uwe; Geis, Elizabeth; Gehn, Eva; Fimbres, Valeria; Pollard, Elena; Mitchell, Jessica; Ingram, Julie; Hellmers, Robert; Quig, David; Hahn, Juergen

    2017-01-01

    Introduction A number of previous studies examined a possible association of toxic metals and autism, and over half of those studies suggest that toxic metal levels are different in individuals with Autism Spectrum Disorders (ASD). Additionally, several studies found that those levels correlate with the severity of ASD. Methods In order to further investigate these points, this paper performs the most detailed statistical analysis to date of a data set in this field. First morning urine samples were collected from 67 children and adults with ASD and 50 neurotypical controls of similar age and gender. The samples were analyzed to determine the levels of 10 urinary toxic metals (UTM). Autism-related symptoms were assessed with eleven behavioral measures. Statistical analysis was used to distinguish participants on the ASD spectrum and neurotypical participants based upon the UTM data alone. The analysis also included examining the association of autism severity with toxic metal excretion data using linear and nonlinear analysis. “Leave-one-out” cross-validation was used to ensure statistical independence of results. Results and Discussion Average excretion levels of several toxic metals (lead, tin, thallium, antimony) were significantly higher in the ASD group. However, ASD classification using univariate statistics proved difficult due to large variability, but nonlinear multivariate statistical analysis significantly improved ASD classification with Type I/II errors of 15% and 18%, respectively. These results clearly indicate that the urinary toxic metal excretion profiles of participants in the ASD group were significantly different from those of the neurotypical participants. Similarly, nonlinear methods determined a significantly stronger association between the behavioral measures and toxic metal excretion. The association was strongest for the Aberrant Behavior Checklist (including subscales on Irritability, Stereotypy, Hyperactivity, and Inappropriate Speech), but significant associations were found for UTM with all eleven autism-related assessments with cross-validation R2 values ranging from 0.12–0.48. PMID:28068407

  15. A Meta-Analysis and Multisite Time-Series Analysis of the Differential Toxicity of Major Fine Particulate Matter Constituents

    PubMed Central

    Levy, Jonathan I.; Diez, David; Dou, Yiping; Barr, Christopher D.; Dominici, Francesca

    2012-01-01

    Health risk assessments of particulate matter less than 2.5 μm in diameter (PM2.5) often assume that all constituents of PM2.5 are equally toxic. While investigators in previous epidemiologic studies have evaluated health risks from various PM2.5 constituents, few have conducted the analyses needed to directly inform risk assessments. In this study, the authors performed a literature review and conducted a multisite time-series analysis of hospital admissions and exposure to PM2.5 constituents (elemental carbon, organic carbon matter, sulfate, and nitrate) in a population of 12 million US Medicare enrollees for the period 2000–2008. The literature review illustrated a general lack of multiconstituent models or insight about probabilities of differential impacts per unit of concentration change. Consistent with previous results, the multisite time-series analysis found statistically significant associations between short-term changes in elemental carbon and cardiovascular hospital admissions. Posterior probabilities from multiconstituent models provided evidence that some individual constituents were more toxic than others, and posterior parameter estimates coupled with correlations among these estimates provided necessary information for risk assessment. Ratios of constituent toxicities, commonly used in risk assessment to describe differential toxicity, were extremely uncertain for all comparisons. These analyses emphasize the subtlety of the statistical techniques and epidemiologic studies necessary to inform risk assessments of particle constituents. PMID:22510275

  16. Mars Pathfinder Near-Field Rock Distribution Re-Evaluation

    NASA Technical Reports Server (NTRS)

    Haldemann, A. F. C.; Golombek, M. P.

    2003-01-01

    We have completed analysis of a new near-field rock count at the Mars Pathfinder landing site and determined that the previously published rock count suggesting 16% cumulative fractional area (CFA) covered by rocks is incorrect. The earlier value is not so much wrong (our new CFA is 20%), as right for the wrong reason: both the old and the new CFA's are consistent with remote sensing data, however the earlier determination incorrectly calculated rock coverage using apparent width rather than average diameter. Here we present details of the new rock database and the new statistics, as well as the importance of using rock average diameter for rock population statistics. The changes to the near-field data do not affect the far-field rock statistics.

  17. Cancer incidence and mortality in workers employed at a transformer manufacturing plant: update to a cohort study.

    PubMed

    Yassi, Annalee; Tate, Robert B; Routledge, Michael

    2003-07-01

    This study is an extension of a previously published analysis of cancer mortality in a transformer manufacturing plant where there had been extensive use of mineral oil transformer fluid. The objectives of the present study were to update the mortality analysis and include deaths for the past 6 years as well as to do an analysis of cancer incidence of the cohort. A cohort of 2,222 males working at a transformer manufacturing plant between 1946 and 1975 was constructed. Using a classical historical cohort study design, cancer incidence and mortality were determined through record linkage with Canadian provincial and national registries. The rates of cancer incidence and mortality experienced by this cohort were compared to that of the Canadian male population. A statistically significant increased risk of developing and dying of pancreatic cancer was found but not an increase in overall cancer mortality. This was consistent with the previous report from this group. Interestingly, the cohort demonstrated a statistically significant risk of overall cancer incidence and specific increased incidence of gallbladder cancer. This study contributes further evidence to the growing body of literature indicating the carcinogenic properties of mineral oils used in occupational settings, in particular those used prior to 1970s. Copyright 2003 Wiley-Liss, Inc.

  18. A comparison of several techniques for imputing tree level data

    Treesearch

    David Gartner

    2002-01-01

    As Forest Inventory and Analysis (FIA) changes from periodic surveys to the multipanel annual survey, new analytical methods become available. The current official statistic is the moving average. One alternative is an updated moving average. Several methods of updating plot per acre volume have been discussed previously. However, these methods may not be appropriate...

  19. Forest statistics for Southwest-South Alabama counties - 1990

    Treesearch

    William H. McWilliams; Patrick E. Miller; John S. Vissage

    1990-01-01

    Tabulated results were derived from data obtained during a recent forest inventory of southeast Alabama (fig. 1). Core tables (1 to 25) are compatible among Forest Inventory and Analysis units in the Eastern U.S. Other tables (26 to 43) supplement the information contained in the core tables. Comparisons are made between results of the 1990 inventory and previous...

  20. Homologues of insulinase, a new superfamily of metalloendopeptidases.

    PubMed Central

    Rawlings, N D; Barrett, A J

    1991-01-01

    On the basis of a statistical analysis of an alignment of the amino acid sequences, a new superfamily of metalloendopeptidases is proposed, consisting of human insulinase, Escherichia coli protease III and mitochondrial processing endopeptidases from Saccharomyces and Neurospora. These enzymes do not contain the 'HEXXH' consensus sequence found in all previously recognized zinc metalloendopeptidases. PMID:2025223

  1. The Cost of a Tuition Tax Credit Reconsidered in the Light of New Evidence.

    ERIC Educational Resources Information Center

    Frey, Donald E.

    1982-01-01

    Using regression analysis on 1976-78 data from the National Center for Education Statistics, the author estimates demand and supply elasticities for nonpublic school tuition and enrollment. Application of the elasticities to data from a 1978 study indicates that federal tuition tax credits would be more costly than previously projected. (Author/RW)

  2. Taxonomic evaluation of species in the Streptomyces hirsutus clade using multi-locus sequence analysis and proposals to reclassify several species in this clade

    USDA-ARS?s Scientific Manuscript database

    Previous phylogenetic analyses of species of Streptomyces based on 16S rRNA gene sequences resulted in a statistically well-supported clade (100% bootstrap value) containing 8 species that exhibited very similar gross morphology in producing open looped (Retinaculum-Apertum) to spiral (Spira) chains...

  3. First Monte Carlo analysis of fragmentation functions from single-inclusive e + e - annihilation

    DOE PAGES

    Sato, Nobuo; Ethier, J. J.; Melnitchouk, W.; ...

    2016-12-02

    Here, we perform the first iterative Monte Carlo (IMC) analysis of fragmentation functions constrained by all available data from single-inclusive $e^+ e^-$ annihilation into pions and kaons. The IMC method eliminates potential bias in traditional analyses based on single fits introduced by fixing parameters not well contrained by the data, and provides a statistically rigorous determination of uncertainties. Our analysis reveals specific features of fragmentation functions using the new IMC methodology and those obtained from previous analyses, especially for light quarks and for strange quark fragmentation to kaons.

  4. An improved multiple linear regression and data analysis computer program package

    NASA Technical Reports Server (NTRS)

    Sidik, S. M.

    1972-01-01

    NEWRAP, an improved version of a previous multiple linear regression program called RAPIER, CREDUC, and CRSPLT, allows for a complete regression analysis including cross plots of the independent and dependent variables, correlation coefficients, regression coefficients, analysis of variance tables, t-statistics and their probability levels, rejection of independent variables, plots of residuals against the independent and dependent variables, and a canonical reduction of quadratic response functions useful in optimum seeking experimentation. A major improvement over RAPIER is that all regression calculations are done in double precision arithmetic.

  5. Quantitative three-dimensional ice roughness from scanning electron microscopy

    NASA Astrophysics Data System (ADS)

    Butterfield, Nicholas; Rowe, Penny M.; Stewart, Emily; Roesel, David; Neshyba, Steven

    2017-03-01

    We present a method for inferring surface morphology of ice from scanning electron microscope images. We first develop a novel functional form for the backscattered electron intensity as a function of ice facet orientation; this form is parameterized using smooth ice facets of known orientation. Three-dimensional representations of rough surfaces are retrieved at approximately micrometer resolution using Gauss-Newton inversion within a Bayesian framework. Statistical analysis of the resulting data sets permits characterization of ice surface roughness with a much higher statistical confidence than previously possible. A survey of results in the range -39°C to -29°C shows that characteristics of the roughness (e.g., Weibull parameters) are sensitive not only to the degree of roughening but also to the symmetry of the roughening. These results suggest that roughening characteristics obtained by remote sensing and in situ measurements of atmospheric ice clouds can potentially provide more facet-specific information than has previously been appreciated.

  6. Statistics of Magnetic Reconnection X-Lines in Kinetic Turbulence

    NASA Astrophysics Data System (ADS)

    Haggerty, C. C.; Parashar, T.; Matthaeus, W. H.; Shay, M. A.; Wan, M.; Servidio, S.; Wu, P.

    2016-12-01

    In this work we examine the statistics of magnetic reconnection (x-lines) and their associated reconnection rates in intermittent current sheets generated in turbulent plasmas. Although such statistics have been studied previously for fluid simulations (e.g. [1]), they have not yet been generalized to fully kinetic particle-in-cell (PIC) simulations. A significant problem with PIC simulations, however, is electrostatic fluctuations generated due to numerical particle counting statistics. We find that analyzing gradients of the magnetic vector potential from the raw PIC field data identifies numerous artificial (or non-physical) x-points. Using small Orszag-Tang vortex PIC simulations, we analyze x-line identification and show that these artificial x-lines can be removed using sub-Debye length filtering of the data. We examine how turbulent properties such as the magnetic spectrum and scale dependent kurtosis are affected by particle noise and sub-Debye length filtering. We subsequently apply these analysis methods to a large scale kinetic PIC turbulent simulation. Consistent with previous fluid models, we find a range of normalized reconnection rates as large as ½ but with the bulk of the rates being approximately less than to 0.1. [1] Servidio, S., W. H. Matthaeus, M. A. Shay, P. A. Cassak, and P. Dmitruk (2009), Magnetic reconnection and two-dimensional magnetohydrodynamic turbulence, Phys. Rev. Lett., 102, 115003.

  7. Statistical analysis for improving data precision in the SPME GC-MS analysis of blackberry (Rubus ulmifolius Schott) volatiles.

    PubMed

    D'Agostino, M F; Sanz, J; Martínez-Castro, I; Giuffrè, A M; Sicari, V; Soria, A C

    2014-07-01

    Statistical analysis has been used for the first time to evaluate the dispersion of quantitative data in the solid-phase microextraction (SPME) followed by gas chromatography-mass spectrometry (GC-MS) analysis of blackberry (Rubus ulmifolius Schott) volatiles with the aim of improving their precision. Experimental and randomly simulated data were compared using different statistical parameters (correlation coefficients, Principal Component Analysis loadings and eigenvalues). Non-random factors were shown to significantly contribute to total dispersion; groups of volatile compounds could be associated with these factors. A significant improvement of precision was achieved when considering percent concentration ratios, rather than percent values, among those blackberry volatiles with a similar dispersion behavior. As novelty over previous references, and to complement this main objective, the presence of non-random dispersion trends in data from simple blackberry model systems was evidenced. Although the influence of the type of matrix on data precision was proved, the possibility of a better understanding of the dispersion patterns in real samples was not possible from model systems. The approach here used was validated for the first time through the multicomponent characterization of Italian blackberries from different harvest years. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Bispectral analysis of equatorial spread F density irregularities

    NASA Technical Reports Server (NTRS)

    Labelle, J.; Lund, E. J.

    1992-01-01

    Bispectral analysis has been applied to density irregularities at frequencies 5-30 Hz observed with a sounding rocket launched from Peru in March 1983. Unlike the power spectrum, the bispectrum contains statistical information about the phase relations between the Fourier components which make up the waveform. In the case of spread F data from 475 km the 5-30 Hz portion of the spectrum displays overall enhanced bicoherence relative to that of the background instrumental noise and to that expected due to statistical considerations, implying that the observed f exp -2.5 power law spectrum has a significant non-Gaussian component. This is consistent with previous qualitative analyses. The bicoherence has also been calculated for simulated equatorial spread F density irregularities in approximately the same wavelength regime, and the resulting bispectrum has some features in common with that of the rocket data. The implications of this analysis for equatorial spread F are discussed, and some future investigations are suggested.

  9. Student failures on first-year medical basic science courses and the USMLE step 1: a retrospective study over a 20-year period.

    PubMed

    Burns, E Robert; Garrett, Judy

    2015-01-01

    Correlates of achievement in the basic science years in medical school and on the Step 1 of the United States Medical Licensing Examination® (USMLE®), (Step 1) in relation to preadmission variables have been the subject of considerable study. Preadmissions variables such as the undergraduate grade point average (uGPA) and Medical College Admission Test® (MCAT®) scores, solely or in combination, have previously been found to be predictors of achievement in the basic science years and/or on the Step 1. The purposes of this retrospective study were to: (1) determine if our statistical analysis confirmed previously published relationships between preadmission variables (MCAT, uGPA, and applicant pool size), and (2) study correlates of the number of failures in five M1 courses with those preadmission variables and failures on Step 1. Statistical analysis confirmed previously published relationships between all preadmission variables. Only one course, Microscopic Anatomy, demonstrated significant correlations with all variables studied including the Step 1 failures. Physiology correlated with three of the four variables studied, but not with the Step 1 failures. Analyses such as these provide a tool by which administrators will be able to identify what courses are or are not responding in appropriate ways to changes in the preadmissions variables that signal student performance on the Step 1. © 2014 American Association of Anatomists.

  10. MMASS: an optimized array-based method for assessing CpG island methylation.

    PubMed

    Ibrahim, Ashraf E K; Thorne, Natalie P; Baird, Katie; Barbosa-Morais, Nuno L; Tavaré, Simon; Collins, V Peter; Wyllie, Andrew H; Arends, Mark J; Brenton, James D

    2006-01-01

    We describe an optimized microarray method for identifying genome-wide CpG island methylation called microarray-based methylation assessment of single samples (MMASS) which directly compares methylated to unmethylated sequences within a single sample. To improve previous methods we used bioinformatic analysis to predict an optimized combination of methylation-sensitive enzymes that had the highest utility for CpG-island probes and different methods to produce unmethylated representations of test DNA for more sensitive detection of differential methylation by hybridization. Subtraction or methylation-dependent digestion with McrBC was used with optimized (MMASS-v2) or previously described (MMASS-v1, MMASS-sub) methylation-sensitive enzyme combinations and compared with a published McrBC method. Comparison was performed using DNA from the cell line HCT116. We show that the distribution of methylation microarray data is inherently skewed and requires exogenous spiked controls for normalization and that analysis of digestion of methylated and unmethylated control sequences together with linear fit models of replicate data showed superior statistical power for the MMASS-v2 method. Comparison with previous methylation data for HCT116 and validation of CpG islands from PXMP4, SFRP2, DCC, RARB and TSEN2 confirmed the accuracy of MMASS-v2 results. The MMASS-v2 method offers improved sensitivity and statistical power for high-throughput microarray identification of differential methylation.

  11. Analysis of covariance as a remedy for demographic mismatch of research subject groups: some sobering simulations.

    PubMed

    Adams, K M; Brown, G G; Grant, I

    1985-08-01

    Analysis of Covariance (ANCOVA) is often used in neuropsychological studies to effect ex-post-facto adjustment of performance variables amongst groups of subjects mismatched on some relevant demographic variable. This paper reviews some of the statistical assumptions underlying this usage. In an attempt to illustrate the complexities of this statistical technique, three sham studies using actual patient data are presented. These staged simulations have varying relationships between group test performance differences and levels of covariate discrepancy. The results were robust and consistent in their nature, and were held to support the wisdom of previous cautions by statisticians concerning the employment of ANCOVA to justify comparisons between incomparable groups. ANCOVA should not be used in neuropsychological research to equate groups unequal on variables such as age and education or to exert statistical control whose objective is to eliminate consideration of the covariate as an explanation for results. Finally, the report advocates by example the use of simulation to further our understanding of neuropsychological variables.

  12. The Statistical Package for the Social Sciences (SPSS) as an adjunct to pharmacokinetic analysis.

    PubMed

    Mather, L E; Austin, K L

    1983-01-01

    Computer techniques for numerical analysis are well known to pharmacokineticists. Powerful techniques for data file management have been developed by social scientists but have, in general, been ignored by pharmacokineticists because of their apparent lack of ability to interface with pharmacokinetic programs. Extensive use has been made of the Statistical Package for the Social Sciences (SPSS) for its data handling capabilities, but at the same time, techniques have been developed within SPSS to interface with pharmacokinetic programs of the users' choice and to carry out a variety of user-defined pharmacokinetic tasks within SPSS commands, apart from the expected variety of statistical tasks. Because it is based on a ubiquitous package, this methodology has all of the benefits of excellent documentation, interchangeability between different types and sizes of machines and true portability of techniques and data files. An example is given of the total management of a pharmacokinetic study previously reported in the literature by the authors.

  13. Peculiarities of the statistics of spectrally selected fluorescence radiation in laser-pumped dye-doped random media

    NASA Astrophysics Data System (ADS)

    Yuvchenko, S. A.; Ushakova, E. V.; Pavlova, M. V.; Alonova, M. V.; Zimnyakov, D. A.

    2018-04-01

    We consider the practical realization of a new optical probe method of the random media which is defined as the reference-free path length interferometry with the intensity moments analysis. A peculiarity in the statistics of the spectrally selected fluorescence radiation in laser-pumped dye-doped random medium is discussed. Previously established correlations between the second- and the third-order moments of the intensity fluctuations in the random interference patterns, the coherence function of the probe radiation, and the path difference probability density for the interfering partial waves in the medium are confirmed. The correlations were verified using the statistical analysis of the spectrally selected fluorescence radiation emitted by a laser-pumped dye-doped random medium. Water solution of Rhodamine 6G was applied as the doping fluorescent agent for the ensembles of the densely packed silica grains, which were pumped by the 532 nm radiation of a solid state laser. The spectrum of the mean path length for a random medium was reconstructed.

  14. Solar granulation and statistical crystallography: A modeling approach using size-shape relations

    NASA Technical Reports Server (NTRS)

    Noever, D. A.

    1994-01-01

    The irregular polygonal pattern of solar granulation is analyzed for size-shape relations using statistical crystallography. In contrast to previous work which has assumed perfectly hexagonal patterns for granulation, more realistic accounting of cell (granule) shapes reveals a broader basis for quantitative analysis. Several features emerge as noteworthy: (1) a linear correlation between number of cell-sides and neighboring shapes (called Aboav-Weaire's law); (2) a linear correlation between both average cell area and perimeter and the number of cell-sides (called Lewis's law and a perimeter law, respectively) and (3) a linear correlation between cell area and squared perimeter (called convolution index). This statistical picture of granulation is consistent with a finding of no correlation in cell shapes beyond nearest neighbors. A comparative calculation between existing model predictions taken from luminosity data and the present analysis shows substantial agreements for cell-size distributions. A model for understanding grain lifetimes is proposed which links convective times to cell shape using crystallographic results.

  15. Spatial statistical analysis of tree deaths using airborne digital imagery

    NASA Astrophysics Data System (ADS)

    Chang, Ya-Mei; Baddeley, Adrian; Wallace, Jeremy; Canci, Michael

    2013-04-01

    High resolution digital airborne imagery offers unprecedented opportunities for observation and monitoring of vegetation, providing the potential to identify, locate and track individual vegetation objects over time. Analytical tools are required to quantify relevant information. In this paper, locations of trees over a large area of native woodland vegetation were identified using morphological image analysis techniques. Methods of spatial point process statistics were then applied to estimate the spatially-varying tree death risk, and to show that it is significantly non-uniform. [Tree deaths over the area were detected in our previous work (Wallace et al., 2008).] The study area is a major source of ground water for the city of Perth, and the work was motivated by the need to understand and quantify vegetation changes in the context of water extraction and drying climate. The influence of hydrological variables on tree death risk was investigated using spatial statistics (graphical exploratory methods, spatial point pattern modelling and diagnostics).

  16. Markov Logic Networks in the Analysis of Genetic Data

    PubMed Central

    Sakhanenko, Nikita A.

    2010-01-01

    Abstract Complex, non-additive genetic interactions are common and can be critical in determining phenotypes. Genome-wide association studies (GWAS) and similar statistical studies of linkage data, however, assume additive models of gene interactions in looking for genotype-phenotype associations. These statistical methods view the compound effects of multiple genes on a phenotype as a sum of influences of each gene and often miss a substantial part of the heritable effect. Such methods do not use any biological knowledge about underlying mechanisms. Modeling approaches from the artificial intelligence (AI) field that incorporate deterministic knowledge into models to perform statistical analysis can be applied to include prior knowledge in genetic analysis. We chose to use the most general such approach, Markov Logic Networks (MLNs), for combining deterministic knowledge with statistical analysis. Using simple, logistic regression-type MLNs we can replicate the results of traditional statistical methods, but we also show that we are able to go beyond finding independent markers linked to a phenotype by using joint inference without an independence assumption. The method is applied to genetic data on yeast sporulation, a complex phenotype with gene interactions. In addition to detecting all of the previously identified loci associated with sporulation, our method identifies four loci with smaller effects. Since their effect on sporulation is small, these four loci were not detected with methods that do not account for dependence between markers due to gene interactions. We show how gene interactions can be detected using more complex models, which can be used as a general framework for incorporating systems biology with genetics. PMID:20958249

  17. Exploring the statistics of magnetic reconnection X-points in kinetic particle-in-cell turbulence

    NASA Astrophysics Data System (ADS)

    Haggerty, C. C.; Parashar, T. N.; Matthaeus, W. H.; Shay, M. A.; Yang, Y.; Wan, M.; Wu, P.; Servidio, S.

    2017-10-01

    Magnetic reconnection is a ubiquitous phenomenon in turbulent plasmas. It is an important part of the turbulent dynamics and heating of space and astrophysical plasmas. We examine the statistics of magnetic reconnection using a quantitative local analysis of the magnetic vector potential, previously used in magnetohydrodynamics simulations, and now employed to fully kinetic particle-in-cell (PIC) simulations. Different ways of reducing the particle noise for analysis purposes, including multiple smoothing techniques, are explored. We find that a Fourier filter applied at the Debye scale is an optimal choice for analyzing PIC data. Finally, we find a broader distribution of normalized reconnection rates compared to the MHD limit with rates as large as 0.5 but with an average of approximately 0.1.

  18. [Incidence and surgical wound infection risk factors in breast cancer surgery].

    PubMed

    Lefebvre, D; Penel, N; Deberles, M F; Fournier, C

    2000-11-18

    In order to evaluate occurrence and risk factors for wound infection (WI) in breast cancer surgery, we carried out a prospective study. From September 1996 through April 1997, an infection control physician prospectively evaluated 542 wounds of all patients having breast cancer surgery at the Oscar Lambret Cancer Center. WI was defined as a wound with pus. Antibiotic prophylaxis was given in case of immediate breast reconstruction. Statistical evaluation was performed using the c < or = test for categorial data and non-parametric Mann-Whitney test for continuous data. In univariate analysis, differences were considered significant at p < 0.01. The overall WI rate was 3.51% (19/352). In univariate analysis, risk factors for WI were: total preoperative hospital stay (p = 0.01), previous chemotherapy (p = 0.01), previous oncologic surgery (p = 0.03) and immediate breast reconstruction (p = 0.002). In mutivariate analysis, we observed two independent predictive factors for WI: previous chemotherapy (p = 0.05) and immediate breast reconstruction (p = 0.02). Previous anticancer chemotherapy was a major risk factor. In these cases, a phase III trial could confirm efficacy of standard antibiotic prophylaxis. Breast reconstruction was the second major risk factor. Standard antibiotic prophylaxis (used in our study) was insufficient.

  19. Seeing Prehistory through New Lenses: Using Geophysical and Statistical Analysis to Identify Fresh Perspectives of a 15th Century Mandan Occupation

    NASA Astrophysics Data System (ADS)

    Mitchum, Amber Marie

    Great Plains prehistoric research has evolved over the course of a century, with many sites like Huff Village (32MO11) in North Dakota recently coming back to the forefront of discussion through new technological applications. Through a majority of its studies and excavations, Huff Village appeared to endure as the final stage in the Middle Missouri tradition. Long thought to reflect only systematically placed long-rectangular structure types of its Middle Missouri predecessors, recent magnetic gradiometry and topographic mapping data revealed circular structure types that deviated from long-held traditions, highlighting new associations with Coalescent groups. A compact system for food capacity was also discovered, with more than 1,500 storage pits visible inside and outside of all structures delineated. Archaeological applications of these new technologies have provided a near-complete picture of this 15th century Mandan expression, allowing new questions to be raised about its previous taxonomic placement. Using a combination of GIS and statistical analysis, an attempt is made to quantitatively examine if it truly represented the Terminal Middle Missouri variant, or if Huff diverted in new directions. Statistical analysis disagrees with previous conclusions that a patterned layout of structures existed, significant clustering shown through point pattern analysis and Ripley’s K function amongst structures. Clustering of external storage pits also resulted from similar analysis, highlighting a connection between external storage features and the structures they surrounded. A combination of documented defensive features, a much higher estimation of caloric support for a population present, and a short occupation lead us to believe that a significant transition was occurring that incorporated attributes of both the Middle Missouri tradition as well as the Coalescent tradition. With more refined taxonomies currently developing, it is hoped that these data will help in the effort to develop future classifications that represent this complex period in prehistory.

  20. Analysis of Garment Production Methods. Part 2: Comparison of Cost and Production between a Traditional Bundle System and Modular Manufacturing

    DTIC Science & Technology

    1992-02-01

    configuration. We have spent the last year observing two firms as they experimented with modular manufacturing. The following report will track the progress of...the transitions as they I moved through the year . Incorporated into the analysis is the statistical interpretation of data collected from each firm, as...during the year . FEBRUARY The most noticeable change this month was the introduction of the new ergonomic chairs for the operators. Previously the

  1. Pan-Cancer Analysis of Mutation Hotspots in Protein Domains.

    PubMed

    Miller, Martin L; Reznik, Ed; Gauthier, Nicholas P; Aksoy, Bülent Arman; Korkut, Anil; Gao, Jianjiong; Ciriello, Giovanni; Schultz, Nikolaus; Sander, Chris

    2015-09-23

    In cancer genomics, recurrence of mutations in independent tumor samples is a strong indicator of functional impact. However, rare functional mutations can escape detection by recurrence analysis owing to lack of statistical power. We enhance statistical power by extending the notion of recurrence of mutations from single genes to gene families that share homologous protein domains. Domain mutation analysis also sharpens the functional interpretation of the impact of mutations, as domains more succinctly embody function than entire genes. By mapping mutations in 22 different tumor types to equivalent positions in multiple sequence alignments of domains, we confirm well-known functional mutation hotspots, identify uncharacterized rare variants in one gene that are equivalent to well-characterized mutations in another gene, detect previously unknown mutation hotspots, and provide hypotheses about molecular mechanisms and downstream effects of domain mutations. With the rapid expansion of cancer genomics projects, protein domain hotspot analysis will likely provide many more leads linking mutations in proteins to the cancer phenotype. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Gene-Based Association Analysis for Censored Traits Via Fixed Effect Functional Regressions.

    PubMed

    Fan, Ruzong; Wang, Yifan; Yan, Qi; Ding, Ying; Weeks, Daniel E; Lu, Zhaohui; Ren, Haobo; Cook, Richard J; Xiong, Momiao; Swaroop, Anand; Chew, Emily Y; Chen, Wei

    2016-02-01

    Genetic studies of survival outcomes have been proposed and conducted recently, but statistical methods for identifying genetic variants that affect disease progression are rarely developed. Motivated by our ongoing real studies, here we develop Cox proportional hazard models using functional regression (FR) to perform gene-based association analysis of survival traits while adjusting for covariates. The proposed Cox models are fixed effect models where the genetic effects of multiple genetic variants are assumed to be fixed. We introduce likelihood ratio test (LRT) statistics to test for associations between the survival traits and multiple genetic variants in a genetic region. Extensive simulation studies demonstrate that the proposed Cox RF LRT statistics have well-controlled type I error rates. To evaluate power, we compare the Cox FR LRT with the previously developed burden test (BT) in a Cox model and sequence kernel association test (SKAT), which is based on mixed effect Cox models. The Cox FR LRT statistics have higher power than or similar power as Cox SKAT LRT except when 50%/50% causal variants had negative/positive effects and all causal variants are rare. In addition, the Cox FR LRT statistics have higher power than Cox BT LRT. The models and related test statistics can be useful in the whole genome and whole exome association studies. An age-related macular degeneration dataset was analyzed as an example. © 2016 WILEY PERIODICALS, INC.

  3. Gene-based Association Analysis for Censored Traits Via Fixed Effect Functional Regressions

    PubMed Central

    Fan, Ruzong; Wang, Yifan; Yan, Qi; Ding, Ying; Weeks, Daniel E.; Lu, Zhaohui; Ren, Haobo; Cook, Richard J; Xiong, Momiao; Swaroop, Anand; Chew, Emily Y.; Chen, Wei

    2015-01-01

    Summary Genetic studies of survival outcomes have been proposed and conducted recently, but statistical methods for identifying genetic variants that affect disease progression are rarely developed. Motivated by our ongoing real studies, we develop here Cox proportional hazard models using functional regression (FR) to perform gene-based association analysis of survival traits while adjusting for covariates. The proposed Cox models are fixed effect models where the genetic effects of multiple genetic variants are assumed to be fixed. We introduce likelihood ratio test (LRT) statistics to test for associations between the survival traits and multiple genetic variants in a genetic region. Extensive simulation studies demonstrate that the proposed Cox RF LRT statistics have well-controlled type I error rates. To evaluate power, we compare the Cox FR LRT with the previously developed burden test (BT) in a Cox model and sequence kernel association test (SKAT) which is based on mixed effect Cox models. The Cox FR LRT statistics have higher power than or similar power as Cox SKAT LRT except when 50%/50% causal variants had negative/positive effects and all causal variants are rare. In addition, the Cox FR LRT statistics have higher power than Cox BT LRT. The models and related test statistics can be useful in the whole genome and whole exome association studies. An age-related macular degeneration dataset was analyzed as an example. PMID:26782979

  4. A hands-on practical tutorial on performing meta-analysis with Stata.

    PubMed

    Chaimani, Anna; Mavridis, Dimitris; Salanti, Georgia

    2014-11-01

    Statistical synthesis of research findings via meta-analysis is widely used to assess the relative effectiveness of competing interventions. A series of three papers aimed at familiarising mental health scientists with the key statistical concepts and problems in meta-analysis was recently published in this journal. One paper focused on the selection and interpretation of the appropriate model to synthesise results (fixed effect or random effects model) whereas the other two papers focused on two major threats that compromise the validity of meta-analysis results, namely publication bias and missing outcome data. In this paper we provide guidance on how to undertake meta-analysis using Stata, one of the most commonly used software packages for meta-analysis. We address the three topics covered in the previous issues of the journal, focusing on their implementation in Stata using a working example from mental health research. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  5. Evidence-Based Medicine as a Tool for Undergraduate Probability and Statistics Education.

    PubMed

    Masel, J; Humphrey, P T; Blackburn, B; Levine, J A

    2015-01-01

    Most students have difficulty reasoning about chance events, and misconceptions regarding probability can persist or even strengthen following traditional instruction. Many biostatistics classes sidestep this problem by prioritizing exploratory data analysis over probability. However, probability itself, in addition to statistics, is essential both to the biology curriculum and to informed decision making in daily life. One area in which probability is particularly important is medicine. Given the preponderance of pre health students, in addition to more general interest in medicine, we capitalized on students' intrinsic motivation in this area to teach both probability and statistics. We use the randomized controlled trial as the centerpiece of the course, because it exemplifies the most salient features of the scientific method, and the application of critical thinking to medicine. The other two pillars of the course are biomedical applications of Bayes' theorem and science and society content. Backward design from these three overarching aims was used to select appropriate probability and statistics content, with a focus on eliciting and countering previously documented misconceptions in their medical context. Pretest/posttest assessments using the Quantitative Reasoning Quotient and Attitudes Toward Statistics instruments are positive, bucking several negative trends previously reported in statistics education. © 2015 J. Masel et al. CBE—Life Sciences Education © 2015 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  6. Daniel Goodman’s empirical approach to Bayesian statistics

    USGS Publications Warehouse

    Gerrodette, Tim; Ward, Eric; Taylor, Rebecca L.; Schwarz, Lisa K.; Eguchi, Tomoharu; Wade, Paul; Himes Boor, Gina

    2016-01-01

    Bayesian statistics, in contrast to classical statistics, uses probability to represent uncertainty about the state of knowledge. Bayesian statistics has often been associated with the idea that knowledge is subjective and that a probability distribution represents a personal degree of belief. Dr. Daniel Goodman considered this viewpoint problematic for issues of public policy. He sought to ground his Bayesian approach in data, and advocated the construction of a prior as an empirical histogram of “similar” cases. In this way, the posterior distribution that results from a Bayesian analysis combined comparable previous data with case-specific current data, using Bayes’ formula. Goodman championed such a data-based approach, but he acknowledged that it was difficult in practice. If based on a true representation of our knowledge and uncertainty, Goodman argued that risk assessment and decision-making could be an exact science, despite the uncertainties. In his view, Bayesian statistics is a critical component of this science because a Bayesian analysis produces the probabilities of future outcomes. Indeed, Goodman maintained that the Bayesian machinery, following the rules of conditional probability, offered the best legitimate inference from available data. We give an example of an informative prior in a recent study of Steller sea lion spatial use patterns in Alaska.

  7. Modular reweighting software for statistical mechanical analysis of biased equilibrium data

    NASA Astrophysics Data System (ADS)

    Sindhikara, Daniel J.

    2012-07-01

    Here a simple, useful, modular approach and software suite designed for statistical reweighting and analysis of equilibrium ensembles is presented. Statistical reweighting is useful and sometimes necessary for analysis of equilibrium enhanced sampling methods, such as umbrella sampling or replica exchange, and also in experimental cases where biasing factors are explicitly known. Essentially, statistical reweighting allows extrapolation of data from one or more equilibrium ensembles to another. Here, the fundamental separable steps of statistical reweighting are broken up into modules - allowing for application to the general case and avoiding the black-box nature of some “all-inclusive” reweighting programs. Additionally, the programs included are, by-design, written with little dependencies. The compilers required are either pre-installed on most systems, or freely available for download with minimal trouble. Examples of the use of this suite applied to umbrella sampling and replica exchange molecular dynamics simulations will be shown along with advice on how to apply it in the general case. New version program summaryProgram title: Modular reweighting version 2 Catalogue identifier: AEJH_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJH_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 179 118 No. of bytes in distributed program, including test data, etc.: 8 518 178 Distribution format: tar.gz Programming language: C++, Python 2.6+, Perl 5+ Computer: Any Operating system: Any RAM: 50-500 MB Supplementary material: An updated version of the original manuscript (Comput. Phys. Commun. 182 (2011) 2227) is available Classification: 4.13 Catalogue identifier of previous version: AEJH_v1_0 Journal reference of previous version: Comput. Phys. Commun. 182 (2011) 2227 Does the new version supersede the previous version?: Yes Nature of problem: While equilibrium reweighting is ubiquitous, there are no public programs available to perform the reweighting in the general case. Further, specific programs often suffer from many library dependencies and numerical instability. Solution method: This package is written in a modular format that allows for easy applicability of reweighting in the general case. Modules are small, numerically stable, and require minimal libraries. Reasons for new version: Some minor bugs, some upgrades needed, error analysis added. analyzeweight.py/analyzeweight.py2 has been replaced by “multihist.py”. This new program performs all the functions of its predecessor while being versatile enough to handle other types of histograms and probability analysis. “bootstrap.py” was added. This script performs basic bootstrap resampling allowing for error analysis of data. “avg_dev_distribution.py” was added. This program computes the averages and standard deviations of multiple distributions, making error analysis (e.g. from bootstrap resampling) easier to visualize. WRE.cpp was slightly modified purely for cosmetic reasons. The manual was updated for clarity and to reflect version updates. Examples were removed from the manual in favor of online tutorials (packaged examples remain). Examples were updated to reflect the new format. An additional example is included to demonstrate error analysis. Running time: Preprocessing scripts 1-5 minutes, WHAM engine <1 minute, postprocess script ∼1-5 minutes.

  8. Secure and scalable deduplication of horizontally partitioned health data for privacy-preserving distributed statistical computation.

    PubMed

    Yigzaw, Kassaye Yitbarek; Michalas, Antonis; Bellika, Johan Gustav

    2017-01-03

    Techniques have been developed to compute statistics on distributed datasets without revealing private information except the statistical results. However, duplicate records in a distributed dataset may lead to incorrect statistical results. Therefore, to increase the accuracy of the statistical analysis of a distributed dataset, secure deduplication is an important preprocessing step. We designed a secure protocol for the deduplication of horizontally partitioned datasets with deterministic record linkage algorithms. We provided a formal security analysis of the protocol in the presence of semi-honest adversaries. The protocol was implemented and deployed across three microbiology laboratories located in Norway, and we ran experiments on the datasets in which the number of records for each laboratory varied. Experiments were also performed on simulated microbiology datasets and data custodians connected through a local area network. The security analysis demonstrated that the protocol protects the privacy of individuals and data custodians under a semi-honest adversarial model. More precisely, the protocol remains secure with the collusion of up to N - 2 corrupt data custodians. The total runtime for the protocol scales linearly with the addition of data custodians and records. One million simulated records distributed across 20 data custodians were deduplicated within 45 s. The experimental results showed that the protocol is more efficient and scalable than previous protocols for the same problem. The proposed deduplication protocol is efficient and scalable for practical uses while protecting the privacy of patients and data custodians.

  9. The relationship between emotional intelligence, previous caring experience and mindfulness in student nurses and midwives: a cross sectional analysis.

    PubMed

    Snowden, Austyn; Stenhouse, Rosie; Young, Jenny; Carver, Hannah; Carver, Fiona; Brown, Norrie

    2015-01-01

    Emotional Intelligence (EI), previous caring experience and mindfulness training may have a positive impact on nurse education. More evidence is needed to support the use of these variables in nurse recruitment and retention. To explore the relationship between EI, gender, age, programme of study, previous caring experience and mindfulness training. Cross sectional element of longitudinal study. 938year one nursing, midwifery and computing students at two Scottish Higher Education Institutes (HEIs) who entered their programme in September 2013. Participants completed a measure of 'trait' EI: Trait Emotional Intelligence Questionnaire Short Form (TEIQue-SF); and 'ability' EI: Schutte's et al. (1998) Emotional Intelligence Scale (SEIS). Demographics, previous caring experience and previous training in mindfulness were recorded. Relationships between variables were tested using non-parametric tests. Emotional intelligence increased with age on both measures of EI [TEIQ-SF H(5)=15.157 p=0.001; SEIS H(5)=11.388, p=0.044]. Females (n=786) scored higher than males (n=149) on both measures [TEIQ-SF, U=44,931, z=-4.509, p<.001; SEIS, U=44,744, z=-5.563, p<.001]. Nursing students scored higher that computing students [TEIQ-SF H(5)=46,496, p<.001; SEIS H(5)=33.309, p<0.001. There were no statistically significant differences in TEIQ-SF scores between those who had previous mindfulness training (n=50) and those who had not (n=857) [U=22,980, z=0.864, p = 0.388]. However, median SEIS was statistically significantly different according to mindfulness training [U=25,115.5, z=2.05, p=.039]. Neither measure demonstrated statistically significantly differences between those with (n=492) and without (n=479) previous caring experience, [TEIQ-SF, U=112, 102, z=0.938, p=.348; SEIS, U=115,194.5, z=1.863, p=0.063]. Previous caring experience was not associated with higher emotional intelligence. Mindfulness training was associated with higher 'ability' emotional intelligence. Implications for recruitment, retention and further research are explored. Copyright © 2014. Published by Elsevier Ltd.

  10. The influence of previous subject experience on interactions during peer instruction in an introductory physics course: A mixed methods analysis

    NASA Astrophysics Data System (ADS)

    Vondruska, Judy A.

    Over the past decade, peer instruction and the introduction of student response systems has provided a means of improving student engagement and achievement in large-lecture settings. While the nature of the student discourse occurring during peer instruction is less understood, existing studies have shown student ideas about the subject, extraneous cues, and confidence level appear to matter in the student-student discourse. Using a mixed methods research design, this study examined the influence of previous subject experience on peer instruction in an introductory, one-semester Survey of Physics course. Quantitative results indicated students in discussion pairs where both had previous subject experience were more likely to answer clicker question correctly both before and after peer discussion compared to student groups where neither partner had previous subject experience. Students in mixed discussion pairs were not statistically different in correct response rates from the other pairings. There was no statistically significant difference between the experience pairs on unit exam scores or the Peer Instruction Partner Survey. Although there was a statistically significant difference between the pre-MPEX and post-MPEX scores, there was no difference between the members of the various subject experience peer discussion pairs. The qualitative study, conducted after the quantitative study, helped to inform the quantitative results by exploring the nature of the peer interactions through survey questions and a series of focus groups discussions. While the majority of participants described a benefit to the use of clickers in the lecture, their experience with their discussion partners varied. Students with previous subject experience tended to describe peer instruction more positively than students who did not have previous subject experience, regardless of the experience level of their partner. They were also more likely to report favorable levels of comfort with the peer instruction experience. Students with no previous subject experience were more likely to describe a level of discomfort being assigned a stranger for a discussion partner and were more likely to report communication issues with their partner. Most group members, regardless of previous subject experience, related deeper discussions occurring when partners did not initially have the same answer to the clicker questions.

  11. Forest canopy effects on snow accumulation and ablation: an integrative review of empirical results

    Treesearch

    Andres Varhola; Nicholas C. Coops; Markus Weiler; R. Dan Moore

    2010-01-01

    The past century has seen significant research comparing snow accumulation and ablation in forested and open sites. In this review we compile and standardize the results of previous empirical studies to generate statistical relations between changes in forest cover and the associated changes in snow accumulation and ablation rate. The analysis drew upon 33 articles...

  12. Trends in the Use of School Choice: 1993 to 2007. Statistical Analysis Report. NCES 2010-004

    ERIC Educational Resources Information Center

    Grady, Sarah; Bielick, Stacey; Aud, Susan

    2010-01-01

    This report updates two previous reports: "Trends in the Use of School Choice: 1993 to 1999" (Bielick and Chapman 2003) and "Trends in the Use of School Choice: 1993 to 2003" (Tice et al. 2006). Using data from the National Household Education Survey (NHES) of the U.S. Department of Education's National Center for Education…

  13. Forest statistics for Southwest-North Alabama counties - 1990

    Treesearch

    William H. McWilliams; Patrick E. Miller; John S. Vissage

    1990-01-01

    Tabulated results were derived from data obtained during a recent forest inventory of southwest-North Alabama (fig. I). Core tables (1 to 25) are compatible among Forest Inventory and Analysis units in the Eastern U.S. Other tables (26 to 43) supplement the information contained in the core tables. Comparisons are made between results of the 1990 inventory and previous...

  14. Revealing Future Research Capacity from an Analysis of a National Database of Discipline-Coded Australian PhD Thesis Records

    ERIC Educational Resources Information Center

    Pittayachawan, Siddhi; Macauley, Peter; Evans, Terry

    2016-01-01

    This article reports how statistical analyses of PhD thesis records can reveal future research capacities for disciplines beyond their primary fields. The previous research showed that most theses contributed to and/or used methodologies from more than one discipline. In Australia, there was a concern for declining mathematical teaching and…

  15. Evidence-based pathology in its second decade: toward probabilistic cognitive computing.

    PubMed

    Marchevsky, Alberto M; Walts, Ann E; Wick, Mark R

    2017-03-01

    Evidence-based pathology advocates using a combination of best available data ("evidence") from the literature and personal experience for the diagnosis, estimation of prognosis, and assessment of other variables that impact individual patient care. Evidence-based pathology relies on systematic reviews of the literature, evaluation of the quality of evidence as categorized by evidence levels and statistical tools such as meta-analyses, estimates of probabilities and odds, and others. However, it is well known that previously "statistically significant" information usually does not accurately forecast the future for individual patients. There is great interest in "cognitive computing" in which "data mining" is combined with "predictive analytics" designed to forecast future events and estimate the strength of those predictions. This study demonstrates the use of IBM Watson Analytics software to evaluate and predict the prognosis of 101 patients with typical and atypical pulmonary carcinoid tumors in which Ki-67 indices have been determined. The results obtained with this system are compared with those previously reported using "routine" statistical software and the help of a professional statistician. IBM Watson Analytics interactively provides statistical results that are comparable to those obtained with routine statistical tools but much more rapidly, with considerably less effort and with interactive graphics that are intuitively easy to apply. It also enables analysis of natural language variables and yields detailed survival predictions for patient subgroups selected by the user. Potential applications of this tool and basic concepts of cognitive computing are discussed. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Prediction and analysis of beta-turns in proteins by support vector machine.

    PubMed

    Pham, Tho Hoan; Satou, Kenji; Ho, Tu Bao

    2003-01-01

    Tight turn has long been recognized as one of the three important features of proteins after the alpha-helix and beta-sheet. Tight turns play an important role in globular proteins from both the structural and functional points of view. More than 90% tight turns are beta-turns. Analysis and prediction of beta-turns in particular and tight turns in general are very useful for the design of new molecules such as drugs, pesticides, and antigens. In this paper, we introduce a support vector machine (SVM) approach to prediction and analysis of beta-turns. We have investigated two aspects of applying SVM to the prediction and analysis of beta-turns. First, we developed a new SVM method, called BTSVM, which predicts beta-turns of a protein from its sequence. The prediction results on the dataset of 426 non-homologous protein chains by sevenfold cross-validation technique showed that our method is superior to the other previous methods. Second, we analyzed how amino acid positions support (or prevent) the formation of beta-turns based on the "multivariable" classification model of a linear SVM. This model is more general than the other ones of previous statistical methods. Our analysis results are more comprehensive and easier to use than previously published analysis results.

  17. Determinants of Chronic Respiratory Symptoms among Pharmaceutical Factory Workers

    PubMed Central

    Enquselassie, Fikre; Tefera, Yifokire; Gizaw, Muluken; Wakuma, Samson; Woldemariam, Messay

    2018-01-01

    Background Chronic respiratory symptoms including chronic cough, chronic phlegm, wheezing, shortness of breath, and chest pain are manifestations of respiratory problems which are mainly evolved as a result of occupational exposures. This study aims to assess determinants of chronic respiratory symptoms among pharmaceutical factory workers. Methods A case control study was carried out among 453 pharmaceutical factory workers with 151 cases and 302 controls. Data was collected using pretested and structured questionnaire. The data was analyzed using descriptive statistics and bivariate and multivariate analysis. Result Previous history of chronic respiratory diseases (AOR = 3.36, 95% CI = 1.85–6.12), family history of chronic respiratory diseases (AOR = 2.55, 95% CI = 1.51–4.32), previous dusty working environment (AOR = 2.26, 95% CI = 1.07–4.78), ever smoking (AOR = 3.66, 95% CI = 1.05–12.72), and service years (AOR = 1.86, 95% CI = 1.16–2.99) showed statistically significant association with chronic respiratory symptoms. Conclusion Previous history of respiratory diseases, family history of chronic respiratory diseases, previous dusty working environment, smoking, and service years were determinants of chronic respiratory symptoms. Public health endeavors to prevent the burden of chronic respiratory symptoms among pharmaceutical factory workers should target the reduction of adverse workplace exposures and discouragement of smoking. PMID:29666655

  18. Logistic regression applied to natural hazards: rare event logistic regression with replications

    NASA Astrophysics Data System (ADS)

    Guns, M.; Vanacker, V.

    2012-06-01

    Statistical analysis of natural hazards needs particular attention, as most of these phenomena are rare events. This study shows that the ordinary rare event logistic regression, as it is now commonly used in geomorphologic studies, does not always lead to a robust detection of controlling factors, as the results can be strongly sample-dependent. In this paper, we introduce some concepts of Monte Carlo simulations in rare event logistic regression. This technique, so-called rare event logistic regression with replications, combines the strength of probabilistic and statistical methods, and allows overcoming some of the limitations of previous developments through robust variable selection. This technique was here developed for the analyses of landslide controlling factors, but the concept is widely applicable for statistical analyses of natural hazards.

  19. [How to start a neuroimaging study].

    PubMed

    Narumoto, Jin

    2012-06-01

    In order to help researchers understand how to start a neuroimaging study, several tips are described in this paper. These include 1) Choice of an imaging modality, 2) Statistical method, and 3) Interpretation of the results. 1) There are several imaging modalities available in clinical research. Advantages and disadvantages of each modality are described. 2) Statistical Parametric Mapping, which is the most common statistical software for neuroimaging analysis, is described in terms of parameter setting in normalization and level of significance. 3) In the discussion section, the region which shows a significant difference between patients and normal controls should be discussed in relation to the neurophysiology of the disease, making reference to previous reports from neuroimaging studies in normal controls, lesion studies and animal studies. A typical pattern of discussion is described.

  20. A Numerical Simulation and Statistical Modeling of High Intensity Radiated Fields Experiment Data

    NASA Technical Reports Server (NTRS)

    Smith, Laura J.

    2004-01-01

    Tests are conducted on a quad-redundant fault tolerant flight control computer to establish upset characteristics of an avionics system in an electromagnetic field. A numerical simulation and statistical model are described in this work to analyze the open loop experiment data collected in the reverberation chamber at NASA LaRC as a part of an effort to examine the effects of electromagnetic interference on fly-by-wire aircraft control systems. By comparing thousands of simulation and model outputs, the models that best describe the data are first identified and then a systematic statistical analysis is performed on the data. All of these efforts are combined which culminate in an extrapolation of values that are in turn used to support previous efforts used in evaluating the data.

  1. A statistical study of EMIC waves observed by Cluster: 1. Wave properties

    NASA Astrophysics Data System (ADS)

    Allen, R. C.; Zhang, J.-C.; Kistler, L. M.; Spence, H. E.; Lin, R.-L.; Klecker, B.; Dunlop, M. W.; André, M.; Jordanova, V. K.

    2015-07-01

    Electromagnetic ion cyclotron (EMIC) waves are an important mechanism for particle energization and losses inside the magnetosphere. In order to better understand the effects of these waves on particle dynamics, detailed information about the occurrence rate, wave power, ellipticity, normal angle, energy propagation angle distributions, and local plasma parameters are required. Previous statistical studies have used in situ observations to investigate the distribution of these parameters in the magnetic local time versus L-shell (MLT-L) frame within a limited magnetic latitude (MLAT) range. In this study, we present a statistical analysis of EMIC wave properties using 10 years (2001-2010) of data from Cluster, totaling 25,431 min of wave activity. Due to the polar orbit of Cluster, we are able to investigate EMIC waves at all MLATs and MLTs. This allows us to further investigate the MLAT dependence of various wave properties inside different MLT sectors and further explore the effects of Shabansky orbits on EMIC wave generation and propagation. The statistical analysis is presented in two papers. This paper focuses on the wave occurrence distribution as well as the distribution of wave properties. The companion paper focuses on local plasma parameters during wave observations as well as wave generation proxies.

  2. A Statistics-based Platform for Quantitative N-terminome Analysis and Identification of Protease Cleavage Products*

    PubMed Central

    auf dem Keller, Ulrich; Prudova, Anna; Gioia, Magda; Butler, Georgina S.; Overall, Christopher M.

    2010-01-01

    Terminal amine isotopic labeling of substrates (TAILS), our recently introduced platform for quantitative N-terminome analysis, enables wide dynamic range identification of original mature protein N-termini and protease cleavage products. Modifying TAILS by use of isobaric tag for relative and absolute quantification (iTRAQ)-like labels for quantification together with a robust statistical classifier derived from experimental protease cleavage data, we report reliable and statistically valid identification of proteolytic events in complex biological systems in MS2 mode. The statistical classifier is supported by a novel parameter evaluating ion intensity-dependent quantification confidences of single peptide quantifications, the quantification confidence factor (QCF). Furthermore, the isoform assignment score (IAS) is introduced, a new scoring system for the evaluation of single peptide-to-protein assignments based on high confidence protein identifications in the same sample prior to negative selection enrichment of N-terminal peptides. By these approaches, we identified and validated, in addition to known substrates, low abundance novel bioactive MMP-2 targets including the plasminogen receptor S100A10 (p11) and the proinflammatory cytokine proEMAP/p43 that were previously undescribed. PMID:20305283

  3. The Interaction of TXNIP and AFq1 Genes Increases the Susceptibility of Schizophrenia.

    PubMed

    Su, Yousong; Ding, Wenhua; Xing, Mengjuan; Qi, Dake; Li, Zezhi; Cui, Donghong

    2017-08-01

    Although previous studies showed the reduced risk of cancer in patients with schizophrenia, whether patients with schizophrenia possess genetic factors that also contribute to tumor suppressor is still unknown. In the present study, based on our previous microarray data, we focused on the tumor suppressor genes TXNIP and AF1q, which differentially expressed in patients with schizophrenia. A total of 413 patients and 578 healthy controls were recruited. We found no significant differences in genotype, allele, or haplotype frequencies at the selected five single nucleotide polymorphisms (SNPs) (rs2236566 and rs7211 in TXNIP gene; rs10749659, rs2140709, and rs3738481 in AF1q gene) between patients with schizophrenia and controls. However, we found the association between the interaction of TXNIP and AF1q with schizophrenia by using the MDR method followed by traditional statistical analysis. The best gene-gene interaction model identified was a three-locus model TXNIP (rs2236566, rs7211)-AF1q (rs2140709). After traditional statistical analysis, we found the high-risk genotype combination was rs2236566 (GG)-rs7211(CC)-rs2140709(CC) (OR = 1.35 [1.03-1.76]). The low-risk genotype combination was rs2236566 (GT)-rs7211(CC)-rs2140709(CC) (OR = 0.67 [0.49-0.91]). Our finding suggested statistically significant role of interaction of TXNIP and AF1q polymorphisms (TXNIP-rs2236566, TXNIP-rs7211, and AF1q-rs2769605) in schizophrenia susceptibility.

  4. Time series regression-based pairs trading in the Korean equities market

    NASA Astrophysics Data System (ADS)

    Kim, Saejoon; Heo, Jun

    2017-07-01

    Pairs trading is an instance of statistical arbitrage that relies on heavy quantitative data analysis to profit by capitalising low-risk trading opportunities provided by anomalies of related assets. A key element in pairs trading is the rule by which open and close trading triggers are defined. This paper investigates the use of time series regression to define the rule which has previously been identified with fixed threshold-based approaches. Empirical results indicate that our approach may yield significantly increased excess returns compared to ones obtained by previous approaches on large capitalisation stocks in the Korean equities market.

  5. Analysis of basic clustering algorithms for numerical estimation of statistical averages in biomolecules.

    PubMed

    Anandakrishnan, Ramu; Onufriev, Alexey

    2008-03-01

    In statistical mechanics, the equilibrium properties of a physical system of particles can be calculated as the statistical average over accessible microstates of the system. In general, these calculations are computationally intractable since they involve summations over an exponentially large number of microstates. Clustering algorithms are one of the methods used to numerically approximate these sums. The most basic clustering algorithms first sub-divide the system into a set of smaller subsets (clusters). Then, interactions between particles within each cluster are treated exactly, while all interactions between different clusters are ignored. These smaller clusters have far fewer microstates, making the summation over these microstates, tractable. These algorithms have been previously used for biomolecular computations, but remain relatively unexplored in this context. Presented here, is a theoretical analysis of the error and computational complexity for the two most basic clustering algorithms that were previously applied in the context of biomolecular electrostatics. We derive a tight, computationally inexpensive, error bound for the equilibrium state of a particle computed via these clustering algorithms. For some practical applications, it is the root mean square error, which can be significantly lower than the error bound, that may be more important. We how that there is a strong empirical relationship between error bound and root mean square error, suggesting that the error bound could be used as a computationally inexpensive metric for predicting the accuracy of clustering algorithms for practical applications. An example of error analysis for such an application-computation of average charge of ionizable amino-acids in proteins-is given, demonstrating that the clustering algorithm can be accurate enough for practical purposes.

  6. pROC: an open-source package for R and S+ to analyze and compare ROC curves.

    PubMed

    Robin, Xavier; Turck, Natacha; Hainard, Alexandre; Tiberti, Natalia; Lisacek, Frédérique; Sanchez, Jean-Charles; Müller, Markus

    2011-03-17

    Receiver operating characteristic (ROC) curves are useful tools to evaluate classifiers in biomedical and bioinformatics applications. However, conclusions are often reached through inconsistent use or insufficient statistical analysis. To support researchers in their ROC curves analysis we developed pROC, a package for R and S+ that contains a set of tools displaying, analyzing, smoothing and comparing ROC curves in a user-friendly, object-oriented and flexible interface. With data previously imported into the R or S+ environment, the pROC package builds ROC curves and includes functions for computing confidence intervals, statistical tests for comparing total or partial area under the curve or the operating points of different classifiers, and methods for smoothing ROC curves. Intermediary and final results are visualised in user-friendly interfaces. A case study based on published clinical and biomarker data shows how to perform a typical ROC analysis with pROC. pROC is a package for R and S+ specifically dedicated to ROC analysis. It proposes multiple statistical tests to compare ROC curves, and in particular partial areas under the curve, allowing proper ROC interpretation. pROC is available in two versions: in the R programming language or with a graphical user interface in the S+ statistical software. It is accessible at http://expasy.org/tools/pROC/ under the GNU General Public License. It is also distributed through the CRAN and CSAN public repositories, facilitating its installation.

  7. STAPP: Spatiotemporal analysis of plantar pressure measurements using statistical parametric mapping.

    PubMed

    Booth, Brian G; Keijsers, Noël L W; Sijbers, Jan; Huysmans, Toon

    2018-05-03

    Pedobarography produces large sets of plantar pressure samples that are routinely subsampled (e.g. using regions of interest) or aggregated (e.g. center of pressure trajectories, peak pressure images) in order to simplify statistical analysis and provide intuitive clinical measures. We hypothesize that these data reductions discard gait information that can be used to differentiate between groups or conditions. To test the hypothesis of null information loss, we created an implementation of statistical parametric mapping (SPM) for dynamic plantar pressure datasets (i.e. plantar pressure videos). Our SPM software framework brings all plantar pressure videos into anatomical and temporal correspondence, then performs statistical tests at each sampling location in space and time. Novelly, we introduce non-linear temporal registration into the framework in order to normalize for timing differences within the stance phase. We refer to our software framework as STAPP: spatiotemporal analysis of plantar pressure measurements. Using STAPP, we tested our hypothesis on plantar pressure videos from 33 healthy subjects walking at different speeds. As walking speed increased, STAPP was able to identify significant decreases in plantar pressure at mid-stance from the heel through the lateral forefoot. The extent of these plantar pressure decreases has not previously been observed using existing plantar pressure analysis techniques. We therefore conclude that the subsampling of plantar pressure videos - a task which led to the discarding of gait information in our study - can be avoided using STAPP. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. A graph theory approach to identify resonant and non-resonant transmission paths in statistical modal energy distribution analysis

    NASA Astrophysics Data System (ADS)

    Aragonès, Àngels; Maxit, Laurent; Guasch, Oriol

    2015-08-01

    Statistical modal energy distribution analysis (SmEdA) extends classical statistical energy analysis (SEA) to the mid frequency range by establishing power balance equations between modes in different subsystems. This circumvents the SEA requirement of modal energy equipartition and enables applying SmEdA to the cases of low modal overlap, locally excited subsystems and to deal with complex heterogeneous subsystems as well. Yet, widening the range of application of SEA is done at a price with large models because the number of modes per subsystem can become considerable when the frequency increases. Therefore, it would be worthwhile to have at one's disposal tools for a quick identification and ranking of the resonant and non-resonant paths involved in modal energy transmission between subsystems. It will be shown that previously developed graph theory algorithms for transmission path analysis (TPA) in SEA can be adapted to SmEdA and prove useful for that purpose. The case of airborne transmission between two cavities separated apart by homogeneous and ribbed plates will be first addressed to illustrate the potential of the graph approach. A more complex case representing transmission between non-contiguous cavities in a shipbuilding structure will be also presented.

  9. Scripts for TRUMP data analyses. Part II (HLA-related data): statistical analyses specific for hematopoietic stem cell transplantation.

    PubMed

    Kanda, Junya

    2016-01-01

    The Transplant Registry Unified Management Program (TRUMP) made it possible for members of the Japan Society for Hematopoietic Cell Transplantation (JSHCT) to analyze large sets of national registry data on autologous and allogeneic hematopoietic stem cell transplantation. However, as the processes used to collect transplantation information are complex and differed over time, the background of these processes should be understood when using TRUMP data. Previously, information on the HLA locus of patients and donors had been collected using a questionnaire-based free-description method, resulting in some input errors. To correct minor but significant errors and provide accurate HLA matching data, the use of a Stata or EZR/R script offered by the JSHCT is strongly recommended when analyzing HLA data in the TRUMP dataset. The HLA mismatch direction, mismatch counting method, and different impacts of HLA mismatches by stem cell source are other important factors in the analysis of HLA data. Additionally, researchers should understand the statistical analyses specific for hematopoietic stem cell transplantation, such as competing risk, landmark analysis, and time-dependent analysis, to correctly analyze transplant data. The data center of the JSHCT can be contacted if statistical assistance is required.

  10. Compositional data analysis for physical activity, sedentary time and sleep research.

    PubMed

    Dumuid, Dorothea; Stanford, Tyman E; Martin-Fernández, Josep-Antoni; Pedišić, Željko; Maher, Carol A; Lewis, Lucy K; Hron, Karel; Katzmarzyk, Peter T; Chaput, Jean-Philippe; Fogelholm, Mikael; Hu, Gang; Lambert, Estelle V; Maia, José; Sarmiento, Olga L; Standage, Martyn; Barreira, Tiago V; Broyles, Stephanie T; Tudor-Locke, Catrine; Tremblay, Mark S; Olds, Timothy

    2017-01-01

    The health effects of daily activity behaviours (physical activity, sedentary time and sleep) are widely studied. While previous research has largely examined activity behaviours in isolation, recent studies have adjusted for multiple behaviours. However, the inclusion of all activity behaviours in traditional multivariate analyses has not been possible due to the perfect multicollinearity of 24-h time budget data. The ensuing lack of adjustment for known effects on the outcome undermines the validity of study findings. We describe a statistical approach that enables the inclusion of all daily activity behaviours, based on the principles of compositional data analysis. Using data from the International Study of Childhood Obesity, Lifestyle and the Environment, we demonstrate the application of compositional multiple linear regression to estimate adiposity from children's daily activity behaviours expressed as isometric log-ratio coordinates. We present a novel method for predicting change in a continuous outcome based on relative changes within a composition, and for calculating associated confidence intervals to allow for statistical inference. The compositional data analysis presented overcomes the lack of adjustment that has plagued traditional statistical methods in the field, and provides robust and reliable insights into the health effects of daily activity behaviours.

  11. A pathway in the brainstem for roll-tilt of the subjective visual vertical: evidence from a lesion-behavior mapping study.

    PubMed

    Baier, Bernhard; Thömke, Frank; Wilting, Janine; Heinze, Caroline; Geber, Christian; Dieterich, Marianne

    2012-10-24

    The perceived subjective visual vertical (SVV) is an important sign of a vestibular otolith tone imbalance in the roll plane. Previous studies suggested that unilateral pontomedullary brainstem lesions cause ipsiversive roll-tilt of SVV, whereas pontomesencephalic lesions cause contraversive roll-tilts of SVV. However, previous data were of limited quality and lacked a statistical approach. We therefore tested roll-tilt of the SVV in 79 human patients with acute unilateral brainstem lesions due to stroke by applying modern statistical lesion-behavior mapping analysis. Roll-tilt of the SVV was verified to be a brainstem sign, and for the first time it was confirmed statistically that lesions of the medial longitudinal fasciculus (MLF) and the medial vestibular nucleus are associated with ipsiversive tilt of the SVV, whereas contraversive tilts are associated with lesions affecting the rostral interstitial nucleus of the MLF, the superior cerebellar peduncle, the oculomotor nucleus, and the interstitial nucleus of Cajal. Thus, these structures constitute the anatomical pathway in the brainstem for verticality perception. Present data indicate that graviceptive otolith signals present a predominant role in the multisensory system of verticality perception.

  12. Weighing of risk factors for penetrating keratoplasty graft failure: application of Risk Score System.

    PubMed

    Tourkmani, Abdo Karim; Sánchez-Huerta, Valeria; De Wit, Guillermo; Martínez, Jaime D; Mingo, David; Mahillo-Fernández, Ignacio; Jiménez-Alfaro, Ignacio

    2017-01-01

    To analyze the relationship between the score obtained in the Risk Score System (RSS) proposed by Hicks et al with penetrating keratoplasty (PKP) graft failure at 1y postoperatively and among each factor in the RSS with the risk of PKP graft failure using univariate and multivariate analysis. The retrospective cohort study had 152 PKPs from 152 patients. Eighteen cases were excluded from our study due to primary failure (10 cases), incomplete medical notes (5 cases) and follow-up less than 1y (3 cases). We included 134 PKPs from 134 patients stratified by preoperative risk score. Spearman coefficient was calculated for the relationship between the score obtained and risk of failure at 1y. Univariate and multivariate analysis were calculated for the impact of every single risk factor included in the RSS over graft failure at 1y. Spearman coefficient showed statistically significant correlation between the score in the RSS and graft failure ( P <0.05). Multivariate logistic regression analysis showed no statistically significant relationship ( P >0.05) between diagnosis and lens status with graft failure. The relationship between the other risk factors studied and graft failure was significant ( P <0.05), although the results for previous grafts and graft failure was unreliable. None of our patients had previous blood transfusion, thus, it had no impact. After the application of multivariate analysis techniques, some risk factors do not show the expected impact over graft failure at 1y.

  13. Weighing of risk factors for penetrating keratoplasty graft failure: application of Risk Score System

    PubMed Central

    Tourkmani, Abdo Karim; Sánchez-Huerta, Valeria; De Wit, Guillermo; Martínez, Jaime D.; Mingo, David; Mahillo-Fernández, Ignacio; Jiménez-Alfaro, Ignacio

    2017-01-01

    AIM To analyze the relationship between the score obtained in the Risk Score System (RSS) proposed by Hicks et al with penetrating keratoplasty (PKP) graft failure at 1y postoperatively and among each factor in the RSS with the risk of PKP graft failure using univariate and multivariate analysis. METHODS The retrospective cohort study had 152 PKPs from 152 patients. Eighteen cases were excluded from our study due to primary failure (10 cases), incomplete medical notes (5 cases) and follow-up less than 1y (3 cases). We included 134 PKPs from 134 patients stratified by preoperative risk score. Spearman coefficient was calculated for the relationship between the score obtained and risk of failure at 1y. Univariate and multivariate analysis were calculated for the impact of every single risk factor included in the RSS over graft failure at 1y. RESULTS Spearman coefficient showed statistically significant correlation between the score in the RSS and graft failure (P<0.05). Multivariate logistic regression analysis showed no statistically significant relationship (P>0.05) between diagnosis and lens status with graft failure. The relationship between the other risk factors studied and graft failure was significant (P<0.05), although the results for previous grafts and graft failure was unreliable. None of our patients had previous blood transfusion, thus, it had no impact. CONCLUSION After the application of multivariate analysis techniques, some risk factors do not show the expected impact over graft failure at 1y. PMID:28393027

  14. Developing and validating a nutrition knowledge questionnaire: key methods and considerations.

    PubMed

    Trakman, Gina Louise; Forsyth, Adrienne; Hoye, Russell; Belski, Regina

    2017-10-01

    To outline key statistical considerations and detailed methodologies for the development and evaluation of a valid and reliable nutrition knowledge questionnaire. Literature on questionnaire development in a range of fields was reviewed and a set of evidence-based guidelines specific to the creation of a nutrition knowledge questionnaire have been developed. The recommendations describe key qualitative methods and statistical considerations, and include relevant examples from previous papers and existing nutrition knowledge questionnaires. Where details have been omitted for the sake of brevity, the reader has been directed to suitable references. We recommend an eight-step methodology for nutrition knowledge questionnaire development as follows: (i) definition of the construct and development of a test plan; (ii) generation of the item pool; (iii) choice of the scoring system and response format; (iv) assessment of content validity; (v) assessment of face validity; (vi) purification of the scale using item analysis, including item characteristics, difficulty and discrimination; (vii) evaluation of the scale including its factor structure and internal reliability, or Rasch analysis, including assessment of dimensionality and internal reliability; and (viii) gathering of data to re-examine the questionnaire's properties, assess temporal stability and confirm construct validity. Several of these methods have previously been overlooked. The measurement of nutrition knowledge is an important consideration for individuals working in the nutrition field. Improved methods in the development of nutrition knowledge questionnaires, such as the use of factor analysis or Rasch analysis, will enable more confidence in reported measures of nutrition knowledge.

  15. Gene network inherent in genomic big data improves the accuracy of prognostic prediction for cancer patients.

    PubMed

    Kim, Yun Hak; Jeong, Dae Cheon; Pak, Kyoungjune; Goh, Tae Sik; Lee, Chi-Seung; Han, Myoung-Eun; Kim, Ji-Young; Liangwen, Liu; Kim, Chi Dae; Jang, Jeon Yeob; Cha, Wonjae; Oh, Sae-Ock

    2017-09-29

    Accurate prediction of prognosis is critical for therapeutic decisions regarding cancer patients. Many previously developed prognostic scoring systems have limitations in reflecting recent progress in the field of cancer biology such as microarray, next-generation sequencing, and signaling pathways. To develop a new prognostic scoring system for cancer patients, we used mRNA expression and clinical data in various independent breast cancer cohorts (n=1214) from the Molecular Taxonomy of Breast Cancer International Consortium (METABRIC) and Gene Expression Omnibus (GEO). A new prognostic score that reflects gene network inherent in genomic big data was calculated using Network-Regularized high-dimensional Cox-regression (Net-score). We compared its discriminatory power with those of two previously used statistical methods: stepwise variable selection via univariate Cox regression (Uni-score) and Cox regression via Elastic net (Enet-score). The Net scoring system showed better discriminatory power in prediction of disease-specific survival (DSS) than other statistical methods (p=0 in METABRIC training cohort, p=0.000331, 4.58e-06 in two METABRIC validation cohorts) when accuracy was examined by log-rank test. Notably, comparison of C-index and AUC values in receiver operating characteristic analysis at 5 years showed fewer differences between training and validation cohorts with the Net scoring system than other statistical methods, suggesting minimal overfitting. The Net-based scoring system also successfully predicted prognosis in various independent GEO cohorts with high discriminatory power. In conclusion, the Net-based scoring system showed better discriminative power than previous statistical methods in prognostic prediction for breast cancer patients. This new system will mark a new era in prognosis prediction for cancer patients.

  16. Gene network inherent in genomic big data improves the accuracy of prognostic prediction for cancer patients

    PubMed Central

    Kim, Yun Hak; Jeong, Dae Cheon; Pak, Kyoungjune; Goh, Tae Sik; Lee, Chi-Seung; Han, Myoung-Eun; Kim, Ji-Young; Liangwen, Liu; Kim, Chi Dae; Jang, Jeon Yeob; Cha, Wonjae; Oh, Sae-Ock

    2017-01-01

    Accurate prediction of prognosis is critical for therapeutic decisions regarding cancer patients. Many previously developed prognostic scoring systems have limitations in reflecting recent progress in the field of cancer biology such as microarray, next-generation sequencing, and signaling pathways. To develop a new prognostic scoring system for cancer patients, we used mRNA expression and clinical data in various independent breast cancer cohorts (n=1214) from the Molecular Taxonomy of Breast Cancer International Consortium (METABRIC) and Gene Expression Omnibus (GEO). A new prognostic score that reflects gene network inherent in genomic big data was calculated using Network-Regularized high-dimensional Cox-regression (Net-score). We compared its discriminatory power with those of two previously used statistical methods: stepwise variable selection via univariate Cox regression (Uni-score) and Cox regression via Elastic net (Enet-score). The Net scoring system showed better discriminatory power in prediction of disease-specific survival (DSS) than other statistical methods (p=0 in METABRIC training cohort, p=0.000331, 4.58e-06 in two METABRIC validation cohorts) when accuracy was examined by log-rank test. Notably, comparison of C-index and AUC values in receiver operating characteristic analysis at 5 years showed fewer differences between training and validation cohorts with the Net scoring system than other statistical methods, suggesting minimal overfitting. The Net-based scoring system also successfully predicted prognosis in various independent GEO cohorts with high discriminatory power. In conclusion, the Net-based scoring system showed better discriminative power than previous statistical methods in prognostic prediction for breast cancer patients. This new system will mark a new era in prognosis prediction for cancer patients. PMID:29100405

  17. A follow-up power analysis of the statistical tests used in the Journal of Research in Science Teaching

    NASA Astrophysics Data System (ADS)

    Woolley, Thomas W.; Dawson, George O.

    It has been two decades since the first power analysis of a psychological journal and 10 years since the Journal of Research in Science Teaching made its contribution to this debate. One purpose of this article is to investigate what power-related changes, if any, have occurred in science education research over the past decade as a result of the earlier survey. In addition, previous recommendations are expanded and expounded upon within the context of more recent work in this area. The absence of any consistent mode of presenting statistical results, as well as little change with regard to power-related issues are reported. Guidelines for reporting the minimal amount of information demanded for clear and independent evaluation of research results by readers are also proposed.

  18. Non-Abelian statistics of vortices with non-Abelian Dirac fermions.

    PubMed

    Yasui, Shigehiro; Hirono, Yuji; Itakura, Kazunori; Nitta, Muneto

    2013-05-01

    We extend our previous analysis on the exchange statistics of vortices having a single Dirac fermion trapped in each core to the case where vortices trap two Dirac fermions with U(2) symmetry. Such a system of vortices with non-Abelian Dirac fermions appears in color superconductors at extremely high densities and in supersymmetric QCD. We show that the exchange of two vortices having doublet Dirac fermions in each core is expressed by non-Abelian representations of a braid group, which is explicitly verified in the matrix representation of the exchange operators when the number of vortices is up to four. We find that the result contains the matrices previously obtained for the vortices with a single Dirac fermion in each core as a special case. The whole braid group does not immediately imply non-Abelian statistics of identical particles because it also contains exchanges between vortices with different numbers of Dirac fermions. However, we find that it does contain, as its subgroup, genuine non-Abelian statistics for the exchange of the identical particles, that is, vortices with the same number of Dirac fermions. This result is surprising compared with conventional understanding because all Dirac fermions are defined locally at each vortex, unlike the case of Majorana fermions for which Dirac fermions are defined nonlocally by Majorana fermions located at two spatially separated vortices.

  19. Cascaded Amplitude Modulations in Sound Texture Perception

    PubMed Central

    McWalter, Richard; Dau, Torsten

    2017-01-01

    Sound textures, such as crackling fire or chirping crickets, represent a broad class of sounds defined by their homogeneous temporal structure. It has been suggested that the perception of texture is mediated by time-averaged summary statistics measured from early auditory representations. In this study, we investigated the perception of sound textures that contain rhythmic structure, specifically second-order amplitude modulations that arise from the interaction of different modulation rates, previously described as “beating” in the envelope-frequency domain. We developed an auditory texture model that utilizes a cascade of modulation filterbanks that capture the structure of simple rhythmic patterns. The model was examined in a series of psychophysical listening experiments using synthetic sound textures—stimuli generated using time-averaged statistics measured from real-world textures. In a texture identification task, our results indicated that second-order amplitude modulation sensitivity enhanced recognition. Next, we examined the contribution of the second-order modulation analysis in a preference task, where the proposed auditory texture model was preferred over a range of model deviants that lacked second-order modulation rate sensitivity. Lastly, the discriminability of textures that included second-order amplitude modulations appeared to be perceived using a time-averaging process. Overall, our results demonstrate that the inclusion of second-order modulation analysis generates improvements in the perceived quality of synthetic textures compared to the first-order modulation analysis considered in previous approaches. PMID:28955191

  20. Cascaded Amplitude Modulations in Sound Texture Perception.

    PubMed

    McWalter, Richard; Dau, Torsten

    2017-01-01

    Sound textures, such as crackling fire or chirping crickets, represent a broad class of sounds defined by their homogeneous temporal structure. It has been suggested that the perception of texture is mediated by time-averaged summary statistics measured from early auditory representations. In this study, we investigated the perception of sound textures that contain rhythmic structure, specifically second-order amplitude modulations that arise from the interaction of different modulation rates, previously described as "beating" in the envelope-frequency domain. We developed an auditory texture model that utilizes a cascade of modulation filterbanks that capture the structure of simple rhythmic patterns. The model was examined in a series of psychophysical listening experiments using synthetic sound textures-stimuli generated using time-averaged statistics measured from real-world textures. In a texture identification task, our results indicated that second-order amplitude modulation sensitivity enhanced recognition. Next, we examined the contribution of the second-order modulation analysis in a preference task, where the proposed auditory texture model was preferred over a range of model deviants that lacked second-order modulation rate sensitivity. Lastly, the discriminability of textures that included second-order amplitude modulations appeared to be perceived using a time-averaging process. Overall, our results demonstrate that the inclusion of second-order modulation analysis generates improvements in the perceived quality of synthetic textures compared to the first-order modulation analysis considered in previous approaches.

  1. Improving accuracy and power with transfer learning using a meta-analytic database.

    PubMed

    Schwartz, Yannick; Varoquaux, Gaël; Pallier, Christophe; Pinel, Philippe; Poline, Jean-Baptiste; Thirion, Bertrand

    2012-01-01

    Typical cohorts in brain imaging studies are not large enough for systematic testing of all the information contained in the images. To build testable working hypotheses, investigators thus rely on analysis of previous work, sometimes formalized in a so-called meta-analysis. In brain imaging, this approach underlies the specification of regions of interest (ROIs) that are usually selected on the basis of the coordinates of previously detected effects. In this paper, we propose to use a database of images, rather than coordinates, and frame the problem as transfer learning: learning a discriminant model on a reference task to apply it to a different but related new task. To facilitate statistical analysis of small cohorts, we use a sparse discriminant model that selects predictive voxels on the reference task and thus provides a principled procedure to define ROIs. The benefits of our approach are twofold. First it uses the reference database for prediction, i.e., to provide potential biomarkers in a clinical setting. Second it increases statistical power on the new task. We demonstrate on a set of 18 pairs of functional MRI experimental conditions that our approach gives good prediction. In addition, on a specific transfer situation involving different scanners at different locations, we show that voxel selection based on transfer learning leads to higher detection power on small cohorts.

  2. Statistical issues on the analysis of change in follow-up studies in dental research.

    PubMed

    Blance, Andrew; Tu, Yu-Kang; Baelum, Vibeke; Gilthorpe, Mark S

    2007-12-01

    To provide an overview to the problems in study design and associated analyses of follow-up studies in dental research, particularly addressing three issues: treatment-baselineinteractions; statistical power; and nonrandomization. Our previous work has shown that many studies purport an interacion between change (from baseline) and baseline values, which is often based on inappropriate statistical analyses. A priori power calculations are essential for randomized controlled trials (RCTs), but in the pre-test/post-test RCT design it is not well known to dental researchers that the choice of statistical method affects power, and that power is affected by treatment-baseline interactions. A common (good) practice in the analysis of RCT data is to adjust for baseline outcome values using ancova, thereby increasing statistical power. However, an important requirement for ancova is there to be no interaction between the groups and baseline outcome (i.e. effective randomization); the patient-selection process should not cause differences in mean baseline values across groups. This assumption is often violated for nonrandomized (observational) studies and the use of ancova is thus problematic, potentially giving biased estimates, invoking Lord's paradox and leading to difficulties in the interpretation of results. Baseline interaction issues can be overcome by use of statistical methods; not widely practiced in dental research: Oldham's method and multilevel modelling; the latter is preferred for its greater flexibility to deal with more than one follow-up occasion as well as additional covariates To illustrate these three key issues, hypothetical examples are considered from the fields of periodontology, orthodontics, and oral implantology. Caution needs to be exercised when considering the design and analysis of follow-up studies. ancova is generally inappropriate for nonrandomized studies and causal inferences from observational data should be avoided.

  3. Association analysis of 9,560 prostate cancer cases from the International Consortium of Prostate Cancer Genetics confirms the role of reported prostate-cancer associated SNPs for familial disease

    PubMed Central

    Teerlink, Craig C.; Thibodeau, Stephen N.; McDonnell, Shannon K.; Schaid, Daniel J.; Rinckleb, Antje; Maier, Christiane; Vogel, Walther; Cancel-Tassin, Geraldine; Egrot, Christophe; Cussenot, Olivier; Foulkes, William D.; Giles, Graham G.; Hopper, John L.; Severi, Gianluca; Eeles, Ros; Easton, Douglas; Kote-Jarai, Zsofia; Guy, Michelle; Cooney, Kathleen A.; Ray, Anna M.; Zuhlke, Kimberly A.; Lange, Ethan M.; FitzGerald, Liesel M.; Stanford, Janet L.; Ostrander, Elaine A.; Wiley, Kathleen E.; Isaacs, Sarah D.; Walsh, Patrick C.; Isaacs, William B.; Wahlfors, Tiina; Tammela, Teuvo; Schleutker, Johanna; Wiklund, Fredrik; Grönberg, Henrik; Emanuelsson, Monica; Carpten, John; Bailey-Wilson, Joan; Whittemore, Alice S.; Oakley-Girvan, Ingrid; Hsieh, Chih-Lin; Catalona, William J.; Zheng, S. Lilly; Jin, Guangfu; Lu, Lingyi; Xu, Jianfeng; Camp, Nicola J.; Cannon-Albright, Lisa A.

    2013-01-01

    Previous GWAS studies have reported significant associations between various common SNPs and prostate cancer risk using cases unselected for family history. How these variants influence risk in familial prostate cancer is not well studied. Here, we analyzed 25 previously reported SNPs across 14 loci from prior prostate cancer GWAS. The International Consortium for Prostate Cancer Genetics (ICPCG) previously validated some of these using a family-based association method (FBAT). However, this approach suffered reduced power due to the conditional statistics implemented in FBAT. Here, we use a case-control design with an empirical analysis strategy to analyze the ICPCG resource for association between these 25 SNPs and familial prostate cancer risk. Fourteen sites contributed 12,506 samples (9,560 prostate cancer cases, 3,368 with aggressive disease, and 2,946 controls from 2,283 pedigrees). We performed association analysis with Genie software which accounts for relationships. We analyzed all familial prostate cancer cases and the subset of aggressive cases. For the familial prostate cancer phenotype, 20 of the 25 SNPs were at least nominally associated with prostate cancer and 16 remained significant after multiple testing correction (p≤1E−3) occurring on chromosomal bands 6q25, 7p15, 8q24, 10q11, 11q13, 17q12, 17q24, and Xp11. For aggressive disease, 16 of the SNPs had at least nominal evidence and 8 were statistically significant including 2p15. The results indicate that the majority of common, low-risk alleles identified in GWAS studies for all prostate cancer also contribute risk for familial prostate cancer, and that some may be contribute risk to aggressive disease. PMID:24162621

  4. The Associations between Infant Homicide, Homicide, and Suicide Rates: An Analysis of World Health Organization and Centers for Disease Control Statistics

    ERIC Educational Resources Information Center

    Large, Matthew; Nielssen, Olav; Lackersteen, Steven; Smith, Glen

    2010-01-01

    Previous studies have found that rates of homicide of children aged under one (infant homicide) are associated with rates of suicide, but not with rates of homicide. Linear regression was used to examine associations among infant homicide, homicide, and suicide in samples of regions in the United States and other countries. Infant homicide rates…

  5. CALL versus Paper: In Which Context Are L1 Glosses More Effective?

    ERIC Educational Resources Information Center

    Taylor, Alan M.

    2013-01-01

    CALL glossing in first language (L1) or second language (L2) texts has been shown by previous studies to be more effective than traditional, paper-and-pen L1 glossing. Using a pool of studies with much more statistical power and more accurate results, this meta-analysis demonstrates more precisely the degree to which CALL L1 glossing can be more…

  6. Persistence of discrimination: Revisiting Axtell, Epstein and Young

    NASA Astrophysics Data System (ADS)

    Weisbuch, Gérard

    2018-02-01

    We reformulate an earlier model of the "Emergence of classes..." proposed by Axtell et al. (2001) using more elaborate cognitive processes allowing a statistical physics approach. The thorough analysis of the phase space and of the basins of attraction leads to a reconsideration of the previous social interpretations: our model predicts the reinforcement of discrimination biases and their long term stability rather than the emergence of classes.

  7. Reproductive function in relation to duty assignments among military personnel.

    PubMed

    Schrader, S M; Langford, R E; Turner, T W; Breitenstein, M J; Clark, J C; Jenkins, B L; Lundy, D O; Simon, S D; Weyandt, T B

    1998-01-01

    As a follow-up to the pilot study of semen quality of soldiers with various military assignments a larger, more complete study was conducted. Soldiers were recruited at Fort Hood, Texas. Thirty-three men were exposed to radar as part of their duty assignment in the Signal Corps, 57 men were involved with firing the 155 mm howitzer (potential lead exposure), and 103 soldiers had neither lead nor radar exposure and served as the comparison control group. Both serum and urinary follicle-stimulating hormone and luteinizing hormone and serum, salivary, and urine testosterone levels were determined in all men. A complete semen analysis was conducted on each soldier. For statistical analysis, the primary study variables were: sperm concentration, sperm/ejaculate, semen volume, percent normal morphology, percent motile, percent viable (both vital stain and hypoosmotic swelling), curvilinear velocity, straight-line velocity, linearity, sperm head length, width, area, and perimeter. Variables were adjusted for significant confounders (e.g., abstinence, sample age, race). No statistical differences (P < 0.05) were observed in any measurement. While these results are in agreement with two previous studies assessing soldiers firing the 155-mm howitzer, they contradict our previous report indicating that radar exposure caused a significant decrease in sperm numbers. A possible explanation is that the radar exposure in this study was that used in Signal Corps operations while the men in the previous study were using different radar as part of military intelligence operations. The data presented here in men firing the 155-mm howitzer combined with the results from the previous studies confirms that there are no deficits in semen quality in these men. The contradiction between the results of the radar exposure studies indicates that more data are needed to evaluate the relationship of military radar and male reproductive health.

  8. A Physics-Inspired Mechanistic Model of Migratory Movement Patterns in Birds.

    PubMed

    Revell, Christopher; Somveille, Marius

    2017-08-29

    In this paper, we introduce a mechanistic model of migratory movement patterns in birds, inspired by ideas and methods from physics. Previous studies have shed light on the factors influencing bird migration but have mainly relied on statistical correlative analysis of tracking data. Our novel method offers a bottom up explanation of population-level migratory movement patterns. It differs from previous mechanistic models of animal migration and enables predictions of pathways and destinations from a given starting location. We define an environmental potential landscape from environmental data and simulate bird movement within this landscape based on simple decision rules drawn from statistical mechanics. We explore the capacity of the model by qualitatively comparing simulation results to the non-breeding migration patterns of a seabird species, the Black-browed Albatross (Thalassarche melanophris). This minimal, two-parameter model was able to capture remarkably well the previously documented migration patterns of the Black-browed Albatross, with the best combination of parameter values conserved across multiple geographically separate populations. Our physics-inspired mechanistic model could be applied to other bird and highly-mobile species, improving our understanding of the relative importance of various factors driving migration and making predictions that could be useful for conservation.

  9. Incidence of post-operative adhesions following Misgav Ladach caesarean section--a comparative study.

    PubMed

    Fatusić, Zlatan; Hudić, Igor

    2009-02-01

    To evaluate the incidence of peritoneal adhesions as a post-operative complication after caesarean section following the Misgav Ladach method and compare it with peritoneal adhesions following traditional caesarean section methods (Pfannenstiel-Dörffler, low midline laparotomy-Dörffler). The analysis is retrospective and is based on medical documentation of the Clinic for Gynecology and Obstetrics, University Clinical Centre, Tuzla, Bosnia and Herzegovina (data from 1 January 2001 to 31 December 2005). We analysed previous caesarean section dependent on caesarean section method (200 by Misgav Ladach method, 100 by Pfannenstiel-Dörffler method and 100 caesarean section by low midline laparotomy-Dörffler). Adhesion scores were assigned using a previously validated scoring system. We found statistically significant difference (p < 0.05) in incidence of peritoneal adhesions in second and third caesarean section between Misgav Ladach method and the Pfannestiel-Dörffler and low midline laparotomy-Dörffler method. Difference in incidence of peritoneal adhesions between low midline laparotomy-Dörffler and Pfannenstiel-Dörffler method was not statistically different (p > 0.05). The mean pelvic adhesion score was statistically lower in Misgav Ladach group (0.43 +/- 0.79) than the mean score in the Pfannestiel-Dörffler (0.71 +/- 1.27) and low midline laparotomy-Dörffler groups (0.99 +/- 1.49) (p < 0.05). Our study showed that Misgav Ladach method of caesarean section makes possible lower incidence of peritoneal adhesions as post-operative complication of previous caesarean section.

  10. A method for screening active components from Chinese herbs by cell membrane chromatography-offline-high performance liquid chromatography/mass spectrometry and an online statistical tool for data processing.

    PubMed

    Cao, Yan; Wang, Shaozhan; Li, Yinghua; Chen, Xiaofei; Chen, Langdong; Wang, Dongyao; Zhu, Zhenyu; Yuan, Yongfang; Lv, Diya

    2018-03-09

    Cell membrane chromatography (CMC) has been successfully applied to screen bioactive compounds from Chinese herbs for many years, and some offline and online two-dimensional (2D) CMC-high performance liquid chromatography (HPLC) hyphenated systems have been established to perform screening assays. However, the requirement of sample preparation steps for the second-dimensional analysis in offline systems and the need for an interface device and technical expertise in the online system limit their extensive use. In the present study, an offline 2D CMC-HPLC analysis combined with the XCMS (various forms of chromatography coupled to mass spectrometry) Online statistical tool for data processing was established. First, our previously reported online 2D screening system was used to analyze three Chinese herbs that were reported to have potential anti-inflammatory effects, and two binding components were identified. By contrast, the proposed offline 2D screening method with XCMS Online analysis was applied, and three more ingredients were discovered in addition to the two compounds revealed by the online system. Then, cross-validation of the three compounds was performed, and they were confirmed to be included in the online data as well, but were not identified there because of their low concentrations and lack of credible statistical approaches. Last, pharmacological experiments showed that these five ingredients could inhibit IL-6 release and IL-6 gene expression on LPS-induced RAW cells in a dose-dependent manner. Compared with previous 2D CMC screening systems, this newly developed offline 2D method needs no sample preparation steps for the second-dimensional analysis, and it is sensitive, efficient, and convenient. It will be applicable in identifying active components from Chinese herbs and practical in discovery of lead compounds derived from herbs. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Risk Factors for Developing Scoliosis in Cerebral Palsy: A Cross-Sectional Descriptive Study.

    PubMed

    Bertoncelli, Carlo M; Solla, Federico; Loughenbury, Peter R; Tsirikos, Athanasios I; Bertoncelli, Domenico; Rampal, Virginie

    2017-06-01

    This study aims to identify the risk factors leading to the development of severe scoliosis among children with cerebral palsy. A cross-sectional descriptive study of 70 children (aged 12-18 years) with severe spastic and/or dystonic cerebral palsy treated in a single specialist unit is described. Statistical analysis included Fisher exact test and logistic regression analysis to identify risk factors. Severe scoliosis is more likely to occur in patients with intractable epilepsy ( P = .008), poor gross motor functional assessment scores ( P = .018), limb spasticity ( P = .045), a history of previous hip surgery ( P = .048), and nonambulatory patients ( P = .013). Logistic regression model confirms the major risk factors are previous hip surgery ( P = .001), moderate to severe epilepsy ( P = .007), and female gender ( P = .03). History of previous hip surgery, intractable epilepsy, and female gender are predictors of developing severe scoliosis in children with cerebral palsy. This knowledge should aid in the early diagnosis of scoliosis and timely referral to specialist services.

  12. Propositional idea density in older men's written language: findings from the HIMS study using computerised analysis.

    PubMed

    Spencer, Elizabeth; Ferguson, Alison; Craig, Hugh; Colyvas, Kim; Hankey, Graeme J; Flicker, Leon

    2015-02-01

    Decline in linguistic function has been associated with decline in cognitive function in previous research. This research investigated the informativeness of written language samples of Australian men from the Health in Men's Study (HIMS) aged from 76 to 93 years using the Computerised Propositional Idea Density Rater (CPIDR 5.1). In total, 60,255 words in 1147 comments were analysed using a linear-mixed model for statistical analysis. Results indicated no relationship with education level (p = 0.79). Participants for whom English was not their first learnt language showed Propositional Idea Density (PD) scores slightly lower (0.018 per 1 word). Mean PD per 1 word for those for whom English was their first language for comments below 60 words was 0.494 and above 60 words 0.526. Text length was found to have an effect (p = <0.0001). The mean PD was higher than previously reported for men and lower than previously reported for a similar cohort for Australian women.

  13. Statistical wind analysis for near-space applications

    NASA Astrophysics Data System (ADS)

    Roney, Jason A.

    2007-09-01

    Statistical wind models were developed based on the existing observational wind data for near-space altitudes between 60 000 and 100 000 ft (18 30 km) above ground level (AGL) at two locations, Akon, OH, USA, and White Sands, NM, USA. These two sites are envisioned as playing a crucial role in the first flights of high-altitude airships. The analysis shown in this paper has not been previously applied to this region of the stratosphere for such an application. Standard statistics were compiled for these data such as mean, median, maximum wind speed, and standard deviation, and the data were modeled with Weibull distributions. These statistics indicated, on a yearly average, there is a lull or a “knee” in the wind between 65 000 and 72 000 ft AGL (20 22 km). From the standard statistics, trends at both locations indicated substantial seasonal variation in the mean wind speed at these heights. The yearly and monthly statistical modeling indicated that Weibull distributions were a reasonable model for the data. Forecasts and hindcasts were done by using a Weibull model based on 2004 data and comparing the model with the 2003 and 2005 data. The 2004 distribution was also a reasonable model for these years. Lastly, the Weibull distribution and cumulative function were used to predict the 50%, 95%, and 99% winds, which are directly related to the expected power requirements of a near-space station-keeping airship. These values indicated that using only the standard deviation of the mean may underestimate the operational conditions.

  14. Maximal use of kinematic information for the extraction of the mass of the top quark in single-lepton tt bar events at DO

    NASA Astrophysics Data System (ADS)

    Estrada Vigil, Juan Cruz

    The mass of the top (t) quark has been measured in the lepton+jets channel of tt¯ final states studied by the DØ and CDF experiments at Fermilab using data from Run I of the Tevatron pp¯ collider. The result published by DØ is 173.3 +/- 5.6(stat) +/- 5.5(syst) GeV. We present a different method to perform this measurement using the existing data. The new technique uses all available kinematic information in an event, and provides a significantly smaller statistical uncertainty than achieved in previous analyses. The preliminary results presented in this thesis indicate a statistical uncertainty for the extracted mass of the top quark of 3.5 GeV, which represents a significant improvement over the previous value of 5.6 GeV. The method of analysis is very general, and may be particularly useful in situations where there is a small signal and a large background.

  15. Spectral Discrete Probability Density Function of Measured Wind Turbine Noise in the Far Field

    PubMed Central

    Ashtiani, Payam; Denison, Adelaide

    2015-01-01

    Of interest is the spectral character of wind turbine noise at typical residential set-back distances. In this paper, a spectral statistical analysis has been applied to immission measurements conducted at three locations. This method provides discrete probability density functions for the Turbine ONLY component of the measured noise. This analysis is completed for one-third octave sound levels, at integer wind speeds, and is compared to existing metrics for measuring acoustic comfort as well as previous discussions on low-frequency noise sources. PMID:25905097

  16. Application of probabilistic analysis/design methods in space programs - The approaches, the status, and the needs

    NASA Technical Reports Server (NTRS)

    Ryan, Robert S.; Townsend, John S.

    1993-01-01

    The prospective improvement of probabilistic methods for space program analysis/design entails the further development of theories, codes, and tools which match specific areas of application, the drawing of lessons from previous uses of probability and statistics data bases, the enlargement of data bases (especially in the field of structural failures), and the education of engineers and managers on the advantages of these methods. An evaluation is presently made of the current limitations of probabilistic engineering methods. Recommendations are made for specific applications.

  17. NASA DOE POD NDE Capabilities Data Book

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.

    2015-01-01

    This data book contains the Directed Design of Experiments for Validating Probability of Detection (POD) Capability of NDE Systems (DOEPOD) analyses of the nondestructive inspection data presented in the NTIAC, Nondestructive Evaluation (NDE) Capabilities Data Book, 3rd ed., NTIAC DB-97-02. DOEPOD is designed as a decision support system to validate inspection system, personnel, and protocol demonstrating 0.90 POD with 95% confidence at critical flaw sizes, a90/95. The test methodology used in DOEPOD is based on the field of statistical sequential analysis founded by Abraham Wald. Sequential analysis is a method of statistical inference whose characteristic feature is that the number of observations required by the procedure is not determined in advance of the experiment. The decision to terminate the experiment depends, at each stage, on the results of the observations previously made. A merit of the sequential method, as applied to testing statistical hypotheses, is that test procedures can be constructed which require, on average, a substantially smaller number of observations than equally reliable test procedures based on a predetermined number of observations.

  18. Fully Bayesian tests of neutrality using genealogical summary statistics.

    PubMed

    Drummond, Alexei J; Suchard, Marc A

    2008-10-31

    Many data summary statistics have been developed to detect departures from neutral expectations of evolutionary models. However questions about the neutrality of the evolution of genetic loci within natural populations remain difficult to assess. One critical cause of this difficulty is that most methods for testing neutrality make simplifying assumptions simultaneously about the mutational model and the population size model. Consequentially, rejecting the null hypothesis of neutrality under these methods could result from violations of either or both assumptions, making interpretation troublesome. Here we harness posterior predictive simulation to exploit summary statistics of both the data and model parameters to test the goodness-of-fit of standard models of evolution. We apply the method to test the selective neutrality of molecular evolution in non-recombining gene genealogies and we demonstrate the utility of our method on four real data sets, identifying significant departures of neutrality in human influenza A virus, even after controlling for variation in population size. Importantly, by employing a full model-based Bayesian analysis, our method separates the effects of demography from the effects of selection. The method also allows multiple summary statistics to be used in concert, thus potentially increasing sensitivity. Furthermore, our method remains useful in situations where analytical expectations and variances of summary statistics are not available. This aspect has great potential for the analysis of temporally spaced data, an expanding area previously ignored for limited availability of theory and methods.

  19. MetaboLyzer: A Novel Statistical Workflow for Analyzing Post-Processed LC/MS Metabolomics Data

    PubMed Central

    Mak, Tytus D.; Laiakis, Evagelia C.; Goudarzi, Maryam; Fornace, Albert J.

    2014-01-01

    Metabolomics, the global study of small molecules in a particular system, has in the last few years risen to become a primary –omics platform for the study of metabolic processes. With the ever-increasing pool of quantitative data yielded from metabolomic research, specialized methods and tools with which to analyze and extract meaningful conclusions from these data are becoming more and more crucial. Furthermore, the depth of knowledge and expertise required to undertake a metabolomics oriented study is a daunting obstacle to investigators new to the field. As such, we have created a new statistical analysis workflow, MetaboLyzer, which aims to both simplify analysis for investigators new to metabolomics, as well as provide experienced investigators the flexibility to conduct sophisticated analysis. MetaboLyzer’s workflow is specifically tailored to the unique characteristics and idiosyncrasies of postprocessed liquid chromatography/mass spectrometry (LC/MS) based metabolomic datasets. It utilizes a wide gamut of statistical tests, procedures, and methodologies that belong to classical biostatistics, as well as several novel statistical techniques that we have developed specifically for metabolomics data. Furthermore, MetaboLyzer conducts rapid putative ion identification and putative biologically relevant analysis via incorporation of four major small molecule databases: KEGG, HMDB, Lipid Maps, and BioCyc. MetaboLyzer incorporates these aspects into a comprehensive workflow that outputs easy to understand statistically significant and potentially biologically relevant information in the form of heatmaps, volcano plots, 3D visualization plots, correlation maps, and metabolic pathway hit histograms. For demonstration purposes, a urine metabolomics data set from a previously reported radiobiology study in which samples were collected from mice exposed to gamma radiation was analyzed. MetaboLyzer was able to identify 243 statistically significant ions out of a total of 1942. Numerous putative metabolites and pathways were found to be biologically significant from the putative ion identification workflow. PMID:24266674

  20. Analysis of the Einstein sample of early-type galaxies

    NASA Technical Reports Server (NTRS)

    Eskridge, Paul B.; Fabbiano, Giuseppina

    1993-01-01

    The EINSTEIN galaxy catalog contains x-ray data for 148 early-type (E and SO) galaxies. A detailed analysis of the global properties of this sample are studied. By comparing the x-ray properties with other tracers of the ISM, as well as with observables related to the stellar dynamics and populations of the sample, we expect to determine more clearly the physical relationships that determine the evolution of early-type galaxies. Previous studies with smaller samples have explored the relationships between x-ray luminosity (L(sub x)) and luminosities in other bands. Using our larger sample and the statistical techniques of survival analysis, a number of these earlier analyses were repeated. For our full sample, a strong statistical correlation is found between L(sub X) and L(sub B) (the probability that the null hypothesis is upheld is P less than 10(exp -4) from a variety of rank correlation tests. Regressions with several algorithms yield consistent results.

  1. A comparison of healing rates on two pressure-relieving systems.

    PubMed

    Russell, L; Reynolds, T; Carr, J; Evans, A; Holmes, M

    The authors have previously reported the preliminary results of a randomized-controlled trial comparing the relative efficacy of two pressure-relieving systems: Huntleigh Nimbus 3 and Aura Cushion, and Pegasus Cairwave Therapy System and ProActive Seating Cushion (Russell et al, 2000). Although both the mattresses and cushions were effective treatments for pressure ulcers, the Huntleigh equipment was demonstrated to be statistically more effective for heel ulcers, but no differences were demonstrated for sacral ulcers. This article gives a more detailed analysis of the 141 patients assessed using computerized-image analysis of the digital images of sacral ulcers captured during the trial and specifically discusses the healing rates and other patient characteristics. Ninety-eight per cent of ulcers examined were deemed superficial (Torrance grade 2a, 2b, 3). Precision of image analysis assessed by within- and between-batch coefficients of variation was excellent: calibration CV 0.93-1.84%; area CV 4.61-5.72%. The healing rates on the two mattresses were not shown to be statistically different from each other.

  2. Effect of non-normality on test statistics for one-way independent groups designs.

    PubMed

    Cribbie, Robert A; Fiksenbaum, Lisa; Keselman, H J; Wilcox, Rand R

    2012-02-01

    The data obtained from one-way independent groups designs is typically non-normal in form and rarely equally variable across treatment populations (i.e., population variances are heterogeneous). Consequently, the classical test statistic that is used to assess statistical significance (i.e., the analysis of variance F test) typically provides invalid results (e.g., too many Type I errors, reduced power). For this reason, there has been considerable interest in finding a test statistic that is appropriate under conditions of non-normality and variance heterogeneity. Previously recommended procedures for analysing such data include the James test, the Welch test applied either to the usual least squares estimators of central tendency and variability, or the Welch test with robust estimators (i.e., trimmed means and Winsorized variances). A new statistic proposed by Krishnamoorthy, Lu, and Mathew, intended to deal with heterogeneous variances, though not non-normality, uses a parametric bootstrap procedure. In their investigation of the parametric bootstrap test, the authors examined its operating characteristics under limited conditions and did not compare it to the Welch test based on robust estimators. Thus, we investigated how the parametric bootstrap procedure and a modified parametric bootstrap procedure based on trimmed means perform relative to previously recommended procedures when data are non-normal and heterogeneous. The results indicated that the tests based on trimmed means offer the best Type I error control and power when variances are unequal and at least some of the distribution shapes are non-normal. © 2011 The British Psychological Society.

  3. Implementing statistical equating for MRCP(UK) Parts 1 and 2.

    PubMed

    McManus, I C; Chis, Liliana; Fox, Ray; Waller, Derek; Tang, Peter

    2014-09-26

    The MRCP(UK) exam, in 2008 and 2010, changed the standard-setting of its Part 1 and Part 2 examinations from a hybrid Angoff/Hofstee method to statistical equating using Item Response Theory, the reference group being UK graduates. The present paper considers the implementation of the change, the question of whether the pass rate increased amongst non-UK candidates, any possible role of Differential Item Functioning (DIF), and changes in examination predictive validity after the change. Analysis of data of MRCP(UK) Part 1 exam from 2003 to 2013 and Part 2 exam from 2005 to 2013. Inspection suggested that Part 1 pass rates were stable after the introduction of statistical equating, but showed greater annual variation probably due to stronger candidates taking the examination earlier. Pass rates seemed to have increased in non-UK graduates after equating was introduced, but was not associated with any changes in DIF after statistical equating. Statistical modelling of the pass rates for non-UK graduates found that pass rates, in both Part 1 and Part 2, were increasing year on year, with the changes probably beginning before the introduction of equating. The predictive validity of Part 1 for Part 2 was higher with statistical equating than with the previous hybrid Angoff/Hofstee method, confirming the utility of IRT-based statistical equating. Statistical equating was successfully introduced into the MRCP(UK) Part 1 and Part 2 written examinations, resulting in higher predictive validity than the previous Angoff/Hofstee standard setting. Concerns about an artefactual increase in pass rates for non-UK candidates after equating were shown not to be well-founded. Most likely the changes resulted from a genuine increase in candidate ability, albeit for reasons which remain unclear, coupled with a cognitive illusion giving the impression of a step-change immediately after equating began. Statistical equating provides a robust standard-setting method, with a better theoretical foundation than judgemental techniques such as Angoff, and is more straightforward and requires far less examiner time to provide a more valid result. The present study provides a detailed case study of introducing statistical equating, and issues which may need to be considered with its introduction.

  4. Studies of vorticity imbalance and stability, moisture budget, atmospheric energetics, and gradients of meteorological parameters during AVE 3

    NASA Technical Reports Server (NTRS)

    Scoggins, J. R. (Editor)

    1978-01-01

    Four diagnostic studies of AVE 3. are presented. AVE 3 represents a high wind speed wintertime situation, while most AVE's analyzed previously represented springtime conditions with rather low wind speeds. The general areas of analysis include the examination of budgets of vorticity, moisture, kinetic energy, and potential energy and a synoptic and statistical study of the horizontal gradients of meteorological parameters. Conclusions are integrated with and compared to those obtained in previously analyzed experiments (mostly springtime weather situations) so as to establish a more definitive understanding of the structure and dynamics of the atmosphere under a wide range of synoptic conditions.

  5. A statistical analysis of energy and power demand for the tractive purposes of an electric vehicle in urban traffic - an analysis of a short and long observation period

    NASA Astrophysics Data System (ADS)

    Slaski, G.; Ohde, B.

    2016-09-01

    The article presents the results of a statistical dispersion analysis of an energy and power demand for tractive purposes of a battery electric vehicle. The authors compare data distribution for different values of an average speed in two approaches, namely a short and long period of observation. The short period of observation (generally around several hundred meters) results from a previously proposed macroscopic energy consumption model based on an average speed per road section. This approach yielded high values of standard deviation and coefficient of variation (the ratio between standard deviation and the mean) around 0.7-1.2. The long period of observation (about several kilometers long) is similar in length to standardized speed cycles used in testing a vehicle energy consumption and available range. The data were analysed to determine the impact of observation length on the energy and power demand variation. The analysis was based on a simulation of electric power and energy consumption performed with speed profiles data recorded in Poznan agglomeration.

  6. [Basic concepts for network meta-analysis].

    PubMed

    Catalá-López, Ferrán; Tobías, Aurelio; Roqué, Marta

    2014-12-01

    Systematic reviews and meta-analyses have long been fundamental tools for evidence-based clinical practice. Initially, meta-analyses were proposed as a technique that could improve the accuracy and the statistical power of previous research from individual studies with small sample size. However, one of its main limitations has been the fact of being able to compare no more than two treatments in an analysis, even when the clinical research question necessitates that we compare multiple interventions. Network meta-analysis (NMA) uses novel statistical methods that incorporate information from both direct and indirect treatment comparisons in a network of studies examining the effects of various competing treatments, estimating comparisons between many treatments in a single analysis. Despite its potential limitations, NMA applications in clinical epidemiology can be of great value in situations where there are several treatments that have been compared against a common comparator. Also, NMA can be relevant to a research or clinical question when many treatments must be considered or when there is a mix of both direct and indirect information in the body of evidence. Copyright © 2013 Elsevier España, S.L.U. All rights reserved.

  7. Use of density equalizing map projections (DEMP) in the analysis of childhood cancer in four California counties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merrill, D.W.; Selvin, S.; Close, E.R.

    In studying geographic disease distributions, one normally compares rates of arbitrarily defined geographic subareas (e.g. census tracts), thereby sacrificing the geographic detail of the original data. The sparser the data, the larger the subareas must be in order to calculate stable rates. This dilemma is avoided with the technique of Density Equalizing Map Projections (DEMP). Boundaries of geographic subregions are adjusted to equalize population density over the entire study area. Case locations plotted on the transformed map should have a uniform distribution if the underlying disease-rates are constant. On the transformed map, the statistical analysis of the observed distribution ismore » greatly simplified. Even for sparse distributions, the statistical significance of a supposed disease cluster can be reliably calculated. The present report describes the first successful application of the DEMP technique to a sizeable ``real-world`` data set of epidemiologic interest. An improved DEMP algorithm [GUSE93, CLOS94] was applied to a data set previously analyzed with conventional techniques [SATA90, REYN91]. The results from the DEMP analysis and a conventional analysis are compared.« less

  8. General Aviation Avionics Statistics : 1975

    DOT National Transportation Integrated Search

    1978-06-01

    This report presents avionics statistics for the 1975 general aviation (GA) aircraft fleet and updates a previous publication, General Aviation Avionics Statistics: 1974. The statistics are presented in a capability group framework which enables one ...

  9. The writer independent online handwriting recognition system frog on hand and cluster generative statistical dynamic time warping.

    PubMed

    Bahlmann, Claus; Burkhardt, Hans

    2004-03-01

    In this paper, we give a comprehensive description of our writer-independent online handwriting recognition system frog on hand. The focus of this work concerns the presentation of the classification/training approach, which we call cluster generative statistical dynamic time warping (CSDTW). CSDTW is a general, scalable, HMM-based method for variable-sized, sequential data that holistically combines cluster analysis and statistical sequence modeling. It can handle general classification problems that rely on this sequential type of data, e.g., speech recognition, genome processing, robotics, etc. Contrary to previous attempts, clustering and statistical sequence modeling are embedded in a single feature space and use a closely related distance measure. We show character recognition experiments of frog on hand using CSDTW on the UNIPEN online handwriting database. The recognition accuracy is significantly higher than reported results of other handwriting recognition systems. Finally, we describe the real-time implementation of frog on hand on a Linux Compaq iPAQ embedded device.

  10. Association between Asymptomatic Urinary Tract Infection and Postoperative Spine Infection in Elderly Women : A Retrospective Analysis Study

    PubMed Central

    Lee, Seung-Eun; Park, Yong-Sook; Kim, Young-Baeg

    2010-01-01

    Objective The purpose of this study is to identify the relationship between asymptomatic urinary tract infection (aUTI) and postoperative spine infection. Methods A retrospective review was done in 355 women more than 65 years old who had undergone laminectomy and/or discectomy, and spinal fusion, between January 2004 and December 2008. Previously postulated risk factors (i.e., instrumentation, diabetes, prior corticosteroid therapy, previous spinal surgery, and smoking) were investigated. Furthermore, we added aUTI that was not previously considered. Results Among 355 patients, 42 met the criteria for aUTI (Bacteriuria ≥ 105 CFU/mL and no associated symptoms). A postoperative spine infection was evident in 15 of 355 patients. Of the previously described risk factors, multi-levels (p < 0.05), instrumentation (p < 0.05) and diabetes (p < 0.05) were proven risk factors, whereas aUTI (p > 0.05) was not statistically significant. However, aUTI with Foley catheterization was statistically significant when Foley catheterization was added as a variable to the all existing risk factors. Conclusion aUTI is not rare in elderly women admitted to the hospital for lumbar spine surgery. The results of this study suggest that aUTI with Foley catheterization may be considered a risk factor for postoperative spine infection in elderly women. Therefore, we would consider treating aUTI before operating on elderly women who will need Foley catheterization. PMID:20461166

  11. Non-linear dielectric spectroscopy of microbiological suspensions

    PubMed Central

    Treo, Ernesto F; Felice, Carmelo J

    2009-01-01

    Background Non-linear dielectric spectroscopy (NLDS) of microorganism was characterized by the generation of harmonics in the polarization current when a microorganism suspension was exposed to a sinusoidal electric field. The biological nonlinear response initially described was not well verified by other authors and the results were susceptible to ambiguous interpretation. In this paper NLDS was performed to yeast suspension in tripolar and tetrapolar configuration with a recently developed analyzer. Methods Tripolar analysis was carried out by applying sinusoidal voltages up to 1 V at the electrode interface. Tetrapolar analysis was carried on with sinusoidal field strengths from 0.1 V cm-1 to 70 V cm-1. Both analyses were performed within a frequency range from 1 Hz through 100 Hz. The harmonic amplitudes were Fourier-analyzed and expressed in dB. The third harmonic, as reported previously, was investigated. Statistical analysis (ANOVA) was used to test the effect of inhibitor an activator of the plasma membrane enzyme in the measured response. Results No significant non-linearities were observed in tetrapolar analysis, and no observable changes occurred when inhibitor and activator were added to the suspension. Statistical analysis confirmed these results. When a pure sinus voltage was applied to an electrode-yeast suspension interface, variations higher than 25 dB for the 3rd harmonic were observed. Variation higher than 20 dB in the 3rd harmonics has also been found when adding an inhibitor or activator of the membrane-bounded enzymes. These variations did not occur when the suspension was boiled. Discussion The lack of result in tetrapolar cells suggest that there is no, if any, harmonic generation in microbiological bulk suspension. The non-linear response observed was originated in the electrode-electrolyte interface. The frequency and voltage windows observed in previous tetrapolar analysis were repeated in the tripolar measurements, but maximum were not observed at the same values. Conclusion Contrary to previous assertions, no repeatable dielectric non-linearity was exhibited in the bulk suspensions tested under the field and frequency condition reported with this recently designed analyzer. Indeed, interface related harmonics were observed and monitored during biochemical stimuli. The changes were coherent with the expected biological response. PMID:19772595

  12. Breastfeeding is positively associated with child intelligence even net of parental IQ.

    PubMed

    Kanazawa, Satoshi

    2015-12-01

    Some previous reviews conclude that breastfeeding is not significantly associated with increased intelligence in children once mother's IQ is statistically controlled. The conclusion may potentially have both theoretical and methodological problems. The National Child Development Study allows the examination of the effect of breastfeeding on intelligence in two consecutive generations of British children. The analysis of the first generation shows that the effect of breastfeeding on intelligence increases from Age 7 to 16. The analysis of the second generation shows that each month of breastfeeding, net of parental IQ and other potential confounds, is associated with an increase of .16 IQ points. Further analyses suggest that some previous studies may have failed to uncover the effect of breastfeeding on child intelligence because of their reliance on one IQ test. (c) 2015 APA, all rights reserved).

  13. [New design of the Health Survey of Catalonia (Spain, 2010-2014): a step forward in health planning and evaluation].

    PubMed

    Alcañiz-Zanón, Manuela; Mompart-Penina, Anna; Guillén-Estany, Montserrat; Medina-Bustos, Antonia; Aragay-Barbany, Josep M; Brugulat-Guiteras, Pilar; Tresserras-Gaju, Ricard

    2014-01-01

    This article presents the genesis of the Health Survey of Catalonia (Spain, 2010-2014) with its semiannual subsamples and explains the basic characteristics of its multistage sampling design. In comparison with previous surveys, the organizational advantages of this new statistical operation include rapid data availability and the ability to continuously monitor the population. The main benefits are timeliness in the production of indicators and the possibility of introducing new topics through the supplemental questionnaire as a function of needs. Limitations consist of the complexity of the sample design and the lack of longitudinal follow-up of the sample. Suitable sampling weights for each specific subsample are necessary for any statistical analysis of micro-data. Accuracy in the analysis of territorial disaggregation or population subgroups increases if annual samples are accumulated. Copyright © 2013 SESPAS. Published by Elsevier Espana. All rights reserved.

  14. Effects of research tool patents on biotechnology innovation in a developing country: A case study of South Korea

    PubMed Central

    Kang, Kyung-Nam; Ryu, Tae-Kyu; Lee, Yoon-Sik

    2009-01-01

    Background Concerns have recently been raised about the negative effects of patents on innovation. In this study, the effects of patents on innovations in the Korean biotech SMEs (small and medium-sized entrepreneurs) were examined using survey data and statistical analysis. Results The survey results of this study provided some evidence that restricted access problems have occurred even though their frequency was not high. Statistical analysis revealed that difficulties in accessing patented research tools were not negatively correlated with the level of innovation performance and attitudes toward the patent system. Conclusion On the basis of the results of this investigation in combination with those of previous studies, we concluded that although restricted access problems have occurred, this has not yet deterred innovation in Korea. However, potential problems do exist, and the effects of restricted access should be constantly scrutinized. PMID:19321013

  15. Statistical intensity variation analysis for rapid volumetric imaging of capillary network flux

    PubMed Central

    Lee, Jonghwan; Jiang, James Y.; Wu, Weicheng; Lesage, Frederic; Boas, David A.

    2014-01-01

    We present a novel optical coherence tomography (OCT)-based technique for rapid volumetric imaging of red blood cell (RBC) flux in capillary networks. Previously we reported that OCT can capture individual RBC passage within a capillary, where the OCT intensity signal at a voxel fluctuates when an RBC passes the voxel. Based on this finding, we defined a metric of statistical intensity variation (SIV) and validated that the mean SIV is proportional to the RBC flux [RBC/s] through simulations and measurements. From rapidly scanned volume data, we used Hessian matrix analysis to vectorize a segment path of each capillary and estimate its flux from the mean of the SIVs gathered along the path. Repeating this process led to a 3D flux map of the capillary network. The present technique enabled us to trace the RBC flux changes over hundreds of capillaries with a temporal resolution of ~1 s during functional activation. PMID:24761298

  16. Analysis of First-Time Unsuccessful Attempts on the Certified Nurse Educator Examination.

    PubMed

    Lundeen, John D

    This retrospective analysis examined first-time unsuccessful attempts on the Certified Nurse Educator (CNE) examination from September 2005 through September 2011 (n = 390). There are few studies examining certification within the academic nurse educator role. There is also a lack of evidence to assist nurse educators in understanding those factors that best support success on the CNE exam. Nonexperimental, descriptive, retrospective correlational design using chi-square test of independence and factorial analyses of variance. A statistically significant relationship was found between first-time failure and highest degree obtained and institutional affiliation on the CNE exam. There was no statistically significant effect on mean scores in any of the six content areas measured by the CNE exam as related to highest degree or institutional affiliation. The findings from this study support a previous recommendation for faculty development, experience in the role, and doctoral preparation prior to seeking certification.

  17. Effects of research tool patents on biotechnology innovation in a developing country: a case study of South Korea.

    PubMed

    Kang, Kyung-Nam; Ryu, Tae-Kyu; Lee, Yoon-Sik

    2009-03-26

    Concerns have recently been raised about the negative effects of patents on innovation. In this study, the effects of patents on innovations in the Korean biotech SMEs (small and medium-sized entrepreneurs) were examined using survey data and statistical analysis. The survey results of this study provided some evidence that restricted access problems have occurred even though their frequency was not high. Statistical analysis revealed that difficulties in accessing patented research tools were not negatively correlated with the level of innovation performance and attitudes toward the patent system. On the basis of the results of this investigation in combination with those of previous studies, we concluded that although restricted access problems have occurred, this has not yet deterred innovation in Korea. However, potential problems do exist, and the effects of restricted access should be constantly scrutinized.

  18. Geographically Sourcing Cocaine's Origin - Delineation of the Nineteen Major Coca Growing Regions in South America.

    PubMed

    Mallette, Jennifer R; Casale, John F; Jordan, James; Morello, David R; Beyer, Paul M

    2016-03-23

    Previously, geo-sourcing to five major coca growing regions within South America was accomplished. However, the expansion of coca cultivation throughout South America made sub-regional origin determinations increasingly difficult. The former methodology was recently enhanced with additional stable isotope analyses ((2)H and (18)O) to fully characterize cocaine due to the varying environmental conditions in which the coca was grown. An improved data analysis method was implemented with the combination of machine learning and multivariate statistical analysis methods to provide further partitioning between growing regions. Here, we show how the combination of trace cocaine alkaloids, stable isotopes, and multivariate statistical analyses can be used to classify illicit cocaine as originating from one of 19 growing regions within South America. The data obtained through this approach can be used to describe current coca cultivation and production trends, highlight trafficking routes, as well as identify new coca growing regions.

  19. Geographically Sourcing Cocaine’s Origin - Delineation of the Nineteen Major Coca Growing Regions in South America

    NASA Astrophysics Data System (ADS)

    Mallette, Jennifer R.; Casale, John F.; Jordan, James; Morello, David R.; Beyer, Paul M.

    2016-03-01

    Previously, geo-sourcing to five major coca growing regions within South America was accomplished. However, the expansion of coca cultivation throughout South America made sub-regional origin determinations increasingly difficult. The former methodology was recently enhanced with additional stable isotope analyses (2H and 18O) to fully characterize cocaine due to the varying environmental conditions in which the coca was grown. An improved data analysis method was implemented with the combination of machine learning and multivariate statistical analysis methods to provide further partitioning between growing regions. Here, we show how the combination of trace cocaine alkaloids, stable isotopes, and multivariate statistical analyses can be used to classify illicit cocaine as originating from one of 19 growing regions within South America. The data obtained through this approach can be used to describe current coca cultivation and production trends, highlight trafficking routes, as well as identify new coca growing regions.

  20. Dietary fat intake and risk of epithelial ovarian cancer: a meta-analysis of 6,689 subjects from 8 observational studies.

    PubMed

    Huncharek, M; Kupelnick, B

    2001-01-01

    The etiology of epithelial ovarian cancer is unknown. Prior work suggests that high dietary fat intake is associated with an increased risk of this tumor, although this association remains speculative. A meta-analysis was performed to evaluate this suspected relationship. Using previously described methods, a protocol was developed for a meta-analysis examining the association between high vs. low dietary fat intake and the risk of epithelial ovarian cancer. Literature search techniques, study inclusion criteria, and statistical procedures were prospectively defined. Data from observational studies were pooled using a general variance-based meta-analytic method employing confidence intervals (CI) previously described by Greenland. The outcome of interest was a summary relative risk (RRs) reflecting the risk of ovarian cancer associated with high vs. low dietary fat intake. Sensitivity analyses were performed when necessary to evaluate any observed statistical heterogeneity. The literature search yielded 8 observational studies enrolling 6,689 subjects. Data were stratified into three dietary fat intake categories: total fat, animal fat, and saturated fat. Initial tests for statistical homogeneity demonstrated that hospital-based studies accounted for observed heterogeneity possibly because of selection bias. Accounting for this, an RRs was calculated for high vs. low total fat intake, yielding a value of 1.24 (95% CI = 1.07-1.43), a statistically significant result. That is, high total fat intake is associated with a 24% increased risk of ovarian cancer development. The RRs for high saturated fat intake was 1.20 (95% CI = 1.04-1.39), suggesting a 20% increased risk of ovarian cancer among subjects with these dietary habits. High vs. low animal fat diet gave an RRs of 1.70 (95% CI = 1.43-2.03), consistent with a statistically significant 70% increased ovarian cancer risk. High dietary fat intake appears to represent a significant risk factor for the development of ovarian cancer. The magnitude of this risk associated with total fat and saturated fat is rather modest. Ovarian cancer risk associated with high animal fat intake appears significantly greater than that associated with the other types of fat intake studied, although this requires confirmation via larger analyses. Further work is needed to clarify factors that may modify the effects of dietary fat in vivo.

  1. One-Carbon Metabolism and Breast Cancer Survival in a Population-Based Study

    DTIC Science & Technology

    2008-06-01

    the dietary intake of one- carbon-related micronutrients /compounds (e.g. folate, methionine, chioline, B vitamins, alcohol, etc) in relation to...examine the dietary intake of one-carbon-related micronutrients /compounds (e.g. folate, methionine, chioline, B vitamins, alcohol, etc) in relation to...of dietary methyl content and overall survival. Some descriptive statistical analysis has been reported in previous annual report. The Kaplan-Meier

  2. Quantum noise in SIS mixers

    NASA Astrophysics Data System (ADS)

    Zorin, A. B.

    1985-03-01

    In the present, quantum-statistical analysis of SIS heterodyne mixer performance, the conventional three-port model of the mixer circuit and the microscopic theory of superconducting tunnel junctions are used to derive a general expression for a noise parameter previously used for the case of parametric amplifiers. This expression is numerically evaluated for various quasiparticle current step widths, dc bias voltages, local oscillator powers, signal frequencies, signal source admittances, and operation temperatures.

  3. Causes of Urban Sprawl in the United States: Auto Reliance as Compared to Natural Evolution, Flight from Blight, and Local Revenue Reliance

    ERIC Educational Resources Information Center

    Wassmer, Robert W.

    2008-01-01

    This paper describes a statistical study of the contribution of theories previously offered by economists to explain differences in the degree of urban decentralization in the U.S. The focus is on a relative comparison of the influence of auto reliance. A regression analysis reveals that a 10 percent reduction in the percentage of households…

  4. [Risk factors associated with dystocic delivery].

    PubMed

    Romero Gutiérrez, Gustavo; Ríos López, Juan Carlos; Cortés Salim, Patricia; Ponce Ponce de León, Ana Lilia

    2007-09-01

    the dystocic delivery is a frequent complication and its perinatal repercussions vary from minor lesions to severe brain damage. It has been reported diverse factors associated with this medical complication. to identify the risk factors with significant association with dystocic delivery. a case-control study was carried out. There were included 750 patients, divided into 250 women with dystocic deliveries (cases) and 500 women with eutocic deliveries (controls). Demographic and clinical variables were registered. The statistical analysis was performed with percentages, arithmetic media, standard deviation, Student t test, chi2 and logistic regression analysis. An alpha value was set at 0.05. the factors with statistical significance were: advanced age (p < 0.001), major patient's height (p < 0.001), major new born's weight (p = 0.009), lower parity (p < 0.001), and prolonged duration of labor (p = 0.04). Other variables such as number of pregnancies, previous cesarean sections, spontaneous abortions, weight of the patient, weight earned during pregnancy, number of medical appointments during antenatal care, previous dystocic delivery, and premature rupture of the membranes, were not significant. there are clinical and demographic risk factors associated with dystocic delivery. To identify this risk factors during the antenatal care could diminish the frequency of dystocic deliveries and therefore to avoid the associated maternal-fetal complications.

  5. A Statistical Approach to Identify Superluminous Supernovae and Probe Their Diversity

    NASA Astrophysics Data System (ADS)

    Inserra, C.; Prajs, S.; Gutierrez, C. P.; Angus, C.; Smith, M.; Sullivan, M.

    2018-02-01

    We investigate the identification of hydrogen-poor superluminous supernovae (SLSNe I) using a photometric analysis, without including an arbitrary magnitude threshold. We assemble a homogeneous sample of previously classified SLSNe I from the literature, and fit their light curves using Gaussian processes. From the fits, we identify four photometric parameters that have a high statistical significance when correlated, and combine them in a parameter space that conveys information on their luminosity and color evolution. This parameter space presents a new definition for SLSNe I, which can be used to analyze existing and future transient data sets. We find that 90% of previously classified SLSNe I meet our new definition. We also examine the evidence for two subclasses of SLSNe I, combining their photometric evolution with spectroscopic information, namely the photospheric velocity and its gradient. A cluster analysis reveals the presence of two distinct groups. “Fast” SLSNe show fast light curves and color evolution, large velocities, and a large velocity gradient. “Slow” SLSNe show slow light curve and color evolution, small expansion velocities, and an almost non-existent velocity gradient. Finally, we discuss the impact of our analyses in the understanding of the powering engine of SLSNe, and their implementation as cosmological probes in current and future surveys.

  6. Optoelectronics-related competence building in Japanese and Western firms

    NASA Astrophysics Data System (ADS)

    Miyazaki, Kumiko

    1992-05-01

    In this paper, an analysis is made of how different firms in Japan and the West have developed competence related to optoelectronics on the basis of their previous experience and corporate strategies. The sample consists of a set of seven Japanese and four Western firms in the industrial, consumer electronics and materials sectors. Optoelectronics is divided into subfields including optical communications systems, optical fibers, optoelectronic key components, liquid crystal displays, optical disks, and others. The relative strengths and weaknesses of companies in the various subfields are determined using the INSPEC database, from 1976 to 1989. Parallel data are analyzed using OTAF U.S. patent statistics and the two sets of data are compared. The statistical analysis from the database is summarized for firms in each subfield in the form of an intra-firm technology index (IFTI), a new technique introduced to assess the revealed technology advantage of firms. The quantitative evaluation is complemented by results from intensive interviews with the management and scientists of the firms involved. The findings show that there is a marked variation in the way firms' technological trajectories have evolved giving rise to strength in some and weakness in other subfields for the different companies, which are related to their accumulated core competencies, previous core business activities, organizational, marketing, and competitive factors.

  7. A statistical study of EMIC waves observed by Cluster. 1. Wave properties. EMIC Wave Properties

    DOE PAGES

    Allen, R. C.; Zhang, J. -C.; Kistler, L. M.; ...

    2015-07-23

    Electromagnetic ion cyclotron (EMIC) waves are an important mechanism for particle energization and losses inside the magnetosphere. In order to better understand the effects of these waves on particle dynamics, detailed information about the occurrence rate, wave power, ellipticity, normal angle, energy propagation angle distributions, and local plasma parameters are required. Previous statistical studies have used in situ observations to investigate the distribution of these parameters in the magnetic local time versus L-shell (MLT-L) frame within a limited magnetic latitude (MLAT) range. In our study, we present a statistical analysis of EMIC wave properties using 10 years (2001–2010) of datamore » from Cluster, totaling 25,431 min of wave activity. Due to the polar orbit of Cluster, we are able to investigate EMIC waves at all MLATs and MLTs. This allows us to further investigate the MLAT dependence of various wave properties inside different MLT sectors and further explore the effects of Shabansky orbits on EMIC wave generation and propagation. Thus, the statistical analysis is presented in two papers. OUr paper focuses on the wave occurrence distribution as well as the distribution of wave properties. The companion paper focuses on local plasma parameters during wave observations as well as wave generation proxies.« less

  8. A 20-year period of orthotopic liver transplantation activity in a single center: a time series analysis performed using the R Statistical Software.

    PubMed

    Santori, G; Andorno, E; Morelli, N; Casaccia, M; Bottino, G; Di Domenico, S; Valente, U

    2009-05-01

    In many Western countries a "minimum volume rule" policy has been adopted as a quality measure for complex surgical procedures. In Italy, the National Transplant Centre set the minimum number of orthotopic liver transplantation (OLT) procedures/y at 25/center. OLT procedures performed in a single center for a reasonably large period may be treated as a time series to evaluate trend, seasonal cycles, and nonsystematic fluctuations. Between January 1, 1987 and December 31, 2006, we performed 563 cadaveric donor OLTs to adult recipients. During 2007, there were another 28 procedures. The greatest numbers of OLTs/y were performed in 2001 (n = 51), 2005 (n = 50), and 2004 (n = 49). A time series analysis performed using R Statistical Software (Foundation for Statistical Computing, Vienna, Austria), a free software environment for statistical computing and graphics, showed an incremental trend after exponential smoothing as well as after seasonal decomposition. The predicted OLT/mo for 2007 calculated with the Holt-Winters exponential smoothing applied to the previous period 1987-2006 helped to identify the months where there was a major difference between predicted and performed procedures. The time series approach may be helpful to establish a minimum volume/y at a single-center level.

  9. Atom-scale compositional distribution in InAlAsSb-based triple junction solar cells by atom probe tomography.

    PubMed

    Hernández-Saz, J; Herrera, M; Delgado, F J; Duguay, S; Philippe, T; Gonzalez, M; Abell, J; Walters, R J; Molina, S I

    2016-07-29

    The analysis by atom probe tomography (APT) of InAlAsSb layers with applications in triple junction solar cells (TJSCs) has shown the existence of In- and Sb-rich regions in the material. The composition variation found is not evident from the direct observation of the 3D atomic distribution and because of this a statistical analysis has been required. From previous analysis of these samples, it is shown that the small compositional fluctuations determined have a strong effect on the optical properties of the material and ultimately on the performance of TJSCs.

  10. Reproducibility-optimized test statistic for ranking genes in microarray studies.

    PubMed

    Elo, Laura L; Filén, Sanna; Lahesmaa, Riitta; Aittokallio, Tero

    2008-01-01

    A principal goal of microarray studies is to identify the genes showing differential expression under distinct conditions. In such studies, the selection of an optimal test statistic is a crucial challenge, which depends on the type and amount of data under analysis. While previous studies on simulated or spike-in datasets do not provide practical guidance on how to choose the best method for a given real dataset, we introduce an enhanced reproducibility-optimization procedure, which enables the selection of a suitable gene- anking statistic directly from the data. In comparison with existing ranking methods, the reproducibilityoptimized statistic shows good performance consistently under various simulated conditions and on Affymetrix spike-in dataset. Further, the feasibility of the novel statistic is confirmed in a practical research setting using data from an in-house cDNA microarray study of asthma-related gene expression changes. These results suggest that the procedure facilitates the selection of an appropriate test statistic for a given dataset without relying on a priori assumptions, which may bias the findings and their interpretation. Moreover, the general reproducibilityoptimization procedure is not limited to detecting differential expression only but could be extended to a wide range of other applications as well.

  11. Survival Model for Foot and Leg High Rate Axial Impact Injury Data.

    PubMed

    Bailey, Ann M; McMurry, Timothy L; Poplin, Gerald S; Salzar, Robert S; Crandall, Jeff R

    2015-01-01

    Understanding how lower extremity injuries from automotive intrusion and underbody blast (UBB) differ is of key importance when determining whether automotive injury criteria can be applied to blast rate scenarios. This article provides a review of existing injury risk analyses and outlines an approach to improve injury prediction for an expanded range of loading rates. This analysis will address issues with existing injury risk functions including inaccuracies due to inertial and potential viscous resistance at higher loading rates. This survival analysis attempts to minimize these errors by considering injury location statistics and a predictor variable selection process dependent upon failure mechanisms of bone. Distribution of foot/ankle/leg injuries induced by axial impact loading at rates characteristic of UBB as well as automotive intrusion was studied and calcaneus injuries were found to be the most common injury; thus, footplate force was chosen as the main predictor variable because of its proximity to injury location to prevent inaccuracies associated with inertial differences due to loading rate. A survival analysis was then performed with age, sex, dorsiflexion angle, and mass as covariates. This statistical analysis uses data from previous axial postmortem human surrogate (PMHS) component leg tests to provide perspectives on how proximal boundary conditions and loading rate affect injury probability in the foot/ankle/leg (n = 82). Tibia force-at-fracture proved to be up to 20% inaccurate in previous analyses because of viscous resistance and inertial effects within the data set used, suggesting that previous injury criteria are accurate only for specific rates of loading and boundary conditions. The statistical model presented in this article predicts 50% probability of injury for a plantar force of 10.2 kN for a 50th percentile male with a neutral ankle position. Force rate was found to be an insignificant covariate because of the limited range of loading rate differences within the data set; however, compensation for inertial effects caused by measuring the force-at-fracture in a location closer to expected injury location improved the model's predictive capabilities for the entire data set. This study provides better injury prediction capabilities for both automotive and blast rates because of reduced sensitivity to inertial effects and tibia-fibula load sharing. Further, a framework is provided for future injury criteria generation for high rate loading scenarios. This analysis also suggests key improvements to be made to existing anthropomorphic test device (ATD) lower extremities to provide accurate injury prediction for high rate applications such as UBB.

  12. Is the concomitant use of clopidogrel and Proton Pump Inhibitors still associated with increased adverse cardiovascular outcomes following coronary angioplasty?: a systematic review and meta-analysis of recently published studies (2012 - 2016).

    PubMed

    Bundhun, Pravesh Kumar; Teeluck, Abhishek Rishikesh; Bhurtu, Akash; Huang, Wei-Qiang

    2017-01-05

    Controversies were previously observed with the concomitant use of clopidogrel and Proton Pump Inhibitors (PPIs), especially omeprazole, following coronary angioplasty. Even though several studies showed no interaction between clopidogrel and PPIs, questions have been raised about the decrease in antiplatelet effects of clopidogrel with PPIs. A previously published meta-analysis showed concomitant use of clopidogrel and PPIs to be associated with higher adverse cardiovascular outcomes. However, data which were used were extracted from studies published before the year 2012. Whether these controversies still exist in this new era is not clear. Therefore, we aim to show if the concomitant use of clopidogrel and PPIs is still associated with higher adverse outcomes following Percutaneous Coronary Intervention (PCI) using data obtained from recently published studies (2012 to 2016). Electronic databases were searched for recent publications (2012-2016) comparing (clopidogrel plus PPIs) versus clopidogrel alone following PCI. Adverse cardiovascular outcomes were considered as the clinical endpoints. Odds Ratios (OR) with 95% Confidence Intervals (CI) were used as the statistical parameters and the pooled analyses were performed with RevMan 5.3 software. Eleven studies with a total number of 84,729 patients (29,235 patients from the PPIs group versus 55,494 patients from the non-PPIs group) were included. Results of this analysis showed that short term mortality and Target Vessel Revascularization (TVR) significantly favored the non-PPIs group with OR: 1.55; 95% CI: 1.43-1.68, P < 0.00001 and OR: 1.26; 95% CI: 1.06-1.49, P = 0.009 respectively. Long-term Major Adverse Cardiac Events (MACEs), Myocardial Infarction (MI), Stent Thrombosis (ST) and TVR significantly favored patients who did not use PPIs with OR: 1.37; 95% CI: 1.23-1.53, P < 0.00001, OR: 1.41; 95% CI: 1.26-1.57, P < 0.00001 and OR: 1.38; 95% CI: 1.13-1.70, P = 0.002 and OR: 1.28; 95% CI: 1.01-1.61, P = 0.04 respectively. However, the result for long term mortality was not statistically significant. The combined use of clopidogrel with PPIs is still associated with significantly higher adverse cardiovascular events such as MACEs, ST and MI following PCI supporting results of the previously published meta-analysis. However, long-term mortality is not statistically significant warranting further analysis with randomized patients.

  13. Protein Interaction Networks Reveal Novel Autism Risk Genes within GWAS Statistical Noise

    PubMed Central

    Correia, Catarina; Oliveira, Guiomar; Vicente, Astrid M.

    2014-01-01

    Genome-wide association studies (GWAS) for Autism Spectrum Disorder (ASD) thus far met limited success in the identification of common risk variants, consistent with the notion that variants with small individual effects cannot be detected individually in single SNP analysis. To further capture disease risk gene information from ASD association studies, we applied a network-based strategy to the Autism Genome Project (AGP) and the Autism Genetics Resource Exchange GWAS datasets, combining family-based association data with Human Protein-Protein interaction (PPI) data. Our analysis showed that autism-associated proteins at higher than conventional levels of significance (P<0.1) directly interact more than random expectation and are involved in a limited number of interconnected biological processes, indicating that they are functionally related. The functionally coherent networks generated by this approach contain ASD-relevant disease biology, as demonstrated by an improved positive predictive value and sensitivity in retrieving known ASD candidate genes relative to the top associated genes from either GWAS, as well as a higher gene overlap between the two ASD datasets. Analysis of the intersection between the networks obtained from the two ASD GWAS and six unrelated disease datasets identified fourteen genes exclusively present in the ASD networks. These are mostly novel genes involved in abnormal nervous system phenotypes in animal models, and in fundamental biological processes previously implicated in ASD, such as axon guidance, cell adhesion or cytoskeleton organization. Overall, our results highlighted novel susceptibility genes previously hidden within GWAS statistical “noise” that warrant further analysis for causal variants. PMID:25409314

  14. Protein interaction networks reveal novel autism risk genes within GWAS statistical noise.

    PubMed

    Correia, Catarina; Oliveira, Guiomar; Vicente, Astrid M

    2014-01-01

    Genome-wide association studies (GWAS) for Autism Spectrum Disorder (ASD) thus far met limited success in the identification of common risk variants, consistent with the notion that variants with small individual effects cannot be detected individually in single SNP analysis. To further capture disease risk gene information from ASD association studies, we applied a network-based strategy to the Autism Genome Project (AGP) and the Autism Genetics Resource Exchange GWAS datasets, combining family-based association data with Human Protein-Protein interaction (PPI) data. Our analysis showed that autism-associated proteins at higher than conventional levels of significance (P<0.1) directly interact more than random expectation and are involved in a limited number of interconnected biological processes, indicating that they are functionally related. The functionally coherent networks generated by this approach contain ASD-relevant disease biology, as demonstrated by an improved positive predictive value and sensitivity in retrieving known ASD candidate genes relative to the top associated genes from either GWAS, as well as a higher gene overlap between the two ASD datasets. Analysis of the intersection between the networks obtained from the two ASD GWAS and six unrelated disease datasets identified fourteen genes exclusively present in the ASD networks. These are mostly novel genes involved in abnormal nervous system phenotypes in animal models, and in fundamental biological processes previously implicated in ASD, such as axon guidance, cell adhesion or cytoskeleton organization. Overall, our results highlighted novel susceptibility genes previously hidden within GWAS statistical "noise" that warrant further analysis for causal variants.

  15. Statistical summaries of selected Iowa streamflow data through September 2013

    USGS Publications Warehouse

    Eash, David A.; O'Shea, Padraic S.; Weber, Jared R.; Nguyen, Kevin T.; Montgomery, Nicholas L.; Simonson, Adrian J.

    2016-01-04

    Statistical summaries of streamflow data collected at 184 streamgages in Iowa are presented in this report. All streamgages included for analysis have at least 10 years of continuous record collected before or through September 2013. This report is an update to two previously published reports that presented statistical summaries of selected Iowa streamflow data through September 1988 and September 1996. The statistical summaries include (1) monthly and annual flow durations, (2) annual exceedance probabilities of instantaneous peak discharges (flood frequencies), (3) annual exceedance probabilities of high discharges, and (4) annual nonexceedance probabilities of low discharges and seasonal low discharges. Also presented for each streamgage are graphs of the annual mean discharges, mean annual mean discharges, 50-percent annual flow-duration discharges (median flows), harmonic mean flows, mean daily mean discharges, and flow-duration curves. Two sets of statistical summaries are presented for each streamgage, which include (1) long-term statistics for the entire period of streamflow record and (2) recent-term statistics for or during the 30-year period of record from 1984 to 2013. The recent-term statistics are only calculated for streamgages with streamflow records pre-dating the 1984 water year and with at least 10 years of record during 1984–2013. The streamflow statistics in this report are not adjusted for the effects of water use; although some of this water is used consumptively, most of it is returned to the streams.

  16. Multivariate Statistical Analysis of Orthogonal Mass Spectral Data for the Identification of Chemical Attribution Signatures of 3-Methylfentanyl

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayer, B. P.; Valdez, C. A.; DeHope, A. J.

    Critical to many modern forensic investigations is the chemical attribution of the origin of an illegal drug. This process greatly relies on identification of compounds indicative of its clandestine or commercial production. The results of these studies can yield detailed information on method of manufacture, sophistication of the synthesis operation, starting material source, and final product. In the present work, chemical attribution signatures (CAS) associated with the synthesis of the analgesic 3- methylfentanyl, N-(3-methyl-1-phenethylpiperidin-4-yl)-N-phenylpropanamide, were investigated. Six synthesis methods were studied in an effort to identify and classify route-specific signatures. These methods were chosen to minimize the use of scheduledmore » precursors, complicated laboratory equipment, number of overall steps, and demanding reaction conditions. Using gas and liquid chromatographies combined with mass spectrometric methods (GC-QTOF and LC-QTOF) in conjunction with inductivelycoupled plasma mass spectrometry (ICP-MS), over 240 distinct compounds and elements were monitored. As seen in our previous work with CAS of fentanyl synthesis the complexity of the resultant data matrix necessitated the use of multivariate statistical analysis. Using partial least squares discriminant analysis (PLS-DA), 62 statistically significant, route-specific CAS were identified. Statistical classification models using a variety of machine learning techniques were then developed with the ability to predict the method of 3-methylfentanyl synthesis from three blind crude samples generated by synthetic chemists without prior experience with these methods.« less

  17. Measurement and statistical analysis of single-molecule current-voltage characteristics, transition voltage spectroscopy, and tunneling barrier height.

    PubMed

    Guo, Shaoyin; Hihath, Joshua; Díez-Pérez, Ismael; Tao, Nongjian

    2011-11-30

    We report on the measurement and statistical study of thousands of current-voltage characteristics and transition voltage spectra (TVS) of single-molecule junctions with different contact geometries that are rapidly acquired using a new break junction method at room temperature. This capability allows one to obtain current-voltage, conductance voltage, and transition voltage histograms, thus adding a new dimension to the previous conductance histogram analysis at a fixed low-bias voltage for single molecules. This method confirms the low-bias conductance values of alkanedithiols and biphenyldithiol reported in literature. However, at high biases the current shows large nonlinearity and asymmetry, and TVS allows for the determination of a critically important parameter, the tunneling barrier height or energy level alignment between the molecule and the electrodes of single-molecule junctions. The energy level alignment is found to depend on the molecule and also on the contact geometry, revealing the role of contact geometry in both the contact resistance and energy level alignment of a molecular junction. Detailed statistical analysis further reveals that, despite the dependence of the energy level alignment on contact geometry, the variation in single-molecule conductance is primarily due to contact resistance rather than variations in the energy level alignment.

  18. Analysis of Loss-of-Offsite-Power Events 1997-2015

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Nancy Ellen; Schroeder, John Alton

    2016-07-01

    Loss of offsite power (LOOP) can have a major negative impact on a power plant’s ability to achieve and maintain safe shutdown conditions. LOOP event frequencies and times required for subsequent restoration of offsite power are important inputs to plant probabilistic risk assessments. This report presents a statistical and engineering analysis of LOOP frequencies and durations at U.S. commercial nuclear power plants. The data used in this study are based on the operating experience during calendar years 1997 through 2015. LOOP events during critical operation that do not result in a reactor trip, are not included. Frequencies and durations weremore » determined for four event categories: plant-centered, switchyard-centered, grid-related, and weather-related. Emergency diesel generator reliability is also considered (failure to start, failure to load and run, and failure to run more than 1 hour). There is an adverse trend in LOOP durations. The previously reported adverse trend in LOOP frequency was not statistically significant for 2006-2015. Grid-related LOOPs happen predominantly in the summer. Switchyard-centered LOOPs happen predominantly in winter and spring. Plant-centered and weather-related LOOPs do not show statistically significant seasonality. The engineering analysis of LOOP data shows that human errors have been much less frequent since 1997 than in the 1986 -1996 time period.« less

  19. Saponin Profile of Wild Asparagus Species.

    PubMed

    Jaramillo-Carmona, Sara; Rodriguez-Arcos, Rocío; Jiménez-Araujo, Ana; López, Sergio; Gil, Juan; Moreno, Roberto; Guillén-Bejarano, Rafael

    2017-03-01

    The aim of this work was to study the saponin profiles from spears of different wild asparagus species in the context of its genetic diversity aside from geographical seed origin. They included Asparagus pseudoscaber Grecescu, Asparagus maritimus (L.) Mill., Asparagus brachiphyllus Turcz., Asparagus prostrates Dumort., and Asparagus officinalis L. The saponin analysis by LC-MS has shown that saponin profile from wild asparagus is similar to that previously described for triguero asparagus from Huétor-Tájar landrace (triguero HT), which had not ever been reported in the edible part of asparagus. All the samples, except A. officinalis, were characterized for having saponins distinct to protodioscin and the total saponin contents were 10-fold higher than those described for commercial hybrids of green asparagus. In particular, A. maritimus from different origins were rich in saponins previously found in triguero HT. These findings supported previous suggestion, based on genetic analysis, about A. maritimus being the origin of triguero HT. Multivariate statistics including principal component analysis and hierarchical clustering analysis were used to define both similarities and differences among samples. The results showed that the greatest variance of the tested wild asparagus could be attributed to differences in the concentration of particular saponins and this knowledge could be a tool for identifying similar species. © 2017 Institute of Food Technologists®.

  20. Creamatocrit analysis of human milk overestimates fat and energy content when compared to a human milk analyzer using mid-infrared spectroscopy.

    PubMed

    O'Neill, Edward F; Radmacher, Paula G; Sparks, Blake; Adamkin, David H

    2013-05-01

    Human milk (HM) is the preferred feeding for human infants but may be inadequate to support the rapid growth of the very-low-birth-weight infant. The creamatocrit (CMCT) has been widely used to guide health care professionals as they analyze HM fortification; however, the CMCT method is based on an equation using assumptions for protein and carbohydrate with fat as the only measured variable. The aim of the present study was to test the hypothesis that a human milk analyzer (HMA) would provide more accurate data for fat and energy content than analysis by CMCT. Fifty-one well-mixed samples of previously frozen expressed HM were obtained after thawing. Previously assayed "control" milk samples were thawed and also run with unknowns. All milk samples were prewarmed at 40°C and then analyzed by both CMCT and HMA. CMCT fat results were substituted in the CMCT equation to reach a value for energy (kcal/oz). Fat results from HMA were entered into a computer model to reach a value for energy (kcal/oz). Fat and energy results were compared by paired t test with statistical significance set at P < 0.05. An additional 10 samples were analyzed locally by both methods and then sent to a certified laboratory for quantitative analysis. Results for fat and energy were analyzed by 1-way analysis of variance with statistical significance set at P < 0.05. Mean fat content by CMCT (5.8 ± 1.9 g/dL) was significantly higher than by HMA (3.2 ± 1.1 g/dL, P < 0.001). Mean energy by CMCT (21.8 ± 3.4 kcal/oz) was also significantly higher than by HMA (17.1 ± 2.9, P < 0.001). Comparison of biochemical analysis with HMA of the subset of milk samples showed no statistical difference for fat and energy, whereas CMCT was significantly higher than for both fat (P < 0.001) and energy (P = 0.002). The CMCT method appears to overestimate fat and energy content of HM samples when compared with HMA and biochemical methods.

  1. Investigation of Association Between Hip Osteoarthritis Susceptibility Loci and Radiographic Proximal Femur Shape

    PubMed Central

    Thiagarajah, Shankar; Wilkinson, J. Mark; Panoutsopoulou, Kalliope; Day‐Williams, Aaron G.; Cootes, Timothy F.; Wallis, Gillian A.; Loughlin, John; Arden, Nigel; Birrell, Fraser; Carr, Andrew; Chapman, Kay; Deloukas, Panos; Doherty, Michael; McCaskie, Andrew; Ollier, William E. R.; Rai, Ashok; Ralston, Stuart H.; Spector, Timothy D.; Valdes, Ana M.; Wallis, Gillian A.; Mark Wilkinson, J.; Zeggini, Eleftheria

    2015-01-01

    Objective To test whether previously reported hip morphology or osteoarthritis (OA) susceptibility loci are associated with proximal femur shape as represented by statistical shape model (SSM) modes and as univariate or multivariate quantitative traits. Methods We used pelvic radiographs and genotype data from 929 subjects with unilateral hip OA who had been recruited previously for the Arthritis Research UK Osteoarthritis Genetics Consortium genome‐wide association study. We built 3 SSMs capturing the shape variation of the OA‐unaffected proximal femur in the entire mixed‐sex cohort and for male/female‐stratified cohorts. We selected 41 candidate single‐nucleotide polymorphisms (SNPs) previously reported as being associated with hip morphology (for replication analysis) or OA (for discovery analysis) and for which genotype data were available. We performed 2 types of analysis for genotype–phenotype associations between these SNPs and the modes of the SSMs: 1) a univariate analysis using individual SSM modes and 2) a multivariate analysis using combinations of SSM modes. Results The univariate analysis identified association between rs4836732 (within the ASTN2 gene) and mode 5 of the female SSM (P = 0.0016) and between rs6976 (within the GLT8D1 gene) and mode 7 of the mixed‐sex SSM (P = 0.0003). The multivariate analysis identified association between rs5009270 (near the IFRD1 gene) and a combination of modes 3, 4, and 9 of the mixed‐sex SSM (P = 0.0004). Evidence of associations remained significant following adjustment for multiple testing. All 3 SNPs had previously been associated with hip OA. Conclusion These de novo findings suggest that rs4836732, rs6976, and rs5009270 may contribute to hip OA susceptibility by altering proximal femur shape. PMID:25939412

  2. Surface inspection of flat products by means of texture analysis: on-line implementation using neural networks

    NASA Astrophysics Data System (ADS)

    Fernandez, Carlos; Platero, Carlos; Campoy, Pascual; Aracil, Rafael

    1994-11-01

    This paper describes some texture-based techniques that can be applied to quality assessment of flat products continuously produced (metal strips, wooden surfaces, cork, textile products, ...). Since the most difficult task is that of inspecting for product appearance, human-like inspection ability is required. A common feature to all these products is the presence of non- deterministic texture on their surfaces. Two main subjects are discussed: statistical techniques for both surface finishing determination and surface defect analysis as well as real-time implementation for on-line inspection in high-speed applications. For surface finishing determination a Gray Level Difference technique is presented to perform over low resolution images, that is, no-zoomed images. Defect analysis is performed by means of statistical texture analysis over defective portions of the surface. On-line implementation is accomplished by means of neural networks. When a defect arises, textural analysis is applied which result in a data-vector, acting as input of a neural net, previously trained in a supervised way. This approach tries to reach on-line performance in automated visual inspection applications when texture is presented in flat product surfaces.

  3. Application of a data-mining method based on Bayesian networks to lesion-deficit analysis

    NASA Technical Reports Server (NTRS)

    Herskovits, Edward H.; Gerring, Joan P.

    2003-01-01

    Although lesion-deficit analysis (LDA) has provided extensive information about structure-function associations in the human brain, LDA has suffered from the difficulties inherent to the analysis of spatial data, i.e., there are many more variables than subjects, and data may be difficult to model using standard distributions, such as the normal distribution. We herein describe a Bayesian method for LDA; this method is based on data-mining techniques that employ Bayesian networks to represent structure-function associations. These methods are computationally tractable, and can represent complex, nonlinear structure-function associations. When applied to the evaluation of data obtained from a study of the psychiatric sequelae of traumatic brain injury in children, this method generates a Bayesian network that demonstrates complex, nonlinear associations among lesions in the left caudate, right globus pallidus, right side of the corpus callosum, right caudate, and left thalamus, and subsequent development of attention-deficit hyperactivity disorder, confirming and extending our previous statistical analysis of these data. Furthermore, analysis of simulated data indicates that methods based on Bayesian networks may be more sensitive and specific for detecting associations among categorical variables than methods based on chi-square and Fisher exact statistics.

  4. Fracture overprinting history using Markov chain analysis: Windsor-Kennetcook subbasin, Maritimes Basin, Canada

    NASA Astrophysics Data System (ADS)

    Snyder, Morgan E.; Waldron, John W. F.

    2018-03-01

    The deformation history of the Upper Paleozoic Maritimes Basin, Atlantic Canada, can be partially unraveled by examining fractures (joints, veins, and faults) that are well exposed on the shorelines of the macrotidal Bay of Fundy, in subsurface core, and on image logs. Data were collected from coastal outcrops and well core across the Windsor-Kennetcook subbasin, a subbasin in the Maritimes Basin, using the circular scan-line and vertical scan-line methods in outcrop, and FMI Image log analysis of core. We use cross-cutting and abutting relationships between fractures to understand relative timing of fracturing, followed by a statistical test (Markov chain analysis) to separate groups of fractures. This analysis, previously used in sedimentology, was modified to statistically test the randomness of fracture timing relationships. The results of the Markov chain analysis suggest that fracture initiation can be attributed to movement along the Minas Fault Zone, an E-W fault system that bounds the Windsor-Kennetcook subbasin to the north. Four sets of fractures are related to dextral strike slip along the Minas Fault Zone in the late Paleozoic, and four sets are related to sinistral reactivation of the same boundary in the Mesozoic.

  5. Reframing Serial Murder Within Empirical Research.

    PubMed

    Gurian, Elizabeth A

    2017-04-01

    Empirical research on serial murder is limited due to the lack of consensus on a definition, the continued use of primarily descriptive statistics, and linkage to popular culture depictions. These limitations also inhibit our understanding of these offenders and affect credibility in the field of research. Therefore, this comprehensive overview of a sample of 508 cases (738 total offenders, including partnered groups of two or more offenders) provides analyses of solo male, solo female, and partnered serial killers to elucidate statistical differences and similarities in offending and adjudication patterns among the three groups. This analysis of serial homicide offenders not only supports previous research on offending patterns present in the serial homicide literature but also reveals that empirically based analyses can enhance our understanding beyond traditional case studies and descriptive statistics. Further research based on these empirical analyses can aid in the development of more accurate classifications and definitions of serial murderers.

  6. Propagation of a Free Flame in a Turbulent Gas Stream

    NASA Technical Reports Server (NTRS)

    Mickelsen, William R; Ernstein, Norman E

    1956-01-01

    Effective flame speeds of free turbulent flames were measured by photographic, ionization-gap, and photomultiplier-tube methods, and were found to have a statistical distribution attributed to the nature of the turbulent field. The effective turbulent flame speeds for the free flame were less than those previously measured for flames stabilized on nozzle burners, Bunsen burners, and bluff bodies. The statistical spread of the effective turbulent flame speeds was markedly wider in the lean and rich fuel-air-ratio regions, which might be attributed to the greater sensitivity of laminar flame speed to flame temperature in those regions. Values calculated from the turbulent free-flame-speed analysis proposed by Tucker apparently form upper limits for the statistical spread of free-flame-speed data. Hot-wire anemometer measurements of the longitudinal velocity fluctuation intensity and longitudinal correlation coefficient were made and were employed in the comparison of data and in the theoretical calculation of turbulent flame speed.

  7. Modified optimal control pilot model for computer-aided design and analysis

    NASA Technical Reports Server (NTRS)

    Davidson, John B.; Schmidt, David K.

    1992-01-01

    This paper presents the theoretical development of a modified optimal control pilot model based upon the optimal control model (OCM) of the human operator developed by Kleinman, Baron, and Levison. This model is input compatible with the OCM and retains other key aspects of the OCM, such as a linear quadratic solution for the pilot gains with inclusion of control rate in the cost function, a Kalman estimator, and the ability to account for attention allocation and perception threshold effects. An algorithm designed for each implementation in current dynamic systems analysis and design software is presented. Example results based upon the analysis of a tracking task using three basic dynamic systems are compared with measured results and with similar analyses performed with the OCM and two previously proposed simplified optimal pilot models. The pilot frequency responses and error statistics obtained with this modified optimal control model are shown to compare more favorably to the measured experimental results than the other previously proposed simplified models evaluated.

  8. Tissue classification using depth-dependent ultrasound time series analysis: in-vitro animal study

    NASA Astrophysics Data System (ADS)

    Imani, Farhad; Daoud, Mohammad; Moradi, Mehdi; Abolmaesumi, Purang; Mousavi, Parvin

    2011-03-01

    Time series analysis of ultrasound radio-frequency (RF) signals has been shown to be an effective tissue classification method. Previous studies of this method for tissue differentiation at high and clinical-frequencies have been reported. In this paper, analysis of RF time series is extended to improve tissue classification at the clinical frequencies by including novel features extracted from the time series spectrum. The primary feature examined is the Mean Central Frequency (MCF) computed for regions of interest (ROIs) in the tissue extending along the axial axis of the transducer. In addition, the intercept and slope of a line fitted to the MCF-values of the RF time series as a function of depth have been included. To evaluate the accuracy of the new features, an in vitro animal study is performed using three tissue types: bovine muscle, bovine liver, and chicken breast, where perfect two-way classification is achieved. The results show statistically significant improvements over the classification accuracies with previously reported features.

  9. Evaluation on the use of cerium in the NBL Titrimetric Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zebrowski, J.P.; Orlowicz, G.J.; Johnson, K.D.

    An alternative to potassium dichromate as titrant in the New Brunswick Laboratory Titrimetric Method for uranium analysis was sought since chromium in the waste makes disposal difficult. Substitution of a ceric-based titrant was statistically evaluated. Analysis of the data indicated statistically equivalent precisions for the two methods, but a significant overall bias of +0.035% for the ceric titrant procedure. The cause of the bias was investigated, alterations to the procedure were made, and a second statistical study was performed. This second study revealed no statistically significant bias, nor any analyst-to-analyst variation in the ceric titration procedure. A statistically significant day-to-daymore » variation was detected, but this was physically small (0.01 5%) and was only detected because of the within-day precision of the method. The added mean and standard deviation of the %RD for a single measurement was found to be 0.031%. A comparison with quality control blind dichromate titration data again indicated similar overall precision. Effects of ten elements on the ceric titration`s performance was determined. Co, Ti, Cu, Ni, Na, Mg, Gd, Zn, Cd, and Cr in previous work at NBL these impurities did not interfere with the potassium dichromate titrant. This study indicated similar results for the ceric titrant, with the exception of Ti. All the elements (excluding Ti and Cr), caused no statistically significant bias in uranium measurements at levels of 10 mg impurity per 20-40 mg uranium. The presence of Ti was found to cause a bias of {minus}0.05%; this is attributed to the presence of sulfate ions, resulting in precipitation of titanium sulfate and occlusion of uranium. A negative bias of 0.012% was also statistically observed in the samples containing chromium impurities.« less

  10. Total hip arthroplasty after a previous pelvic osteotomy: A systematic review and meta-analysis.

    PubMed

    Shigemura, T; Yamamoto, Y; Murata, Y; Sato, T; Tsuchiya, R; Wada, Y

    2018-06-01

    There are several reports regarding total hip arthroplasty (THA) after a previous pelvic osteotomy (PO). However, to our knowledge, until now there has been no formal systematic review and meta-analysis published to summarize the clinical results of THA after a previous PO. Therefore, we conducted a systematic review and meta-analysis of results of THA after a previous PO. We focus on these questions as follows: does a previous PO affect the results of subsequent THA, such as clinical outcomes, operative time, operative blood loss, and radiological parameters. Using PubMed, Web of Science, and Cochrane Library, we searched for relevant original papers. The pooling of data was performed using RevMan software (version 5.3, Cochrane Collaboration, Oxford, UK). A p-value<0.05 was judged as significant. Standardized mean differences (SMD) were calculated for continuous data with a 95% confidence interval (CI) was reported. Statistical heterogeneity was assessed based on I 2 using standard χ 2 test. When I 2 >50%, significant heterogeneity was assumed and a random-effects model was applied for the meta-analysis. A fixed-effects model was applied in the absence of significant heterogeneity. Eleven studies were included in this meta-analysis. The pooled results indicated that there was no significant difference in postoperative Merle D'Aubigne-Postel score (I 2 =0%, SMD=-0.15, 95% CI: -0.36 to 0.06, p=0.17), postoperative Harris hip score (I 2 =60%, SMD=-0.23, 95% CI: -0.50 to 0.05, p=0.10), operative time (I 2 =86%, SMD=0.37, 95% CI: -0.09 to 0.82, p=0.11), operative blood loss (I 2 =82%, SMD=0.23, 95% CI: -0.17 to 0.63, p=0.25), and cup abduction angle (I 2 =43%, SMD=-0.08, 95% CI: -0.25 to 0.09, p=0.38) between THA with and without a previous PO. However, cup anteversion angle of THA with a previous PO was significantly smaller than that of without a previous PO (I 2 =77%, SMD=-0.63, 95% CI: -1.13 to -0.13, p=0.01). Systematic review and meta-analysis of results of THA after a previous PO was performed. A previous PO did not affect the results of subsequent THA, except for cup anteversion. Because of the low quality evidence currently available, high-quality randomized controlled trials are required. Level III, meta-analysis of case-control studies. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  11. Formalizing the definition of meta-analysis in Molecular Ecology.

    PubMed

    ArchMiller, Althea A; Bauer, Eric F; Koch, Rebecca E; Wijayawardena, Bhagya K; Anil, Ammu; Kottwitz, Jack J; Munsterman, Amelia S; Wilson, Alan E

    2015-08-01

    Meta-analysis, the statistical synthesis of pertinent literature to develop evidence-based conclusions, is relatively new to the field of molecular ecology, with the first meta-analysis published in the journal Molecular Ecology in 2003 (Slate & Phua 2003). The goal of this article is to formalize the definition of meta-analysis for the authors, editors, reviewers and readers of Molecular Ecology by completing a review of the meta-analyses previously published in this journal. We also provide a brief overview of the many components required for meta-analysis with a more specific discussion of the issues related to the field of molecular ecology, including the use and statistical considerations of Wright's FST and its related analogues as effect sizes in meta-analysis. We performed a literature review to identify articles published as 'meta-analyses' in Molecular Ecology, which were then evaluated by at least two reviewers. We specifically targeted Molecular Ecology publications because as a flagship journal in this field, meta-analyses published in Molecular Ecology have the potential to set the standard for meta-analyses in other journals. We found that while many of these reviewed articles were strong meta-analyses, others failed to follow standard meta-analytical techniques. One of these unsatisfactory meta-analyses was in fact a secondary analysis. Other studies attempted meta-analyses but lacked the fundamental statistics that are considered necessary for an effective and powerful meta-analysis. By drawing attention to the inconsistency of studies labelled as meta-analyses, we emphasize the importance of understanding the components of traditional meta-analyses to fully embrace the strengths of quantitative data synthesis in the field of molecular ecology. © 2015 John Wiley & Sons Ltd.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    The plpdfa software is a product of an LDRD project at LLNL entitked "Adaptive Sampling for Very High Throughput Data Streams" (tracking number 11-ERD-035). This software was developed by a graduate student summer intern, Chris Challis, who worked under project PI Dan Merl furing the summer of 2011. The software the source code is implementing is a statistical analysis technique for clustering and classification of text-valued data. The method had been previously published by the PI in the open literature.

  13. Heavy flavor decay of Zγ at CDF

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timothy M. Harrington-Taber

    2013-01-01

    Diboson production is an important and frequently measured parameter of the Standard Model. This analysis considers the previously neglected pmore » $$\\bar{p}$$ →Z γ→ b$$\\bar{b}$$ channel, as measured at the Collider Detector at Fermilab. Using the entire Tevatron Run II dataset, the measured result is consistent with Standard Model predictions, but the statistical error associated with this method of measurement limits the strength of this correlation.« less

  14. Multivariate statistical analysis of hemlock (Tsuga) volatiles by SPME/GC/MS: insights into the phytochemistry of the hemlock woolly adelgid (Adelges tsugae Annand)

    Treesearch

    Anthony Lagalante; Frank Calvosa; Michael Mirzabeigi; Vikram Iyengar; Michael Montgomery; Kathleen Shields

    2007-01-01

    A previously developed single-needle, SPME/GC/MS technique was used to measure the terpenoid content of T. canadensis growing in a hemlock forest at Lake Scranton, PA (Lagalante and Montgomery 2003). The volatile terpenoid composition was measured over a 1-year period from June 2003 to May 2004 to follow the annual cycle of foliage development from...

  15. The battle against violence in U.S. hospitals: an analysis of the recent I IAHSS Foundation's healthcare crime surveys.

    PubMed

    Vellani, Karim H

    2016-10-01

    In this article, the author analyzes the possible reasons for the reported drop in hospital violence in the 2016IAHSS Crime Survey compared to previous surveys. He also reviews the one statistic that has remained constant in all the recent crime surveys and recommends an approach in violence prevention programs that may prove successful in reducing workplace violence and staff injuries.

  16. Neighbourhood Socio Economic Disadvantage Index’s Analysis of the Flood Disasters Area at East Jakarta in 1996 and 2016

    NASA Astrophysics Data System (ADS)

    Ranti Ristiani, Christina; Rokhmatuloh; Hernina, Revi

    2017-12-01

    Flood is one of natural disasters that have often happened in East Jakarta. Flood can give several negative impacts and it can affect all aspects of society lives such as economics, political, cultural, socials and others. East Jakarta is an urban area which continuously grows and establishes to become a rapid area. It can be seen from the highest population density in East Jakarta (BPS, 2016) and categorized into a region prone to flooding based on data Prone Flood Map in 1996 and 2016. The higher population exists in East Jakarta, the bigger possibility of the negative effects of disaster it gets. The negative impacts of flood disaster can affect societies especially with socio-economic disadvantage. One of the index to measure socio-economic disadvantage is NSDI (Neighbourhood socio-economic disadvantage index). However, to adjust indicators used in NSDI with Indonesia statistical data compatibility, it needs further assessment and evaluation. Therefore, this paper evaluates previous main indicators used in previous NSDI studies and improves with indicators which more suitable with statistical records in Indonesia. As a result, there will be improved 19 indicators to be used in NSDI, but the groups of indicators remain the same as previous namely; income, education, occupation, housing, and population.

  17. Lung Cancer Risk Prediction Model Incorporating Lung Function: Development and Validation in the UK Biobank Prospective Cohort Study.

    PubMed

    Muller, David C; Johansson, Mattias; Brennan, Paul

    2017-03-10

    Purpose Several lung cancer risk prediction models have been developed, but none to date have assessed the predictive ability of lung function in a population-based cohort. We sought to develop and internally validate a model incorporating lung function using data from the UK Biobank prospective cohort study. Methods This analysis included 502,321 participants without a previous diagnosis of lung cancer, predominantly between 40 and 70 years of age. We used flexible parametric survival models to estimate the 2-year probability of lung cancer, accounting for the competing risk of death. Models included predictors previously shown to be associated with lung cancer risk, including sex, variables related to smoking history and nicotine addiction, medical history, family history of lung cancer, and lung function (forced expiratory volume in 1 second [FEV1]). Results During accumulated follow-up of 1,469,518 person-years, there were 738 lung cancer diagnoses. A model incorporating all predictors had excellent discrimination (concordance (c)-statistic [95% CI] = 0.85 [0.82 to 0.87]). Internal validation suggested that the model will discriminate well when applied to new data (optimism-corrected c-statistic = 0.84). The full model, including FEV1, also had modestly superior discriminatory power than one that was designed solely on the basis of questionnaire variables (c-statistic = 0.84 [0.82 to 0.86]; optimism-corrected c-statistic = 0.83; p FEV1 = 3.4 × 10 -13 ). The full model had better discrimination than standard lung cancer screening eligibility criteria (c-statistic = 0.66 [0.64 to 0.69]). Conclusion A risk prediction model that includes lung function has strong predictive ability, which could improve eligibility criteria for lung cancer screening programs.

  18. Temperature threshold models for benthic macroinvertebrates in Idaho wadeable streams and neighboring ecoregions.

    PubMed

    Richards, David C; Lester, Gary; Pfeiffer, John; Pappani, Jason

    2018-02-07

    Water temperatures are warming throughout the world including the Pacific Northwest, USA. Benthic macroinvertebrates are one of the most important and widely used indicators of freshwater impairment; however, their response to increased water temperatures and their use for monitoring water temperature impairment has been hindered by lack of knowledge of temperature occurrences, threshold change points, or indicator taxa. We present new analysis of a large macroinvertebrate database provided by Idaho Department of Environmental Quality from wadeable streams in Idaho that is to be used in conjunction with our previous analyses. This new analysis provides threshold change points for over 400 taxa along an increasing temperature gradient and provides a list of statistically important indicator taxa. The macroinvertebrate assemblage temperature change point for the taxa that decreased with increased temperatures was determined to be about 20.5 °C and for the taxa assemblage that increased with increased temperatures was about 11.5 °C. Results of this new analysis combined with our previous analysis will also be useful for others in neighboring regions where these taxa occur.

  19. Asthma phenotypes in childhood.

    PubMed

    Reddy, Monica B; Covar, Ronina A

    2016-04-01

    This review describes the literature over the past 18 months that evaluated childhood asthma phenotypes, highlighting the key aspects of these studies, and comparing these studies to previous ones in this area. Recent studies on asthma phenotypes have identified new phenotypes on the basis of statistical analyses (using cluster analysis and latent class analysis methodology) and have evaluated the outcomes and associated risk factors of previously established early childhood asthma phenotypes that are based on asthma onset and patterns of wheezing illness. There have also been investigations focusing on immunologic, physiologic, and genetic correlates of various phenotypes, as well as identification of subphenotypes of severe childhood asthma. Childhood asthma remains a heterogeneous condition, and investigations into these various presentations, risk factors, and outcomes are important since they can offer therapeutic and prognostic relevance. Further investigation into the immunopathology and genetic basis underlying childhood phenotypes is important so therapy can be tailored accordingly.

  20. Quasi-experimental Studies in the Fields of Infection Control and Antibiotic Resistance, Ten Years Later: A Systematic Review.

    PubMed

    Alsaggaf, Rotana; O'Hara, Lyndsay M; Stafford, Kristen A; Leekha, Surbhi; Harris, Anthony D

    2018-02-01

    OBJECTIVE A systematic review of quasi-experimental studies in the field of infectious diseases was published in 2005. The aim of this study was to assess improvements in the design and reporting of quasi-experiments 10 years after the initial review. We also aimed to report the statistical methods used to analyze quasi-experimental data. DESIGN Systematic review of articles published from January 1, 2013, to December 31, 2014, in 4 major infectious disease journals. METHODS Quasi-experimental studies focused on infection control and antibiotic resistance were identified and classified based on 4 criteria: (1) type of quasi-experimental design used, (2) justification of the use of the design, (3) use of correct nomenclature to describe the design, and (4) statistical methods used. RESULTS Of 2,600 articles, 173 (7%) featured a quasi-experimental design, compared to 73 of 2,320 articles (3%) in the previous review (P<.01). Moreover, 21 articles (12%) utilized a study design with a control group; 6 (3.5%) justified the use of a quasi-experimental design; and 68 (39%) identified their design using the correct nomenclature. In addition, 2-group statistical tests were used in 75 studies (43%); 58 studies (34%) used standard regression analysis; 18 (10%) used segmented regression analysis; 7 (4%) used standard time-series analysis; 5 (3%) used segmented time-series analysis; and 10 (6%) did not utilize statistical methods for comparisons. CONCLUSIONS While some progress occurred over the decade, it is crucial to continue improving the design and reporting of quasi-experimental studies in the fields of infection control and antibiotic resistance to better evaluate the effectiveness of important interventions. Infect Control Hosp Epidemiol 2018;39:170-176.

  1. Extreme events in total ozone over Arosa - Part 1: Application of extreme value theory

    NASA Astrophysics Data System (ADS)

    Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Peter, T.; Ribatet, M.; Davison, A. C.; Stübi, R.; Weihs, P.; Holawe, F.

    2010-10-01

    In this study ideas from extreme value theory are for the first time applied in the field of stratospheric ozone research, because statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values) of total ozone data do not adequately address the structure of the extremes. We show that statistical extreme value methods are appropriate to identify ozone extremes and to describe the tails of the Arosa (Switzerland) total ozone time series. In order to accommodate the seasonal cycle in total ozone, a daily moving threshold was determined and used, with tools from extreme value theory, to analyse the frequency of days with extreme low (termed ELOs) and high (termed EHOs) total ozone at Arosa. The analysis shows that the Generalized Pareto Distribution (GPD) provides an appropriate model for the frequency distribution of total ozone above or below a mathematically well-defined threshold, thus providing a statistical description of ELOs and EHOs. The results show an increase in ELOs and a decrease in EHOs during the last decades. The fitted model represents the tails of the total ozone data set with high accuracy over the entire range (including absolute monthly minima and maxima), and enables a precise computation of the frequency distribution of ozone mini-holes (using constant thresholds). Analyzing the tails instead of a small fraction of days below constant thresholds provides deeper insight into the time series properties. Fingerprints of dynamical (e.g. ENSO, NAO) and chemical features (e.g. strong polar vortex ozone loss), and major volcanic eruptions, can be identified in the observed frequency of extreme events throughout the time series. Overall the new approach to analysis of extremes provides more information on time series properties and variability than previous approaches that use only monthly averages and/or mini-holes and mini-highs.

  2. Extreme events in total ozone over Arosa - Part 1: Application of extreme value theory

    NASA Astrophysics Data System (ADS)

    Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Peter, T.; Ribatet, M.; Davison, A. C.; Stübi, R.; Weihs, P.; Holawe, F.

    2010-05-01

    In this study ideas from extreme value theory are for the first time applied in the field of stratospheric ozone research, because statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values) of total ozone data do not adequately address the structure of the extremes. We show that statistical extreme value methods are appropriate to identify ozone extremes and to describe the tails of the Arosa (Switzerland) total ozone time series. In order to accommodate the seasonal cycle in total ozone, a daily moving threshold was determined and used, with tools from extreme value theory, to analyse the frequency of days with extreme low (termed ELOs) and high (termed EHOs) total ozone at Arosa. The analysis shows that the Generalized Pareto Distribution (GPD) provides an appropriate model for the frequency distribution of total ozone above or below a mathematically well-defined threshold, thus providing a statistical description of ELOs and EHOs. The results show an increase in ELOs and a decrease in EHOs during the last decades. The fitted model represents the tails of the total ozone data set with high accuracy over the entire range (including absolute monthly minima and maxima), and enables a precise computation of the frequency distribution of ozone mini-holes (using constant thresholds). Analyzing the tails instead of a small fraction of days below constant thresholds provides deeper insight into the time series properties. Fingerprints of dynamical (e.g. ENSO, NAO) and chemical features (e.g. strong polar vortex ozone loss), and major volcanic eruptions, can be identified in the observed frequency of extreme events throughout the time series. Overall the new approach to analysis of extremes provides more information on time series properties and variability than previous approaches that use only monthly averages and/or mini-holes and mini-highs.

  3. Previous vertebral compression fractures add to the deterioration of the disability and quality of life after an acute compression fracture.

    PubMed

    Suzuki, Nobuyuki; Ogikubo, Osamu; Hansson, Tommy

    2010-04-01

    Prevalent vertebral compression fracture(s) have been reported as having a negative impact on pain, disability, and quality of life. But no study has evaluated the effect of previous fracture on the course of acute compression fractures. The aim of the present study was to compare the natural course of the acute compression fracture in patients with (n = 51) and without (n = 56) previous vertebral compression fracture(s). The study is a retrospective analysis of a prospective cohort followed with postal questionnaires during a 12-month period after an acute fracture event. Eligible patients were those over 40 years of age, who were admitted to the emergency unit because of back pain and had an X-ray confirmed acute vertebral body fracture. A total of 107 patients were included in the study. The pain, disability (von Korff pain and disability scores), ADL (Hannover ADL score), and quality of life (QoL) (EQ-5D) were measured after 3 weeks, and 3, 6, and 12 months. The X-rays from the first visit to the emergency unit were evaluated. The difference of the scores between the groups with and without previous fracture was statistically significant (P < 0.05) at 3 weeks, 6 and 12 months for von Korff disability score, at all occasions for EQ-5D and at 3-12 months for Hannover ADL score, but only at 12 months for the von Korff pain intensity score. In both the groups all scores had improved in a statistically significant way at 3 months. The number of previous fractures was related to all the outcome scores in a statistically significant way (P < 0.05) except von Korff pain intensity score at 3 weeks and 3 months and von Korff disability score at 3 months. In conclusion, disability, ADL, and QoL scores, but not pain intensity score, were significantly worse in the patients with previous fracture from the fracture episode through the first 12 months. However, the improvements during the follow-up year seen in both groups were of a similar magnitude. The presence or absence of a previous fracture in an acutely fractured patient will influence the prognosis and thus possibly also the indications for treatments.

  4. National mandatory motorcycle helmet laws may save $2.2 billion annually: An inpatient and value of statistical life analysis.

    PubMed

    Dua, Anahita; Wei, Shuyan; Safarik, Justin; Furlough, Courtney; Desai, Sapan S

    2015-06-01

    While statistics exist regarding the overall rate of fatalities in motorcyclists with and without helmets, a combined inpatient and value of statistical life (VSL) analysis has not previously been reported. Statistical data of motorcycle collisions were obtained from the Centers for Disease Control, National Highway Transportation Safety Board, and Governors Highway Safety Association. The VSL estimate was obtained from the 2002 Department of Transportation calculation. Statistics on helmeted versus nonhelmeted motorcyclists, death at the scene, and inpatient death were obtained using the 2010 National Trauma Data Bank. Inpatient costs were obtained from the 2010 National Inpatient Sample. Population estimates were generated using weighted samples, and all costs are reported using 2010 US dollars using the Consumer Price Index. A total of 3,951 fatal motorcycle collisions were reported in 2010, of which 77% of patients died at the scene, 10% in the emergency department, and 13% as inpatients. Thirty-seven percent of all riders did not wear a helmet but accounted for 69% of all deaths. Of those motorcyclists who survived to the hospital, the odds ratio of surviving with a helmet was 1.51 compared with those without a helmet (p < 0.001). Total costs for nonhelmeted motorcyclists were 66% greater at $5.5 billion, compared with $3.3 billion for helmeted motorcyclists (p < 0.001). Direct inpatient costs were 16% greater for helmeted riders ($203,248 vs. $175,006) but led to more than 50% greater VSL generated (absolute benefit, $602,519 per helmeted survivor). A cost analysis of inpatient care and indirect costs of motorcycle riders who do not wear helmets leads to nearly $2.2 billion in losses per year, with almost 1.9 times as many deaths compared with helmeted motorcyclists. The per capita cost per fatality is more than $800,000. Institution of a mandatory helmet law could lead to an annual cost savings of almost $2.2 billion. Economic analysis, level III.

  5. Spatial Statistics for Tumor Cell Counting and Classification

    NASA Astrophysics Data System (ADS)

    Wirjadi, Oliver; Kim, Yoo-Jin; Breuel, Thomas

    To count and classify cells in histological sections is a standard task in histology. One example is the grading of meningiomas, benign tumors of the meninges, which requires to assess the fraction of proliferating cells in an image. As this process is very time consuming when performed manually, automation is required. To address such problems, we propose a novel application of Markov point process methods in computer vision, leading to algorithms for computing the locations of circular objects in images. In contrast to previous algorithms using such spatial statistics methods in image analysis, the present one is fully trainable. This is achieved by combining point process methods with statistical classifiers. Using simulated data, the method proposed in this paper will be shown to be more accurate and more robust to noise than standard image processing methods. On the publicly available SIMCEP benchmark for cell image analysis algorithms, the cell count performance of the present paper is significantly more accurate than results published elsewhere, especially when cells form dense clusters. Furthermore, the proposed system performs as well as a state-of-the-art algorithm for the computer-aided histological grading of meningiomas when combined with a simple k-nearest neighbor classifier for identifying proliferating cells.

  6. Utilization of medical services in the public health system in the Southern Brazil.

    PubMed

    Bastos, Gisele Alsina Nader; Duca, Giovâni Firpo Del; Hallal, Pedro Curi; Santos, Iná S

    2011-06-01

    To estimate the prevalence and analyze factors associated with the utilization of medical services in the public health system. Cross-sectional population-based study with 2,706 individuals aged 20-69 years carried out in Pelotas, Southern Brazil, in 2008. A systematic sampling with probability proportional to the number of households in each sector was adopted. The outcome was defined by the combination of the questions related to medical consultation in the previous three months and place. The exposure variables were: sex, age, marital status, level of schooling, family income, self-reported hospital admission in the previous year, having a regular physician, self-perception of health, and the main reason for the last consultation. Descriptive analysis was stratified by sex and the analytical statistics included the use of the Wald test for tendency and heterogeneity in the crude analysis and Poisson regression with robust variance in the adjusted analysis, taking into consideration cluster sampling. The prevalence of utilization of medical services in the three previous months was 60.6%, almost half of these (42.0%, 95%CI: 36.6;47.5) in public services. The most utilized public services were the primary care units (49.5%). In the adjusted analysis stratified by sex, men with advanced age and young women had higher probability of using the medical services in the public system. In both sexes, low level of schooling, low per capita family income, not having a regular physician and hospital admission in the previous year were associated with the outcome. Despite the expressive reduction in the utilization of medical health services in the public system in the last 15 years, the public services are now reaching a previously unassisted portion of the population (individuals with low income and schooling).

  7. State transportation statistics 2003

    DOT National Transportation Integrated Search

    2003-12-01

    The Bureau of Transportation Statistics (BTS) presents a statistical : profile of transportation in the 50 states and the District of Columbia. : This document supplements a previously published series of individual : state profiles. Like the individ...

  8. An Integrated Analysis of the Physiological Effects of Space Flight: Executive Summary

    NASA Technical Reports Server (NTRS)

    Leonard, J. I.

    1985-01-01

    A large array of models were applied in a unified manner to solve problems in space flight physiology. Mathematical simulation was used as an alternative way of looking at physiological systems and maximizing the yield from previous space flight experiments. A medical data analysis system was created which consist of an automated data base, a computerized biostatistical and data analysis system, and a set of simulation models of physiological systems. Five basic models were employed: (1) a pulsatile cardiovascular model; (2) a respiratory model; (3) a thermoregulatory model; (4) a circulatory, fluid, and electrolyte balance model; and (5) an erythropoiesis regulatory model. Algorithms were provided to perform routine statistical tests, multivariate analysis, nonlinear regression analysis, and autocorrelation analysis. Special purpose programs were prepared for rank correlation, factor analysis, and the integration of the metabolic balance data.

  9. Therapeutic whole-body hypothermia reduces mortality in severe traumatic brain injury if the cooling index is sufficiently high: meta-analyses of the effect of single cooling parameters and their integrated measure.

    PubMed

    Olah, Emoke; Poto, Laszlo; Hegyi, Peter; Szabo, Imre; Hartmann, Petra; Solymar, Margit; Petervari, Erika; Balasko, Marta; Habon, Tamas; Rumbus, Zoltan; Tenk, Judit; Rostas, Ildiko; Weinberg, Jordan; Romanovsky, Andrej A; Garami, Andras

    2018-04-21

    Therapeutic hypothermia was investigated repeatedly as a tool to improve the outcome of severe traumatic brain injury (TBI), but previous clinical trials and meta-analyses found contradictory results. We aimed to determine the effectiveness of therapeutic whole-body hypothermia on the mortality of adult patients with severe TBI by using a novel approach of meta-analysis. We searched the PubMed, EMBASE, and Cochrane Library databases from inception to February 2017. The identified human studies were evaluated regarding statistical, clinical, and methodological designs to ensure inter-study homogeneity. We extracted data on TBI severity, body temperature, mortality, and cooling parameters; then we calculated the cooling index, an integrated measure of therapeutic hypothermia. Forest plot of all identified studies showed no difference in the outcome of TBI between cooled and not cooled patients, but inter-study heterogeneity was high. On the contrary, by meta-analysis of RCTs which were homogenous with regards to statistical, clinical designs and precisely reported the cooling protocol, we showed decreased odds ratio for mortality in therapeutic hypothermia compared to no cooling. As independent factors, milder and longer cooling, and rewarming at < 0.25°C/h were associated with better outcome. Therapeutic hypothermia was beneficial only if the cooling index (measure of combination of cooling parameters) was sufficiently high. We conclude that high methodological and statistical inter-study heterogeneity could underlie the contradictory results obtained in previous studies. By analyzing methodologically homogenous studies, we show that cooling improves the outcome of severe TBI and this beneficial effect depends on certain cooling parameters and on their integrated measure, the cooling index.

  10. Antibiotic treatment of bacterial vaginosis in pregnancy: a meta-analysis.

    PubMed

    Leitich, Harald; Brunbauer, Mathias; Bodner-Adler, Barbara; Kaider, Alexandra; Egarter, Christian; Husslein, Peter

    2003-03-01

    The purpose of this study was to evaluate the effectiveness of antibiotic treatment of bacterial vaginosis in pregnancy to reduce preterm delivery. We performed a meta-analysis of published, English-language, randomized, placebo-controlled clinical trials of antibiotic treatment of bacterial vaginosis in pregnant women with intact amniotic membranes at <37 weeks of gestation. Primary outcomes included preterm delivery, perinatal or neonatal death, and neonatal morbidity. Ten studies with results for 3969 patients were included. In patients without preterm labor, antibiotic treatment did not significantly decrease preterm delivery at <37 weeks of gestation, in all patients combined (odds ratio, 0.83; 95% CI, 0.57-1.21) nor in high-risk patients with a previous preterm delivery (odds ratio, 0.50; 95% CI, 0.22-1.12). In both groups, significant statistical heterogeneity was observed. A significant reduction in preterm delivery and no statistical heterogeneity were observed in 338 high-risk patients who received oral regimens with treatment durations of > or =7 days (odds ratio, 0.42; 95% CI, 0.27-0.67). Nonsignificant effects and no statistical heterogeneity were observed in low-risk patients (odds ratio, 0.94; 95% CI, 0.71-1.25) and with vaginal regimens (odds ratio, 1.25; 95% CI: 0.86-1.81). In one study antibiotic treatment in patients with preterm labor led to a nonsignificant decrease in the rate of preterm deliveries (odds ratio, 0.31; 95% CI, 0.03-3.24). The screening of pregnant women who have bacterial vaginosis and who have had a previous preterm delivery and treatment with an oral regimen of longer duration can be justified on the basis of current evidence. More studies are needed to confirm the effectiveness of this strategy, both in high-risk patients without preterm labor and in patients with preterm labor.

  11. Reexamining Sample Size Requirements for Multivariate, Abundance-Based Community Research: When Resources are Limited, the Research Does Not Have to Be.

    PubMed

    Forcino, Frank L; Leighton, Lindsey R; Twerdy, Pamela; Cahill, James F

    2015-01-01

    Community ecologists commonly perform multivariate techniques (e.g., ordination, cluster analysis) to assess patterns and gradients of taxonomic variation. A critical requirement for a meaningful statistical analysis is accurate information on the taxa found within an ecological sample. However, oversampling (too many individuals counted per sample) also comes at a cost, particularly for ecological systems in which identification and quantification is substantially more resource consuming than the field expedition itself. In such systems, an increasingly larger sample size will eventually result in diminishing returns in improving any pattern or gradient revealed by the data, but will also lead to continually increasing costs. Here, we examine 396 datasets: 44 previously published and 352 created datasets. Using meta-analytic and simulation-based approaches, the research within the present paper seeks (1) to determine minimal sample sizes required to produce robust multivariate statistical results when conducting abundance-based, community ecology research. Furthermore, we seek (2) to determine the dataset parameters (i.e., evenness, number of taxa, number of samples) that require larger sample sizes, regardless of resource availability. We found that in the 44 previously published and the 220 created datasets with randomly chosen abundances, a conservative estimate of a sample size of 58 produced the same multivariate results as all larger sample sizes. However, this minimal number varies as a function of evenness, where increased evenness resulted in increased minimal sample sizes. Sample sizes as small as 58 individuals are sufficient for a broad range of multivariate abundance-based research. In cases when resource availability is the limiting factor for conducting a project (e.g., small university, time to conduct the research project), statistically viable results can still be obtained with less of an investment.

  12. Automated neurovascular tracing and analysis of the knife-edge scanning microscope Rat Nissl data set using a computing cluster.

    PubMed

    Sungjun Lim; Nowak, Michael R; Yoonsuck Choe

    2016-08-01

    We present a novel, parallelizable algorithm capable of automatically reconstructing and calculating anatomical statistics of cerebral vascular networks embedded in large volumes of Rat Nissl-stained data. In this paper, we report the results of our method using Rattus somatosensory cortical data acquired using Knife-Edge Scanning Microscopy. Our algorithm performs the reconstruction task with averaged precision, recall, and F2-score of 0.978, 0.892, and 0.902 respectively. Calculated anatomical statistics show some conformance to values previously reported. The results that can be obtained from our method are expected to help explicate the relationship between the structural organization of the microcirculation and normal (and abnormal) cerebral functioning.

  13. [Pathogenetic therapy of mastopathies in the prevention of breast cancer].

    PubMed

    Iaritsyn, S S; Sidorenko, L N

    1979-01-01

    The breast cancer morbidity among the population of the city of Leningrad has been analysed. It was shown that there is a tendency to the increased number of breast cancer patients. In this respect attention is given to the prophylactic measures, accomplished in Leningrad City oncological dyspensary. As proved statistically, the pathogenetic therapy of mastopathy is a factor contributing to less risk of malignant transformation. For the statistical analysis the authors used the data of 132 breast cancer patients; previously operated upon for local fibroadenomatosis, and the data of 259 control patients. It was found that among the patients with fibroadenomatosis who subsequently developed cancer of the mammary gland, the proportion of untreated patients was 2.8 times as much as in the control group.

  14. Reassessment of the relationship between M-protein decrement and survival in multiple myeloma.

    PubMed

    Palmer, M; Belch, A; Hanson, J; Brox, L

    1989-01-01

    The relationship between percentage M-protein decrement and survival is assessed in 134 multiple myeloma patients. The correlation did not achieve statistical significance (P = 0.069). Multivariate analysis using the Cox proportional hazards model, including a number of previously recognised prognostic factors, showed only percentage M-protein decrement, creatinine and haemoglobin to be significantly correlated with survival. However, the R'-statistic for each of these variables was low, indicating that their prognostic power is weak. We conclude that neither the percentage M-protein decrement nor the response derived from it can be used as an accurate means of assessing the efficacy of treatment in myeloma. Mature survival data alone should be used for this purpose.

  15. Reassessment of the relationship between M-protein decrement and survival in multiple myeloma.

    PubMed Central

    Palmer, M.; Belch, A.; Hanson, J.; Brox, L.

    1989-01-01

    The relationship between percentage M-protein decrement and survival is assessed in 134 multiple myeloma patients. The correlation did not achieve statistical significance (P = 0.069). Multivariate analysis using the Cox proportional hazards model, including a number of previously recognised prognostic factors, showed only percentage M-protein decrement, creatinine and haemoglobin to be significantly correlated with survival. However, the R'-statistic for each of these variables was low, indicating that their prognostic power is weak. We conclude that neither the percentage M-protein decrement nor the response derived from it can be used as an accurate means of assessing the efficacy of treatment in myeloma. Mature survival data alone should be used for this purpose. PMID:2757916

  16. Haplotype-based association analysis of general cognitive ability in Generation Scotland, the English Longitudinal Study of Ageing, and UK Biobank.

    PubMed

    Howard, David M; Adams, Mark J; Clarke, Toni-Kim; Wigmore, Eleanor M; Zeng, Yanni; Hagenaars, Saskia P; Lyall, Donald M; Thomson, Pippa A; Evans, Kathryn L; Porteous, David J; Nagy, Reka; Hayward, Caroline; Haley, Chris S; Smith, Blair H; Murray, Alison D; Batty, G David; Deary, Ian J; McIntosh, Andrew M

    2017-01-01

    Cognitive ability is a heritable trait with a polygenic architecture, for which several associated variants have been identified using genotype-based and candidate gene approaches. Haplotype-based analyses are a complementary technique that take phased genotype data into account, and potentially provide greater statistical power to detect lower frequency variants. In the present analysis, three cohort studies (n total = 48,002) were utilised: Generation Scotland: Scottish Family Health Study (GS:SFHS), the English Longitudinal Study of Ageing (ELSA), and the UK Biobank. A genome-wide haplotype-based meta-analysis of cognitive ability was performed, as well as a targeted meta-analysis of several gene coding regions. None of the assessed haplotypes provided evidence of a statistically significant association with cognitive ability in either the individual cohorts or the meta-analysis. Within the meta-analysis, the haplotype with the lowest observed P -value overlapped with the D-amino acid oxidase activator ( DAOA ) gene coding region. This coding region has previously been associated with bipolar disorder, schizophrenia and Alzheimer's disease, which have all been shown to impact upon cognitive ability. Another potentially interesting region highlighted within the current genome-wide association analysis (GS:SFHS: P = 4.09 x 10 -7 ), was the butyrylcholinesterase ( BCHE ) gene coding region. The protein encoded by BCHE has been shown to influence the progression of Alzheimer's disease and its role in cognitive ability merits further investigation. Although no evidence was found for any haplotypes with a statistically significant association with cognitive ability, our results did provide further evidence that the genetic variants contributing to the variance of cognitive ability are likely to be of small effect.

  17. Kinetics of fast short-term depression are matched to spike train statistics to reduce noise.

    PubMed

    Khanbabaie, Reza; Nesse, William H; Longtin, Andre; Maler, Leonard

    2010-06-01

    Short-term depression (STD) is observed at many synapses of the CNS and is important for diverse computations. We have discovered a form of fast STD (FSTD) in the synaptic responses of pyramidal cells evoked by stimulation of their electrosensory afferent fibers (P-units). The dynamics of the FSTD are matched to the mean and variance of natural P-unit discharge. FSTD exhibits switch-like behavior in that it is immediately activated with stimulus intervals near the mean interspike interval (ISI) of P-units (approximately 5 ms) and recovers immediately after stimulation with the slightly longer intervals (>7.5 ms) that also occur during P-unit natural and evoked discharge patterns. Remarkably, the magnitude of evoked excitatory postsynaptic potentials appear to depend only on the duration of the previous ISI. Our theoretical analysis suggests that FSTD can serve as a mechanism for noise reduction. Because the kinetics of depression are as fast as the natural spike statistics, this role is distinct from previously ascribed functional roles of STD in gain modulation, synchrony detection or as a temporal filter.

  18. The Effect of a Student-Designed Data Collection: Project on Attitudes toward Statistics

    ERIC Educational Resources Information Center

    Carnell, Lisa J.

    2008-01-01

    Students often enter an introductory statistics class with less than positive attitudes about the subject. They tend to believe statistics is difficult and irrelevant to their lives. Observational evidence from previous studies suggests including projects in a statistics course may enhance students' attitudes toward statistics. This study examines…

  19. Scalability of Several Asynchronous Many-Task Models for In Situ Statistical Analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pebay, Philippe Pierre; Bennett, Janine Camille; Kolla, Hemanth

    This report is a sequel to [PB16], in which we provided a first progress report on research and development towards a scalable, asynchronous many-task, in situ statistical analysis engine using the Legion runtime system. This earlier work included a prototype implementation of a proposed solution, using a proxy mini-application as a surrogate for a full-scale scientific simulation code. The first scalability studies were conducted with the above on modestly-sized experimental clusters. In contrast, in the current work we have integrated our in situ analysis engines with a full-size scientific application (S3D, using the Legion-SPMD model), and have conducted nu- mericalmore » tests on the largest computational platform currently available for DOE science ap- plications. We also provide details regarding the design and development of a light-weight asynchronous collectives library. We describe how this library is utilized within our SPMD- Legion S3D workflow, and compare the data aggregation technique deployed herein to the approach taken within our previous work.« less

  20. A network meta-analysis on the effects of information technology application on preoperative knowledge of patients.

    PubMed

    Lai, Yi-Horng

    2015-01-01

    The application of information technology in health education plan in Taiwan has existed for a long time. The purpose of this study is to explore the relationship between information technology application in health education and patients' preoperative knowledge by synthesizing existing researches that compare the effectiveness of information technology application and traditional instruction in the health education plan. In spite of claims regarding the potential benefits of using information technology in health education plan, results of previous researches were conflicting. This study is carried out to examine the effectiveness of information technology by using network meta-analysis, which is a statistical analysis of separate but similar studies in order to test the pooled data for statistical significance. Information technology application in health education discussed in this study include interactive technology therapy (person-computer), group interactive technology therapy (person-person), multimedia technology therapy and video therapy. The result has shown that group interactive technology therapy is the most effective, followed by interactive technology therapy. And these four therapies of information technology are all superior to the traditional health education plan (leaflet therapy).

  1. Artificial neural network models for prediction of cardiovascular autonomic dysfunction in general Chinese population

    PubMed Central

    2013-01-01

    Background The present study aimed to develop an artificial neural network (ANN) based prediction model for cardiovascular autonomic (CA) dysfunction in the general population. Methods We analyzed a previous dataset based on a population sample consisted of 2,092 individuals aged 30–80 years. The prediction models were derived from an exploratory set using ANN analysis. Performances of these prediction models were evaluated in the validation set. Results Univariate analysis indicated that 14 risk factors showed statistically significant association with CA dysfunction (P < 0.05). The mean area under the receiver-operating curve was 0.762 (95% CI 0.732–0.793) for prediction model developed using ANN analysis. The mean sensitivity, specificity, positive and negative predictive values were similar in the prediction models was 0.751, 0.665, 0.330 and 0.924, respectively. All HL statistics were less than 15.0. Conclusion ANN is an effective tool for developing prediction models with high value for predicting CA dysfunction among the general population. PMID:23902963

  2. A Comparison of Atmospheric Quantities Determined from Advanced WVR and Weather Analysis Data

    NASA Astrophysics Data System (ADS)

    Morabito, D.; Wu, L.; Slobin, S.

    2017-05-01

    Lower frequency bands used for deep space communications (e.g., 2.3 GHz and 8.4 GHz) are oversubscribed. Thus, NASA has become interested in using higher frequency bands (e.g., 26 GHz and 32 GHz) for telemetry, making use of the available wider bandwidth. However, these bands are more susceptible to atmospheric degradation. Currently, flight projects tend to be conservative in preparing their communications links by using worst-case or conservative assumptions, which result in nonoptimum data return. We previously explored the use of weather forecasting over different weather condition scenarios to determine more optimal values of atmospheric attenuation and atmospheric noise temperature for use in telecommunications link design. In this article, we present the results of a comparison of meteorological parameters (columnar water vapor and liquid water content) estimated from multifrequency Advanced Water Vapor Radiometer (AWVR) data with those estimated from weather analysis tools (FNL). We find that for the Deep Space Network's Goldstone and Madrid tracking sites, the statistics are in reasonable agreement between the two methods. We can then use the statistics of these quantities based on FNL runs to estimate statistics of atmospheric signal degradation for tracking sites that do not have the benefit of possessing multiyear WVR data sets, such as those of the NASA Near-Earth Network (NEN). The resulting statistics of atmospheric attenuation and atmospheric noise temperature increase can then be used in link budget calculations.

  3. Temperature, Not Fine Particulate Matter (PM2.5), is Causally Associated with Short-Term Acute Daily Mortality Rates: Results from One Hundred United States Cities

    PubMed Central

    Cox, Tony; Popken, Douglas; Ricci, Paolo F

    2013-01-01

    Exposures to fine particulate matter (PM2.5) in air (C) have been suspected of contributing causally to increased acute (e.g., same-day or next-day) human mortality rates (R). We tested this causal hypothesis in 100 United States cities using the publicly available NMMAPS database. Although a significant, approximately linear, statistical C-R association exists in simple statistical models, closer analysis suggests that it is not causal. Surprisingly, conditioning on other variables that have been extensively considered in previous analyses (usually using splines or other smoothers to approximate their effects), such as month of the year and mean daily temperature, suggests that they create strong, nonlinear confounding that explains the statistical association between PM2.5 and mortality rates in this data set. As this finding disagrees with conventional wisdom, we apply several different techniques to examine it. Conditional independence tests for potential causation, non-parametric classification tree analysis, Bayesian Model Averaging (BMA), and Granger-Sims causality testing, show no evidence that PM2.5 concentrations have any causal impact on increasing mortality rates. This apparent absence of a causal C-R relation, despite their statistical association, has potentially important implications for managing and communicating the uncertain health risks associated with, but not necessarily caused by, PM2.5 exposures. PMID:23983662

  4. The Essential Genome of Escherichia coli K-12

    PubMed Central

    2018-01-01

    ABSTRACT Transposon-directed insertion site sequencing (TraDIS) is a high-throughput method coupling transposon mutagenesis with short-fragment DNA sequencing. It is commonly used to identify essential genes. Single gene deletion libraries are considered the gold standard for identifying essential genes. Currently, the TraDIS method has not been benchmarked against such libraries, and therefore, it remains unclear whether the two methodologies are comparable. To address this, a high-density transposon library was constructed in Escherichia coli K-12. Essential genes predicted from sequencing of this library were compared to existing essential gene databases. To decrease false-positive identification of essential genes, statistical data analysis included corrections for both gene length and genome length. Through this analysis, new essential genes and genes previously incorrectly designated essential were identified. We show that manual analysis of TraDIS data reveals novel features that would not have been detected by statistical analysis alone. Examples include short essential regions within genes, orientation-dependent effects, and fine-resolution identification of genome and protein features. Recognition of these insertion profiles in transposon mutagenesis data sets will assist genome annotation of less well characterized genomes and provides new insights into bacterial physiology and biochemistry. PMID:29463657

  5. Applications of Remote Sensing and GIS(Geographic Information System) in Crime Analysis of Gujranwala City.

    NASA Astrophysics Data System (ADS)

    Munawar, Iqra

    2016-07-01

    Crime mapping is a dynamic process. It can be used to assist all stages of the problem solving process. Mapping crime can help police protect citizens more effectively. The decision to utilize a certain type of map or design element may change based on the purpose of a map, the audience or the available data. If the purpose of the crime analysis map is to assist in the identification of a particular problem, selected data may be mapped to identify patterns of activity that have been previously undetected. The main objective of this research was to study the spatial distribution patterns of the four common crimes i.e Narcotics, Arms, Burglary and Robbery in Gujranwala City using spatial statistical techniques to identify the hotspots. Hotspots or location of clusters were identified using Getis-Ord Gi* Statistic. Crime analysis mapping can be used to conduct a comprehensive spatial analysis of the problem. Graphic presentations of such findings provide a powerful medium to communicate conditions, patterns and trends thus creating an avenue for analysts to bring about significant policy changes. Moreover Crime mapping also helps in the reduction of crime rate.

  6. Implantable cardioverter defibrillators for primary prevention in patients with nonischemic cardiomyopathy: A systematic review and meta-analysis.

    PubMed

    Akel, Tamer; Lafferty, James

    2017-06-01

    Implantable cardioverter defibrillators (ICDs) have proved their favorable outcomes on survival in selected patients with cardiomyopathy. Although previous meta-analyses have shown benefit for their use in primary prevention, the evidence remains less robust for patients with nonischemic cardiomyopathy (NICM) in comparison to patients with coronary artery disease (CAD). To evaluate the effect of ICD therapy on reducing all-cause mortality and sudden cardiac death (SCD) in patients with NICM. PubMed (1993-2016), the Cochrane Central Register of Controlled Trials (2000-2016), reference lists of relevant articles, and previous meta-analyses. Search terms included defibrillator, heart failure, cardiomyopathy, randomized controlled trials, and clinical trials. Eligible trials were randomized controlled trials with at least an arm of ICD, an arm of medical therapy and enrolled some patients with NICM. The primary endpoint in the trials should include all-cause mortality or mortality from SCD. Hazard ratios (HRs) for all-cause mortality and mortality from SCD were either extracted or calculated along with their standard errors. Of the 1047 abstracts retained by the initial screen, eight randomized controlled trials were identified. Five of these trials reported relevant data regarding patients with NICM and were subsequently included in this meta-analysis. Pooled analysis of HRs suggested a statistically significant reduction in all-cause mortality among a total of 2573 patients randomized to ICD vs medical therapy (HR 0.80; 95% CI, 0.67-0.96; P=.02). Pooled analysis of HRs for mortality from SCD was also statistically significant (n=1677) (HR 0.51; 95% CI, 0.34-0.76; P=.001). ICD implantation is beneficial in terms of all-cause mortality and mortality from SCD in certain subgroups of patients with NICM. © 2017 John Wiley & Sons Ltd.

  7. Neurological Outcomes Following Suicidal Hanging: A Prospective Study of 101 Patients

    PubMed Central

    Jawaid, Mohammed Turab; Amalnath, S. Deepak; Subrahmanyam, D. K. S.

    2017-01-01

    Context: Survivors of suicidal hanging can have variable neurological outcomes – from complete recovery to irreversible brain damage. Literature on the neurological outcomes in these patients is confined to retrospective studies and case series. Hence, this prospective study was carried out. Aims: The aim is to study the neurological outcomes in suicidal hanging. Settings and Design: This was a prospective observational study carried out from July 2014 to July 2016. Subjects and Methods: Consecutive patients admitted to the emergency and medicine wards were included in the study. Details of the clinical and radiological findings, course in hospital and at 1 month postdischarge were analyzed. Statistical Analysis Used: Statistical analysis was performed using IBM SPSS advanced statistics 20.0 (SPSS Inc., Chicago, USA). Univariate analysis was performed using Chi-square test for significance and Odd's ratio was calculated. Results: Of the 101 patients, 6 died and 4 had residual neuro deficits. Cervical spine injury was seen in 3 patients. Interestingly, 39 patients could not remember the act of hanging (retrograde amnesia). Hypotension, pulmonary edema, Glasgow coma scale (GCS) score <8 at admission, need for mechanical ventilation, and cerebral edema on plain computed tomography were more in those with amnesia as compared to those with normal memory and these findings were statistically significant. Conclusions: Majority of patients recovered without any sequelae. Routine imaging of cervical spine may not be warranted in all patients, even in those with poor GCS. Retrograde amnesia might be more common than previously believed and further studies are needed to analyze this peculiar feature. PMID:28584409

  8. Medial Tibial Stress Syndrome in Active Individuals: A Systematic Review and Meta-analysis of Risk Factors

    PubMed Central

    Reinking, Mark F.; Austin, Tricia M.; Richter, Randy R.; Krieger, Mary M.

    2016-01-01

    Context: Medial tibial stress syndrome (MTSS) is a common condition in active individuals and presents as diffuse pain along the posteromedial border of the tibia. Objective: To use cross-sectional, case-control, and cohort studies to identify significant MTSS risk factors. Data Sources: Bibliographic databases (PubMed, Scopus, CINAHL, SPORTDiscus, EMBASE, EBM Reviews, PEDRo), grey literature, electronic search of full text of journals, manual review of reference lists, and automatically executed PubMed MTSS searches were utilized. All searches were conducted between 2011 and 2015. Study Selection: Inclusion criteria were determined a priori and included original research with participants’ pain diffuse, located in the posterior medial tibial region, and activity related. Study Design: Systematic review with meta-analysis. Level of evidence: Level 4. Data Extraction: Titles and abstracts were reviewed to eliminate citations that did not meet the criteria for inclusion. Study characteristics identified a priori were extracted for data analysis. Statistical heterogeneity was examined using the I2 index and Cochran Q test, and a random-effects model was used to calculate the meta-analysis when 2 or more studies examined a risk factor. Two authors independently assessed study quality. Results: Eighty-three articles met the inclusion criteria, and 22 articles included risk factor data. Of the 27 risk factors that were in 2 or more studies, 5 risk factors showed a significant pooled effect and low statistical heterogeneity, including female sex (odds ratio [OR], 2.35; CI, 1.58-3.50), increased weight (standardized mean difference [SMD], 0.24; CI, 0.03-0.45), higher navicular drop (SMD, 0.44; CI, 0.21-0.67), previous running injury (OR, 2.18; CI, 1.00-4.72), and greater hip external rotation with the hip in flexion (SMD, 0.44; CI, 0.23-0.65). The remaining risk factors had a nonsignificant pooled effect or significant pooled effect with high statistical heterogeneity. Conclusion: Female sex, increased weight, higher navicular drop, previous running injury, and greater hip external rotation with the hip in flexion are risk factors for the development of MTSS. PMID:27729482

  9. Summary of the COS Cycle 22 Calibration Program

    NASA Astrophysics Data System (ADS)

    Sonnentrucker, Paule; Becker, George; Bostroem, Azalee; Debes, John H.; Ely, Justin; Fox, Andrew; Lockwood, Sean; Oliveira, Cristina; Penton, Steven; Proffitt, Charles; Roman-Duval, Julia; Sahnow, David; Sana, Hugues; Taylor, Jo; Welty, Alan D.; Wheeler, Thomas

    2016-09-01

    We summarize the calibration activities for the Cosmic Origins Spectrograph (COS) on the Hubble Space Telescope during Cycle 22 which ran from November 2014 through October 2015. We give an overview of the COS calibration plan, COS usage statistics and we briefly describe major changes with respect to the previous cycle. High-level executive summaries for each calibration program comprising Cycle 22 are also given here. Results of the analysis attached to each program are published in separate ISRs.

  10. Correlation of sweat chloride and percent predicted FEV1 in cystic fibrosis patients treated with ivacaftor.

    PubMed

    Fidler, Meredith C; Beusmans, Jack; Panorchan, Paul; Van Goor, Fredrick

    2017-01-01

    Ivacaftor, a CFTR potentiator that enhances chloride transport by acting directly on CFTR to increase its channel gating activity, has been evaluated in patients with different CFTR mutations. Several previous analyses have reported no statistical correlation between change from baseline in ppFEV 1 and reduction in sweat chloride levels for individuals treated with ivacaftor. The objective of the post hoc analysis described here was to expand upon previous analyses and evaluate the correlation between sweat chloride levels and absolute ppFEV 1 changes across multiple cohorts of patients with different CF-causing mutations who were treated with ivacaftor. The goal of the analysis was to help define the potential value of sweat chloride as a pharmacodynamic biomarker for use in CFTR modulator trials. For any given study, reductions in sweat chloride levels and improvements in absolute ppFEV 1 were not correlated for individual patients. However, when the data from all studies were combined, a statistically significant correlation between sweat chloride levels and ppFEV 1 changes was observed (p<0.0001). Thus, sweat chloride level changes in response to potentiation of the CFTR protein by ivacaftor appear to be a predictive pharmacodynamic biomarker of lung function changes on a population basis but are unsuitable for the prediction of treatment benefits for individuals. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  11. Developing a Continuous Quality Improvement Assessment Using a Patient-Centered Approach in Optimizing Systemic Lupus Erythematosus Disease Control.

    PubMed

    Updyke, Katelyn Mariko; Urso, Brittany; Beg, Shazia; Solomon, James

    2017-10-09

    Systemic lupus erythematosus (SLE) is a multi-organ, autoimmune disease in which patients lose self-tolerance and develop immune complexes which deposit systemically causing multi-organ damage and inflammation. Patients often experience unpredictable flares of symptoms with poorly identified triggers. Literature suggests exogenous exposures may contribute to flares in symptoms. An online pilot survey was marketed globally through social media to self-reported SLE patients with the goal to identify specific subpopulations who are susceptible to disease state changes based on analyzed exogenous factors. The pilot survey was promoted for two weeks, 80 respondents fully completed the survey and were included in statistical analysis. Descriptive statistical analysis was performed on de-identified patient surveys and compared to previous literature studies reporting known or theorized triggers in the SLE disease state. The pilot survey identified similar exogenous triggers compared to previous literature, including antibiotics, increasing beef intake, and metal implants. The goal of the pilot survey is to utilize similar questions to develop a detailed internet-based patient interactive form that can be edited and time stamped as a method to promote continuous quality improvement assessments. The ultimate objective of the platform is to interact with SLE patients from across the globe longitudinally to optimize disease control and improve quality of care by allowing them to avoid harmful triggers.

  12. Developing a Continuous Quality Improvement Assessment Using a Patient-Centered Approach in Optimizing Systemic Lupus Erythematosus Disease Control

    PubMed Central

    Urso, Brittany; Beg, Shazia; Solomon, James

    2017-01-01

    Systemic lupus erythematosus (SLE) is a multi-organ, autoimmune disease in which patients lose self-tolerance and develop immune complexes which deposit systemically causing multi-organ damage and inflammation. Patients often experience unpredictable flares of symptoms with poorly identified triggers. Literature suggests exogenous exposures may contribute to flares in symptoms. An online pilot survey was marketed globally through social media to self-reported SLE patients with the goal to identify specific subpopulations who are susceptible to disease state changes based on analyzed exogenous factors. The pilot survey was promoted for two weeks, 80 respondents fully completed the survey and were included in statistical analysis. Descriptive statistical analysis was performed on de-identified patient surveys and compared to previous literature studies reporting known or theorized triggers in the SLE disease state. The pilot survey identified similar exogenous triggers compared to previous literature, including antibiotics, increasing beef intake, and metal implants. The goal of the pilot survey is to utilize similar questions to develop a detailed internet-based patient interactive form that can be edited and time stamped as a method to promote continuous quality improvement assessments. The ultimate objective of the platform is to interact with SLE patients from across the globe longitudinally to optimize disease control and improve quality of care by allowing them to avoid harmful triggers. PMID:29226052

  13. Evidence, temperature, and the laws of thermodynamics.

    PubMed

    Vieland, Veronica J

    2014-01-01

    A primary purpose of statistical analysis in genetics is the measurement of the strength of evidence for or against hypotheses. As with any type of measurement, a properly calibrated measurement scale is necessary if we want to be able to meaningfully compare degrees of evidence across genetic data sets, across different types of genetic studies and/or across distinct experimental modalities. In previous papers in this journal and elsewhere, my colleagues and I have argued that geneticists ought to care about the scale on which statistical evidence is measured, and we have proposed the Kelvin temperature scale as a template for a context-independent measurement scale for statistical evidence. Moreover, we have claimed that, mathematically speaking, evidence and temperature may be one and the same thing. On first blush, this might seem absurd. Temperature is a property of systems following certain laws of nature (in particular, the 1st and 2nd Law of Thermodynamics) involving very physical quantities (e.g., energy) and processes (e.g., mechanical work). But what do the laws of thermodynamics have to do with statistical systems? Here I address that question. © 2014 S. Karger AG, Basel.

  14. Damages detection in cylindrical metallic specimens by means of statistical baseline models and updated daily temperature profiles

    NASA Astrophysics Data System (ADS)

    Villamizar-Mejia, Rodolfo; Mujica-Delgado, Luis-Eduardo; Ruiz-Ordóñez, Magda-Liliana; Camacho-Navarro, Jhonatan; Moreno-Beltrán, Gustavo

    2017-05-01

    In previous works, damage detection of metallic specimens exposed to temperature changes has been achieved by using a statistical baseline model based on Principal Component Analysis (PCA), piezodiagnostics principle and taking into account temperature effect by augmenting the baseline model or by using several baseline models according to the current temperature. In this paper a new approach is presented, where damage detection is based in a new index that combine Q and T2 statistical indices with current temperature measurements. Experimental tests were achieved in a carbon-steel pipe of 1m length and 1.5 inches diameter, instrumented with piezodevices acting as actuators or sensors. A PCA baseline model was obtained to a temperature of 21º and then T2 and Q statistical indices were obtained for a 24h temperature profile. Also, mass adding at different points of pipe between sensor and actuator was used as damage. By using the combined index the temperature contribution can be separated and a better differentiation of damages respect to undamaged cases can be graphically obtained.

  15. The “χ” of the Matter: Testing the Relationship between Paleoenvironments and Three Theropod Clades

    PubMed Central

    Sales, Marcos A. F.; Lacerda, Marcel B.; Horn, Bruno L. D.; de Oliveira, Isabel A. P.; Schultz, Cesar L.

    2016-01-01

    The view of spinosaurs as dinosaurs of semi-aquatic habits and strongly associated with marginal and coastal habitats are deeply rooted in both scientific and popular knowledge, but it was never statistically tested. Inspired by a previous analysis of other dinosaur clades and major paleoenvironmental categories, here we present our own statistical evaluation of the association between coastal and terrestrial paleoenvironments and spinosaurids, along with other two theropod taxa: abelisaurids and carcharodontosaurids. We also included a taphonomic perspective and classified the occurrences in categories related to potential biases in order to better address our interpretations. Our main results can be summarized as follows: 1) the taxon with the largest amount of statistical evidence showing it positively associated to coastal paleoenvironments is Spinosauridae; 2) abelisaurids and carcharodontosaurids had more statistical evidence showing them positively associated with terrestrial paleoenvironments; 3) it is likely that spinosaurids also occupied spatially inland areas in a way somehow comparable at least to carcharodontosaurids; 4) abelisaurids may have been more common than the other two taxa in inland habitats. PMID:26829315

  16. Detecting the contagion effect in mass killings; a constructive example of the statistical advantages of unbinned likelihood methods.

    PubMed

    Towers, Sherry; Mubayi, Anuj; Castillo-Chavez, Carlos

    2018-01-01

    When attempting to statistically distinguish between a null and an alternative hypothesis, many researchers in the life and social sciences turn to binned statistical analysis methods, or methods that are simply based on the moments of a distribution (such as the mean, and variance). These methods have the advantage of simplicity of implementation, and simplicity of explanation. However, when null and alternative hypotheses manifest themselves in subtle differences in patterns in the data, binned analysis methods may be insensitive to these differences, and researchers may erroneously fail to reject the null hypothesis when in fact more sensitive statistical analysis methods might produce a different result when the null hypothesis is actually false. Here, with a focus on two recent conflicting studies of contagion in mass killings as instructive examples, we discuss how the use of unbinned likelihood methods makes optimal use of the information in the data; a fact that has been long known in statistical theory, but perhaps is not as widely appreciated amongst general researchers in the life and social sciences. In 2015, Towers et al published a paper that quantified the long-suspected contagion effect in mass killings. However, in 2017, Lankford & Tomek subsequently published a paper, based upon the same data, that claimed to contradict the results of the earlier study. The former used unbinned likelihood methods, and the latter used binned methods, and comparison of distribution moments. Using these analyses, we also discuss how visualization of the data can aid in determination of the most appropriate statistical analysis methods to distinguish between a null and alternate hypothesis. We also discuss the importance of assessment of the robustness of analysis results to methodological assumptions made (for example, arbitrary choices of number of bins and bin widths when using binned methods); an issue that is widely overlooked in the literature, but is critical to analysis reproducibility and robustness. When an analysis cannot distinguish between a null and alternate hypothesis, care must be taken to ensure that the analysis methodology itself maximizes the use of information in the data that can distinguish between the two hypotheses. The use of binned methods by Lankford & Tomek (2017), that examined how many mass killings fell within a 14 day window from a previous mass killing, substantially reduced the sensitivity of their analysis to contagion effects. The unbinned likelihood methods used by Towers et al (2015) did not suffer from this problem. While a binned analysis might be favorable for simplicity and clarity of presentation, unbinned likelihood methods are preferable when effects might be somewhat subtle.

  17. Detecting the contagion effect in mass killings; a constructive example of the statistical advantages of unbinned likelihood methods

    PubMed Central

    Mubayi, Anuj; Castillo-Chavez, Carlos

    2018-01-01

    Background When attempting to statistically distinguish between a null and an alternative hypothesis, many researchers in the life and social sciences turn to binned statistical analysis methods, or methods that are simply based on the moments of a distribution (such as the mean, and variance). These methods have the advantage of simplicity of implementation, and simplicity of explanation. However, when null and alternative hypotheses manifest themselves in subtle differences in patterns in the data, binned analysis methods may be insensitive to these differences, and researchers may erroneously fail to reject the null hypothesis when in fact more sensitive statistical analysis methods might produce a different result when the null hypothesis is actually false. Here, with a focus on two recent conflicting studies of contagion in mass killings as instructive examples, we discuss how the use of unbinned likelihood methods makes optimal use of the information in the data; a fact that has been long known in statistical theory, but perhaps is not as widely appreciated amongst general researchers in the life and social sciences. Methods In 2015, Towers et al published a paper that quantified the long-suspected contagion effect in mass killings. However, in 2017, Lankford & Tomek subsequently published a paper, based upon the same data, that claimed to contradict the results of the earlier study. The former used unbinned likelihood methods, and the latter used binned methods, and comparison of distribution moments. Using these analyses, we also discuss how visualization of the data can aid in determination of the most appropriate statistical analysis methods to distinguish between a null and alternate hypothesis. We also discuss the importance of assessment of the robustness of analysis results to methodological assumptions made (for example, arbitrary choices of number of bins and bin widths when using binned methods); an issue that is widely overlooked in the literature, but is critical to analysis reproducibility and robustness. Conclusions When an analysis cannot distinguish between a null and alternate hypothesis, care must be taken to ensure that the analysis methodology itself maximizes the use of information in the data that can distinguish between the two hypotheses. The use of binned methods by Lankford & Tomek (2017), that examined how many mass killings fell within a 14 day window from a previous mass killing, substantially reduced the sensitivity of their analysis to contagion effects. The unbinned likelihood methods used by Towers et al (2015) did not suffer from this problem. While a binned analysis might be favorable for simplicity and clarity of presentation, unbinned likelihood methods are preferable when effects might be somewhat subtle. PMID:29742115

  18. H.E.S.S. Limits on Linelike Dark Matter Signatures in the 100 GeV to 2 TeV Energy Range Close to the Galactic Center.

    PubMed

    Abdalla, H; Abramowski, A; Aharonian, F; Ait Benkhali, F; Akhperjanian, A G; Andersson, T; Angüner, E O; Arrieta, M; Aubert, P; Backes, M; Balzer, A; Barnard, M; Becherini, Y; Becker Tjus, J; Berge, D; Bernhard, S; Bernlöhr, K; Birsin, E; Blackwell, R; Böttcher, M; Boisson, C; Bolmont, J; Bordas, P; Bregeon, J; Brun, F; Brun, P; Bryan, M; Bulik, T; Capasso, M; Carr, J; Casanova, S; Chakraborty, N; Chalme-Calvet, R; Chaves, R C G; Chen, A; Chevalier, J; Chrétien, M; Colafrancesco, S; Cologna, G; Condon, B; Conrad, J; Couturier, C; Cui, Y; Davids, I D; Degrange, B; Deil, C; Devin, J; deWilt, P; Djannati-Ataï, A; Domainko, W; Donath, A; Drury, L O'C; Dubus, G; Dutson, K; Dyks, J; Dyrda, M; Edwards, T; Egberts, K; Eger, P; Ernenwein, J-P; Eschbach, S; Farnier, C; Fegan, S; Fernandes, M V; Fiasson, A; Fontaine, G; Förster, A; Funk, S; Füßling, M; Gabici, S; Gajdus, M; Gallant, Y A; Garrigoux, T; Giavitto, G; Giebels, B; Glicenstein, J F; Gottschall, D; Goyal, A; Grondin, M-H; Grudzińska, M; Hadasch, D; Hahn, J; Hawkes, J; Heinzelmann, G; Henri, G; Hermann, G; Hervet, O; Hillert, A; Hinton, J A; Hofmann, W; Hoischen, C; Holler, M; Horns, D; Ivascenko, A; Jacholkowska, A; Jamrozy, M; Janiak, M; Jankowsky, D; Jankowsky, F; Jingo, M; Jogler, T; Jouvin, L; Jung-Richardt, I; Kastendieck, M A; Katarzyński, K; Katz, U; Kerszberg, D; Khélifi, B; Kieffer, M; King, J; Klepser, S; Klochkov, D; Kluźniak, W; Kolitzus, D; Komin, Nu; Kosack, K; Krakau, S; Kraus, M; Krayzel, F; Krüger, P P; Laffon, H; Lamanna, G; Lau, J; Lees, J-P; Lefaucheur, J; Lefranc, V; Lemière, A; Lemoine-Goumard, M; Lenain, J-P; Leser, E; Liu, R; Lohse, T; Lorentz, M; Lypova, I; Marandon, V; Marcowith, A; Mariaud, C; Marx, R; Maurin, G; Maxted, N; Mayer, M; Meintjes, P J; Meyer, M; Mitchell, A M W; Moderski, R; Mohamed, M; Morå, K; Moulin, E; Murach, T; de Naurois, M; Niederwanger, F; Niemiec, J; Oakes, L; O'Brien, P; Odaka, H; Ohm, S; Ostrowski, M; Öttl, S; Oya, I; Padovani, M; Panter, M; Parsons, R D; Paz Arribas, M; Pekeur, N W; Pelletier, G; Perennes, C; Petrucci, P-O; Peyaud, B; Pita, S; Poon, H; Prokhorov, D; Prokoph, H; Pühlhofer, G; Punch, M; Quirrenbach, A; Raab, S; Reimer, A; Reimer, O; Renaud, M; de Los Reyes, R; Rieger, F; Romoli, C; Rosier-Lees, S; Rowell, G; Rudak, B; Rulten, C B; Sahakian, V; Salek, D; Sanchez, D A; Santangelo, A; Sasaki, M; Schlickeiser, R; Schüssler, F; Schulz, A; Schwanke, U; Schwemmer, S; Settimo, M; Seyffert, A S; Shafi, N; Shilon, I; Simoni, R; Sol, H; Spanier, F; Spengler, G; Spies, F; Stawarz, Ł; Steenkamp, R; Stegmann, C; Stinzing, F; Stycz, K; Sushch, I; Tavernet, J-P; Tavernier, T; Taylor, A M; Terrier, R; Tibaldo, L; Tluczykont, M; Trichard, C; Tuffs, R; van der Walt, J; van Eldik, C; van Soelen, B; Vasileiadis, G; Veh, J; Venter, C; Viana, A; Vincent, P; Vink, J; Voisin, F; Völk, H J; Vuillaume, T; Wadiasingh, Z; Wagner, S J; Wagner, P; Wagner, R M; White, R; Wierzcholska, A; Willmann, P; Wörnlein, A; Wouters, D; Yang, R; Zabalza, V; Zaborov, D; Zacharias, M; Zdziarski, A A; Zech, A; Zefi, F; Ziegler, A; Żywucka, N

    2016-10-07

    A search for dark matter linelike signals iss performed in the vicinity of the Galactic Center by the H.E.S.S. experiment on observational data taken in 2014. An unbinned likelihood analysis iss developed to improve the sensitivity to linelike signals. The upgraded analysis along with newer data extend the energy coverage of the previous measurement down to 100 GeV. The 18 h of data collected with the H.E.S.S. array allow one to rule out at 95% C.L. the presence of a 130 GeV line (at l=-1.5°, b=0° and for a dark matter profile centered at this location) previously reported in Fermi-LAT data. This new analysis overlaps significantly in energy with previous Fermi-LAT and H.E.S.S. No significant excess associated with dark matter annihilations was found in the energy range of 100 GeV to 2 TeV and upper limits on the gamma-ray flux and the velocity weighted annihilation cross section are derived adopting an Einasto dark matter halo profile. Expected limits for present and future large statistics H.E.S.S. observations are also given.

  19. H.E.S.S. Limits on Linelike Dark Matter Signatures in the 100 GeV to 2 TeV Energy Range Close to the Galactic Center

    NASA Astrophysics Data System (ADS)

    Abdalla, H.; Abramowski, A.; Aharonian, F.; Ait Benkhali, F.; Akhperjanian, A. G.; Andersson, T.; Angüner, E. O.; Arrieta, M.; Aubert, P.; Backes, M.; Balzer, A.; Barnard, M.; Becherini, Y.; Becker Tjus, J.; Berge, D.; Bernhard, S.; Bernlöhr, K.; Birsin, E.; Blackwell, R.; Böttcher, M.; Boisson, C.; Bolmont, J.; Bordas, P.; Bregeon, J.; Brun, F.; Brun, P.; Bryan, M.; Bulik, T.; Capasso, M.; Carr, J.; Casanova, S.; Chakraborty, N.; Chalme-Calvet, R.; Chaves, R. C. G.; Chen, A.; Chevalier, J.; Chrétien, M.; Colafrancesco, S.; Cologna, G.; Condon, B.; Conrad, J.; Couturier, C.; Cui, Y.; Davids, I. D.; Degrange, B.; Deil, C.; Devin, J.; deWilt, P.; Djannati-Ataï, A.; Domainko, W.; Donath, A.; Drury, L. O'C.; Dubus, G.; Dutson, K.; Dyks, J.; Dyrda, M.; Edwards, T.; Egberts, K.; Eger, P.; Ernenwein, J.-P.; Eschbach, S.; Farnier, C.; Fegan, S.; Fernandes, M. V.; Fiasson, A.; Fontaine, G.; Förster, A.; Funk, S.; Füßling, M.; Gabici, S.; Gajdus, M.; Gallant, Y. A.; Garrigoux, T.; Giavitto, G.; Giebels, B.; Glicenstein, J. F.; Gottschall, D.; Goyal, A.; Grondin, M.-H.; Grudzińska, M.; Hadasch, D.; Hahn, J.; Hawkes, J.; Heinzelmann, G.; Henri, G.; Hermann, G.; Hervet, O.; Hillert, A.; Hinton, J. A.; Hofmann, W.; Hoischen, C.; Holler, M.; Horns, D.; Ivascenko, A.; Jacholkowska, A.; Jamrozy, M.; Janiak, M.; Jankowsky, D.; Jankowsky, F.; Jingo, M.; Jogler, T.; Jouvin, L.; Jung-Richardt, I.; Kastendieck, M. A.; Katarzyński, K.; Katz, U.; Kerszberg, D.; Khélifi, B.; Kieffer, M.; King, J.; Klepser, S.; Klochkov, D.; Kluźniak, W.; Kolitzus, D.; Komin, Nu.; Kosack, K.; Krakau, S.; Kraus, M.; Krayzel, F.; Krüger, P. P.; Laffon, H.; Lamanna, G.; Lau, J.; Lees, J.-P.; Lefaucheur, J.; Lefranc, V.; Lemière, A.; Lemoine-Goumard, M.; Lenain, J.-P.; Leser, E.; Liu, R.; Lohse, T.; Lorentz, M.; Lypova, I.; Marandon, V.; Marcowith, A.; Mariaud, C.; Marx, R.; Maurin, G.; Maxted, N.; Mayer, M.; Meintjes, P. J.; Meyer, M.; Mitchell, A. M. W.; Moderski, R.; Mohamed, M.; Morâ, K.; Moulin, E.; Murach, T.; de Naurois, M.; Niederwanger, F.; Niemiec, J.; Oakes, L.; O'Brien, P.; Odaka, H.; Ohm, S.; Ostrowski, M.; Öttl, S.; Oya, I.; Padovani, M.; Panter, M.; Parsons, R. D.; Paz Arribas, M.; Pekeur, N. W.; Pelletier, G.; Perennes, C.; Petrucci, P.-O.; Peyaud, B.; Pita, S.; Poon, H.; Prokhorov, D.; Prokoph, H.; Pühlhofer, G.; Punch, M.; Quirrenbach, A.; Raab, S.; Reimer, A.; Reimer, O.; Renaud, M.; de los Reyes, R.; Rieger, F.; Romoli, C.; Rosier-Lees, S.; Rowell, G.; Rudak, B.; Rulten, C. B.; Sahakian, V.; Salek, D.; Sanchez, D. A.; Santangelo, A.; Sasaki, M.; Schlickeiser, R.; Schüssler, F.; Schulz, A.; Schwanke, U.; Schwemmer, S.; Settimo, M.; Seyffert, A. S.; Shafi, N.; Shilon, I.; Simoni, R.; Sol, H.; Spanier, F.; Spengler, G.; Spies, F.; Stawarz, Ł.; Steenkamp, R.; Stegmann, C.; Stinzing, F.; Stycz, K.; Sushch, I.; Tavernet, J.-P.; Tavernier, T.; Taylor, A. M.; Terrier, R.; Tibaldo, L.; Tluczykont, M.; Trichard, C.; Tuffs, R.; van der Walt, J.; van Eldik, C.; van Soelen, B.; Vasileiadis, G.; Veh, J.; Venter, C.; Viana, A.; Vincent, P.; Vink, J.; Voisin, F.; Völk, H. J.; Vuillaume, T.; Wadiasingh, Z.; Wagner, S. J.; Wagner, P.; Wagner, R. M.; White, R.; Wierzcholska, A.; Willmann, P.; Wörnlein, A.; Wouters, D.; Yang, R.; Zabalza, V.; Zaborov, D.; Zacharias, M.; Zdziarski, A. A.; Zech, A.; Zefi, F.; Ziegler, A.; Żywucka, N.; H. E. S. S. Collaboration

    2016-10-01

    A search for dark matter linelike signals iss performed in the vicinity of the Galactic Center by the H.E.S.S. experiment on observational data taken in 2014. An unbinned likelihood analysis iss developed to improve the sensitivity to linelike signals. The upgraded analysis along with newer data extend the energy coverage of the previous measurement down to 100 GeV. The 18 h of data collected with the H.E.S.S. array allow one to rule out at 95% C.L. the presence of a 130 GeV line (at l =-1.5 ° , b =0 ° and for a dark matter profile centered at this location) previously reported in Fermi-LAT data. This new analysis overlaps significantly in energy with previous Fermi-LAT and H.E.S.S. results. No significant excess associated with dark matter annihilations was found in the energy range of 100 GeV to 2 TeV and upper limits on the gamma-ray flux and the velocity weighted annihilation cross section are derived adopting an Einasto dark matter halo profile. Expected limits for present and future large statistics H.E.S.S. observations are also given.

  20. AGR-1 Thermocouple Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeff Einerson

    2012-05-01

    This report documents an effort to analyze measured and simulated data obtained in the Advanced Gas Reactor (AGR) fuel irradiation test program conducted in the INL's Advanced Test Reactor (ATR) to support the Next Generation Nuclear Plant (NGNP) R&D program. The work follows up on a previous study (Pham and Einerson, 2010), in which statistical analysis methods were applied for AGR-1 thermocouple data qualification. The present work exercises the idea that, while recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, results of the numerical simulations can be used in combination with the statistical analysis methods tomore » further improve qualification of measured data. Additionally, the combined analysis of measured and simulation data can generate insights about simulation model uncertainty that can be useful for model improvement. This report also describes an experimental control procedure to maintain fuel target temperature in the future AGR tests using regression relationships that include simulation results. The report is organized into four chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program, AGR-1 test configuration and test procedure, overview of AGR-1 measured data, and overview of physics and thermal simulation, including modeling assumptions and uncertainties. A brief summary of statistical analysis methods developed in (Pham and Einerson 2010) for AGR-1 measured data qualification within NGNP Data Management and Analysis System (NDMAS) is also included for completeness. Chapters 2-3 describe and discuss cases, in which the combined use of experimental and simulation data is realized. A set of issues associated with measurement and modeling uncertainties resulted from the combined analysis are identified. This includes demonstration that such a combined analysis led to important insights for reducing uncertainty in presentation of AGR-1 measured data (Chapter 2) and interpretation of simulation results (Chapter 3). The statistics-based simulation-aided experimental control procedure described for the future AGR tests is developed and demonstrated in Chapter 4. The procedure for controlling the target fuel temperature (capsule peak or average) is based on regression functions of thermocouple readings and other relevant parameters and accounting for possible changes in both physical and thermal conditions and in instrument performance.« less

  1. Prognostic impact of the red cell distribution width in esophageal cancer patients: A systematic review and meta-analysis.

    PubMed

    Xu, Wei-Yu; Yang, Xiao-Bo; Wang, Wen-Qin; Bai, Yi; Long, Jun-Yu; Lin, Jian-Zhen; Xiong, Jian-Ping; Zheng, Yong-Chang; He, Xiao-Dong; Zhao, Hai-Tao; Sang, Xin-Ting

    2018-05-21

    To clarify the previous discrepant conclusions, we performed a meta-analysis to evaluate the prognostic value of red cell distribution width (RDW) in esophageal cancer (EC). We searched the PubMed, EMBASE, Web of Science and Cochrane Library databases to identify clinical studies, followed by using STATA version 12.0 for statistical analysis. Studies that met the following criteria were considered eligible: (1) Studies including EC patients who underwent radical esophagectomy; (2) studies including patients with localized disease without distant metastasis; (3) studies including patients without preoperative neoadjuvant therapy; (4) studies including patients without previous antiinflammatory therapies and with available preoperative laboratory outcomes; (5) studies reporting association between the preoperative RDW and overall survival (OS)/disease-free survival (DFS)/cancer-specific survival (CSS); and (6) studies published in English. A total of six articles, published between 2015 and 2017, fulfilled the selection criteria in the end. Statistical analysis showed that RDW was not associated with the prognosis of EC patients, irrespective of OS/CSS [hazard ratio (HR) = 1.27, 95% confidence interval (CI): 0.97-1.57, P = 0.000] or DFS (HR = 1.42, 95%CI: 0.96-1.88, P = 0.000). Subgroup analysis indicated that elevated RDW was significantly associated with worse OS/CSS of EC patients when RDW > 13% (HR = 1.45, 95%CI: 1.13-1.76, P = 0.000), when the patient number ≤ 400 (HR = 1.45, 95%CI: 1.13-1.76, P = 0.000) and when the study type was retrospective (HR = 1.42, 95%CI : 1.16-1.69, P = 0.000). Contrary to our general understanding, this meta-analysis revealed that RDW cannot serve as an indicator of poor prognosis in patients with EC. However, it may still be a useful predictor of unfavorable prognosis using an appropriate cut-off value.

  2. Association of Soccer and Genu Varum in Adolescents

    PubMed Central

    Asadi, Kamran; Mirbolook, Ahmadreza; Heidarzadeh, Abtin; Mardani Kivi, Mohsen; Emami Meybodi, Mohammad Kazem; Rouhi Rad, Melina

    2015-01-01

    Background: Genu varum is a physical deformity marked by bowing of the leg. One of the risk factors of this musculoskeletal alignment is stress on the knee joint such as with exercise. Objectives: Since the evaluation of genu varum has not been widely studies, this study was conducted to examine the association between genu varum and playing soccer. Materials and Methods: Between Septembers 2010-2012, 750 soccer players and 750 non-soccer players 10-18 years of age were included in the study. A questionnaire of data including age, height, weight, body mass index (BMI), years of soccer participation, the average time of playing soccer per week, previous trauma to the lower limbs, history of any fractures of the knee, previous hospitalizations, and the distance of joint lines between the knees was assessed for all subjects. Chi-square, student t-test, and one-way ANOVA were used for statistical analysis by SPSS v.19.0 software. In all tests, a P value of less than 0.05 was construed as statistically significant. Results: Both soccer players and controls had genu varum. However, the incidence of genu varum was higher in the soccer players (P = 0.0001) and it was more prevalent in the 16-18 year age group (P = 0.0001). The results revealed a statistically significant association between the degree of practices and the prevalence of genu varum (P = 0.0001). Moreover, previous trauma to the knees and practicing in load-bearing sports led to an increase in the degree of genu varum (P = 0.0001). Conclusions: There was a higher incidence of genu varum in soccer players than in control adolescents; the stress and load imposed on the knee joint led to more severe genu varum. PMID:26290852

  3. Methods for estimating the magnitude and frequency of floods for urban and small, rural streams in Georgia, South Carolina, and North Carolina, 2011

    USGS Publications Warehouse

    Feaster, Toby D.; Gotvald, Anthony J.; Weaver, J. Curtis

    2014-01-01

    Reliable estimates of the magnitude and frequency of floods are essential for the design of transportation and water-conveyance structures, flood-insurance studies, and flood-plain management. Such estimates are particularly important in densely populated urban areas. In order to increase the number of streamflow-gaging stations (streamgages) available for analysis, expand the geographical coverage that would allow for application of regional regression equations across State boundaries, and build on a previous flood-frequency investigation of rural U.S Geological Survey streamgages in the Southeast United States, a multistate approach was used to update methods for determining the magnitude and frequency of floods in urban and small, rural streams that are not substantially affected by regulation or tidal fluctuations in Georgia, South Carolina, and North Carolina. The at-site flood-frequency analysis of annual peak-flow data for urban and small, rural streams (through September 30, 2011) included 116 urban streamgages and 32 small, rural streamgages, defined in this report as basins draining less than 1 square mile. The regional regression analysis included annual peak-flow data from an additional 338 rural streamgages previously included in U.S. Geological Survey flood-frequency reports and 2 additional rural streamgages in North Carolina that were not included in the previous Southeast rural flood-frequency investigation for a total of 488 streamgages included in the urban and small, rural regression analysis. The at-site flood-frequency analyses for the urban and small, rural streamgages included the expected moments algorithm, which is a modification of the Bulletin 17B log-Pearson type III method for fitting the statistical distribution to the logarithms of the annual peak flows. Where applicable, the flood-frequency analysis also included low-outlier and historic information. Additionally, the application of a generalized Grubbs-Becks test allowed for the detection of multiple potentially influential low outliers. Streamgage basin characteristics were determined using geographical information system techniques. Initial ordinary least squares regression simulations reduced the number of basin characteristics on the basis of such factors as statistical significance, coefficient of determination, Mallow’s Cp statistic, and ease of measurement of the explanatory variable. Application of generalized least squares regression techniques produced final predictive (regression) equations for estimating the 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent annual exceedance probability flows for urban and small, rural ungaged basins for three hydrologic regions (HR1, Piedmont–Ridge and Valley; HR3, Sand Hills; and HR4, Coastal Plain), which previously had been defined from exploratory regression analysis in the Southeast rural flood-frequency investigation. Because of the limited availability of urban streamgages in the Coastal Plain of Georgia, South Carolina, and North Carolina, additional urban streamgages in Florida and New Jersey were used in the regression analysis for this region. Including the urban streamgages in New Jersey allowed for the expansion of the applicability of the predictive equations in the Coastal Plain from 3.5 to 53.5 square miles. Average standard error of prediction for the predictive equations, which is a measure of the average accuracy of the regression equations when predicting flood estimates for ungaged sites, range from 25.0 percent for the 10-percent annual exceedance probability regression equation for the Piedmont–Ridge and Valley region to 73.3 percent for the 0.2-percent annual exceedance probability regression equation for the Sand Hills region.

  4. General solution of the chemical master equation and modality of marginal distributions for hierarchic first-order reaction networks.

    PubMed

    Reis, Matthias; Kromer, Justus A; Klipp, Edda

    2018-01-20

    Multimodality is a phenomenon which complicates the analysis of statistical data based exclusively on mean and variance. Here, we present criteria for multimodality in hierarchic first-order reaction networks, consisting of catalytic and splitting reactions. Those networks are characterized by independent and dependent subnetworks. First, we prove the general solvability of the Chemical Master Equation (CME) for this type of reaction network and thereby extend the class of solvable CME's. Our general solution is analytical in the sense that it allows for a detailed analysis of its statistical properties. Given Poisson/deterministic initial conditions, we then prove the independent species to be Poisson/binomially distributed, while the dependent species exhibit generalized Poisson/Khatri Type B distributions. Generalized Poisson/Khatri Type B distributions are multimodal for an appropriate choice of parameters. We illustrate our criteria for multimodality by several basic models, as well as the well-known two-stage transcription-translation network and Bateman's model from nuclear physics. For both examples, multimodality was previously not reported.

  5. Statistical and clustering analysis for disturbances: A case study of voltage dips in wind farms

    DOE PAGES

    Garcia-Sanchez, Tania; Gomez-Lazaro, Emilio; Muljadi, Eduard; ...

    2016-01-28

    This study proposes and evaluates an alternative statistical methodology to analyze a large number of voltage dips. For a given voltage dip, a set of lengths is first identified to characterize the root mean square (rms) voltage evolution along the disturbance, deduced from partial linearized time intervals and trajectories. Principal component analysis and K-means clustering processes are then applied to identify rms-voltage patterns and propose a reduced number of representative rms-voltage profiles from the linearized trajectories. This reduced group of averaged rms-voltage profiles enables the representation of a large amount of disturbances, which offers a visual and graphical representation ofmore » their evolution along the events, aspects that were not previously considered in other contributions. The complete process is evaluated on real voltage dips collected in intense field-measurement campaigns carried out in a wind farm in Spain among different years. The results are included in this paper.« less

  6. Geographically Sourcing Cocaine’s Origin – Delineation of the Nineteen Major Coca Growing Regions in South America

    PubMed Central

    Mallette, Jennifer R.; Casale, John F.; Jordan, James; Morello, David R.; Beyer, Paul M.

    2016-01-01

    Previously, geo-sourcing to five major coca growing regions within South America was accomplished. However, the expansion of coca cultivation throughout South America made sub-regional origin determinations increasingly difficult. The former methodology was recently enhanced with additional stable isotope analyses (2H and 18O) to fully characterize cocaine due to the varying environmental conditions in which the coca was grown. An improved data analysis method was implemented with the combination of machine learning and multivariate statistical analysis methods to provide further partitioning between growing regions. Here, we show how the combination of trace cocaine alkaloids, stable isotopes, and multivariate statistical analyses can be used to classify illicit cocaine as originating from one of 19 growing regions within South America. The data obtained through this approach can be used to describe current coca cultivation and production trends, highlight trafficking routes, as well as identify new coca growing regions. PMID:27006288

  7. Statistical methodology: II. Reliability and validity assessment in study design, Part B.

    PubMed

    Karras, D J

    1997-02-01

    Validity measures the correspondence between a test and other purported measures of the same or similar qualities. When a reference standard exists, a criterion-based validity coefficient can be calculated. If no such standard is available, the concepts of content and construct validity may be used, but quantitative analysis may not be possible. The Pearson and Spearman tests of correlation are often used to assess the correspondence between tests, but do not account for measurement biases and may yield misleading results. Techniques that measure interest differences may be more meaningful in validity assessment, and the kappa statistic is useful for analyzing categorical variables. Questionnaires often can be designed to allow quantitative assessment of reliability and validity, although this may be difficult. Inclusion of homogeneous questions is necessary to assess reliability. Analysis is enhanced by using Likert scales or similar techniques that yield ordinal data. Validity assessment of questionnaires requires careful definition of the scope of the test and comparison with previously validated tools.

  8. Echinococcus vogeli Rausch and Bernstein, 1972, from the paca, Cuniculus paca L. (Rodentia: Dasyproctidae), in the Departamento de Santa Cruz, Bolivia.

    PubMed

    Gardner, S L; Rausch, R L; Camacho, O C

    1988-06-01

    Among approximately 2,000 mammals examined for helminths in various regions of Bolivia during 1983-1987, cysts of Echinococcus vogeli Rausch and Bernstein, 1972, were found in a single paca, Cuniculus paca L., collected at La Laguna, Departamento de Santa Cruz (lat. 16 degrees 36'W; long. 62 degrees 42'S). This record, the first from Bolivia, represents a considerable extension of the known geographic range of this species in South America. Upon analysis of the morphologic characteristics of the protoscoleces derived from the cysts, the sizes of rostellar hooks from the material from the paca were found to be well within the ranges reported in previous studies. Statistical analysis of frequency distributions of hook characteristics revealed some deviations from normality. These results indicate that parametric statistics should be applied with caution in analyses of inter-and intraspecific variation of morphologic characteristics of hooks of metacestodes of the genus Echinococcus.

  9. Longevity of metal-ceramic crowns cemented with self-adhesive resin cement: a prospective clinical study

    PubMed

    Brondani, Lucas Pradebon; Pereira-Cenci, Tatiana; Wandsher, Vinicius Felipe; Pereira, Gabriel Kalil; Valandro, Luis Felipe; Bergoli, César Dalmolin

    2017-04-10

    Resin cements are often used for single crown cementation due to their physical properties. Self-adhesive resin cements gained widespread due to their simplified technique compared to regular resin cement. However, there is lacking clinical evidence about the long-term behavior of this material. The aim of this prospective clinical trial was to assess the survival rates of metal-ceramic crowns cemented with self-adhesive resin cement up to six years. One hundred and twenty-nine subjects received 152 metal-ceramic crowns. The cementation procedures were standardized and performed by previously trained operators. The crowns were assessed as to primary outcome (debonding) and FDI criteria. Statistical analysis was performed using Kaplan-Meier statistics and descriptive analysis. Three failures occurred (debonding), resulting in a 97.6% survival rate. FDI criteria assessment resulted in scores 1 and 2 (acceptable clinical evaluation) for all surviving crowns. The use of self-adhesive resin cement is a feasible alternative for metal-ceramic crowns cementation, achieving high and adequate survival rates.

  10. Rainfall Threshold Assessment Corresponding to the Maximum Allowable Turbidity for Source Water.

    PubMed

    Fan, Shu-Kai S; Kuan, Wen-Hui; Fan, Chihhao; Chen, Chiu-Yang

    2016-12-01

      This study aims to assess the upstream rainfall thresholds corresponding to the maximum allowable turbidity of source water, using monitoring data and artificial neural network computation. The Taipei Water Source Domain was selected as the study area, and the upstream rainfall records were collected for statistical analysis. Using analysis of variance (ANOVA), the cumulative rainfall records of one-day Ping-lin, two-day Ping-lin, two-day Tong-hou, one-day Guie-shan, and one-day Tai-ping (rainfall in the previous 24 or 48 hours at the named weather stations) were found to be the five most significant parameters for downstream turbidity development. An artificial neural network model was constructed to predict the downstream turbidity in the area investigated. The observed and model-calculated turbidity data were applied to assess the rainfall thresholds in the studied area. By setting preselected turbidity criteria, the upstream rainfall thresholds for these statistically determined rain gauge stations were calculated.

  11. Some Interesting Applications of Probabilistic Techiques in Structural Dynamic Analysis of Rocket Engines

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.

    2014-01-01

    Numerical and Analytical methods developed to determine damage accumulation in specific engine components when speed variation included. Dither Life Ratio shown to be well over factor of 2 for specific example. Steady-State assumption shown to be accurate for most turbopump cases, allowing rapid calculation of DLR. If hot-fire speed data unknown, Monte Carlo method developed that uses speed statistics for similar engines. Application of techniques allow analyst to reduce both uncertainty and excess conservatism. High values of DLR could allow previously unacceptable part to pass HCF criteria without redesign. Given benefit and ease of implementation, recommend that any finite life turbomachine component analysis adopt these techniques. Probability Values calculated, compared, and evaluated for several industry-proposed methods for combining random and harmonic loads. Two new excel macros written to calculate combined load for any specific probability level. Closed form Curve fits generated for widely used 3(sigma) and 2(sigma) probability levels. For design of lightweight aerospace components, obtaining accurate, reproducible, statistically meaningful answer critical.

  12. Effect of environment and genotype on commercial maize hybrids using LC/MS-based metabolomics.

    PubMed

    Baniasadi, Hamid; Vlahakis, Chris; Hazebroek, Jan; Zhong, Cathy; Asiago, Vincent

    2014-02-12

    We recently applied gas chromatography coupled to time-of-flight mass spectrometry (GC/TOF-MS) and multivariate statistical analysis to measure biological variation of many metabolites due to environment and genotype in forage and grain samples collected from 50 genetically diverse nongenetically modified (non-GM) DuPont Pioneer commercial maize hybrids grown at six North American locations. In the present study, the metabolome coverage was extended using a core subset of these grain and forage samples employing ultra high pressure liquid chromatography (uHPLC) mass spectrometry (LC/MS). A total of 286 and 857 metabolites were detected in grain and forage samples, respectively, using LC/MS. Multivariate statistical analysis was utilized to compare and correlate the metabolite profiles. Environment had a greater effect on the metabolome than genetic background. The results of this study support and extend previously published insights into the environmental and genetic associated perturbations to the metabolome that are not associated with transgenic modification.

  13. Statistical Diversions

    ERIC Educational Resources Information Center

    Petocz, Peter; Sowey, Eric

    2008-01-01

    As a branch of knowledge, Statistics is ubiquitous and its applications can be found in (almost) every field of human endeavour. In this article, the authors track down the possible source of the link between the "Siren song" and applications of Statistics. Answers to their previous five questions and five new questions on Statistics are presented.

  14. Some Variables in Relation to Students' Anxiety in Learning Statistics.

    ERIC Educational Resources Information Center

    Sutarso, Toto

    The purpose of this study was to investigate some variables that relate to students' anxiety in learning statistics. The variables included sex, class level, students' achievement, school, mathematical background, previous statistics courses, and race. The instrument used was the 24-item Students' Attitudes Toward Statistics (STATS), which was…

  15. An experimental validation of a statistical-based damage detection approach.

    DOT National Transportation Integrated Search

    2011-01-01

    In this work, a previously-developed, statistical-based, damage-detection approach was validated for its ability to : autonomously detect damage in bridges. The damage-detection approach uses statistical differences in the actual and : predicted beha...

  16. Real-time Raman spectroscopy for automatic in vivo skin cancer detection: an independent validation.

    PubMed

    Zhao, Jianhua; Lui, Harvey; Kalia, Sunil; Zeng, Haishan

    2015-11-01

    In a recent study, we have demonstrated that real-time Raman spectroscopy could be used for skin cancer diagnosis. As a translational study, the objective of this study is to validate previous findings through a completely independent clinical test. In total, 645 confirmed cases were included in the analysis, including a cohort of 518 cases from a previous study, and an independent cohort of 127 new cases. Multi-variant statistical data analyses including principal component with general discriminant analysis (PC-GDA) and partial least squares (PLS) were used separately for lesion classification, which generated similar results. When the previous cohort (n = 518) was used as training and the new cohort (n = 127) was used as testing, the area under the receiver operating characteristic curve (ROC AUC) was found to be 0.889 (95 % CI 0.834-0.944; PLS); when the two cohorts were combined, the ROC AUC was 0.894 (95 % CI 0.870-0.918; PLS) with the narrowest confidence intervals. Both analyses were comparable to the previous findings, where the ROC AUC was 0.896 (95 % CI 0.846-0.946; PLS). The independent study validates that real-time Raman spectroscopy could be used for automatic in vivo skin cancer diagnosis with good accuracy.

  17. Power-up: A Reanalysis of 'Power Failure' in Neuroscience Using Mixture Modeling

    PubMed Central

    Wood, John

    2017-01-01

    Recently, evidence for endemically low statistical power has cast neuroscience findings into doubt. If low statistical power plagues neuroscience, then this reduces confidence in the reported effects. However, if statistical power is not uniformly low, then such blanket mistrust might not be warranted. Here, we provide a different perspective on this issue, analyzing data from an influential study reporting a median power of 21% across 49 meta-analyses (Button et al., 2013). We demonstrate, using Gaussian mixture modeling, that the sample of 730 studies included in that analysis comprises several subcomponents so the use of a single summary statistic is insufficient to characterize the nature of the distribution. We find that statistical power is extremely low for studies included in meta-analyses that reported a null result and that it varies substantially across subfields of neuroscience, with particularly low power in candidate gene association studies. Therefore, whereas power in neuroscience remains a critical issue, the notion that studies are systematically underpowered is not the full story: low power is far from a universal problem. SIGNIFICANCE STATEMENT Recently, researchers across the biomedical and psychological sciences have become concerned with the reliability of results. One marker for reliability is statistical power: the probability of finding a statistically significant result given that the effect exists. Previous evidence suggests that statistical power is low across the field of neuroscience. Our results present a more comprehensive picture of statistical power in neuroscience: on average, studies are indeed underpowered—some very seriously so—but many studies show acceptable or even exemplary statistical power. We show that this heterogeneity in statistical power is common across most subfields in neuroscience. This new, more nuanced picture of statistical power in neuroscience could affect not only scientific understanding, but potentially policy and funding decisions for neuroscience research. PMID:28706080

  18. Leasing Into the Sun: A Mixed Method Analysis of Transactions of Homes with Third Party Owned Solar

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoen, Ben; Rand, Joseph; Adomatis, Sandra

    This analysis is the first to examine if homes with third-party owned (TPO) PV systems are unique in the marketplace as compared to non-PV or non-TPO PV homes. This is of growing importance as the number of homes with TPO systems is nearly a half of a million in the US currently and is growing. A hedonic pricing model analysis of 20,106 homes that sold in California between 2011 and 2013 is conducted, as well as a paired sales analysis of 18 pairs of TPO PV and non-PV homes in San Diego spanning 2012 and 2013. The hedonic model examinedmore » 2,914 non-TPO PV home sales and 113 TPO PV sales and fails to uncover statistically significant premiums for TPO PV homes nor for those with pre-paid leases as compared to non-PV homes. Similarly, the paired sales analysis does not find evidence of an impact to value for the TPO homes when comparing to non-PV homes. Analyses of non-TPO PV sales both here and previously have found larger and statistically significant premiums. Collection of a larger dataset that covers the present period is recommended for future analyses so that smaller, more nuanced and recent effects can be discovered.« less

  19. New subtype of familial achondrogenesis type IA (Houston-Harris).

    PubMed

    Ramírez-García, Sergio Alberto; García-Cruz, Diana; Cervantes-Aragón, Iván; Bitar-Alatorre, Wadih Emilio; Dávalos-Rodríguez, Ingrid Patricia; Dávalos-Rodríguez, Nory Omayra; Corona-Rivera, Jorge Román; Sánchez-Corona, José

    2018-01-01

    Achondrogenesis is a skeletal dysplasia characterized primarily by short stature, severe micromelia, short and narrow chest, prematurity, polyhydramnios, fetal hydrops, and in utero or neonatal death. Based on the radiological and histopathological findings, there are three types of achondrogenesis: type 1A (Houston-Harris), type 1B (Fraccaro) and type 2 (Langer-Saldino). A premature female product was studied whose clinical, radiological and histopathological characteristics were compatible with achondrogenesis Type 1A. The family information allowed us to conclude that the 4 products of the 6 previous pregnancies were affected. Statistical analysis in at least 4 families previously described, including this family case showed significant differences between expected and observed number of members, being incongruent with an autosomal recessive mode of inheritance previously reported. therefore, it could be considered a new subtype of achondrogenesis type 1A due to the presence of a preferential germline mutation. Copyright: © 2018 Permanyer.

  20. Use of a statistical model of the whole femur in a large scale, multi-model study of femoral neck fracture risk.

    PubMed

    Bryan, Rebecca; Nair, Prasanth B; Taylor, Mark

    2009-09-18

    Interpatient variability is often overlooked in orthopaedic computational studies due to the substantial challenges involved in sourcing and generating large numbers of bone models. A statistical model of the whole femur incorporating both geometric and material property variation was developed as a potential solution to this problem. The statistical model was constructed using principal component analysis, applied to 21 individual computer tomography scans. To test the ability of the statistical model to generate realistic, unique, finite element (FE) femur models it was used as a source of 1000 femurs to drive a study on femoral neck fracture risk. The study simulated the impact of an oblique fall to the side, a scenario known to account for a large proportion of hip fractures in the elderly and have a lower fracture load than alternative loading approaches. FE model generation, application of subject specific loading and boundary conditions, FE processing and post processing of the solutions were completed automatically. The generated models were within the bounds of the training data used to create the statistical model with a high mesh quality, able to be used directly by the FE solver without remeshing. The results indicated that 28 of the 1000 femurs were at highest risk of fracture. Closer analysis revealed the percentage of cortical bone in the proximal femur to be a crucial differentiator between the failed and non-failed groups. The likely fracture location was indicated to be intertrochantic. Comparison to previous computational, clinical and experimental work revealed support for these findings.

  1. Using SERVQUAL and Kano research techniques in a patient service quality survey.

    PubMed

    Christoglou, Konstantinos; Vassiliadis, Chris; Sigalas, Ioakim

    2006-01-01

    This article presents the results of a service quality study. After an introduction to the SERVQUAL and the Kano research techniques, a Kano analysis of 75 patients from the General Hospital of Katerini in Greece is presented. The service quality criterion used satisfaction and dissatisfaction indices. The Kano statistical analysis process results strengthened the hypothesis of previous research regarding the importance of personal knowledge, the courtesy of the hospital employees and their ability to convey trust and confidence (assurance dimension). Managerial suggestions are made regarding the best way of acting and approaching hospital patients based on the basic SERVQUAL model.

  2. Validating MEDIQUAL Constructs

    NASA Astrophysics Data System (ADS)

    Lee, Sang-Gun; Min, Jae H.

    In this paper, we validate MEDIQUAL constructs through the different media users in help desk service. In previous research, only two end-users' constructs were used: assurance and responsiveness. In this paper, we extend MEDIQUAL constructs to include reliability, empathy, assurance, tangibles, and responsiveness, which are based on the SERVQUAL theory. The results suggest that: 1) five MEDIQUAL constructs are validated through the factor analysis. That is, importance of the constructs have relatively high correlations between measures of the same construct using different methods and low correlations between measures of the constructs that are expected to differ; and 2) five MEDIQUAL constructs are statistically significant on media users' satisfaction in help desk service by regression analysis.

  3. STS-121/Discovery: Imagery Quick-Look Briefing

    NASA Technical Reports Server (NTRS)

    2006-01-01

    Kyle Herring (NASA Public Affairs) introduced Wayne Hale (Space Shuttle Program Manager) who stated that the imagery for the Space shuttle external tank showed the tank performed very well. Image analysis showed small pieces of foam falling off the rocket booster and external tank. There was no risk involved in these minor incidents. Statistical models were built to assist in risk analysis. The orbiter performed excellently. Wayne also provided some close-up pictures of small pieces of foam separating from the external tank during launching. He said the crew will also perform a 100% inspection of the heat shield. This flight showed great improvement over previous flights.

  4. Association Testing of Previously Reported Variants in a Large Case-Control Meta-analysis of Diabetic Nephropathy

    PubMed Central

    Williams, Winfred W.; Salem, Rany M.; McKnight, Amy Jayne; Sandholm, Niina; Forsblom, Carol; Taylor, Andrew; Guiducci, Candace; McAteer, Jarred B.; McKay, Gareth J.; Isakova, Tamara; Brennan, Eoin P.; Sadlier, Denise M.; Palmer, Cameron; Söderlund, Jenny; Fagerholm, Emma; Harjutsalo, Valma; Lithovius, Raija; Gordin, Daniel; Hietala, Kustaa; Kytö, Janne; Parkkonen, Maija; Rosengård-Bärlund, Milla; Thorn, Lena; Syreeni, Anna; Tolonen, Nina; Saraheimo, Markku; Wadén, Johan; Pitkäniemi, Janne; Sarti, Cinzia; Tuomilehto, Jaakko; Tryggvason, Karl; Österholm, Anne-May; He, Bing; Bain, Steve; Martin, Finian; Godson, Catherine; Hirschhorn, Joel N.; Maxwell, Alexander P.; Groop, Per-Henrik; Florez, Jose C.

    2012-01-01

    We formed the GEnetics of Nephropathy–an International Effort (GENIE) consortium to examine previously reported genetic associations with diabetic nephropathy (DN) in type 1 diabetes. GENIE consists of 6,366 similarly ascertained participants of European ancestry with type 1 diabetes, with and without DN, from the All Ireland-Warren 3-Genetics of Kidneys in Diabetes U.K. and Republic of Ireland (U.K.-R.O.I.) collection and the Finnish Diabetic Nephropathy Study (FinnDiane), combined with reanalyzed data from the Genetics of Kidneys in Diabetes U.S. Study (U.S. GoKinD). We found little evidence for the association of the EPO promoter polymorphism, rs161740, with the combined phenotype of proliferative retinopathy and end-stage renal disease in U.K.-R.O.I. (odds ratio [OR] 1.14, P = 0.19) or FinnDiane (OR 1.06, P = 0.60). However, a fixed-effects meta-analysis that included the previously reported cohorts retained a genome-wide significant association with that phenotype (OR 1.31, P = 2 × 10−9). An expanded investigation of the ELMO1 locus and genetic regions reported to be associated with DN in the U.S. GoKinD yielded only nominal statistical significance for these loci. Finally, top candidates identified in a recent meta-analysis failed to reach genome-wide significance. In conclusion, we were unable to replicate most of the previously reported genetic associations for DN, and significance for the EPO promoter association was attenuated. PMID:22721967

  5. Measurement of the W charge asymmetry in production with jets using 5 inverse-femtobarns of data measured at center of mass energy = 7TeV with CMS

    NASA Astrophysics Data System (ADS)

    Lawson, Philip Daniel

    A measurement of the electron charge asymmetry in p+p→W production in association with jets at sqrt{s}=7 TeV is presented. The dataset corresponds to an integrated luminosity of L=5fb-1 recorded by the CMS detector in proton-proton collisions at the LHC. The sample represents a large increase in statistical precision with respect to previous CMS results and describes a first study of charge asymmetry measured in p+p→W+1 jet events. Full comparisons to previous results and theoretical predictions are provided and recommendations for extending the analysis to produce valuable input for future PDF models are made.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Le, K. C.; Tran, T. M.; Langer, J. S.

    The statistical-thermodynamic dislocation theory developed in previous papers is used here in an analysis of high-temperature deformation of aluminum and steel. Using physics-based parameters that we expect theoretically to be independent of strain rate and temperature, we are able to fit experimental stress-strain curves for three different strain rates and three different temperatures for each of these two materials. Here, our theoretical curves include yielding transitions at zero strain in agreement with experiment. We find that thermal softening effects are important even at the lowest temperatures and smallest strain rates.

  7. An Analysis of the Impact of Job Search Behaviors on Air Force Company Grade Officer Turnover

    DTIC Science & Technology

    2012-03-01

    pilot tested on Air Force CGOs. Participants were given the definition of passive job search and active job search used in this research effort, and...identifying these different groups and testing the modified model separately within each could yield more accuracy in predicting turnover. This research ...the model the same way. Use of the pseudo R 2 , and the reported statistics and the table design were done in the same manner as previous research

  8. Orientation of Hittite Monuments

    NASA Astrophysics Data System (ADS)

    González-García, A. César; Belmonte, Juan Antonio

    The possible astronomical or topographical orientations of the Hittite monuments of the Bronze Age has remained unexplored until recently. This would provide an important insight into how temporality was imprinted by this culture in sacred spaces and in the landscape. The authors' analysis of a statistically significant sample of Hittite temples - and a few monumental gates - has demonstrated that ancient Hittite monuments were not randomly orientated as previously thought. On the contrary, there were well-defined patterns of orientation that can be interpreted within the context of Hittite culture and religion.

  9. A case-control study of malignant melanoma among Lawrence Livermore National Laboratory employees: A critical evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kupper, L.L.; Setzer, R.W.; Schwartzbaum, J.

    1987-07-01

    This document reports on a reevaluation of data obtained in a previous report on occupational factors associated with the development of malignant melanomas at Lawrence Livermore National Laboratory. The current report reduces the number of these factors from five to three based on a rigorous statistical analysis of the original data. Recommendations include restructuring the original questionnaire and trying to contact more individuals that worked with volatile photographic chemicals. 17 refs., 7 figs., 22 tabs. (TEM)

  10. Revisiting the Estimation of Dinosaur Growth Rates

    PubMed Central

    Myhrvold, Nathan P.

    2013-01-01

    Previous growth-rate studies covering 14 dinosaur taxa, as represented by 31 data sets, are critically examined and reanalyzed by using improved statistical techniques. The examination reveals that some previously reported results cannot be replicated by using the methods originally reported; results from new methods are in many cases different, in both the quantitative rates and the qualitative nature of the growth, from results in the prior literature. Asymptotic growth curves, which have been hypothesized to be ubiquitous, are shown to provide best fits for only four of the 14 taxa. Possible reasons for non-asymptotic growth patterns are discussed; they include systematic errors in the age-estimation process and, more likely, a bias toward younger ages among the specimens analyzed. Analysis of the data sets finds that only three taxa include specimens that could be considered skeletally mature (i.e., having attained 90% of maximum body size predicted by asymptotic curve fits), and eleven taxa are quite immature, with the largest specimen having attained less than 62% of predicted asymptotic size. The three taxa that include skeletally mature specimens are included in the four taxa that are best fit by asymptotic curves. The totality of results presented here suggests that previous estimates of both maximum dinosaur growth rates and maximum dinosaur sizes have little statistical support. Suggestions for future research are presented. PMID:24358133

  11. Dentists' attitude to provision of care for people with learning disabilities in Udaipur, India.

    PubMed

    Nagarajappa, Ramesh; Tak, Mridula; Sharda, Archana J; Asawa, Kailash; Jalihal, Sagar; Kakatkar, Gauri

    2013-03-01

    This study determines and compares the attitudes of dentists to the provision of care for people with learning disabilities according to gender, qualification, previous experience of treating patients with learning disabilities and work experience of dentists. A cross-sectional study was conducted among 247 dentists (166 men and 81 women) using a pretested structured questionnaire. This questionnaire assessed the respondent's attitude towards learning-disabled patients in five categories: beliefs about treating them, their capabilities, discrimination against these patients, their social behaviour and quality of care to be received by these patients. The information on dentist's gender, qualification, work experience and previous experience of treating patients with learning disabilities was also collected through questionnaire. The Student's t-test and anova test were used for statistical analysis. The mean attitude score was found to be 71.13 ± 8.97. A statistically significant difference was found in the mean attitude scores of dentists with work experience (p = 0.000). Study subjects with postgraduate qualification and previous experience of treating patients with learning disabilities had significantly greater mean attitude score than their counterparts (p = 0.000). The overall attitude of dentists towards provision of care for people with learning disabilities was favourable, which increased with higher qualification and past experience. © 2012 The Authors. Scandinavian Journal of Caring Sciences © 2012 Nordic College of Caring Science.

  12. Scaling analysis of Anderson localizing optical fibers

    NASA Astrophysics Data System (ADS)

    Abaie, Behnam; Mafi, Arash

    2017-02-01

    Anderson localizing optical fibers (ALOF) enable a novel optical waveguiding mechanism; if a narrow beam is scanned across the input facet of the disordered fiber, the output beam follows the transverse position of the incoming wave. Strong transverse disorder induces several localized modes uniformly spread across the transverse structure of the fiber. Each localized mode acts like a transmission channel which carries a narrow input beam along the fiber without transverse expansion. Here, we investigate scaling of transverse size of the localized modes of ALOF with respect to transverse dimensions of the fiber. Probability density function (PDF) of the mode-area is applied and it is shown that PDF converges to a terminal shape at transverse dimensions considerably smaller than the previous experimental implementations. Our analysis turns the formidable numerical task of ALOF simulations into a much simpler problem, because the convergence of mode-area PDF to a terminal shape indicates that a much smaller disordered fiber, compared to previous numerical and experimental implementations, provides all the statistical information required for the precise analysis of the fiber.

  13. The application of the statistical theory of extreme values to gust-load problems

    NASA Technical Reports Server (NTRS)

    Press, Harry

    1950-01-01

    An analysis is presented which indicates that the statistical theory of extreme values is applicable to the problems of predicting the frequency of encountering the larger gust loads and gust velocities for both specific test conditions as well as commercial transport operations. The extreme-value theory provides an analytic form for the distributions of maximum values of gust load and velocity. Methods of fitting the distribution are given along with a method of estimating the reliability of the predictions. The theory of extreme values is applied to available load data from commercial transport operations. The results indicate that the estimates of the frequency of encountering the larger loads are more consistent with the data and more reliable than those obtained in previous analyses. (author)

  14. VizieR Online Data Catalog: HARPS timeseries data for HD41248 (Jenkins+, 2014)

    NASA Astrophysics Data System (ADS)

    Jenkins, J. S.; Tuomi, M.

    2017-05-01

    We modeled the HARPS radial velocities of HD 42148 by adopting the analysis techniques and the statistical model applied in Tuomi et al. (2014, arXiv:1405.2016). This model contains Keplerian signals, a linear trend, a moving average component with exponential smoothing, and linear correlations with activity indices, namely, BIS, FWHM, and chromospheric activity S index. We applied our statistical model outlined above to the full data set of radial velocities for HD 41248, combining the previously published data in Jenkins et al. (2013ApJ...771...41J) with the newly published data in Santos et al. (2014, J/A+A/566/A35), giving rise to a total time series of 223 HARPS (Mayor et al. 2003Msngr.114...20M) velocities. (1 data file).

  15. Statistical Analysis of Complexity Generators for Cost Estimation

    NASA Technical Reports Server (NTRS)

    Rowell, Ginger Holmes

    1999-01-01

    Predicting the cost of cutting edge new technologies involved with spacecraft hardware can be quite complicated. A new feature of the NASA Air Force Cost Model (NAFCOM), called the Complexity Generator, is being developed to model the complexity factors that drive the cost of space hardware. This parametric approach is also designed to account for the differences in cost, based on factors that are unique to each system and subsystem. The cost driver categories included in this model are weight, inheritance from previous missions, technical complexity, and management factors. This paper explains the Complexity Generator framework, the statistical methods used to select the best model within this framework, and the procedures used to find the region of predictability and the prediction intervals for the cost of a mission.

  16. Lightweight and Statistical Techniques for Petascale PetaScale Debugging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Barton

    2014-06-30

    This project investigated novel techniques for debugging scientific applications on petascale architectures. In particular, we developed lightweight tools that narrow the problem space when bugs are encountered. We also developed techniques that either limit the number of tasks and the code regions to which a developer must apply a traditional debugger or that apply statistical techniques to provide direct suggestions of the location and type of error. We extend previous work on the Stack Trace Analysis Tool (STAT), that has already demonstrated scalability to over one hundred thousand MPI tasks. We also extended statistical techniques developed to isolate programming errorsmore » in widely used sequential or threaded applications in the Cooperative Bug Isolation (CBI) project to large scale parallel applications. Overall, our research substantially improved productivity on petascale platforms through a tool set for debugging that complements existing commercial tools. Previously, Office Of Science application developers relied either on primitive manual debugging techniques based on printf or they use tools, such as TotalView, that do not scale beyond a few thousand processors. However, bugs often arise at scale and substantial effort and computation cycles are wasted in either reproducing the problem in a smaller run that can be analyzed with the traditional tools or in repeated runs at scale that use the primitive techniques. New techniques that work at scale and automate the process of identifying the root cause of errors were needed. These techniques significantly reduced the time spent debugging petascale applications, thus leading to a greater overall amount of time for application scientists to pursue the scientific objectives for which the systems are purchased. We developed a new paradigm for debugging at scale: techniques that reduced the debugging scenario to a scale suitable for traditional debuggers, e.g., by narrowing the search for the root-cause analysis to a small set of nodes or by identifying equivalence classes of nodes and sampling our debug targets from them. We implemented these techniques as lightweight tools that efficiently work on the full scale of the target machine. We explored four lightweight debugging refinements: generic classification parameters, such as stack traces, application-specific classification parameters, such as global variables, statistical data acquisition techniques and machine learning based approaches to perform root cause analysis. Work done under this project can be divided into two categories, new algorithms and techniques for scalable debugging, and foundation infrastructure work on our MRNet multicast-reduction framework for scalability, and Dyninst binary analysis and instrumentation toolkits.« less

  17. SEDIDAT: A BASIC program for the collection and statistical analysis of particle settling velocity data

    NASA Astrophysics Data System (ADS)

    Wright, Robyn; Thornberg, Steven M.

    SEDIDAT is a series of compiled IBM-BASIC (version 2.0) programs that direct the collection, statistical calculation, and graphic presentation of particle settling velocity and equivalent spherical diameter for samples analyzed using the settling tube technique. The programs follow a menu-driven format that is understood easily by students and scientists with little previous computer experience. Settling velocity is measured directly (cm,sec) and also converted into Chi units. Equivalent spherical diameter (reported in Phi units) is calculated using a modified Gibbs equation for different particle densities. Input parameters, such as water temperature, settling distance, particle density, run time, and Phi;Chi interval are changed easily at operator discretion. Optional output to a dot-matrix printer includes a summary of moment and graphic statistical parameters, a tabulation of individual and cumulative weight percents, a listing of major distribution modes, and cumulative and histogram plots of a raw time, settling velocity. Chi and Phi data.

  18. A procedure for combining acoustically induced and mechanically induced loads (first passage failure design criterion)

    NASA Technical Reports Server (NTRS)

    Crowe, D. R.; Henricks, W.

    1983-01-01

    The combined load statistics are developed by taking the acoustically induced load to be a random population, assumed to be stationary. Each element of this ensemble of acoustically induced loads is assumed to have the same power spectral density (PSD), obtained previously from a random response analysis employing the given acoustic field in the STS cargo bay as a stationary random excitation. The mechanically induced load is treated as either (1) a known deterministic transient, or (2) a nonstationary random variable of known first and second statistical moments which vary with time. A method is then shown for determining the probability that the combined load would, at any time, have a value equal to or less than a certain level. Having obtained a statistical representation of how the acoustic and mechanical loads are expected to combine, an analytical approximation for defining design levels for these loads is presented using the First Passage failure criterion.

  19. Breast cancer statistics and prediction methodology: a systematic review and analysis.

    PubMed

    Dubey, Ashutosh Kumar; Gupta, Umesh; Jain, Sonal

    2015-01-01

    Breast cancer is a menacing cancer, primarily affecting women. Continuous research is going on for detecting breast cancer in the early stage as the possibility of cure in early stages is bright. There are two main objectives of this current study, first establish statistics for breast cancer and second to find methodologies which can be helpful in the early stage detection of the breast cancer based on previous studies. The breast cancer statistics for incidence and mortality of the UK, US, India and Egypt were considered for this study. The finding of this study proved that the overall mortality rates of the UK and US have been improved because of awareness, improved medical technology and screening, but in case of India and Egypt the condition is less positive because of lack of awareness. The methodological findings of this study suggest a combined framework based on data mining and evolutionary algorithms. It provides a strong bridge in improving the classification and detection accuracy of breast cancer data.

  20. Descriptive Statistics for Modern Test Score Distributions: Skewness, Kurtosis, Discreteness, and Ceiling Effects.

    PubMed

    Ho, Andrew D; Yu, Carol C

    2015-06-01

    Many statistical analyses benefit from the assumption that unconditional or conditional distributions are continuous and normal. More than 50 years ago in this journal, Lord and Cook chronicled departures from normality in educational tests, and Micerri similarly showed that the normality assumption is met rarely in educational and psychological practice. In this article, the authors extend these previous analyses to state-level educational test score distributions that are an increasingly common target of high-stakes analysis and interpretation. Among 504 scale-score and raw-score distributions from state testing programs from recent years, nonnormal distributions are common and are often associated with particular state programs. The authors explain how scaling procedures from item response theory lead to nonnormal distributions as well as unusual patterns of discreteness. The authors recommend that distributional descriptive statistics be calculated routinely to inform model selection for large-scale test score data, and they illustrate consequences of nonnormality using sensitivity studies that compare baseline results to those from normalized score scales.

  1. A Statistical Analysis of Reviewer Agreement and Bias in Evaluating Medical Abstracts 1

    PubMed Central

    Cicchetti, Domenic V.; Conn, Harold O.

    1976-01-01

    Observer variability affects virtually all aspects of clinical medicine and investigation. One important aspect, not previously examined, is the selection of abstracts for presentation at national medical meetings. In the present study, 109 abstracts, submitted to the American Association for the Study of Liver Disease, were evaluated by three “blind” reviewers for originality, design-execution, importance, and overall scientific merit. Of the 77 abstracts rated for all parameters by all observers, interobserver agreement ranged between 81 and 88%. However, corresponding intraclass correlations varied between 0.16 (approaching statistical significance) and 0.37 (p < 0.01). Specific tests of systematic differences in scoring revealed statistically significant levels of observer bias on most of the abstract components. Moreover, the mean differences in interobserver ratings were quite small compared to the standard deviations of these differences. These results emphasize the importance of evaluating the simple percentage of rater agreement within the broader context of observer variability and systematic bias. PMID:997596

  2. Design and analysis of randomized clinical trials requiring prolonged observation of each patient. II. analysis and examples.

    PubMed Central

    Peto, R.; Pike, M. C.; Armitage, P.; Breslow, N. E.; Cox, D. R.; Howard, S. V.; Mantel, N.; McPherson, K.; Peto, J.; Smith, P. G.

    1977-01-01

    Part I of this report appeared in the previous issue (Br. J. Cancer (1976) 34,585), and discussed the design of randomized clinical trials. Part II now describes efficient methods of analysis of randomized clinical trials in which we wish to compare the duration of survival (or the time until some other untoward event first occurs) among different groups of patients. It is intended to enable physicians without statistical training either to analyse such data themselves using life tables, the logrank test and retrospective stratification, or, when such analyses are presented, to appreciate them more critically, but the discussion may also be of interest to statisticians who have not yet specialized in clinical trial analyses. PMID:831755

  3. Improved Bond Equations for Fiber-Reinforced Polymer Bars in Concrete.

    PubMed

    Pour, Sadaf Moallemi; Alam, M Shahria; Milani, Abbas S

    2016-08-30

    This paper explores a set of new equations to predict the bond strength between fiber reinforced polymer (FRP) rebar and concrete. The proposed equations are based on a comprehensive statistical analysis and existing experimental results in the literature. Namely, the most effective parameters on bond behavior of FRP concrete were first identified by applying a factorial analysis on a part of the available database. Then the database that contains 250 pullout tests were divided into four groups based on the concrete compressive strength and the rebar surface. Afterward, nonlinear regression analysis was performed for each study group in order to determine the bond equations. The results show that the proposed equations can predict bond strengths more accurately compared to the other previously reported models.

  4. Unsupervised Outlier Profile Analysis

    PubMed Central

    Ghosh, Debashis; Li, Song

    2014-01-01

    In much of the analysis of high-throughput genomic data, “interesting” genes have been selected based on assessment of differential expression between two groups or generalizations thereof. Most of the literature focuses on changes in mean expression or the entire distribution. In this article, we explore the use of C(α) tests, which have been applied in other genomic data settings. Their use for the outlier expression problem, in particular with continuous data, is problematic but nevertheless motivates new statistics that give an unsupervised analog to previously developed outlier profile analysis approaches. Some simulation studies are used to evaluate the proposal. A bivariate extension is described that can accommodate data from two platforms on matched samples. The proposed methods are applied to data from a prostate cancer study. PMID:25452686

  5. Fostering Change in College Students' Statistical Reasoning and Motivation through Statistical Investigation

    ERIC Educational Resources Information Center

    Ramirez-Faghih, Caroline Ann

    2012-01-01

    The goal of this study was to examine the reciprocal relationship between statistical investigation and motivation of college students in a Mathematical Reasoning course (Math 1). Unlike previous studies in which students' projects or statistical investigations have been examined as the final product that shows evidence of statistical…

  6. An integrated workflow for robust alignment and simplified quantitative analysis of NMR spectrometry data.

    PubMed

    Vu, Trung N; Valkenborg, Dirk; Smets, Koen; Verwaest, Kim A; Dommisse, Roger; Lemière, Filip; Verschoren, Alain; Goethals, Bart; Laukens, Kris

    2011-10-20

    Nuclear magnetic resonance spectroscopy (NMR) is a powerful technique to reveal and compare quantitative metabolic profiles of biological tissues. However, chemical and physical sample variations make the analysis of the data challenging, and typically require the application of a number of preprocessing steps prior to data interpretation. For example, noise reduction, normalization, baseline correction, peak picking, spectrum alignment and statistical analysis are indispensable components in any NMR analysis pipeline. We introduce a novel suite of informatics tools for the quantitative analysis of NMR metabolomic profile data. The core of the processing cascade is a novel peak alignment algorithm, called hierarchical Cluster-based Peak Alignment (CluPA). The algorithm aligns a target spectrum to the reference spectrum in a top-down fashion by building a hierarchical cluster tree from peak lists of reference and target spectra and then dividing the spectra into smaller segments based on the most distant clusters of the tree. To reduce the computational time to estimate the spectral misalignment, the method makes use of Fast Fourier Transformation (FFT) cross-correlation. Since the method returns a high-quality alignment, we can propose a simple methodology to study the variability of the NMR spectra. For each aligned NMR data point the ratio of the between-group and within-group sum of squares (BW-ratio) is calculated to quantify the difference in variability between and within predefined groups of NMR spectra. This differential analysis is related to the calculation of the F-statistic or a one-way ANOVA, but without distributional assumptions. Statistical inference based on the BW-ratio is achieved by bootstrapping the null distribution from the experimental data. The workflow performance was evaluated using a previously published dataset. Correlation maps, spectral and grey scale plots show clear improvements in comparison to other methods, and the down-to-earth quantitative analysis works well for the CluPA-aligned spectra. The whole workflow is embedded into a modular and statistically sound framework that is implemented as an R package called "speaq" ("spectrum alignment and quantitation"), which is freely available from http://code.google.com/p/speaq/.

  7. An instrument to assess the statistical intensity of medical research papers.

    PubMed

    Nieminen, Pentti; Virtanen, Jorma I; Vähänikkilä, Hannu

    2017-01-01

    There is widespread evidence that statistical methods play an important role in original research articles, especially in medical research. The evaluation of statistical methods and reporting in journals suffers from a lack of standardized methods for assessing the use of statistics. The objective of this study was to develop and evaluate an instrument to assess the statistical intensity in research articles in a standardized way. A checklist-type measure scale was developed by selecting and refining items from previous reports about the statistical contents of medical journal articles and from published guidelines for statistical reporting. A total of 840 original medical research articles that were published between 2007-2015 in 16 journals were evaluated to test the scoring instrument. The total sum of all items was used to assess the intensity between sub-fields and journals. Inter-rater agreement was examined using a random sample of 40 articles. Four raters read and evaluated the selected articles using the developed instrument. The scale consisted of 66 items. The total summary score adequately discriminated between research articles according to their study design characteristics. The new instrument could also discriminate between journals according to their statistical intensity. The inter-observer agreement measured by the ICC was 0.88 between all four raters. Individual item analysis showed very high agreement between the rater pairs, the percentage agreement ranged from 91.7% to 95.2%. A reliable and applicable instrument for evaluating the statistical intensity in research papers was developed. It is a helpful tool for comparing the statistical intensity between sub-fields and journals. The novel instrument may be applied in manuscript peer review to identify papers in need of additional statistical review.

  8. Statistical evaluation of rainfall-simulator and erosion testing procedure : final report.

    DOT National Transportation Integrated Search

    1977-01-01

    The specific aims of this study were (1) to supply documentation of statistical repeatability and precision of the rainfall-simulator and to document the statistical repeatabiity of the soil-loss data when using the previously recommended tentative l...

  9. Transportation statistics annual report 1997 : mobility and access

    DOT National Transportation Integrated Search

    1997-01-01

    This document is the fourth Transportation Statistics Annual Report (TSAR) prepared by the Bureau of Transportation Statistics (BTS) for the President and Congress. As in previous years, it reports on the state of U.S. transportation system at two le...

  10. Using structural equation modeling for network meta-analysis.

    PubMed

    Tu, Yu-Kang; Wu, Yun-Chun

    2017-07-14

    Network meta-analysis overcomes the limitations of traditional pair-wise meta-analysis by incorporating all available evidence into a general statistical framework for simultaneous comparisons of several treatments. Currently, network meta-analyses are undertaken either within the Bayesian hierarchical linear models or frequentist generalized linear mixed models. Structural equation modeling (SEM) is a statistical method originally developed for modeling causal relations among observed and latent variables. As random effect is explicitly modeled as a latent variable in SEM, it is very flexible for analysts to specify complex random effect structure and to make linear and nonlinear constraints on parameters. The aim of this article is to show how to undertake a network meta-analysis within the statistical framework of SEM. We used an example dataset to demonstrate the standard fixed and random effect network meta-analysis models can be easily implemented in SEM. It contains results of 26 studies that directly compared three treatment groups A, B and C for prevention of first bleeding in patients with liver cirrhosis. We also showed that a new approach to network meta-analysis based on the technique of unrestricted weighted least squares (UWLS) method can also be undertaken using SEM. For both the fixed and random effect network meta-analysis, SEM yielded similar coefficients and confidence intervals to those reported in the previous literature. The point estimates of two UWLS models were identical to those in the fixed effect model but the confidence intervals were greater. This is consistent with results from the traditional pairwise meta-analyses. Comparing to UWLS model with common variance adjusted factor, UWLS model with unique variance adjusted factor has greater confidence intervals when the heterogeneity was larger in the pairwise comparison. The UWLS model with unique variance adjusted factor reflects the difference in heterogeneity within each comparison. SEM provides a very flexible framework for univariate and multivariate meta-analysis, and its potential as a powerful tool for advanced meta-analysis is still to be explored.

  11. Low-flow frequency and flow duration of selected South Carolina streams in the Savannah and Salkehatchie River Basins through March 2014

    USGS Publications Warehouse

    Feaster, Toby D.; Guimaraes, Wladmir B.

    2016-07-14

    An ongoing understanding of streamflow characteristics of the rivers and streams in South Carolina is important for the protection and preservation of the State’s water resources. Information concerning the low-flow characteristics of streams is especially important during critical flow periods, such as during the historic droughts that South Carolina has experienced in the past few decades.In 2008, the U.S. Geological Survey, in cooperation with the South Carolina Department of Health and Environmental Control, initiated a study to update low-flow statistics at continuous-record streamgaging stations operated by the U.S. Geological Survey in South Carolina. This report presents the low-flow statistics for 28 selected streamgaging stations in the Savannah and Salkehatchie River Basins in South Carolina. The low-flow statistics include daily mean flow durations for the 5-, 10-, 25-, 50-, 75-, 90-, and 95-percent probability of exceedance and the annual minimum 1-, 3-, 7-, 14-, 30-, 60-, and 90-day mean flows with recurrence intervals of 2, 5, 10, 20, 30, and 50 years, depending on the length of record available at the streamgaging station. The low-flow statistics were computed from records available through March 31, 2014.Low-flow statistics are influenced by length of record, hydrologic regime under which the data were collected, analytical techniques used, and other factors, such as urbanization, diversions, and droughts that may have occurred in the basin. To assess changes in the low-flow statistics from the previously published values, a comparison of the low-flow statistics for the annual minimum 7-day average streamflow with a 10-year recurrence interval (7Q10) from this study was made with the most recently published values. Of the 28 streamgaging stations for which recurrence interval computations were made, 14 streamgaging stations were suitable for comparing to low-flow statistics that were previously published in U.S. Geological Survey reports. These comparisons indicated that seven of the streamgaging stations had values lower than the previous values, two streamgaging stations had values higher than the previous values, and two streamgaging stations had values that were unchanged from previous values. The remaining three stations for which previous 7Q10 values were computed, which are located on the main stem of the Savannah River, were not compared with current estimates because of differences in the way the pre-regulation and regulated flow data were analyzed.

  12. Back to "once a caesarean: always a caesarean"? A trend analysis in Switzerland.

    PubMed

    Christmann-Schmid, Corina; Raio, Luigi; Scheibner, Katrin; Müller, Martin; Surbek, Daniel

    2016-11-01

    Caesarean sections (CS) have significantly increased worldwide and a previous CS is nowadays an important and increasingly reported indication to perform a repeat CS. There is a paucity of information in Switzerland on the incidence of repeat CS after previous CS and relationship between the rates of vaginal birth after CS (VBAC). The aim of this study was to analyse the actual trend in VBAC in Switzerland. We performed a retrospective cohort study to analyse the proportion of VBAC among all pregnant women with previous sections which give birth during two time periods (group 1:1998/1999 vs. group 2:2004/2005) in our tertiary care referral hospital and in the annual statistics of Swiss Women's Hospitals (ASF-Statistics). In addition, the proportion of induction of labour after a previous caesarean and its success was analysed. In both cohorts studied, we found a significant decrease of vaginal births (p < 0.05) and a significant increase of primary elective repeat caesarean section (p < 0.05) from the first to the second time period, while there was a decrease of secondary repeat caesarean sections. The prevalence of labour induction did not decrease. Our study shows that vaginal birth after a prior caesarean section has decreased over time in Switzerland. There was no significant change in labour induction during the study period. While this trend might reflect an increasing demand for safety in pregnancy and childbirth, it concomitantly increases maternal risks of further pregnancies, and women need to be appropriately informed about long-term risks.

  13. Prevalence, correlates and pattern of hepatitis B surface antigen in a low resource setting.

    PubMed

    Eke, Ahizechukwu C; Eke, Uzoamaka A; Okafor, Charles I; Ezebialu, Ifeanyichukwu U; Ogbuagu, Chukwuanugo

    2011-01-12

    Hepatitis B virus (HBV) infection in Nigeria has remained a Public Health issue. It is a major cause of mortality, especially in developing countries. Vertical transmission of hepatitis B virus infection is thought to be a major route of transmission in low resource areas. In spite of this, routine antenatal screening for hepatitis B infection is not yet practiced in many Nigerian hospitals. This paper present the findings of a study conducted among antenatal women in Nnewi, Nigeria. It was a cross-sectional study carried out over a 3-month period (August-October, 2009). Recruitment of 480 women attending antenatal clinics in Nnewi, Nigeria was done by simple random sampling using computer generated random numbers. HBsAg screening was done using rapid ELISA Kits. Statistical analysis was computed using STATA 11 package. The results were subjected to analysis using cross tabulations to explore statistical relationships between variables. Chi square test was used to explore proportional relationship between groups. The level of statistical significance was set at p < 0.05 (providing 95% confidence interval). Four hundred and eighty pregnant women were recruited into the study. Of these, 40 tested positive to HBsAg, accounting for 8.3% of the sample population. The age of the subjects studied varied from 14 to 45 years (mean age--24.3 years) while the mean parity was 2.18. The HIV/HBV co-infection rate was 4.2%. The vertical transmission rate was 51.6%. There were statistically significant relationships between HBV infection and previous history of tribal marks/tattoos (χ2 = 27.39, P = 0.001, df = 1), history of contact with previously infected HBV patients (χ2 = 23.11, P = 0.001, df = 1) and occupation of the women (χ2 = 51.22, P = 0.001, df = 1). Multiple sexual partners, blood transfusion, dental manipulations, sharing of sharps/needles, and circumcision were not significant modes of transmission. There was no statistically significant relationship between maternal age, educational level and HBV infection. The authors argued that hepatitis B screening in pregnancy should be made routine practice in Nigeria because of the low pick up rate of the infection based only on risk factors for the disease.

  14. Meta-analysis to determine the effects of plant disease management measures: review and case studies on soybean and apple.

    PubMed

    Ngugi, Henry K; Esker, Paul D; Scherm, Harald

    2011-01-01

    The continuing exponential increase in scientific knowledge, the growing availability of large databases containing raw or partially annotated information, and the increased need to document impacts of large-scale research and funding programs provide a great incentive for integrating and adding value to previously published (or unpublished) research through quantitative synthesis. Meta-analysis has become the standard for quantitative evidence synthesis in many disciplines, offering a broadly accepted and statistically powerful framework for estimating the magnitude, consistency, and homogeneity of the effect of interest across studies. Here, we review previous and current uses of meta-analysis in plant pathology with a focus on applications in epidemiology and disease management. About a dozen formal meta-analyses have been published in the plant pathological literature in the past decade, and several more are currently in progress. Three broad research questions have been addressed, the most common being the comparative efficacy of chemical treatments for managing disease and reducing yield loss across environments. The second most common application has been the quantification of relationships between disease intensity and yield, or between different measures of disease, across studies. Lastly, meta-analysis has been applied to assess factors affecting pathogen-biocontrol agent interactions or the effectiveness of biological control of plant disease or weeds. In recent years, fixed-effects meta-analysis has been largely replaced by random- (or mixed-) effects analysis owing to the statistical benefits associated with the latter and the wider availability of computer software to conduct these analyses. Another recent trend has been the more common use of multivariate meta-analysis or meta-regression to analyze the impacts of study-level independent variables (moderator variables) on the response of interest. The application of meta-analysis to practical problems in epidemiology and disease management is illustrated with case studies from our work on Phakopsora pachyrhizi on soybean and Erwinia amylovora on apple. We show that although meta-analyses are often used to corroborate and validate general conclusions drawn from more traditional, qualitative reviews, they can also reveal new patterns and interpretations not obvious from individual studies.

  15. Effect of ultrasound frequency on the Nakagami statistics of human liver tissues.

    PubMed

    Tsui, Po-Hsiang; Zhou, Zhuhuang; Lin, Ying-Hsiu; Hung, Chieh-Ming; Chung, Shih-Jou; Wan, Yung-Liang

    2017-01-01

    The analysis of the backscattered statistics using the Nakagami parameter is an emerging ultrasound technique for assessing hepatic steatosis and fibrosis. Previous studies indicated that the echo amplitude distribution of a normal liver follows the Rayleigh distribution (the Nakagami parameter m is close to 1). However, using different frequencies may change the backscattered statistics of normal livers. This study explored the frequency dependence of the backscattered statistics in human livers and then discussed the sources of ultrasound scattering in the liver. A total of 30 healthy participants were enrolled to undergo a standard care ultrasound examination on the liver, which is a natural model containing diffuse and coherent scatterers. The liver of each volunteer was scanned from the right intercostal view to obtain image raw data at different central frequencies ranging from 2 to 3.5 MHz. Phantoms with diffuse scatterers only were also made to perform ultrasound scanning using the same protocol for comparisons with clinical data. The Nakagami parameter-frequency correlation was evaluated using Pearson correlation analysis. The median and interquartile range of the Nakagami parameter obtained from livers was 1.00 (0.98-1.05) for 2 MHz, 0.93 (0.89-0.98) for 2.3 MHz, 0.87 (0.84-0.92) for 2.5 MHz, 0.82 (0.77-0.88) for 3.3 MHz, and 0.81 (0.76-0.88) for 3.5 MHz. The Nakagami parameter decreased with the increasing central frequency (r = -0.67, p < 0.0001). However, the effect of ultrasound frequency on the statistical distribution of the backscattered envelopes was not found in the phantom results (r = -0.147, p = 0.0727). The current results demonstrated that the backscattered statistics of normal livers is frequency-dependent. Moreover, the coherent scatterers may be the primary factor to dominate the frequency dependence of the backscattered statistics in a liver.

  16. Challenges Associated with Estimating Utility in Wet Age-Related Macular Degeneration: A Novel Regression Analysis to Capture the Bilateral Nature of the Disease.

    PubMed

    Hodgson, Robert; Reason, Timothy; Trueman, David; Wickstead, Rose; Kusel, Jeanette; Jasilek, Adam; Claxton, Lindsay; Taylor, Matthew; Pulikottil-Jacob, Ruth

    2017-10-01

    The estimation of utility values for the economic evaluation of therapies for wet age-related macular degeneration (AMD) is a particular challenge. Previous economic models in wet AMD have been criticized for failing to capture the bilateral nature of wet AMD by modelling visual acuity (VA) and utility values associated with the better-seeing eye only. Here we present a de novo regression analysis using generalized estimating equations (GEE) applied to a previous dataset of time trade-off (TTO)-derived utility values from a sample of the UK population that wore contact lenses to simulate visual deterioration in wet AMD. This analysis allows utility values to be estimated as a function of VA in both the better-seeing eye (BSE) and worse-seeing eye (WSE). VAs in both the BSE and WSE were found to be statistically significant (p < 0.05) when regressed separately. When included without an interaction term, only the coefficient for VA in the BSE was significant (p = 0.04), but when an interaction term between VA in the BSE and WSE was included, only the constant term (mean TTO utility value) was significant, potentially a result of the collinearity between the VA of the two eyes. The lack of both formal model fit statistics from the GEE approach and theoretical knowledge to support the superiority of one model over another make it difficult to select the best model. Limitations of this analysis arise from the potential influence of collinearity between the VA of both eyes, and the use of contact lenses to reflect VA states to obtain the original dataset. Whilst further research is required to elicit more accurate utility values for wet AMD, this novel regression analysis provides a possible source of utility values to allow future economic models to capture the quality of life impact of changes in VA in both eyes. Novartis Pharmaceuticals UK Limited.

  17. Statistical comparison of a hybrid approach with approximate and exact inference models for Fusion 2+

    NASA Astrophysics Data System (ADS)

    Lee, K. David; Wiesenfeld, Eric; Gelfand, Andrew

    2007-04-01

    One of the greatest challenges in modern combat is maintaining a high level of timely Situational Awareness (SA). In many situations, computational complexity and accuracy considerations make the development and deployment of real-time, high-level inference tools very difficult. An innovative hybrid framework that combines Bayesian inference, in the form of Bayesian Networks, and Possibility Theory, in the form of Fuzzy Logic systems, has recently been introduced to provide a rigorous framework for high-level inference. In previous research, the theoretical basis and benefits of the hybrid approach have been developed. However, lacking is a concrete experimental comparison of the hybrid framework with traditional fusion methods, to demonstrate and quantify this benefit. The goal of this research, therefore, is to provide a statistical analysis on the comparison of the accuracy and performance of hybrid network theory, with pure Bayesian and Fuzzy systems and an inexact Bayesian system approximated using Particle Filtering. To accomplish this task, domain specific models will be developed under these different theoretical approaches and then evaluated, via Monte Carlo Simulation, in comparison to situational ground truth to measure accuracy and fidelity. Following this, a rigorous statistical analysis of the performance results will be performed, to quantify the benefit of hybrid inference to other fusion tools.

  18. Direction dependence analysis: A framework to test the direction of effects in linear models with an implementation in SPSS.

    PubMed

    Wiedermann, Wolfgang; Li, Xintong

    2018-04-16

    In nonexperimental data, at least three possible explanations exist for the association of two variables x and y: (1) x is the cause of y, (2) y is the cause of x, or (3) an unmeasured confounder is present. Statistical tests that identify which of the three explanatory models fits best would be a useful adjunct to the use of theory alone. The present article introduces one such statistical method, direction dependence analysis (DDA), which assesses the relative plausibility of the three explanatory models on the basis of higher-moment information about the variables (i.e., skewness and kurtosis). DDA involves the evaluation of three properties of the data: (1) the observed distributions of the variables, (2) the residual distributions of the competing models, and (3) the independence properties of the predictors and residuals of the competing models. When the observed variables are nonnormally distributed, we show that DDA components can be used to uniquely identify each explanatory model. Statistical inference methods for model selection are presented, and macros to implement DDA in SPSS are provided. An empirical example is given to illustrate the approach. Conceptual and empirical considerations are discussed for best-practice applications in psychological data, and sample size recommendations based on previous simulation studies are provided.

  19. Thermal heterogeneity within aqueous materials quantified by 1H NMR spectroscopy: Multiparametric validation in silico and in vitro

    NASA Astrophysics Data System (ADS)

    Lutz, Norbert W.; Bernard, Monique

    2018-02-01

    We recently suggested a new paradigm for statistical analysis of thermal heterogeneity in (semi-)aqueous materials by 1H NMR spectroscopy, using water as a temperature probe. Here, we present a comprehensive in silico and in vitro validation that demonstrates the ability of this new technique to provide accurate quantitative parameters characterizing the statistical distribution of temperature values in a volume of (semi-)aqueous matter. First, line shape parameters of numerically simulated water 1H NMR spectra are systematically varied to study a range of mathematically well-defined temperature distributions. Then, corresponding models based on measured 1H NMR spectra of agarose gel are analyzed. In addition, dedicated samples based on hydrogels or biological tissue are designed to produce temperature gradients changing over time, and dynamic NMR spectroscopy is employed to analyze the resulting temperature profiles at sub-second temporal resolution. Accuracy and consistency of the previously introduced statistical descriptors of temperature heterogeneity are determined: weighted median and mean temperature, standard deviation, temperature range, temperature mode(s), kurtosis, skewness, entropy, and relative areas under temperature curves. Potential and limitations of this method for quantitative analysis of thermal heterogeneity in (semi-)aqueous materials are discussed in view of prospective applications in materials science as well as biology and medicine.

  20. Women victims of intentional homicide in Italy: New insights comparing Italian trends to German and U.S. trends, 2008-2014.

    PubMed

    Terranova, Claudio; Zen, Margherita

    2018-01-01

    National statistics on female homicide could be a useful tool to evaluate the phenomenon and plan adequate strategies to prevent and reduce this crime. The aim of the study is to contribute to the analysis of intentional female homicides in Italy by comparing Italian trends to German and United States trends from 2008 to 2014. This is a population study based on data deriving primarily from national and European statistical institutes, from the U.S. Federal Bureau of Investigation's Uniform Crime Reporting and from the National Center for Health Statistics. Data were analyzed in relation to trends and age by Chi-square test, Student's t-test and linear regression. Results show that female homicides, unlike male homicides, remained stable in the three countries. Regression analysis showed a higher risk for female homicide in all age groups in the U.S. Middle-aged women result at higher risk, and the majority of murdered women are killed by people they know. These results confirm previous findings and suggest the need to focus also in Italy on preventive strategies to reduce those precipitating factors linked to violence and present in the course of a relationship or within the family. Copyright © 2017 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  1. Non-targeted 1H NMR fingerprinting and multivariate statistical analyses for the characterisation of the geographical origin of Italian sweet cherries.

    PubMed

    Longobardi, F; Ventrella, A; Bianco, A; Catucci, L; Cafagna, I; Gallo, V; Mastrorilli, P; Agostiano, A

    2013-12-01

    In this study, non-targeted (1)H NMR fingerprinting was used in combination with multivariate statistical techniques for the classification of Italian sweet cherries based on their different geographical origins (Emilia Romagna and Puglia). As classification techniques, Soft Independent Modelling of Class Analogy (SIMCA), Partial Least Squares Discriminant Analysis (PLS-DA), and Linear Discriminant Analysis (LDA) were carried out and the results were compared. For LDA, before performing a refined selection of the number/combination of variables, two different strategies for a preliminary reduction of the variable number were tested. The best average recognition and CV prediction abilities (both 100.0%) were obtained for all the LDA models, although PLS-DA also showed remarkable performances (94.6%). All the statistical models were validated by observing the prediction abilities with respect to an external set of cherry samples. The best result (94.9%) was obtained with LDA by performing a best subset selection procedure on a set of 30 principal components previously selected by a stepwise decorrelation. The metabolites that mostly contributed to the classification performances of such LDA model, were found to be malate, glucose, fructose, glutamine and succinate. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Prognostic value of GLUT-1 expression in ovarian surface epithelial tumors: a morphometric study.

    PubMed

    Ozcan, Ayhan; Deveci, Mehmet Salih; Oztas, Emin; Dede, Murat; Yenen, Mufit Cemal; Korgun, Emin Turkay; Gunhan, Omer

    2005-08-01

    To investigate the reported increase in the expression of the glucose transporter GLUT-1 in borderline and malignant ovarian epithelial tumors and its relationship to prognosis. In this study, areas in which immunohistochemical membranous staining with GLUT-1 were most evident were selected, and the proportions of GLUT-1 expression in 46 benign, 11 borderline and 42 malignant cases of ovarian epithelial tumors were determined quantitatively with a computer and Zeiss Vision KS 400 3.0 (Göttingen, Germany) for Windows (Microsoft, Redmond, Washington, U.S.A.) image analysis. GLUT-1 expression was determined in all borderline tumors (11 of 11) and in 97.6% of malignant tumors (41 of 42). No GLUT-1 expression was observed in benign tumors. The intensity of GLUT-1 staining was lower in borderline tumors than in malignant cases. This was statistically significant (p = 0.005). As differentiation in malignant tumors increased, proportions of GLUT-1 expression showed a relative increase, but this difference was not statistically significant (p = 0.68). When GLUT-1 expression in borderline and malignant ovarian epithelial tumors was analyzed against prognosis, no statistically significant difference was identified. Assessment of GLUT-1 expression using the image analysis program was more reliable, with higher reproducibility than in previous studies.

  3. A retrospective analysis of mathieu and tip urethroplasty techniques for distal hypospadias repair; A 20 year experience.

    PubMed

    Oztorun, Kenan; Bagbanci, Sahin; Dadali, Mumtaz; Emir, Levent; Karabulut, Ayhan

    2017-09-01

    We aimed to identify the changes in the application rate of two surgical techniques in distal hypospadias repair in years and compare the most popular two surgical repair techniques for distal hypospadias in terms of surgical outcomes, the factors that affect the outcomes, which were performed over a 20 year period. In this study, the records of 492 consecutive patients that had undergone an operation for distal hypospadias in the urology clinic of Ankara between May 1990 and December 2010 using either Mathieu or TIPU surgical techniques were reviewed retrospectively. The patients who had glanular, coronal, and subcoronal meatus, were accepted as distal hypospadias cases. Among the 492 examined medical records, it was revealed that 331 and 161 surgical interventions were performed by using the Mathieu urethroplasty technique (Group-1) and TIP urethroplasty technique (Group-2), respectively. Group-1 was divided into two subgroups; namely Group-1a (patients with primary hypospadias) and Group-1b (patients with previous hypospadias operation). Likewise, Group-2 was divided into two subgroups; namely group-2a and group-2b. The patients' ages, number of previously urethroplasty operations, localization of the external urethral meatus prior to the operation, chordee state, length of the newly formed urethra, whether urinary diversion was done or not, post-operative complications and data regarding the follow-up period were evaluated, and the effects of these variables on the surgical outcome were investigated via statistical analyses. The primary objective of this study is to identify the changes in the application rate of two surgical techniques in distal hypospadias repair over the a 20 year period, and the secondary objectives are to compare the most popular two surgical repair techniques for distal hypospadias in terms of surgical outcomes, and the factors affecting the outcomes. Independent samples t test and Pearson's Chisquare test was used for statistical analysis. p<0.05 was considered as statistically significant. There were no statistically significant differences between the subgroups in terms of age, length of the neo-urethra, number of previously performed urethroplasty operations, surgical success rates, or complications (p>0.05). The concurrent utilization of the cystostomy and urethral stent was significantly more frequent in group-1 (p<0.05; Pearson's Chi-square test). It was determined that over time, TIP urethroplasty has become a more preferred technique for the repair of distal hypospadias. Both surgical techniques have similar success rates in distal hypospadias cases. TIP urethroplasty has become the method of choice over time.

  4. Variability in source sediment contributions by applying different statistic test for a Pyrenean catchment.

    PubMed

    Palazón, L; Navas, A

    2017-06-01

    Information on sediment contribution and transport dynamics from the contributing catchments is needed to develop management plans to tackle environmental problems related with effects of fine sediment as reservoir siltation. In this respect, the fingerprinting technique is an indirect technique known to be valuable and effective for sediment source identification in river catchments. Large variability in sediment delivery was found in previous studies in the Barasona catchment (1509 km 2 , Central Spanish Pyrenees). Simulation results with SWAT and fingerprinting approaches identified badlands and agricultural uses as the main contributors to sediment supply in the reservoir. In this study the <63 μm sediment fraction from the surface reservoir sediments (2 cm) are investigated following the fingerprinting procedure to assess how the use of different statistical procedures affects the amounts of source contributions. Three optimum composite fingerprints were selected to discriminate between source contributions based in land uses/land covers from the same dataset by the application of (1) discriminant function analysis; and its combination (as second step) with (2) Kruskal-Wallis H-test and (3) principal components analysis. Source contribution results were different between assessed options with the greatest differences observed for option using #3, including the two step process: principal components analysis and discriminant function analysis. The characteristics of the solutions by the applied mixing model and the conceptual understanding of the catchment showed that the most reliable solution was achieved using #2, the two step process of Kruskal-Wallis H-test and discriminant function analysis. The assessment showed the importance of the statistical procedure used to define the optimum composite fingerprint for sediment fingerprinting applications. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. A Non-Destructive Method for Distinguishing Reindeer Antler (Rangifer tarandus) from Red Deer Antler (Cervus elaphus) Using X-Ray Micro-Tomography Coupled with SVM Classifiers

    PubMed Central

    Lefebvre, Alexandre; Rochefort, Gael Y.; Santos, Frédéric; Le Denmat, Dominique; Salmon, Benjamin; Pétillon, Jean-Marc

    2016-01-01

    Over the last decade, biomedical 3D-imaging tools have gained widespread use in the analysis of prehistoric bone artefacts. While initial attempts to characterise the major categories used in osseous industry (i.e. bone, antler, and dentine/ivory) have been successful, the taxonomic determination of prehistoric artefacts remains to be investigated. The distinction between reindeer and red deer antler can be challenging, particularly in cases of anthropic and/or taphonomic modifications. In addition to the range of destructive physicochemical identification methods available (mass spectrometry, isotopic ratio, and DNA analysis), X-ray micro-tomography (micro-CT) provides convincing non-destructive 3D images and analyses. This paper presents the experimental protocol (sample scans, image processing, and statistical analysis) we have developed in order to identify modern and archaeological antler collections (from Isturitz, France). This original method is based on bone microstructure analysis combined with advanced statistical support vector machine (SVM) classifiers. A combination of six microarchitecture biomarkers (bone volume fraction, trabecular number, trabecular separation, trabecular thickness, trabecular bone pattern factor, and structure model index) were screened using micro-CT in order to characterise internal alveolar structure. Overall, reindeer alveoli presented a tighter mesh than red deer alveoli, and statistical analysis allowed us to distinguish archaeological antler by species with an accuracy of 96%, regardless of anatomical location on the antler. In conclusion, micro-CT combined with SVM classifiers proves to be a promising additional non-destructive method for antler identification, suitable for archaeological artefacts whose degree of human modification and cultural heritage or scientific value has previously made it impossible (tools, ornaments, etc.). PMID:26901355

  6. Statistical analysis of the horizontal divergent flow in emerging solar active regions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toriumi, Shin; Hayashi, Keiji; Yokoyama, Takaaki, E-mail: shin.toriumi@nao.ac.jp

    Solar active regions (ARs) are thought to be formed by magnetic fields from the convection zone. Our flux emergence simulations revealed that a strong horizontal divergent flow (HDF) of unmagnetized plasma appears at the photosphere before the flux begins to emerge. In our earlier study, we analyzed HMI data for a single AR and confirmed presence of this precursor plasma flow in the actual Sun. In this paper, as an extension of our earlier study, we conducted a statistical analysis of the HDFs to further investigate their characteristics and better determine the properties. From SDO/HMI data, we picked up 23more » flux emergence events over a period of 14 months, the total flux of which ranges from 10{sup 20} to 10{sup 22} Mx. Out of 23 selected events, 6 clear HDFs were detected by the method we developed in our earlier study, and 7 HDFs detected by visual inspection were added to this statistic analysis. We found that the duration of the HDF is on average 61 minutes and the maximum HDF speed is on average 3.1 km s{sup –1}. We also estimated the rising speed of the subsurface magnetic flux to be 0.6-1.4 km s{sup –1}. These values are highly consistent with our previous one-event analysis as well as our simulation results. The observation results lead us to the conclusion that the HDF is a rather common feature in the earliest phase of AR emergence. Moreover, our HDF analysis has the capability of determining the subsurface properties of emerging fields that cannot be directly measured.« less

  7. Integrative pathway analysis of a genome-wide association study of V̇o2max response to exercise training

    PubMed Central

    Vivar, Juan C.; Sarzynski, Mark A.; Sung, Yun Ju; Timmons, James A.; Bouchard, Claude; Rankinen, Tuomo

    2013-01-01

    We previously reported the findings from a genome-wide association study of the response of maximal oxygen uptake (V̇o2max) to an exercise program. Here we follow up on these results to generate hypotheses on genes, pathways, and systems involved in the ability to respond to exercise training. A systems biology approach can help us better establish a comprehensive physiological description of what underlies V̇o2maxtrainability. The primary material for this exploration was the individual single-nucleotide polymorphism (SNP), SNP-gene mapping, and statistical significance levels. We aimed to generate novel hypotheses through analyses that go beyond statistical association of single-locus markers. This was accomplished through three complementary approaches: 1) building de novo evidence of gene candidacy through informatics-driven literature mining; 2) aggregating evidence from statistical associations to link variant enrichment in biological pathways to V̇o2max trainability; and 3) predicting possible consequences of variants residing in the pathways of interest. We started with candidate gene prioritization followed by pathway analysis focused on overrepresentation analysis and gene set enrichment analysis. Subsequently, leads were followed using in silico analysis of predicted SNP functions. Pathways related to cellular energetics (pantothenate and CoA biosynthesis; PPAR signaling) and immune functions (complement and coagulation cascades) had the highest levels of SNP burden. In particular, long-chain fatty acid transport and fatty acid oxidation genes and sequence variants were found to influence differences in V̇o2max trainability. Together, these methods allow for the hypothesis-driven ranking and prioritization of genes and pathways for future experimental testing and validation. PMID:23990238

  8. Evidential Value That Exercise Improves BMI z-Score in Overweight and Obese Children and Adolescents

    PubMed Central

    Kelley, George A.; Kelley, Kristi S.

    2015-01-01

    Background. Given the cardiovascular disease (CVD) related importance of understanding the true effects of exercise on adiposity in overweight and obese children and adolescents, this study examined whether there is evidential value to rule out excessive and inappropriate reporting of statistically significant results, a major problem in the published literature, with respect to exercise-induced improvements in BMI z-score among overweight and obese children and adolescents. Methods. Using data from a previous meta-analysis of 10 published studies that included 835 overweight and obese children and adolescents, a novel, recently developed approach (p-curve) was used to test for evidential value and rule out selective reporting of findings. Chi-squared tests (χ 2) were used to test for statistical significance with alpha (p) values <0.05 considered statistically significant. Results. Six of 10 findings (60%) were statistically significant. Statistically significant right-skew to rule out selective reporting was found (χ 2 = 38.8, p = 0.0001). Conversely, studies neither lacked evidential value (χ 2 = 6.8, p = 0.87) nor lacked evidential value and were intensely p-hacked (χ 2 = 4.3, p = 0.98). Conclusion. Evidential value results confirm that exercise reduces BMI z-score in overweight and obese children and adolescents, an important therapeutic strategy for treating and preventing CVD. PMID:26509145

  9. Evidential Value That Exercise Improves BMI z-Score in Overweight and Obese Children and Adolescents.

    PubMed

    Kelley, George A; Kelley, Kristi S

    2015-01-01

    Background. Given the cardiovascular disease (CVD) related importance of understanding the true effects of exercise on adiposity in overweight and obese children and adolescents, this study examined whether there is evidential value to rule out excessive and inappropriate reporting of statistically significant results, a major problem in the published literature, with respect to exercise-induced improvements in BMI z-score among overweight and obese children and adolescents. Methods. Using data from a previous meta-analysis of 10 published studies that included 835 overweight and obese children and adolescents, a novel, recently developed approach (p-curve) was used to test for evidential value and rule out selective reporting of findings. Chi-squared tests (χ (2)) were used to test for statistical significance with alpha (p) values <0.05 considered statistically significant. Results. Six of 10 findings (60%) were statistically significant. Statistically significant right-skew to rule out selective reporting was found (χ (2) = 38.8, p = 0.0001). Conversely, studies neither lacked evidential value (χ (2) = 6.8, p = 0.87) nor lacked evidential value and were intensely p-hacked (χ (2) = 4.3, p = 0.98). Conclusion. Evidential value results confirm that exercise reduces BMI z-score in overweight and obese children and adolescents, an important therapeutic strategy for treating and preventing CVD.

  10. Statistics of Scientific Procedures on Living Animals Great Britain 2015 - highlighting an ongoing upward trend in animal use and missed opportunities.

    PubMed

    Hudson-Shore, Michelle

    2016-12-01

    The Annual Statistics of Scientific Procedures on Living Animals Great Britain 2015 indicate that the Home Office were correct in recommending that caution should be exercised when interpreting the 2014 data as an apparent decline in animal experiments. The 2015 report shows that, as the changes to the format of the annual statistics have become more familiar and less problematic, there has been a re-emergence of the upward trend in animal research and testing in Great Britain. The 2015 statistics report an increase in animal procedures (up to 4,142,631) and in the number of animals used (up to 4,069,349). This represents 1% more than the totals in 2013, and a 7% increase on the procedures reported in 2014. This paper details an analysis of these most recent statistics, providing information on overall animal use and highlighting specific issues associated with genetically-altered animals, dogs and primates. It also reflects on areas of the new format that have previously been highlighted as being problematic, and concludes with a discussion about the use of animals in regulatory research and testing, and how there are significant missed opportunities for replacing some of the animal-based tests in this area. 2016 FRAME.

  11. Fast and accurate imputation of summary statistics enhances evidence of functional enrichment

    PubMed Central

    Pasaniuc, Bogdan; Zaitlen, Noah; Shi, Huwenbo; Bhatia, Gaurav; Gusev, Alexander; Pickrell, Joseph; Hirschhorn, Joel; Strachan, David P.; Patterson, Nick; Price, Alkes L.

    2014-01-01

    Motivation: Imputation using external reference panels (e.g. 1000 Genomes) is a widely used approach for increasing power in genome-wide association studies and meta-analysis. Existing hidden Markov models (HMM)-based imputation approaches require individual-level genotypes. Here, we develop a new method for Gaussian imputation from summary association statistics, a type of data that is becoming widely available. Results: In simulations using 1000 Genomes (1000G) data, this method recovers 84% (54%) of the effective sample size for common (>5%) and low-frequency (1–5%) variants [increasing to 87% (60%) when summary linkage disequilibrium information is available from target samples] versus the gold standard of 89% (67%) for HMM-based imputation, which cannot be applied to summary statistics. Our approach accounts for the limited sample size of the reference panel, a crucial step to eliminate false-positive associations, and it is computationally very fast. As an empirical demonstration, we apply our method to seven case–control phenotypes from the Wellcome Trust Case Control Consortium (WTCCC) data and a study of height in the British 1958 birth cohort (1958BC). Gaussian imputation from summary statistics recovers 95% (105%) of the effective sample size (as quantified by the ratio of χ2 association statistics) compared with HMM-based imputation from individual-level genotypes at the 227 (176) published single nucleotide polymorphisms (SNPs) in the WTCCC (1958BC height) data. In addition, for publicly available summary statistics from large meta-analyses of four lipid traits, we publicly release imputed summary statistics at 1000G SNPs, which could not have been obtained using previously published methods, and demonstrate their accuracy by masking subsets of the data. We show that 1000G imputation using our approach increases the magnitude and statistical evidence of enrichment at genic versus non-genic loci for these traits, as compared with an analysis without 1000G imputation. Thus, imputation of summary statistics will be a valuable tool in future functional enrichment analyses. Availability and implementation: Publicly available software package available at http://bogdan.bioinformatics.ucla.edu/software/. Contact: bpasaniuc@mednet.ucla.edu or aprice@hsph.harvard.edu Supplementary information: Supplementary materials are available at Bioinformatics online. PMID:24990607

  12. Development of computer-assisted instruction application for statistical data analysis android platform as learning resource

    NASA Astrophysics Data System (ADS)

    Hendikawati, P.; Arifudin, R.; Zahid, M. Z.

    2018-03-01

    This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.

  13. Repeatability Modeling for Wind-Tunnel Measurements: Results for Three Langley Facilities

    NASA Technical Reports Server (NTRS)

    Hemsch, Michael J.; Houlden, Heather P.

    2014-01-01

    Data from extensive check standard tests of seven measurement processes in three NASA Langley Research Center wind tunnels are statistically analyzed to test a simple model previously presented in 2000 for characterizing short-term, within-test and across-test repeatability. The analysis is intended to support process improvement and development of uncertainty models for the measurements. The analysis suggests that the repeatability can be estimated adequately as a function of only the test section dynamic pressure over a two-orders- of-magnitude dynamic pressure range. As expected for low instrument loading, short-term coefficient repeatability is determined by the resolution of the instrument alone (air off). However, as previously pointed out, for the highest dynamic pressure range the coefficient repeatability appears to be independent of dynamic pressure, thus presenting a lower floor for the standard deviation for all three time frames. The simple repeatability model is shown to be adequate for all of the cases presented and for all three time frames.

  14. Analysis of mammalian gene function through broad-based phenotypic screens across a consortium of mouse clinics.

    PubMed

    de Angelis, Martin Hrabě; Nicholson, George; Selloum, Mohammed; White, Jacqui; Morgan, Hugh; Ramirez-Solis, Ramiro; Sorg, Tania; Wells, Sara; Fuchs, Helmut; Fray, Martin; Adams, David J; Adams, Niels C; Adler, Thure; Aguilar-Pimentel, Antonio; Ali-Hadji, Dalila; Amann, Gregory; André, Philippe; Atkins, Sarah; Auburtin, Aurelie; Ayadi, Abdel; Becker, Julien; Becker, Lore; Bedu, Elodie; Bekeredjian, Raffi; Birling, Marie-Christine; Blake, Andrew; Bottomley, Joanna; Bowl, Mike; Brault, Véronique; Busch, Dirk H; Bussell, James N; Calzada-Wack, Julia; Cater, Heather; Champy, Marie-France; Charles, Philippe; Chevalier, Claire; Chiani, Francesco; Codner, Gemma F; Combe, Roy; Cox, Roger; Dalloneau, Emilie; Dierich, André; Di Fenza, Armida; Doe, Brendan; Duchon, Arnaud; Eickelberg, Oliver; Esapa, Chris T; El Fertak, Lahcen; Feigel, Tanja; Emelyanova, Irina; Estabel, Jeanne; Favor, Jack; Flenniken, Ann; Gambadoro, Alessia; Garrett, Lilian; Gates, Hilary; Gerdin, Anna-Karin; Gkoutos, George; Greenaway, Simon; Glasl, Lisa; Goetz, Patrice; Da Cruz, Isabelle Goncalves; Götz, Alexander; Graw, Jochen; Guimond, Alain; Hans, Wolfgang; Hicks, Geoff; Hölter, Sabine M; Höfler, Heinz; Hancock, John M; Hoehndorf, Robert; Hough, Tertius; Houghton, Richard; Hurt, Anja; Ivandic, Boris; Jacobs, Hughes; Jacquot, Sylvie; Jones, Nora; Karp, Natasha A; Katus, Hugo A; Kitchen, Sharon; Klein-Rodewald, Tanja; Klingenspor, Martin; Klopstock, Thomas; Lalanne, Valerie; Leblanc, Sophie; Lengger, Christoph; le Marchand, Elise; Ludwig, Tonia; Lux, Aline; McKerlie, Colin; Maier, Holger; Mandel, Jean-Louis; Marschall, Susan; Mark, Manuel; Melvin, David G; Meziane, Hamid; Micklich, Kateryna; Mittelhauser, Christophe; Monassier, Laurent; Moulaert, David; Muller, Stéphanie; Naton, Beatrix; Neff, Frauke; Nolan, Patrick M; Nutter, Lauryl Mj; Ollert, Markus; Pavlovic, Guillaume; Pellegata, Natalia S; Peter, Emilie; Petit-Demoulière, Benoit; Pickard, Amanda; Podrini, Christine; Potter, Paul; Pouilly, Laurent; Puk, Oliver; Richardson, David; Rousseau, Stephane; Quintanilla-Fend, Leticia; Quwailid, Mohamed M; Racz, Ildiko; Rathkolb, Birgit; Riet, Fabrice; Rossant, Janet; Roux, Michel; Rozman, Jan; Ryder, Ed; Salisbury, Jennifer; Santos, Luis; Schäble, Karl-Heinz; Schiller, Evelyn; Schrewe, Anja; Schulz, Holger; Steinkamp, Ralf; Simon, Michelle; Stewart, Michelle; Stöger, Claudia; Stöger, Tobias; Sun, Minxuan; Sunter, David; Teboul, Lydia; Tilly, Isabelle; Tocchini-Valentini, Glauco P; Tost, Monica; Treise, Irina; Vasseur, Laurent; Velot, Emilie; Vogt-Weisenhorn, Daniela; Wagner, Christelle; Walling, Alison; Weber, Bruno; Wendling, Olivia; Westerberg, Henrik; Willershäuser, Monja; Wolf, Eckhard; Wolter, Anne; Wood, Joe; Wurst, Wolfgang; Yildirim, Ali Önder; Zeh, Ramona; Zimmer, Andreas; Zimprich, Annemarie; Holmes, Chris; Steel, Karen P; Herault, Yann; Gailus-Durner, Valérie; Mallon, Ann-Marie; Brown, Steve Dm

    2015-09-01

    The function of the majority of genes in the mouse and human genomes remains unknown. The mouse embryonic stem cell knockout resource provides a basis for the characterization of relationships between genes and phenotypes. The EUMODIC consortium developed and validated robust methodologies for the broad-based phenotyping of knockouts through a pipeline comprising 20 disease-oriented platforms. We developed new statistical methods for pipeline design and data analysis aimed at detecting reproducible phenotypes with high power. We acquired phenotype data from 449 mutant alleles, representing 320 unique genes, of which half had no previous functional annotation. We captured data from over 27,000 mice, finding that 83% of the mutant lines are phenodeviant, with 65% demonstrating pleiotropy. Surprisingly, we found significant differences in phenotype annotation according to zygosity. New phenotypes were uncovered for many genes with previously unknown function, providing a powerful basis for hypothesis generation and further investigation in diverse systems.

  15. High Variability in Cellular Stoichiometry of Carbon, Nitrogen, and Phosphorus Within Classes of Marine Eukaryotic Phytoplankton Under Sufficient Nutrient Conditions.

    PubMed

    Garcia, Nathan S; Sexton, Julie; Riggins, Tracey; Brown, Jeff; Lomas, Michael W; Martiny, Adam C

    2018-01-01

    Current hypotheses suggest that cellular elemental stoichiometry of marine eukaryotic phytoplankton such as the ratios of cellular carbon:nitrogen:phosphorus (C:N:P) vary between phylogenetic groups. To investigate how phylogenetic structure, cell volume, growth rate, and temperature interact to affect the cellular elemental stoichiometry of marine eukaryotic phytoplankton, we examined the C:N:P composition in 30 isolates across 7 classes of marine phytoplankton that were grown with a sufficient supply of nutrients and nitrate as the nitrogen source. The isolates covered a wide range in cell volume (5 orders of magnitude), growth rate (<0.01-0.9 d -1 ), and habitat temperature (2-24°C). Our analysis indicates that C:N:P is highly variable, with statistical model residuals accounting for over half of the total variance and no relationship between phylogeny and elemental stoichiometry. Furthermore, our data indicated that variability in C:P, N:P, and C:N within Bacillariophyceae (diatoms) was as high as that among all of the isolates that we examined. In addition, a linear statistical model identified a positive relationship between diatom cell volume and C:P and N:P. Among all of the isolates that we examined, the statistical model identified temperature as a significant factor, consistent with the temperature-dependent translation efficiency model, but temperature only explained 5% of the total statistical model variance. While some of our results support data from previous field studies, the high variability of elemental ratios within Bacillariophyceae contradicts previous work that suggests that this cosmopolitan group of microalgae has consistently low C:P and N:P ratios in comparison with other groups.

  16. Quantification and statistical significance analysis of group separation in NMR-based metabonomics studies

    PubMed Central

    Goodpaster, Aaron M.; Kennedy, Michael A.

    2015-01-01

    Currently, no standard metrics are used to quantify cluster separation in PCA or PLS-DA scores plots for metabonomics studies or to determine if cluster separation is statistically significant. Lack of such measures makes it virtually impossible to compare independent or inter-laboratory studies and can lead to confusion in the metabonomics literature when authors putatively identify metabolites distinguishing classes of samples based on visual and qualitative inspection of scores plots that exhibit marginal separation. While previous papers have addressed quantification of cluster separation in PCA scores plots, none have advocated routine use of a quantitative measure of separation that is supported by a standard and rigorous assessment of whether or not the cluster separation is statistically significant. Here quantification and statistical significance of separation of group centroids in PCA and PLS-DA scores plots are considered. The Mahalanobis distance is used to quantify the distance between group centroids, and the two-sample Hotelling's T2 test is computed for the data, related to an F-statistic, and then an F-test is applied to determine if the cluster separation is statistically significant. We demonstrate the value of this approach using four datasets containing various degrees of separation, ranging from groups that had no apparent visual cluster separation to groups that had no visual cluster overlap. Widespread adoption of such concrete metrics to quantify and evaluate the statistical significance of PCA and PLS-DA cluster separation would help standardize reporting of metabonomics data. PMID:26246647

  17. A New Scoring System to Predict the Risk for High-risk Adenoma and Comparison of Existing Risk Calculators.

    PubMed

    Murchie, Brent; Tandon, Kanwarpreet; Hakim, Seifeldin; Shah, Kinchit; O'Rourke, Colin; Castro, Fernando J

    2017-04-01

    Colorectal cancer (CRC) screening guidelines likely over-generalizes CRC risk, 35% of Americans are not up to date with screening, and there is growing incidence of CRC in younger patients. We developed a practical prediction model for high-risk colon adenomas in an average-risk population, including an expanded definition of high-risk polyps (≥3 nonadvanced adenomas), exposing higher than average-risk patients. We also compared results with previously created calculators. Patients aged 40 to 59 years, undergoing first-time average-risk screening or diagnostic colonoscopies were evaluated. Risk calculators for advanced adenomas and high-risk adenomas were created based on age, body mass index, sex, race, and smoking history. Previously established calculators with similar risk factors were selected for comparison of concordance statistic (c-statistic) and external validation. A total of 5063 patients were included. Advanced adenomas, and high-risk adenomas were seen in 5.7% and 7.4% of the patient population, respectively. The c-statistic for our calculator was 0.639 for the prediction of advanced adenomas, and 0.650 for high-risk adenomas. When applied to our population, all previous models had lower c-statistic results although one performed similarly. Our model compares favorably to previously established prediction models. Age and body mass index were used as continuous variables, likely improving the c-statistic. It also reports absolute predictive probabilities of advanced and high-risk polyps, allowing for more individualized risk assessment of CRC.

  18. An online sleep apnea detection method based on recurrence quantification analysis.

    PubMed

    Nguyen, Hoa Dinh; Wilkins, Brek A; Cheng, Qi; Benjamin, Bruce Allen

    2014-07-01

    This paper introduces an online sleep apnea detection method based on heart rate complexity as measured by recurrence quantification analysis (RQA) statistics of heart rate variability (HRV) data. RQA statistics can capture nonlinear dynamics of a complex cardiorespiratory system during obstructive sleep apnea. In order to obtain a more robust measurement of the nonstationarity of the cardiorespiratory system, we use different fixed amount of neighbor thresholdings for recurrence plot calculation. We integrate a feature selection algorithm based on conditional mutual information to select the most informative RQA features for classification, and hence, to speed up the real-time classification process without degrading the performance of the system. Two types of binary classifiers, i.e., support vector machine and neural network, are used to differentiate apnea from normal sleep. A soft decision fusion rule is developed to combine the results of these classifiers in order to improve the classification performance of the whole system. Experimental results show that our proposed method achieves better classification results compared with the previous recurrence analysis-based approach. We also show that our method is flexible and a strong candidate for a real efficient sleep apnea detection system.

  19. The analysis of factors of management of safety of critical information infrastructure with use of dynamic models

    NASA Astrophysics Data System (ADS)

    Trostyansky, S. N.; Kalach, A. V.; Lavlinsky, V. V.; Lankin, O. V.

    2018-03-01

    Based on the analysis of the dynamic model of panel data by region, including fire statistics for surveillance sites and statistics of a set of regional socio-economic indicators, as well as the time of rapid response of the state fire service to fires, the probability of fires in the surveillance sites and the risk of human death in The result of such fires from the values of the corresponding indicators for the previous year, a set of regional social-economics factors, as well as regional indicators time rapid response of the state fire service in the fire. The results obtained are consistent with the results of the application to the fire risks of the model of a rational offender. Estimation of the economic equivalent of human life from data on surveillance objects for Russia, calculated on the basis of the analysis of the presented dynamic model of fire risks, correctly agrees with the known literary data. The results obtained on the basis of the econometric approach to fire risks allow us to forecast fire risks at the supervisory sites in the regions of Russia and to develop management solutions to minimize such risks.

  20. A systematic review and meta-analysis of tract-based spatial statistics studies regarding attention-deficit/hyperactivity disorder.

    PubMed

    Chen, Lizhou; Hu, Xinyu; Ouyang, Luo; He, Ning; Liao, Yi; Liu, Qi; Zhou, Ming; Wu, Min; Huang, Xiaoqi; Gong, Qiyong

    2016-09-01

    Diffusion tensor imaging (DTI) studies that use tract-based spatial statistics (TBSS) have demonstrated the microstructural abnormalities of white matter (WM) in patients with attention-deficit/hyperactivity disorder (ADHD); however, robust conclusions have not yet been drawn. The present study integrated the findings of previous TBSS studies to determine the most consistent WM alterations in ADHD via a narrative review and meta-analysis. The literature search was conducted through October 2015 to identify TBSS studies that compared fractional anisotropy (FA) between ADHD patients and healthy controls. FA reductions were identified in the splenium of the corpus callosum (CC) that extended to the right cingulum, right sagittal stratum, and left tapetum. The first two clusters retained significance in the sensitivity analysis and in all subgroup analyses. The FA reduction in the CC splenium was negatively associated with the mean age of the ADHD group. We hypothesize that, in addition to the fronto-striatal-cerebellar circuit, the disturbed WM matter tracts that integrate the bilateral hemispheres and posterior-brain circuitries play a crucial role in the pathophysiology of ADHD. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Seeking a fingerprint: analysis of point processes in actigraphy recording

    NASA Astrophysics Data System (ADS)

    Gudowska-Nowak, Ewa; Ochab, Jeremi K.; Oleś, Katarzyna; Beldzik, Ewa; Chialvo, Dante R.; Domagalik, Aleksandra; Fąfrowicz, Magdalena; Marek, Tadeusz; Nowak, Maciej A.; Ogińska, Halszka; Szwed, Jerzy; Tyburczyk, Jacek

    2016-05-01

    Motor activity of humans displays complex temporal fluctuations which can be characterised by scale-invariant statistics, thus demonstrating that structure and fluctuations of such kinetics remain similar over a broad range of time scales. Previous studies on humans regularly deprived of sleep or suffering from sleep disorders predicted a change in the invariant scale parameters with respect to those for healthy subjects. In this study we investigate the signal patterns from actigraphy recordings by means of characteristic measures of fractional point processes. We analyse spontaneous locomotor activity of healthy individuals recorded during a week of regular sleep and a week of chronic partial sleep deprivation. Behavioural symptoms of lack of sleep can be evaluated by analysing statistics of duration times during active and resting states, and alteration of behavioural organisation can be assessed by analysis of power laws detected in the event count distribution, distribution of waiting times between consecutive movements and detrended fluctuation analysis of recorded time series. We claim that among different measures characterising complexity of the actigraphy recordings and their variations implied by chronic sleep distress, the exponents characterising slopes of survival functions in resting states are the most effective biomarkers distinguishing between healthy and sleep-deprived groups.

  2. The Lévy flight foraging hypothesis: forgetting about memory may lead to false verification of Brownian motion.

    PubMed

    Gautestad, Arild O; Mysterud, Atle

    2013-01-01

    The Lévy flight foraging hypothesis predicts a transition from scale-free Lévy walk (LW) to scale-specific Brownian motion (BM) as an animal moves from resource-poor towards resource-rich environment. However, the LW-BM continuum implies a premise of memory-less search, which contradicts the cognitive capacity of vertebrates. We describe methods to test if apparent support for LW-BM transitions may rather be a statistical artifact from movement under varying intensity of site fidelity. A higher frequency of returns to previously visited patches (stronger site fidelity) may erroneously be interpreted as a switch from LW towards BM. Simulations of scale-free, memory-enhanced space use illustrate how the ratio between return events and scale-free exploratory movement translates to varying strength of site fidelity. An expanded analysis of GPS data of 18 female red deer, Cervus elaphus, strengthens previous empirical support of memory-enhanced and scale-free space use in a northern forest ecosystem. A statistical mechanical model architecture that describes foraging under environment-dependent variation of site fidelity may allow for higher realism of optimal search models and movement ecology in general, in particular for vertebrates with high cognitive capacity.

  3. Using transportation accident databases to investigate ignition and explosion probabilities of flammable spills.

    PubMed

    Ronza, A; Vílchez, J A; Casal, J

    2007-07-19

    Risk assessment of hazardous material spill scenarios, and quantitative risk assessment in particular, make use of event trees to account for the possible outcomes of hazardous releases. Using event trees entails the definition of probabilities of occurrence for events such as spill ignition and blast formation. This study comprises an extensive analysis of ignition and explosion probability data proposed in previous work. Subsequently, the results of the survey of two vast US federal spill databases (HMIRS, by the Department of Transportation, and MINMOD, by the US Coast Guard) are reported and commented on. Some tens of thousands of records of hydrocarbon spills were analysed. The general pattern of statistical ignition and explosion probabilities as a function of the amount and the substance spilled is discussed. Equations are proposed based on statistical data that predict the ignition probability of hydrocarbon spills as a function of the amount and the substance spilled. Explosion probabilities are put forth as well. Two sets of probability data are proposed: it is suggested that figures deduced from HMIRS be used in land transportation risk assessment, and MINMOD results with maritime scenarios assessment. Results are discussed and compared with previous technical literature.

  4. Identification and characterization of near-fatal asthma phenotypes by cluster analysis.

    PubMed

    Serrano-Pariente, J; Rodrigo, G; Fiz, J A; Crespo, A; Plaza, V

    2015-09-01

    Near-fatal asthma (NFA) is a heterogeneous clinical entity and several profiles of patients have been described according to different clinical, pathophysiological and histological features. However, there are no previous studies that identify in a unbiased way--using statistical methods such as clusters analysis--different phenotypes of NFA. Therefore, the aim of the present study was to identify and to characterize phenotypes of near fatal asthma using a cluster analysis. Over a period of 2 years, 33 Spanish hospitals enrolled 179 asthmatics admitted for an episode of NFA. A cluster analysis using two-steps algorithm was performed from data of 84 of these cases. The analysis defined three clusters of patients with NFA: cluster 1, the largest, including older patients with clinical and therapeutic criteria of severe asthma; cluster 2, with an high proportion of respiratory arrest (68%), impaired consciousness level (82%) and mechanical ventilation (93%); and cluster 3, which included younger patients, characterized by an insufficient anti-inflammatory treatment and frequent sensitization to Alternaria alternata and soybean. These results identify specific asthma phenotypes involved in NFA, confirming in part previous findings observed in studies with a clinical approach. The identification of patients with a specific NFA phenotype could suggest interventions to prevent future severe asthma exacerbations. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  5. Statistical Learning of Probabilistic Nonadjacent Dependencies by Multiple-Cue Integration

    ERIC Educational Resources Information Center

    van den Bos, Esther; Christiansen, Morten H.; Misyak, Jennifer B.

    2012-01-01

    Previous studies have indicated that dependencies between nonadjacent elements can be acquired by statistical learning when each element predicts only one other element (deterministic dependencies). The present study investigates statistical learning of probabilistic nonadjacent dependencies, in which each element predicts several other elements…

  6. A geographical information system-based analysis of cancer mortality and population exposure to coal mining activities in West Virginia, United States of America.

    PubMed

    Hendryx, Michael; Fedorko, Evan; Anesetti-Rothermel, Andrew

    2010-05-01

    Cancer incidence and mortality rates are high in West Virginia compared to the rest of the United States of America. Previous research has suggested that exposure to activities of the coal mining industry may contribute to elevated cancer mortality, although exposure measures have been limited. This study tests alternative specifications of exposure to mining activity to determine whether a measure based on location of mines, processing plants, coal slurry impoundments and underground slurry injection sites relative to population levels is superior to a previously-reported measure of exposure based on tons mined at the county level, in the prediction of age-adjusted cancer mortality rates. To this end, we utilize two geographical information system (GIS) techniques--exploratory spatial data analysis and inverse distance mapping--to construct new statistical analyses. Total, respiratory and "other" age-adjusted cancer mortality rates in West Virginia were found to be more highly associated with the GIS-exposure measure than the tonnage measure, before and after statistical control for smoking rates. The superior performance of the GIS measure, based on where people in the state live relative to mining activity, suggests that activities of the industry contribute to cancer mortality. Further confirmation of observed phenomena is necessary with person-level studies, but the results add to the body of evidence that coal mining poses environmental risks to population health in West Virginia.

  7. Exposure to the dental environment and prevalence of respiratory illness in dental student populations.

    PubMed

    Scannapieco, Frank A; Ho, Alex W; DiTolla, Maris; Chen, Casey; Dentino, Andrew R

    2004-03-01

    To determine if the prevalence of respiratory disease among dental students and dental residents varies with their exposure to the clinical dental environment. A detailed questionnaire was administered to 817 students at 3 dental schools. The questionnaire sought information concerning demographic characteristics, school year, exposure to the dental environment and dental procedures, and history of respiratory disease. The data obtained were subjected to bivariate and multiple logistic regression analysis. Respondents reported experiencing the following respiratory conditions during the previous year: asthma (26 cases), bronchitis (11 cases), chronic lung disease (6 cases), pneumonia (5 cases) and streptococcal pharyngitis (50 cases). Bivariate statistical analyses indicated no significant associations between the prevalence of any of the respiratory conditions and year in dental school, except for asthma, for which there was a significantly higher prevalence at 1 school compared to the other 2 schools. When all cases of respiratory disease were combined as a composite variable and subjected to multivariate logistic regression analysis controlling for age, sex, race, dental school, smoking history and alcohol consumption, no statistically significant association was observed between respiratory condition and year in dental school or exposure to the dental environment as a dental patient. No association was found between the prevalence of respiratory disease and a student's year in dental school or previous exposure to the dental environment as a patient. These results suggest that exposure to the dental environment does not increase the risk for respiratory infection in healthy dental health care workers.

  8. An analysis of fosaprepitant-induced venous toxicity in patients receiving highly emetogenic chemotherapy

    PubMed Central

    Leal, Alexis D.; Grendahl, Darryl C.; Seisler, Drew K.; Sorgatz, Kristine M.; Anderson, Kari J.; Hilger, Crystal R.; Loprinzi, Charles L.

    2015-01-01

    Purpose Fosaprepitant is an antiemetic used for chemotherapy-induced nausea and vomiting. We recently reported increased infusion site adverse events (ISAE) in a cohort of breast cancer patients receiving chemotherapy with doxorubicin and cyclophosphamide (AC). In this current study, we evaluated the venous toxicity of fosaprepitant use with non-anthracycline platinum-based antineoplastic regimens. Methods A retrospective review was conducted of the first 81 patients initiated on fosaprepitant among patients receiving highly emetogenic chemotherapy, on or after January 1, 2011 at Mayo Clinic Rochester. None of these regimens included an anthracycline. Data collected included baseline demographics, chemotherapy regimen, type of intravenous access and type, and severity of ISAE. Data from these patients were compared to previously collected data from patients who had received AC. Statistical analysis using χ2 and univariate logistic regression was used to evaluate the association between treatment regimen, fosaprepitant, and risk of ISAE. Results Among these 81 patients, the incidence of ISAE was 7.4 % in the non-anthracycline platinum group. The most commonly reported ISAE were swelling (3 %), extravasation (3 %), and phlebitis (3 %). When stratified by regimen, fosaprepitant was associated with a statistically significant increased risk of ISAE in the anthracycline group (OR 8.1; 95 % CI 2.0–31.9) compared to the platinum group. Conclusions Fosaprepitant antiemetic therapy causes significant ISAE that are appreciably higher than previous reports. Patients receiving platinum-based chemotherapy appear to have less significant ISAE than do patients who receive anthracycline-based regimens. PMID:24964876

  9. [Analysis of the Structure of Acute Psychotic Disorder].

    PubMed

    Gerardo, Téllez R; Ricardo, Sánchez P; Luis, Eduardo Jaramillo

    2012-03-01

    Schizophrenia is a clinically heterogeneous disorder. A multifactorial structure of this syndrome has been described in previous reports. The aim of this study was to evaluate what are the possible diagnostic categories in patients having acute psychotic symptoms, studying their clinical characteristics in a cross-sectional study. An instrument for measuring psychotic symptoms was created using previous scales (SANS, SAPS, BPRS, EMUN, Zung depression scale). Using as criteria statistical indexes and redundance of items, the initial instrument having 101 items has been reduced to 57 items. 232 patients with acute psychotic symptoms, in most cases schizophrenia, attending Clínica Nuestra Señora de la Paz in Bogotá and Hospital San Juan de Dios in Chía have been evaluated from April, 2008 to December, 2009. Multivariate statistical methods have been used for analyzing data. A six-factor structure has been found (Deficit, paranoid-aggressive, disorganized, depressive, bizarre delusions, hallucinations). Cluster analysis showed eight subtypes that can be described as: 1) bizarre delusions-hallucinations; 2) deterioration and disorganized behavior; 3) deterioration; 4) deterioration and paranoid-aggressive behavior; 5) bizarre delusions; 6) paranoia-anxiety- aggressiveness; 7) depressive symptoms and bizarre delusions; 8) paranoia and aggressiveness with depressive symptoms These subtypes allow a more exhaustive characterization that those included in standard classification schemes and should be validated in longitudinal studies. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  10. A meta-analysis of neuropsychological outcome after mild traumatic brain injury: re-analyses and reconsiderations of Binder et al. (1997), Frencham et al. (2005), and Pertab et al. (2009).

    PubMed

    Rohling, Martin L; Binder, Laurence M; Demakis, George J; Larrabee, Glenn J; Ploetz, Danielle M; Langhinrichsen-Rohling, Jennifer

    2011-05-01

    The meta-analytic findings of Binder et al. (1997) and Frencham et al. (2005) showed that the neuropsychological effect of mild traumatic brain injury (mTBI) was negligible in adults by 3 months post injury. Pertab et al. (2009) reported that verbal paired associates, coding tasks, and digit span yielded significant differences between mTBI and control groups. We re-analyzed data from the 25 studies used in the prior meta-analyses, correcting statistical and methodological limitations of previous efforts, and analyzed the chronicity data by discrete epochs. Three months post injury the effect size of -0.07 was not statistically different from zero and similar to that which has been found in several other meta-analyses (Belanger et al., 2005; Schretlen & Shapiro, 2003). The effect size 7 days post injury was -0.39. The effect of mTBI immediately post injury was largest on Verbal and Visual Memory domains. However, 3 months post injury all domains improved to show non-significant effect sizes. These findings indicate that mTBI has an initial small effect on neuropsychological functioning that dissipates quickly. The evidence of recovery in the present meta-analysis is consistent with previous conclusions of both Binder et al. and Frencham et al. Our findings may not apply to people with a history of multiple concussions or complicated mTBIs.

  11. Experimental design of an interlaboratory study for trace metal analysis of liquid fluids. [for aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Greenbauer-Seng, L. A.

    1983-01-01

    The accurate determination of trace metals and fuels is an important requirement in much of the research into and development of alternative fuels for aerospace applications. Recognizing the detrimental effects of certain metals on fuel performance and fuel systems at the part per million and in some cases part per billion levels requires improved accuracy in determining these low concentration elements. Accurate analyses are also required to ensure interchangeability of analysis results between vendor, researcher, and end use for purposes of quality control. Previous interlaboratory studies have demonstrated the inability of different laboratories to agree on the results of metal analysis, particularly at low concentration levels, yet typically good precisions are reported within a laboratory. An interlaboratory study was designed to gain statistical information about the sources of variation in the reported concentrations. Five participant laboratories were used on a fee basis and were not informed of the purpose of the analyses. The effects of laboratory, analytical technique, concentration level, and ashing additive were studied in four fuel types for 20 elements of interest. The prescribed sample preparation schemes (variations of dry ashing) were used by all of the laboratories. The analytical data were statistically evaluated using a computer program for the analysis of variance technique.

  12. CT coronary angiography: impact of adapted statistical iterative reconstruction (ASIR) on coronary stenosis and plaque composition analysis.

    PubMed

    Fuchs, Tobias A; Fiechter, Michael; Gebhard, Cathérine; Stehli, Julia; Ghadri, Jelena R; Kazakauskaite, Egle; Herzog, Bernhard A; Husmann, Lars; Gaemperli, Oliver; Kaufmann, Philipp A

    2013-03-01

    To assess the impact of adaptive statistical iterative reconstruction (ASIR) on coronary plaque volume and composition analysis as well as on stenosis quantification in high definition coronary computed tomography angiography (CCTA). We included 50 plaques in 29 consecutive patients who were referred for the assessment of known or suspected coronary artery disease (CAD) with contrast-enhanced CCTA on a 64-slice high definition CT scanner (Discovery HD 750, GE Healthcare). CCTA scans were reconstructed with standard filtered back projection (FBP) with no ASIR (0 %) or with increasing contributions of ASIR, i.e. 20, 40, 60, 80 and 100 % (no FBP). Plaque analysis (volume, components and stenosis degree) was performed using a previously validated automated software. Mean values for minimal diameter and minimal area as well as degree of stenosis did not change significantly using different ASIR reconstructions. There was virtually no impact of reconstruction algorithms on mean plaque volume or plaque composition (e.g. soft, intermediate and calcified component). However, with increasing ASIR contribution, the percentage of plaque volume component between 401 and 500 HU decreased significantly (p < 0.05). Modern image reconstruction algorithms such as ASIR, which has been developed for noise reduction in latest high resolution CCTA scans, can be used reliably without interfering with the plaque analysis and stenosis severity assessment.

  13. Physical and genetic-interaction density reveals functional organization and informs significance cutoffs in genome-wide screens

    PubMed Central

    Dittmar, John C.; Pierce, Steven; Rothstein, Rodney; Reid, Robert J. D.

    2013-01-01

    Genome-wide experiments often measure quantitative differences between treated and untreated cells to identify affected strains. For these studies, statistical models are typically used to determine significance cutoffs. We developed a method termed “CLIK” (Cutoff Linked to Interaction Knowledge) that overlays biological knowledge from the interactome on screen results to derive a cutoff. The method takes advantage of the fact that groups of functionally related interacting genes often respond similarly to experimental conditions and, thus, cluster in a ranked list of screen results. We applied CLIK analysis to five screens of the yeast gene disruption library and found that it defined a significance cutoff that differed from traditional statistics. Importantly, verification experiments revealed that the CLIK cutoff correlated with the position in the rank order where the rate of true positives drops off significantly. In addition, the gene sets defined by CLIK analysis often provide further biological perspectives. For example, applying CLIK analysis retrospectively to a screen for cisplatin sensitivity allowed us to identify the importance of the Hrq1 helicase in DNA crosslink repair. Furthermore, we demonstrate the utility of CLIK to determine optimal treatment conditions by analyzing genome-wide screens at multiple rapamycin concentrations. We show that CLIK is an extremely useful tool for evaluating screen quality, determining screen cutoffs, and comparing results between screens. Furthermore, because CLIK uses previously annotated interaction data to determine biologically informed cutoffs, it provides additional insights into screen results, which supplement traditional statistical approaches. PMID:23589890

  14. Association of bladder sensation measures and bladder diary in patients with urinary incontinence.

    PubMed

    King, Ashley B; Wolters, Jeff P; Klausner, Adam P; Rapp, David E

    2012-04-01

    Investigation suggests the involvement of afferent actions in the pathophysiology of urinary incontinence. Current diagnostic modalities do not allow for the accurate identification of sensory dysfunction. We previously reported urodynamic derivatives that may be useful in assessing bladder sensation. We sought to further investigate these derivatives by assessing for a relationship with 3-day bladder diary. Subset analysis was performed in patients without stress urinary incontinence (SUI) attempting to isolate patients with urgency symptoms. No association was demonstrated between bladder diary parameters and urodynamic derivatives (r coefficient range (-0.06 to 0.08)(p > 0.05)). However, subset analysis demonstrated an association between detrusor overactivity (DO) and bladder urgency velocity (BUV), with a lower BUV identified in patients without DO. Subset analysis of patients with isolated urgency/urge incontinence identified weak associations between voiding frequency and FSR (r = 0.39) and between daily incontinence episodes and BUV (r = 0.35). However, these associations failed to demonstrate statistical significance. No statistical association was seen between bladder diary and urodynamic derivatives. This is not unexpected, given that bladder diary parameters may reflect numerous pathologies including not only sensory dysfunction but also SUI and DO. However, weak associations were identified in patients without SUI and, further, a statistical relationship between DO and BUV was seen. Additional research is needed to assess the utility of FSR/BUV in characterizing sensory dysfunction, especially in patients without concurrent pathology (e.g. SUI, DO).

  15. Identification of Chemical Attribution Signatures of Fentanyl Syntheses Using Multivariate Statistical Analysis of Orthogonal Analytical Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayer, B. P.; Mew, D. A.; DeHope, A.

    Attribution of the origin of an illicit drug relies on identification of compounds indicative of its clandestine production and is a key component of many modern forensic investigations. The results of these studies can yield detailed information on method of manufacture, starting material source, and final product - all critical forensic evidence. In the present work, chemical attribution signatures (CAS) associated with the synthesis of the analgesic fentanyl, N-(1-phenylethylpiperidin-4-yl)-N-phenylpropanamide, were investigated. Six synthesis methods, all previously published fentanyl synthetic routes or hybrid versions thereof, were studied in an effort to identify and classify route-specific signatures. 160 distinct compounds and inorganicmore » species were identified using gas and liquid chromatographies combined with mass spectrometric methods (GC-MS and LCMS/ MS-TOF) in conjunction with inductively coupled plasma mass spectrometry (ICPMS). The complexity of the resultant data matrix urged the use of multivariate statistical analysis. Using partial least squares discriminant analysis (PLS-DA), 87 route-specific CAS were classified and a statistical model capable of predicting the method of fentanyl synthesis was validated and tested against CAS profiles from crude fentanyl products deposited and later extracted from two operationally relevant surfaces: stainless steel and vinyl tile. This work provides the most detailed fentanyl CAS investigation to date by using orthogonal mass spectral data to identify CAS of forensic significance for illicit drug detection, profiling, and attribution.« less

  16. Mining Claim Activity on Federal Land in the United States

    USGS Publications Warehouse

    Causey, J. Douglas

    2007-01-01

    Several statistical compilations of mining claim activity on Federal land derived from the Bureau of Land Management's LR2000 database have previously been published by the U.S Geological Survey (USGS). The work in the 1990s did not include Arkansas or Florida. None of the previous reports included Alaska because it is stored in a separate database (Alaska Land Information System) and is in a different format. This report includes data for all states for which there are Federal mining claim records, beginning in 1976 and continuing to the present. The intent is to update the spatial and statistical data associated with this report on an annual basis, beginning with 2005 data. The statistics compiled from the databases are counts of the number of active mining claims in a section of land each year from 1976 to the present for all states within the United States. Claim statistics are subset by lode and placer types, as well as a dataset summarizing all claims including mill site and tunnel site claims. One table presents data by case type, case status, and number of claims in a section. This report includes a spatial database for each state in which mining claims were recorded, except North Dakota, which only has had two claims. A field is present that allows the statistical data to be joined to the spatial databases so that spatial displays and analysis can be done by using appropriate geographic information system (GIS) software. The data show how mining claim activity has changed in intensity, space, and time. Variations can be examined on a state, as well as a national level. The data are tied to a section of land, approximately 640 acres, which allows it to be used at regional, as well as local scale. The data only pertain to Federal land and mineral estate that was open to mining claim location at the time the claims were staked.

  17. A global estimate of the Earth's magnetic crustal thickness

    NASA Astrophysics Data System (ADS)

    Vervelidou, Foteini; Thébault, Erwan

    2014-05-01

    The Earth's lithosphere is considered to be magnetic only down to the Curie isotherm. Therefore the Curie isotherm can, in principle, be estimated by analysis of magnetic data. Here, we propose such an analysis in the spectral domain by means of a newly introduced regional spatial power spectrum. This spectrum is based on the Revised Spherical Cap Harmonic Analysis (R-SCHA) formalism (Thébault et al., 2006). We briefly discuss its properties and its relationship with the Spherical Harmonic spatial power spectrum. This relationship allows us to adapt any theoretical expression of the lithospheric field power spectrum expressed in Spherical Harmonic degrees to the regional formulation. We compared previously published statistical expressions (Jackson, 1994 ; Voorhies et al., 2002) to the recent lithospheric field models derived from the CHAMP and airborne measurements and we finally developed a new statistical form for the power spectrum of the Earth's magnetic lithosphere that we think provides more consistent results. This expression depends on the mean magnetization, the mean crustal thickness and a power law value that describes the amount of spatial correlation of the sources. In this study, we make a combine use of the R-SCHA surface power spectrum and this statistical form. We conduct a series of regional spectral analyses for the entire Earth. For each region, we estimate the R-SCHA surface power spectrum of the NGDC-720 Spherical Harmonic model (Maus, 2010). We then fit each of these observational spectra to the statistical expression of the power spectrum of the Earth's lithosphere. By doing so, we estimate the large wavelengths of the magnetic crustal thickness on a global scale that are not accessible directly from the magnetic measurements due to the masking core field. We then discuss these results and compare them to the results we obtained by conducting a similar spectral analysis, but this time in the cartesian coordinates, by means of a published statistical expression (Maus et al., 1997). We also compare our results to crustal thickness global maps derived by means of additional geophysical data (Purucker et al., 2002).

  18. Emotional intelligence and perceived stress.

    PubMed

    Naidoo, Sudeshni; Pau, Allan

    2008-04-01

    Many studies have reported that high levels of stress and psychological morbidity occur in students in the health care profession. Stress has been defined as the strain that accompanies a demand perceived to be either challenging (positive) or threatening (negative) and, depending on the appraisal, may be either adaptive or debilitating. The aim of the present survey was to gain some understanding of the explanatory factors for stress and an evaluation of the role that emotional intelligence (EI) plays in the experience of perceived stress (PS). It also aimed to compare EI and PS and explore the association between academic background, satisfaction with career choice and EI, and PS in first year dental students. A cross-sectional survey was conducted at the Faculty of Dentistry, University of the Western Cape. First year dental undergraduates who had completed at least six months of their dental degree course during 2005/06 were invited to complete a set of questionnaires on emotional intelligence and perceived stress. Demographic questions included gender and age. Students were also asked if they had a previous qualification from a higher education institution and if they were satisfied with their decision to study dentistry. Ninety eight completed the questionnaires representing a response rate of 96%. 43 were male (44%) and 55 female (56%), Results of t-tests indicated that low scorers on the EI scale were more likely to be (i) younger compared to older students (p<0.001), (ii) those without compared to those with a previous higher education qualification (p<0.001), and (iii) those who were not satisfied compared to those who were satisfied with their decision to study dentistry (p<0.001). Statistically significant differences were noted in mean PS scores between (i) male and female students (p<0.05), (ii) younger compared to older students (p<0.001), (iii) those without compared to those with previous higher education qualification (p<0.001), and (iv) those who were not satisfied compared to those who were satisfied with their decisions to study dentistry (p<0.001). Correlation analysis between EI and PS indicated a statistically significant inverse relationship between EI and PS (coefficient =-0.50, p=0.001). Stepwise regression analysis identified significant predictors of PS as gender, previous higher education qualification, satisfaction with decision to study dentistry and EI. The t statistic indicates that EI is relatively the most important predictor of PS. The finding that low EI is associated the stress suggests two possible strategies: firstly, selection of prospective students could be based on EI, and there should be interventions to enhance students' emotional intelligence.

  19. Statistical detection of EEG synchrony using empirical bayesian inference.

    PubMed

    Singh, Archana K; Asoh, Hideki; Takeda, Yuji; Phillips, Steven

    2015-01-01

    There is growing interest in understanding how the brain utilizes synchronized oscillatory activity to integrate information across functionally connected regions. Computing phase-locking values (PLV) between EEG signals is a popular method for quantifying such synchronizations and elucidating their role in cognitive tasks. However, high-dimensionality in PLV data incurs a serious multiple testing problem. Standard multiple testing methods in neuroimaging research (e.g., false discovery rate, FDR) suffer severe loss of power, because they fail to exploit complex dependence structure between hypotheses that vary in spectral, temporal and spatial dimension. Previously, we showed that a hierarchical FDR and optimal discovery procedures could be effectively applied for PLV analysis to provide better power than FDR. In this article, we revisit the multiple comparison problem from a new Empirical Bayes perspective and propose the application of the local FDR method (locFDR; Efron, 2001) for PLV synchrony analysis to compute FDR as a posterior probability that an observed statistic belongs to a null hypothesis. We demonstrate the application of Efron's Empirical Bayes approach for PLV synchrony analysis for the first time. We use simulations to validate the specificity and sensitivity of locFDR and a real EEG dataset from a visual search study for experimental validation. We also compare locFDR with hierarchical FDR and optimal discovery procedures in both simulation and experimental analyses. Our simulation results showed that the locFDR can effectively control false positives without compromising on the power of PLV synchrony inference. Our results from the application locFDR on experiment data detected more significant discoveries than our previously proposed methods whereas the standard FDR method failed to detect any significant discoveries.

  20. Cone Beam Computed Tomography Analysis of Oropharyngeal Airway in Preadolescent Nonsyndromic Bilateral and Unilateral Cleft Lip and Palate Patients.

    PubMed

    Al-Fahdawi, Mahmood Abd; El-Kassaby, Marwa Abdelwahab; Farid, Mary Medhat; El-Fotouh, Mona Abou

    2018-01-01

    Objective The objective of this study was to assess the volume, area, and dimensions of the oropharyngeal airway (OPA) in a previously repaired nonsyndromic unilateral cleft lip and palate (UCLP) versus bilateral cleft lip and palate (BCLP) patients when compared with noncleft controls using cone beam computed tomography (CBCT). Design This was a retrospective case-control study. Setting The Cleft Care Center and outpatient clinic that are affiliated to our faculty were the settings for the study. Participants A total of 58 CBCT scans were selected of preadolescent individuals: 14 BCLP, 20 UCLP, and 24 age- and gender-matched noncleft controls. Variables Variables were volume, cross-sectional area (CSA), midsagittal area (MSA), and dimensions of OPA. Statistical analysis One-way analysis of variance and post hoc tests were used to compare variables. Statistical significance was set at P ≤ .05. Results UCLP showed significantly smaller superior oropharyngeal airway volume than both controls and BCLP ( P ≤ .05). BCLP showed significantly larger CSA at soft palate plane and significantly larger MSA than both UCLP and controls ( P < .05). Conclusions UCLP patients at the studied age and stage of previously repaired clefts have significantly less superior oropharyngeal airway volume than both controls and BCLP patients. This confirms that preadolescents with UCLP are at greater risk for superior oropharyngeal airway obstruction when compared with those BCLP and controls. Furthermore, BCLP patients showed significantly larger CSA at soft palate plane and MSA than both controls and UCLP patients. These variations in OPA characteristics of cleft patients can influence function in terms of respiration and vocalization.

  1. Volume analysis of heat-induced cracks in human molars: A preliminary study

    PubMed Central

    Sandholzer, Michael A.; Baron, Katharina; Heimel, Patrick; Metscher, Brian D.

    2014-01-01

    Context: Only a few methods have been published dealing with the visualization of heat-induced cracks inside bones and teeth. Aims: As a novel approach this study used nondestructive X-ray microtomography (micro-CT) for volume analysis of heat-induced cracks to observe the reaction of human molars to various levels of thermal stress. Materials and Methods: Eighteen clinically extracted third molars were rehydrated and burned under controlled temperatures (400, 650, and 800°C) using an electric furnace adjusted with a 25°C increase/min. The subsequent high-resolution scans (voxel-size 17.7 μm) were made with a compact micro-CT scanner (SkyScan 1174). In total, 14 scans were automatically segmented with Definiens XD Developer 1.2 and three-dimensional (3D) models were computed with Visage Imaging Amira 5.2.2. The results of the automated segmentation were analyzed with an analysis of variance (ANOVA) and uncorrected post hoc least significant difference (LSD) tests using Statistical Package for Social Sciences (SPSS) 17. A probability level of P < 0.05 was used as an index of statistical significance. Results: A temperature-dependent increase of heat-induced cracks was observed between the three temperature groups (P < 0.05, ANOVA post hoc LSD). In addition, the distributions and shape of the heat-induced changes could be classified using the computed 3D models. Conclusion: The macroscopic heat-induced changes observed in this preliminary study correspond with previous observations of unrestored human teeth, yet the current observations also take into account the entire microscopic 3D expansions of heat-induced cracks within the dental hard tissues. Using the same experimental conditions proposed in the literature, this study confirms previous results, adds new observations, and offers new perspectives in the investigation of forensic evidence. PMID:25125923

  2. A Cladistic Analysis of Phenotypic Associations with Haplotypes Inferred from Restriction Endonuclease Mapping. IV. Nested Analyses with Cladogram Uncertainty and Recombination

    PubMed Central

    Templeton, A. R.; Sing, C. F.

    1993-01-01

    We previously developed an analytical strategy based on cladistic theory to identify subsets of haplotypes that are associated with significant phenotypic deviations. Our initial approach was limited to segments of DNA in which little recombination occurs. In such cases, a cladogram can be constructed from the restriction site data to estimate the evolutionary steps that interrelate the observed haplotypes to one another. The cladogram is then used to define a nested statistical design for identifying mutational steps associated with significant phenotypic deviations. The central assumption behind this strategy is that a mutation responsible for a particular phenotypic effect is embedded within the evolutionary history that is represented by the cladogram. The power of this approach depends on the accuracy of the cladogram in portraying the evolutionary history of the DNA region. This accuracy can be diminished both by recombination and by uncertainty in the estimated cladogram topology. In a previous paper, we presented an algorithm for estimating the set of likely cladograms and recombination events. In this paper we present an algorithm for defining a nested statistical design under cladogram uncertainty and recombination. Given the nested design, phenotypic associations can be examined using either a nested analysis of variance (for haploids or homozygous strains) or permutation testing (for outcrossed, diploid gene regions). In this paper we also extend this analytical strategy to include categorical phenotypes in addition to quantitative phenotypes. Some worked examples are presented using Drosophila data sets. These examples illustrate that having some recombination may actually enhance the biological inferences that may derived from a cladistic analysis. In particular, recombination can be used to assign a physical localization to a given subregion for mutations responsible for significant phenotypic effects. PMID:8100789

  3. How reliable are gray matter disruptions in specific reading disability across multiple countries and languages? Insights from a large-scale voxel-based morphometry study.

    PubMed

    Jednoróg, Katarzyna; Marchewka, Artur; Altarelli, Irene; Monzalvo Lopez, Ana Karla; van Ermingen-Marbach, Muna; Grande, Marion; Grabowska, Anna; Heim, Stefan; Ramus, Franck

    2015-05-01

    The neural basis of specific reading disability (SRD) remains only partly understood. A dozen studies have used voxel-based morphometry (VBM) to investigate gray matter volume (GMV) differences between SRD and control children, however, recent meta-analyses suggest that few regions are consistent across studies. We used data collected across three countries (France, Poland, and Germany) with the aim of both increasing sample size (236 SRD and controls) to obtain a clearer picture of group differences, and of further assessing the consistency of the findings across languages. VBM analysis reveals a significant group difference in a single cluster in the left thalamus. Furthermore, we observe correlations between reading accuracy and GMV in the left supramarginal gyrus and in the left cerebellum, in controls only. Most strikingly, we fail to replicate all the group differences in GMV reported in previous studies, despite the superior statistical power. The main limitation of this study is the heterogeneity of the sample drawn from different countries (i.e., speaking languages with varying orthographic transparencies) and selected based on different assessment batteries. Nevertheless, analyses within each country support the conclusions of the cross-linguistic analysis. Explanations for the discrepancy between the present and previous studies may include: (1) the limited suitability of VBM to reveal the subtle brain disruptions underlying SRD; (2) insufficient correction for multiple statistical tests and flexibility in data analysis, and (3) publication bias in favor of positive results. Thus the study echoes widespread concerns about the risk of false-positive results inherent to small-scale VBM studies. © 2015 Wiley Periodicals, Inc.

  4. Bubbles and denaturation in DNA

    NASA Astrophysics Data System (ADS)

    van Erp, T. S.; Cuesta-López, S.; Peyrard, M.

    2006-08-01

    The local opening of DNA is an intriguing phenomenon from a statistical-physics point of view, but is also essential for its biological function. For instance, the transcription and replication of our genetic code cannot take place without the unwinding of the DNA double helix. Although these biological processes are driven by proteins, there might well be a relation between these biological openings and the spontaneous bubble formation due to thermal fluctuations. Mesoscopic models, like the Peyrard-Bishop-Dauxois (PBD) model, have fairly accurately reproduced some experimental denaturation curves and the sharp phase transition in the thermodynamic limit. It is, hence, tempting to see whether these models could be used to predict the biological activity of DNA. In a previous study, we introduced a method that allows to obtain very accurate results on this subject, which showed that some previous claims in this direction, based on molecular-dynamics studies, were premature. This could either imply that the present PBD model should be improved or that biological activity can only be predicted in a more complex framework that involves interactions with proteins and super helical stresses. In this article, we give a detailed description of the statistical method introduced before. Moreover, for several DNA sequences, we give a thorough analysis of the bubble-statistics as a function of position and bubble size and the so-called l-denaturation curves that can be measured experimentally. These show that some important experimental observations are missing in the present model. We discuss how the present model could be improved.

  5. Statistical Study of Nightside Quiet Time Midlatitude Ionospheric Convection

    NASA Astrophysics Data System (ADS)

    Maimaiti, M.; Ruohoniemi, J. M.; Baker, J. B. H.; Ribeiro, A. J.

    2018-03-01

    Previous studies have shown that F region midlatitude ionospheric plasma exhibits drifts of a few tens of meters per second during quiet geomagnetic conditions, predominantly in the westward direction. However, detailed morphology of this plasma motion and its drivers are still not well understood. In this study, we have used 2 years of data obtained from six midlatitude SuperDARN radars in the North American sector to derive a statistical model of quiet time midlatitude plasma convection between 52° and 58° magnetic latitude (MLAT). The model is organized in MLAT-MLT (magnetic local time) coordinates and has a spatial resolution of 1° × 7 min with thousands of velocity measurements contributing to most grid cells. Our results show that the flow is predominantly westward (20-55 m/s) and weakly northward (0-20 m/s) deep on the nightside but with a strong seasonal dependence such that the flows tend to be strongest and most structured in winter. These statistical results are in good agreement with previously reported observations from Millstone Hill incoherent scatter radar measurements for a single latitude but also show some interesting new features, one being a significant latitudinal variation of zonal flow velocity near midnight in winter. Our analysis suggests that penetration of the high-latitude convection electric fields can account for the direction of midlatitude convection in the premidnight sector, but postmidnight midlatitude convection is dominated by the neutral wind dynamo.

  6. [Gender-sensitive epidemiological data analysis: methodological aspects and empirical outcomes. Illustrated by a health reporting example].

    PubMed

    Jahn, I; Foraita, R

    2008-01-01

    In Germany gender-sensitive approaches are part of guidelines for good epidemiological practice as well as health reporting. They are increasingly claimed to realize the gender mainstreaming strategy in research funding by the federation and federal states. This paper focuses on methodological aspects of data analysis, as an empirical data example of which serves the health report of Bremen, a population-based cross-sectional study. Health reporting requires analysis and reporting methods that are able to discover sex/gender issues of questions, on the one hand, and consider how results can adequately be communicated, on the other hand. The core question is: Which consequences do a different inclusion of the category sex in different statistical analyses for identification of potential target groups have on the results? As evaluation methods logistic regressions as well as a two-stage procedure were exploratively conducted. This procedure combines graphical models with CHAID decision trees and allows for visualising complex results. Both methods are analysed by stratification as well as adjusted by sex/gender and compared with each other. As a result, only stratified analyses are able to detect differences between the sexes and within the sex/gender groups as long as one cannot resort to previous knowledge. Adjusted analyses can detect sex/gender differences only if interaction terms have been included in the model. Results are discussed from a statistical-epidemiological perspective as well as in the context of health reporting. As a conclusion, the question, if a statistical method is gender-sensitive, can only be answered by having concrete research questions and known conditions. Often, an appropriate statistic procedure can be chosen after conducting a separate analysis for women and men. Future gender studies deserve innovative study designs as well as conceptual distinctiveness with regard to the biological and the sociocultural elements of the category sex/gender.

  7. Statistical Association Criteria in Forensic Psychiatry–A criminological evaluation of casuistry

    PubMed Central

    Gheorghiu, V; Buda, O; Popescu, I; Trandafir, MS

    2011-01-01

    Purpose. Identification of potential shared primary psychoprophylaxis and crime prevention is measured by analyzing the rate of commitments for patients–subjects to forensic examination. Material and method. The statistic trial is a retrospective, document–based study. The statistical lot consists of 770 initial examination reports performed and completed during the whole year 2007, primarily analyzed in order to summarize the data within the National Institute of Forensic Medicine, Bucharest, Romania (INML), with one of the group variables being ‘particularities of the psychiatric patient history’, containing the items ‘forensic onset’, ‘commitments within the last year prior to the examination’ and ‘absence of commitments within the last year prior to the examination’. The method used was the Kendall bivariate correlation. For this study, the authors separately analyze only the two items regarding commitments by other correlation alternatives and by modern, elaborate statistical analyses, i.e. recording of the standard case study variables, Kendall bivariate correlation, cross tabulation, factor analysis and hierarchical cluster analysis. Results. The results are varied, from theoretically presumed clinical nosography (such as schizophrenia or manic depression), to non–presumed (conduct disorders) or unexpected behavioral acts, and therefore difficult to interpret. Conclusions. One took into consideration the features of the batch as well as the results of the previous standard correlation of the whole statistical lot. The authors emphasize the role of medical security measures that are actually applied in the therapeutic management in general and in risk and second offence management in particular, as well as the role of forensic psychiatric examinations in the detection of certain aspects related to the monitoring of mental patients. PMID:21505571

  8. Genome-wide comparative analysis of four Indian Drosophila species.

    PubMed

    Mohanty, Sujata; Khanna, Radhika

    2017-12-01

    Comparative analysis of multiple genomes of closely or distantly related Drosophila species undoubtedly creates excitement among evolutionary biologists in exploring the genomic changes with an ecology and evolutionary perspective. We present herewith the de novo assembled whole genome sequences of four Drosophila species, D. bipectinata, D. takahashii, D. biarmipes and D. nasuta of Indian origin using Next Generation Sequencing technology on an Illumina platform along with their detailed assembly statistics. The comparative genomics analysis, e.g. gene predictions and annotations, functional and orthogroup analysis of coding sequences and genome wide SNP distribution were performed. The whole genome of Zaprionus indianus of Indian origin published earlier by us and the genome sequences of previously sequenced 12 Drosophila species available in the NCBI database were included in the analysis. The present work is a part of our ongoing genomics project of Indian Drosophila species.

  9. Parallel line analysis: multifunctional software for the biomedical sciences

    NASA Technical Reports Server (NTRS)

    Swank, P. R.; Lewis, M. L.; Damron, K. L.; Morrison, D. R.

    1990-01-01

    An easy to use, interactive FORTRAN program for analyzing the results of parallel line assays is described. The program is menu driven and consists of five major components: data entry, data editing, manual analysis, manual plotting, and automatic analysis and plotting. Data can be entered from the terminal or from previously created data files. The data editing portion of the program is used to inspect and modify data and to statistically identify outliers. The manual analysis component is used to test the assumptions necessary for parallel line assays using analysis of covariance techniques and to determine potency ratios with confidence limits. The manual plotting component provides a graphic display of the data on the terminal screen or on a standard line printer. The automatic portion runs through multiple analyses without operator input. Data may be saved in a special file to expedite input at a future time.

  10. Pivot tables for mortality analysis, or who needs life tables anyway?

    PubMed

    Wesley, David; Cox, Hugh F

    2007-01-01

    Actuarial life-table analysis has long been used by life insurance medical directors for mortality abstraction from clinical studies. Ironically, today's life actuary instead uses pivot tables to analyze mortality. Pivot tables (a feature/function in MS Excel) collapse various dimensions of data that were previously arranged in an "experience study" format. Summary statistics such as actual deaths, actual and expected mortality (usually measured in dollars), and calculated results such as actual to expected ratios, are then displayed in a 2-dimensional grid. The same analytic process, excluding the dollar focus, can be used for clinical mortality studies. For raw survival data, especially large datasets, this combination of experience study data and pivot tables has clear advantages over life-table analysis in both accuracy and flexibility. Using the SEER breast cancer data, we compare the results of life-table analysis and pivot-table analysis.

  11. Allometric scaling: analysis of LD50 data.

    PubMed

    Burzala-Kowalczyk, Lidia; Jongbloed, Geurt

    2011-04-01

    The need to identify toxicologically equivalent doses across different species is a major issue in toxicology and risk assessment. In this article, we investigate interspecies scaling based on the allometric equation applied to the single, oral LD (50) data previously analyzed by Rhomberg and Wolff. We focus on the statistical approach, namely, regression analysis of the mentioned data. In contrast to Rhomberg and Wolff's analysis of species pairs, we perform an overall analysis based on the whole data set. From our study it follows that if one assumes one single scaling rule for all species and substances in the data set, then β = 1 is the most natural choice among a set of candidates known in the literature. In fact, we obtain quite narrow confidence intervals for this parameter. However, the estimate of the variance in the model is relatively high, resulting in rather wide prediction intervals. © 2010 Society for Risk Analysis.

  12. Some statistical features of the seismic activity related to the recent M8.2 and M7.1 earthquakes in Mexico

    NASA Astrophysics Data System (ADS)

    Guzman, L.; Baeza-Blancas, E.; Reyes, I.; Angulo Brown, F.; Rudolf Navarro, A.

    2017-12-01

    By studying the magnitude earthquake catalogs, previous studies have reported evidence that some changes in the spatial and temporal organization of earthquake activity is observedbefore and after of a main-shock. These previous studies have used different approach methods for detecting clustering behavior and distance-events density in order topoint out the asymmetric behavior of before shocks and aftershocks. Here, we present a statistical analysis of the seismic activity related to the M8.2 and M7.1 earthquakes occurredon Sept. 7th and Sept. 19th, respectively. First, we calculated the interevent time and distance for the period Sept. 7th 2016 until Oct. 20th 2017 for each seismic region ( a radius of 150 km centeredat coordinates of the M8.1 and M7.1). Next, we calculated the "velocity" of the walker as the ratio between the interevent distance and interevent time, and similarly, we also constructed the"acceleration". A slider pointer is considered to estimate some statistical features within time windows of size τ for the velocity and acceleration sequences before and after the main shocks. Specifically, we applied the fractal dimension method to detect changes in the correlation (persistence) behavior of events in the period before the main events.Our preliminary results pointed out that the fractal dimension associated to the velocity and acceleration sequences exhibits changes in the persistence behavior before the mainshock, while thescaling dimension values after the main events resemble a more uncorrelated behavior. Moreover, the relationship between the standard deviation of the velocity and the local mean velocity valuefor a given time window-size τ is described by an exponent close to 1.5, and the cumulative distribution of velocity and acceleration are well described by power law functions after the crash and stretched-exponential-like distribution before the main shock. On the other hand, we present an analysis of patterns of seismicquiescence before the M8.2 earthquake based on the Schreider algorithmover a period of 27 years. This analysis also includes the modificationof the Schreider method proposed by Muñoz-Diosdado et al. (2015).

  13. A Statistical Portrait of Women in the United States. Current Population Reports, Special Studies Series P-32, No. 58.

    ERIC Educational Resources Information Center

    Bureau of the Census (DOC), Suitland, MD.

    This report presents a statistical portrait of the changing role of women in the United States during the 20th century. Data are from United States Government sources--from surveys, decennial censuses, vital statistics, and administrative records. The majority of the statistics have been published previously, either in government documents or…

  14. Descriptive data analysis.

    PubMed

    Thompson, Cheryl Bagley

    2009-01-01

    This 13th article of the Basics of Research series is first in a short series on statistical analysis. These articles will discuss creating your statistical analysis plan, levels of measurement, descriptive statistics, probability theory, inferential statistics, and general considerations for interpretation of the results of a statistical analysis.

  15. A Documentary Analysis of Abstracts Presented in European Congresses on Adapted Physical Activity.

    PubMed

    Sklenarikova, Jana; Kudlacek, Martin; Baloun, Ladislav; Causgrove Dunn, Janice

    2016-07-01

    The purpose of the study was to identify trends in research abstracts published in the books of abstracts of the European Congress of Adapted Physical Activity from 2004 to 2012. A documentary analysis of the contents of 459 abstracts was completed. Data were coded based on subcategories used in a previous study by Zhang, deLisle, and Chen (2006) and by Porretta and Sherrill (2005): number of authors, data source, sample size, type of disability, data analyses, type of study, and focus of study. Descriptive statistics calculated for each subcategory revealed an overall picture of the state and trends of scientific inquiry in adapted physical activity research in Europe.

  16. An absolute chronology for early Egypt using radiocarbon dating and Bayesian statistical modelling

    PubMed Central

    Dee, Michael; Wengrow, David; Shortland, Andrew; Stevenson, Alice; Brock, Fiona; Girdland Flink, Linus; Bronk Ramsey, Christopher

    2013-01-01

    The Egyptian state was formed prior to the existence of verifiable historical records. Conventional dates for its formation are based on the relative ordering of artefacts. This approach is no longer considered sufficient for cogent historical analysis. Here, we produce an absolute chronology for Early Egypt by combining radiocarbon and archaeological evidence within a Bayesian paradigm. Our data cover the full trajectory of Egyptian state formation and indicate that the process occurred more rapidly than previously thought. We provide a timeline for the First Dynasty of Egypt of generational-scale resolution that concurs with prevailing archaeological analysis and produce a chronometric date for the foundation of Egypt that distinguishes between historical estimates. PMID:24204188

  17. The deuteron-radius puzzle is alive: A new analysis of nuclear structure uncertainties

    NASA Astrophysics Data System (ADS)

    Hernandez, O. J.; Ekström, A.; Nevo Dinur, N.; Ji, C.; Bacca, S.; Barnea, N.

    2018-03-01

    To shed light on the deuteron radius puzzle we analyze the theoretical uncertainties of the nuclear structure corrections to the Lamb shift in muonic deuterium. We find that the discrepancy between the calculated two-photon exchange correction and the corresponding experimentally inferred value by Pohl et al. [1] remain. The present result is consistent with our previous estimate, although the discrepancy is reduced from 2.6 σ to about 2 σ. The error analysis includes statistic as well as systematic uncertainties stemming from the use of nucleon-nucleon interactions derived from chiral effective field theory at various orders. We therefore conclude that nuclear theory uncertainty is more likely not the source of the discrepancy.

  18. Volcanic eruptions and solar activity

    NASA Technical Reports Server (NTRS)

    Stothers, Richard B.

    1989-01-01

    The historical record of large volcanic eruptions from 1500 to 1980 is subjected to detailed time series analysis. In two weak but probably statistically significant periodicities of about 11 and 80 yr, the frequency of volcanic eruptions increases (decreases) slightly around the times of solar minimum (maximum). Time series analysis of the volcanogenic acidities in a deep ice core from Greenland reveals several very long periods ranging from about 80 to about 350 yr which are similar to the very slow solar cycles previously detected in auroral and C-14 records. Solar flares may cause changes in atmospheric circulation patterns that abruptly alter the earth's spin. The resulting jolt probably triggers small earthquakes which affect volcanism.

  19. Multiresolution Wavelet Analysis of Heartbeat Intervals Discriminates Healthy Patients from Those with Cardiac Pathology

    NASA Astrophysics Data System (ADS)

    Thurner, Stefan; Feurstein, Markus C.; Teich, Malvin C.

    1998-02-01

    We applied multiresolution wavelet analysis to the sequence of times between human heartbeats ( R-R intervals) and have found a scale window, between 16 and 32 heartbeat intervals, over which the widths of the R-R wavelet coefficients fall into disjoint sets for normal and heart-failure patients. This has enabled us to correctly classify every patient in a standard data set as belonging either to the heart-failure or normal group with 100% accuracy, thereby providing a clinically significant measure of the presence of heart failure from the R-R intervals alone. Comparison is made with previous approaches, which have provided only statistically significant measures.

  20. An asymptotic analysis of the logrank test.

    PubMed

    Strawderman, R L

    1997-01-01

    Asymptotic expansions for the null distribution of the logrank statistic and its distribution under local proportional hazards alternatives are developed in the case of iid observations. The results, which are derived from the work of Gu (1992) and Taniguchi (1992), are easy to interpret, and provide some theoretical justification for many behavioral characteristics of the logrank test that have been previously observed in simulation studies. We focus primarily upon (i) the inadequacy of the usual normal approximation under treatment group imbalance; and, (ii) the effects of treatment group imbalance on power and sample size calculations. A simple transformation of the logrank statistic is also derived based on results in Konishi (1991) and is found to substantially improve the standard normal approximation to its distribution under the null hypothesis of no survival difference when there is treatment group imbalance.

  1. Database Performance Monitoring for the Photovoltaic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klise, Katherine A.

    The Database Performance Monitoring (DPM) software (copyright in processes) is being developed at Sandia National Laboratories to perform quality control analysis on time series data. The software loads time indexed databases (currently csv format), performs a series of quality control tests defined by the user, and creates reports which include summary statistics, tables, and graphics. DPM can be setup to run on an automated schedule defined by the user. For example, the software can be run once per day to analyze data collected on the previous day. HTML formatted reports can be sent via email or hosted on a website.more » To compare performance of several databases, summary statistics and graphics can be gathered in a dashboard view which links to detailed reporting information for each database. The software can be customized for specific applications.« less

  2. Vitamin D and depression: a systematic review and meta-analysis comparing studies with and without biological flaws.

    PubMed

    Spedding, Simon

    2014-04-11

    Efficacy of Vitamin D supplements in depression is controversial, awaiting further literature analysis. Biological flaws in primary studies is a possible reason meta-analyses of Vitamin D have failed to demonstrate efficacy. This systematic review and meta-analysis of Vitamin D and depression compared studies with and without biological flaws. The systematic review followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. The literature search was undertaken through four databases for randomized controlled trials (RCTs). Studies were critically appraised for methodological quality and biological flaws, in relation to the hypothesis and study design. Meta-analyses were performed for studies according to the presence of biological flaws. The 15 RCTs identified provide a more comprehensive evidence-base than previous systematic reviews; methodological quality of studies was generally good and methodology was diverse. A meta-analysis of all studies without flaws demonstrated a statistically significant improvement in depression with Vitamin D supplements (+0.78 CI +0.24, +1.27). Studies with biological flaws were mainly inconclusive, with the meta-analysis demonstrating a statistically significant worsening in depression by taking Vitamin D supplements (-1.1 CI -0.7, -1.5). Vitamin D supplementation (≥800 I.U. daily) was somewhat favorable in the management of depression in studies that demonstrate a change in vitamin levels, and the effect size was comparable to that of anti-depressant medication.

  3. Improved Bond Equations for Fiber-Reinforced Polymer Bars in Concrete

    PubMed Central

    Pour, Sadaf Moallemi; Alam, M. Shahria; Milani, Abbas S.

    2016-01-01

    This paper explores a set of new equations to predict the bond strength between fiber reinforced polymer (FRP) rebar and concrete. The proposed equations are based on a comprehensive statistical analysis and existing experimental results in the literature. Namely, the most effective parameters on bond behavior of FRP concrete were first identified by applying a factorial analysis on a part of the available database. Then the database that contains 250 pullout tests were divided into four groups based on the concrete compressive strength and the rebar surface. Afterward, nonlinear regression analysis was performed for each study group in order to determine the bond equations. The results show that the proposed equations can predict bond strengths more accurately compared to the other previously reported models. PMID:28773859

  4. Tsallis q-triplet, intermittent turbulence and Portevin-Le Chatelier effect

    NASA Astrophysics Data System (ADS)

    Iliopoulos, A. C.; Aifantis, E. C.

    2018-05-01

    In this paper, we extend a previous study concerning Portevin-LeChatelier (PLC) effect and Tsallis statistics (Iliopoulos et al., 2015). In particular, we estimate Tsallis' q-triplet, namely {qstat, qsens, qrel} for two sets of stress serration time series concerning the deformation of Cu-15%Al alloy corresponding to different deformation temperatures and thus types (A and B) of PLC bands. The results concerning the stress serrations analysis reveal that Tsallis q- triplet attains values different from unity ({qstat, qsens, qrel} ≠ {1,1,1}). In particular, PLC type A bands' serrations were found to follow Tsallis super-q-Gaussian, non-extensive, sub-additive, multifractal statistics indicating that the underlying dynamics are at the edge of chaos, characterized by global long range correlations and power law scaling. For PLC type B bands' serrations, the results revealed a Tsallis sub-q-Gaussian, non-extensive, super-additive, multifractal statistical profile. In addition, our results reveal also significant differences in statistical and dynamical features, indicating important variations of the stress field dynamics in terms of rate of entropy production, relaxation dynamics and non-equilibrium meta-stable stationary states. We also estimate parameters commonly used for characterizing fully developed turbulence, such as structure functions and flatness coefficient (F), in order to provide further information about jerky flow underlying dynamics. Finally, we use two multifractal models developed to describe turbulence, namely Arimitsu and Arimitsu (A&A) [2000, 2001] theoretical model which is based on Tsallis statistics and p-model to estimate theoretical multifractal spectrums f(a). Furthermore, we estimate flatness coefficient (F) using a theoretical formula based on Tsallis statistics. The theoretical results are compared with the experimental ones showing a remarkable agreement between modeling and experiment. Finally, the results of this study verify, as well as, extend previous studies which stated that type B and type A PLC bands underlying dynamics are connected with distinct dynamical behavior, namely chaotic behavior for the first and self-organized critical (SOC) behavior for the latter, while they shed new light concerning the turbulent character of the PLC jerky flow.

  5. Does a balance deficit persist in Australian Football players with previous lower limb ligament injury?

    PubMed

    Hrysomallis, C; McLaughlin, P; Goodman, C

    2005-03-01

    A history of lower limb ligament injury is a commonly-cited risk factor for another similar injury. During the acute phase of injury, there is a balancing skill deficit in the injured limb. It has been unclear as to whether this deficit persists in the medium-to-long term for previously injured Australian footballers, contributing to the risk of re-injury. This study compared the balance ability of footballers with and without previous lower limb ligament injury and, for previously injured players, the balance ability of the previously injured limb to the opposite uninjured limb. A total of 216 players from 6 teams from the Australian Football League were tested. The balance task comprised stepping on to a foam mat on top of a force plate and maintaining one-legged balance. The subjects were divided into 4 groups based on their injury history: all ankle injuries to only one limb, recent ankle injuries to only one limb (within the last 12 months), knee ligament injury only to one limb, and no previous ankle or knee ligament injury. Statistical analysis revealed that there was no significant difference between the balance scores of any of the previously injured players and those with no previous lower limb ligament injury. There was no significant difference between the balance score of the previously injured limb with the opposite uninjured limb. It appears that a balance deficit does not persist in Australian Football players with previous lower limb ligament injury.

  6. K-nearest neighbors based methods for identification of different gear crack levels under different motor speeds and loads: Revisited

    NASA Astrophysics Data System (ADS)

    Wang, Dong

    2016-03-01

    Gears are the most commonly used components in mechanical transmission systems. Their failures may cause transmission system breakdown and result in economic loss. Identification of different gear crack levels is important to prevent any unexpected gear failure because gear cracks lead to gear tooth breakage. Signal processing based methods mainly require expertize to explain gear fault signatures which is usually not easy to be achieved by ordinary users. In order to automatically identify different gear crack levels, intelligent gear crack identification methods should be developed. The previous case studies experimentally proved that K-nearest neighbors based methods exhibit high prediction accuracies for identification of 3 different gear crack levels under different motor speeds and loads. In this short communication, to further enhance prediction accuracies of existing K-nearest neighbors based methods and extend identification of 3 different gear crack levels to identification of 5 different gear crack levels, redundant statistical features are constructed by using Daubechies 44 (db44) binary wavelet packet transform at different wavelet decomposition levels, prior to the use of a K-nearest neighbors method. The dimensionality of redundant statistical features is 620, which provides richer gear fault signatures. Since many of these statistical features are redundant and highly correlated with each other, dimensionality reduction of redundant statistical features is conducted to obtain new significant statistical features. At last, the K-nearest neighbors method is used to identify 5 different gear crack levels under different motor speeds and loads. A case study including 3 experiments is investigated to demonstrate that the developed method provides higher prediction accuracies than the existing K-nearest neighbors based methods for recognizing different gear crack levels under different motor speeds and loads. Based on the new significant statistical features, some other popular statistical models including linear discriminant analysis, quadratic discriminant analysis, classification and regression tree and naive Bayes classifier, are compared with the developed method. The results show that the developed method has the highest prediction accuracies among these statistical models. Additionally, selection of the number of new significant features and parameter selection of K-nearest neighbors are thoroughly investigated.

  7. Propositional idea density in women's written language over the lifespan: computerized analysis.

    PubMed

    Ferguson, Alison; Spencer, Elizabeth; Craig, Hugh; Colyvas, Kim

    2014-06-01

    The informativeness of written language, as measured by Propositional Idea Density (PD), has been shown to be a sensitive predictive index of language decline with age and dementia in previous research. The present study investigated the influence of age and education on the written language of three large cohorts of women from the general community, born between 1973 and 1978, 1946-51 and 1921-26. Written texts were obtained from the Australian Longitudinal Study on Women's Health in which participants were invited to respond to an open-ended question about their health. The informativeness of written comments of 10 words or more (90% of the total number of comments) was analyzed using the Computerized Propositional Idea Density Rater 3 (CPIDR-3). Over 2.5 million words used in 37,705 written responses from 19,512 respondents were analyzed. Based on a linear mixed model approach to statistical analysis with adjustment for several factors including number of comments per respondent and number of words per comment, a small but statistically significant effect of age was identified for the older cohort with mean age 78 years. The mean PD per word for this cohort was lower than the younger and mid-aged cohorts with mean age 27 and 53 years respectively, with mean reduction in PD 95% confidence interval (CI) of .006 (.003, .008) and .009 (.008, .011) respectively. This suggests that PD for this population of women was relatively more stable over the adult lifespan than has been reported previously even in late old age. There was no statistically significant effect of education level. Computerized analyses were found to greatly facilitate the study of informativeness of this large corpus of written language. Directions for further research are discussed in relation to the need for extended investigation of the variability of the measure for potential application to the identification of acquired language pathologies. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. The impact of meteorology on the occurrence of waterborne outbreaks of vero cytotoxin-producing Escherichia coli (VTEC): a logistic regression approach.

    PubMed

    O'Dwyer, Jean; Morris Downes, Margaret; Adley, Catherine C

    2016-02-01

    This study analyses the relationship between meteorological phenomena and outbreaks of waterborne-transmitted vero cytotoxin-producing Escherichia coli (VTEC) in the Republic of Ireland over an 8-year period (2005-2012). Data pertaining to the notification of waterborne VTEC outbreaks were extracted from the Computerised Infectious Disease Reporting system, which is administered through the national Health Protection Surveillance Centre as part of the Health Service Executive. Rainfall and temperature data were obtained from the national meteorological office and categorised as cumulative rainfall, heavy rainfall events in the previous 7 days, and mean temperature. Regression analysis was performed using logistic regression (LR) analysis. The LR model was significant (p < 0.001), with all independent variables: cumulative rainfall, heavy rainfall and mean temperature making a statistically significant contribution to the model. The study has found that rainfall, particularly heavy rainfall in the preceding 7 days of an outbreak, is a strong statistical indicator of a waterborne outbreak and that temperature also impacts waterborne VTEC outbreak occurrence.

  9. A statistical study of ionopause perturbation and associated boundary wave formation at Venus.

    NASA Astrophysics Data System (ADS)

    Chong, G. S.; Pope, S. A.; Walker, S. N.; Zhang, T.; Balikhin, M. A.

    2017-12-01

    In contrast to Earth, Venus does not possess an intrinsic magnetic field. Hence the interaction between solar wind and Venus is significantly different when compared to Earth, even though these two planets were once considered similar. Within the induced magnetosphere and ionosphere of Venus, previous studies have shown the existence of ionospheric boundary waves. These structures may play an important role in the atmospheric evolution of Venus. By using Venus Express data, the crossings of the ionopause boundary are determined based on the observations of photoelectrons during 2011. Pulses of dropouts in the electron energy spectrometer were observed in 92 events, which suggests potential perturbations of the boundary. Minimum variance analysis of the 1Hz magnetic field data for the perturbations is conducted and used to confirm the occurrence of the boundary waves. Statistical analysis shows that they were propagating mainly in the ±VSO-Y direction in the polar north terminator region. The generation mechanisms of boundary waves and their evolution into the potential nonlinear regime are discussed and analysed.

  10. The accurate assessment of small-angle X-ray scattering data

    DOE PAGES

    Grant, Thomas D.; Luft, Joseph R.; Carter, Lester G.; ...

    2015-01-23

    Small-angle X-ray scattering (SAXS) has grown in popularity in recent times with the advent of bright synchrotron X-ray sources, powerful computational resources and algorithms enabling the calculation of increasingly complex models. However, the lack of standardized data-quality metrics presents difficulties for the growing user community in accurately assessing the quality of experimental SAXS data. Here, a series of metrics to quantitatively describe SAXS data in an objective manner using statistical evaluations are defined. These metrics are applied to identify the effects of radiation damage, concentration dependence and interparticle interactions on SAXS data from a set of 27 previously described targetsmore » for which high-resolution structures have been determined via X-ray crystallography or nuclear magnetic resonance (NMR) spectroscopy. Studies show that these metrics are sufficient to characterize SAXS data quality on a small sample set with statistical rigor and sensitivity similar to or better than manual analysis. The development of data-quality analysis strategies such as these initial efforts is needed to enable the accurate and unbiased assessment of SAXS data quality.« less

  11. A mixed-effects model approach for the statistical analysis of vocal fold viscoelastic shear properties.

    PubMed

    Xu, Chet C; Chan, Roger W; Sun, Han; Zhan, Xiaowei

    2017-11-01

    A mixed-effects model approach was introduced in this study for the statistical analysis of rheological data of vocal fold tissues, in order to account for the data correlation caused by multiple measurements of each tissue sample across the test frequency range. Such data correlation had often been overlooked in previous studies in the past decades. The viscoelastic shear properties of the vocal fold lamina propria of two commonly used laryngeal research animal species (i.e. rabbit, porcine) were measured by a linear, controlled-strain simple-shear rheometer. Along with published canine and human rheological data, the vocal fold viscoelastic shear moduli of these animal species were compared to those of human over a frequency range of 1-250Hz using the mixed-effects models. Our results indicated that tissues of the rabbit, canine and porcine vocal fold lamina propria were significantly stiffer and more viscous than those of human. Mixed-effects models were shown to be able to more accurately analyze rheological data generated from repeated measurements. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Measuring outcome from vestibular rehabilitation, part II: refinement and validation of a new self-report measure.

    PubMed

    Morris, Anna E; Lutman, Mark E; Yardley, Lucy

    2009-01-01

    A prototype self-report measure of vestibular rehabilitation outcome is described in a previous paper. The objectives of the present work were to identify the most useful items and assess their psychometric properties. Stage 1: One hundred fifty-five participants completed a prototype 36-item Vestibular Rehabilitation Benefit Questionnaire (VRBQ). Statistical analysis demonstrated its subscale structure and identified redundant items. Stage 2: One hundred twenty-four participants completed a refined 22-item VRBQ and three established questionnaires (Dizziness Handicap Inventory, DHI; Vertigo Symptom Scale short form, VSS-sf; Medical Outcomes Study short form 36, SF-36) in a longitudinal study. Statistical analysis revealed four internally consistent subscales of the VRBQ: Dizziness, Anxiety, Motion-Provoked Dizziness, and Quality of Life. Correlations with the DHI, VSS-sf, and SF-36 support the validity of the VRBQ, and effect size estimates suggest that the VRBQ is more responsive than comparable questionnaires. Twenty participants completed the VRBQ twice in a 24-hour period, indicating excellent test-retest reliability. The VRBQ appears to be a concise and psychometrically robust questionnaire that addresses the main aspects of dizziness impact.

  13. Shilling Attacks Detection in Recommender Systems Based on Target Item Analysis

    PubMed Central

    Zhou, Wei; Wen, Junhao; Koh, Yun Sing; Xiong, Qingyu; Gao, Min; Dobbie, Gillian; Alam, Shafiq

    2015-01-01

    Recommender systems are highly vulnerable to shilling attacks, both by individuals and groups. Attackers who introduce biased ratings in order to affect recommendations, have been shown to negatively affect collaborative filtering (CF) algorithms. Previous research focuses only on the differences between genuine profiles and attack profiles, ignoring the group characteristics in attack profiles. In this paper, we study the use of statistical metrics to detect rating patterns of attackers and group characteristics in attack profiles. Another question is that most existing detecting methods are model specific. Two metrics, Rating Deviation from Mean Agreement (RDMA) and Degree of Similarity with Top Neighbors (DegSim), are used for analyzing rating patterns between malicious profiles and genuine profiles in attack models. Building upon this, we also propose and evaluate a detection structure called RD-TIA for detecting shilling attacks in recommender systems using a statistical approach. In order to detect more complicated attack models, we propose a novel metric called DegSim’ based on DegSim. The experimental results show that our detection model based on target item analysis is an effective approach for detecting shilling attacks. PMID:26222882

  14. Persistent homology and non-Gaussianity

    NASA Astrophysics Data System (ADS)

    Cole, Alex; Shiu, Gary

    2018-03-01

    In this paper, we introduce the topological persistence diagram as a statistic for Cosmic Microwave Background (CMB) temperature anisotropy maps. A central concept in 'Topological Data Analysis' (TDA), the idea of persistence is to represent a data set by a family of topological spaces. One then examines how long topological features 'persist' as the family of spaces is traversed. We compute persistence diagrams for simulated CMB temperature anisotropy maps featuring various levels of primordial non-Gaussianity of local type. Postponing the analysis of observational effects, we show that persistence diagrams are more sensitive to local non-Gaussianity than previous topological statistics including the genus and Betti number curves, and can constrain Δ fNLloc= 35.8 at the 68% confidence level on the simulation set, compared to Δ fNLloc= 60.6 for the Betti number curves. Given the resolution of our simulations, we expect applying persistence diagrams to observational data will give constraints competitive with those of the Minkowski Functionals. This is the first in a series of papers where we plan to apply TDA to different shapes of non-Gaussianity in the CMB and Large Scale Structure.

  15. The influence of economic business cycles on United States suicide rates.

    PubMed

    Wasserman, I M

    1984-01-01

    A number of social science investigators have shown that a downturn in the economy leads to an increase in the suicide rate. However, the previous works on the subject are flawed by the fact that they employ years as their temporal unit of analysis. This time period is so large that it makes it difficult for investigators to precisely determine the length of the lag effect, while at the same time removing the autocorrelation effects. Also, although most works on suicide and the business cycle employ unemployment as a measure of a downturn in the business cycle, the average duration of unemployment represents a better measure for determining the social impact of an economic downturn. From 1947 to 1977 the average monthly duration of unemployment is statistically related to the suicide rate using multivariate time-series analysis. From 1910 to 1939 the Ayres business index, a surrogate measure for movement in the business cycle, is statistically related to the monthly suicide rate. An examination of the findings confirms that in most cases a downturn in the economy causes an increase in the suicide rate.

  16. Two Populations of Sunspots: Differential Rotation

    NASA Astrophysics Data System (ADS)

    Nagovitsyn, Yu. A.; Pevtsov, A. A.; Osipova, A. A.

    2018-03-01

    To investigate the differential rotation of sunspot groups using the Greenwich data, we propose an approach based on a statistical analysis of the histograms of particular longitudinal velocities in different latitude intervals. The general statistical velocity distributions for all such intervals are shown to be described by two rather than one normal distribution, so that two fundamental rotation modes exist simultaneously: fast and slow. The differentiality of rotation for the modes is the same: the coefficient at sin2 in Faye's law is 2.87-2.88 deg/day, while the equatorial rotation rates differ significantly, 0.27 deg/day. On the other hand, an analysis of the longitudinal velocities for the previously revealed two differing populations of sunspot groups has shown that small short-lived groups (SSGs) are associated with the fast rotation mode, while large long-lived groups (LLGs) are associated with both fast and slow modes. The results obtained not only suggest a real physical difference between the two populations of sunspots but also give new empirical data for the development of a dynamo theory, in particular, for the theory of a spatially distributed dynamo.

  17. Electron-positron momentum distribution measurements of high-T superconductors and related systems

    NASA Astrophysics Data System (ADS)

    Wachs, A. L.; Turchi, P. E. A.; Howell, R. J.; Jean, Y. C.; Fluss, M. J.; West, R. N.; Kaiser, J. H.; Rayner, S.; Hahgighi, H.; Merkle, K. L.

    1989-08-01

    Measurements are discussed of the 2-D angular correlation of positron annihilation radiation (ACAR) in La2CuO4, YBa2Cu3O7 (YBCO), and NiO. The measurements for NiO are the first such 2-D ACAR measurements; the YBCO results are of a higher statistical quality than previously reported in the literature. The data are compared with complementary theoretical calculations and with each other. The implication is discussed of the analysis for ACAR studies of similar and related systems.

  18. Vortex Thermometry for Turbulent Two-Dimensional Fluids.

    PubMed

    Groszek, Andrew J; Davis, Matthew J; Paganin, David M; Helmerson, Kristian; Simula, Tapio P

    2018-01-19

    We introduce a new method of statistical analysis to characterize the dynamics of turbulent fluids in two dimensions. We establish that, in equilibrium, the vortex distributions can be uniquely connected to the temperature of the vortex gas, and we apply this vortex thermometry to characterize simulations of decaying superfluid turbulence. We confirm the hypothesis of vortex evaporative heating leading to Onsager vortices proposed in Phys. Rev. Lett. 113, 165302 (2014)PRLTAO0031-900710.1103/PhysRevLett.113.165302, and we find previously unidentified vortex power-law distributions that emerge from the dynamics.

  19. Electron-position momentum distribution measurements of high-T c superconductors and related systems

    NASA Astrophysics Data System (ADS)

    Wachs, A. L.; Turchi, P. E. A.; Howell, R. H.; Jean, Y. C.; Fluss, M. J.; West, R. N.; Kaiser, J. H.; Rayner, S.; Haghighi, H.; Merkle, K. L.; Revcolevschi, A.; Wang, Z. Z.

    1989-12-01

    We discuss our measurements of the 2D-angular correlation of positron annihilation radiation (ACAR) in La 2CuO 4, YBa 2Cu 3O 7 (YBCO), and NiO. The measurements for NiO are the first such 2D-ACAR measurements; the YBCO results are of a higher statistical quality than previously reported in the literature. The data are compared with complementary theoretical calculations and with each other. We discuss the implication of our analysis for ACAR studies of similar and related systems.

  20. An experimental investigation of masking in the US FDA adverse event reporting system database.

    PubMed

    Wang, Hsin-wei; Hochberg, Alan M; Pearson, Ronald K; Hauben, Manfred

    2010-12-01

    A phenomenon of 'masking' or 'cloaking' in pharmacovigilance data mining has been described, which can potentially cause signals of disproportionate reporting (SDRs) to be missed, particularly in pharmaceutical company databases. Masking has been predicted theoretically, observed anecdotally or studied to a limited extent in both pharmaceutical company and health authority databases, but no previous publication systematically assesses its occurrence in a large health authority database. To explore the nature, extent and possible consequences of masking in the US FDA Adverse Event Reporting System (AERS) database by applying various experimental unmasking protocols to a set of drugs and events representing realistic pharmacovigilance analysis conditions. This study employed AERS data from 2001 through 2005. For a set of 63 Medical Dictionary for Regulatory Activities (MedDRA®) Preferred Terms (PTs), disproportionality analysis was carried out with respect to all drugs included in the AERS database, using a previously described urn-model-based algorithm. We specifically sought masking in which drug removal induced an increase in the statistical representation of a drug-event combination (DEC) that resulted in the emergence of a new SDR. We performed a series of unmasking experiments selecting drugs for removal using rational statistical decision rules based on the requirement of a reporting ratio (RR) >1, top-ranked statistical unexpectedness (SU) and relatedness as reflected in the WHO Anatomical Therapeutic Chemical level 4 (ATC4) grouping. In order to assess the possible extent of residual masking we performed two supplemental purely empirical analyses on a limited subset of data. This entailed testing every drug and drug group to determine which was most influential in uncovering masked SDRs. We assessed the strength of external evidence for a causal association for a small number of masked SDRs involving a subset of 29 drugs for which level of evidence adjudication was available from a previous study. The original disproportionality analysis identified 8719 SDRs for the 63 PTs. The SU-based unmasking protocols generated variable numbers of masked SDRs ranging from 38 to 156, representing a 0.43-1.8% increase over the number of baseline SDRs. A significant number of baseline SDRs were also lost in the course of our experiments. The trend in the number of gained SDRs per report removed was inversely related to the number of lost SDRs per protocol. Both the number and nature of the reports removed influenced the number of gained SDRs observed. The purely empirical protocols unmasked up to ten times as many SDRs. None of the masked SDRs had strong external evidence supporting a causal association. Most involved associations for which there was no external supporting evidence or were in the original product label. For two masked SDRs, there was external evidence of a possible causal association. We documented masking in the FDA AERS database. Attempts at unmasking SDRs using practically implementable protocols produced only small changes in the output of SDRs in our analysis. This is undoubtedly related to the large size and diversity of the database, but the complex interdependencies between drugs and events in authentic spontaneous reporting system (SRS) databases, and the impact of measures of statistical variability that are typically used in real-world disproportionality analysis, may be additional factors that constrain the discovery of masked SDRs and which may also operate in pharmaceutical company databases. Empirical determination of the most influential drugs may uncover significantly more SDRs than protocols based on predetermined statistical selection rules but are impractical except possibly for evaluating specific events. Routine global exercises to elicit masking, especially in large health authority databases are not justified based on results available to date. Exercises to elicit unmasking should be driven by prior knowledge or obvious data imbalances.

  1. [Pregnancy outcome during the bombing of Yugoslavia from March 24 to June 9, 1999].

    PubMed

    Krstić, Dragan; Marinković, Darko; Mirković, Ljiljana; Krstić, Jelena

    2006-04-01

    The aim of this study was to evaluate pregnancy outcome during the bombing of Yugoslavia in the period from March 24 to June 9, 1999. A retrospective study included a total of 81 spontaneous abortion following XII gestation week, and 1448 deliveries, hospitalized in the regional hospital. The analyzed were: the incidence of spontaneous abortion, Cesarean section, post-term delivery, vaginal delivery following the previous Cesarean section within the period from March 24 to June 9, 1999, and compared to the same periods in 1998 and 2000 by the use of chi2 and Kolgomorov-Smirnov tests. Under the conditions of a three-month stress imposed by the bombing, significantly increased were the incidence of spontaneous abortion and vaginal delivery following the previous Cesarean section, while the incidence of Cesarean section and post-term delivery were decreased, but the incidence of perinatal outcome was paradoxically improved. The analysis of findings on admittance revealed that iterative Cesarean section was performed electively, close to the expected term of delivery, and vaginal delivery following the previous Cesarean section mainly two weeks before that term with the admittance finding confirming a high active stage pregnancy. Within the bombing, statistically significantly was increased the percentage of abortions after XII gestation week, and the biological duration of pregnancy was reduced. The reduced duration of pregnancy complete with the accelerated fetal mutation (also caused by the stress) resulted in better perinatal outcome, and statistically significantly lower percentage of Cesarean section.

  2. Dental esthetic satisfaction, received and desired dental treatments for improvement of esthetics.

    PubMed

    Akarslan, Zühre Zafersoy; Sadik, Burak; Erten, Hüya; Karabulut, Erdem

    2009-01-01

    The purposes of this research were to investigate factors influencing patients' satisfaction with their present dental esthetic, received previous dental treatments on anterior teeth and basic treatments that they wanted to undergo to improve their dental appearance. A total of 1014 patients who attended a dental school in a major city in Turkey participated in the study. The participants were surveyed with a questionnaire containing questions about gender, age, education level, self-reported tooth appearance, received previous dental treatments on anterior teeth and desired basic esthetic dental treatments. Statistical analysis of the verifying data was made with descriptive statistics, chi2 test and multiple logistic regression analyses. According to the analyses of the verifying data, 55.1% of the patients were dissatisfied with the color of their teeth, 42.7% with dental appearance, 29.9% with crowding of anterior teeth, 23.3% were hiding teeth while smiling, 16.1% had non-esthetic restorations and 11.9% thought that their anterior teeth were protruding. Esthetic restoration was found to be the most-performed treatment recently (29.0%) and whitening of teeth was the most-desired dental treatment (49.0%). Gender, age and education level had an effect on satisfaction and received previous and desired dental treatments for improvement of esthetics. Many of the Turkish patients surveyed in the study were dissatisfied and desired the improvement of dental esthetics. Therefore, dentists should consider this as an important dimension in their practice.

  3. Efficacy of clinical and radiological methods to identify second mesiobuccal canals in maxillary first molars.

    PubMed

    Abuabara, Allan; Baratto-Filho, Flares; Aguiar Anele, Juliana; Leonardi, Denise Piotto; Sousa-Neto, Manoel Damião

    2013-01-01

    The success of endodontic treatment depends on the identification of all root canals. Technological advances have facilitated this process as well as the assessment of internal anatomical variations. The aim of this study was to compare the efficacy of clinical and radiological methods in locating second mesiobuccal canals (MB2) in maxillary first molars. Fifty patients referred for analysis; access and clinical analysis; cone-beam endodontic treatment of their maxillary first molars were submitted to the following assessments: analysis; access and clinical analysis; cone-beam computed tomography (CBCT); post-CBCT clinical analysis; clinical analysis using an operating microscope; and clinical analysis after Start X ultrasonic inserts in teeth with negative results in all previous analyses. Periapical radiographic analysis revealed the presence of MB2 in four (8%) teeth, clinical analysis in 25 (50%), CBCT analysis in 27 (54%) and clinical analysis following CBCT and using an operating microscope in 27 (54%) and 29 (58%) teeth, respectively. The use of Start X ultrasonic inserts allowed one to detect two additional teeth with MB2 (62%). According to Vertucci's classification 48% of the mesiobuccal canals found were type I, 28% type II, 18% type IV and 6% type V. Statistical analysis showed no significant differences (p > 0.5) in the ability of CBCT to detect MB2 canals when compared with clinical assessment with or without an operating microscope. A significant difference (p < 0.001)was found only between periapical radiography and clinical/CBCT evaluations. Combined use of different methods increased the detection ofthe second canal in MB roots, but without statistical difference among CBCT, operating microscope, Start X and clinical analysis.

  4. Statistical principle and methodology in the NISAN system.

    PubMed Central

    Asano, C

    1979-01-01

    The NISAN system is a new interactive statistical analysis program package constructed by an organization of Japanese statisticans. The package is widely available for both statistical situations, confirmatory analysis and exploratory analysis, and is planned to obtain statistical wisdom and to choose optimal process of statistical analysis for senior statisticians. PMID:540594

  5. Railroad safety statistics annual report 1999

    DOT National Transportation Integrated Search

    2000-08-01

    This edition of the Railroad Safety Statistics compiles previous safety bulletins prepared by the Federal Railroad Administration (FRA). These include: the Accident/Incident Bulletin; the Highway-Rail Crossing Accident/Incident and Inventory Bulletin...

  6. Railroad safety statistics annual report 2005

    DOT National Transportation Integrated Search

    2006-12-01

    This edition of the Railroad Safety Statistics compiles previous safety bulletins prepared by the Federal Railroad Administration (FRA). These include: the Accident/Incident Bulletin; the Highway-Rail Crossing Accident/Incident And Inventory Bulletin...

  7. Railroad safety statistics annual report 2003

    DOT National Transportation Integrated Search

    2005-10-01

    This edition of the Railroad Safety Statistics compiles previous safety bulletins prepared by the Federal Railroad Administration (FRA). These include: the Accident/Incident Bulletin; the Highway-Rail Crossing Accident/Incident And Inventory Bulletin...

  8. Railroad safety statistics annual report 2004

    DOT National Transportation Integrated Search

    2005-11-01

    This edition of the Railroad Safety Statistics compiles previous safety bulletins prepared by the Federal Railroad Administration (FRA). These include: the Accident/Incident Bulletin; the Highway-Rail Crossing Accident/Incident And Inventory Bulletin...

  9. State transportation profile : summary

    DOT National Transportation Integrated Search

    2003-12-01

    The Bureau of Transportation Statistics (BTS) presents a statistical : profile of transportation in the 50 states and the District of Columbia. : This document supplements a previously published series of individual : state profiles. Like the individ...

  10. Railroad safety statistics annual report 2000

    DOT National Transportation Integrated Search

    2001-07-01

    This edition of the Railroad Safety Statistics compiles previous safety bulletins prepared by the : Federal Railroad Administration (FRA). These include: the Accident/Incident Bulletin; the : Highway-Rail Crossing Accident/Incident And Inventory Bull...

  11. Power-up: A Reanalysis of 'Power Failure' in Neuroscience Using Mixture Modeling.

    PubMed

    Nord, Camilla L; Valton, Vincent; Wood, John; Roiser, Jonathan P

    2017-08-23

    Recently, evidence for endemically low statistical power has cast neuroscience findings into doubt. If low statistical power plagues neuroscience, then this reduces confidence in the reported effects. However, if statistical power is not uniformly low, then such blanket mistrust might not be warranted. Here, we provide a different perspective on this issue, analyzing data from an influential study reporting a median power of 21% across 49 meta-analyses (Button et al., 2013). We demonstrate, using Gaussian mixture modeling, that the sample of 730 studies included in that analysis comprises several subcomponents so the use of a single summary statistic is insufficient to characterize the nature of the distribution. We find that statistical power is extremely low for studies included in meta-analyses that reported a null result and that it varies substantially across subfields of neuroscience, with particularly low power in candidate gene association studies. Therefore, whereas power in neuroscience remains a critical issue, the notion that studies are systematically underpowered is not the full story: low power is far from a universal problem. SIGNIFICANCE STATEMENT Recently, researchers across the biomedical and psychological sciences have become concerned with the reliability of results. One marker for reliability is statistical power: the probability of finding a statistically significant result given that the effect exists. Previous evidence suggests that statistical power is low across the field of neuroscience. Our results present a more comprehensive picture of statistical power in neuroscience: on average, studies are indeed underpowered-some very seriously so-but many studies show acceptable or even exemplary statistical power. We show that this heterogeneity in statistical power is common across most subfields in neuroscience. This new, more nuanced picture of statistical power in neuroscience could affect not only scientific understanding, but potentially policy and funding decisions for neuroscience research. Copyright © 2017 Nord, Valton et al.

  12. The use of imputed sibling genotypes in sibship-based association analysis: on modeling alternatives, power and model misspecification.

    PubMed

    Minică, Camelia C; Dolan, Conor V; Hottenga, Jouke-Jan; Willemsen, Gonneke; Vink, Jacqueline M; Boomsma, Dorret I

    2013-05-01

    When phenotypic, but no genotypic data are available for relatives of participants in genetic association studies, previous research has shown that family-based imputed genotypes can boost the statistical power when included in such studies. Here, using simulations, we compared the performance of two statistical approaches suitable to model imputed genotype data: the mixture approach, which involves the full distribution of the imputed genotypes and the dosage approach, where the mean of the conditional distribution features as the imputed genotype. Simulations were run by varying sibship size, size of the phenotypic correlations among siblings, imputation accuracy and minor allele frequency of the causal SNP. Furthermore, as imputing sibling data and extending the model to include sibships of size two or greater requires modeling the familial covariance matrix, we inquired whether model misspecification affects power. Finally, the results obtained via simulations were empirically verified in two datasets with continuous phenotype data (height) and with a dichotomous phenotype (smoking initiation). Across the settings considered, the mixture and the dosage approach are equally powerful and both produce unbiased parameter estimates. In addition, the likelihood-ratio test in the linear mixed model appears to be robust to the considered misspecification in the background covariance structure, given low to moderate phenotypic correlations among siblings. Empirical results show that the inclusion in association analysis of imputed sibling genotypes does not always result in larger test statistic. The actual test statistic may drop in value due to small effect sizes. That is, if the power benefit is small, that the change in distribution of the test statistic under the alternative is relatively small, the probability is greater of obtaining a smaller test statistic. As the genetic effects are typically hypothesized to be small, in practice, the decision on whether family-based imputation could be used as a means to increase power should be informed by prior power calculations and by the consideration of the background correlation.

  13. Comparison of untreated adolescent idiopathic scoliosis with normal controls: a review and statistical analysis of the literature.

    PubMed

    Rushton, Paul R P; Grevitt, Michael P

    2013-04-20

    Review and statistical analysis of studies evaluating health-related quality of life (HRQOL) in adolescents with untreated adolescent idiopathic scoliosis (AIS) using Scoliosis Research Society (SRS) outcomes. To apply normative values and minimum clinical important differences for the SRS-22r to the literature. Identify whether the HRQOL of adolescents with untreated AIS differs from unaffected peers and whether any differences are clinically relevant. The effect of untreated AIS on adolescent HRQOL is uncertain. The lack of published normative values and minimum clinical important difference for the SRS-22r has so far hindered our interpretation of previous studies. The publication of this background data allows these studies to be re-examined. Using suitable inclusion criteria, a literature search identified studies examining HRQOL in untreated adolescents with AIS. Each cohort was analyzed individually. Statistically significant differences were identified by using 95% confidence intervals for the difference in SRS-22r domain mean scores between the cohorts with AIS and the published data for unaffected adolescents. If the lower bound of the confidence interval was greater than the minimum clinical important difference, the difference was considered clinically significant. Of the 21 included patient cohorts, 81% reported statistically worse pain than those unaffected. Yet in only 5% of cohorts was this difference clinically important. Of the 11 cohorts included examining patient self-image, 91% reported statistically worse scores than those unaffected. In 73% of cohorts this difference was clinically significant. Affected cohorts tended to score well in function/activity and mental health domains and differences from those unaffected rarely reached clinically significant values. Pain and self-image tend to be statistically lower among cohorts with AIS than those unaffected. The literature to date suggests that it is only self-image which consistently differs clinically. This should be considered when assessing the possible benefits of surgery.

  14. rMATS: robust and flexible detection of differential alternative splicing from replicate RNA-Seq data.

    PubMed

    Shen, Shihao; Park, Juw Won; Lu, Zhi-xiang; Lin, Lan; Henry, Michael D; Wu, Ying Nian; Zhou, Qing; Xing, Yi

    2014-12-23

    Ultra-deep RNA sequencing (RNA-Seq) has become a powerful approach for genome-wide analysis of pre-mRNA alternative splicing. We previously developed multivariate analysis of transcript splicing (MATS), a statistical method for detecting differential alternative splicing between two RNA-Seq samples. Here we describe a new statistical model and computer program, replicate MATS (rMATS), designed for detection of differential alternative splicing from replicate RNA-Seq data. rMATS uses a hierarchical model to simultaneously account for sampling uncertainty in individual replicates and variability among replicates. In addition to the analysis of unpaired replicates, rMATS also includes a model specifically designed for paired replicates between sample groups. The hypothesis-testing framework of rMATS is flexible and can assess the statistical significance over any user-defined magnitude of splicing change. The performance of rMATS is evaluated by the analysis of simulated and real RNA-Seq data. rMATS outperformed two existing methods for replicate RNA-Seq data in all simulation settings, and RT-PCR yielded a high validation rate (94%) in an RNA-Seq dataset of prostate cancer cell lines. Our data also provide guiding principles for designing RNA-Seq studies of alternative splicing. We demonstrate that it is essential to incorporate biological replicates in the study design. Of note, pooling RNAs or merging RNA-Seq data from multiple replicates is not an effective approach to account for variability, and the result is particularly sensitive to outliers. The rMATS source code is freely available at rnaseq-mats.sourceforge.net/. As the popularity of RNA-Seq continues to grow, we expect rMATS will be useful for studies of alternative splicing in diverse RNA-Seq projects.

  15. COgnitive behavioural therapy versus standardised medical care for adults with Dissociative non-Epileptic Seizures (CODES): statistical and economic analysis plan for a randomised controlled trial.

    PubMed

    Robinson, Emily J; Goldstein, Laura H; McCrone, Paul; Perdue, Iain; Chalder, Trudie; Mellers, John D C; Richardson, Mark P; Murray, Joanna; Reuber, Markus; Medford, Nick; Stone, Jon; Carson, Alan; Landau, Sabine

    2017-06-06

    Dissociative seizures (DSs), also called psychogenic non-epileptic seizures, are a distressing and disabling problem for many patients in neurological settings with high and often unnecessary economic costs. The COgnitive behavioural therapy versus standardised medical care for adults with Dissociative non-Epileptic Seizures (CODES) trial is an evaluation of a specifically tailored psychological intervention with the aims of reducing seizure frequency and severity and improving psychological well-being in adults with DS. The aim of this paper is to report in detail the quantitative and economic analysis plan for the CODES trial, as agreed by the trial steering committee. The CODES trial is a multicentre, pragmatic, parallel group, randomised controlled trial performed to evaluate the clinical effectiveness and cost-effectiveness of 13 sessions of cognitive behavioural therapy (CBT) plus standardised medical care (SMC) compared with SMC alone for adult outpatients with DS. The objectives and design of the trial are summarised, and the aims and procedures of the planned analyses are illustrated. The proposed analysis plan addresses statistical considerations such as maintaining blinding, monitoring adherence with the protocol, describing aspects of treatment and dealing with missing data. The formal analysis approach for the primary and secondary outcomes is described, as are the descriptive statistics that will be reported. This paper provides transparency to the planned inferential analyses for the CODES trial prior to the extraction of outcome data. It also provides an update to the previously published trial protocol and guidance to those conducting similar trials. ISRCTN registry ISRCTN05681227 (registered on 5 March 2014); ClinicalTrials.gov NCT02325544 (registered on 15 December 2014).

  16. Joint multi-population analysis for genetic linkage of bipolar disorder or "wellness" to chromosome 4p.

    PubMed

    Visscher, P M; Haley, C S; Ewald, H; Mors, O; Egeland, J; Thiel, B; Ginns, E; Muir, W; Blackwood, D H

    2005-02-05

    To test the hypothesis that the same genetic loci confer susceptibility to, or protection from, disease in different populations, and that a combined analysis would improve the map resolution of a common susceptibility locus, we analyzed data from three studies that had reported linkage to bipolar disorder in a small region on chromosome 4p. Data sets comprised phenotypic information and genetic marker data on Scottish, Danish, and USA extended pedigrees. Across the three data sets, 913 individuals appeared in the pedigrees, 462 were classified, either as unaffected (323) or affected (139) with unipolar or bipolar disorder. A consensus linkage map was created from 14 microsatellite markers in a 33 cM region. Phenotypic and genetic data were analyzed using a variance component (VC) and allele sharing method. All previously reported elevated test statistics in the region were confirmed with one or both analysis methods, indicating the presence of one or more susceptibility genes to bipolar disorder in the three populations in the studied chromosome segment. When the results from both the VC and allele sharing method were considered, there was strong evidence for a susceptibility locus in the data from Scotland, some evidence in the data from Denmark and relatively less evidence in the data from the USA. The test statistics from the Scottish data set dominated the test statistics from the other studies, and no improved map resolution for a putative genetic locus underlying susceptibility in all three studies was obtained. Studies reporting linkage to the same region require careful scrutiny and preferably joint or meta analysis on the same basis in order to ensure that the results are truly comparable. (c) 2004 Wiley-Liss, Inc.

  17. Railroad safety statistics annual report 1998

    DOT National Transportation Integrated Search

    1999-07-01

    This edition of the Railroad Safety Statistics is a composite of previous safety bulletins prepared by the Federal Railroad Administration (FRA). These include: the Accident/Incident Bulletin; the Highway-Rail Crossing Accident/Incident And Inventory...

  18. Matrix metalloproteinases and educational attainment in refractive error: evidence of gene-environment interactions in the AREDS study

    PubMed Central

    Wojciechowski, Robert; Yee, Stephanie S.; Simpson, Claire L.; Bailey-Wilson, Joan E.; Stambolian, Dwight

    2012-01-01

    Purpose A previous study of Old Order Amish families has shown association of ocular refraction with markers proximal to matrix metalloproteinase (MMP) genes MMP1 and MMP10 and intragenic to MMP2. We conducted a candidate gene replication study of association between refraction and single nucleotide polymorphisms (SNPs) within these genomic regions. Design Candidate gene genetic association study. Participants 2,000 participants drawn from the Age Related Eye Disease Study (AREDS) were chosen for genotyping. After quality control filtering, 1912 individuals were available for analysis. Methods Microarray genotyping was performed using the HumanOmni 2.5 bead array. SNPs originally typed in the previous Amish association study were extracted for analysis. In addition, haplotype tagging SNPs were genotyped using TaqMan assays. Quantitative trait association analyses of mean spherical equivalent refraction (MSE) were performed on 30 markers using linear regression models and an additive genetic risk model, while adjusting for age, sex, education, and population substructure. Post-hoc analyses were performed after stratifying on a dichotomous education variable. Pointwise (P-emp) and multiple-test study-wise (P-multi) significance levels were calculated empirically through permutation. Main outcome measures MSE was used as a quantitative measure of ocular refraction. Results The mean age and ocular refraction were 68 years (SD=4.7) and +0.55 D (SD=2.14), respectively. Pointwise statistical significance was obtained for rs1939008 (P-emp=0.0326). No SNP attained statistical significance after correcting for multiple testing. In stratified analyses, multiple SNPs reached pointwise significance in the lower-education group: 2 of these were statistically significant after multiple testing correction. The two highest-ranking SNPs in Amish families (rs1939008 and rs9928731) showed pointwise P-emp<0.01 in the lower-education stratum of AREDS participants. Conclusions We show suggestive evidence of replication of an association signal for ocular refraction to a marker between MMP1 and MMP10. We also provide evidence of a gene-environment interaction between previously-reported markers and education on refractive error. Variants in MMP1- MMP10 and MMP2 regions appear to affect population variation in ocular refraction in environmental conditions less favorable for myopia development. PMID:23098370

  19. Classification of Rotor Induced Shearing Events in the Near Wake of a Wind Turbine Array Boundary Layer

    NASA Astrophysics Data System (ADS)

    Smith, Sarah; Viggiano, Bianca; Ali, Naseem; Cal, Raul Bayoan

    2017-11-01

    Flow perturbation induced by a turbine rotor imposes considerable turbulence and shearing effects in the near wake of a turbine, altering the efficiency of subsequent units within a wind farm array. Previous methods have characterized near wake vorticity of a turbine and recovery distance of various turbine array configurations. This study aims to build on previous analysis with respect to a turbine rotor within an array and develop a model to examine stress events and energy contribution in the near wake due to rotational effects. Hot wire anemometry was employed downstream of a turbine centrally located in the third row of a 3x3 array. Data considered points planar to the rotor and included simultaneous streamwise and wall-normal velocities as well as concurrent streamwise and transverse velocities. Conditional analysis of Reynolds stresses induced by the rotor agree with former near wake research, and examination of stresses in terms of streamwise and transverse velocity components depicts areas of significant rotational effects. Continued analysis includes spectral decomposition and conditional statistics to further characterize shearing events at various points considering the swept area of the rotor.

  20. Facts about Newspapers '85: A Statistical Summary of the Newspaper Business.

    ERIC Educational Resources Information Center

    American Newspaper Publishers Association, Washington, DC.

    A statistical summary of the newspaper industry for 1984 and previous years is presented in this brochure. Focusing primarily on the United States newspaper industry, the brochure also contains some information on Canadian newspapers. The brochure presents statistics in the following categories: (1) number of daily newspapers, (2) daily newspaper…

  1. Retention of Statistical Concepts in a Preliminary Randomization-Based Introductory Statistics Curriculum

    ERIC Educational Resources Information Center

    Tintle, Nathan; Topliff, Kylie; VanderStoep, Jill; Holmes, Vicki-Lynn; Swanson, Todd

    2012-01-01

    Previous research suggests that a randomization-based introductory statistics course may improve student learning compared to the consensus curriculum. However, it is unclear whether these gains are retained by students post-course. We compared the conceptual understanding of a cohort of students who took a randomization-based curriculum (n = 76)…

  2. Comparing Student Success and Understanding in Introductory Statistics under Consensus and Simulation-Based Curricula

    ERIC Educational Resources Information Center

    Hldreth, Laura A.; Robison-Cox, Jim; Schmidt, Jade

    2018-01-01

    This study examines the transferability of results from previous studies of simulation-based curriculum in introductory statistics using data from 3,500 students enrolled in an introductory statistics course at Montana State University from fall 2013 through spring 2016. During this time, four different curricula, a traditional curriculum and…

  3. Periodicity of Strong Seismicity in Italy: Schuster Spectrum Analysis Extended to the Destructive Earthquakes of 2016

    NASA Astrophysics Data System (ADS)

    Bragato, P. L.

    2017-10-01

    The strong earthquakes that occurred in Italy between 2009 and 2016 represent an abrupt acceleration of seismicity in respect of the previous 30 years. Such behavior seems to agree with the periodic rate change I observed in a previous paper. The present work improves that study by extending the data set up to the end of 2016, adopting the latest version of the historical seismic catalog of Italy, and introducing Schuster spectrum analysis for the detection of the oscillatory period and the assessment of its statistical significance. Applied to the declustered catalog of M w ≥ 6 earthquakes that occurred between 1600 and 2016, the analysis individuates a marked periodicity of 46 years, which is recognized above the 95% confidence level. Monte Carlo simulation shows that the oscillatory behavior is stable in respect of random errors on magnitude estimation. A parametric oscillatory model for the annual rate of seismicity is estimated by likelihood maximization under the hypothesis of inhomogeneous Poisson point process. According to the Akaike Information Criterion, such model outperforms the simpler homogeneous one with constant annual rate. A further element emerges form the analysis: so far, despite recent earthquakes, the Italian seismicity is still within a long-term decreasing trend established since the first half of the twentieth century.

  4. Modeling spatiotemporal covariance for magnetoencephalography or electroencephalography source analysis.

    PubMed

    Plis, Sergey M; George, J S; Jun, S C; Paré-Blagoev, J; Ranken, D M; Wood, C C; Schmidt, D M

    2007-01-01

    We propose a new model to approximate spatiotemporal noise covariance for use in neural electromagnetic source analysis, which better captures temporal variability in background activity. As with other existing formalisms, our model employs a Kronecker product of matrices representing temporal and spatial covariance. In our model, spatial components are allowed to have differing temporal covariances. Variability is represented as a series of Kronecker products of spatial component covariances and corresponding temporal covariances. Unlike previous attempts to model covariance through a sum of Kronecker products, our model is designed to have a computationally manageable inverse. Despite increased descriptive power, inversion of the model is fast, making it useful in source analysis. We have explored two versions of the model. One is estimated based on the assumption that spatial components of background noise have uncorrelated time courses. Another version, which gives closer approximation, is based on the assumption that time courses are statistically independent. The accuracy of the structural approximation is compared to an existing model, based on a single Kronecker product, using both Frobenius norm of the difference between spatiotemporal sample covariance and a model, and scatter plots. Performance of ours and previous models is compared in source analysis of a large number of single dipole problems with simulated time courses and with background from authentic magnetoencephalography data.

  5. Latent transition analysis of pre-service teachers' efficacy in mathematics and science

    NASA Astrophysics Data System (ADS)

    Ward, Elizabeth Kennedy

    This study modeled changes in pre-service teacher efficacy in mathematics and science over the course of the final year of teacher preparation using latent transition analysis (LTA), a longitudinal form of analysis that builds on two modeling traditions (latent class analysis (LCA) and auto-regressive modeling). Data were collected using the STEBI-B, MTEBI-r, and the ABNTMS instruments. The findings suggest that LTA is a viable technique for use in teacher efficacy research. Teacher efficacy is modeled as a construct with two dimensions: personal teaching efficacy (PTE) and outcome expectancy (OE). Findings suggest that the mathematics and science teaching efficacy (PTE) of pre-service teachers is a multi-class phenomena. The analyses revealed a four-class model of PTE at the beginning and end of the final year of teacher training. Results indicate that when pre-service teachers transition between classes, they tend to move from a lower efficacy class into a higher efficacy class. In addition, the findings suggest that time-varying variables (attitudes and beliefs) and time-invariant variables (previous coursework, previous experiences, and teacher perceptions) are statistically significant predictors of efficacy class membership. Further, analyses suggest that the measures used to assess outcome expectancy are not suitable for LCA and LTA procedures.

  6. Brief communication: Skeletal biology past and present: Are we moving in the right direction?

    PubMed

    Hens, Samantha M; Godde, Kanya

    2008-10-01

    In 1982, Spencer's edited volume A History of American Physical Anthropology: 1930-1980 allowed numerous authors to document the state of our science, including a critical examination of skeletal biology. Some authors argued that the first 50 years of skeletal biology were characterized by the descriptive-historical approach with little regard for processual problems and that technological and statistical analyses were not rooted in theory. In an effort to determine whether Spencer's landmark volume impacted the field of skeletal biology, a content analysis was carried out for the American Journal of Physical Anthropology from 1980 to 2004. The percentage of skeletal biology articles is similar to that of previous decades. Analytical articles averaged only 32% and are defined by three criteria: statistical analysis, hypothesis testing, and broader explanatory context. However, when these criteria were scored individually, nearly 80% of papers attempted a broader theoretical explanation, 44% tested hypotheses, and 67% used advanced statistics, suggesting that the skeletal biology papers in the journal have an analytical emphasis. Considerable fluctuation exists between subfields; trends toward a more analytical approach are witnessed in the subfields of age/sex/stature/demography, skeletal maturation, anatomy, and nonhuman primate studies, which also increased in frequency, while paleontology and pathology were largely descriptive. Comparisons to the International Journal of Osteoarchaeology indicate that there are statistically significant differences between the two journals in terms of analytical criteria. These data indicate a positive shift in theoretical thinking, i.e., an attempt by most to explain processes rather than present a simple description of events.

  7. StreamStats: a U.S. geological survey web site for stream information

    USGS Publications Warehouse

    Kernell, G. Ries; Gray, John R.; Renard, Kenneth G.; McElroy, Stephen A.; Gburek, William J.; Canfield, H. Evan; Scott, Russell L.

    2003-01-01

    The U.S. Geological Survey has developed a Web application, named StreamStats, for providing streamflow statistics, such as the 100-year flood and the 7-day, 10-year low flow, to the public. Statistics can be obtained for data-collection stations and for ungaged sites. Streamflow statistics are needed for water-resources planning and management; for design of bridges, culverts, and flood-control structures; and for many other purposes. StreamStats users can point and click on data-collection stations shown on a map in their Web browser window to obtain previously determined streamflow statistics and other information for the stations. Users also can point and click on any stream shown on the map to get estimates of streamflow statistics for ungaged sites. StreamStats determines the watershed boundaries and measures physical and climatic characteristics of the watersheds for the ungaged sites by use of a Geographic Information System (GIS), and then it inserts the characteristics into previously determined regression equations to estimate the streamflow statistics. Compared to manual methods, StreamStats reduces the average time needed to estimate streamflow statistics for ungaged sites from several hours to several minutes.

  8. Quantitative knowledge acquisition for expert systems

    NASA Technical Reports Server (NTRS)

    Belkin, Brenda L.; Stengel, Robert F.

    1991-01-01

    A common problem in the design of expert systems is the definition of rules from data obtained in system operation or simulation. While it is relatively easy to collect data and to log the comments of human operators engaged in experiments, generalizing such information to a set of rules has not previously been a direct task. A statistical method is presented for generating rule bases from numerical data, motivated by an example based on aircraft navigation with multiple sensors. The specific objective is to design an expert system that selects a satisfactory suite of measurements from a dissimilar, redundant set, given an arbitrary navigation geometry and possible sensor failures. The systematic development is described of a Navigation Sensor Management (NSM) Expert System from Kalman Filter convariance data. The method invokes two statistical techniques: Analysis of Variance (ANOVA) and the ID3 Algorithm. The ANOVA technique indicates whether variations of problem parameters give statistically different covariance results, and the ID3 algorithms identifies the relationships between the problem parameters using probabilistic knowledge extracted from a simulation example set. Both are detailed.

  9. Modeling of a Robust Confidence Band for the Power Curve of a Wind Turbine.

    PubMed

    Hernandez, Wilmar; Méndez, Alfredo; Maldonado-Correa, Jorge L; Balleteros, Francisco

    2016-12-07

    Having an accurate model of the power curve of a wind turbine allows us to better monitor its operation and planning of storage capacity. Since wind speed and direction is of a highly stochastic nature, the forecasting of the power generated by the wind turbine is of the same nature as well. In this paper, a method for obtaining a robust confidence band containing the power curve of a wind turbine under test conditions is presented. Here, the confidence band is bound by two curves which are estimated using parametric statistical inference techniques. However, the observations that are used for carrying out the statistical analysis are obtained by using the binning method, and in each bin, the outliers are eliminated by using a censorship process based on robust statistical techniques. Then, the observations that are not outliers are divided into observation sets. Finally, both the power curve of the wind turbine and the two curves that define the robust confidence band are estimated using each of the previously mentioned observation sets.

  10. Modeling of a Robust Confidence Band for the Power Curve of a Wind Turbine

    PubMed Central

    Hernandez, Wilmar; Méndez, Alfredo; Maldonado-Correa, Jorge L.; Balleteros, Francisco

    2016-01-01

    Having an accurate model of the power curve of a wind turbine allows us to better monitor its operation and planning of storage capacity. Since wind speed and direction is of a highly stochastic nature, the forecasting of the power generated by the wind turbine is of the same nature as well. In this paper, a method for obtaining a robust confidence band containing the power curve of a wind turbine under test conditions is presented. Here, the confidence band is bound by two curves which are estimated using parametric statistical inference techniques. However, the observations that are used for carrying out the statistical analysis are obtained by using the binning method, and in each bin, the outliers are eliminated by using a censorship process based on robust statistical techniques. Then, the observations that are not outliers are divided into observation sets. Finally, both the power curve of the wind turbine and the two curves that define the robust confidence band are estimated using each of the previously mentioned observation sets. PMID:27941604

  11. Continuum radiation from active galactic nuclei: A statistical study

    NASA Technical Reports Server (NTRS)

    Isobe, T.; Feigelson, E. D.; Singh, K. P.; Kembhavi, A.

    1986-01-01

    The physics of the continuum spectrum of active galactic nuclei (AGNs) was examined using a large data set and rigorous statistical methods. A data base was constructed for 469 objects which include radio selected quasars, optically selected quasars, X-ray selected AGNs, BL Lac objects, and optically unidentified compact radio sources. Each object has measurements of its radio, optical, X-ray core continuum luminosity, though many of them are upper limits. Since many radio sources have extended components, the core component were carefully selected out from the total radio luminosity. With survival analysis statistical methods, which can treat upper limits correctly, these data can yield better statistical results than those previously obtained. A variety of statistical tests are performed, such as the comparison of the luminosity functions in different subsamples, and linear regressions of luminosities in different bands. Interpretation of the results leads to the following tentative conclusions: the main emission mechanism of optically selected quasars and X-ray selected AGNs is thermal, while that of BL Lac objects is synchrotron; radio selected quasars may have two different emission mechanisms in the X-ray band; BL Lac objects appear to be special cases of the radio selected quasars; some compact radio sources show the possibility of synchrotron self-Compton (SSC) in the optical band; and the spectral index between the optical and the X-ray bands depends on the optical luminosity.

  12. Improved Statistical Fault Detection Technique and Application to Biological Phenomena Modeled by S-Systems.

    PubMed

    Mansouri, Majdi; Nounou, Mohamed N; Nounou, Hazem N

    2017-09-01

    In our previous work, we have demonstrated the effectiveness of the linear multiscale principal component analysis (PCA)-based moving window (MW)-generalized likelihood ratio test (GLRT) technique over the classical PCA and multiscale principal component analysis (MSPCA)-based GLRT methods. The developed fault detection algorithm provided optimal properties by maximizing the detection probability for a particular false alarm rate (FAR) with different values of windows, and however, most real systems are nonlinear, which make the linear PCA method not able to tackle the issue of non-linearity to a great extent. Thus, in this paper, first, we apply a nonlinear PCA to obtain an accurate principal component of a set of data and handle a wide range of nonlinearities using the kernel principal component analysis (KPCA) model. The KPCA is among the most popular nonlinear statistical methods. Second, we extend the MW-GLRT technique to one that utilizes exponential weights to residuals in the moving window (instead of equal weightage) as it might be able to further improve fault detection performance by reducing the FAR using exponentially weighed moving average (EWMA). The developed detection method, which is called EWMA-GLRT, provides improved properties, such as smaller missed detection and FARs and smaller average run length. The idea behind the developed EWMA-GLRT is to compute a new GLRT statistic that integrates current and previous data information in a decreasing exponential fashion giving more weight to the more recent data. This provides a more accurate estimation of the GLRT statistic and provides a stronger memory that will enable better decision making with respect to fault detection. Therefore, in this paper, a KPCA-based EWMA-GLRT method is developed and utilized in practice to improve fault detection in biological phenomena modeled by S-systems and to enhance monitoring process mean. The idea behind a KPCA-based EWMA-GLRT fault detection algorithm is to combine the advantages brought forward by the proposed EWMA-GLRT fault detection chart with the KPCA model. Thus, it is used to enhance fault detection of the Cad System in E. coli model through monitoring some of the key variables involved in this model such as enzymes, transport proteins, regulatory proteins, lysine, and cadaverine. The results demonstrate the effectiveness of the proposed KPCA-based EWMA-GLRT method over Q , GLRT, EWMA, Shewhart, and moving window-GLRT methods. The detection performance is assessed and evaluated in terms of FAR, missed detection rates, and average run length (ARL 1 ) values.

  13. PAH Baselines for Amazonic Surficial Sediments: A Case of Study in Guajará Bay and Guamá River (Northern Brazil).

    PubMed

    Rodrigues, Camila Carneiro Dos Santos; Santos, Ewerton; Ramos, Brunalisa Silva; Damasceno, Flaviana Cardoso; Correa, José Augusto Martins

    2018-06-01

    The 16 priority PAH were determined in sediment samples from the insular zone of Guajará Bay and Guamá River (Southern Amazon River mouth). Low hydrocarbon levels were observed and naphthalene was the most representative PAH. The low molecular weight PAH represented 51% of the total PAH. Statistical analysis showed that the sampling sites are not significantly different. Source analysis by PAH ratios and principal component analysis revealed that PAH are primary from a few rate of fossil fuel combustion, mainly related to the local small community activity. All samples presented no biological stress or damage potencial according to the sediment quality guidelines. This study discuss baselines for PAH in surface sediments from Amazonic aquatic systems based on source determination by PAH ratios and principal component analysis, sediment quality guidelines and through comparison with previous studies data.

  14. An overview of groundwater chemistry studies in Malaysia.

    PubMed

    Kura, Nura Umar; Ramli, Mohammad Firuz; Sulaiman, Wan Nor Azmin; Ibrahim, Shaharin; Aris, Ahmad Zaharin

    2018-03-01

    In this paper, numerous studies on groundwater in Malaysia were reviewed with the aim of evaluating past trends and the current status for discerning the sustainability of the water resources in the country. It was found that most of the previous groundwater studies (44 %) focused on the islands and mostly concentrated on qualitative assessment with more emphasis being placed on seawater intrusion studies. This was then followed by inland-based studies, with Selangor state leading the studies which reflected the current water challenges facing the state. From a methodological perspective, geophysics, graphical methods, and statistical analysis are the dominant techniques (38, 25, and 25 %) respectively. The geophysical methods especially the 2D resistivity method cut across many subjects such as seawater intrusion studies, quantitative assessment, and hydraulic parameters estimation. The statistical techniques used include multivariate statistical analysis techniques and ANOVA among others, most of which are quality related studies using major ions, in situ parameters, and heavy metals. Conversely, numerical techniques like MODFLOW were somewhat less admired which is likely due to their complexity in nature and high data demand. This work will facilitate researchers in identifying the specific areas which need improvement and focus, while, at the same time, provide policymakers and managers with an executive summary and knowledge of the current situation in groundwater studies and where more work needs to be done for sustainable development.

  15. Analysis of a genetically structured variance heterogeneity model using the Box-Cox transformation.

    PubMed

    Yang, Ye; Christensen, Ole F; Sorensen, Daniel

    2011-02-01

    Over recent years, statistical support for the presence of genetic factors operating at the level of the environmental variance has come from fitting a genetically structured heterogeneous variance model to field or experimental data in various species. Misleading results may arise due to skewness of the marginal distribution of the data. To investigate how the scale of measurement affects inferences, the genetically structured heterogeneous variance model is extended to accommodate the family of Box-Cox transformations. Litter size data in rabbits and pigs that had previously been analysed in the untransformed scale were reanalysed in a scale equal to the mode of the marginal posterior distribution of the Box-Cox parameter. In the rabbit data, the statistical evidence for a genetic component at the level of the environmental variance is considerably weaker than that resulting from an analysis in the original metric. In the pig data, the statistical evidence is stronger, but the coefficient of correlation between additive genetic effects affecting mean and variance changes sign, compared to the results in the untransformed scale. The study confirms that inferences on variances can be strongly affected by the presence of asymmetry in the distribution of data. We recommend that to avoid one important source of spurious inferences, future work seeking support for a genetic component acting on environmental variation using a parametric approach based on normality assumptions confirms that these are met.

  16. Continuous diffraction of molecules and disordered molecular crystals

    PubMed Central

    Yefanov, Oleksandr M.; Ayyer, Kartik; White, Thomas A.; Barty, Anton; Morgan, Andrew; Mariani, Valerio; Oberthuer, Dominik; Pande, Kanupriya

    2017-01-01

    The intensities of far-field diffraction patterns of orientationally aligned molecules obey Wilson statistics, whether those molecules are in isolation (giving rise to a continuous diffraction pattern) or arranged in a crystal (giving rise to Bragg peaks). Ensembles of molecules in several orientations, but uncorrelated in position, give rise to the incoherent sum of the diffraction from those objects, modifying the statistics in a similar way as crystal twinning modifies the distribution of Bragg intensities. This situation arises in the continuous diffraction of laser-aligned molecules or translationally disordered molecular crystals. This paper develops the analysis of the intensity statistics of such continuous diffraction to obtain parameters such as scaling, beam coherence and the number of contributing independent object orientations. When measured, continuous molecular diffraction is generally weak and accompanied by a background that far exceeds the strength of the signal. Instead of just relying upon the smallest measured intensities or their mean value to guide the subtraction of the background, it is shown how all measured values can be utilized to estimate the background, noise and signal, by employing a modified ‘noisy Wilson’ distribution that explicitly includes the background. Parameters relating to the background and signal quantities can be estimated from the moments of the measured intensities. The analysis method is demonstrated on previously published continuous diffraction data measured from crystals of photosystem II [Ayyer et al. (2016 ▸), Nature, 530, 202–206]. PMID:28808434

  17. Evaluation of Allele-Specific Somatic Changes of Genome-Wide Association Study Susceptibility Alleles in Human Colorectal Cancers

    PubMed Central

    Gerber, Madelyn M.; Hampel, Heather; Schulz, Nathan P.; Fernandez, Soledad; Wei, Lai; Zhou, Xiao-Ping; de la Chapelle, Albert; Toland, Amanda Ewart

    2012-01-01

    Background Tumors frequently exhibit loss of tumor suppressor genes or allelic gains of activated oncogenes. A significant proportion of cancer susceptibility loci in the mouse show somatic losses or gains consistent with the presence of a tumor susceptibility or resistance allele. Thus, allele-specific somatic gains or losses at loci may demarcate the presence of resistance or susceptibility alleles. The goal of this study was to determine if previously mapped susceptibility loci for colorectal cancer show evidence of allele-specific somatic events in colon tumors. Methods We performed quantitative genotyping of 16 single nucleotide polymorphisms (SNPs) showing statistically significant association with colorectal cancer in published genome-wide association studies (GWAS). We genotyped 194 paired normal and colorectal tumor DNA samples and 296 paired validation samples to investigate these SNPs for allele-specific somatic gains and losses. We combined analysis of our data with published data for seven of these SNPs. Results No statistically significant evidence for allele-specific somatic selection was observed for the tested polymorphisms in the discovery set. The rs6983267 variant, which has shown preferential loss of the non-risk T allele and relative gain of the risk G allele in previous studies, favored relative gain of the G allele in the combined discovery and validation samples (corrected p-value = 0.03). When we combined our data with published allele-specific imbalance data for this SNP, the G allele of rs6983267 showed statistically significant evidence of relative retention (p-value = 2.06×10−4). Conclusions Our results suggest that the majority of variants identified as colon cancer susceptibility alleles through GWAS do not exhibit somatic allele-specific imbalance in colon tumors. Our data confirm previously published results showing allele-specific imbalance for rs6983267. These results indicate that allele-specific imbalance of cancer susceptibility alleles may not be a common phenomenon in colon cancer. PMID:22629442

  18. EzArray: A web-based highly automated Affymetrix expression array data management and analysis system

    PubMed Central

    Zhu, Yuerong; Zhu, Yuelin; Xu, Wei

    2008-01-01

    Background Though microarray experiments are very popular in life science research, managing and analyzing microarray data are still challenging tasks for many biologists. Most microarray programs require users to have sophisticated knowledge of mathematics, statistics and computer skills for usage. With accumulating microarray data deposited in public databases, easy-to-use programs to re-analyze previously published microarray data are in high demand. Results EzArray is a web-based Affymetrix expression array data management and analysis system for researchers who need to organize microarray data efficiently and get data analyzed instantly. EzArray organizes microarray data into projects that can be analyzed online with predefined or custom procedures. EzArray performs data preprocessing and detection of differentially expressed genes with statistical methods. All analysis procedures are optimized and highly automated so that even novice users with limited pre-knowledge of microarray data analysis can complete initial analysis quickly. Since all input files, analysis parameters, and executed scripts can be downloaded, EzArray provides maximum reproducibility for each analysis. In addition, EzArray integrates with Gene Expression Omnibus (GEO) and allows instantaneous re-analysis of published array data. Conclusion EzArray is a novel Affymetrix expression array data analysis and sharing system. EzArray provides easy-to-use tools for re-analyzing published microarray data and will help both novice and experienced users perform initial analysis of their microarray data from the location of data storage. We believe EzArray will be a useful system for facilities with microarray services and laboratories with multiple members involved in microarray data analysis. EzArray is freely available from . PMID:18218103

  19. Assessing Statistical Competencies in Clinical and Translational Science Education: One Size Does Not Fit All

    PubMed Central

    Lindsell, Christopher J.; Welty, Leah J.; Mazumdar, Madhu; Thurston, Sally W.; Rahbar, Mohammad H.; Carter, Rickey E.; Pollock, Bradley H.; Cucchiara, Andrew J.; Kopras, Elizabeth J.; Jovanovic, Borko D.; Enders, Felicity T.

    2014-01-01

    Abstract Introduction Statistics is an essential training component for a career in clinical and translational science (CTS). Given the increasing complexity of statistics, learners may have difficulty selecting appropriate courses. Our question was: what depth of statistical knowledge do different CTS learners require? Methods For three types of CTS learners (principal investigator, co‐investigator, informed reader of the literature), each with different backgrounds in research (no previous research experience, reader of the research literature, previous research experience), 18 experts in biostatistics, epidemiology, and research design proposed levels for 21 statistical competencies. Results Statistical competencies were categorized as fundamental, intermediate, or specialized. CTS learners who intend to become independent principal investigators require more specialized training, while those intending to become informed consumers of the medical literature require more fundamental education. For most competencies, less training was proposed for those with more research background. Discussion When selecting statistical coursework, the learner's research background and career goal should guide the decision. Some statistical competencies are considered to be more important than others. Baseline knowledge assessments may help learners identify appropriate coursework. Conclusion Rather than one size fits all, tailoring education to baseline knowledge, learner background, and future goals increases learning potential while minimizing classroom time. PMID:25212569

  20. Statistical Analysis of Research Data | Center for Cancer Research

    Cancer.gov

    Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. The Statistical Analysis of Research Data (SARD) course will be held on April 5-6, 2018 from 9 a.m.-5 p.m. at the National Institutes of Health's Natcher Conference Center, Balcony C on the Bethesda Campus. SARD is designed to provide an overview on the general principles of statistical analysis of research data.  The first day will feature univariate data analysis, including descriptive statistics, probability distributions, one- and two-sample inferential statistics.

Top