Sample records for controls statistical analysis

  1. Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.

    PubMed

    Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias

    2016-01-01

    To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.

  2. Application of multivariable statistical techniques in plant-wide WWTP control strategies analysis.

    PubMed

    Flores, X; Comas, J; Roda, I R; Jiménez, L; Gernaey, K V

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation of the complex multicriteria data sets and allows an improved use of information for effective evaluation of control strategies.

  3. Risk Factors for Sexual Violence in the Military: An Analysis of Sexual Assault and Sexual Harassment Incidents and Reporting

    DTIC Science & Technology

    2017-03-01

    53 ix LIST OF TABLES Table 1. Descriptive Statistics for Control Variables by... Statistics for Control Variables by Gender (Random Subsample with Complete Survey) ............................................................30 Table...empirical analysis. Chapter IV describes the summary statistics and results. Finally, Chapter V offers concluding thoughts, study limitations, and

  4. A Study on Predictive Analytics Application to Ship Machinery Maintenance

    DTIC Science & Technology

    2013-09-01

    Looking at the nature of the time series forecasting method , it would be better applied to offline analysis . The application for real- time online...other system attributes in future. Two techniques of statistical analysis , mainly time series models and cumulative sum control charts, are discussed in...statistical tool employed for the two techniques of statistical analysis . Both time series forecasting as well as CUSUM control charts are shown to be

  5. Statistical process control methods allow the analysis and improvement of anesthesia care.

    PubMed

    Fasting, Sigurd; Gisvold, Sven E

    2003-10-01

    Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.

  6. Applying Statistical Process Control to Clinical Data: An Illustration.

    ERIC Educational Resources Information Center

    Pfadt, Al; And Others

    1992-01-01

    Principles of statistical process control are applied to a clinical setting through the use of control charts to detect changes, as part of treatment planning and clinical decision-making processes. The logic of control chart analysis is derived from principles of statistical inference. Sample charts offer examples of evaluating baselines and…

  7. The Use of Statistical Process Control-Charts for Person-Fit Analysis on Computerized Adaptive Testing. LSAC Research Report Series.

    ERIC Educational Resources Information Center

    Meijer, Rob R.; van Krimpen-Stoop, Edith M. L. A.

    In this study a cumulative-sum (CUSUM) procedure from the theory of Statistical Process Control was modified and applied in the context of person-fit analysis in a computerized adaptive testing (CAT) environment. Six person-fit statistics were proposed using the CUSUM procedure, and three of them could be used to investigate the CAT in online test…

  8. Analysis and interpretation of cost data in randomised controlled trials: review of published studies

    PubMed Central

    Barber, Julie A; Thompson, Simon G

    1998-01-01

    Objective To review critically the statistical methods used for health economic evaluations in randomised controlled trials where an estimate of cost is available for each patient in the study. Design Survey of published randomised trials including an economic evaluation with cost values suitable for statistical analysis; 45 such trials published in 1995 were identified from Medline. Main outcome measures The use of statistical methods for cost data was assessed in terms of the descriptive statistics reported, use of statistical inference, and whether the reported conclusions were justified. Results Although all 45 trials reviewed apparently had cost data for each patient, only 9 (20%) reported adequate measures of variability for these data and only 25 (56%) gave results of statistical tests or a measure of precision for the comparison of costs between the randomised groups. Only 16 (36%) of the articles gave conclusions which were justified on the basis of results presented in the paper. No paper reported sample size calculations for costs. Conclusions The analysis and interpretation of cost data from published trials reveal a lack of statistical awareness. Strong and potentially misleading conclusions about the relative costs of alternative therapies have often been reported in the absence of supporting statistical evidence. Improvements in the analysis and reporting of health economic assessments are urgently required. Health economic guidelines need to be revised to incorporate more detailed statistical advice. Key messagesHealth economic evaluations required for important healthcare policy decisions are often carried out in randomised controlled trialsA review of such published economic evaluations assessed whether statistical methods for cost outcomes have been appropriately used and interpretedFew publications presented adequate descriptive information for costs or performed appropriate statistical analysesIn at least two thirds of the papers, the main conclusions regarding costs were not justifiedThe analysis and reporting of health economic assessments within randomised controlled trials urgently need improving PMID:9794854

  9. Applied Behavior Analysis and Statistical Process Control?

    ERIC Educational Resources Information Center

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  10. Design and analysis of multiple diseases genome-wide association studies without controls.

    PubMed

    Chen, Zhongxue; Huang, Hanwen; Ng, Hon Keung Tony

    2012-11-15

    In genome-wide association studies (GWAS), multiple diseases with shared controls is one of the case-control study designs. If data obtained from these studies are appropriately analyzed, this design can have several advantages such as improving statistical power in detecting associations and reducing the time and cost in the data collection process. In this paper, we propose a study design for GWAS which involves multiple diseases but without controls. We also propose corresponding statistical data analysis strategy for GWAS with multiple diseases but no controls. Through a simulation study, we show that the statistical association test with the proposed study design is more powerful than the test with single disease sharing common controls, and it has comparable power to the overall test based on the whole dataset including the controls. We also apply the proposed method to a real GWAS dataset to illustrate the methodologies and the advantages of the proposed design. Some possible limitations of this study design and testing method and their solutions are also discussed. Our findings indicate that the proposed study design and statistical analysis strategy could be more efficient than the usual case-control GWAS as well as those with shared controls. Copyright © 2012 Elsevier B.V. All rights reserved.

  11. Improved Statistics for Genome-Wide Interaction Analysis

    PubMed Central

    Ueki, Masao; Cordell, Heather J.

    2012-01-01

    Recently, Wu and colleagues [1] proposed two novel statistics for genome-wide interaction analysis using case/control or case-only data. In computer simulations, their proposed case/control statistic outperformed competing approaches, including the fast-epistasis option in PLINK and logistic regression analysis under the correct model; however, reasons for its superior performance were not fully explored. Here we investigate the theoretical properties and performance of Wu et al.'s proposed statistics and explain why, in some circumstances, they outperform competing approaches. Unfortunately, we find minor errors in the formulae for their statistics, resulting in tests that have higher than nominal type 1 error. We also find minor errors in PLINK's fast-epistasis and case-only statistics, although theory and simulations suggest that these errors have only negligible effect on type 1 error. We propose adjusted versions of all four statistics that, both theoretically and in computer simulations, maintain correct type 1 error rates under the null hypothesis. We also investigate statistics based on correlation coefficients that maintain similar control of type 1 error. Although designed to test specifically for interaction, we show that some of these previously-proposed statistics can, in fact, be sensitive to main effects at one or both loci, particularly in the presence of linkage disequilibrium. We propose two new “joint effects” statistics that, provided the disease is rare, are sensitive only to genuine interaction effects. In computer simulations we find, in most situations considered, that highest power is achieved by analysis under the correct genetic model. Such an analysis is unachievable in practice, as we do not know this model. However, generally high power over a wide range of scenarios is exhibited by our joint effects and adjusted Wu statistics. We recommend use of these alternative or adjusted statistics and urge caution when using Wu et al.'s originally-proposed statistics, on account of the inflated error rate that can result. PMID:22496670

  12. Root Cause Analysis of Quality Defects Using HPLC-MS Fingerprint Knowledgebase for Batch-to-batch Quality Control of Herbal Drugs.

    PubMed

    Yan, Binjun; Fang, Zhonghua; Shen, Lijuan; Qu, Haibin

    2015-01-01

    The batch-to-batch quality consistency of herbal drugs has always been an important issue. To propose a methodology for batch-to-batch quality control based on HPLC-MS fingerprints and process knowledgebase. The extraction process of Compound E-jiao Oral Liquid was taken as a case study. After establishing the HPLC-MS fingerprint analysis method, the fingerprints of the extract solutions produced under normal and abnormal operation conditions were obtained. Multivariate statistical models were built for fault detection and a discriminant analysis model was built using the probabilistic discriminant partial-least-squares method for fault diagnosis. Based on multivariate statistical analysis, process knowledge was acquired and the cause-effect relationship between process deviations and quality defects was revealed. The quality defects were detected successfully by multivariate statistical control charts and the type of process deviations were diagnosed correctly by discriminant analysis. This work has demonstrated the benefits of combining HPLC-MS fingerprints, process knowledge and multivariate analysis for the quality control of herbal drugs. Copyright © 2015 John Wiley & Sons, Ltd.

  13. Controlling the joint local false discovery rate is more powerful than meta-analysis methods in joint analysis of summary statistics from multiple genome-wide association studies.

    PubMed

    Jiang, Wei; Yu, Weichuan

    2017-02-15

    In genome-wide association studies (GWASs) of common diseases/traits, we often analyze multiple GWASs with the same phenotype together to discover associated genetic variants with higher power. Since it is difficult to access data with detailed individual measurements, summary-statistics-based meta-analysis methods have become popular to jointly analyze datasets from multiple GWASs. In this paper, we propose a novel summary-statistics-based joint analysis method based on controlling the joint local false discovery rate (Jlfdr). We prove that our method is the most powerful summary-statistics-based joint analysis method when controlling the false discovery rate at a certain level. In particular, the Jlfdr-based method achieves higher power than commonly used meta-analysis methods when analyzing heterogeneous datasets from multiple GWASs. Simulation experiments demonstrate the superior power of our method over meta-analysis methods. Also, our method discovers more associations than meta-analysis methods from empirical datasets of four phenotypes. The R-package is available at: http://bioinformatics.ust.hk/Jlfdr.html . eeyu@ust.hk. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  14. Efficiency Analysis: Enhancing the Statistical and Evaluative Power of the Regression-Discontinuity Design.

    ERIC Educational Resources Information Center

    Madhere, Serge

    An analytic procedure, efficiency analysis, is proposed for improving the utility of quantitative program evaluation for decision making. The three features of the procedure are explained: (1) for statistical control, it adopts and extends the regression-discontinuity design; (2) for statistical inferences, it de-emphasizes hypothesis testing in…

  15. The estimation of the measurement results with using statistical methods

    NASA Astrophysics Data System (ADS)

    Velychko, O.; Gordiyenko, T.

    2015-02-01

    The row of international standards and guides describe various statistical methods that apply for a management, control and improvement of processes with the purpose of realization of analysis of the technical measurement results. The analysis of international standards and guides on statistical methods estimation of the measurement results recommendations for those applications in laboratories is described. For realization of analysis of standards and guides the cause-and-effect Ishikawa diagrams concerting to application of statistical methods for estimation of the measurement results are constructed.

  16. Statistical power in parallel group point exposure studies with time-to-event outcomes: an empirical comparison of the performance of randomized controlled trials and the inverse probability of treatment weighting (IPTW) approach.

    PubMed

    Austin, Peter C; Schuster, Tibor; Platt, Robert W

    2015-10-15

    Estimating statistical power is an important component of the design of both randomized controlled trials (RCTs) and observational studies. Methods for estimating statistical power in RCTs have been well described and can be implemented simply. In observational studies, statistical methods must be used to remove the effects of confounding that can occur due to non-random treatment assignment. Inverse probability of treatment weighting (IPTW) using the propensity score is an attractive method for estimating the effects of treatment using observational data. However, sample size and power calculations have not been adequately described for these methods. We used an extensive series of Monte Carlo simulations to compare the statistical power of an IPTW analysis of an observational study with time-to-event outcomes with that of an analysis of a similarly-structured RCT. We examined the impact of four factors on the statistical power function: number of observed events, prevalence of treatment, the marginal hazard ratio, and the strength of the treatment-selection process. We found that, on average, an IPTW analysis had lower statistical power compared to an analysis of a similarly-structured RCT. The difference in statistical power increased as the magnitude of the treatment-selection model increased. The statistical power of an IPTW analysis tended to be lower than the statistical power of a similarly-structured RCT.

  17. Some issues in the statistical analysis of vehicle emissions

    DOT National Transportation Integrated Search

    2000-09-01

    Some of the issues complicating the statistical analysis of vehicle emissions and the effectiveness of emissions control programs are presented in this article. Issues discussed include: the variability of inter- and intra-vehicle emissions; the skew...

  18. Statistical tools for transgene copy number estimation based on real-time PCR.

    PubMed

    Yuan, Joshua S; Burris, Jason; Stewart, Nathan R; Mentewab, Ayalew; Stewart, C Neal

    2007-11-01

    As compared with traditional transgene copy number detection technologies such as Southern blot analysis, real-time PCR provides a fast, inexpensive and high-throughput alternative. However, the real-time PCR based transgene copy number estimation tends to be ambiguous and subjective stemming from the lack of proper statistical analysis and data quality control to render a reliable estimation of copy number with a prediction value. Despite the recent progresses in statistical analysis of real-time PCR, few publications have integrated these advancements in real-time PCR based transgene copy number determination. Three experimental designs and four data quality control integrated statistical models are presented. For the first method, external calibration curves are established for the transgene based on serially-diluted templates. The Ct number from a control transgenic event and putative transgenic event are compared to derive the transgene copy number or zygosity estimation. Simple linear regression and two group T-test procedures were combined to model the data from this design. For the second experimental design, standard curves were generated for both an internal reference gene and the transgene, and the copy number of transgene was compared with that of internal reference gene. Multiple regression models and ANOVA models can be employed to analyze the data and perform quality control for this approach. In the third experimental design, transgene copy number is compared with reference gene without a standard curve, but rather, is based directly on fluorescence data. Two different multiple regression models were proposed to analyze the data based on two different approaches of amplification efficiency integration. Our results highlight the importance of proper statistical treatment and quality control integration in real-time PCR-based transgene copy number determination. These statistical methods allow the real-time PCR-based transgene copy number estimation to be more reliable and precise with a proper statistical estimation. Proper confidence intervals are necessary for unambiguous prediction of trangene copy number. The four different statistical methods are compared for their advantages and disadvantages. Moreover, the statistical methods can also be applied for other real-time PCR-based quantification assays including transfection efficiency analysis and pathogen quantification.

  19. NAUSEA and the Principle of Supplementarity of Damping and Isolation in Noise Control.

    DTIC Science & Technology

    1980-02-01

    New approaches and uses of the statistical energy analysis (NAUSEA) have been considered and developed in recent months. The advances were made...possible in that the requirement, in the olde statistical energy analysis , that the dynamic systems be highly reverberant and the couplings between the...analytical consideration in terms of the statistical energy analysis (SEA). A brief discussion and simple examples that relate to these recent advances

  20. Statistical quality control through overall vibration analysis

    NASA Astrophysics Data System (ADS)

    Carnero, M. a. Carmen; González-Palma, Rafael; Almorza, David; Mayorga, Pedro; López-Escobar, Carlos

    2010-05-01

    The present study introduces the concept of statistical quality control in automotive wheel bearings manufacturing processes. Defects on products under analysis can have a direct influence on passengers' safety and comfort. At present, the use of vibration analysis on machine tools for quality control purposes is not very extensive in manufacturing facilities. Noise and vibration are common quality problems in bearings. These failure modes likely occur under certain operating conditions and do not require high vibration amplitudes but relate to certain vibration frequencies. The vibration frequencies are affected by the type of surface problems (chattering) of ball races that are generated through grinding processes. The purpose of this paper is to identify grinding process variables that affect the quality of bearings by using statistical principles in the field of machine tools. In addition, an evaluation of the quality results of the finished parts under different combinations of process variables is assessed. This paper intends to establish the foundations to predict the quality of the products through the analysis of self-induced vibrations during the contact between the grinding wheel and the parts. To achieve this goal, the overall self-induced vibration readings under different combinations of process variables are analysed using statistical tools. The analysis of data and design of experiments follows a classical approach, considering all potential interactions between variables. The analysis of data is conducted through analysis of variance (ANOVA) for data sets that meet normality and homoscedasticity criteria. This paper utilizes different statistical tools to support the conclusions such as chi squared, Shapiro-Wilks, symmetry, Kurtosis, Cochran, Hartlett, and Hartley and Krushal-Wallis. The analysis presented is the starting point to extend the use of predictive techniques (vibration analysis) for quality control. This paper demonstrates the existence of predictive variables (high-frequency vibration displacements) that are sensible to the processes setup and the quality of the products obtained. Based on the result of this overall vibration analysis, a second paper will analyse self-induced vibration spectrums in order to define limit vibration bands, controllable every cycle or connected to permanent vibration-monitoring systems able to adjust sensible process variables identified by ANOVA, once the vibration readings exceed established quality limits.

  1. Systems and methods for detection of blowout precursors in combustors

    DOEpatents

    Lieuwen, Tim C.; Nair, Suraj

    2006-08-15

    The present invention comprises systems and methods for detecting flame blowout precursors in combustors. The blowout precursor detection system comprises a combustor, a pressure measuring device, and blowout precursor detection unit. A combustion controller may also be used to control combustor parameters. The methods of the present invention comprise receiving pressure data measured by an acoustic pressure measuring device, performing one or a combination of spectral analysis, statistical analysis, and wavelet analysis on received pressure data, and determining the existence of a blowout precursor based on such analyses. The spectral analysis, statistical analysis, and wavelet analysis further comprise their respective sub-methods to determine the existence of blowout precursors.

  2. Policy Safeguards and the Legitimacy of Highway Interdiction

    DTIC Science & Technology

    2016-12-01

    17 B. BIAS WITHIN LAW ENFORCEMENT ..............................................19 C. STATISTICAL DATA GATHERING...32 3. Controlling Discretion .................................................................36 4. Statistical Data Collection for Traffic Stops...49 A. DESCRIPTION OF STATISTICAL DATA COLLECTED ...............50 B. DATA ORGANIZATION AND ANALYSIS

  3. The effect of berberine on insulin resistance in women with polycystic ovary syndrome: detailed statistical analysis plan (SAP) for a multicenter randomized controlled trial.

    PubMed

    Zhang, Ying; Sun, Jin; Zhang, Yun-Jiao; Chai, Qian-Yun; Zhang, Kang; Ma, Hong-Li; Wu, Xiao-Ke; Liu, Jian-Ping

    2016-10-21

    Although Traditional Chinese Medicine (TCM) has been widely used in clinical settings, a major challenge that remains in TCM is to evaluate its efficacy scientifically. This randomized controlled trial aims to evaluate the efficacy and safety of berberine in the treatment of patients with polycystic ovary syndrome. In order to improve the transparency and research quality of this clinical trial, we prepared this statistical analysis plan (SAP). The trial design, primary and secondary outcomes, and safety outcomes were declared to reduce selection biases in data analysis and result reporting. We specified detailed methods for data management and statistical analyses. Statistics in corresponding tables, listings, and graphs were outlined. The SAP provided more detailed information than trial protocol on data management and statistical analysis methods. Any post hoc analyses could be identified via referring to this SAP, and the possible selection bias and performance bias will be reduced in the trial. This study is registered at ClinicalTrials.gov, NCT01138930 , registered on 7 June 2010.

  4. An Automated Statistical Process Control Study of Inline Mixing Using Spectrophotometric Detection

    ERIC Educational Resources Information Center

    Dickey, Michael D.; Stewart, Michael D.; Willson, C. Grant

    2006-01-01

    An experiment is described, which is designed for a junior-level chemical engineering "fundamentals of measurements and data analysis" course, where students are introduced to the concept of statistical process control (SPC) through a simple inline mixing experiment. The students learn how to create and analyze control charts in an effort to…

  5. Robust Fixed-Structure Control

    DTIC Science & Technology

    1994-10-30

    Deterministic Foundation for Statistical Energy Analysis ," J. Sound Vibr., to appear. 1.96 D. S. Bernstein and S. P. Bhat, "Lyapunov Stability, Semistability...S. Bernstein, "Power Flow, Energy Balance, and Statistical Energy Analysis for Large Scale, Interconnected Systems," Proc. Amer. Contr. Conf., pp

  6. Learning Model and Form of Assesment toward the Inferensial Statistical Achievement by Controlling Numeric Thinking Skills

    ERIC Educational Resources Information Center

    Widiana, I. Wayan; Jampel, I. Nyoman

    2016-01-01

    This study aimed to find out the effect of learning model and form of assessment toward inferential statistical achievement after controlling numeric thinking skills. This study was quasi experimental study with 130 students as the sample. The data analysis used ANCOVA. After controlling numeric thinking skills, the result of this study show that:…

  7. The Warning System in Disaster Situations: A Selective Analysis.

    DTIC Science & Technology

    DISASTERS, *WARNING SYSTEMS), CIVIL DEFENSE, SOCIAL PSYCHOLOGY, REACTION(PSYCHOLOGY), FACTOR ANALYSIS, CLASSIFICATION, STATISTICAL DATA, TIME ... MANAGEMENT PLANNING AND CONTROL, DAMAGE, CONTROL SYSTEMS, THREAT EVALUATION, DECISION MAKING, DATA PROCESSING, COMMUNICATION SYSTEMS, NUCLEAR EXPLOSIONS

  8. Active Nonlinear Feedback Control for Aerospace Systems. Processor

    DTIC Science & Technology

    1990-12-01

    relating to the role of nonlinearities in feedback control. These area include Lyapunov function theory, chaotic controllers, statistical energy analysis , phase robustness, and optimal nonlinear control theory.

  9. Quality control analysis : part V : review of data generated by statistical specifications on asphaltic concrete : final report.

    DOT National Transportation Integrated Search

    1975-12-01

    The primary objective of this study was to review the data generated by projects governed by statistically oriented system of specifications for the control and acceptance of asphaltic concrete and to recommend any revisions that may be deemed necess...

  10. Analyzing a Mature Software Inspection Process Using Statistical Process Control (SPC)

    NASA Technical Reports Server (NTRS)

    Barnard, Julie; Carleton, Anita; Stamper, Darrell E. (Technical Monitor)

    1999-01-01

    This paper presents a cooperative effort where the Software Engineering Institute and the Space Shuttle Onboard Software Project could experiment applying Statistical Process Control (SPC) analysis to inspection activities. The topics include: 1) SPC Collaboration Overview; 2) SPC Collaboration Approach and Results; and 3) Lessons Learned.

  11. Proceedings of the first ERDA statistical symposium, Los Alamos, NM, November 3--5, 1975. [Sixteen papers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nicholson, W L; Harris, J L

    1976-03-01

    The First ERDA Statistical Symposium was organized to provide a means for communication among ERDA statisticians, and the sixteen papers presented at the meeting are given. Topics include techniques of numerical analysis used for accelerators, nuclear reactors, skewness and kurtosis statistics, radiochemical spectral analysis, quality control, and other statistics problems. Nine of the papers were previously announced in Nuclear Science Abstracts (NSA), while the remaining seven were abstracted for ERDA Energy Research Abstracts (ERA) and INIS Atomindex. (PMA)

  12. Statistical analysis plan for evaluating low- vs. standard-dose alteplase in the ENhanced Control of Hypertension and Thrombolysis strokE stuDy (ENCHANTED).

    PubMed

    Anderson, Craig S; Woodward, Mark; Arima, Hisatomi; Chen, Xiaoying; Lindley, Richard I; Wang, Xia; Chalmers, John

    2015-12-01

    The ENhanced Control of Hypertension And Thrombolysis strokE stuDy trial is a 2 × 2 quasi-factorial active-comparison, prospective, randomized, open, blinded endpoint clinical trial that is evaluating in thrombolysis-eligible acute ischemic stroke patients whether: (1) low-dose (0·6 mg/kg body weight) intravenous alteplase has noninferior efficacy and lower risk of symptomatic intracerebral hemorrhage compared with standard-dose (0·9 mg/kg body weight) intravenous alteplase; and (2) early intensive blood pressure lowering (systolic target 130-140 mmHg) has superior efficacy and lower risk of any intracerebral hemorrhage compared with guideline-recommended blood pressure control (systolic target <180 mmHg). To outline in detail the predetermined statistical analysis plan for the 'alteplase dose arm' of the study. All data collected by participating researchers will be reviewed and formally assessed. Information pertaining to the baseline characteristics of patients, their process of care, and the delivery of treatments will be classified, and for each item, appropriate descriptive statistical analyses are planned with appropriate comparisons made between randomized groups. For the trial outcomes, the most appropriate statistical comparisons to be made between groups are planned and described. A statistical analysis plan was developed for the results of the alteplase dose arm of the study that is transparent, available to the public, verifiable, and predetermined before completion of data collection. We have developed a predetermined statistical analysis plan for the ENhanced Control of Hypertension And Thrombolysis strokE stuDy alteplase dose arm which is to be followed to avoid analysis bias arising from prior knowledge of the study findings. © 2015 The Authors. International Journal of Stroke published by John Wiley & Sons Ltd on behalf of World Stroke Organization.

  13. Analysis of Exhaled Breath Volatile Organic Compounds in Inflammatory Bowel Disease: A Pilot Study.

    PubMed

    Hicks, Lucy C; Huang, Juzheng; Kumar, Sacheen; Powles, Sam T; Orchard, Timothy R; Hanna, George B; Williams, Horace R T

    2015-09-01

    Distinguishing between the inflammatory bowel diseases [IBD], Crohn's disease [CD] and ulcerative colitis [UC], is important for determining management and prognosis. Selected ion flow tube mass spectrometry [SIFT-MS] may be used to analyse volatile organic compounds [VOCs] in exhaled breath: these may be altered in disease states, and distinguishing breath VOC profiles can be identified. The aim of this pilot study was to identify, quantify, and analyse VOCs present in the breath of IBD patients and controls, potentially providing insights into disease pathogenesis and complementing current diagnostic algorithms. SIFT-MS breath profiling of 56 individuals [20 UC, 18 CD, and 18 healthy controls] was undertaken. Multivariate analysis included principal components analysis and partial least squares discriminant analysis with orthogonal signal correction [OSC-PLS-DA]. Receiver operating characteristic [ROC] analysis was performed for each comparative analysis using statistically significant VOCs. OSC-PLS-DA modelling was able to distinguish both CD and UC from healthy controls and from one other with good sensitivity and specificity. ROC analysis using combinations of statistically significant VOCs [dimethyl sulphide, hydrogen sulphide, hydrogen cyanide, ammonia, butanal, and nonanal] gave integrated areas under the curve of 0.86 [CD vs healthy controls], 0.74 [UC vs healthy controls], and 0.83 [CD vs UC]. Exhaled breath VOC profiling was able to distinguish IBD patients from controls, as well as to separate UC from CD, using both multivariate and univariate statistical techniques. Copyright © 2015 European Crohn’s and Colitis Organisation (ECCO). Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  14. Active Structural Acoustic Control as an Approach to Acoustic Optimization of Lightweight Structures

    DTIC Science & Technology

    2001-06-01

    appropriate approach based on Statistical Energy Analysis (SEA) would facilitate investigations of the structural behavior at a high modal density. On the way...higher frequency investigations an approach based on the Statistical Energy Analysis (SEA) is recommended to describe the structural dynamic behavior

  15. Implementation of statistical process control for proteomic experiments via LC MS/MS.

    PubMed

    Bereman, Michael S; Johnson, Richard; Bollinger, James; Boss, Yuval; Shulman, Nick; MacLean, Brendan; Hoofnagle, Andrew N; MacCoss, Michael J

    2014-04-01

    Statistical process control (SPC) is a robust set of tools that aids in the visualization, detection, and identification of assignable causes of variation in any process that creates products, services, or information. A tool has been developed termed Statistical Process Control in Proteomics (SProCoP) which implements aspects of SPC (e.g., control charts and Pareto analysis) into the Skyline proteomics software. It monitors five quality control metrics in a shotgun or targeted proteomic workflow. None of these metrics require peptide identification. The source code, written in the R statistical language, runs directly from the Skyline interface, which supports the use of raw data files from several of the mass spectrometry vendors. It provides real time evaluation of the chromatographic performance (e.g., retention time reproducibility, peak asymmetry, and resolution), and mass spectrometric performance (targeted peptide ion intensity and mass measurement accuracy for high resolving power instruments) via control charts. Thresholds are experiment- and instrument-specific and are determined empirically from user-defined quality control standards that enable the separation of random noise and systematic error. Finally, Pareto analysis provides a summary of performance metrics and guides the user to metrics with high variance. The utility of these charts to evaluate proteomic experiments is illustrated in two case studies.

  16. Statistical process control for residential treated wood

    Treesearch

    Patricia K. Lebow; Timothy M. Young; Stan Lebow

    2017-01-01

    This paper is the first stage of a study that attempts to improve the process of manufacturing treated lumber through the use of statistical process control (SPC). Analysis of industrial and auditing agency data sets revealed there are differences between the industry and agency probability density functions (pdf) for normalized retention data. Resampling of batches of...

  17. Sexual Abuse, Family Environment, and Psychological Symptoms: On the Validity of Statistical Control.

    ERIC Educational Resources Information Center

    Briere, John; Elliott, Diana M.

    1993-01-01

    Responds to article in which Nash et al. reported on effects of controlling for family environment when studying sexual abuse sequelae. Considers findings in terms of theoretical and statistical constraints placed on analysis of covariance and other partializing procedures. Questions use of covariate techniques to test hypotheses about causal role…

  18. Evaluation of statistical protocols for quality control of ecosystem carbon dioxide fluxes

    Treesearch

    Jorge F. Perez-Quezada; Nicanor Z. Saliendra; William E. Emmerich; Emilio A. Laca

    2007-01-01

    The process of quality control of micrometeorological and carbon dioxide (CO2) flux data can be subjective and may lack repeatability, which would undermine the results of many studies. Multivariate statistical methods and time series analysis were used together and independently to detect and replace outliers in CO2 flux...

  19. Statistical process control: separating signal from noise in emergency department operations.

    PubMed

    Pimentel, Laura; Barrueto, Fermin

    2015-05-01

    Statistical process control (SPC) is a visually appealing and statistically rigorous methodology very suitable to the analysis of emergency department (ED) operations. We demonstrate that the control chart is the primary tool of SPC; it is constructed by plotting data measuring the key quality indicators of operational processes in rationally ordered subgroups such as units of time. Control limits are calculated using formulas reflecting the variation in the data points from one another and from the mean. SPC allows managers to determine whether operational processes are controlled and predictable. We review why the moving range chart is most appropriate for use in the complex ED milieu, how to apply SPC to ED operations, and how to determine when performance improvement is needed. SPC is an excellent tool for operational analysis and quality improvement for these reasons: 1) control charts make large data sets intuitively coherent by integrating statistical and visual descriptions; 2) SPC provides analysis of process stability and capability rather than simple comparison with a benchmark; 3) SPC allows distinction between special cause variation (signal), indicating an unstable process requiring action, and common cause variation (noise), reflecting a stable process; and 4) SPC keeps the focus of quality improvement on process rather than individual performance. Because data have no meaning apart from their context, and every process generates information that can be used to improve it, we contend that SPC should be seriously considered for driving quality improvement in emergency medicine. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Genome-wide association analysis of secondary imaging phenotypes from the Alzheimer's disease neuroimaging initiative study.

    PubMed

    Zhu, Wensheng; Yuan, Ying; Zhang, Jingwen; Zhou, Fan; Knickmeyer, Rebecca C; Zhu, Hongtu

    2017-02-01

    The aim of this paper is to systematically evaluate a biased sampling issue associated with genome-wide association analysis (GWAS) of imaging phenotypes for most imaging genetic studies, including the Alzheimer's Disease Neuroimaging Initiative (ADNI). Specifically, the original sampling scheme of these imaging genetic studies is primarily the retrospective case-control design, whereas most existing statistical analyses of these studies ignore such sampling scheme by directly correlating imaging phenotypes (called the secondary traits) with genotype. Although it has been well documented in genetic epidemiology that ignoring the case-control sampling scheme can produce highly biased estimates, and subsequently lead to misleading results and suspicious associations, such findings are not well documented in imaging genetics. We use extensive simulations and a large-scale imaging genetic data analysis of the Alzheimer's Disease Neuroimaging Initiative (ADNI) data to evaluate the effects of the case-control sampling scheme on GWAS results based on some standard statistical methods, such as linear regression methods, while comparing it with several advanced statistical methods that appropriately adjust for the case-control sampling scheme. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Statistics for nuclear engineers and scientists. Part 1. Basic statistical inference

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beggs, W.J.

    1981-02-01

    This report is intended for the use of engineers and scientists working in the nuclear industry, especially at the Bettis Atomic Power Laboratory. It serves as the basis for several Bettis in-house statistics courses. The objectives of the report are to introduce the reader to the language and concepts of statistics and to provide a basic set of techniques to apply to problems of the collection and analysis of data. Part 1 covers subjects of basic inference. The subjects include: descriptive statistics; probability; simple inference for normally distributed populations, and for non-normal populations as well; comparison of two populations; themore » analysis of variance; quality control procedures; and linear regression analysis.« less

  2. Statistical evaluation of vibration analysis techniques

    NASA Technical Reports Server (NTRS)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  3. Coordinate based random effect size meta-analysis of neuroimaging studies.

    PubMed

    Tench, C R; Tanasescu, Radu; Constantinescu, C S; Auer, D P; Cottam, W J

    2017-06-01

    Low power in neuroimaging studies can make them difficult to interpret, and Coordinate based meta-analysis (CBMA) may go some way to mitigating this issue. CBMA has been used in many analyses to detect where published functional MRI or voxel-based morphometry studies testing similar hypotheses report significant summary results (coordinates) consistently. Only the reported coordinates and possibly t statistics are analysed, and statistical significance of clusters is determined by coordinate density. Here a method of performing coordinate based random effect size meta-analysis and meta-regression is introduced. The algorithm (ClusterZ) analyses both coordinates and reported t statistic or Z score, standardised by the number of subjects. Statistical significance is determined not by coordinate density, but by a random effects meta-analyses of reported effects performed cluster-wise using standard statistical methods and taking account of censoring inherent in the published summary results. Type 1 error control is achieved using the false cluster discovery rate (FCDR), which is based on the false discovery rate. This controls both the family wise error rate under the null hypothesis that coordinates are randomly drawn from a standard stereotaxic space, and the proportion of significant clusters that are expected under the null. Such control is necessary to avoid propagating and even amplifying the very issues motivating the meta-analysis in the first place. ClusterZ is demonstrated on both numerically simulated data and on real data from reports of grey matter loss in multiple sclerosis (MS) and syndromes suggestive of MS, and of painful stimulus in healthy controls. The software implementation is available to download and use freely. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Data exploration, quality control and statistical analysis of ChIP-exo/nexus experiments

    PubMed Central

    Welch, Rene; Chung, Dongjun; Grass, Jeffrey; Landick, Robert

    2017-01-01

    Abstract ChIP-exo/nexus experiments rely on innovative modifications of the commonly used ChIP-seq protocol for high resolution mapping of transcription factor binding sites. Although many aspects of the ChIP-exo data analysis are similar to those of ChIP-seq, these high throughput experiments pose a number of unique quality control and analysis challenges. We develop a novel statistical quality control pipeline and accompanying R/Bioconductor package, ChIPexoQual, to enable exploration and analysis of ChIP-exo and related experiments. ChIPexoQual evaluates a number of key issues including strand imbalance, library complexity, and signal enrichment of data. Assessment of these features are facilitated through diagnostic plots and summary statistics computed over regions of the genome with varying levels of coverage. We evaluated our QC pipeline with both large collections of public ChIP-exo/nexus data and multiple, new ChIP-exo datasets from Escherichia coli. ChIPexoQual analysis of these datasets resulted in guidelines for using these QC metrics across a wide range of sequencing depths and provided further insights for modelling ChIP-exo data. PMID:28911122

  5. Data exploration, quality control and statistical analysis of ChIP-exo/nexus experiments.

    PubMed

    Welch, Rene; Chung, Dongjun; Grass, Jeffrey; Landick, Robert; Keles, Sündüz

    2017-09-06

    ChIP-exo/nexus experiments rely on innovative modifications of the commonly used ChIP-seq protocol for high resolution mapping of transcription factor binding sites. Although many aspects of the ChIP-exo data analysis are similar to those of ChIP-seq, these high throughput experiments pose a number of unique quality control and analysis challenges. We develop a novel statistical quality control pipeline and accompanying R/Bioconductor package, ChIPexoQual, to enable exploration and analysis of ChIP-exo and related experiments. ChIPexoQual evaluates a number of key issues including strand imbalance, library complexity, and signal enrichment of data. Assessment of these features are facilitated through diagnostic plots and summary statistics computed over regions of the genome with varying levels of coverage. We evaluated our QC pipeline with both large collections of public ChIP-exo/nexus data and multiple, new ChIP-exo datasets from Escherichia coli. ChIPexoQual analysis of these datasets resulted in guidelines for using these QC metrics across a wide range of sequencing depths and provided further insights for modelling ChIP-exo data. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  6. Antecedents to Organizational Performance: Theoretical and Practical Implications for Aircraft Maintenance Officer Force Development

    DTIC Science & Technology

    2015-03-26

    to my reader, Lieutenant Colonel Robert Overstreet, for helping solidify my research, coaching me through the statistical analysis, and positive...61  Descriptive Statistics .............................................................................................................. 61...common-method bias requires careful assessment of potential sources of bias and implementing procedural and statistical control methods. Podsakoff

  7. Quality control analysis : part II : soil and aggregate base course.

    DOT National Transportation Integrated Search

    1966-07-01

    This is the second of the three reports on the quality control analysis of highway construction materials. : It deals with the statistical evaluation of results from several construction projects to determine the basic pattern of variability with res...

  8. Quality control analysis : part III : concrete and concrete aggregates.

    DOT National Transportation Integrated Search

    1966-11-01

    This is the third and last report on the Quality Control Analysis of highway construction materials. : It deals with the statistical evaluation of data from several construction projects to determine the basic pattern of variability with respect to s...

  9. Neural network approach in multichannel auditory event-related potential analysis.

    PubMed

    Wu, F Y; Slater, J D; Ramsay, R E

    1994-04-01

    Even though there are presently no clearly defined criteria for the assessment of P300 event-related potential (ERP) abnormality, it is strongly indicated through statistical analysis that such criteria exist for classifying control subjects and patients with diseases resulting in neuropsychological impairment such as multiple sclerosis (MS). We have demonstrated the feasibility of artificial neural network (ANN) methods in classifying ERP waveforms measured at a single channel (Cz) from control subjects and MS patients. In this paper, we report the results of multichannel ERP analysis and a modified network analysis methodology to enhance automation of the classification rule extraction process. The proposed methodology significantly reduces the work of statistical analysis. It also helps to standardize the criteria of P300 ERP assessment and facilitate the computer-aided analysis on neuropsychological functions.

  10. Statistical issues in quality control of proteomic analyses: good experimental design and planning.

    PubMed

    Cairns, David A

    2011-03-01

    Quality control is becoming increasingly important in proteomic investigations as experiments become more multivariate and quantitative. Quality control applies to all stages of an investigation and statistics can play a key role. In this review, the role of statistical ideas in the design and planning of an investigation is described. This involves the design of unbiased experiments using key concepts from statistical experimental design, the understanding of the biological and analytical variation in a system using variance components analysis and the determination of a required sample size to perform a statistically powerful investigation. These concepts are described through simple examples and an example data set from a 2-D DIGE pilot experiment. Each of these concepts can prove useful in producing better and more reproducible data. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Clinical effectiveness of palifermin in prevention and treatment of oral mucositis in children with acute lymphoblastic leukaemia: a case-control study.

    PubMed

    Lauritano, Dorina; Petruzzi, Massimo; Di Stasio, Dario; Lucchese, Alberta

    2014-03-01

    The aim of this study was to evaluate the efficacy of palifermin, an N-terminal truncated version of endogenous keratinocyte growth factor, in the control of oral mucositis during antiblastic therapy. Twenty patients undergoing allogeneic stem-cell transplantation for acute lymphoblastic leukaemia were treated with palifermin, and compared to a control group with the same number of subjects and similar inclusion criteria. Statistical analysis were performed to compare the outcomes in the treatment vs. control groups. In the treatment group, we found a statistically significant reduction in the duration of parenteral nutrition (P=0.002), duration of mucositis (P=0.003) and the average grade of mucositis (P=0.03). The statistical analysis showed that the drug was able to decrease the severity of mucositis. These data, although preliminary, suggest that palifermin could be a valid therapeutic adjuvant to improve the quality of life of patients suffering from leukaemia.

  12. Design, analysis, and interpretation of field quality-control data for water-sampling projects

    USGS Publications Warehouse

    Mueller, David K.; Schertz, Terry L.; Martin, Jeffrey D.; Sandstrom, Mark W.

    2015-01-01

    The report provides extensive information about statistical methods used to analyze quality-control data in order to estimate potential bias and variability in environmental data. These methods include construction of confidence intervals on various statistical measures, such as the mean, percentiles and percentages, and standard deviation. The methods are used to compare quality-control results with the larger set of environmental data in order to determine whether the effects of bias and variability might interfere with interpretation of these data. Examples from published reports are presented to illustrate how the methods are applied, how bias and variability are reported, and how the interpretation of environmental data can be qualified based on the quality-control analysis.

  13. Analysis of spontaneous MEG activity in mild cognitive impairment and Alzheimer's disease using spectral entropies and statistical complexity measures

    NASA Astrophysics Data System (ADS)

    Bruña, Ricardo; Poza, Jesús; Gómez, Carlos; García, María; Fernández, Alberto; Hornero, Roberto

    2012-06-01

    Alzheimer's disease (AD) is the most common cause of dementia. Over the last few years, a considerable effort has been devoted to exploring new biomarkers. Nevertheless, a better understanding of brain dynamics is still required to optimize therapeutic strategies. In this regard, the characterization of mild cognitive impairment (MCI) is crucial, due to the high conversion rate from MCI to AD. However, only a few studies have focused on the analysis of magnetoencephalographic (MEG) rhythms to characterize AD and MCI. In this study, we assess the ability of several parameters derived from information theory to describe spontaneous MEG activity from 36 AD patients, 18 MCI subjects and 26 controls. Three entropies (Shannon, Tsallis and Rényi entropies), one disequilibrium measure (based on Euclidean distance ED) and three statistical complexities (based on Lopez Ruiz-Mancini-Calbet complexity LMC) were used to estimate the irregularity and statistical complexity of MEG activity. Statistically significant differences between AD patients and controls were obtained with all parameters (p < 0.01). In addition, statistically significant differences between MCI subjects and controls were achieved by ED and LMC (p < 0.05). In order to assess the diagnostic ability of the parameters, a linear discriminant analysis with a leave-one-out cross-validation procedure was applied. The accuracies reached 83.9% and 65.9% to discriminate AD and MCI subjects from controls, respectively. Our findings suggest that MCI subjects exhibit an intermediate pattern of abnormalities between normal aging and AD. Furthermore, the proposed parameters provide a new description of brain dynamics in AD and MCI.

  14. Simulation Study of Evacuation Control Center Operations Analysis

    DTIC Science & Technology

    2011-06-01

    28 4.3 Baseline Manning (Runs 1, 2, & 3) . . . . . . . . . . . . 30 4.3.1 Baseline Statistics Interpretation...46 Appendix B. Key Statistic Matrix: Runs 1-12 . . . . . . . . . . . . . 48 Appendix C. Blue Dart...Completion Time . . . 33 11. Paired T result - Run 5 v. Run 6: ECC Completion Time . . . 35 12. Key Statistics : Run 3 vs. Run 9

  15. Quasi-experimental Studies in the Fields of Infection Control and Antibiotic Resistance, Ten Years Later: A Systematic Review.

    PubMed

    Alsaggaf, Rotana; O'Hara, Lyndsay M; Stafford, Kristen A; Leekha, Surbhi; Harris, Anthony D

    2018-02-01

    OBJECTIVE A systematic review of quasi-experimental studies in the field of infectious diseases was published in 2005. The aim of this study was to assess improvements in the design and reporting of quasi-experiments 10 years after the initial review. We also aimed to report the statistical methods used to analyze quasi-experimental data. DESIGN Systematic review of articles published from January 1, 2013, to December 31, 2014, in 4 major infectious disease journals. METHODS Quasi-experimental studies focused on infection control and antibiotic resistance were identified and classified based on 4 criteria: (1) type of quasi-experimental design used, (2) justification of the use of the design, (3) use of correct nomenclature to describe the design, and (4) statistical methods used. RESULTS Of 2,600 articles, 173 (7%) featured a quasi-experimental design, compared to 73 of 2,320 articles (3%) in the previous review (P<.01). Moreover, 21 articles (12%) utilized a study design with a control group; 6 (3.5%) justified the use of a quasi-experimental design; and 68 (39%) identified their design using the correct nomenclature. In addition, 2-group statistical tests were used in 75 studies (43%); 58 studies (34%) used standard regression analysis; 18 (10%) used segmented regression analysis; 7 (4%) used standard time-series analysis; 5 (3%) used segmented time-series analysis; and 10 (6%) did not utilize statistical methods for comparisons. CONCLUSIONS While some progress occurred over the decade, it is crucial to continue improving the design and reporting of quasi-experimental studies in the fields of infection control and antibiotic resistance to better evaluate the effectiveness of important interventions. Infect Control Hosp Epidemiol 2018;39:170-176.

  16. CADDIS Volume 4. Data Analysis: Basic Principles & Issues

    EPA Pesticide Factsheets

    Use of inferential statistics in causal analysis, introduction to data independence and autocorrelation, methods to identifying and control for confounding variables, references for the Basic Principles section of Data Analysis.

  17. A statistical analysis of the impact of advertising signs on road safety.

    PubMed

    Yannis, George; Papadimitriou, Eleonora; Papantoniou, Panagiotis; Voulgari, Chrisoula

    2013-01-01

    This research aims to investigate the impact of advertising signs on road safety. An exhaustive review of international literature was carried out on the effect of advertising signs on driver behaviour and safety. Moreover, a before-and-after statistical analysis with control groups was applied on several road sites with different characteristics in the Athens metropolitan area, in Greece, in order to investigate the correlation between the placement or removal of advertising signs and the related occurrence of road accidents. Road accident data for the 'before' and 'after' periods on the test sites and the control sites were extracted from the database of the Hellenic Statistical Authority, and the selected 'before' and 'after' periods vary from 2.5 to 6 years. The statistical analysis shows no statistical correlation between road accidents and advertising signs in none of the nine sites examined, as the confidence intervals of the estimated safety effects are non-significant at 95% confidence level. This can be explained by the fact that, in the examined road sites, drivers are overloaded with information (traffic signs, directions signs, labels of shops, pedestrians and other vehicles, etc.) so that the additional information load from advertising signs may not further distract them.

  18. Trial Sequential Analysis in systematic reviews with meta-analysis.

    PubMed

    Wetterslev, Jørn; Jakobsen, Janus Christian; Gluud, Christian

    2017-03-06

    Most meta-analyses in systematic reviews, including Cochrane ones, do not have sufficient statistical power to detect or refute even large intervention effects. This is why a meta-analysis ought to be regarded as an interim analysis on its way towards a required information size. The results of the meta-analyses should relate the total number of randomised participants to the estimated required meta-analytic information size accounting for statistical diversity. When the number of participants and the corresponding number of trials in a meta-analysis are insufficient, the use of the traditional 95% confidence interval or the 5% statistical significance threshold will lead to too many false positive conclusions (type I errors) and too many false negative conclusions (type II errors). We developed a methodology for interpreting meta-analysis results, using generally accepted, valid evidence on how to adjust thresholds for significance in randomised clinical trials when the required sample size has not been reached. The Lan-DeMets trial sequential monitoring boundaries in Trial Sequential Analysis offer adjusted confidence intervals and restricted thresholds for statistical significance when the diversity-adjusted required information size and the corresponding number of required trials for the meta-analysis have not been reached. Trial Sequential Analysis provides a frequentistic approach to control both type I and type II errors. We define the required information size and the corresponding number of required trials in a meta-analysis and the diversity (D 2 ) measure of heterogeneity. We explain the reasons for using Trial Sequential Analysis of meta-analysis when the actual information size fails to reach the required information size. We present examples drawn from traditional meta-analyses using unadjusted naïve 95% confidence intervals and 5% thresholds for statistical significance. Spurious conclusions in systematic reviews with traditional meta-analyses can be reduced using Trial Sequential Analysis. Several empirical studies have demonstrated that the Trial Sequential Analysis provides better control of type I errors and of type II errors than the traditional naïve meta-analysis. Trial Sequential Analysis represents analysis of meta-analytic data, with transparent assumptions, and better control of type I and type II errors than the traditional meta-analysis using naïve unadjusted confidence intervals.

  19. Risk of thromboembolism with thrombopoietin receptor agonists in adult patients with thrombocytopenia: Systematic review and meta-analysis of randomized controlled trials.

    PubMed

    Catalá-López, Ferrán; Corrales, Inmaculada; de la Fuente-Honrubia, César; González-Bermejo, Diana; Martín-Serrano, Gloria; Montero, Dolores; Saint-Gerons, Diego Macías

    2015-12-21

    Romiplostim and eltrombopag are thrombopoietin receptor (TPOr) agonists that promote megakaryocyte differentiation, proliferation and platelet production. In 2012, a systematic review and meta-analysis reported a non-statistically significant increased risk of thromboembolic events for these drugs, but analyses were limited by lack of statistical power. Our objective was to update the 2012 meta-analysis examining whether TPOr agonists affect thromboembolism occurrence in adult thrombocytopenic patients. We conducted a systematic review and meta-analysis of randomized controlled trials (RCTs). Updated searches were conduced on PubMed, Cochrane Central, and publicly available registries (up to December 2014). RCTs using romiplostim or eltrombopag in at least one group were included. Relative risks (RR), absolute risk ratios (ARR) and number needed to harm (NNH) were estimated. Heterogeneity was analyzed using Cochran's Q test and I(2) statistic. Fifteen studies with 3026 adult thrombocytopenic patients were included. Estimated frequency of thromboembolism was 3.69% (95% CI: 2.95-4.61%) for TPOr agonists and 1.46% (95% CI: 0.89-2.40%) for controls. TPOr agonists were associated with a RR of thromboembolism of 1.81 (95% CI: 1.04-3.14) and an ARR of 2.10% (95% CI: 0.03-3.90%) meaning a NNH of 48. Overall, we did not find evidence of statistical heterogeneity (p=0.43; I(2)=1.60%). Our updated meta-analysis suggested that TPOr agonists are associated with a higher risk of thromboemboembolic events compared with controls, and supports the current recommendations included in the European product information on this respect. Copyright © 2015 Elsevier España, S.L.U. All rights reserved.

  20. Cognition of and Demand for Education and Teaching in Medical Statistics in China: A Systematic Review and Meta-Analysis

    PubMed Central

    Li, Gaoming; Yi, Dali; Wu, Xiaojiao; Liu, Xiaoyu; Zhang, Yanqi; Liu, Ling; Yi, Dong

    2015-01-01

    Background Although a substantial number of studies focus on the teaching and application of medical statistics in China, few studies comprehensively evaluate the recognition of and demand for medical statistics. In addition, the results of these various studies differ and are insufficiently comprehensive and systematic. Objectives This investigation aimed to evaluate the general cognition of and demand for medical statistics by undergraduates, graduates, and medical staff in China. Methods We performed a comprehensive database search related to the cognition of and demand for medical statistics from January 2007 to July 2014 and conducted a meta-analysis of non-controlled studies with sub-group analysis for undergraduates, graduates, and medical staff. Results There are substantial differences with respect to the cognition of theory in medical statistics among undergraduates (73.5%), graduates (60.7%), and medical staff (39.6%). The demand for theory in medical statistics is high among graduates (94.6%), undergraduates (86.1%), and medical staff (88.3%). Regarding specific statistical methods, the cognition of basic statistical methods is higher than of advanced statistical methods. The demand for certain advanced statistical methods, including (but not limited to) multiple analysis of variance (ANOVA), multiple linear regression, and logistic regression, is higher than that for basic statistical methods. The use rates of the Statistical Package for the Social Sciences (SPSS) software and statistical analysis software (SAS) are only 55% and 15%, respectively. Conclusion The overall statistical competence of undergraduates, graduates, and medical staff is insufficient, and their ability to practically apply their statistical knowledge is limited, which constitutes an unsatisfactory state of affairs for medical statistics education. Because the demand for skills in this area is increasing, the need to reform medical statistics education in China has become urgent. PMID:26053876

  1. Cognition of and Demand for Education and Teaching in Medical Statistics in China: A Systematic Review and Meta-Analysis.

    PubMed

    Wu, Yazhou; Zhou, Liang; Li, Gaoming; Yi, Dali; Wu, Xiaojiao; Liu, Xiaoyu; Zhang, Yanqi; Liu, Ling; Yi, Dong

    2015-01-01

    Although a substantial number of studies focus on the teaching and application of medical statistics in China, few studies comprehensively evaluate the recognition of and demand for medical statistics. In addition, the results of these various studies differ and are insufficiently comprehensive and systematic. This investigation aimed to evaluate the general cognition of and demand for medical statistics by undergraduates, graduates, and medical staff in China. We performed a comprehensive database search related to the cognition of and demand for medical statistics from January 2007 to July 2014 and conducted a meta-analysis of non-controlled studies with sub-group analysis for undergraduates, graduates, and medical staff. There are substantial differences with respect to the cognition of theory in medical statistics among undergraduates (73.5%), graduates (60.7%), and medical staff (39.6%). The demand for theory in medical statistics is high among graduates (94.6%), undergraduates (86.1%), and medical staff (88.3%). Regarding specific statistical methods, the cognition of basic statistical methods is higher than of advanced statistical methods. The demand for certain advanced statistical methods, including (but not limited to) multiple analysis of variance (ANOVA), multiple linear regression, and logistic regression, is higher than that for basic statistical methods. The use rates of the Statistical Package for the Social Sciences (SPSS) software and statistical analysis software (SAS) are only 55% and 15%, respectively. The overall statistical competence of undergraduates, graduates, and medical staff is insufficient, and their ability to practically apply their statistical knowledge is limited, which constitutes an unsatisfactory state of affairs for medical statistics education. Because the demand for skills in this area is increasing, the need to reform medical statistics education in China has become urgent.

  2. Quasi-experimental study designs series-paper 10: synthesizing evidence for effects collected from quasi-experimental studies presents surmountable challenges.

    PubMed

    Becker, Betsy Jane; Aloe, Ariel M; Duvendack, Maren; Stanley, T D; Valentine, Jeffrey C; Fretheim, Atle; Tugwell, Peter

    2017-09-01

    To outline issues of importance to analytic approaches to the synthesis of quasi-experiments (QEs) and to provide a statistical model for use in analysis. We drew on studies of statistics, epidemiology, and social-science methodology to outline methods for synthesis of QE studies. The design and conduct of QEs, effect sizes from QEs, and moderator variables for the analysis of those effect sizes were discussed. Biases, confounding, design complexities, and comparisons across designs offer serious challenges to syntheses of QEs. Key components of meta-analyses of QEs were identified, including the aspects of QE study design to be coded and analyzed. Of utmost importance are the design and statistical controls implemented in the QEs. Such controls and any potential sources of bias and confounding must be modeled in analyses, along with aspects of the interventions and populations studied. Because of such controls, effect sizes from QEs are more complex than those from randomized experiments. A statistical meta-regression model that incorporates important features of the QEs under review was presented. Meta-analyses of QEs provide particular challenges, but thorough coding of intervention characteristics and study methods, along with careful analysis, should allow for sound inferences. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. The Bayesian New Statistics: Hypothesis testing, estimation, meta-analysis, and power analysis from a Bayesian perspective.

    PubMed

    Kruschke, John K; Liddell, Torrin M

    2018-02-01

    In the practice of data analysis, there is a conceptual distinction between hypothesis testing, on the one hand, and estimation with quantified uncertainty on the other. Among frequentists in psychology, a shift of emphasis from hypothesis testing to estimation has been dubbed "the New Statistics" (Cumming 2014). A second conceptual distinction is between frequentist methods and Bayesian methods. Our main goal in this article is to explain how Bayesian methods achieve the goals of the New Statistics better than frequentist methods. The article reviews frequentist and Bayesian approaches to hypothesis testing and to estimation with confidence or credible intervals. The article also describes Bayesian approaches to meta-analysis, randomized controlled trials, and power analysis.

  4. DNA Damage Analysis in Children with Non-syndromic Developmental Delay by Comet Assay.

    PubMed

    Susai, Surraj; Chand, Parkash; Ballambattu, Vishnu Bhat; Hanumanthappa, Nandeesha; Veeramani, Raveendranath

    2016-05-01

    Majority of the developmental delays in children are non-syndromic and they are believed to have an underlying DNA damage, though not well substantiated. Hence the present study was carried out to find out if there is any increased DNA damage in children with non-syndromic developmental delay by using the comet assay. The present case-control study was undertaken to assess the level of DNA damage in children with non syndromic developmental delay and compare the same with that of age and sex matched controls using submarine gel electrophoresis (Comet Assay). The blood from clinically diagnosed children with non syndromic developmental delay and controls were subjected for alkaline version of comet assay - Single cell gel electrophoresis using lymphocytes isolated from the peripheral blood. The comets were observed under a bright field microscope; photocaptured and scored using the Image J image quantification software. Comet parameters were compared between the cases and controls and statistical analysis and interpretation of results was done using the statistical software SPSS version 20. The mean comet tail length in cases and control was 20.77+7.659μm and 08.97+4.398μm respectively which was statistically significant (p<0.001). Other comet parameters like total comet length and % DNA in tail also showed a statistically significant difference (p < 0.001) between cases and controls. The current investigation unraveled increased levels of DNA damage in children with non syndromic developmental delay when compared to the controls.

  5. Outbreak of resistant Acinetobacter baumannii- measures and proposal for prevention and control.

    PubMed

    Romanelli, Roberta Maia de Castro; Jesus, Lenize Adriana de; Clemente, Wanessa Trindade; Lima, Stella Sala Soares; Rezende, Edna Maria; Coutinho, Rosane Luiza; Moreira, Ricardo Luiz Fontes; Neves, Francelli Aparecida Cordeiro; Brás, Nelma de Jesus

    2009-10-01

    Acinetobacter baumannii colonization and infection, frequent in Intensive Care Unit (ICU) patients, is commonly associated with high morbimortality. Several outbreaks due to multidrug-resistant (MDR) A. baumanii have been reported but few of them in Brazil. This study aimed to identify risk factors associated with colonization and infection by MDR and carbapenem-resistant A. baumannii strains isolated from patients admitted to the adult ICU at HC/UFMG. A case-control study was performed from January 2007 to June 2008. Cases were defined as patients colonized or infected by MDR/carbapenem-resistant A. baumannii, and controls were patients without MDR/carbapenem-resistant A. baumannii isolation, in a 1:2 proportion. For statistical analysis, due to changes in infection control guidelines, infection criteria and the notification process, this study was divided into two periods. During the first period analyzed, from January to December 2007, colonization or infection by MDR/carbapenem-resistant A. baumannii was associated with prior infection, invasive device utilization, prior carbapenem use and clinical severity. In the multivariate analysis, prior infection and mechanical ventilation proved to be statistically significant risk factors. Carbapenem use showed a tendency towards a statistical association. During the second study period, from January to June 2008, variables with a significant association with MDR/carbapenem-resistant A. baumannii colonization/infection were catheter utilization, carbapenem and third-generation cephalosporin use, hepatic transplantation, and clinical severity. In the multivariate analysis, only CVC use showed a statistical difference. Carbapenem and third-generation cephalosporin use displayed a tendency to be risk factors. Risk factors must be focused on infection control and prevention measures considering A. baumanni dissemination.

  6. Statistical Analysis Tools for Learning in Engineering Laboratories.

    ERIC Educational Resources Information Center

    Maher, Carolyn A.

    1990-01-01

    Described are engineering programs that have used automated data acquisition systems to implement data collection and analyze experiments. Applications include a biochemical engineering laboratory, heat transfer performance, engineering materials testing, mechanical system reliability, statistical control laboratory, thermo-fluid laboratory, and a…

  7. The Importance of Practice in the Development of Statistics.

    DTIC Science & Technology

    1983-01-01

    RESOLUTION TEST CHART NATIONAL BUREAU OIF STANDARDS 1963 -A NRC Technical Summary Report #2471 C THE IMORTANCE OF PRACTICE IN to THE DEVELOPMENT OF STATISTICS...component analysis, bioassay, limits for a ratio, quality control, sampling inspection, non-parametric tests , transformation theory, ARIMA time series...models, sequential tests , cumulative sum charts, data analysis plotting techniques, and a resolution of the Bayes - frequentist controversy. It appears

  8. Digital immunohistochemistry platform for the staining variation monitoring based on integration of image and statistical analyses with laboratory information system.

    PubMed

    Laurinaviciene, Aida; Plancoulaine, Benoit; Baltrusaityte, Indra; Meskauskas, Raimundas; Besusparis, Justinas; Lesciute-Krilaviciene, Daiva; Raudeliunas, Darius; Iqbal, Yasir; Herlin, Paulette; Laurinavicius, Arvydas

    2014-01-01

    Digital immunohistochemistry (IHC) is one of the most promising applications brought by new generation image analysis (IA). While conventional IHC staining quality is monitored by semi-quantitative visual evaluation of tissue controls, IA may require more sensitive measurement. We designed an automated system to digitally monitor IHC multi-tissue controls, based on SQL-level integration of laboratory information system with image and statistical analysis tools. Consecutive sections of TMA containing 10 cores of breast cancer tissue were used as tissue controls in routine Ki67 IHC testing. Ventana slide label barcode ID was sent to the LIS to register the serial section sequence. The slides were stained and scanned (Aperio ScanScope XT), IA was performed by the Aperio/Leica Colocalization and Genie Classifier/Nuclear algorithms. SQL-based integration ensured automated statistical analysis of the IA data by the SAS Enterprise Guide project. Factor analysis and plot visualizations were performed to explore slide-to-slide variation of the Ki67 IHC staining results in the control tissue. Slide-to-slide intra-core IHC staining analysis revealed rather significant variation of the variables reflecting the sample size, while Brown and Blue Intensity were relatively stable. To further investigate this variation, the IA results from the 10 cores were aggregated to minimize tissue-related variance. Factor analysis revealed association between the variables reflecting the sample size detected by IA and Blue Intensity. Since the main feature to be extracted from the tissue controls was staining intensity, we further explored the variation of the intensity variables in the individual cores. MeanBrownBlue Intensity ((Brown+Blue)/2) and DiffBrownBlue Intensity (Brown-Blue) were introduced to better contrast the absolute intensity and the colour balance variation in each core; relevant factor scores were extracted. Finally, tissue-related factors of IHC staining variance were explored in the individual tissue cores. Our solution enabled to monitor staining of IHC multi-tissue controls by the means of IA, followed by automated statistical analysis, integrated into the laboratory workflow. We found that, even in consecutive serial tissue sections, tissue-related factors affected the IHC IA results; meanwhile, less intense blue counterstain was associated with less amount of tissue, detected by the IA tools.

  9. Digital immunohistochemistry platform for the staining variation monitoring based on integration of image and statistical analyses with laboratory information system

    PubMed Central

    2014-01-01

    Background Digital immunohistochemistry (IHC) is one of the most promising applications brought by new generation image analysis (IA). While conventional IHC staining quality is monitored by semi-quantitative visual evaluation of tissue controls, IA may require more sensitive measurement. We designed an automated system to digitally monitor IHC multi-tissue controls, based on SQL-level integration of laboratory information system with image and statistical analysis tools. Methods Consecutive sections of TMA containing 10 cores of breast cancer tissue were used as tissue controls in routine Ki67 IHC testing. Ventana slide label barcode ID was sent to the LIS to register the serial section sequence. The slides were stained and scanned (Aperio ScanScope XT), IA was performed by the Aperio/Leica Colocalization and Genie Classifier/Nuclear algorithms. SQL-based integration ensured automated statistical analysis of the IA data by the SAS Enterprise Guide project. Factor analysis and plot visualizations were performed to explore slide-to-slide variation of the Ki67 IHC staining results in the control tissue. Results Slide-to-slide intra-core IHC staining analysis revealed rather significant variation of the variables reflecting the sample size, while Brown and Blue Intensity were relatively stable. To further investigate this variation, the IA results from the 10 cores were aggregated to minimize tissue-related variance. Factor analysis revealed association between the variables reflecting the sample size detected by IA and Blue Intensity. Since the main feature to be extracted from the tissue controls was staining intensity, we further explored the variation of the intensity variables in the individual cores. MeanBrownBlue Intensity ((Brown+Blue)/2) and DiffBrownBlue Intensity (Brown-Blue) were introduced to better contrast the absolute intensity and the colour balance variation in each core; relevant factor scores were extracted. Finally, tissue-related factors of IHC staining variance were explored in the individual tissue cores. Conclusions Our solution enabled to monitor staining of IHC multi-tissue controls by the means of IA, followed by automated statistical analysis, integrated into the laboratory workflow. We found that, even in consecutive serial tissue sections, tissue-related factors affected the IHC IA results; meanwhile, less intense blue counterstain was associated with less amount of tissue, detected by the IA tools. PMID:25565007

  10. Comparative efficacy of two battery-powered toothbrushes on dental plaque removal.

    PubMed

    Ruhlman, C Douglas; Bartizek, Robert D; Biesbrock, Aaron R

    2002-01-01

    A number of clinical studies have consistently demonstrated that power toothbrushes deliver superior plaque removal compared to manual toothbrushes. Recently, a new power toothbrush (Crest SpinBrush) has been marketed with a design that fundamentally differs from other marketed power toothbrushes. Other power toothbrushes feature a small, round head designed to oscillate for enhanced cleaning between the teeth and below the gumline. The new power toothbrush incorporates a similar round oscillating head in conjunction with fixed bristles, which allows the user to brush with optimal manual brushing technique. The objective of this randomized, examiner-blind, parallel design study was to compare the plaque removal efficacy of a positive control power toothbrush (Colgate Actibrush) to an experimental toothbrush (Crest SpinBrush) following a single use among 59 subjects. Baseline plaque scores were 1.64 and 1.40 for the experimental toothbrush and control toothbrush treatment groups, respectively. With regard to all surfaces examined, the experimental toothbrush delivered an adjusted (via analysis of covariance) mean difference between baseline and post-brushing plaque scores of 0.47, while the control toothbrush delivered an adjusted mean difference of 0.33. On average, the difference between toothbrushes was statistically significant (p = 0.013). Because the covariate slope for the experimental group was statistically significantly greater (p = 0.001) than the slope for the control group, a separate slope model was used. Further analysis demonstrated that the experimental group had statistically significantly greater plaque removal than the control group for baseline plaque scores above 1.43. With respect to buccal surfaces, using a separate slope analysis of covariance, the experimental toothbrush delivered an adjusted mean difference between baseline and post-brushing plaque scores of 0.61, while the control toothbrush delivered an adjusted mean difference of 0.39. This difference between toothbrushes was also statistically significant (p = 0.002). On average, the results on lingual surfaces demonstrated similar directional scores favoring the experimental toothbrush; however these results did not achieve statistical significance. In conclusion, the experimental Crest SpinBrush, with its novel fixed and oscillating bristle design, was found to be more effective than the positive control Colgate Actibrush, which is designed with a small round oscillating cluster of bristles.

  11. A randomized, placebo-controlled trial of patient education for acute low back pain (PREVENT Trial): statistical analysis plan.

    PubMed

    Traeger, Adrian C; Skinner, Ian W; Hübscher, Markus; Lee, Hopin; Moseley, G Lorimer; Nicholas, Michael K; Henschke, Nicholas; Refshauge, Kathryn M; Blyth, Fiona M; Main, Chris J; Hush, Julia M; Pearce, Garry; Lo, Serigne; McAuley, James H

    Statistical analysis plans increase the transparency of decisions made in the analysis of clinical trial results. The purpose of this paper is to detail the planned analyses for the PREVENT trial, a randomized, placebo-controlled trial of patient education for acute low back pain. We report the pre-specified principles, methods, and procedures to be adhered to in the main analysis of the PREVENT trial data. The primary outcome analysis will be based on Mixed Models for Repeated Measures (MMRM), which can test treatment effects at specific time points, and the assumptions of this analysis are outlined. We also outline the treatment of secondary outcomes and planned sensitivity analyses. We provide decisions regarding the treatment of missing data, handling of descriptive and process measure data, and blinded review procedures. Making public the pre-specified statistical analysis plan for the PREVENT trial minimizes the potential for bias in the analysis of trial data, and in the interpretation and reporting of trial results. ACTRN12612001180808 (https://www.anzctr.org.au/Trial/Registration/TrialReview.aspx?ACTRN=12612001180808). Copyright © 2017 Associação Brasileira de Pesquisa e Pós-Graduação em Fisioterapia. Publicado por Elsevier Editora Ltda. All rights reserved.

  12. Control of hot mix production by cold feed only : final report.

    DOT National Transportation Integrated Search

    1978-04-01

    This report is concerned with an analysis of the gradation control possible with recently improved aggregate cold feed systems. The gradation control for three mix types produced in a screenless batch plant was monitored and statistically compared wi...

  13. The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bihn T. Pham; Jeffrey J. Einerson

    2010-06-01

    This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automatedmore » processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.« less

  14. Dangers in Using Analysis of Covariance Procedures.

    ERIC Educational Resources Information Center

    Campbell, Kathleen T.

    Problems associated with the use of analysis of covariance (ANCOVA) as a statistical control technique are explained. Three problems relate to the use of "OVA" methods (analysis of variance, analysis of covariance, multivariate analysis of variance, and multivariate analysis of covariance) in general. These are: (1) the wasting of information when…

  15. Performance analysis of Integrated Communication and Control System networks

    NASA Technical Reports Server (NTRS)

    Halevi, Y.; Ray, A.

    1990-01-01

    This paper presents statistical analysis of delays in Integrated Communication and Control System (ICCS) networks that are based on asynchronous time-division multiplexing. The models are obtained in closed form for analyzing control systems with randomly varying delays. The results of this research are applicable to ICCS design for complex dynamical processes like advanced aircraft and spacecraft, autonomous manufacturing plants, and chemical and processing plants.

  16. Statistical Quality Control of Moisture Data in GEOS DAS

    NASA Technical Reports Server (NTRS)

    Dee, D. P.; Rukhovets, L.; Todling, R.

    1999-01-01

    A new statistical quality control algorithm was recently implemented in the Goddard Earth Observing System Data Assimilation System (GEOS DAS). The final step in the algorithm consists of an adaptive buddy check that either accepts or rejects outlier observations based on a local statistical analysis of nearby data. A basic assumption in any such test is that the observed field is spatially coherent, in the sense that nearby data can be expected to confirm each other. However, the buddy check resulted in excessive rejection of moisture data, especially during the Northern Hemisphere summer. The analysis moisture variable in GEOS DAS is water vapor mixing ratio. Observational evidence shows that the distribution of mixing ratio errors is far from normal. Furthermore, spatial correlations among mixing ratio errors are highly anisotropic and difficult to identify. Both factors contribute to the poor performance of the statistical quality control algorithm. To alleviate the problem, we applied the buddy check to relative humidity data instead. This variable explicitly depends on temperature and therefore exhibits a much greater spatial coherence. As a result, reject rates of moisture data are much more reasonable and homogeneous in time and space.

  17. Does daily nurse staffing match ward workload variability? Three hospitals' experiences.

    PubMed

    Gabbay, Uri; Bukchin, Michael

    2009-01-01

    Nurse shortage and rising healthcare resource burdens mean that appropriate workforce use is imperative. This paper aims to evaluate whether daily nursing staffing meets ward workload needs. Nurse attendance and daily nurses' workload capacity in three hospitals were evaluated. Statistical process control was used to evaluate intra-ward nurse workload capacity and day-to-day variations. Statistical process control is a statistics-based method for process monitoring that uses charts with predefined target measure and control limits. Standardization was performed for inter-ward analysis by converting ward-specific crude measures to ward-specific relative measures by dividing observed/expected. Two charts: acceptable and tolerable daily nurse workload intensity, were defined. Appropriate staffing indicators were defined as those exceeding predefined rates within acceptable and tolerable limits (50 percent and 80 percent respectively). A total of 42 percent of the overall days fell within acceptable control limits and 71 percent within tolerable control limits. Appropriate staffing indicators were met in only 33 percent of wards regarding acceptable nurse workload intensity and in only 45 percent of wards regarding tolerable workloads. The study work did not differentiate crude nurse attendance and it did not take into account patient severity since crude bed occupancy was used. Double statistical process control charts and certain staffing indicators were used, which is open to debate. Wards that met appropriate staffing indicators prove the method's feasibility. Wards that did not meet appropriate staffing indicators prove the importance and the need for process evaluations and monitoring. Methods presented for monitoring daily staffing appropriateness are simple to implement either for intra-ward day-to-day variation by using nurse workload capacity statistical process control charts or for inter-ward evaluation using standardized measure of nurse workload intensity. The real challenge will be to develop planning systems and implement corrective interventions such as dynamic and flexible daily staffing, which will face difficulties and barriers. The paper fulfils the need for workforce utilization evaluation. A simple method using available data for daily staffing appropriateness evaluation, which is easy to implement and operate, is presented. The statistical process control method enables intra-ward evaluation, while standardization by converting crude into relative measures enables inter-ward analysis. The staffing indicator definitions enable performance evaluation. This original study uses statistical process control to develop simple standardization methods and applies straightforward statistical tools. This method is not limited to crude measures, rather it uses weighted workload measures such as nursing acuity or weighted nurse level (i.e. grade/band).

  18. Investigation of energy management strategies for photovoltaic systems - A predictive control algorithm

    NASA Technical Reports Server (NTRS)

    Cull, R. C.; Eltimsahy, A. H.

    1983-01-01

    The present investigation is concerned with the formulation of energy management strategies for stand-alone photovoltaic (PV) systems, taking into account a basic control algorithm for a possible predictive, (and adaptive) controller. The control system controls the flow of energy in the system according to the amount of energy available, and predicts the appropriate control set-points based on the energy (insolation) available by using an appropriate system model. Aspects of adaptation to the conditions of the system are also considered. Attention is given to a statistical analysis technique, the analysis inputs, the analysis procedure, and details regarding the basic control algorithm.

  19. Multivariate approaches for stability control of the olive oil reference materials for sensory analysis - part I: framework and fundamentals.

    PubMed

    Valverde-Som, Lucia; Ruiz-Samblás, Cristina; Rodríguez-García, Francisco P; Cuadros-Rodríguez, Luis

    2018-02-09

    Virgin olive oil is the only food product for which sensory analysis is regulated to classify it in different quality categories. To harmonize the results of the sensorial method, the use of standards or reference materials is crucial. The stability of sensory reference materials is required to enable their suitable control, aiming to confirm that their specific target values are maintained on an ongoing basis. Currently, such stability is monitored by means of sensory analysis and the sensory panels are in the paradoxical situation of controlling the standards that are devoted to controlling the panels. In the present study, several approaches based on similarity analysis are exploited. For each approach, the specific methodology to build a proper multivariate control chart to monitor the stability of the sensory properties is explained and discussed. The normalized Euclidean and Mahalanobis distances, the so-called nearness and hardiness indices respectively, have been defined as new similarity indices to range the values from 0 to 1. Also, the squared mean from Hotelling's T 2 -statistic and Q 2 -statistic has been proposed as another similarity index. © 2018 Society of Chemical Industry. © 2018 Society of Chemical Industry.

  20. HDBStat!: a platform-independent software suite for statistical analysis of high dimensional biology data.

    PubMed

    Trivedi, Prinal; Edwards, Jode W; Wang, Jelai; Gadbury, Gary L; Srinivasasainagendra, Vinodh; Zakharkin, Stanislav O; Kim, Kyoungmi; Mehta, Tapan; Brand, Jacob P L; Patki, Amit; Page, Grier P; Allison, David B

    2005-04-06

    Many efforts in microarray data analysis are focused on providing tools and methods for the qualitative analysis of microarray data. HDBStat! (High-Dimensional Biology-Statistics) is a software package designed for analysis of high dimensional biology data such as microarray data. It was initially developed for the analysis of microarray gene expression data, but it can also be used for some applications in proteomics and other aspects of genomics. HDBStat! provides statisticians and biologists a flexible and easy-to-use interface to analyze complex microarray data using a variety of methods for data preprocessing, quality control analysis and hypothesis testing. Results generated from data preprocessing methods, quality control analysis and hypothesis testing methods are output in the form of Excel CSV tables, graphs and an Html report summarizing data analysis. HDBStat! is a platform-independent software that is freely available to academic institutions and non-profit organizations. It can be downloaded from our website http://www.soph.uab.edu/ssg_content.asp?id=1164.

  1. Sister chromatid exchanges and micronuclei analysis in lymphocytes of men exposed to simazine through drinking water.

    PubMed

    Suárez, Susanna; Rubio, Arantxa; Sueiro, Rosa Ana; Garrido, Joaquín

    2003-06-06

    In some cities of the autonomous community of Extremadura (south-west of Spain), levels of simazine from 10 to 30 ppm were detected in tap water. To analyse the possible effect of this herbicide, two biomarkers, sister chromatid exchanges (SCE) and micronuclei (MN), were used in peripheral blood lymphocytes from males exposed to simazine through drinking water. SCE and MN analysis failed to detect any statistically significant increase in the people exposed to simazine when compared with the controls. With respect to high frequency cells (HFC), a statistically significant difference was detected between exposed and control groups.

  2. A generic approach for examining the effectiveness of traffic control devices in school zones.

    PubMed

    Zhao, Xiaohua; Li, Jiahui; Ding, Han; Zhang, Guohui; Rong, Jian

    2015-09-01

    The effectiveness and performance of traffic control devices in school zones have been impacted significantly by many factors, such as driver behavioral attributes, roadway geometric features, environmental characteristics, weather and visibility conditions, region-wide traffic regulations and policies, control modes, etc. When deploying traffic control devices in school zones, efforts are needed to clarify: (1) whether traffic control device installation is warranted; and (2) whether other device effectively complements this traffic control device and strengthens its effectiveness. In this study, a generic approach is developed to examine and evaluate the effectiveness of various traffic control devices deployed in school zones through driving simulator-based experiments. A Traffic Control Device Selection Model (TCDSM) is developed and two representative school zones are selected as the testbed in Beijing for driving simulation implementation to enhance its applicability. Statistical analyses are conducted to extract the knowledge from test data recorded by a driving simulator. Multiple measures of effectiveness (MOEs) are developed and adopted including average speed, relative speed difference, and standard deviation of acceleration for traffic control device performance quantification. The experimental tests and analysis results reveal that the appropriateness of the installation of certain traffic control devices can be statistically verified by TCDSM. The proposed approach provides a generic framework to assess traffic control device performance in school zones including experiment design, statistical formulation, data analysis, simulation model implementation, data interpretation, and recommendation development. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Statistical analysis plan for the Alveolar Recruitment for Acute Respiratory Distress Syndrome Trial (ART). A randomized controlled trial

    PubMed Central

    Damiani, Lucas Petri; Berwanger, Otavio; Paisani, Denise; Laranjeira, Ligia Nasi; Suzumura, Erica Aranha; Amato, Marcelo Britto Passos; Carvalho, Carlos Roberto Ribeiro; Cavalcanti, Alexandre Biasi

    2017-01-01

    Background The Alveolar Recruitment for Acute Respiratory Distress Syndrome Trial (ART) is an international multicenter randomized pragmatic controlled trial with allocation concealment involving 120 intensive care units in Brazil, Argentina, Colombia, Italy, Poland, Portugal, Malaysia, Spain, and Uruguay. The primary objective of ART is to determine whether maximum stepwise alveolar recruitment associated with PEEP titration, adjusted according to the static compliance of the respiratory system (ART strategy), is able to increase 28-day survival in patients with acute respiratory distress syndrome compared to conventional treatment (ARDSNet strategy). Objective To describe the data management process and statistical analysis plan. Methods The statistical analysis plan was designed by the trial executive committee and reviewed and approved by the trial steering committee. We provide an overview of the trial design with a special focus on describing the primary (28-day survival) and secondary outcomes. We describe our data management process, data monitoring committee, interim analyses, and sample size calculation. We describe our planned statistical analyses for primary and secondary outcomes as well as pre-specified subgroup analyses. We also provide details for presenting results, including mock tables for baseline characteristics, adherence to the protocol and effect on clinical outcomes. Conclusion According to best trial practice, we report our statistical analysis plan and data management plan prior to locking the database and beginning analyses. We anticipate that this document will prevent analysis bias and enhance the utility of the reported results. Trial registration ClinicalTrials.gov number, NCT01374022. PMID:28977255

  4. A Finite-Volume "Shaving" Method for Interfacing NASA/DAO''s Physical Space Statistical Analysis System to the Finite-Volume GCM with a Lagrangian Control-Volume Vertical Coordinate

    NASA Technical Reports Server (NTRS)

    Lin, Shian-Jiann; DaSilva, Arlindo; Atlas, Robert (Technical Monitor)

    2001-01-01

    Toward the development of a finite-volume Data Assimilation System (fvDAS), a consistent finite-volume methodology is developed for interfacing the NASA/DAO's Physical Space Statistical Analysis System (PSAS) to the joint NASA/NCAR finite volume CCM3 (fvCCM3). To take advantage of the Lagrangian control-volume vertical coordinate of the fvCCM3, a novel "shaving" method is applied to the lowest few model layers to reflect the surface pressure changes as implied by the final analysis. Analysis increments (from PSAS) to the upper air variables are then consistently put onto the Lagrangian layers as adjustments to the volume-mean quantities during the analysis cycle. This approach is demonstrated to be superior to the conventional method of using independently computed "tendency terms" for surface pressure and upper air prognostic variables.

  5. [Adequate application of quantitative and qualitative statistic analytic methods in acupuncture clinical trials].

    PubMed

    Tan, Ming T; Liu, Jian-ping; Lao, Lixing

    2012-08-01

    Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.

  6. Transfusion Indication Threshold Reduction (TITRe2) randomized controlled trial in cardiac surgery: statistical analysis plan.

    PubMed

    Pike, Katie; Nash, Rachel L; Murphy, Gavin J; Reeves, Barnaby C; Rogers, Chris A

    2015-02-22

    The Transfusion Indication Threshold Reduction (TITRe2) trial is the largest randomized controlled trial to date to compare red blood cell transfusion strategies following cardiac surgery. This update presents the statistical analysis plan, detailing how the study will be analyzed and presented. The statistical analysis plan has been written following recommendations from the International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use, prior to database lock and the final analysis of trial data. Outlined analyses are in line with the Consolidated Standards of Reporting Trials (CONSORT). The study aims to randomize 2000 patients from 17 UK centres. Patients are randomized to either a restrictive (transfuse if haemoglobin concentration <7.5 g/dl) or liberal (transfuse if haemoglobin concentration <9 g/dl) transfusion strategy. The primary outcome is a binary composite outcome of any serious infectious or ischaemic event in the first 3 months following randomization. The statistical analysis plan details how non-adherence with the intervention, withdrawals from the study, and the study population will be derived and dealt with in the analysis. The planned analyses of the trial primary and secondary outcome measures are described in detail, including approaches taken to deal with multiple testing, model assumptions not being met and missing data. Details of planned subgroup and sensitivity analyses and pre-specified ancillary analyses are given, along with potential issues that have been identified with such analyses and possible approaches to overcome such issues. ISRCTN70923932 .

  7. Laryngospasm during emergency department ketamine sedation: a case-control study.

    PubMed

    Green, Steven M; Roback, Mark G; Krauss, Baruch

    2010-11-01

    The objective of this study was to assess predictors of emergency department (ED) ketamine-associated laryngospasm using case-control techniques. We performed a matched case-control analysis of a sample of 8282 ED ketamine sedations (including 22 occurrences of laryngospasm) assembled from 32 prior published series. We sequentially studied the association of each of 7 clinical variables with laryngospasm by assigning 4 controls to each case while matching for the remaining 6 variables. We then used univariate statistics and conditional logistic regression to analyze the matched sets. We found no statistical association of age, dose, oropharyngeal procedure, underlying physical illness, route, or coadministered anticholinergics with laryngospasm. Coadministered benzodiazepines showed a borderline association in the multivariate but not univariate analysis that was considered anomalous. This case-control analysis of the largest available sample of ED ketamine-associated laryngospasm did not demonstrate evidence of association with age, dose, or other clinical factors. Such laryngospasm seems to be idiosyncratic, and accordingly, clinicians administering ketamine must be prepared for its rapid identification and management. Given no evidence that they decrease the risk of laryngospasm, coadministered anticholinergics seem unnecessary.

  8. Assessment of trace elements levels in patients with Type 2 diabetes using multivariate statistical analysis.

    PubMed

    Badran, M; Morsy, R; Soliman, H; Elnimr, T

    2016-01-01

    The trace elements metabolism has been reported to possess specific roles in the pathogenesis and progress of diabetes mellitus. Due to the continuous increase in the population of patients with Type 2 diabetes (T2D), this study aims to assess the levels and inter-relationships of fast blood glucose (FBG) and serum trace elements in Type 2 diabetic patients. This study was conducted on 40 Egyptian Type 2 diabetic patients and 36 healthy volunteers (Hospital of Tanta University, Tanta, Egypt). The blood serum was digested and then used to determine the levels of 24 trace elements using an inductive coupled plasma mass spectroscopy (ICP-MS). Multivariate statistical analysis depended on correlation coefficient, cluster analysis (CA) and principal component analysis (PCA), were used to analysis the data. The results exhibited significant changes in FBG and eight of trace elements, Zn, Cu, Se, Fe, Mn, Cr, Mg, and As, levels in the blood serum of Type 2 diabetic patients relative to those of healthy controls. The statistical analyses using multivariate statistical techniques were obvious in the reduction of the experimental variables, and grouping the trace elements in patients into three clusters. The application of PCA revealed a distinct difference in associations of trace elements and their clustering patterns in control and patients group in particular for Mg, Fe, Cu, and Zn that appeared to be the most crucial factors which related with Type 2 diabetes. Therefore, on the basis of this study, the contributors of trace elements content in Type 2 diabetic patients can be determine and specify with correlation relationship and multivariate statistical analysis, which confirm that the alteration of some essential trace metals may play a role in the development of diabetes mellitus. Copyright © 2015 Elsevier GmbH. All rights reserved.

  9. Statistical and Economic Techniques for Site-specific Nematode Management.

    PubMed

    Liu, Zheng; Griffin, Terry; Kirkpatrick, Terrence L

    2014-03-01

    Recent advances in precision agriculture technologies and spatial statistics allow realistic, site-specific estimation of nematode damage to field crops and provide a platform for the site-specific delivery of nematicides within individual fields. This paper reviews the spatial statistical techniques that model correlations among neighboring observations and develop a spatial economic analysis to determine the potential of site-specific nematicide application. The spatial econometric methodology applied in the context of site-specific crop yield response contributes to closing the gap between data analysis and realistic site-specific nematicide recommendations and helps to provide a practical method of site-specifically controlling nematodes.

  10. 78 FR 8682 - Shipping Coordinating Committee; Notice of Committee Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-06

    ... the Protocol of 1978 (MARPOL 73/78); Casualty statistics and investigations; Harmonization of port State control activities; Port State Control (PSC) Guidelines on seafarers' hours of rest and PSC... control under the 2004 Ballast Water Management (BWM) Convention; Comprehensive analysis of difficulties...

  11. Diagnosis checking of statistical analysis in RCTs indexed in PubMed.

    PubMed

    Lee, Paul H; Tse, Andy C Y

    2017-11-01

    Statistical analysis is essential for reporting of the results of randomized controlled trials (RCTs), as well as evaluating their effectiveness. However, the validity of a statistical analysis also depends on whether the assumptions of that analysis are valid. To review all RCTs published in journals indexed in PubMed during December 2014 to provide a complete picture of how RCTs handle assumptions of statistical analysis. We reviewed all RCTs published in December 2014 that appeared in journals indexed in PubMed using the Cochrane highly sensitive search strategy. The 2014 impact factors of the journals were used as proxies for their quality. The type of statistical analysis used and whether the assumptions of the analysis were tested were reviewed. In total, 451 papers were included. Of the 278 papers that reported a crude analysis for the primary outcomes, 31 (27·2%) reported whether the outcome was normally distributed. Of the 172 papers that reported an adjusted analysis for the primary outcomes, diagnosis checking was rarely conducted, with only 20%, 8·6% and 7% checked for generalized linear model, Cox proportional hazard model and multilevel model, respectively. Study characteristics (study type, drug trial, funding sources, journal type and endorsement of CONSORT guidelines) were not associated with the reporting of diagnosis checking. The diagnosis of statistical analyses in RCTs published in PubMed-indexed journals was usually absent. Journals should provide guidelines about the reporting of a diagnosis of assumptions. © 2017 Stichting European Society for Clinical Investigation Journal Foundation.

  12. Gram-Negative Bacterial Wound Infections

    DTIC Science & Technology

    2014-05-01

    shows an effect with increasing concentration, however survival analysis does not show a significant difference between treatment groups and controls ...with 3 dead larvae in the 25 mM group compared to a single dead larva in the control group (Fig. 7). Probit analysis estimates the lethal...statistically differ- ent from that of the control group . The levels (CFU/g) of bacteria in lung tissue correlated with the survival curves. The median

  13. Diagnostic Value of Serum YKL-40 Level for Coronary Artery Disease: A Meta-Analysis.

    PubMed

    Song, Chun-Li; Bin-Li; Diao, Hong-Ying; Wang, Jiang-Hua; Shi, Yong-fei; Lu, Yang; Wang, Guan; Guo, Zi-Yuan; Li, Yang-Xue; Liu, Jian-Gen; Wang, Jin-Peng; Zhang, Ji-Chang; Zhao, Zhuo; Liu, Yi-Hang; Li, Ying; Cai, Dan; Li, Qian

    2016-01-01

    This meta-analysis aimed to identify the value of serum YKL-40 level for the diagnosis of coronary artery disease (CAD). Through searching the following electronic databases: the Cochrane Library Database (Issue 12, 2013), Web of Science (1945 ∼ 2013), PubMed (1966 ∼ 2013), CINAHL (1982 ∼ 2013), EMBASE (1980 ∼ 2013), and the Chinese Biomedical Database (CBM; 1982 ∼ 2013), related articles were determined without any language restrictions. STATA statistical software (Version 12.0, Stata Corporation, College Station, TX) was chosen to deal with statistical data. Standard mean difference (SMD) and its corresponding 95% confidence interval (95% CI) were calculated. Eleven clinical case-control studies that recruited 1,175 CAD patients and 1,261 healthy controls were selected for statistical analysis. The main findings of our meta-analysis showed that serum YKL-40 level in CAD patients was significantly higher than that in control subjects (SMD = 2.79, 95% CI = 1.73 ∼ 3.85, P < 0.001). Ethnicity-stratified analysis indicated a higher serum YKL-40 level in CAD patients than control subjects among China, Korea, and Denmark populations (China: SMD = 2.97, 95% CI = 1.21 ∼ 4.74, P = 0.001; Korea: SMD = 0.66, 95% CI = 0.17 ∼ 1.15, P = 0.008; Denmark: SMD = 1.85, 95% CI = 1.42 ∼ 2.29, P < 0.001; respectively), but not in Turkey (SMD = 4.52, 95% CI = -2.87 ∼ 11.91, P = 0.231). The present meta-analysis suggests that an elevated serum YKL-40 level may be used as a promising diagnostic tool for early identification of CAD.

  14. [Statistical process control applied to intensity modulated radiotherapy pretreatment controls with portal dosimetry].

    PubMed

    Villani, N; Gérard, K; Marchesi, V; Huger, S; François, P; Noël, A

    2010-06-01

    The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (IMRT) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. At Alexis-Vautrin center, pretreatment quality controls in IMRT for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multileaf collimator). Correlation between dose measured at one point, given with the EPID and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. The study allowed to demonstrate the feasibility to reduce the time devoted to pretreatment controls, by substituting the ionisation chamber's measurements with those performed with EPID, and also that a statistical process control monitoring of data brought security guarantee. 2010 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.

  15. Statistical process management: An essential element of quality improvement

    NASA Astrophysics Data System (ADS)

    Buckner, M. R.

    Successful quality improvement requires a balanced program involving the three elements that control quality: organization, people and technology. The focus of the SPC/SPM User's Group is to advance the technology component of Total Quality by networking within the Group and by providing an outreach within Westinghouse to foster the appropriate use of statistic techniques to achieve Total Quality. SPM encompasses the disciplines by which a process is measured against its intrinsic design capability, in the face of measurement noise and other obscuring variability. SPM tools facilitate decisions about the process that generated the data. SPM deals typically with manufacturing processes, but with some flexibility of definition and technique it accommodates many administrative processes as well. The techniques of SPM are those of Statistical Process Control, Statistical Quality Control, Measurement Control, and Experimental Design. In addition, techniques such as job and task analysis, and concurrent engineering are important elements of systematic planning and analysis that are needed early in the design process to ensure success. The SPC/SPM User's Group is endeavoring to achieve its objectives by sharing successes that have occurred within the member's own Westinghouse department as well as within other US and foreign industry. In addition, failures are reviewed to establish lessons learned in order to improve future applications. In broader terms, the Group is interested in making SPM the accepted way of doing business within Westinghouse.

  16. Marketing Analysis and Strategy for a Small Business in the Beekeeping Industry.

    DTIC Science & Technology

    1980-08-18

    segment has opportunities associated with it that may be profitably 2Philip Kotler , "Marketing Management: Analysis, Planning, and Control," Prentice...I I I162 I I BIBLIOGRAPHY IB Abel, Derek and John Hammond. Strategic Market Planning. Englewood Cliffs, N.J.: Prentice- Hall, Inc., 1979. Kotler ... Philip . Marketing Management: Analysis, Planning, and Control. Englewood Cliffs, N.J.: Prentice-Hall, Inc., 1976. Ott, Hyman. Introduction to Statistical

  17. Measuring and improving the quality of postoperative epidural analgesia for major abdominal surgery using statistical process control charts.

    PubMed

    Duncan, Fiona; Haigh, Carol

    2013-10-01

    To explore and improve the quality of continuous epidural analgesia for pain relief using Statistical Process Control tools. Measuring the quality of pain management interventions is complex. Intermittent audits do not accurately capture the results of quality improvement initiatives. The failure rate for one intervention, epidural analgesia, is approximately 30% in everyday practice, so it is an important area for improvement. Continuous measurement and analysis are required to understand the multiple factors involved in providing effective pain relief. Process control and quality improvement Routine prospectively acquired data collection started in 2006. Patients were asked about their pain and side effects of treatment. Statistical Process Control methods were applied for continuous data analysis. A multidisciplinary group worked together to identify reasons for variation in the data and instigated ideas for improvement. The key measure for improvement was a reduction in the percentage of patients with an epidural in severe pain. The baseline control charts illustrated the recorded variation in the rate of several processes and outcomes for 293 surgical patients. The mean visual analogue pain score (VNRS) was four. There was no special cause variation when data were stratified by surgeons, clinical area or patients who had experienced pain before surgery. Fifty-seven per cent of patients were hypotensive on the first day after surgery. We were able to demonstrate a significant improvement in the failure rate of epidurals as the project continued with quality improvement interventions. Statistical Process Control is a useful tool for measuring and improving the quality of pain management. The applications of Statistical Process Control methods offer the potential to learn more about the process of change and outcomes in an Acute Pain Service both locally and nationally. We have been able to develop measures for improvement and benchmarking in routine care that has led to the establishment of a national pain registry. © 2013 Blackwell Publishing Ltd.

  18. Performance analysis of different tuning rules for an isothermal CSTR using integrated EPC and SPC

    NASA Astrophysics Data System (ADS)

    Roslan, A. H.; Karim, S. F. Abd; Hamzah, N.

    2018-03-01

    This paper demonstrates the integration of Engineering Process Control (EPC) and Statistical Process Control (SPC) for the control of product concentration of an isothermal CSTR. The objectives of this study are to evaluate the performance of Ziegler-Nichols (Z-N), Direct Synthesis, (DS) and Internal Model Control (IMC) tuning methods and determine the most effective method for this process. The simulation model was obtained from past literature and re-constructed using SIMULINK MATLAB to evaluate the process response. Additionally, the process stability, capability and normality were analyzed using Process Capability Sixpack reports in Minitab. Based on the results, DS displays the best response for having the smallest rise time, settling time, overshoot, undershoot, Integral Time Absolute Error (ITAE) and Integral Square Error (ISE). Also, based on statistical analysis, DS yields as the best tuning method as it exhibits the highest process stability and capability.

  19. A database application for pre-processing, storage and comparison of mass spectra derived from patients and controls

    PubMed Central

    Titulaer, Mark K; Siccama, Ivar; Dekker, Lennard J; van Rijswijk, Angelique LCT; Heeren, Ron MA; Sillevis Smitt, Peter A; Luider, Theo M

    2006-01-01

    Background Statistical comparison of peptide profiles in biomarker discovery requires fast, user-friendly software for high throughput data analysis. Important features are flexibility in changing input variables and statistical analysis of peptides that are differentially expressed between patient and control groups. In addition, integration the mass spectrometry data with the results of other experiments, such as microarray analysis, and information from other databases requires a central storage of the profile matrix, where protein id's can be added to peptide masses of interest. Results A new database application is presented, to detect and identify significantly differentially expressed peptides in peptide profiles obtained from body fluids of patient and control groups. The presented modular software is capable of central storage of mass spectra and results in fast analysis. The software architecture consists of 4 pillars, 1) a Graphical User Interface written in Java, 2) a MySQL database, which contains all metadata, such as experiment numbers and sample codes, 3) a FTP (File Transport Protocol) server to store all raw mass spectrometry files and processed data, and 4) the software package R, which is used for modular statistical calculations, such as the Wilcoxon-Mann-Whitney rank sum test. Statistic analysis by the Wilcoxon-Mann-Whitney test in R demonstrates that peptide-profiles of two patient groups 1) breast cancer patients with leptomeningeal metastases and 2) prostate cancer patients in end stage disease can be distinguished from those of control groups. Conclusion The database application is capable to distinguish patient Matrix Assisted Laser Desorption Ionization (MALDI-TOF) peptide profiles from control groups using large size datasets. The modular architecture of the application makes it possible to adapt the application to handle also large sized data from MS/MS- and Fourier Transform Ion Cyclotron Resonance (FT-ICR) mass spectrometry experiments. It is expected that the higher resolution and mass accuracy of the FT-ICR mass spectrometry prevents the clustering of peaks of different peptides and allows the identification of differentially expressed proteins from the peptide profiles. PMID:16953879

  20. A database application for pre-processing, storage and comparison of mass spectra derived from patients and controls.

    PubMed

    Titulaer, Mark K; Siccama, Ivar; Dekker, Lennard J; van Rijswijk, Angelique L C T; Heeren, Ron M A; Sillevis Smitt, Peter A; Luider, Theo M

    2006-09-05

    Statistical comparison of peptide profiles in biomarker discovery requires fast, user-friendly software for high throughput data analysis. Important features are flexibility in changing input variables and statistical analysis of peptides that are differentially expressed between patient and control groups. In addition, integration the mass spectrometry data with the results of other experiments, such as microarray analysis, and information from other databases requires a central storage of the profile matrix, where protein id's can be added to peptide masses of interest. A new database application is presented, to detect and identify significantly differentially expressed peptides in peptide profiles obtained from body fluids of patient and control groups. The presented modular software is capable of central storage of mass spectra and results in fast analysis. The software architecture consists of 4 pillars, 1) a Graphical User Interface written in Java, 2) a MySQL database, which contains all metadata, such as experiment numbers and sample codes, 3) a FTP (File Transport Protocol) server to store all raw mass spectrometry files and processed data, and 4) the software package R, which is used for modular statistical calculations, such as the Wilcoxon-Mann-Whitney rank sum test. Statistic analysis by the Wilcoxon-Mann-Whitney test in R demonstrates that peptide-profiles of two patient groups 1) breast cancer patients with leptomeningeal metastases and 2) prostate cancer patients in end stage disease can be distinguished from those of control groups. The database application is capable to distinguish patient Matrix Assisted Laser Desorption Ionization (MALDI-TOF) peptide profiles from control groups using large size datasets. The modular architecture of the application makes it possible to adapt the application to handle also large sized data from MS/MS- and Fourier Transform Ion Cyclotron Resonance (FT-ICR) mass spectrometry experiments. It is expected that the higher resolution and mass accuracy of the FT-ICR mass spectrometry prevents the clustering of peaks of different peptides and allows the identification of differentially expressed proteins from the peptide profiles.

  1. Sex genes for genomic analysis in human brain: internal controls for comparison of probe level data extraction.

    PubMed Central

    Galfalvy, Hanga C; Erraji-Benchekroun, Loubna; Smyrniotopoulos, Peggy; Pavlidis, Paul; Ellis, Steven P; Mann, J John; Sibille, Etienne; Arango, Victoria

    2003-01-01

    Background Genomic studies of complex tissues pose unique analytical challenges for assessment of data quality, performance of statistical methods used for data extraction, and detection of differentially expressed genes. Ideally, to assess the accuracy of gene expression analysis methods, one needs a set of genes which are known to be differentially expressed in the samples and which can be used as a "gold standard". We introduce the idea of using sex-chromosome genes as an alternative to spiked-in control genes or simulations for assessment of microarray data and analysis methods. Results Expression of sex-chromosome genes were used as true internal biological controls to compare alternate probe-level data extraction algorithms (Microarray Suite 5.0 [MAS5.0], Model Based Expression Index [MBEI] and Robust Multi-array Average [RMA]), to assess microarray data quality and to establish some statistical guidelines for analyzing large-scale gene expression. These approaches were implemented on a large new dataset of human brain samples. RMA-generated gene expression values were markedly less variable and more reliable than MAS5.0 and MBEI-derived values. A statistical technique controlling the false discovery rate was applied to adjust for multiple testing, as an alternative to the Bonferroni method, and showed no evidence of false negative results. Fourteen probesets, representing nine Y- and two X-chromosome linked genes, displayed significant sex differences in brain prefrontal cortex gene expression. Conclusion In this study, we have demonstrated the use of sex genes as true biological internal controls for genomic analysis of complex tissues, and suggested analytical guidelines for testing alternate oligonucleotide microarray data extraction protocols and for adjusting multiple statistical analysis of differentially expressed genes. Our results also provided evidence for sex differences in gene expression in the brain prefrontal cortex, supporting the notion of a putative direct role of sex-chromosome genes in differentiation and maintenance of sexual dimorphism of the central nervous system. Importantly, these analytical approaches are applicable to all microarray studies that include male and female human or animal subjects. PMID:12962547

  2. Sex genes for genomic analysis in human brain: internal controls for comparison of probe level data extraction.

    PubMed

    Galfalvy, Hanga C; Erraji-Benchekroun, Loubna; Smyrniotopoulos, Peggy; Pavlidis, Paul; Ellis, Steven P; Mann, J John; Sibille, Etienne; Arango, Victoria

    2003-09-08

    Genomic studies of complex tissues pose unique analytical challenges for assessment of data quality, performance of statistical methods used for data extraction, and detection of differentially expressed genes. Ideally, to assess the accuracy of gene expression analysis methods, one needs a set of genes which are known to be differentially expressed in the samples and which can be used as a "gold standard". We introduce the idea of using sex-chromosome genes as an alternative to spiked-in control genes or simulations for assessment of microarray data and analysis methods. Expression of sex-chromosome genes were used as true internal biological controls to compare alternate probe-level data extraction algorithms (Microarray Suite 5.0 [MAS5.0], Model Based Expression Index [MBEI] and Robust Multi-array Average [RMA]), to assess microarray data quality and to establish some statistical guidelines for analyzing large-scale gene expression. These approaches were implemented on a large new dataset of human brain samples. RMA-generated gene expression values were markedly less variable and more reliable than MAS5.0 and MBEI-derived values. A statistical technique controlling the false discovery rate was applied to adjust for multiple testing, as an alternative to the Bonferroni method, and showed no evidence of false negative results. Fourteen probesets, representing nine Y- and two X-chromosome linked genes, displayed significant sex differences in brain prefrontal cortex gene expression. In this study, we have demonstrated the use of sex genes as true biological internal controls for genomic analysis of complex tissues, and suggested analytical guidelines for testing alternate oligonucleotide microarray data extraction protocols and for adjusting multiple statistical analysis of differentially expressed genes. Our results also provided evidence for sex differences in gene expression in the brain prefrontal cortex, supporting the notion of a putative direct role of sex-chromosome genes in differentiation and maintenance of sexual dimorphism of the central nervous system. Importantly, these analytical approaches are applicable to all microarray studies that include male and female human or animal subjects.

  3. Statistical plant set estimation using Schroeder-phased multisinusoidal input design

    NASA Technical Reports Server (NTRS)

    Bayard, D. S.

    1992-01-01

    A frequency domain method is developed for plant set estimation. The estimation of a plant 'set' rather than a point estimate is required to support many methods of modern robust control design. The approach here is based on using a Schroeder-phased multisinusoid input design which has the special property of placing input energy only at the discrete frequency points used in the computation. A detailed analysis of the statistical properties of the frequency domain estimator is given, leading to exact expressions for the probability distribution of the estimation error, and many important properties. It is shown that, for any nominal parametric plant estimate, one can use these results to construct an overbound on the additive uncertainty to any prescribed statistical confidence. The 'soft' bound thus obtained can be used to replace 'hard' bounds presently used in many robust control analysis and synthesis methods.

  4. Application of statistical process control and process capability analysis procedures in orbiter processing activities at the Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Safford, Robert R.; Jackson, Andrew E.; Swart, William W.; Barth, Timothy S.

    1994-01-01

    Successful ground processing at KSC requires that flight hardware and ground support equipment conform to specifications at tens of thousands of checkpoints. Knowledge of conformance is an essential requirement for launch. That knowledge of conformance at every requisite point does not, however, enable identification of past problems with equipment, or potential problem areas. This paper describes how the introduction of Statistical Process Control and Process Capability Analysis identification procedures into existing shuttle processing procedures can enable identification of potential problem areas and candidates for improvements to increase processing performance measures. Results of a case study describing application of the analysis procedures to Thermal Protection System processing are used to illustrate the benefits of the approaches described in the paper.

  5. EEG Correlates of Fluctuation in Cognitive Performance in an Air Traffic Control Task

    DTIC Science & Technology

    2014-11-01

    using non-parametric statistical analysis to identify neurophysiological patterns due to the time-on-task effect. Significant changes in EEG power...EEG, Cognitive Performance, Power Spectral Analysis , Non-Parametric Analysis Document is available to the public through the Internet...3 Performance Data Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 EEG

  6. Workers' Participation and the Distribution of Control as Perceived by Members of Ten German Companies.

    ERIC Educational Resources Information Center

    Bartolke, Klaus; And Others

    1982-01-01

    A survey of 601 managers and workers in 10 German manufacturing companies studied the implications of workers' participation for the exercise of control. Statistical analysis of data on control over work environments, production organization, personnel, and finance indicated that, in more participative companies, distribution of control is more…

  7. Tsallis statistics and neurodegenerative disorders

    NASA Astrophysics Data System (ADS)

    Iliopoulos, Aggelos C.; Tsolaki, Magdalini; Aifantis, Elias C.

    2016-08-01

    In this paper, we perform statistical analysis of time series deriving from four neurodegenerative disorders, namely epilepsy, amyotrophic lateral sclerosis (ALS), Parkinson's disease (PD), Huntington's disease (HD). The time series are concerned with electroencephalograms (EEGs) of healthy and epileptic states, as well as gait dynamics (in particular stride intervals) of the ALS, PD and HDs. We study data concerning one subject for each neurodegenerative disorder and one healthy control. The analysis is based on Tsallis non-extensive statistical mechanics and in particular on the estimation of Tsallis q-triplet, namely {qstat, qsen, qrel}. The deviation of Tsallis q-triplet from unity indicates non-Gaussian statistics and long-range dependencies for all time series considered. In addition, the results reveal the efficiency of Tsallis statistics in capturing differences in brain dynamics between healthy and epileptic states, as well as differences between ALS, PD, HDs from healthy control subjects. The results indicate that estimations of Tsallis q-indices could be used as possible biomarkers, along with others, for improving classification and prediction of epileptic seizures, as well as for studying the gait complex dynamics of various diseases providing new insights into severity, medications and fall risk, improving therapeutic interventions.

  8. Statistical analysis of nonmonotonic dose-response relationships: research design and analysis of nasal cell proliferation in rats exposed to formaldehyde.

    PubMed

    Gaylor, David W; Lutz, Werner K; Conolly, Rory B

    2004-01-01

    Statistical analyses of nonmonotonic dose-response curves are proposed, experimental designs to detect low-dose effects of J-shaped curves are suggested, and sample sizes are provided. For quantal data such as cancer incidence rates, much larger numbers of animals are required than for continuous data such as biomarker measurements. For example, 155 animals per dose group are required to have at least an 80% chance of detecting a decrease from a 20% incidence in controls to an incidence of 10% at a low dose. For a continuous measurement, only 14 animals per group are required to have at least an 80% chance of detecting a change of the mean by one standard deviation of the control group. Experimental designs based on three dose groups plus controls are discussed to detect nonmonotonicity or to estimate the zero equivalent dose (ZED), i.e., the dose that produces a response equal to the average response in the controls. Cell proliferation data in the nasal respiratory epithelium of rats exposed to formaldehyde by inhalation are used to illustrate the statistical procedures. Statistically significant departures from a monotonic dose response were obtained for time-weighted average labeling indices with an estimated ZED at a formaldehyde dose of 5.4 ppm, with a lower 95% confidence limit of 2.7 ppm. It is concluded that demonstration of a statistically significant bi-phasic dose-response curve, together with estimation of the resulting ZED, could serve as a point-of departure in establishing a reference dose for low-dose risk assessment.

  9. Multisample adjusted U-statistics that account for confounding covariates.

    PubMed

    Satten, Glen A; Kong, Maiying; Datta, Somnath

    2018-06-19

    Multisample U-statistics encompass a wide class of test statistics that allow the comparison of 2 or more distributions. U-statistics are especially powerful because they can be applied to both numeric and nonnumeric data, eg, ordinal and categorical data where a pairwise similarity or distance-like measure between categories is available. However, when comparing the distribution of a variable across 2 or more groups, observed differences may be due to confounding covariates. For example, in a case-control study, the distribution of exposure in cases may differ from that in controls entirely because of variables that are related to both exposure and case status and are distributed differently among case and control participants. We propose to use individually reweighted data (ie, using the stratification score for retrospective data or the propensity score for prospective data) to construct adjusted U-statistics that can test the equality of distributions across 2 (or more) groups in the presence of confounding covariates. Asymptotic normality of our adjusted U-statistics is established and a closed form expression of their asymptotic variance is presented. The utility of our approach is demonstrated through simulation studies, as well as in an analysis of data from a case-control study conducted among African-Americans, comparing whether the similarity in haplotypes (ie, sets of adjacent genetic loci inherited from the same parent) occurring in a case and a control participant differs from the similarity in haplotypes occurring in 2 control participants. Copyright © 2018 John Wiley & Sons, Ltd.

  10. Statistical analysis of sparse infection data and its implications for retroviral treatment trials in primates.

    PubMed Central

    Spouge, J L

    1992-01-01

    Reports on retroviral primate trials rarely publish any statistical analysis. Present statistical methodology lacks appropriate tests for these trials and effectively discourages quantitative assessment. This paper describes the theory behind VACMAN, a user-friendly computer program that calculates statistics for in vitro and in vivo infectivity data. VACMAN's analysis applies to many retroviral trials using i.v. challenges and is valid whenever the viral dose-response curve has a particular shape. Statistics from actual i.v. retroviral trials illustrate some unappreciated principles of effective animal use: dilutions other than 1:10 can improve titration accuracy; infecting titration animals at the lowest doses possible can lower challenge doses; and finally, challenging test animals in small trials with more virus than controls safeguards against false successes, "reuses" animals, and strengthens experimental conclusions. The theory presented also explains the important concept of viral saturation, a phenomenon that may cause in vitro and in vivo titrations to agree for some retroviral strains and disagree for others. PMID:1323844

  11. Statistical analysis plan for the family-led rehabilitation after stroke in India (ATTEND) trial: A multicenter randomized controlled trial of a new model of stroke rehabilitation compared to usual care.

    PubMed

    Billot, Laurent; Lindley, Richard I; Harvey, Lisa A; Maulik, Pallab K; Hackett, Maree L; Murthy, Gudlavalleti Vs; Anderson, Craig S; Shamanna, Bindiganavale R; Jan, Stephen; Walker, Marion; Forster, Anne; Langhorne, Peter; Verma, Shweta J; Felix, Cynthia; Alim, Mohammed; Gandhi, Dorcas Bc; Pandian, Jeyaraj Durai

    2017-02-01

    Background In low- and middle-income countries, few patients receive organized rehabilitation after stroke, yet the burden of chronic diseases such as stroke is increasing in these countries. Affordable models of effective rehabilitation could have a major impact. The ATTEND trial is evaluating a family-led caregiver delivered rehabilitation program after stroke. Objective To publish the detailed statistical analysis plan for the ATTEND trial prior to trial unblinding. Methods Based upon the published registration and protocol, the blinded steering committee and management team, led by the trial statistician, have developed a statistical analysis plan. The plan has been informed by the chosen outcome measures, the data collection forms and knowledge of key baseline data. Results The resulting statistical analysis plan is consistent with best practice and will allow open and transparent reporting. Conclusions Publication of the trial statistical analysis plan reduces potential bias in trial reporting, and clearly outlines pre-specified analyses. Clinical Trial Registrations India CTRI/2013/04/003557; Australian New Zealand Clinical Trials Registry ACTRN1261000078752; Universal Trial Number U1111-1138-6707.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plemons, R.E.; Hopwood, W.H. Jr.; Hamilton, J.H.

    For a number of years the Oak Ridge Y-12 Plant Laboratory has been analyzing coal predominately for the utilities department of the Y-12 Plant. All laboratory procedures, except a Leco sulfur method which used the Leco Instruction Manual as a reference, were written based on the ASTM coal analyses. Sulfur is analyzed at the present time by two methods, gravimetric and Leco. The laboratory has two major endeavors for monitoring the quality of its coal analyses. (1) A control program by the Plant Statistical Quality Control Department. Quality Control submits one sample for every nine samples submitted by the utilitiesmore » departments and the laboratory analyzes a control sample along with the utilities samples. (2) An exchange program with the DOE Coal Analysis Laboratory in Bruceton, Pennsylvania. The Y-12 Laboratory submits to the DOE Coal Laboratory, on even numbered months, a sample that Y-12 has analyzed. The DOE Coal Laboratory submits, on odd numbered months, one of their analyzed samples to the Y-12 Plant Laboratory to be analyzed. The results of these control and exchange programs are monitored not only by laboratory personnel, but also by Statistical Quality Control personnel who provide statistical evaluations. After analysis and reporting of results, all utilities samples are retained by the laboratory until the coal contracts have been settled. The utilities departments have responsibility for the initiation and preparation of the coal samples. The samples normally received by the laboratory have been ground to 4-mesh, reduced to 0.5-gallon quantities, and sealed in air-tight containers. Sample identification numbers and a Request for Analysis are generated by the utilities departments.« less

  13. A case study: application of statistical process control tool for determining process capability and sigma level.

    PubMed

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical process control study on process. Interpretation of such a study provides information about stability, process variability, changing of trends, and quantification of process ability against defective production. Comparative evaluation of critical quality attributes by Pareto charts provides the least capable and most variable process that is liable for improvement. Statistical process control thus proves to be an important tool for six sigma-capable process development and continuous quality improvement.

  14. Statistical testing and power analysis for brain-wide association study.

    PubMed

    Gong, Weikang; Wan, Lin; Lu, Wenlian; Ma, Liang; Cheng, Fan; Cheng, Wei; Grünewald, Stefan; Feng, Jianfeng

    2018-04-05

    The identification of connexel-wise associations, which involves examining functional connectivities between pairwise voxels across the whole brain, is both statistically and computationally challenging. Although such a connexel-wise methodology has recently been adopted by brain-wide association studies (BWAS) to identify connectivity changes in several mental disorders, such as schizophrenia, autism and depression, the multiple correction and power analysis methods designed specifically for connexel-wise analysis are still lacking. Therefore, we herein report the development of a rigorous statistical framework for connexel-wise significance testing based on the Gaussian random field theory. It includes controlling the family-wise error rate (FWER) of multiple hypothesis testings using topological inference methods, and calculating power and sample size for a connexel-wise study. Our theoretical framework can control the false-positive rate accurately, as validated empirically using two resting-state fMRI datasets. Compared with Bonferroni correction and false discovery rate (FDR), it can reduce false-positive rate and increase statistical power by appropriately utilizing the spatial information of fMRI data. Importantly, our method bypasses the need of non-parametric permutation to correct for multiple comparison, thus, it can efficiently tackle large datasets with high resolution fMRI images. The utility of our method is shown in a case-control study. Our approach can identify altered functional connectivities in a major depression disorder dataset, whereas existing methods fail. A software package is available at https://github.com/weikanggong/BWAS. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. Inhibition of Orthopaedic Implant Infections by Immunomodulatory Effects of Host Defense Peptides

    DTIC Science & Technology

    2014-12-01

    significance was determined by t- tests or by one-way analysis of variance (ANOVA) followed by Bonferroni post hoc tests in experiments with multiple...groups. Non- parametric Mann-Whitney tests , Kruskal-Wallis ANOVA followed by Newman-Kuels post hoc tests , or van Elteren’s two-way tests were applied to...in D, and black symbols in A), statistical analysis was by one-way ANOVA followed by Bonferroni versus control, post hoc tests . Otherwise, statistical

  16. Navigation analysis for Viking 1979, option B

    NASA Technical Reports Server (NTRS)

    Mitchell, P. H.

    1971-01-01

    A parametric study performed for 48 trans-Mars reference missions in support of the Viking program is reported. The launch dates cover several months in the year 1979, and each launch date has multiple arrival dates in 1980. A plot of launch versus arrival dates with case numbers designated for reference purposes is included. The analysis consists of the computation of statistical covariance matrices based on certain assumptions about the ground-based tracking systems. The error model statistics are listed in tables. Tracking systems were assumed at three sites: Goldstone, California; Canberra, Australia; and Madrid, Spain. The tracking data consisted of range and Doppler measurements taken during the tracking intervals starting at E-30(d) and ending at E-10(d) for the control data and ending at E-18(h) for the knowledge data. The control and knowledge covariance matrices were delivered to the Planetary Mission Analysis Branch for inputs into a delta V dispersion analysis.

  17. Integrated Assessment and Improvement of the Quality Assurance System for the Cosworth Casting Process

    NASA Astrophysics Data System (ADS)

    Yousif, Dilon

    The purpose of this study was to improve the Quality Assurance (QA) System at the Nemak Windsor Aluminum Plant (WAP). The project used Six Sigma method based on Define, Measure, Analyze, Improve, and Control (DMAIC). Analysis of in process melt at WAP was based on chemical, thermal, and mechanical testing. The control limits for the W319 Al Alloy were statistically recalculated using the composition measured under stable conditions. The "Chemistry Viewer" software was developed for statistical analysis of alloy composition. This software features the Silicon Equivalency (SiBQ) developed by the IRC. The Melt Sampling Device (MSD) was designed and evaluated at WAP to overcome traditional sampling limitations. The Thermal Analysis "Filters" software was developed for cooling curve analysis of the 3XX Al Alloy(s) using IRC techniques. The impact of low melting point impurities on the start of melting was evaluated using the Universal Metallurgical Simulator and Analyzer (UMSA).

  18. Feminist identity as a predictor of eating disorder diagnostic status.

    PubMed

    Green, Melinda A; Scott, Norman A; Riopel, Cori M; Skaggs, Anna K

    2008-06-01

    Passive Acceptance (PA) and Active Commitment (AC) subscales of the Feminist Identity Development Scale (FIDS) were examined as predictors of eating disorder diagnostic status as assessed by the Questionnaire for Eating Disorder Diagnoses (Q-EDD). Results of a hierarchical regression analysis revealed PA and AC scores were not statistically significant predictors of ED diagnostic status after controlling for diagnostic subtype. Results of a multiple regression analysis revealed FIDS as a statistically significant predictor of ED diagnostic status when failing to control for ED diagnostic subtype. Discrepancies suggest ED diagnostic subtype may serve as a moderator variable in the relationship between ED diagnostic status and FIDS. (c) 2008 Wiley Periodicals, Inc.

  19. The Power of 'Evidence': Reliable Science or a Set of Blunt Tools?

    ERIC Educational Resources Information Center

    Wrigley, Terry

    2018-01-01

    In response to the increasing emphasis on 'evidence-based teaching', this article examines the privileging of randomised controlled trials and their statistical synthesis (meta-analysis). It also pays particular attention to two third-level statistical syntheses: John Hattie's "Visible learning" project and the EEF's "Teaching and…

  20. Statistical Analysis and Time Series Modeling of Air Traffic Operations Data From Flight Service Stations and Terminal Radar Approach Control Facilities : Two Case Studies

    DOT National Transportation Integrated Search

    1981-10-01

    Two statistical procedures have been developed to estimate hourly or daily aircraft counts. These counts can then be transformed into estimates of instantaneous air counts. The first procedure estimates the stable (deterministic) mean level of hourly...

  1. Luster measurements of lips treated with lipstick formulations.

    PubMed

    Yadav, Santosh; Issa, Nevine; Streuli, David; McMullen, Roger; Fares, Hani

    2011-01-01

    In this study, digital photography in combination with image analysis was used to measure the luster of several lipstick formulations containing varying amounts and types of polymers. A weighed amount of lipstick was applied to a mannequin's lips and the mannequin was illuminated by a uniform beam of a white light source. Digital images of the mannequin were captured with a high-resolution camera and the images were analyzed using image analysis software. Luster analysis was performed using Stamm (L(Stamm)) and Reich-Robbins (L(R-R)) luster parameters. Statistical analysis was performed on each luster parameter (L(Stamm) and L(R-R)), peak height, and peak width. Peak heights for lipstick formulation containing 11% and 5% VP/eicosene copolymer were statistically different from those of the control. The L(Stamm) and L(R-R) parameters for the treatment containing 11% VP/eicosene copolymer were statistically different from these of the control. Based on the results obtained in this study, we are able to determine whether a polymer is a good pigment dispersant and contributes to visually detected shine of a lipstick upon application. The methodology presented in this paper could serve as a tool for investigators to screen their ingredients for shine in lipstick formulations.

  2. Automated system for the on-line monitoring of powder blending processes using near-infrared spectroscopy. Part I. System development and control.

    PubMed

    Hailey, P A; Doherty, P; Tapsell, P; Oliver, T; Aldridge, P K

    1996-03-01

    An automated system for the on-line monitoring of powder blending processes is described. The system employs near-infrared (NIR) spectroscopy using fibre-optics and a graphical user interface (GUI) developed in the LabVIEW environment. The complete supervisory control and data analysis (SCADA) software controls blender and spectrophotometer operation and performs statistical spectral data analysis in real time. A data analysis routine using standard deviation is described to demonstrate an approach to the real-time determination of blend homogeneity.

  3. Dietary Soy Supplement on Fibromyalgia Symptoms: A Randomized, Double-Blind, Placebo-Controlled, Early Phase Trial

    PubMed Central

    Wahner-Roedler, Dietlind L.; Thompson, Jeffrey M.; Luedtke, Connie A.; King, Susan M.; Cha, Stephen S.; Elkin, Peter L.; Bruce, Barbara K.; Townsend, Cynthia O.; Bergeson, Jody R.; Eickhoff, Andrea L.; Loehrer, Laura L.; Sood, Amit; Bauer, Brent A.

    2011-01-01

    Most patients with fibromyalgia use complementary and alternative medicine (CAM). Properly designed controlled trials are necessary to assess the effectiveness of these practices. This study was a randomized, double-blind, placebo-controlled, early phase trial. Fifty patients seen at a fibromyalgia outpatient treatment program were randomly assigned to a daily soy or placebo (casein) shake. Outcome measures were scores of the Fibromyalgia Impact Questionnaire (FIQ) and the Center for Epidemiologic Studies Depression Scale (CES-D) at baseline and after 6 weeks of intervention. Analysis was with standard statistics based on the null hypothesis, and separation test for early phase CAM comparative trials. Twenty-eight patients completed the study. Use of standard statistics with intent-to-treat analysis showed that total FIQ scores decreased by 14% in the soy group (P = .02) and by 18% in the placebo group (P < .001). The difference in change in scores between the groups was not significant (P = .16). With the same analysis, CES-D scores decreased in the soy group by 16% (P = .004) and in the placebo group by 15% (P = .05). The change in scores was similar in the groups (P = .83). Results of statistical analysis using the separation test and intent-to-treat analysis revealed no benefit of soy compared with placebo. Shakes that contain soy and shakes that contain casein, when combined with a multidisciplinary fibromyalgia treatment program, provide a decrease in fibromyalgia symptoms. Separation between the effects of soy and casein (control) shakes did not favor the intervention. Therefore, large-sample studies using soy for patients with fibromyalgia are probably not indicated. PMID:18990724

  4. Detection of changes of high-frequency activity by statistical time-frequency analysis in epileptic spikes

    PubMed Central

    Kobayashi, Katsuhiro; Jacobs, Julia; Gotman, Jean

    2013-01-01

    Objective A novel type of statistical time-frequency analysis was developed to elucidate changes of high-frequency EEG activity associated with epileptic spikes. Methods The method uses the Gabor Transform and detects changes of power in comparison to background activity using t-statistics that are controlled by the false discovery rate (FDR) to correct type I error of multiple testing. The analysis was applied to EEGs recorded at 2000 Hz from three patients with mesial temporal lobe epilepsy. Results Spike-related increase of high-frequency oscillations (HFOs) was clearly shown in the FDR-controlled t-spectra: it was most dramatic in spikes recorded from the hippocampus when the hippocampus was the seizure onset zone (SOZ). Depression of fast activity was observed immediately after the spikes, especially consistently in the discharges from the hippocampal SOZ. It corresponded to the slow wave part in case of spike-and-slow-wave complexes, but it was noted even in spikes without apparent slow waves. In one patient, a gradual increase of power above 200 Hz preceded spikes. Conclusions FDR-controlled t-spectra clearly detected the spike-related changes of HFOs that were unclear in standard power spectra. Significance We developed a promising tool to study the HFOs that may be closely linked to the pathophysiology of epileptogenesis. PMID:19394892

  5. Statistical analysis plan for the Pneumatic CompREssion for PreVENting Venous Thromboembolism (PREVENT) trial: a study protocol for a randomized controlled trial.

    PubMed

    Arabi, Yaseen; Al-Hameed, Fahad; Burns, Karen E A; Mehta, Sangeeta; Alsolamy, Sami; Almaani, Mohammed; Mandourah, Yasser; Almekhlafi, Ghaleb A; Al Bshabshe, Ali; Finfer, Simon; Alshahrani, Mohammed; Khalid, Imran; Mehta, Yatin; Gaur, Atul; Hawa, Hassan; Buscher, Hergen; Arshad, Zia; Lababidi, Hani; Al Aithan, Abdulsalam; Jose, Jesna; Abdukahil, Sheryl Ann I; Afesh, Lara Y; Dbsawy, Maamoun; Al-Dawood, Abdulaziz

    2018-03-15

    The Pneumatic CompREssion for Preventing VENous Thromboembolism (PREVENT) trial evaluates the effect of adjunctive intermittent pneumatic compression (IPC) with pharmacologic thromboprophylaxis compared to pharmacologic thromboprophylaxis alone on venous thromboembolism (VTE) in critically ill adults. In this multicenter randomized trial, critically ill patients receiving pharmacologic thromboprophylaxis will be randomized to an IPC or a no IPC (control) group. The primary outcome is "incident" proximal lower-extremity deep vein thrombosis (DVT) within 28 days after randomization. Radiologists interpreting the lower-extremity ultrasonography will be blinded to intervention allocation, whereas the patients and treating team will be unblinded. The trial has 80% power to detect a 3% absolute risk reduction in the rate of proximal DVT from 7% to 4%. Consistent with international guidelines, we have developed a detailed plan to guide the analysis of the PREVENT trial. This plan specifies the statistical methods for the evaluation of primary and secondary outcomes, and defines covariates for adjusted analyses a priori. Application of this statistical analysis plan to the PREVENT trial will facilitate unbiased analyses of clinical data. ClinicalTrials.gov , ID: NCT02040103 . Registered on 3 November 2013; Current controlled trials, ID: ISRCTN44653506 . Registered on 30 October 2013.

  6. The Relationship between Zinc Levels and Autism: A Systematic Review and Meta-analysis.

    PubMed

    Babaknejad, Nasim; Sayehmiri, Fatemeh; Sayehmiri, Kourosh; Mohamadkhani, Ashraf; Bahrami, Somaye

    2016-01-01

    Autism is a complex behaviorally defined disorder.There is a relationship between zinc (Zn) levels in autistic patients and development of pathogenesis, but the conclusion is not permanent. The present study conducted to estimate this probability using meta-analysis method. In this study, Fixed Effect Model, twelve articles published from 1978 to 2012 were selected by searching Google scholar, PubMed, ISI Web of Science, and Scopus and information were analyzed. I² statistics were calculated to examine heterogeneity. The information was analyzed using R and STATA Ver. 12.2. There was no significant statistical difference between hair, nail, and teeth Zn levels between controls and autistic patients: -0.471 [95% confidence interval (95% CI): -1.172 to 0.231]. There was significant statistical difference between plasma Zn concentration and autistic patients besides healthy controls: -0.253 (95% CI: 0.498 to -0.007). Using a Random Effect Model, the overall Integration of data from the two groups was -0.414 (95% CI: -0.878 to -0.051). Based on sensitivity analysis, zinc supplements can be used for the nutritional therapy for autistic patients.

  7. Testing and Evaluating C3I Systems That Employ AI. Volume 1. Handbook for Testing Expert Systems

    DTIC Science & Technology

    1991-01-31

    Designs ....... ............. .. 6-29 Nonequivalent Control Group Design ...does not receive the system; and (c) nonequivalent (and nonrandomized) control group designs that rely on statistical techniques like analysis of...implementation); (b) multiple time-series designs using a control group ; and (c) nonequivalent control group designs that obtain pretest and

  8. Effectiveness of perioperative antiepileptic drug prophylaxis for early and late seizures following oncologic neurosurgery: a meta-analysis.

    PubMed

    Joiner, Evan F; Youngerman, Brett E; Hudson, Taylor S; Yang, Jingyan; Welch, Mary R; McKhann, Guy M; Neugut, Alfred I; Bruce, Jeffrey N

    2018-04-27

    OBJECTIVE The purpose of this meta-analysis was to evaluate the impact of perioperative antiepileptic drug (AED) prophylaxis on short- and long-term seizure incidence among patients undergoing brain tumor surgery. It is the first meta-analysis to focus exclusively on perioperative AED prophylaxis among patients undergoing brain tumor surgery. METHODS The authors searched PubMed/MEDLINE, Embase, Cochrane Central Register of Controlled Trials, clinicaltrials.gov, and the System for Information on Gray Literature in Europe for records related to perioperative AED prophylaxis for patients with brain tumors. Risk of bias in the included studies was assessed using the Cochrane risk of bias tool. Incidence rates for early seizures (within the first postoperative week) and total seizures were estimated based on data from randomized controlled trials. A Mantel-Haenszel random-effects model was used to analyze pooled relative risk (RR) of early seizures (within the first postoperative week) and total seizures associated with perioperative AED prophylaxis versus control. RESULTS Four RCTs involving 352 patients met the criteria of inclusion. The results demonstrated that perioperative AED prophylaxis for patients undergoing brain tumor surgery provides a statistically significant reduction in risk of early postoperative seizures compared with control (RR = 0.352, 95% confidence interval 0.130-0.949, p = 0.039). AED prophylaxis had no statistically significant effect on the total (combined short- and long-term) incidence of seizures. CONCLUSIONS This meta-analysis demonstrates for the first time that perioperative AED prophylaxis for brain tumor surgery provides a statistically significant reduction in early postoperative seizure risk.

  9. Living systematic reviews: 3. Statistical methods for updating meta-analyses.

    PubMed

    Simmonds, Mark; Salanti, Georgia; McKenzie, Joanne; Elliott, Julian

    2017-11-01

    A living systematic review (LSR) should keep the review current as new research evidence emerges. Any meta-analyses included in the review will also need updating as new material is identified. If the aim of the review is solely to present the best current evidence standard meta-analysis may be sufficient, provided reviewers are aware that results may change at later updates. If the review is used in a decision-making context, more caution may be needed. When using standard meta-analysis methods, the chance of incorrectly concluding that any updated meta-analysis is statistically significant when there is no effect (the type I error) increases rapidly as more updates are performed. Inaccurate estimation of any heterogeneity across studies may also lead to inappropriate conclusions. This paper considers four methods to avoid some of these statistical problems when updating meta-analyses: two methods, that is, law of the iterated logarithm and the Shuster method control primarily for inflation of type I error and two other methods, that is, trial sequential analysis and sequential meta-analysis control for type I and II errors (failing to detect a genuine effect) and take account of heterogeneity. This paper compares the methods and considers how they could be applied to LSRs. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Exceedance statistics of accelerations resulting from thruster firings on the Apollo-Soyuz mission

    NASA Technical Reports Server (NTRS)

    Fichtl, G. H.; Holland, R. L.

    1981-01-01

    Spacecraft acceleration resulting from firings of vernier control system thrusters is an important consideration in the design, planning, execution and post-flight analysis of laboratory experiments in space. In particular, scientists and technologists involved with the development of experiments to be performed in space in many instances required statistical information on the magnitude and rate of occurrence of spacecraft accelerations. Typically, these accelerations are stochastic in nature, so that it is useful to characterize these accelerations in statistical terms. Statistics of spacecraft accelerations are summarized.

  11. Plan delivery quality assurance for CyberKnife: Statistical process control analysis of 350 film-based patient-specific QAs.

    PubMed

    Bellec, J; Delaby, N; Jouyaux, F; Perdrieux, M; Bouvier, J; Sorel, S; Henry, O; Lafond, C

    2017-07-01

    Robotic radiosurgery requires plan delivery quality assurance (DQA) but there has never been a published comprehensive analysis of a patient-specific DQA process in a clinic. We proposed to evaluate 350 consecutive film-based patient-specific DQAs using statistical process control. We evaluated the performance of the process to propose achievable tolerance criteria for DQA validation and we sought to identify suboptimal DQA using control charts. DQAs were performed on a CyberKnife-M6 using Gafchromic-EBT3 films. The signal-to-dose conversion was performed using a multichannel-correction and a scanning protocol that combined measurement and calibration in a single scan. The DQA analysis comprised a gamma-index analysis at 3%/1.5mm and a separate evaluation of spatial and dosimetric accuracy of the plan delivery. Each parameter was plotted on a control chart and control limits were calculated. A capability index (Cpm) was calculated to evaluate the ability of the process to produce results within specifications. The analysis of capability showed that a gamma pass rate of 85% at 3%/1.5mm was highly achievable as acceptance criteria for DQA validation using a film-based protocol (Cpm>1.33). 3.4% of DQA were outside a control limit of 88% for gamma pass-rate. The analysis of the out-of-control DQA helped identify a dosimetric error in our institute for a specific treatment type. We have defined initial tolerance criteria for DQA validations. We have shown that the implementation of a film-based patient-specific DQA protocol with the use of control charts is an effective method to improve patient treatment safety on CyberKnife. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  12. A user-friendly workflow for analysis of Illumina gene expression bead array data available at the arrayanalysis.org portal.

    PubMed

    Eijssen, Lars M T; Goelela, Varshna S; Kelder, Thomas; Adriaens, Michiel E; Evelo, Chris T; Radonjic, Marijana

    2015-06-30

    Illumina whole-genome expression bead arrays are a widely used platform for transcriptomics. Most of the tools available for the analysis of the resulting data are not easily applicable by less experienced users. ArrayAnalysis.org provides researchers with an easy-to-use and comprehensive interface to the functionality of R and Bioconductor packages for microarray data analysis. As a modular open source project, it allows developers to contribute modules that provide support for additional types of data or extend workflows. To enable data analysis of Illumina bead arrays for a broad user community, we have developed a module for ArrayAnalysis.org that provides a free and user-friendly web interface for quality control and pre-processing for these arrays. This module can be used together with existing modules for statistical and pathway analysis to provide a full workflow for Illumina gene expression data analysis. The module accepts data exported from Illumina's GenomeStudio, and provides the user with quality control plots and normalized data. The outputs are directly linked to the existing statistics module of ArrayAnalysis.org, but can also be downloaded for further downstream analysis in third-party tools. The Illumina bead arrays analysis module is available at http://www.arrayanalysis.org . A user guide, a tutorial demonstrating the analysis of an example dataset, and R scripts are available. The module can be used as a starting point for statistical evaluation and pathway analysis provided on the website or to generate processed input data for a broad range of applications in life sciences research.

  13. [Evaluation on application of China Disease Prevention and Control Information System of Hydatid Disease Ⅰ Current status at the provincial level].

    PubMed

    Zhi-Hua, Zhang; Qing, Yu; Tian, Tian; Wei-Ping, Wu; Ning, Xiao

    2016-03-31

    To evaluate the application status of China Disease Prevention and Control Information System of Hydatid Disease, in which questions existed are summarized in order to promote the system update. A questionnaire was designed and distributed to Inner Mongolia, Sichuan, Tibet, Gansu, Qinghai, Ningxia, Xinjiang and Xinjiang Production and Construction Corps to evaluate the application status of China Disease Prevention and Control Information System of Hydatid Disease assistant with telephone. The recovery rate of questionnaires was 87.5%. The statistics of closed questions showed that national application rate of the China Disease Prevention and Control Information System of Hydatid Disease was 100%, of which 15.3% were low frequency users, 57.1% believed the system was necessary, 28.6% considered it was dispensable, and 14.3% believed that it was totally unnecessary. The statistics of open-ended questions indicated that 6 endemic regions suggested to increase the guidance and training, while 4 endemic regions had opinions on sharing the information of the national infectious disease reporting systems and hydatid disease prevention and control information system, and the opinions on turning monthly report to quarterly report, and increasing statistics and analysis module, and 3 endemic regions deemed that the system had logic errors and defects. The problems of the system are mainly focused on the existence of systemic deficiencies and logic errors, lacking of statistical parameters and corresponding analysis function module, and lacking of the guidance and training, which limits the use of the system. Therefore, these problems should be resolved.

  14. Multiple comparison analysis testing in ANOVA.

    PubMed

    McHugh, Mary L

    2011-01-01

    The Analysis of Variance (ANOVA) test has long been an important tool for researchers conducting studies on multiple experimental groups and one or more control groups. However, ANOVA cannot provide detailed information on differences among the various study groups, or on complex combinations of study groups. To fully understand group differences in an ANOVA, researchers must conduct tests of the differences between particular pairs of experimental and control groups. Tests conducted on subsets of data tested previously in another analysis are called post hoc tests. A class of post hoc tests that provide this type of detailed information for ANOVA results are called "multiple comparison analysis" tests. The most commonly used multiple comparison analysis statistics include the following tests: Tukey, Newman-Keuls, Scheffee, Bonferroni and Dunnett. These statistical tools each have specific uses, advantages and disadvantages. Some are best used for testing theory while others are useful in generating new theory. Selection of the appropriate post hoc test will provide researchers with the most detailed information while limiting Type 1 errors due to alpha inflation.

  15. Methods for trend analysis: Examples with problem/failure data

    NASA Technical Reports Server (NTRS)

    Church, Curtis K.

    1989-01-01

    Statistics are emphasized as an important role in quality control and reliability. Consequently, Trend Analysis Techniques recommended a variety of statistical methodologies that could be applied to time series data. The major goal of the working handbook, using data from the MSFC Problem Assessment System, is to illustrate some of the techniques in the NASA standard, some different techniques, and to notice patterns of data. Techniques for trend estimation used are: regression (exponential, power, reciprocal, straight line) and Kendall's rank correlation coefficient. The important details of a statistical strategy for estimating a trend component are covered in the examples. However, careful analysis and interpretation is necessary because of small samples and frequent zero problem reports in a given time period. Further investigations to deal with these issues are being conducted.

  16. AGR-1 Thermocouple Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeff Einerson

    2012-05-01

    This report documents an effort to analyze measured and simulated data obtained in the Advanced Gas Reactor (AGR) fuel irradiation test program conducted in the INL's Advanced Test Reactor (ATR) to support the Next Generation Nuclear Plant (NGNP) R&D program. The work follows up on a previous study (Pham and Einerson, 2010), in which statistical analysis methods were applied for AGR-1 thermocouple data qualification. The present work exercises the idea that, while recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, results of the numerical simulations can be used in combination with the statistical analysis methods tomore » further improve qualification of measured data. Additionally, the combined analysis of measured and simulation data can generate insights about simulation model uncertainty that can be useful for model improvement. This report also describes an experimental control procedure to maintain fuel target temperature in the future AGR tests using regression relationships that include simulation results. The report is organized into four chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program, AGR-1 test configuration and test procedure, overview of AGR-1 measured data, and overview of physics and thermal simulation, including modeling assumptions and uncertainties. A brief summary of statistical analysis methods developed in (Pham and Einerson 2010) for AGR-1 measured data qualification within NGNP Data Management and Analysis System (NDMAS) is also included for completeness. Chapters 2-3 describe and discuss cases, in which the combined use of experimental and simulation data is realized. A set of issues associated with measurement and modeling uncertainties resulted from the combined analysis are identified. This includes demonstration that such a combined analysis led to important insights for reducing uncertainty in presentation of AGR-1 measured data (Chapter 2) and interpretation of simulation results (Chapter 3). The statistics-based simulation-aided experimental control procedure described for the future AGR tests is developed and demonstrated in Chapter 4. The procedure for controlling the target fuel temperature (capsule peak or average) is based on regression functions of thermocouple readings and other relevant parameters and accounting for possible changes in both physical and thermal conditions and in instrument performance.« less

  17. Measuring the Effects of Peer Learning on Students' Academic Achievement in First-Year Business Statistics

    ERIC Educational Resources Information Center

    Dancer, Diane; Morrison, Kellie; Tarr, Garth

    2015-01-01

    Peer-assisted study session (PASS) programs have been shown to positively affect students' grades in a majority of studies. This study extends that analysis in two ways: controlling for ability and other factors, with focus on international students, and by presenting results for PASS in business statistics. Ordinary least squares, random effects…

  18. The Effect of Using Case Studies in Business Statistics

    ERIC Educational Resources Information Center

    Pariseau, Susan E.; Kezim, Boualem

    2007-01-01

    The authors evaluated the effect on learning of using case studies in business statistics courses. The authors divided students into 3 groups: a control group, a group that completed 1 case study, and a group that completed 3 case studies. Results evidenced that, on average, students whom the authors required to complete a case analysis received…

  19. Treated cabin acoustic prediction using statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Yoerkie, Charles A.; Ingraham, Steven T.; Moore, James A.

    1987-01-01

    The application of statistical energy analysis (SEA) to the modeling and design of helicopter cabin interior noise control treatment is demonstrated. The information presented here is obtained from work sponsored at NASA Langley for the development of analytic modeling techniques and the basic understanding of cabin noise. Utility and executive interior models are developed directly from existing S-76 aircraft designs. The relative importance of panel transmission loss (TL), acoustic leakage, and absorption to the control of cabin noise is shown using the SEA modeling parameters. It is shown that the major cabin noise improvement below 1000 Hz comes from increased panel TL, while above 1000 Hz it comes from reduced acoustic leakage and increased absorption in the cabin and overhead cavities.

  20. Skin antiseptics in venous puncture site disinfection for preventing blood culture contamination: A Bayesian network meta-analysis of randomized controlled trials.

    PubMed

    Liu, Wenjie; Duan, Yuchen; Cui, Wenyao; Li, Li; Wang, Xia; Dai, Heling; You, Chao; Chen, Maojun

    2016-07-01

    To compare the efficacy of several antiseptics in decreasing the blood culture contamination rate. Network meta-analysis. Electronic searches of PubMed and Embase were conducted up to November 2015. Only randomized controlled trials or quasi-randomized controlled trials were eligible. We applied no language restriction. A comprehensive review of articles in the reference lists was also accomplished for possible relevant studies. Relevant studies evaluating efficacy of different antiseptics in venous puncture site for decreasing the blood culture contamination rate were included. The data were extracted from the included randomized controlled trials by two authors independently. The risk of bias was evaluated using Detsky scale by two authors independently. We used WinBUGS1.43 software and statistic model described by Chaimani to perform this network meta-analysis. Then graphs of statistical results of WinBUGS1.43 software were generated using 'networkplot', 'ifplot', 'netfunnel' and 'sucra' procedure by STATA13.0. Odds ratio and 95% confidence intervals were assessed for dichotomous data. A probability of p less than 0.05 was considered to be statistically significant. Compared with ordinary meta-analyses, this network meta-analysis offered hierarchies for the efficacy of different antiseptics in decreasing the blood culture contamination rate. Seven randomized controlled trials involving 34,408 blood samples were eligible for the meta-analysis. No significant difference was found in blood culture contamination rate among different antiseptics. No significant difference was found between non-alcoholic antiseptics and alcoholic antiseptics, alcoholic chlorhexidine and povidone iodine, chlorhexidine and iodine compounds, povidone iodine and iodine tincture in this aspect, respectively. Different antiseptics may not affect the blood culture contamination rate. Different intervals between the skin disinfection and the venous puncture, the different settings (emergency room, medical wards, and intensive care units) and the performance of the phlebotomy may affect the blood culture contamination rate. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. [PASS neurocognitive dysfunction in attention deficit].

    PubMed

    Pérez-Alvarez, F; Timoneda-Gallart, C

    Attention deficit disorder shows both cognitive and behavioral patterns. To determine a particular PASS (planning, attention, successive and simultaneous) pattern in order to early diagnosis and remediation according to PASS theory. 80 patients were selected from the neuropediatric attendance, aged 6 to 12 years old, 55 boys and 25 girls. Inclusion criteria were inattention (80 cases) and inattention with hyperactive symptoms (40 cases) according to the Diagnostic and Statistical Manual (DSM-IV). Exclusion criteria were the criteria of phonologic awareness previously reported, considered useful to diagnose dyslexia. A control group of 300 individuals, aged 5 to 12 years old, was used, criteria above mentioned being controlled. DN:CAS (Das-Naglieri Cognitive Assessment System) battery, translated to native language, was given to assess PASS cognitive processes. Results were analyzed with cluster analysis and t-Student test. Statistical factor analysis of the control group had previously identified the four PASS processes: planning, attention, successive and simultaneous. The dendrogram of the cluster analysis discriminated three categories of attention deficit disorder: 1. The most frequent, with planning deficit; 2. Without planning deficit but with deficit in other processes, and 3. Just only a few cases, without cognitive processing deficit. Cognitive deficiency in terms of means of scores was statistically significant when compared to control group (p = 0.001). According to PASS pattern, planning deficiency is a relevant factor. Neurological planning is not exactly the same than neurological executive function. The behavioral pattern is mainly linked to planning deficiency, but also to other PASS processing deficits and even to no processing deficit.

  2. Expression Profiling of Nonpolar Lipids in Meibum From Patients With Dry Eye: A Pilot Study

    PubMed Central

    Chen, Jianzhong; Keirsey, Jeremy K.; Green, Kari B.; Nichols, Kelly K.

    2017-01-01

    Purpose The purpose of this investigation was to characterize differentially expressed lipids in meibum samples from patients with dry eye disease (DED) in order to better understand the underlying pathologic mechanisms. Methods Meibum samples were collected from postmenopausal women with DED (PW-DED; n = 5) and a control group of postmenopausal women without DED (n = 4). Lipid profiles were analyzed by direct infusion full-scan electrospray ionization mass spectrometry (ESI-MS). An initial analysis of 145 representative peaks from four classes of lipids in PW-DED samples revealed that additional manual corrections for peak overlap and isotopes only slightly affected the statistical analysis. Therefore, analysis of uncorrected data, which can be applied to a greater number of peaks, was used to compare more than 500 lipid peaks common to PW-DED and control samples. Statistical analysis of peak intensities identified several lipid species that differed significantly between the two groups. Data from contact lens wearers with DED (CL-DED; n = 5) were also analyzed. Results Many species of the two types of diesters (DE) and very long chain wax esters (WE) were decreased by ∼20% in PW-DED, whereas levels of triacylglycerols were increased by an average of 39% ± 3% in meibum from PW-DED compared to that in the control group. Approximately the same reduction (20%) of similar DE and WE was observed for CL-DED. Conclusions Statistical analysis of peak intensities from direct infusion ESI-MS results identified differentially expressed lipids in meibum from dry eye patients. Further studies are warranted to support these findings. PMID:28426869

  3. No direct correlation between rotavirus diarrhea and breast feeding: A meta-analysis.

    PubMed

    Shen, Jian; Zhang, Bi-Meng; Zhu, Sheng-Guo; Chen, Jian-Jie

    2018-04-01

    Some studies indicated that children with exclusive breast feeding had a reduction in the prevalence of rotavirus diarrhea, while some others held the opposite views. In this study, we aimed to systematically find the associations between rotavirus diarrhea and breast feeding. A literature search up to June 2016 in electronic literature databases, including PubMed and Embase, was performed. The Newcastle-Ottawa Scale was used to conduct the quality assessment of all the selected studies. Statistical analyses were performed using the R package version 3.12 (R Foundation for Statistical Computing, Beijing1, China, meta package), and odds ratio (OR) and 95% confidence interval (CI) were used to assess the strength of the association. The heterogeneity was assessed by Cochran's Q-statistic and I 2 test, and the sensitivity analysis was performed by trimming one study at a time. A total of 17 articles, which included 10,841 participants, were investigated in the present meta-analysis. There was no significant difference between the case group and control group (OR, 0.59 95% CI 0.33-1.07) in the meta-analysis of exclusive breast feeding, and no significant difference was found between the case group and the control group (OR, 0.86; 95% CI 0.63-1.16) in the meta-analysis of breast feeding. No significant difference was found between the case group and control group (OR, 0.78 95% CI 0.59-1.04) for all quantitative data. There may be no direct correlation between rotavirus diarrhea and breast feeding. Copyright © 2017. Published by Elsevier B.V.

  4. Robot-assisted walking training for individuals with Parkinson's disease: a pilot randomized controlled trial.

    PubMed

    Sale, Patrizio; De Pandis, Maria Francesca; Le Pera, Domenica; Sova, Ivan; Cimolin, Veronica; Ancillao, Andrea; Albertini, Giorgio; Galli, Manuela; Stocchi, Fabrizio; Franceschini, Marco

    2013-05-24

    Over the last years, the introduction of robotic technologies into Parkinson's disease rehabilitation settings has progressed from concept to reality. However, the benefit of robotic training remains elusive. This pilot randomized controlled observer trial is aimed at investigating the feasibility, the effectiveness and the efficacy of new end-effector robot training in people with mild Parkinson's disease. Design. Pilot randomized controlled trial. Robot training was feasible, acceptable, safe, and the participants completed 100% of the prescribed training sessions. A statistically significant improvement in gait index was found in favour of the EG (T0 versus T1). In particular, the statistical analysis of primary outcome (gait speed) using the Friedman test showed statistically significant improvements for the EG (p = 0,0195). The statistical analysis performed by Friedman test of Step length left (p = 0,0195) and right (p = 0,0195) and Stride length left (p = 0,0078) and right (p = 0,0195) showed a significant statistical gain. No statistically significant improvements on the CG were found. Robot training is a feasible and safe form of rehabilitative exercise for cognitively intact people with mild PD. This original approach can contribute to increase a short time lower limb motor recovery in idiopathic PD patients. The focus on the gait recovery is a further characteristic that makes this research relevant to clinical practice. On the whole, the simplicity of treatment, the lack of side effects, and the positive results from patients support the recommendation to extend the use of this treatment. Further investigation regarding the long-time effectiveness of robot training is warranted. ClinicalTrials.gov NCT01668407.

  5. [Monitoring method of extraction process for Schisandrae Chinensis Fructus based on near infrared spectroscopy and multivariate statistical process control].

    PubMed

    Xu, Min; Zhang, Lei; Yue, Hong-Shui; Pang, Hong-Wei; Ye, Zheng-Liang; Ding, Li

    2017-10-01

    To establish an on-line monitoring method for extraction process of Schisandrae Chinensis Fructus, the formula medicinal material of Yiqi Fumai lyophilized injection by combining near infrared spectroscopy with multi-variable data analysis technology. The multivariate statistical process control (MSPC) model was established based on 5 normal batches in production and 2 test batches were monitored by PC scores, DModX and Hotelling T2 control charts. The results showed that MSPC model had a good monitoring ability for the extraction process. The application of the MSPC model to actual production process could effectively achieve on-line monitoring for extraction process of Schisandrae Chinensis Fructus, and can reflect the change of material properties in the production process in real time. This established process monitoring method could provide reference for the application of process analysis technology in the process quality control of traditional Chinese medicine injections. Copyright© by the Chinese Pharmaceutical Association.

  6. An Adaptive Buddy Check for Observational Quality Control

    NASA Technical Reports Server (NTRS)

    Dee, Dick P.; Rukhovets, Leonid; Todling, Ricardo; DaSilva, Arlindo M.; Larson, Jay W.; Einaudi, Franco (Technical Monitor)

    2000-01-01

    An adaptive buddy check algorithm is presented that adjusts tolerances for outlier observations based on the variability of surrounding data. The algorithm derives from a statistical hypothesis test combined with maximum-likelihood covariance estimation. Its stability is shown to depend on the initial identification of outliers by a simple background check. The adaptive feature ensures that the final quality control decisions are not very sensitive to prescribed statistics of first-guess and observation errors, nor on other approximations introduced into the algorithm. The implementation of the algorithm in a global atmospheric data assimilation is described. Its performance is contrasted with that of a non-adaptive buddy check, for the surface analysis of an extreme storm that took place in Europe on 27 December 1999. The adaptive algorithm allowed the inclusion of many important observations that differed greatly from the first guess and that would have been excluded on the basis of prescribed statistics. The analysis of the storm development was much improved as a result of these additional observations.

  7. Surgical Treatment for Discogenic Low-Back Pain: Lumbar Arthroplasty Results in Superior Pain Reduction and Disability Level Improvement Compared With Lumbar Fusion

    PubMed Central

    2007-01-01

    Background The US Food and Drug Administration approved the Charité artificial disc on October 26, 2004. This approval was based on an extensive analysis and review process; 20 years of disc usage worldwide; and the results of a prospective, randomized, controlled clinical trial that compared lumbar artificial disc replacement to fusion. The results of the investigational device exemption (IDE) study led to a conclusion that clinical outcomes following lumbar arthroplasty were at least as good as outcomes from fusion. Methods The author performed a new analysis of the Visual Analog Scale pain scores and the Oswestry Disability Index scores from the Charité artificial disc IDE study and used a nonparametric statistical test, because observed data distributions were not normal. The analysis included all of the enrolled subjects in both the nonrandomized and randomized phases of the study. Results Subjects from both the treatment and control groups improved from the baseline situation (P < .001) at all follow-up times (6 weeks to 24 months). Additionally, these pain and disability levels with artificial disc replacement were superior (P < .05) to the fusion treatment at all follow-up times including 2 years. Conclusions The a priori statistical plan for an IDE study may not adequately address the final distribution of the data. Therefore, statistical analyses more appropriate to the distribution may be necessary to develop meaningful statistical conclusions from the study. A nonparametric statistical analysis of the Charité artificial disc IDE outcomes scores demonstrates superiority for lumbar arthroplasty versus fusion at all follow-up time points to 24 months. PMID:25802574

  8. Phosphorylated neurofilament heavy: A potential blood biomarker to evaluate the severity of acute spinal cord injuries in adults

    PubMed Central

    Singh, Ajai; Kumar, Vineet; Ali, Sabir; Mahdi, Abbas Ali; Srivastava, Rajeshwer Nath

    2017-01-01

    Aims: The aim of this study is to analyze the serial estimation of phosphorylated neurofilament heavy (pNF-H) in blood plasma that would act as a potential biomarker for early prediction of the neurological severity of acute spinal cord injuries (SCI) in adults. Settings and Design: Pilot study/observational study. Subjects and Methods: A total of 40 patients (28 cases and 12 controls) of spine injury were included in this study. In the enrolled cases, plasma level of pNF-H was evaluated in blood samples and neurological evaluation was performed by the American Spinal Injury Association Injury Scale at specified period. Serial plasma neurofilament heavy values were then correlated with the neurological status of these patients during follow-up visits and were analyzed statistically. Statistical Analysis Used: Statistical analysis was performed using GraphPad InStat software (version 3.05 for Windows, San Diego, CA, USA). The correlation analysis between the clinical progression and pNF-H expression was done using Spearman's correlation. Results: The mean baseline level of pNF-H in cases was 6.40 ± 2.49 ng/ml, whereas in controls it was 0.54 ± 0.27 ng/ml. On analyzing the association between the two by Mann–Whitney U–test, the difference in levels was found to be statistically significant. The association between the neurological progression and pNF-H expression was determined using correlation analysis (Spearman's correlation). At 95% confidence interval, the correlation coefficient was found to be 0.64, and the correlation was statistically significant. Conclusions: Plasma pNF-H levels were elevated in accordance with the severity of SCI. Therefore, pNF-H may be considered as a potential biomarker to determine early the severity of SCI in adult patients. PMID:29291173

  9. LP-search and its use in analysis of the accuracy of control systems with acoustical models

    NASA Technical Reports Server (NTRS)

    Sergeyev, V. I.; Sobol, I. M.; Statnikov, R. B.; Statnikov, I. N.

    1973-01-01

    The LP-search is proposed as an analog of the Monte Carlo method for finding values in nonlinear statistical systems. It is concluded that: To attain the required accuracy in solution to the problem of control for a statistical system in the LP-search, a considerably smaller number of tests is required than in the Monte Carlo method. The LP-search allows the possibility of multiple repetitions of tests under identical conditions and observability of the output variables of the system.

  10. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center ({+-}4% of deviation between the calculated and measured doses) by calculatingmore » a control process capability (C{sub pc}) index. The C{sub pc} index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the security of treatments. They also showed that the dose delivery processes in the cancer center were in control for prostate and head-and-neck treatments. In parallel, long term process performance indices (P{sub p}, P{sub pk}, and P{sub pm}) have been analyzed. Their analysis helped defining which actions should be undertaken in order to improve the performance of the process. The prostate dose delivery process has been shown statistically capable (0.08% of the results is expected to be outside the clinical tolerances) contrary to the head-and-neck dose delivery process (5.76% of the results are expected to be outside the clinical tolerances).« less

  11. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC).

    PubMed

    Gérard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, François; Aletti, Pierre

    2009-04-01

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short-term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center (+/- 4% of deviation between the calculated and measured doses) by calculating a control process capability (C(pc)) index. The C(pc) index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short-term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the security of treatments. They also showed that the dose delivery processes in the cancer center were in control for prostate and head-and-neck treatments. In parallel, long-term process performance indices (P(p), P(pk), and P(pm)) have been analyzed. Their analysis helped defining which actions should be undertaken in order to improve the performance of the process. The prostate dose delivery process has been shown statistically capable (0.08% of the results is expected to be outside the clinical tolerances) contrary to the head-and-neck dose delivery process (5.76% of the results are expected to be outside the clinical tolerances).

  12. Accounting for competing risks in randomized controlled trials: a review and recommendations for improvement.

    PubMed

    Austin, Peter C; Fine, Jason P

    2017-04-15

    In studies with survival or time-to-event outcomes, a competing risk is an event whose occurrence precludes the occurrence of the primary event of interest. Specialized statistical methods must be used to analyze survival data in the presence of competing risks. We conducted a review of randomized controlled trials with survival outcomes that were published in high-impact general medical journals. Of 40 studies that we identified, 31 (77.5%) were potentially susceptible to competing risks. However, in the majority of these studies, the potential presence of competing risks was not accounted for in the statistical analyses that were described. Of the 31 studies potentially susceptible to competing risks, 24 (77.4%) reported the results of a Kaplan-Meier survival analysis, while only five (16.1%) reported using cumulative incidence functions to estimate the incidence of the outcome over time in the presence of competing risks. The former approach will tend to result in an overestimate of the incidence of the outcome over time, while the latter approach will result in unbiased estimation of the incidence of the primary outcome over time. We provide recommendations on the analysis and reporting of randomized controlled trials with survival outcomes in the presence of competing risks. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  13. Industry sector analysis: The profile of the market for water and wastewater pollution control systems (the Philippines). Export trade information

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miranda, A.L.

    1990-11-01

    The market survey covers the water and wastewater pollution control systems market in the Philippines. The analysis contains statistical and narrative information on projected market demand, end-users; receptivity of Philippine consumers to U.S. products; the competitive situation, and market access (tariffs, non-tariff barriers, standards, taxes, distribution channels). It also contains key contact information.

  14. Living Animals in the Classroom: A Meta-Analysis on Learning Outcome and a Treatment-Control Study Focusing on Knowledge and Motivation

    ERIC Educational Resources Information Center

    Hummel, Eberhard; Randler, Christoph

    2012-01-01

    Prior research states that the use of living animals in the classroom leads to a higher knowledge but those previous studies have methodological and statistical problems. We applied a meta-analysis and developed a treatment-control study in a middle school classroom. The treatments (film vs. living animal) differed only by the presence of the…

  15. Radiosurgery of Glomus Jugulare Tumors: A Meta-Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guss, Zachary D.; Batra, Sachin; Limb, Charles J.

    2011-11-15

    Purpose: During the past two decades, radiosurgery has arisen as a promising approach to the management of glomus jugulare. In the present study, we report on a systematic review and meta-analysis of the available published data on the radiosurgical management of glomus jugulare tumors. Methods and Materials: To identify eligible studies, systematic searches of all glomus jugulare tumors treated with radiosurgery were conducted in major scientific publication databases. The data search yielded 19 studies, which were included in the meta-analysis. The data from 335 glomus jugulare patients were extracted. The fixed effects pooled proportions were calculated from the data whenmore » Cochrane's statistic was statistically insignificant and the inconsistency among studies was <25%. Bias was assessed using the Egger funnel plot test. Results: Across all studies, 97% of patients achieved tumor control, and 95% of patients achieved clinical control. Eight studies reported a mean or median follow-up time of >36 months. In these studies, 95% of patients achieved clinical control and 96% achieved tumor control. The gamma knife, linear accelerator, and CyberKnife technologies all exhibited high rates of tumor and clinical control. Conclusions: The present study reports the results of a meta-analysis for the radiosurgical management of glomus jugulare. Because of its high effectiveness, we suggest considering radiosurgery for the primary management of glomus jugulare tumors.« less

  16. Exceedance statistics of accelerations resulting from thruster firings on the Apollo-Soyuz mission

    NASA Technical Reports Server (NTRS)

    Fichtl, G. H.; Holland, R. L.

    1983-01-01

    Spacecraft acceleration resulting from firings of vernier control system thrusters is an important consideration in the design, planning, execution and post-flight analysis of laboratory experiments in space. In particular, scientists and technologists involved with the development of experiments to be performed in space in many instances required statistical information on the magnitude and rate of occurrence of spacecraft accelerations. Typically, these accelerations are stochastic in nature, so that it is useful to characterize these accelerations in statistical terms. Statistics of spacecraft accelerations are summarized. Previously announced in STAR as N82-12127

  17. Gain optimization with non-linear controls

    NASA Technical Reports Server (NTRS)

    Slater, G. L.; Kandadai, R. D.

    1984-01-01

    An algorithm has been developed for the analysis and design of controls for non-linear systems. The technical approach is to use statistical linearization to model the non-linear dynamics of a system by a quasi-Gaussian model. A covariance analysis is performed to determine the behavior of the dynamical system and a quadratic cost function. Expressions for the cost function and its derivatives are determined so that numerical optimization techniques can be applied to determine optimal feedback laws. The primary application for this paper is centered about the design of controls for nominally linear systems but where the controls are saturated or limited by fixed constraints. The analysis is general, however, and numerical computation requires only that the specific non-linearity be considered in the analysis.

  18. Oral cancer associated with chronic mechanical irritation of the oral mucosa.

    PubMed

    Piemonte, E; Lazos, J; Belardinelli, P; Secchi, D; Brunotto, M; Lanfranchi-Tizeira, H

    2018-03-01

    Most of the studies dealing with Chronic Mechanical Irritation (CMI) and Oral Cancer (OC) only considered prosthetic and dental variables separately, and CMI functional factors are not registered. Thus, the aim of this study was to assess OC risk in individuals with dental, prosthetic and functional CMI. Also, we examined CMI presence in relation to tumor size. A case-control study was carried out from 2009 to 2013. Study group were squamous cell carcinoma cases; control group was patients seeking dental treatment in the same institution. 153 patients were studied (Study group n=53, Control group n=100). CMI reproducibility displayed a correlation coefficient of 1 (p<0.0001). Bivariate analysis showed statistically significant associations for all variables (age, gender, tobacco and alcohol consumption and CMI). Multivariate analysis exhibited statistical significance for age, alcohol, and CMI, but not for gender or tobacco. Relationship of CMI with tumor size showed no statistically significant differences. CMI could be regarded as a risk factor for oral cancer. In individuals with other OC risk factors, proper treatment of the mechanical injuring factors (dental, prosthetic and functional) could be an important measure to reduce the risk of oral cancer.

  19. Prevalence of dental attrition in in vitro fertilization children of West Bengal

    PubMed Central

    Kar, Sudipta; Sarkar, Subrata; Mukherjee, Ananya

    2014-01-01

    CONTEXT: Dental attrition is one of the problems affecting the tooth structure. It may affect both in vitro fertilization (IVF) and spontaneously conceived children. AIMS: This study was aimed to evaluate and to compare the prevalence of dental attrition in deciduous dentition of IVF and spontaneously conceived children. SETTINGS AND DESIGN: In a cross-sectional case control study dental attrition status of 3-5 years old children were assessed. The case group consisted of term, singleton babies who were the outcome of IVF in the studied area in 2009. SUBJECTS AND METHODS: The control group consisted of term, first child, singleton and spontaneously conceived 3-5 years old children who were also resident of the studied area. A sample of 153 IVF and 153 spontaneously conceived children was examined according to Hansson and Nilner classification. STATISTICAL ANALYSIS USED: Statistical analysis was carried out using Chi-square tests (χ2 ) or Z test. RESULTS: No statistically significant difference found in studied (IVF children) and control group (spontaneously conceived children). CONCLUSIONS: IVF children are considered same as spontaneously conceived children when studied in relation to dental attrition status. PMID:24829529

  20. Effect of Table Tennis Trainings on Biomotor Capacities in Boys

    ERIC Educational Resources Information Center

    Tas, Murat

    2017-01-01

    The aim of this study is to investigate whether the biomotor capacities of boys doing table tennis trainings are affected. A total of 40 students, as randomly selected 20 test groups and 20 control groups at an age range of 10-12 participated in the research. Statistical analysis of data was performed using Statistic Package for Social Science…

  1. Ensemble of Thermostatically Controlled Loads: Statistical Physics Approach.

    PubMed

    Chertkov, Michael; Chernyak, Vladimir

    2017-08-17

    Thermostatically controlled loads, e.g., air conditioners and heaters, are by far the most widespread consumers of electricity. Normally the devices are calibrated to provide the so-called bang-bang control - changing from on to off, and vice versa, depending on temperature. We considered aggregation of a large group of similar devices into a statistical ensemble, where the devices operate following the same dynamics, subject to stochastic perturbations and randomized, Poisson on/off switching policy. Using theoretical and computational tools of statistical physics, we analyzed how the ensemble relaxes to a stationary distribution and established a relationship between the relaxation and the statistics of the probability flux associated with devices' cycling in the mixed (discrete, switch on/off, and continuous temperature) phase space. This allowed us to derive the spectrum of the non-equilibrium (detailed balance broken) statistical system and uncover how switching policy affects oscillatory trends and the speed of the relaxation. Relaxation of the ensemble is of practical interest because it describes how the ensemble recovers from significant perturbations, e.g., forced temporary switching off aimed at utilizing the flexibility of the ensemble to provide "demand response" services to change consumption temporarily to balance a larger power grid. We discuss how the statistical analysis can guide further development of the emerging demand response technology.

  2. Ensemble of Thermostatically Controlled Loads: Statistical Physics Approach

    DOE PAGES

    Chertkov, Michael; Chernyak, Vladimir

    2017-01-17

    Thermostatically Controlled Loads (TCL), e.g. air-conditioners and heaters, are by far the most wide-spread consumers of electricity. Normally the devices are calibrated to provide the so-called bang-bang control of temperature - changing from on to off , and vice versa, depending on temperature. Aggregation of a large group of similar devices into a statistical ensemble is considered, where the devices operate following the same dynamics subject to stochastic perturbations and randomized, Poisson on/off switching policy. We analyze, using theoretical and computational tools of statistical physics, how the ensemble relaxes to a stationary distribution and establish relation between the re- laxationmore » and statistics of the probability flux, associated with devices' cycling in the mixed (discrete, switch on/off , and continuous, temperature) phase space. This allowed us to derive and analyze spec- trum of the non-equilibrium (detailed balance broken) statistical system. and uncover how switching policy affects oscillatory trend and speed of the relaxation. Relaxation of the ensemble is of a practical interest because it describes how the ensemble recovers from significant perturbations, e.g. forceful temporary switching o aimed at utilizing flexibility of the ensemble in providing "demand response" services relieving consumption temporarily to balance larger power grid. We discuss how the statistical analysis can guide further development of the emerging demand response technology.« less

  3. Ensemble of Thermostatically Controlled Loads: Statistical Physics Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chertkov, Michael; Chernyak, Vladimir

    Thermostatically Controlled Loads (TCL), e.g. air-conditioners and heaters, are by far the most wide-spread consumers of electricity. Normally the devices are calibrated to provide the so-called bang-bang control of temperature - changing from on to off , and vice versa, depending on temperature. Aggregation of a large group of similar devices into a statistical ensemble is considered, where the devices operate following the same dynamics subject to stochastic perturbations and randomized, Poisson on/off switching policy. We analyze, using theoretical and computational tools of statistical physics, how the ensemble relaxes to a stationary distribution and establish relation between the re- laxationmore » and statistics of the probability flux, associated with devices' cycling in the mixed (discrete, switch on/off , and continuous, temperature) phase space. This allowed us to derive and analyze spec- trum of the non-equilibrium (detailed balance broken) statistical system. and uncover how switching policy affects oscillatory trend and speed of the relaxation. Relaxation of the ensemble is of a practical interest because it describes how the ensemble recovers from significant perturbations, e.g. forceful temporary switching o aimed at utilizing flexibility of the ensemble in providing "demand response" services relieving consumption temporarily to balance larger power grid. We discuss how the statistical analysis can guide further development of the emerging demand response technology.« less

  4. 77 FR 17460 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-26

    ..., Associated Form, and OMB Control Number: The 2012 Post- Election Survey of State and Local Election Officials; OMB Control Number 0704-0125. Needs and Uses: The information collection requirement is necessary to.... 1973ff]). UOCAVA requires a statistical analysis report to the President and Congress on the...

  5. Simulation on a car interior aerodynamic noise control based on statistical energy analysis

    NASA Astrophysics Data System (ADS)

    Chen, Xin; Wang, Dengfeng; Ma, Zhengdong

    2012-09-01

    How to simulate interior aerodynamic noise accurately is an important question of a car interior noise reduction. The unsteady aerodynamic pressure on body surfaces is proved to be the key effect factor of car interior aerodynamic noise control in high frequency on high speed. In this paper, a detail statistical energy analysis (SEA) model is built. And the vibra-acoustic power inputs are loaded on the model for the valid result of car interior noise analysis. The model is the solid foundation for further optimization on car interior noise control. After the most sensitive subsystems for the power contribution to car interior noise are pointed by SEA comprehensive analysis, the sound pressure level of car interior aerodynamic noise can be reduced by improving their sound and damping characteristics. The further vehicle testing results show that it is available to improve the interior acoustic performance by using detailed SEA model, which comprised by more than 80 subsystems, with the unsteady aerodynamic pressure calculation on body surfaces and the materials improvement of sound/damping properties. It is able to acquire more than 2 dB reduction on the central frequency in the spectrum over 800 Hz. The proposed optimization method can be looked as a reference of car interior aerodynamic noise control by the detail SEA model integrated unsteady computational fluid dynamics (CFD) and sensitivity analysis of acoustic contribution.

  6. ON MODEL SELECTION STRATEGIES TO IDENTIFY GENES UNDERLYING BINARY TRAITS USING GENOME-WIDE ASSOCIATION DATA.

    PubMed

    Wu, Zheyang; Zhao, Hongyu

    2012-01-01

    For more fruitful discoveries of genetic variants associated with diseases in genome-wide association studies, it is important to know whether joint analysis of multiple markers is more powerful than the commonly used single-marker analysis, especially in the presence of gene-gene interactions. This article provides a statistical framework to rigorously address this question through analytical power calculations for common model search strategies to detect binary trait loci: marginal search, exhaustive search, forward search, and two-stage screening search. Our approach incorporates linkage disequilibrium, random genotypes, and correlations among score test statistics of logistic regressions. We derive analytical results under two power definitions: the power of finding all the associated markers and the power of finding at least one associated marker. We also consider two types of error controls: the discovery number control and the Bonferroni type I error rate control. After demonstrating the accuracy of our analytical results by simulations, we apply them to consider a broad genetic model space to investigate the relative performances of different model search strategies. Our analytical study provides rapid computation as well as insights into the statistical mechanism of capturing genetic signals under different genetic models including gene-gene interactions. Even though we focus on genetic association analysis, our results on the power of model selection procedures are clearly very general and applicable to other studies.

  7. ON MODEL SELECTION STRATEGIES TO IDENTIFY GENES UNDERLYING BINARY TRAITS USING GENOME-WIDE ASSOCIATION DATA

    PubMed Central

    Wu, Zheyang; Zhao, Hongyu

    2013-01-01

    For more fruitful discoveries of genetic variants associated with diseases in genome-wide association studies, it is important to know whether joint analysis of multiple markers is more powerful than the commonly used single-marker analysis, especially in the presence of gene-gene interactions. This article provides a statistical framework to rigorously address this question through analytical power calculations for common model search strategies to detect binary trait loci: marginal search, exhaustive search, forward search, and two-stage screening search. Our approach incorporates linkage disequilibrium, random genotypes, and correlations among score test statistics of logistic regressions. We derive analytical results under two power definitions: the power of finding all the associated markers and the power of finding at least one associated marker. We also consider two types of error controls: the discovery number control and the Bonferroni type I error rate control. After demonstrating the accuracy of our analytical results by simulations, we apply them to consider a broad genetic model space to investigate the relative performances of different model search strategies. Our analytical study provides rapid computation as well as insights into the statistical mechanism of capturing genetic signals under different genetic models including gene-gene interactions. Even though we focus on genetic association analysis, our results on the power of model selection procedures are clearly very general and applicable to other studies. PMID:23956610

  8. Treatment of control data in lunar phototriangulation. [application of statistical procedures and development of mathematical and computer techniques

    NASA Technical Reports Server (NTRS)

    Wong, K. W.

    1974-01-01

    In lunar phototriangulation, there is a complete lack of accurate ground control points. The accuracy analysis of the results of lunar phototriangulation must, therefore, be completely dependent on statistical procedure. It was the objective of this investigation to examine the validity of the commonly used statistical procedures, and to develop both mathematical techniques and computer softwares for evaluating (1) the accuracy of lunar phototriangulation; (2) the contribution of the different types of photo support data on the accuracy of lunar phototriangulation; (3) accuracy of absolute orientation as a function of the accuracy and distribution of both the ground and model points; and (4) the relative slope accuracy between any triangulated pass points.

  9. mapDIA: Preprocessing and statistical analysis of quantitative proteomics data from data independent acquisition mass spectrometry.

    PubMed

    Teo, Guoshou; Kim, Sinae; Tsou, Chih-Chiang; Collins, Ben; Gingras, Anne-Claude; Nesvizhskii, Alexey I; Choi, Hyungwon

    2015-11-03

    Data independent acquisition (DIA) mass spectrometry is an emerging technique that offers more complete detection and quantification of peptides and proteins across multiple samples. DIA allows fragment-level quantification, which can be considered as repeated measurements of the abundance of the corresponding peptides and proteins in the downstream statistical analysis. However, few statistical approaches are available for aggregating these complex fragment-level data into peptide- or protein-level statistical summaries. In this work, we describe a software package, mapDIA, for statistical analysis of differential protein expression using DIA fragment-level intensities. The workflow consists of three major steps: intensity normalization, peptide/fragment selection, and statistical analysis. First, mapDIA offers normalization of fragment-level intensities by total intensity sums as well as a novel alternative normalization by local intensity sums in retention time space. Second, mapDIA removes outlier observations and selects peptides/fragments that preserve the major quantitative patterns across all samples for each protein. Last, using the selected fragments and peptides, mapDIA performs model-based statistical significance analysis of protein-level differential expression between specified groups of samples. Using a comprehensive set of simulation datasets, we show that mapDIA detects differentially expressed proteins with accurate control of the false discovery rates. We also describe the analysis procedure in detail using two recently published DIA datasets generated for 14-3-3β dynamic interaction network and prostate cancer glycoproteome. The software was written in C++ language and the source code is available for free through SourceForge website http://sourceforge.net/projects/mapdia/.This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Optimality, stochasticity, and variability in motor behavior

    PubMed Central

    Guigon, Emmanuel; Baraduc, Pierre; Desmurget, Michel

    2008-01-01

    Recent theories of motor control have proposed that the nervous system acts as a stochastically optimal controller, i.e. it plans and executes motor behaviors taking into account the nature and statistics of noise. Detrimental effects of noise are converted into a principled way of controlling movements. Attractive aspects of such theories are their ability to explain not only characteristic features of single motor acts, but also statistical properties of repeated actions. Here, we present a critical analysis of stochastic optimality in motor control which reveals several difficulties with this hypothesis. We show that stochastic control may not be necessary to explain the stochastic nature of motor behavior, and we propose an alternative framework, based on the action of a deterministic controller coupled with an optimal state estimator, which relieves drawbacks of stochastic optimality and appropriately explains movement variability. PMID:18202922

  11. Anomalous heat transfer modes of nanofluids: a review based on statistical analysis

    NASA Astrophysics Data System (ADS)

    Sergis, Antonis; Hardalupas, Yannis

    2011-05-01

    This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids.

  12. Anomalous heat transfer modes of nanofluids: a review based on statistical analysis.

    PubMed

    Sergis, Antonis; Hardalupas, Yannis

    2011-05-19

    This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids.

  13. Anomalous heat transfer modes of nanofluids: a review based on statistical analysis

    PubMed Central

    2011-01-01

    This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids. PMID:21711932

  14. Development of a Comprehensive Digital Avionics Curriculum for the Aeronautical Engineer

    DTIC Science & Technology

    2006-03-01

    able to analyze and design aircraft and missile guidance and control systems, including feedback stabilization schemes and stochastic processes, using ...Uncertainty modeling for robust control; Robust closed-loop stability and performance; Robust H- infinity control; Robustness check using mu-analysis...Controlled feedback (reduces noise) 3. Statistical group response (reduce pressure toward conformity) When used as a tool to study a complex problem

  15. Probabilistic/Fracture-Mechanics Model For Service Life

    NASA Technical Reports Server (NTRS)

    Watkins, T., Jr.; Annis, C. G., Jr.

    1991-01-01

    Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.

  16. A clinicomicrobiological study to evaluate the efficacy of manual and powered toothbrushes among autistic patients

    PubMed Central

    Vajawat, Mayuri; Deepika, P. C.; Kumar, Vijay; Rajeshwari, P.

    2015-01-01

    Aim: To compare the efficacy of powered toothbrushes in improving gingival health and reducing salivary red complex counts as compared to manual toothbrushes, among autistic individuals. Materials and Methods: Forty autistics was selected. Test group received powered toothbrushes, and control group received manual toothbrushes. Plaque index and gingival index were recorded. Unstimulated saliva was collected for analysis of red complex organisms using polymerase chain reaction. Results: A statistically significant reduction in the plaque scores was seen over a period of 12 weeks in both the groups (P < 0.001 for tests and P = 0.002 for controls). This reduction was statistically more significant in the test group (P = 0.024). A statistically significant reduction in the gingival scores was seen over a period of 12 weeks in both the groups (P < 0.001 for tests and P = 0.001 for controls). This reduction was statistically more significant in the test group (P = 0.042). No statistically significant reduction in the detection rate of red complex organisms were seen at 4 weeks in both the groups. Conclusion: Powered toothbrushes result in a significant overall improvement in gingival health when constant reinforcement of oral hygiene instructions is given. PMID:26681855

  17. Temporal scaling and spatial statistical analyses of groundwater level fluctuations

    NASA Astrophysics Data System (ADS)

    Sun, H.; Yuan, L., Sr.; Zhang, Y.

    2017-12-01

    Natural dynamics such as groundwater level fluctuations can exhibit multifractionality and/or multifractality due likely to multi-scale aquifer heterogeneity and controlling factors, whose statistics requires efficient quantification methods. This study explores multifractionality and non-Gaussian properties in groundwater dynamics expressed by time series of daily level fluctuation at three wells located in the lower Mississippi valley, after removing the seasonal cycle in the temporal scaling and spatial statistical analysis. First, using the time-scale multifractional analysis, a systematic statistical method is developed to analyze groundwater level fluctuations quantified by the time-scale local Hurst exponent (TS-LHE). Results show that the TS-LHE does not remain constant, implying the fractal-scaling behavior changing with time and location. Hence, we can distinguish the potentially location-dependent scaling feature, which may characterize the hydrology dynamic system. Second, spatial statistical analysis shows that the increment of groundwater level fluctuations exhibits a heavy tailed, non-Gaussian distribution, which can be better quantified by a Lévy stable distribution. Monte Carlo simulations of the fluctuation process also show that the linear fractional stable motion model can well depict the transient dynamics (i.e., fractal non-Gaussian property) of groundwater level, while fractional Brownian motion is inadequate to describe natural processes with anomalous dynamics. Analysis of temporal scaling and spatial statistics therefore may provide useful information and quantification to understand further the nature of complex dynamics in hydrology.

  18. Robust inference from multiple test statistics via permutations: a better alternative to the single test statistic approach for randomized trials.

    PubMed

    Ganju, Jitendra; Yu, Xinxin; Ma, Guoguang Julie

    2013-01-01

    Formal inference in randomized clinical trials is based on controlling the type I error rate associated with a single pre-specified statistic. The deficiency of using just one method of analysis is that it depends on assumptions that may not be met. For robust inference, we propose pre-specifying multiple test statistics and relying on the minimum p-value for testing the null hypothesis of no treatment effect. The null hypothesis associated with the various test statistics is that the treatment groups are indistinguishable. The critical value for hypothesis testing comes from permutation distributions. Rejection of the null hypothesis when the smallest p-value is less than the critical value controls the type I error rate at its designated value. Even if one of the candidate test statistics has low power, the adverse effect on the power of the minimum p-value statistic is not much. Its use is illustrated with examples. We conclude that it is better to rely on the minimum p-value rather than a single statistic particularly when that single statistic is the logrank test, because of the cost and complexity of many survival trials. Copyright © 2013 John Wiley & Sons, Ltd.

  19. Experience with multiple control groups in a large population-based case-control study on genetic and environmental risk factors.

    PubMed

    Pomp, E R; Van Stralen, K J; Le Cessie, S; Vandenbroucke, J P; Rosendaal, F R; Doggen, C J M

    2010-07-01

    We discuss the analytic and practical considerations in a large case-control study that had two control groups; the first control group consisting of partners of patients and the second obtained by random digit dialling (RDD). As an example of the evaluation of a general lifestyle factor, we present body mass index (BMI). Both control groups had lower BMIs than the patients. The distribution in the partner controls was closer to that of the patients, likely due to similar lifestyles. A statistical approach was used to pool the results of both analyses, wherein partners were analyzed with a matched analysis, while RDDs were analyzed without matching. Even with a matched analysis, the odds ratio with partner controls remained closer to unity than with RDD controls, which is probably due to unmeasured confounders in the comparison with the random controls as well as intermediary factors. However, when studying injuries as a risk factor, the odds ratio remained higher with partner control subjects than with RRD control subjects, even after taking the matching into account. Finally we used factor V Leiden as an example of a genetic risk factor. The frequencies of factor V Leiden were identical in both control groups, indicating that for the analyses of this genetic risk factor the two control groups could be combined in a single unmatched analysis. In conclusion, the effect measures with the two control groups were in the same direction, and of the same order of magnitude. Moreover, it was not always the same control group that produced the higher or lower estimates, and a matched analysis did not remedy the differences. Our experience with the intricacies of dealing with two control groups may be useful to others when thinking about an optimal research design or the best statistical approach.

  20. Graph theory applied to noise and vibration control in statistical energy analysis models.

    PubMed

    Guasch, Oriol; Cortés, Lluís

    2009-06-01

    A fundamental aspect of noise and vibration control in statistical energy analysis (SEA) models consists in first identifying and then reducing the energy flow paths between subsystems. In this work, it is proposed to make use of some results from graph theory to address both issues. On the one hand, linear and path algebras applied to adjacency matrices of SEA graphs are used to determine the existence of any order paths between subsystems, counting and labeling them, finding extremal paths, or determining the power flow contributions from groups of paths. On the other hand, a strategy is presented that makes use of graph cut algorithms to reduce the energy flow from a source subsystem to a receiver one, modifying as few internal and coupling loss factors as possible.

  1. Analyzing Randomized Controlled Interventions: Three Notes for Applied Linguists

    ERIC Educational Resources Information Center

    Vanhove, Jan

    2015-01-01

    I discuss three common practices that obfuscate or invalidate the statistical analysis of randomized controlled interventions in applied linguistics. These are (a) checking whether randomization produced groups that are balanced on a number of possibly relevant covariates, (b) using repeated measures ANOVA to analyze pretest-posttest designs, and…

  2. The Hard but Necessary Task of Gathering Order-One Effect Size Indices in Meta-Analysis

    ERIC Educational Resources Information Center

    Ortego, Carmen; Botella, Juan

    2010-01-01

    Meta-analysis of studies with two groups and two measurement occasions must employ order-one effect size indices to represent study outcomes. Especially with non-random assignment, non-equivalent control group designs, a statistical analysis restricted to post-treatment scores can lead to severely biased conclusions. The 109 primary studies…

  3. Performance of Modified Test Statistics in Covariance and Correlation Structure Analysis under Conditions of Multivariate Nonnormality.

    ERIC Educational Resources Information Center

    Fouladi, Rachel T.

    2000-01-01

    Provides an overview of standard and modified normal theory and asymptotically distribution-free covariance and correlation structure analysis techniques and details Monte Carlo simulation results on Type I and Type II error control. Demonstrates through the simulation that robustness and nonrobustness of structure analysis techniques vary as a…

  4. Visualization and statistical comparisons of microbial communities using R packages on Phylochip data.

    PubMed

    Holmes, Susan; Alekseyenko, Alexander; Timme, Alden; Nelson, Tyrrell; Pasricha, Pankaj Jay; Spormann, Alfred

    2011-01-01

    This article explains the statistical and computational methodology used to analyze species abundances collected using the LNBL Phylochip in a study of Irritable Bowel Syndrome (IBS) in rats. Some tools already available for the analysis of ordinary microarray data are useful in this type of statistical analysis. For instance in correcting for multiple testing we use Family Wise Error rate control and step-down tests (available in the multtest package). Once the most significant species are chosen we use the hypergeometric tests familiar for testing GO categories to test specific phyla and families. We provide examples of normalization, multivariate projections, batch effect detection and integration of phylogenetic covariation, as well as tree equalization and robustification methods.

  5. Meta-analysis of thirty-two case-control and two ecological radon studies of lung cancer.

    PubMed

    Dobrzynski, Ludwik; Fornalski, Krzysztof W; Reszczynska, Joanna

    2018-03-01

    A re-analysis has been carried out of thirty-two case-control and two ecological studies concerning the influence of radon, a radioactive gas, on the risk of lung cancer. Three mathematically simplest dose-response relationships (models) were tested: constant (zero health effect), linear, and parabolic (linear-quadratic). Health effect end-points reported in the analysed studies are odds ratios or relative risk ratios, related either to morbidity or mortality. In our preliminary analysis, we show that the results of dose-response fitting are qualitatively (within uncertainties, given as error bars) the same, whichever of these health effect end-points are applied. Therefore, we deemed it reasonable to aggregate all response data into the so-called Relative Health Factor and jointly analysed such mixed data, to obtain better statistical power. In the second part of our analysis, robust Bayesian and classical methods of analysis were applied to this combined dataset. In this part of our analysis, we selected different subranges of radon concentrations. In view of substantial differences between the methodology used by the authors of case-control and ecological studies, the mathematical relationships (models) were applied mainly to the thirty-two case-control studies. The degree to which the two ecological studies, analysed separately, affect the overall results when combined with the thirty-two case-control studies, has also been evaluated. In all, as a result of our meta-analysis of the combined cohort, we conclude that the analysed data concerning radon concentrations below ~1000 Bq/m3 (~20 mSv/year of effective dose to the whole body) do not support the thesis that radon may be a cause of any statistically significant increase in lung cancer incidence.

  6. Meta-analysis of gene-level associations for rare variants based on single-variant statistics.

    PubMed

    Hu, Yi-Juan; Berndt, Sonja I; Gustafsson, Stefan; Ganna, Andrea; Hirschhorn, Joel; North, Kari E; Ingelsson, Erik; Lin, Dan-Yu

    2013-08-08

    Meta-analysis of genome-wide association studies (GWASs) has led to the discoveries of many common variants associated with complex human diseases. There is a growing recognition that identifying "causal" rare variants also requires large-scale meta-analysis. The fact that association tests with rare variants are performed at the gene level rather than at the variant level poses unprecedented challenges in the meta-analysis. First, different studies may adopt different gene-level tests, so the results are not compatible. Second, gene-level tests require multivariate statistics (i.e., components of the test statistic and their covariance matrix), which are difficult to obtain. To overcome these challenges, we propose to perform gene-level tests for rare variants by combining the results of single-variant analysis (i.e., p values of association tests and effect estimates) from participating studies. This simple strategy is possible because of an insight that multivariate statistics can be recovered from single-variant statistics, together with the correlation matrix of the single-variant test statistics, which can be estimated from one of the participating studies or from a publicly available database. We show both theoretically and numerically that the proposed meta-analysis approach provides accurate control of the type I error and is as powerful as joint analysis of individual participant data. This approach accommodates any disease phenotype and any study design and produces all commonly used gene-level tests. An application to the GWAS summary results of the Genetic Investigation of ANthropometric Traits (GIANT) consortium reveals rare and low-frequency variants associated with human height. The relevant software is freely available. Copyright © 2013 The American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  7. Preparing for the first meeting with a statistician.

    PubMed

    De Muth, James E

    2008-12-15

    Practical statistical issues that should be considered when performing data collection and analysis are reviewed. The meeting with a statistician should take place early in the research development before any study data are collected. The process of statistical analysis involves establishing the research question, formulating a hypothesis, selecting an appropriate test, sampling correctly, collecting data, performing tests, and making decisions. Once the objectives are established, the researcher can determine the characteristics or demographics of the individuals required for the study, how to recruit volunteers, what type of data are needed to answer the research question(s), and the best methods for collecting the required information. There are two general types of statistics: descriptive and inferential. Presenting data in a more palatable format for the reader is called descriptive statistics. Inferential statistics involve making an inference or decision about a population based on results obtained from a sample of that population. In order for the results of a statistical test to be valid, the sample should be representative of the population from which it is drawn. When collecting information about volunteers, researchers should only collect information that is directly related to the study objectives. Important information that a statistician will require first is an understanding of the type of variables involved in the study and which variables can be controlled by researchers and which are beyond their control. Data can be presented in one of four different measurement scales: nominal, ordinal, interval, or ratio. Hypothesis testing involves two mutually exclusive and exhaustive statements related to the research question. Statisticians should not be replaced by computer software, and they should be consulted before any research data are collected. When preparing to meet with a statistician, the pharmacist researcher should be familiar with the steps of statistical analysis and consider several questions related to the study to be conducted.

  8. On-Orbit System Identification

    NASA Technical Reports Server (NTRS)

    Mettler, E.; Milman, M. H.; Bayard, D.; Eldred, D. B.

    1987-01-01

    Information derived from accelerometer readings benefits important engineering and control functions. Report discusses methodology for detection, identification, and analysis of motions within space station. Techniques of vibration and rotation analyses, control theory, statistics, filter theory, and transform methods integrated to form system for generating models and model parameters that characterize total motion of complicated space station, with respect to both control-induced and random mechanical disturbances.

  9. Bias, precision and statistical power of analysis of covariance in the analysis of randomized trials with baseline imbalance: a simulation study.

    PubMed

    Egbewale, Bolaji E; Lewis, Martyn; Sim, Julius

    2014-04-09

    Analysis of variance (ANOVA), change-score analysis (CSA) and analysis of covariance (ANCOVA) respond differently to baseline imbalance in randomized controlled trials. However, no empirical studies appear to have quantified the differential bias and precision of estimates derived from these methods of analysis, and their relative statistical power, in relation to combinations of levels of key trial characteristics. This simulation study therefore examined the relative bias, precision and statistical power of these three analyses using simulated trial data. 126 hypothetical trial scenarios were evaluated (126,000 datasets), each with continuous data simulated by using a combination of levels of: treatment effect; pretest-posttest correlation; direction and magnitude of baseline imbalance. The bias, precision and power of each method of analysis were calculated for each scenario. Compared to the unbiased estimates produced by ANCOVA, both ANOVA and CSA are subject to bias, in relation to pretest-posttest correlation and the direction of baseline imbalance. Additionally, ANOVA and CSA are less precise than ANCOVA, especially when pretest-posttest correlation ≥ 0.3. When groups are balanced at baseline, ANCOVA is at least as powerful as the other analyses. Apparently greater power of ANOVA and CSA at certain imbalances is achieved in respect of a biased treatment effect. Across a range of correlations between pre- and post-treatment scores and at varying levels and direction of baseline imbalance, ANCOVA remains the optimum statistical method for the analysis of continuous outcomes in RCTs, in terms of bias, precision and statistical power.

  10. Bias, precision and statistical power of analysis of covariance in the analysis of randomized trials with baseline imbalance: a simulation study

    PubMed Central

    2014-01-01

    Background Analysis of variance (ANOVA), change-score analysis (CSA) and analysis of covariance (ANCOVA) respond differently to baseline imbalance in randomized controlled trials. However, no empirical studies appear to have quantified the differential bias and precision of estimates derived from these methods of analysis, and their relative statistical power, in relation to combinations of levels of key trial characteristics. This simulation study therefore examined the relative bias, precision and statistical power of these three analyses using simulated trial data. Methods 126 hypothetical trial scenarios were evaluated (126 000 datasets), each with continuous data simulated by using a combination of levels of: treatment effect; pretest-posttest correlation; direction and magnitude of baseline imbalance. The bias, precision and power of each method of analysis were calculated for each scenario. Results Compared to the unbiased estimates produced by ANCOVA, both ANOVA and CSA are subject to bias, in relation to pretest-posttest correlation and the direction of baseline imbalance. Additionally, ANOVA and CSA are less precise than ANCOVA, especially when pretest-posttest correlation ≥ 0.3. When groups are balanced at baseline, ANCOVA is at least as powerful as the other analyses. Apparently greater power of ANOVA and CSA at certain imbalances is achieved in respect of a biased treatment effect. Conclusions Across a range of correlations between pre- and post-treatment scores and at varying levels and direction of baseline imbalance, ANCOVA remains the optimum statistical method for the analysis of continuous outcomes in RCTs, in terms of bias, precision and statistical power. PMID:24712304

  11. Subjective memory complaints, depressive symptoms and cognition in patients attending a memory outpatient clinic.

    PubMed

    Lehrner, J; Moser, D; Klug, S; Gleiß, A; Auff, E; Dal-Bianco, P; Pusswald, G

    2014-03-01

    The goals of this study were to establish prevalence of subjective memory complaints (SMC) and depressive symptoms (DS) and their relation to cognitive functioning and cognitive status in an outpatient memory clinic cohort. Two hundred forty-eight cognitively healthy controls and 581 consecutive patients with cognitive complaints who fulfilled the inclusion criteria were included in the study. A statistically significant difference (p < 0.001) between control group and patient group regarding mean SMC was detected. 7.7% of controls reported a considerable degree of SMC, whereas 35.8% of patients reported considerable SMC. Additionally, a statistically significant difference (p < 0.001) between controls and patient group regarding Beck depression score was detected. 16.6% of controls showed a clinical relevant degree of DS, whereas 48.5% of patients showed DS. An analysis of variance revealed a statistically significant difference across all four groups (control group, SCI group, naMCI group, aMCI group) (p < 0.001). Whereas 8% of controls reported a considerable degree of SMC, 34% of the SCI group, 31% of the naMCI group, and 54% of the aMCI group reported considerable SMC. A two-factor analysis of variance with the factors cognitive status (controls, SCI group, naMCI group, aMCI group) and depressive status (depressed vs. not depressed) and SMC as dependent variable revealed that both factors were significant (p < 0.001), whereas the interaction was not (p = 0.820). A large proportion of patients seeking help in a memory outpatient clinic report considerable SMC, with an increasing degree from cognitively healthy elderly to aMCI. Depressive status increases SMC consistently across groups with different cognitive status.

  12. gsSKAT: Rapid gene set analysis and multiple testing correction for rare-variant association studies using weighted linear kernels.

    PubMed

    Larson, Nicholas B; McDonnell, Shannon; Cannon Albright, Lisa; Teerlink, Craig; Stanford, Janet; Ostrander, Elaine A; Isaacs, William B; Xu, Jianfeng; Cooney, Kathleen A; Lange, Ethan; Schleutker, Johanna; Carpten, John D; Powell, Isaac; Bailey-Wilson, Joan E; Cussenot, Olivier; Cancel-Tassin, Geraldine; Giles, Graham G; MacInnis, Robert J; Maier, Christiane; Whittemore, Alice S; Hsieh, Chih-Lin; Wiklund, Fredrik; Catalona, William J; Foulkes, William; Mandal, Diptasri; Eeles, Rosalind; Kote-Jarai, Zsofia; Ackerman, Michael J; Olson, Timothy M; Klein, Christopher J; Thibodeau, Stephen N; Schaid, Daniel J

    2017-05-01

    Next-generation sequencing technologies have afforded unprecedented characterization of low-frequency and rare genetic variation. Due to low power for single-variant testing, aggregative methods are commonly used to combine observed rare variation within a single gene. Causal variation may also aggregate across multiple genes within relevant biomolecular pathways. Kernel-machine regression and adaptive testing methods for aggregative rare-variant association testing have been demonstrated to be powerful approaches for pathway-level analysis, although these methods tend to be computationally intensive at high-variant dimensionality and require access to complete data. An additional analytical issue in scans of large pathway definition sets is multiple testing correction. Gene set definitions may exhibit substantial genic overlap, and the impact of the resultant correlation in test statistics on Type I error rate control for large agnostic gene set scans has not been fully explored. Herein, we first outline a statistical strategy for aggregative rare-variant analysis using component gene-level linear kernel score test summary statistics as well as derive simple estimators of the effective number of tests for family-wise error rate control. We then conduct extensive simulation studies to characterize the behavior of our approach relative to direct application of kernel and adaptive methods under a variety of conditions. We also apply our method to two case-control studies, respectively, evaluating rare variation in hereditary prostate cancer and schizophrenia. Finally, we provide open-source R code for public use to facilitate easy application of our methods to existing rare-variant analysis results. © 2017 WILEY PERIODICALS, INC.

  13. [Analysis the epidemiological features of 3,258 patients with allergic rhinitis in Yichang City].

    PubMed

    Chen, Bo; Zhang, Zhimao; Pei, Zhi; Chen, Shihan; Du, Zhimei; Lan, Yan; Han, Bei; Qi, Qi

    2015-02-01

    To investigate the epidemiological features in patients with allergic rhinitis (AR) in Yichang city, and put forward effective prevention and control measures. Collecting the data of allergic rhinitis in city proper from 2010 to 2013, input the data into the database and used statistical analysis. In recent years, the AR patients in this area increased year by year. The spring and the winter were the peak season of onset. The patients was constituted by young men. There was statistically significant difference between the age, the area,and the gender (P < 0.01). The history of allergy and the diseases related to the gender composition had statistical significance difference (P < 0.05). The allergens and the positive degree in gender, age structure had statistically significant difference (P < 0.01). Need to conduct the healthy propaganda and education, optimizing the environment, change the bad habits, timely medical treatment, standard treatment.

  14. Statistical Analysis of Spectral Properties and Prosodic Parameters of Emotional Speech

    NASA Astrophysics Data System (ADS)

    Přibil, J.; Přibilová, A.

    2009-01-01

    The paper addresses reflection of microintonation and spectral properties in male and female acted emotional speech. Microintonation component of speech melody is analyzed regarding its spectral and statistical parameters. According to psychological research of emotional speech, different emotions are accompanied by different spectral noise. We control its amount by spectral flatness according to which the high frequency noise is mixed in voiced frames during cepstral speech synthesis. Our experiments are aimed at statistical analysis of cepstral coefficient values and ranges of spectral flatness in three emotions (joy, sadness, anger), and a neutral state for comparison. Calculated histograms of spectral flatness distribution are visually compared and modelled by Gamma probability distribution. Histograms of cepstral coefficient distribution are evaluated and compared using skewness and kurtosis. Achieved statistical results show good correlation comparing male and female voices for all emotional states portrayed by several Czech and Slovak professional actors.

  15. Analysis of vehicle classification and truck weight data of the New England States : is data sharing a good idea?

    DOT National Transportation Integrated Search

    1998-01-01

    This paper is about a statistical research analysis of 1995-96 classification and weigh in motion : (WIM) data from seventeen continuous traffic-monitoring sites in New England. Data screening is : discussed briefly, and a cusum data quality control ...

  16. Opportunities for Applied Behavior Analysis in the Total Quality Movement.

    ERIC Educational Resources Information Center

    Redmon, William K.

    1992-01-01

    This paper identifies critical components of recent organizational quality improvement programs and specifies how applied behavior analysis can contribute to quality technology. Statistical Process Control and Total Quality Management approaches are compared, and behavior analysts are urged to build their research base and market behavior change…

  17. Exploration of the Maximum Entropy/Optimal Projection Approach to Control Design Synthesis for Large Space Structures.

    DTIC Science & Technology

    1985-02-01

    Energy Analysis , a branch of dynamic modal analysis developed for analyzing acoustic vibration problems, its present stage of development embodies a...Maximum Entropy Stochastic Modelling and Reduced-Order Design Synthesis is a rigorous new approach to this class of problems. Inspired by Statistical

  18. Use of Computer Statistical Packages to Generate Quality Control Reports on Training

    DTIC Science & Technology

    1980-01-01

    Quality Control Statistical Analysis 126. Th~rAcr ivowhis. sim oeva.e ebb VI .eseem mu 111160#0 by block nuaber; OU6btaining timely and efficient...DISSAI.SFIEC 4 31 EXTRE.,AELY SATISFIED 4 32 8. HUW MAN-Y MEN IN YOU 1QNIT hA:,T TO DO A GOCO JOB IN 5. 2 TRAIING -?5- ə> F01 UF THEM ɚ> SCME CF THEM...permanent disk storage space within the coma- puteor account.* The user may not wish to run the "Audit" program in the s a batch flow as the 6th.: three

  19. Dealing with missing standard deviation and mean values in meta-analysis of continuous outcomes: a systematic review.

    PubMed

    Weir, Christopher J; Butcher, Isabella; Assi, Valentina; Lewis, Stephanie C; Murray, Gordon D; Langhorne, Peter; Brady, Marian C

    2018-03-07

    Rigorous, informative meta-analyses rely on availability of appropriate summary statistics or individual participant data. For continuous outcomes, especially those with naturally skewed distributions, summary information on the mean or variability often goes unreported. While full reporting of original trial data is the ideal, we sought to identify methods for handling unreported mean or variability summary statistics in meta-analysis. We undertook two systematic literature reviews to identify methodological approaches used to deal with missing mean or variability summary statistics. Five electronic databases were searched, in addition to the Cochrane Colloquium abstract books and the Cochrane Statistics Methods Group mailing list archive. We also conducted cited reference searching and emailed topic experts to identify recent methodological developments. Details recorded included the description of the method, the information required to implement the method, any underlying assumptions and whether the method could be readily applied in standard statistical software. We provided a summary description of the methods identified, illustrating selected methods in example meta-analysis scenarios. For missing standard deviations (SDs), following screening of 503 articles, fifteen methods were identified in addition to those reported in a previous review. These included Bayesian hierarchical modelling at the meta-analysis level; summary statistic level imputation based on observed SD values from other trials in the meta-analysis; a practical approximation based on the range; and algebraic estimation of the SD based on other summary statistics. Following screening of 1124 articles for methods estimating the mean, one approximate Bayesian computation approach and three papers based on alternative summary statistics were identified. Illustrative meta-analyses showed that when replacing a missing SD the approximation using the range minimised loss of precision and generally performed better than omitting trials. When estimating missing means, a formula using the median, lower quartile and upper quartile performed best in preserving the precision of the meta-analysis findings, although in some scenarios, omitting trials gave superior results. Methods based on summary statistics (minimum, maximum, lower quartile, upper quartile, median) reported in the literature facilitate more comprehensive inclusion of randomised controlled trials with missing mean or variability summary statistics within meta-analyses.

  20. Gene Level Meta-Analysis of Quantitative Traits by Functional Linear Models.

    PubMed

    Fan, Ruzong; Wang, Yifan; Boehnke, Michael; Chen, Wei; Li, Yun; Ren, Haobo; Lobach, Iryna; Xiong, Momiao

    2015-08-01

    Meta-analysis of genetic data must account for differences among studies including study designs, markers genotyped, and covariates. The effects of genetic variants may differ from population to population, i.e., heterogeneity. Thus, meta-analysis of combining data of multiple studies is difficult. Novel statistical methods for meta-analysis are needed. In this article, functional linear models are developed for meta-analyses that connect genetic data to quantitative traits, adjusting for covariates. The models can be used to analyze rare variants, common variants, or a combination of the two. Both likelihood-ratio test (LRT) and F-distributed statistics are introduced to test association between quantitative traits and multiple variants in one genetic region. Extensive simulations are performed to evaluate empirical type I error rates and power performance of the proposed tests. The proposed LRT and F-distributed statistics control the type I error very well and have higher power than the existing methods of the meta-analysis sequence kernel association test (MetaSKAT). We analyze four blood lipid levels in data from a meta-analysis of eight European studies. The proposed methods detect more significant associations than MetaSKAT and the P-values of the proposed LRT and F-distributed statistics are usually much smaller than those of MetaSKAT. The functional linear models and related test statistics can be useful in whole-genome and whole-exome association studies. Copyright © 2015 by the Genetics Society of America.

  1. Immunohistochemical Analysis of the Role Connective Tissue Growth Factor in Drug-induced Gingival Overgrowth in Response to Phenytoin, Cyclosporine, and Nifedipine

    PubMed Central

    Anand, A. J.; Gopalakrishnan, Sivaram; Karthikeyan, R.; Mishra, Debasish; Mohapatra, Shreeyam

    2018-01-01

    Objective: To evaluate for the presence of connective tissue growth factor (CTGF) in drug (phenytoin, cyclosporine, and nifedipine)-induced gingival overgrowth (DIGO) and to compare it with healthy controls in the absence of overgrowth. Materials and Methods: Thirty-five patients were chosen for the study and segregated into study (25) and control groups (10). The study group consisted of phenytoin-induced (10), cyclosporine-induced (10), and nifedipine-induced (5) gingival overgrowth. After completing necessary medical evaluations, biopsy was done. The tissue samples were fixed in 10% formalin and then immunohistochemically evaluated for the presence of CTGF. The statistical analysis of the values was done using statistical package SPSS PC+ (Statistical Package for the Social Sciences, version 4.01). Results: The outcome of immunohistochemistry shows that DIGO samples express more CTGF than control group and phenytoin expresses more CTGF followed by nifedipine and cyclosporine. Conclusion: The study shows that there is an increase in the levels of CTGF in patients with DIGO in comparison to the control group without any gingival overgrowth. In the study, we compared the levels of CTGF in DIGO induced by three most commonly used drugs phenytoin, cyclosporine, and nifedipine. By comparing the levels of CTGF, we find that cyclosporine induces the production of least amount of CTGF. Therefore, it might be a more viable drug choice with reduced side effects. PMID:29629324

  2. Using statistical process control to make data-based clinical decisions.

    PubMed

    Pfadt, A; Wheeler, D J

    1995-01-01

    Applied behavior analysis is based on an investigation of variability due to interrelationships among antecedents, behavior, and consequences. This permits testable hypotheses about the causes of behavior as well as for the course of treatment to be evaluated empirically. Such information provides corrective feedback for making data-based clinical decisions. This paper considers how a different approach to the analysis of variability based on the writings of Walter Shewart and W. Edwards Deming in the area of industrial quality control helps to achieve similar objectives. Statistical process control (SPC) was developed to implement a process of continual product improvement while achieving compliance with production standards and other requirements for promoting customer satisfaction. SPC involves the use of simple statistical tools, such as histograms and control charts, as well as problem-solving techniques, such as flow charts, cause-and-effect diagrams, and Pareto charts, to implement Deming's management philosophy. These data-analytic procedures can be incorporated into a human service organization to help to achieve its stated objectives in a manner that leads to continuous improvement in the functioning of the clients who are its customers. Examples are provided to illustrate how SPC procedures can be used to analyze behavioral data. Issues related to the application of these tools for making data-based clinical decisions and for creating an organizational climate that promotes their routine use in applied settings are also considered.

  3. Protein Sectors: Statistical Coupling Analysis versus Conservation

    PubMed Central

    Teşileanu, Tiberiu; Colwell, Lucy J.; Leibler, Stanislas

    2015-01-01

    Statistical coupling analysis (SCA) is a method for analyzing multiple sequence alignments that was used to identify groups of coevolving residues termed “sectors”. The method applies spectral analysis to a matrix obtained by combining correlation information with sequence conservation. It has been asserted that the protein sectors identified by SCA are functionally significant, with different sectors controlling different biochemical properties of the protein. Here we reconsider the available experimental data and note that it involves almost exclusively proteins with a single sector. We show that in this case sequence conservation is the dominating factor in SCA, and can alone be used to make statistically equivalent functional predictions. Therefore, we suggest shifting the experimental focus to proteins for which SCA identifies several sectors. Correlations in protein alignments, which have been shown to be informative in a number of independent studies, would then be less dominated by sequence conservation. PMID:25723535

  4. A probabilistic analysis of electrical equipment vulnerability to carbon fibers

    NASA Technical Reports Server (NTRS)

    Elber, W.

    1980-01-01

    The statistical problems of airborne carbon fibers falling onto electrical circuits were idealized and analyzed. The probability of making contact between randomly oriented finite length fibers and sets of parallel conductors with various spacings and lengths was developed theoretically. The probability of multiple fibers joining to bridge a single gap between conductors, or forming continuous networks is included. From these theoretical considerations, practical statistical analyses to assess the likelihood of causing electrical malfunctions was produced. The statistics obtained were confirmed by comparison with results of controlled experiments.

  5. Exercise and Bone Mineral Density in Premenopausal Women: A Meta-Analysis of Randomized Controlled Trials

    PubMed Central

    Kelley, George A.; Kelley, Kristi S.; Kohrt, Wendy M.

    2013-01-01

    Objective. Examine the effects of exercise on femoral neck (FN) and lumbar spine (LS) bone mineral density (BMD) in premenopausal women. Methods. Meta-analysis of randomized controlled exercise trials ≥24 weeks in premenopausal women. Standardized effect sizes (g) were calculated for each result and pooled using random-effects models, Z score alpha values, 95% confidence intervals (CIs), and number needed to treat (NNT). Heterogeneity was examined using Q and I 2. Moderator and predictor analyses using mixed-effects ANOVA and simple metaregression were conducted. Statistical significance was set at P ≤ 0.05. Results. Statistically significant improvements were found for both FN (7g's, 466 participants, g = 0.342, 95%  CI = 0.132, 0.553, P = 0.001, Q = 10.8, P = 0.22, I 2 = 25.7%, NNT = 5) and LS (6g's, 402 participants, g = 0.201, 95%  CI = 0.009, 0.394, P = 0.04, Q = 3.3, P = 0.65, I 2 = 0%, NNT = 9) BMD. A trend for greater benefits in FN BMD was observed for studies published in countries other than the United States and for those who participated in home versus facility-based exercise. Statistically significant, or a trend for statistically significant, associations were observed for 7 different moderators and predictors, 6 for FN BMD and 1 for LS BMD. Conclusions. Exercise benefits FN and LS BMD in premenopausal women. The observed moderators and predictors deserve further investigation in well-designed randomized controlled trials. PMID:23401684

  6. Exercise and bone mineral density in premenopausal women: a meta-analysis of randomized controlled trials.

    PubMed

    Kelley, George A; Kelley, Kristi S; Kohrt, Wendy M

    2013-01-01

    Objective. Examine the effects of exercise on femoral neck (FN) and lumbar spine (LS) bone mineral density (BMD) in premenopausal women. Methods. Meta-analysis of randomized controlled exercise trials ≥24 weeks in premenopausal women. Standardized effect sizes (g) were calculated for each result and pooled using random-effects models, Z score alpha values, 95% confidence intervals (CIs), and number needed to treat (NNT). Heterogeneity was examined using Q and I(2). Moderator and predictor analyses using mixed-effects ANOVA and simple metaregression were conducted. Statistical significance was set at P ≤ 0.05. Results. Statistically significant improvements were found for both FN (7g's, 466 participants, g = 0.342, 95%  CI = 0.132, 0.553, P = 0.001, Q = 10.8, P = 0.22, I(2) = 25.7%, NNT = 5) and LS (6g's, 402 participants, g = 0.201, 95%  CI = 0.009, 0.394, P = 0.04, Q = 3.3, P = 0.65, I(2) = 0%, NNT = 9) BMD. A trend for greater benefits in FN BMD was observed for studies published in countries other than the United States and for those who participated in home versus facility-based exercise. Statistically significant, or a trend for statistically significant, associations were observed for 7 different moderators and predictors, 6 for FN BMD and 1 for LS BMD. Conclusions. Exercise benefits FN and LS BMD in premenopausal women. The observed moderators and predictors deserve further investigation in well-designed randomized controlled trials.

  7. Logistic regression applied to natural hazards: rare event logistic regression with replications

    NASA Astrophysics Data System (ADS)

    Guns, M.; Vanacker, V.

    2012-06-01

    Statistical analysis of natural hazards needs particular attention, as most of these phenomena are rare events. This study shows that the ordinary rare event logistic regression, as it is now commonly used in geomorphologic studies, does not always lead to a robust detection of controlling factors, as the results can be strongly sample-dependent. In this paper, we introduce some concepts of Monte Carlo simulations in rare event logistic regression. This technique, so-called rare event logistic regression with replications, combines the strength of probabilistic and statistical methods, and allows overcoming some of the limitations of previous developments through robust variable selection. This technique was here developed for the analyses of landslide controlling factors, but the concept is widely applicable for statistical analyses of natural hazards.

  8. [How to start a neuroimaging study].

    PubMed

    Narumoto, Jin

    2012-06-01

    In order to help researchers understand how to start a neuroimaging study, several tips are described in this paper. These include 1) Choice of an imaging modality, 2) Statistical method, and 3) Interpretation of the results. 1) There are several imaging modalities available in clinical research. Advantages and disadvantages of each modality are described. 2) Statistical Parametric Mapping, which is the most common statistical software for neuroimaging analysis, is described in terms of parameter setting in normalization and level of significance. 3) In the discussion section, the region which shows a significant difference between patients and normal controls should be discussed in relation to the neurophysiology of the disease, making reference to previous reports from neuroimaging studies in normal controls, lesion studies and animal studies. A typical pattern of discussion is described.

  9. A Numerical Simulation and Statistical Modeling of High Intensity Radiated Fields Experiment Data

    NASA Technical Reports Server (NTRS)

    Smith, Laura J.

    2004-01-01

    Tests are conducted on a quad-redundant fault tolerant flight control computer to establish upset characteristics of an avionics system in an electromagnetic field. A numerical simulation and statistical model are described in this work to analyze the open loop experiment data collected in the reverberation chamber at NASA LaRC as a part of an effort to examine the effects of electromagnetic interference on fly-by-wire aircraft control systems. By comparing thousands of simulation and model outputs, the models that best describe the data are first identified and then a systematic statistical analysis is performed on the data. All of these efforts are combined which culminate in an extrapolation of values that are in turn used to support previous efforts used in evaluating the data.

  10. Multivariate statistical process control of a continuous pharmaceutical twin-screw granulation and fluid bed drying process.

    PubMed

    Silva, A F; Sarraguça, M C; Fonteyne, M; Vercruysse, J; De Leersnyder, F; Vanhoorne, V; Bostijn, N; Verstraeten, M; Vervaet, C; Remon, J P; De Beer, T; Lopes, J A

    2017-08-07

    A multivariate statistical process control (MSPC) strategy was developed for the monitoring of the ConsiGma™-25 continuous tablet manufacturing line. Thirty-five logged variables encompassing three major units, being a twin screw high shear granulator, a fluid bed dryer and a product control unit, were used to monitor the process. The MSPC strategy was based on principal component analysis of data acquired under normal operating conditions using a series of four process runs. Runs with imposed disturbances in the dryer air flow and temperature, in the granulator barrel temperature, speed and liquid mass flow and in the powder dosing unit mass flow were utilized to evaluate the model's monitoring performance. The impact of the imposed deviations to the process continuity was also evaluated using Hotelling's T 2 and Q residuals statistics control charts. The influence of the individual process variables was assessed by analyzing contribution plots at specific time points. Results show that the imposed disturbances were all detected in both control charts. Overall, the MSPC strategy was successfully developed and applied. Additionally, deviations not associated with the imposed changes were detected, mainly in the granulator barrel temperature control. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Statistical Analysis of Human Body Movement and Group Interactions in Response to Music

    NASA Astrophysics Data System (ADS)

    Desmet, Frank; Leman, Marc; Lesaffre, Micheline; de Bruyn, Leen

    Quantification of time series that relate to physiological data is challenging for empirical music research. Up to now, most studies have focused on time-dependent responses of individual subjects in controlled environments. However, little is known about time-dependent responses of between-subject interactions in an ecological context. This paper provides new findings on the statistical analysis of group synchronicity in response to musical stimuli. Different statistical techniques were applied to time-dependent data obtained from an experiment on embodied listening in individual and group settings. Analysis of inter group synchronicity are described. Dynamic Time Warping (DTW) and Cross Correlation Function (CCF) were found to be valid methods to estimate group coherence of the resulting movements. It was found that synchronicity of movements between individuals (human-human interactions) increases significantly in the social context. Moreover, Analysis of Variance (ANOVA) revealed that the type of music is the predominant factor in both the individual and the social context.

  12. Association between ErbB4 single nucleotide polymorphisms and susceptibility to schizophrenia: A meta-analysis of case-control studies.

    PubMed

    Feng, Yanguo; Cheng, Dejun; Zhang, Chaofeng; Li, Yuchun; Zhang, Zhiying; Wang, Juan; Feng, Xiao

    2017-02-01

    Accumulating studies have reported inconsistent association between ErbB4 single nucleotide polymorphisms (SNPs) and predisposition to schizophrenia. To better interpret this issue, here we conducted a meta-analysis using published case-control studies. We conducted a systematic search of MEDLINE (Pubmed), Embase (Ovid), Web of Science (Thomson-Reuters) to identify relevant references. The association between ErbB4 SNPs and schizophrenia was assessed by odds ratios (ORs) and 95% confidence intervals (CIs). Between-study heterogeneity was evaluated by I squared (I) statistics and Cochran's Q test. To appraise the stability of results, we employed sensitivity analysis by omitting 1 single study each time. To assess the potential publication bias, we conducted trim and fill analysis. Seven studies published in English comprising 3162 cases and 4264 controls were included in this meta-analysis. Meta-analyses showed that rs707284 is statistically significantly associated with schizophrenia susceptibility among Asian and Caucasian populations under the allelic model (OR = 0.91, 95% CI: 0.83-0.99, P = 0.035). Additionally, a marginal association (P < 0.1) was observed between rs707284 and schizophrenia risk among Asian and Caucasian populations under the recessive (OR = 0.85, 95% CI: 0.72-1.01, P = 0.065) and homozygous (OR = 0.84, 95% CI: 0.68-1.03, P = 0.094) models. In the Asian subgroup, rs707284 was also noted to be marginally associated with schizophrenia under the recessive model (OR = 0.84, 95% CI: 0.70-1.00, P = 0.053). However, no statistically significant association was found between rs839523, rs7598440, rs3748962, and rs2371276 and schizophrenia risk. This meta-analysis suggested that rs707284 may be a potential ErbB4 SNP associated with susceptibility to schizophrenia. Nevertheless, due to the limited sample size in this meta-analysis, more large-scale association studies are still needed to confirm the results.

  13. A Simple Method to Control Positive Baseline Trend within Data Nonoverlap

    ERIC Educational Resources Information Center

    Parker, Richard I.; Vannest, Kimberly J.; Davis, John L.

    2014-01-01

    Nonoverlap is widely used as a statistical summary of data; however, these analyses rarely correct unwanted positive baseline trend. This article presents and validates the graph rotation for overlap and trend (GROT) technique, a hand calculation method for controlling positive baseline trend within an analysis of data nonoverlap. GROT is…

  14. Sudden death and cervical spine: A new contribution to pathogenesis for sudden death in critical care unit from subarachnoid hemorrhage; first report – An experimental study

    PubMed Central

    Kazdal, Hizir; Kanat, Ayhan; Aydin, Mehmet Dumlu; Yazar, Ugur; Guvercin, Ali Riza; Calik, Muhammet; Gundogdu, Betul

    2017-01-01

    Context: Sudden death from subarachnoid hemorrhage (SAH) is not uncommon. Aims: The goal of this study is to elucidate the effect of the cervical spinal roots and the related dorsal root ganglions (DRGs) on cardiorespiratory arrest following SAH. Settings and Design: This was an experimental study conducted on rabbits. Materials and Methods: This study was conducted on 22 rabbits which were randomly divided into three groups: control (n = 5), physiologic serum saline (SS; n = 6), SAH groups (n = 11). Experimental SAH was performed. Seven of 11 rabbits with SAH died within the first 2 weeks. After 20 days, other animals were sacrificed. The anterior spinal arteries, arteriae nervorum of cervical nerve roots (C6–C8), DRGs, and lungs were histopathologically examined and estimated stereologically. Statistical Analysis Used: Statistical analysis was performed using the PASW Statistics 18.0 for Windows (SPSS Inc., Chicago, Illinois, USA). Intergroup differences were assessed using a one-way ANOVA. The statistical significance was set at P < 0.05. Results: In the SAH group, histopathologically, severe anterior spinal artery (ASA) and arteriae nervorum vasospasm, axonal and neuronal degeneration, and neuronal apoptosis were observed. Vasospasm of ASA did not occur in the SS and control groups. There was a statistically significant increase in the degenerated neuron density in the SAH group as compared to the control and SS groups (P < 0.05). Cardiorespiratory disturbances, arrest, and lung edema more commonly developed in animals in the SAH group. Conclusion: We noticed interestingly that C6–C8 DRG degenerations were secondary to the vasospasm of ASA, following SAH. Cardiorespiratory disturbances or arrest can be explained with these mechanisms. PMID:28250634

  15. White Matter Integrity Deficit Associated with Betel Quid Dependence.

    PubMed

    Yuan, Fulai; Zhu, Xueling; Kong, Lingyu; Shen, Huaizhen; Liao, Weihua; Jiang, Canhua

    2017-01-01

    Betel quid (BQ) is a commonly consumed psychoactive substance, which has been regarded as a human carcinogen. Long-term BQ chewing may cause Diagnostic and Statistical Manual of Mental Disorders-IV dependence symptoms, which can lead to decreased cognitive functions, such as attention and inhibition control. Although betel quid dependence (BQD) individuals have been reported with altered brain structure and function, there is little evidence showing white matter microstructure alternation in BQD individuals. The present study aimed to investigate altered white matter microstructure in BQD individuals using diffusion tensor imaging. Tract-based spatial statistics was used to analyze the data. Compared with healthy controls, BQD individuals exhibited higher mean diffusivity (MD) in anterior thalamic radiation (ATR). Further analysis revealed that the ATR in BQD individuals showed less fractional anisotropy (FA) than that in healthy controls. Correlation analysis showed that both the increase of MD and reduction of FA in BQD individuals were associated with severity of BQ dependence. These results suggested that BQD would disrupt the balance between prefrontal cortex and subcortical areas, causing declined inhibition control.

  16. Application of spatial technology in malaria research & control: some new insights.

    PubMed

    Saxena, Rekha; Nagpal, B N; Srivastava, Aruna; Gupta, S K; Dash, A P

    2009-08-01

    Geographical information System (GIS) has emerged as the core of the spatial technology which integrates wide range of dataset available from different sources including Remote Sensing (RS) and Global Positioning System (GPS). Literature published during the decade (1998-2007) has been compiled and grouped into six categories according to the usage of the technology in malaria epidemiology. Different GIS modules like spatial data sources, mapping and geo-processing tools, distance calculation, digital elevation model (DEM), buffer zone and geo-statistical analysis have been investigated in detail, illustrated with examples as per the derived results. These GIS tools have contributed immensely in understanding the epidemiological processes of malaria and examples drawn have shown that GIS is now widely used for research and decision making in malaria control. Statistical data analysis currently is the most consistent and established set of tools to analyze spatial datasets. The desired future development of GIS is in line with the utilization of geo-statistical tools which combined with high quality data has capability to provide new insight into malaria epidemiology and the complexity of its transmission potential in endemic areas.

  17. NASA Marshall Space Flight Center Controls Systems Design and Analysis Branch

    NASA Technical Reports Server (NTRS)

    Gilligan, Eric

    2014-01-01

    Marshall Space Flight Center maintains a critical national capability in the analysis of launch vehicle flight dynamics and flight certification of GN&C algorithms. MSFC analysts are domain experts in the areas of flexible-body dynamics and control-structure interaction, thrust vector control, sloshing propellant dynamics, and advanced statistical methods. Marshall's modeling and simulation expertise has supported manned spaceflight for over 50 years. Marshall's unparalleled capability in launch vehicle guidance, navigation, and control technology stems from its rich heritage in developing, integrating, and testing launch vehicle GN&C systems dating to the early Mercury-Redstone and Saturn vehicles. The Marshall team is continuously developing novel methods for design, including advanced techniques for large-scale optimization and analysis.

  18. Statistical evidence of strain induced breaking of metallic point contacts

    NASA Astrophysics Data System (ADS)

    Alwan, Monzer; Candoni, Nadine; Dumas, Philippe; Klein, Hubert R.

    2013-06-01

    A scanning tunneling microscopy in break junction regime and a mechanically controllable break junction are used to acquire thousands of conductance-elongation curves by stretching until breaking and re-connecting Au junctions. From a robust statistical analysis performed on large sets of experiments, parameters such as lifetime, elongation and occurrence probabilities are extracted. The analysis of results obtained for different stretching speeds of the electrodes indicates that the breaking mechanism of di- and mono-atomic junction is identical, and that the junctions undergo atomic rearrangement during their stretching and at the moment of breaking.

  19. Statistical analysis plan of the head position in acute ischemic stroke trial pilot (HEADPOST pilot).

    PubMed

    Olavarría, Verónica V; Arima, Hisatomi; Anderson, Craig S; Brunser, Alejandro; Muñoz-Venturelli, Paula; Billot, Laurent; Lavados, Pablo M

    2017-02-01

    Background The HEADPOST Pilot is a proof-of-concept, open, prospective, multicenter, international, cluster randomized, phase IIb controlled trial, with masked outcome assessment. The trial will test if lying flat head position initiated in patients within 12 h of onset of acute ischemic stroke involving the anterior circulation increases cerebral blood flow in the middle cerebral arteries, as measured by transcranial Doppler. The study will also assess the safety and feasibility of patients lying flat for ≥24 h. The trial was conducted in centers in three countries, with ability to perform early transcranial Doppler. A feature of this trial was that patients were randomized to a certain position according to the month of admission to hospital. Objective To outline in detail the predetermined statistical analysis plan for HEADPOST Pilot study. Methods All data collected by participating researchers will be reviewed and formally assessed. Information pertaining to the baseline characteristics of patients, their process of care, and the delivery of treatments will be classified, and for each item, appropriate descriptive statistical analyses are planned with comparisons made between randomized groups. For the outcomes, statistical comparisons to be made between groups are planned and described. Results This statistical analysis plan was developed for the analysis of the results of the HEADPOST Pilot study to be transparent, available, verifiable, and predetermined before data lock. Conclusions We have developed a statistical analysis plan for the HEADPOST Pilot study which is to be followed to avoid analysis bias arising from prior knowledge of the study findings. Trial registration The study is registered under HEADPOST-Pilot, ClinicalTrials.gov Identifier NCT01706094.

  20. Meta-analysis of correlated traits via summary statistics from GWASs with an application in hypertension.

    PubMed

    Zhu, Xiaofeng; Feng, Tao; Tayo, Bamidele O; Liang, Jingjing; Young, J Hunter; Franceschini, Nora; Smith, Jennifer A; Yanek, Lisa R; Sun, Yan V; Edwards, Todd L; Chen, Wei; Nalls, Mike; Fox, Ervin; Sale, Michele; Bottinger, Erwin; Rotimi, Charles; Liu, Yongmei; McKnight, Barbara; Liu, Kiang; Arnett, Donna K; Chakravati, Aravinda; Cooper, Richard S; Redline, Susan

    2015-01-08

    Genome-wide association studies (GWASs) have identified many genetic variants underlying complex traits. Many detected genetic loci harbor variants that associate with multiple-even distinct-traits. Most current analysis approaches focus on single traits, even though the final results from multiple traits are evaluated together. Such approaches miss the opportunity to systemically integrate the phenome-wide data available for genetic association analysis. In this study, we propose a general approach that can integrate association evidence from summary statistics of multiple traits, either correlated, independent, continuous, or binary traits, which might come from the same or different studies. We allow for trait heterogeneity effects. Population structure and cryptic relatedness can also be controlled. Our simulations suggest that the proposed method has improved statistical power over single-trait analysis in most of the cases we studied. We applied our method to the Continental Origins and Genetic Epidemiology Network (COGENT) African ancestry samples for three blood pressure traits and identified four loci (CHIC2, HOXA-EVX1, IGFBP1/IGFBP3, and CDH17; p < 5.0 × 10(-8)) associated with hypertension-related traits that were missed by a single-trait analysis in the original report. Six additional loci with suggestive association evidence (p < 5.0 × 10(-7)) were also observed, including CACNA1D and WNT3. Our study strongly suggests that analyzing multiple phenotypes can improve statistical power and that such analysis can be executed with the summary statistics from GWASs. Our method also provides a way to study a cross phenotype (CP) association by using summary statistics from GWASs of multiple phenotypes. Copyright © 2015 The American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  1. Design and control strategies for CELSS - Integrating mechanistic paradigms and biological complexities

    NASA Technical Reports Server (NTRS)

    Moore, B., III; Kaufmann, R.; Reinhold, C.

    1981-01-01

    Systems analysis and control theory consideration are given to simulations of both individual components and total systems, in order to develop a reliable control strategy for a Controlled Ecological Life Support System (CELSS) which includes complex biological components. Because of the numerous nonlinearities and tight coupling within the biological component, classical control theory may be inadequate and the statistical analysis of factorial experiments more useful. The range in control characteristics of particular species may simplify the overall task by providing an appropriate balance of stability and controllability to match species function in the overall design. The ultimate goal of this research is the coordination of biological and mechanical subsystems in order to achieve a self-supporting environment.

  2. Development of polytoxicomania in function of defence from psychoticism.

    PubMed

    Nenadović, Milutin M; Sapić, Rosa

    2011-01-01

    Polytoxicomanic proportions in subpopulations of youth have been growing steadily in recent decades, and this trend is pan-continental. Psychoticism is a psychological construct that assumes special basic dimensions of personality disintegration and cognitive functions. Psychoticism may, in general, be the basis of pathological functioning of youth and influence the patterns of thought, feelings and actions that cause dysfunction. The aim of this study was to determine the distribution of basic dimensions of psychoticism for commitment of youth to abuse psychoactive substances (PAS) in order to reduce disturbing intrapsychic experiences or manifestation of psychotic symptoms. For the purpose of this study, two groups of respondents were formed, balanced by age, gender and family structure of origin (at least one parent alive). The study applied a DELTA-9 instrument for assessment of cognitive disintegration in function of establishing psychoticism and its operationalization. The obtained results were statistically analyzed. From the parameters of descriptive statistics, the arithmetic mean was calculated with measures of dispersion. A cross-tabular analysis of variables tested was performed, as well as statistical significance with Pearson's chi2-test, and analysis of variance. Age structure and gender are approximately represented in the group of polytoximaniacs and the control group. Testing did not confirm the statistically significant difference (p > 0.5). Statistical methodology established that they significantly differed in most variables of psychoticism, polytoxicomaniacs compared with a control group of respondents. Testing confirmed a high statistical significance of differences of variables of psychoticism in the group of respondents for p < 0.001 to p < 0.01. A statistically significant representation of the dimension of psychoticism in the polytoxicomaniac group was established. The presence of factors concerning common executive dysfunction was emphasized.

  3. Descriptive and inferential statistical methods used in burns research.

    PubMed

    Al-Benna, Sammy; Al-Ajam, Yazan; Way, Benjamin; Steinstraesser, Lars

    2010-05-01

    Burns research articles utilise a variety of descriptive and inferential methods to present and analyse data. The aim of this study was to determine the descriptive methods (e.g. mean, median, SD, range, etc.) and survey the use of inferential methods (statistical tests) used in articles in the journal Burns. This study defined its population as all original articles published in the journal Burns in 2007. Letters to the editor, brief reports, reviews, and case reports were excluded. Study characteristics, use of descriptive statistics and the number and types of statistical methods employed were evaluated. Of the 51 articles analysed, 11(22%) were randomised controlled trials, 18(35%) were cohort studies, 11(22%) were case control studies and 11(22%) were case series. The study design and objectives were defined in all articles. All articles made use of continuous and descriptive data. Inferential statistics were used in 49(96%) articles. Data dispersion was calculated by standard deviation in 30(59%). Standard error of the mean was quoted in 19(37%). The statistical software product was named in 33(65%). Of the 49 articles that used inferential statistics, the tests were named in 47(96%). The 6 most common tests used (Student's t-test (53%), analysis of variance/co-variance (33%), chi(2) test (27%), Wilcoxon & Mann-Whitney tests (22%), Fisher's exact test (12%)) accounted for the majority (72%) of statistical methods employed. A specified significance level was named in 43(88%) and the exact significance levels were reported in 28(57%). Descriptive analysis and basic statistical techniques account for most of the statistical tests reported. This information should prove useful in deciding which tests should be emphasised in educating burn care professionals. These results highlight the need for burn care professionals to have a sound understanding of basic statistics, which is crucial in interpreting and reporting data. Advice should be sought from professionals in the fields of biostatistics and epidemiology when using more advanced statistical techniques. Copyright 2009 Elsevier Ltd and ISBI. All rights reserved.

  4. A robust and efficient statistical method for genetic association studies using case and control samples from multiple cohorts

    PubMed Central

    2013-01-01

    Background The theoretical basis of genome-wide association studies (GWAS) is statistical inference of linkage disequilibrium (LD) between any polymorphic marker and a putative disease locus. Most methods widely implemented for such analyses are vulnerable to several key demographic factors and deliver a poor statistical power for detecting genuine associations and also a high false positive rate. Here, we present a likelihood-based statistical approach that accounts properly for non-random nature of case–control samples in regard of genotypic distribution at the loci in populations under study and confers flexibility to test for genetic association in presence of different confounding factors such as population structure, non-randomness of samples etc. Results We implemented this novel method together with several popular methods in the literature of GWAS, to re-analyze recently published Parkinson’s disease (PD) case–control samples. The real data analysis and computer simulation show that the new method confers not only significantly improved statistical power for detecting the associations but also robustness to the difficulties stemmed from non-randomly sampling and genetic structures when compared to its rivals. In particular, the new method detected 44 significant SNPs within 25 chromosomal regions of size < 1 Mb but only 6 SNPs in two of these regions were previously detected by the trend test based methods. It discovered two SNPs located 1.18 Mb and 0.18 Mb from the PD candidates, FGF20 and PARK8, without invoking false positive risk. Conclusions We developed a novel likelihood-based method which provides adequate estimation of LD and other population model parameters by using case and control samples, the ease in integration of these samples from multiple genetically divergent populations and thus confers statistically robust and powerful analyses of GWAS. On basis of simulation studies and analysis of real datasets, we demonstrated significant improvement of the new method over the non-parametric trend test, which is the most popularly implemented in the literature of GWAS. PMID:23394771

  5. The platelet activating factor acetyl hydrolase, oxidized low-density lipoprotein, paraoxonase 1 and arylesterase levels in treated and untreated patients with polycystic ovary syndrome.

    PubMed

    Carlioglu, Ayse; Kaygusuz, Ikbal; Karakurt, Feridun; Gumus, Ilknur Inegol; Uysal, Aysel; Kasapoglu, Benan; Armutcu, Ferah; Uysal, Sema; Keskin, Esra Aktepe; Koca, Cemile

    2014-11-01

    To evaluate the platelet activating factor acetyl hydrolyze (PAF-AH), oxidized low-density lipoprotein (ox-LDL), paraoxonase 1 (PON1), arylesterase (ARE) levels and the effects of metformin and Diane-35 (ethinyl oestradiol + cyproterone acetate) therapies on these parameters and to determine the PON1 polymorphisms among PCOS patients. Ninety patients with PCOS, age 30, and body mass index-matched healthy controls were included in the study. Patients were divided into three groups: metformin treatment, Diane-35 treatment and no medication groups. The treatment with metformin or Diane-35 was continued for 6 months and all subjects were evaluated with clinical and biochemical parameters 6 months later. One-way Anova test, t test and non-parametric Mann-Whitney U tests were used for statistical analysis. PAF-AH and ox-LDL levels were statistically significantly higher in untreated PCOS patients than controls, and they were statistically significantly lower in patients treated with metformin or Diane-35 than untreated PCOS patients. In contrast, there were lower PON1 (not statistically significant) and ARE (statistically significant) levels in untreated PCOS patients than the control group and they significantly increased after metformin and Diane-35 treatments. In PCOS patients serum PON1 levels for QQ, QR and RR phenotypes were statistically significantly lower than the control group. In patients with PCOS, proatherogenic markers increase. The treatment of PCOS with metformin or Diane-35 had positive effects on lipid profile, increased PON1 level, which is a protector from atherosclerosis and decreased the proatherogenic PAF-AH and ox-LDL levels.

  6. Statistical analysis of cascading failures in power grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chertkov, Michael; Pfitzner, Rene; Turitsyn, Konstantin

    2010-12-01

    We introduce a new microscopic model of cascading failures in transmission power grids. This model accounts for automatic response of the grid to load fluctuations that take place on the scale of minutes, when optimum power flow adjustments and load shedding controls are unavailable. We describe extreme events, caused by load fluctuations, which cause cascading failures of loads, generators and lines. Our model is quasi-static in the causal, discrete time and sequential resolution of individual failures. The model, in its simplest realization based on the Directed Current description of the power flow problem, is tested on three standard IEEE systemsmore » consisting of 30, 39 and 118 buses. Our statistical analysis suggests a straightforward classification of cascading and islanding phases in terms of the ratios between average number of removed loads, generators and links. The analysis also demonstrates sensitivity to variations in line capacities. Future research challenges in modeling and control of cascading outages over real-world power networks are discussed.« less

  7. Individualism: a valid and important dimension of cultural differences between nations.

    PubMed

    Schimmack, Ulrich; Oishi, Shigehiro; Diener, Ed

    2005-01-01

    Oyserman, Coon, and Kemmelmeier's (2002) meta-analysis suggested problems in the measurement of individualism and collectivism. Studies using Hofstede's individualism scores show little convergent validity with more recent measures of individualism and collectivism. We propose that the lack of convergent validity is due to national differences in response styles. Whereas Hofstede statistically controlled for response styles, Oyserman et al.'s meta-analysis relied on uncorrected ratings. Data from an international student survey demonstrated convergent validity between Hofstede's individualism dimension and horizontal individualism when response styles were statistically controlled, whereas uncorrected scores correlated highly with the individualism scores in Oyserman et al.'s meta-analysis. Uncorrected horizontal individualism scores and meta-analytic individualism scores did not correlate significantly with nations' development, whereas corrected horizontal individualism scores and Hofstede's individualism dimension were significantly correlated with development. This pattern of results suggests that individualism is a valid construct for cross-cultural comparisons, but that the measurement of this construct needs improvement.

  8. Using statistical process control for monitoring the prevalence of hospital-acquired pressure ulcers.

    PubMed

    Kottner, Jan; Halfens, Ruud

    2010-05-01

    Institutionally acquired pressure ulcers are used as outcome indicators to assess the quality of pressure ulcer prevention programs. Determining whether quality improvement projects that aim to decrease the proportions of institutionally acquired pressure ulcers lead to real changes in clinical practice depends on the measurement method and statistical analysis used. To examine whether nosocomial pressure ulcer prevalence rates in hospitals in the Netherlands changed, a secondary data analysis using different statistical approaches was conducted of annual (1998-2008) nationwide nursing-sensitive health problem prevalence studies in the Netherlands. Institutions that participated regularly in all survey years were identified. Risk-adjusted nosocomial pressure ulcers prevalence rates, grade 2 to 4 (European Pressure Ulcer Advisory Panel system) were calculated per year and hospital. Descriptive statistics, chi-square trend tests, and P charts based on statistical process control (SPC) were applied and compared. Six of the 905 healthcare institutions participated in every survey year and 11,444 patients in these six hospitals were identified as being at risk for pressure ulcers. Prevalence rates per year ranged from 0.05 to 0.22. Chi-square trend tests revealed statistically significant downward trends in four hospitals but based on SPC methods, prevalence rates of five hospitals varied by chance only. Results of chi-square trend tests and SPC methods were not comparable, making it impossible to decide which approach is more appropriate. P charts provide more valuable information than single P values and are more helpful for monitoring institutional performance. Empirical evidence about the decrease of nosocomial pressure ulcer prevalence rates in the Netherlands is contradictory and limited.

  9. Methods for computational disease surveillance in infection prevention and control: Statistical process control versus Twitter's anomaly and breakout detection algorithms.

    PubMed

    Wiemken, Timothy L; Furmanek, Stephen P; Mattingly, William A; Wright, Marc-Oliver; Persaud, Annuradha K; Guinn, Brian E; Carrico, Ruth M; Arnold, Forest W; Ramirez, Julio A

    2018-02-01

    Although not all health care-associated infections (HAIs) are preventable, reducing HAIs through targeted intervention is key to a successful infection prevention program. To identify areas in need of targeted intervention, robust statistical methods must be used when analyzing surveillance data. The objective of this study was to compare and contrast statistical process control (SPC) charts with Twitter's anomaly and breakout detection algorithms. SPC and anomaly/breakout detection (ABD) charts were created for vancomycin-resistant Enterococcus, Acinetobacter baumannii, catheter-associated urinary tract infection, and central line-associated bloodstream infection data. Both SPC and ABD charts detected similar data points as anomalous/out of control on most charts. The vancomycin-resistant Enterococcus ABD chart detected an extra anomalous point that appeared to be higher than the same time period in prior years. Using a small subset of the central line-associated bloodstream infection data, the ABD chart was able to detect anomalies where the SPC chart was not. SPC charts and ABD charts both performed well, although ABD charts appeared to work better in the context of seasonal variation and autocorrelation. Because they account for common statistical issues in HAI data, ABD charts may be useful for practitioners for analysis of HAI surveillance data. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  10. A Novel Genome-Information Content-Based Statistic for Genome-Wide Association Analysis Designed for Next-Generation Sequencing Data

    PubMed Central

    Luo, Li; Zhu, Yun

    2012-01-01

    Abstract The genome-wide association studies (GWAS) designed for next-generation sequencing data involve testing association of genomic variants, including common, low frequency, and rare variants. The current strategies for association studies are well developed for identifying association of common variants with the common diseases, but may be ill-suited when large amounts of allelic heterogeneity are present in sequence data. Recently, group tests that analyze their collective frequency differences between cases and controls shift the current variant-by-variant analysis paradigm for GWAS of common variants to the collective test of multiple variants in the association analysis of rare variants. However, group tests ignore differences in genetic effects among SNPs at different genomic locations. As an alternative to group tests, we developed a novel genome-information content-based statistics for testing association of the entire allele frequency spectrum of genomic variation with the diseases. To evaluate the performance of the proposed statistics, we use large-scale simulations based on whole genome low coverage pilot data in the 1000 Genomes Project to calculate the type 1 error rates and power of seven alternative statistics: a genome-information content-based statistic, the generalized T2, collapsing method, multivariate and collapsing (CMC) method, individual χ2 test, weighted-sum statistic, and variable threshold statistic. Finally, we apply the seven statistics to published resequencing dataset from ANGPTL3, ANGPTL4, ANGPTL5, and ANGPTL6 genes in the Dallas Heart Study. We report that the genome-information content-based statistic has significantly improved type 1 error rates and higher power than the other six statistics in both simulated and empirical datasets. PMID:22651812

  11. A novel genome-information content-based statistic for genome-wide association analysis designed for next-generation sequencing data.

    PubMed

    Luo, Li; Zhu, Yun; Xiong, Momiao

    2012-06-01

    The genome-wide association studies (GWAS) designed for next-generation sequencing data involve testing association of genomic variants, including common, low frequency, and rare variants. The current strategies for association studies are well developed for identifying association of common variants with the common diseases, but may be ill-suited when large amounts of allelic heterogeneity are present in sequence data. Recently, group tests that analyze their collective frequency differences between cases and controls shift the current variant-by-variant analysis paradigm for GWAS of common variants to the collective test of multiple variants in the association analysis of rare variants. However, group tests ignore differences in genetic effects among SNPs at different genomic locations. As an alternative to group tests, we developed a novel genome-information content-based statistics for testing association of the entire allele frequency spectrum of genomic variation with the diseases. To evaluate the performance of the proposed statistics, we use large-scale simulations based on whole genome low coverage pilot data in the 1000 Genomes Project to calculate the type 1 error rates and power of seven alternative statistics: a genome-information content-based statistic, the generalized T(2), collapsing method, multivariate and collapsing (CMC) method, individual χ(2) test, weighted-sum statistic, and variable threshold statistic. Finally, we apply the seven statistics to published resequencing dataset from ANGPTL3, ANGPTL4, ANGPTL5, and ANGPTL6 genes in the Dallas Heart Study. We report that the genome-information content-based statistic has significantly improved type 1 error rates and higher power than the other six statistics in both simulated and empirical datasets.

  12. Correlation of histogram analysis of apparent diffusion coefficient with uterine cervical pathologic finding.

    PubMed

    Lin, Yuning; Li, Hui; Chen, Ziqian; Ni, Ping; Zhong, Qun; Huang, Huijuan; Sandrasegaran, Kumar

    2015-05-01

    The purpose of this study was to investigate the application of histogram analysis of apparent diffusion coefficient (ADC) in characterizing pathologic features of cervical cancer and benign cervical lesions. This prospective study was approved by the institutional review board, and written informed consent was obtained. Seventy-three patients with cervical cancer (33-69 years old; 35 patients with International Federation of Gynecology and Obstetrics stage IB cervical cancer) and 38 patients (38-61 years old) with normal cervix or cervical benign lesions (control group) were enrolled. All patients underwent 3-T diffusion-weighted imaging (DWI) with b values of 0 and 800 s/mm(2). ADC values of the entire tumor in the patient group and the whole cervix volume in the control group were assessed. Mean ADC, median ADC, 25th and 75th percentiles of ADC, skewness, and kurtosis were calculated. Histogram parameters were compared between different pathologic features, as well as between stage IB cervical cancer and control groups. Mean ADC, median ADC, and 25th percentile of ADC were significantly higher for adenocarcinoma (p = 0.021, 0.006, and 0.004, respectively), and skewness was significantly higher for squamous cell carcinoma (p = 0.011). Median ADC was statistically significantly higher for well or moderately differentiated tumors (p = 0.044), and skewness was statistically significantly higher for poorly differentiated tumors (p = 0.004). No statistically significant difference of ADC histogram was observed between lymphovascular space invasion subgroups. All histogram parameters differed significantly between stage IB cervical cancer and control groups (p < 0.05). Distribution of ADCs characterized by histogram analysis may help to distinguish early-stage cervical cancer from normal cervix or cervical benign lesions and may be useful for evaluating the different pathologic features of cervical cancer.

  13. Quality control analysis : part I : asphaltic concrete.

    DOT National Transportation Integrated Search

    1964-11-01

    This report deals with the statistical evaluation of results from several hot mix plants to determine the pattern of variability with respect to bituminous hot mix characteristics. : Individual tests results when subjected to frequency distribution i...

  14. Ridge preservation using a composite bone graft and a bioabsorbable membrane with and without primary wound closure: a comparative clinical trial.

    PubMed

    Engler-Hamm, Daniel; Cheung, Wai S; Yen, Alec; Stark, Paul C; Griffin, Terrence

    2011-03-01

    The aim of this single-masked, randomized controlled clinical trial is to compare hard and soft tissue changes after ridge preservation performed with (control, RPc) and without (test, RPe) primary soft tissue closure in a split-mouth design. Eleven patients completed this 6-month trial. Extraction and ridge preservation were performed using a composite bone graft of inorganic bovine-derived hydroxyapatite matrix and cell binding peptide P-15 (ABM/P-15), demineralized freeze-dried bone allograft, and a copolymer bioabsorbable membrane. Primary wound closure was achieved on the control sites (RPc), whereas test sites (RPe) left the membrane exposed. Pocket probing depth on adjacent teeth, repositioning of the mucogingival junction, bone width, bone fill, and postoperative discomfort were assessed. Bone cores were obtained for histological examination. Intragroup analyses for both groups demonstrated statistically significant mean reductions in probing depth (RPc: 0.42 mm, P = 0.012; RPe: 0.25 mm, P = 0.012) and bone width (RPc: 3 mm, P = 0.002; RPe: 3.42 mm, P <0.001). However, intergroup analysis did not find these parameters to be statistically different at 6 months. The test group showed statistically significant mean change in bone fill (7.21 mm; P <0.001). Compared to the control group, the test group showed statistically significant lower mean postoperative discomfort (RPc 4 versus RPe 2; P = 0.002). Histomorphometric analysis showed presence of 0% to 40% of ABM/P-15 and 5% to 20% of new bone formation in both groups. Comparison of clinical variables between the two groups at 6 months revealed that the mucogingival junction was statistically significantly more coronally displaced in the control group than in the test group, with a mean of 3.83 mm versus 1.21 mm (P = 0.002). Ridge preservation without flap advancement preserves more keratinized tissue and has less postoperative discomfort and swelling. Although ridge preservation is performed with either method, ≈27% to 30% of bone width is lost.

  15. Comments on: blood product transfusion in emergency department patients: a case control study of practice patterns and impact on outcome.

    PubMed

    Karami, Manoochehr; Khazaei, Salman

    2017-12-06

    Clinical decision makings according studies result require the valid and correct data collection, andanalysis. However, there are some common methodological and statistical issues which may ignore by authors. In individual matched case- control design bias arising from the unconditional analysis instead of conditional analysis. Using an unconditional logistic for matched data causes the imposition of a large number of nuisance parameters which may result in seriously biased estimates.

  16. Visual Data Analysis for Satellites

    NASA Technical Reports Server (NTRS)

    Lau, Yee; Bhate, Sachin; Fitzpatrick, Patrick

    2008-01-01

    The Visual Data Analysis Package is a collection of programs and scripts that facilitate visual analysis of data available from NASA and NOAA satellites, as well as dropsonde, buoy, and conventional in-situ observations. The package features utilities for data extraction, data quality control, statistical analysis, and data visualization. The Hierarchical Data Format (HDF) satellite data extraction routines from NASA's Jet Propulsion Laboratory were customized for specific spatial coverage and file input/output. Statistical analysis includes the calculation of the relative error, the absolute error, and the root mean square error. Other capabilities include curve fitting through the data points to fill in missing data points between satellite passes or where clouds obscure satellite data. For data visualization, the software provides customizable Generic Mapping Tool (GMT) scripts to generate difference maps, scatter plots, line plots, vector plots, histograms, timeseries, and color fill images.

  17. Linnorm: improved statistical analysis for single cell RNA-seq expression data

    PubMed Central

    Yip, Shun H.; Wang, Panwen; Kocher, Jean-Pierre A.; Sham, Pak Chung

    2017-01-01

    Abstract Linnorm is a novel normalization and transformation method for the analysis of single cell RNA sequencing (scRNA-seq) data. Linnorm is developed to remove technical noises and simultaneously preserve biological variations in scRNA-seq data, such that existing statistical methods can be improved. Using real scRNA-seq data, we compared Linnorm with existing normalization methods, including NODES, SAMstrt, SCnorm, scran, DESeq and TMM. Linnorm shows advantages in speed, technical noise removal and preservation of cell heterogeneity, which can improve existing methods in the discovery of novel subtypes, pseudo-temporal ordering of cells, clustering analysis, etc. Linnorm also performs better than existing DEG analysis methods, including BASiCS, NODES, SAMstrt, Seurat and DESeq2, in false positive rate control and accuracy. PMID:28981748

  18. Differentiation of chocolates according to the cocoa's geographical origin using chemometrics.

    PubMed

    Cambrai, Amandine; Marcic, Christophe; Morville, Stéphane; Sae Houer, Pierre; Bindler, Françoise; Marchioni, Eric

    2010-02-10

    The determination of the geographical origin of cocoa used to produce chocolate has been assessed through the analysis of the volatile compounds of chocolate samples. The analysis of the volatile content and their statistical processing by multivariate analyses tended to form independent groups for both Africa and Madagascar, even if some of the chocolate samples analyzed appeared in a mixed zone together with those from America. This analysis also allowed a clear separation between Caribbean chocolates and those from other origins. Height compounds (such as linalool or (E,E)-2,4-decadienal) characteristic of chocolate's different geographical origins were also identified. The method described in this work (hydrodistillation, GC analysis, and statistic treatment) may improve the control of the geographical origin of chocolate during its long production process.

  19. Voice Tremor in Parkinson's Disease: An Acoustic Study.

    PubMed

    Gillivan-Murphy, Patricia; Miller, Nick; Carding, Paul

    2018-01-30

    Voice tremor associated with Parkinson disease (PD) has not been characterized. Its relationship with voice disability and disease variables is unknown. This study aimed to evaluate voice tremor in people with PD (pwPD) and a matched control group using acoustic analysis, and to examine correlations with voice disability and disease variables. Acoustic voice tremor analysis was completed on 30 pwPD and 28 age-gender matched controls. Voice disability (Voice Handicap Index), and disease variables of disease duration, Activities of Daily Living (Unified Parkinson's Disease Rating Scale [UPDRS II]), and motor symptoms related to PD (UPDRS III) were examined for relationship with voice tremor measures. Voice tremor was detected acoustically in pwPD and controls with similar frequency. PwPD had a statistically significantly higher rate of amplitude tremor (Hz) than controls (P = 0.001). Rate of amplitude tremor was negatively and significantly correlated with UPDRS III total score (rho -0.509). For pwPD, the magnitude and periodicity of acoustic tremor was higher than for controls without statistical significance. The magnitude of frequency tremor (Mftr%) was positively and significantly correlated with disease duration (rho 0.463). PwPD had higher Voice Handicap Index total, functional, emotional, and physical subscale scores than matched controls (P < 0.001). Voice disability did not correlate significantly with acoustic voice tremor measures. Acoustic analysis enhances understanding of PD voice tremor characteristics, its pathophysiology, and its relationship with voice disability and disease symptomatology. Copyright © 2018 The Voice Foundation. All rights reserved.

  20. Evaluation of structural connectivity changes in betel-quid chewers using generalized q-sampling MRI.

    PubMed

    Weng, Jun-Cheng; Kao, Te-Wei; Huang, Guo-Joe; Tyan, Yeu-Sheng; Tseng, Hsien-Chun; Ho, Ming-Chou

    2017-07-01

    Betel quid (BQ) is a common addictive substance in many Asian countries. However, few studies have focused on the influences of BQ on the brain. It remains unclear how BQ can affect structural brain abnormalities in BQ chewers. We aimed to use generalized q-sampling imaging (GQI) to evaluate the impact of the neurological structure of white matter caused by BQ. The study population comprised 16 BQ chewers, 15 tobacco and alcohol controls, and 17 healthy controls. We used GQI with voxel-based statistical analysis (VBA) to evaluate structural brain and connectivity abnormalities in the BQ chewers compared to the tobacco and alcohol controls and the healthy controls. Graph theoretical analysis (GTA) and network-based statistical (NBS) analysis were also performed to identify the structural network differences among the three groups. Using GQI, we found increases in diffusion anisotropy in the right anterior cingulate cortex (ACC), the midbrain, the bilateral angular gyrus, the right superior temporal gyrus (rSTG), the bilateral superior occipital gyrus, the left middle occipital gyrus, the bilateral superior and inferior parietal lobule, and the bilateral postcentral and precentral gyrus in the BQ chewers when compared to the tobacco and alcohol controls and the healthy controls. In GTA and NBS analyses, we found more connections in connectivity among the BQ chewers, particularly in the bilateral anterior cingulum. Our results provided further evidence indicating that BQ chewing may lead to brain structure and connectivity changes in BQ chewers.

  1. Bangladesh.

    PubMed

    Ahmed, K S

    1979-01-01

    In Bangladesh the Population Control and Family Planning Division of the Ministry of Health and Population Control has decided to delegate increased financial and administrative powers to the officers of the family planning program at the district level and below. Currently, about 20,000 family planning workers and officials are at work in rural areas. The government believes that the success of the entire family planning program depends on the performance of workers in rural areas, because that is where about 90% of the population lives. Awareness of the need to improve statistical data in Bangladesh has been increasing, particularly in regard to the development of rural areas. An accurate statistical profile of rural Bangladesh is crucial to the formation, implementation and evaluation of rural development programs. A Seminar on Statistics for Rural Development will be held from June 18-20, 1980. The primary objectives of the Seminar are to make an exhaustive analysis of the current availability of statistics required for rural development programs and to consider methodological and operational improvements toward building up an adequate data base.

  2. Tips and Tricks for Successful Application of Statistical Methods to Biological Data.

    PubMed

    Schlenker, Evelyn

    2016-01-01

    This chapter discusses experimental design and use of statistics to describe characteristics of data (descriptive statistics) and inferential statistics that test the hypothesis posed by the investigator. Inferential statistics, based on probability distributions, depend upon the type and distribution of the data. For data that are continuous, randomly and independently selected, as well as normally distributed more powerful parametric tests such as Student's t test and analysis of variance (ANOVA) can be used. For non-normally distributed or skewed data, transformation of the data (using logarithms) may normalize the data allowing use of parametric tests. Alternatively, with skewed data nonparametric tests can be utilized, some of which rely on data that are ranked prior to statistical analysis. Experimental designs and analyses need to balance between committing type 1 errors (false positives) and type 2 errors (false negatives). For a variety of clinical studies that determine risk or benefit, relative risk ratios (random clinical trials and cohort studies) or odds ratios (case-control studies) are utilized. Although both use 2 × 2 tables, their premise and calculations differ. Finally, special statistical methods are applied to microarray and proteomics data, since the large number of genes or proteins evaluated increase the likelihood of false discoveries. Additional studies in separate samples are used to verify microarray and proteomic data. Examples in this chapter and references are available to help continued investigation of experimental designs and appropriate data analysis.

  3. The emergence of modern statistics in agricultural science: analysis of variance, experimental design and the reshaping of research at Rothamsted Experimental Station, 1919-1933.

    PubMed

    Parolini, Giuditta

    2015-01-01

    During the twentieth century statistical methods have transformed research in the experimental and social sciences. Qualitative evidence has largely been replaced by quantitative results and the tools of statistical inference have helped foster a new ideal of objectivity in scientific knowledge. The paper will investigate this transformation by considering the genesis of analysis of variance and experimental design, statistical methods nowadays taught in every elementary course of statistics for the experimental and social sciences. These methods were developed by the mathematician and geneticist R. A. Fisher during the 1920s, while he was working at Rothamsted Experimental Station, where agricultural research was in turn reshaped by Fisher's methods. Analysis of variance and experimental design required new practices and instruments in field and laboratory research, and imposed a redistribution of expertise among statisticians, experimental scientists and the farm staff. On the other hand the use of statistical methods in agricultural science called for a systematization of information management and made computing an activity integral to the experimental research done at Rothamsted, permanently integrating the statisticians' tools and expertise into the station research programme. Fisher's statistical methods did not remain confined within agricultural research and by the end of the 1950s they had come to stay in psychology, sociology, education, chemistry, medicine, engineering, economics, quality control, just to mention a few of the disciplines which adopted them.

  4. Impact of Group Sandtray Therapy on the Self-Esteem of Young Adolescent Girls

    ERIC Educational Resources Information Center

    Shen, Yu-Pei; Armstrong, Stephen A.

    2008-01-01

    The effectiveness of group sandtray therapy was examined using a pretest-posttest control group design with young adolescent girls (n = 37) identified as having low self-esteem. A split-plot analysis of variance (SPANOVA) revealed statistically significant differences between participants in the treatment and control groups in self-esteem on five…

  5. Cognitive-Behavioral Treatment for Panic Disorder with Agoraphobia: A Randomized, Controlled Trial and Cost-Effectiveness Analysis

    ERIC Educational Resources Information Center

    Roberge, Pasquale; Marchand, Andre; Reinharz, Daniel; Savard, Pierre

    2008-01-01

    A randomized, controlled trial was conducted to examine the cost-effectiveness of cognitive-behavioral treatment (CBT) for panic disorder with agoraphobia. A total of 100 participants were randomly assigned to standard (n = 33), group (n = 35), and brief (n = 32) treatment conditions. Results show significant clinical and statistical improvement…

  6. Preliminary Retrospective Analysis of Daily Tomotherapy Output Constancy Checks Using Statistical Process Control.

    PubMed

    Mezzenga, Emilio; D'Errico, Vincenzo; Sarnelli, Anna; Strigari, Lidia; Menghi, Enrico; Marcocci, Francesco; Bianchini, David; Benassi, Marcello

    2016-01-01

    The purpose of this study was to retrospectively evaluate the results from a Helical TomoTherapy Hi-Art treatment system relating to quality controls based on daily static and dynamic output checks using statistical process control methods. Individual value X-charts, exponentially weighted moving average charts, and process capability and acceptability indices were used to monitor the treatment system performance. Daily output values measured from January 2014 to January 2015 were considered. The results obtained showed that, although the process was in control, there was an out-of-control situation in the principal maintenance intervention for the treatment system. In particular, process capability indices showed a decreasing percentage of points in control which was, however, acceptable according to AAPM TG148 guidelines. Our findings underline the importance of restricting the acceptable range of daily output checks and suggest a future line of investigation for a detailed process control of daily output checks for the Helical TomoTherapy Hi-Art treatment system.

  7. The efficacy of problem-solving treatments after deliberate self-harm: meta-analysis of randomized controlled trials with respect to depression, hopelessness and improvement in problems.

    PubMed

    Townsend, E; Hawton, K; Altman, D G; Arensman, E; Gunnell, D; Hazell, P; House, A; Van Heeringen, K

    2001-08-01

    Brief problem-solving therapy is regarded as a pragmatic treatment for deliberate self-harm (DSH) patients. A recent meta-analysis of randomized controlled trials (RCTs) evaluating this approach indicated a trend towards reduced repetition of DSH but the pooled odds ratio was not statistically significant. We have now examined other important outcomes using this procedure, namely depression, hopelessness and improvement in problems. Six trials in which problem-solving therapy was compared with control treatment were identified from an extensive literature review of RCTs of treatments for DSH patients. Data concerning depression, hopelessness and improvement in problems were extracted. Where relevant statistical data (e.g. standard deviations) were missing these were imputed using various statistical methods. Results were pooled using meta-analytical procedures. At follow-up, patients who were offered problem-solving therapy had significantly greater improvement in scores for depression (standardized mean difference = -0.36; 95% CI -0.61 to -0.11) and hopelessness (weighted mean difference =-3.2; 95% CI -4.0 to -2.41), and significantly more reported improvement in their problems (odds ratio = 2.31; 95% CI 1.29 to 4.13), than patients who were in the control treatment groups. Problem-solving therapy for DSH patients appears to produce better results than control treatment with regard to improvement in depression, hopelessness and problems. It is desirable that this finding is confirmed in a large trial, which will also allow adequate testing of the impact of this treatment on repetition of DSH.

  8. Self-reported data on sleep quality and psychologic characteristics in patients with myofascial pain and disc displacement versus asymptomatic controls.

    PubMed

    Sener, Sevgi; Guler, Ozkan

    2012-01-01

    The aim of this research was to compare the differences between patients with myofascial pain and disc displacement and asymptomatic individuals based on aspects of psychologic status and sleep quality. One hundred thirty patients (81 women, 49 men; mean ages: 30.0 and 31.0 years, respectively) with temporomandibular disorder were selected, and 64 control subjects (32 women, 32 men; mean ages: 27.2 and 27.5 years, respectively) were included in the investigation over a period of 1 year. Clinical diagnosis of 65 patients with myofascial pain and 65 patients with disc displacement with or without limitation and joint pain was determined according to the Research Diagnostic Criteria for Temporomandibular Disorders. The Pittsburgh Sleep Quality Index (PSQI) was used to evaluate sleep quality. Psychologic status was assessed using Symptom Checklist-90-Revised (SCL-90-R). Chi-square, Kolmogorov-Smirnov, one-way analysis of variance, and Tukey Honestly Significant Difference post hoc multiple comparison or Tamhane T2 tests were used for statistical analysis. There was a significant difference between patients with myofascial pain and disc displacement regarding somatization and paranoid ideation. No statistically significant difference was found between patients with disc displacements and controls in all dimensions of the SCL-90-R. Total score for the PSQI was statistically significantly different between patients with myofascial pain and controls; no significant differences were found between patients with disc displacement and those with myofascial pain or controls regarding the PSQI. To manage patients with myofascial pain, psychologic assessments including sleep quality should be considered.

  9. Estimation of elastic moduli of graphene monolayer in lattice statics approach at nonzero temperature

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zubko, I. Yu., E-mail: zoubko@list.ru; Kochurov, V. I.

    2015-10-27

    For the aim of the crystal temperature control the computational-statistical approach to studying thermo-mechanical properties for finite sized crystals is presented. The approach is based on the combination of the high-performance computational techniques and statistical analysis of the crystal response on external thermo-mechanical actions for specimens with the statistically small amount of atoms (for instance, nanoparticles). The heat motion of atoms is imitated in the statics approach by including the independent degrees of freedom for atoms connected with their oscillations. We obtained that under heating, graphene material response is nonsymmetric.

  10. Hydrochemical evolution and groundwater flow processes in the Galilee and Eromanga basins, Great Artesian Basin, Australia: a multivariate statistical approach.

    PubMed

    Moya, Claudio E; Raiber, Matthias; Taulis, Mauricio; Cox, Malcolm E

    2015-03-01

    The Galilee and Eromanga basins are sub-basins of the Great Artesian Basin (GAB). In this study, a multivariate statistical approach (hierarchical cluster analysis, principal component analysis and factor analysis) is carried out to identify hydrochemical patterns and assess the processes that control hydrochemical evolution within key aquifers of the GAB in these basins. The results of the hydrochemical assessment are integrated into a 3D geological model (previously developed) to support the analysis of spatial patterns of hydrochemistry, and to identify the hydrochemical and hydrological processes that control hydrochemical variability. In this area of the GAB, the hydrochemical evolution of groundwater is dominated by evapotranspiration near the recharge area resulting in a dominance of the Na-Cl water types. This is shown conceptually using two selected cross-sections which represent discrete groundwater flow paths from the recharge areas to the deeper parts of the basins. With increasing distance from the recharge area, a shift towards a dominance of carbonate (e.g. Na-HCO3 water type) has been observed. The assessment of hydrochemical changes along groundwater flow paths highlights how aquifers are separated in some areas, and how mixing between groundwater from different aquifers occurs elsewhere controlled by geological structures, including between GAB aquifers and coal bearing strata of the Galilee Basin. The results of this study suggest that distinct hydrochemical differences can be observed within the previously defined Early Cretaceous-Jurassic aquifer sequence of the GAB. A revision of the two previously recognised hydrochemical sequences is being proposed, resulting in three hydrochemical sequences based on systematic differences in hydrochemistry, salinity and dominant hydrochemical processes. The integrated approach presented in this study which combines different complementary multivariate statistical techniques with a detailed assessment of the geological framework of these sedimentary basins, can be adopted in other complex multi-aquifer systems to assess hydrochemical evolution and its geological controls. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. [Antiurolithiasic effect of a plant mixture of Herniaria glabra, Agropyron repens, Equisetum arvense and Sambucus nigra (Herbensurina®) in the prevention of experimentally induced nephrolithiasis in rats].

    PubMed

    Crescenti, Anna; Puiggròs, Francesc; Colomé, Arnau; Poch, Josep Antón; Caimari, Antoni; Bas, Josep Maria; Boqué, Noemí; Arola, Lluís

    2015-12-01

    To determine the effect of a botanical formulation of Herniaria glabra, Agropyron repens, Equisetum arvense, and Sambucus nigra as a preventive agent in an experimentally induced nefrolithiasis model in rats. Six groups of six Wistar male rats each were induced for nefrolithiasis by treatment with 0.75% ethylene glycol (EG) and 1% ammonium chloride for three days and then EG only for 15 days. One group was treated with placebo (control group) and the other groups (treated groups) were treated with 30 mg/Kg, 60 mg/Kg, 125 mg/Kg, 250 mg/Kg and 500 mg/Kg of the plant extract formulation (PEF). 24-h urine and water samples were collected one day before EG administration and at 7, 13 and 18 days to determine diuresis, crystalluria and urine biochemistry. The kidneys were removed for histological analysis. The phytochemical characterization of PEF and each of its component plant extracts was performed using gas chromatography-mass spectrometry and liquid chromatography-mass spectrometry. Animals treated with 125 mg/Kg of the PEF had statistically significantly lower calcium oxalate crystals deposits content compared to the control group. All PEF doses statistically significantly decreased the number of microcalcifications compared to the control group. Furthermore, the number of kidneys affected by subcapsular fibrosis was statistically significantly higher in control group than in treated groups with the PEF. The diuresis of the 125 mg/Kg and 500 mg/Kg PEF-treated groups was statistically significantly higher than that of the control group. A phytochemical analysis demonstrated the presence of flavonoids, dicarboxylic acids and saponins. Treatment with PEF prevents deposits of calcium oxalate crystals formation and of microcalcifications in the kidney, and reduces the risk of fibrosis subcapsular. 125 mg/Kg of PEF is the dose that has a greater effect on the studied parameters.

  12. Pembrolizumab Injection

    MedlinePlus

    ... with an immunomodulatory agent as compared to the control group (see statistical analysis section below). Merck & Co., Inc. was made aware of the issue through an external data monitoring committee recommendation and suspended the ... and http://www.fda.gov/Drugs/DrugSafety.

  13. Optimizing construction quality management of pavements using mechanistic performance analysis.

    DOT National Transportation Integrated Search

    2004-08-01

    This report presents a statistical-based algorithm that was developed to reconcile the results from several pavement performance models used in the state of practice with systematic process control techniques. These algorithms identify project-specif...

  14. Low-level processing for real-time image analysis

    NASA Technical Reports Server (NTRS)

    Eskenazi, R.; Wilf, J. M.

    1979-01-01

    A system that detects object outlines in television images in real time is described. A high-speed pipeline processor transforms the raw image into an edge map and a microprocessor, which is integrated into the system, clusters the edges, and represents them as chain codes. Image statistics, useful for higher level tasks such as pattern recognition, are computed by the microprocessor. Peak intensity and peak gradient values are extracted within a programmable window and are used for iris and focus control. The algorithms implemented in hardware and the pipeline processor architecture are described. The strategy for partitioning functions in the pipeline was chosen to make the implementation modular. The microprocessor interface allows flexible and adaptive control of the feature extraction process. The software algorithms for clustering edge segments, creating chain codes, and computing image statistics are also discussed. A strategy for real time image analysis that uses this system is given.

  15. Block observations of neighbourhood physical disorder are associated with neighbourhood crime, firearm injuries and deaths, and teen births.

    PubMed

    Wei, Evelyn; Hipwell, Alison; Pardini, Dustin; Beyers, Jennifer M; Loeber, Rolf

    2005-10-01

    To provide reliability information for a brief observational measure of physical disorder and determine its relation with neighbourhood level crime and health variables after controlling for census based measures of concentrated poverty and minority concentration. Psychometric analysis of block observation data comprising a brief measure of neighbourhood physical disorder, and cross sectional analysis of neighbourhood physical disorder, neighbourhood crime and birth statistics, and neighbourhood level poverty and minority concentration. Pittsburgh, Pennsylvania, US (2000 population=334 563). Pittsburgh neighbourhoods (n=82) and their residents (as reflected in neighbourhood level statistics). The physical disorder index showed adequate reliability and validity and was associated significantly with rates of crime, firearm injuries and homicides, and teen births, while controlling for concentrated poverty and minority population. This brief measure of neighbourhood physical disorder may help increase our understanding of how community level factors reflect health and crime outcomes.

  16. Rolling-Element Fatigue Testing and Data Analysis - A Tutorial

    NASA Technical Reports Server (NTRS)

    Vlcek, Brian L.; Zaretsky, Erwin V.

    2011-01-01

    In order to rank bearing materials, lubricants and other design variables using rolling-element bench type fatigue testing of bearing components and full-scale rolling-element bearing tests, the investigator needs to be cognizant of the variables that affect rolling-element fatigue life and be able to maintain and control them within an acceptable experimental tolerance. Once these variables are controlled, the number of tests and the test conditions must be specified to assure reasonable statistical certainty of the final results. There is a reasonable correlation between the results from elemental test rigs with those results obtained with full-scale bearings. Using the statistical methods of W. Weibull and L. Johnson, the minimum number of tests required can be determined. This paper brings together and discusses the technical aspects of rolling-element fatigue testing and data analysis as well as making recommendations to assure quality and reliable testing of rolling-element specimens and full-scale rolling-element bearings.

  17. A double-blind placebo-controlled cross-over clinical trial of DONepezil In Posterior cortical atrophy due to underlying Alzheimer's Disease: DONIPAD study.

    PubMed

    Ridha, Basil H; Crutch, Sebastian; Cutler, Dawn; Frost, Christopher; Knight, William; Barker, Suzie; Epie, Norah; Warrington, Elizabeth K; Kukkastenvehmas, Riitta; Douglas, Jane; Rossor, Martin N

    2018-05-01

    The study investigated whether donepezil exerts symptomatic benefit in patients with posterior cortical atrophy (PCA), an atypical variant of Alzheimer's disease. A single-centre, double-blind, placebo-controlled, cross-over clinical trial was performed to assess the efficacy of donepezil in patients with PCA. Each patient received either donepezil (5 mg once daily in the first 6 weeks and 10 mg once daily in the second 6 weeks) or placebo for 12 weeks. After a 2-week washout period, each patient received the other treatment arm during the following 12 weeks followed by another 2-week washout period. The primary outcome was the Mini-Mental State Examination (MMSE) at 12 weeks. Secondary outcome measures were five neuropsychological tests reflecting parieto-occipital function. Intention-to-treat analysis was used. For each outcome measure, carry-over effects were first assessed. If present, then analysis was restricted to the first 12-week period. Otherwise, the standard approach to the analysis of a 2 × 2 cross-over trial was used. Eighteen patients (13 females) were recruited (mean age 61.6 years). There was a protocol violation in one patient, who subsequently withdrew from the study due to gastrointestinal side effects. There was statistically significant (p < 0.05) evidence of a carry-over effect on MMSE. Therefore, the analysis of treatment effect on MMSE was restricted to the first 12-week period. Treatment effect at 6 weeks was statistically significant (difference = 2.5 in favour of donepezil, 95% CI 0.1 to 5.0, p < 0.05). Treatment effect at 12 weeks was close, but not statistically significant (difference = 2.0 in favour of donepezil, 95% CI -0.1 to 4.5, p > 0.05). There were no statistically significant treatment effects on any of the five neuropsychological tests, except for digit span at 12 weeks (higher by 0.5 digits in favour of placebo, 95% CI 0.1 to 0.9). Gastrointestinal side effects occurred most frequently, affecting 13/18 subjects (72%), and were the cause of study discontinuation in one subject. Nightmares and vivid dreams occurred in 8/18 subjects (44%), and were statistically more frequent during treatment with donepezil. In this small study, there was no statistically significant treatment effect of donepezil on the primary outcome measure (MMSE score at 12 weeks) in PCA patients, who appear to be particularly susceptible to the development of nightmares and vivid dreams when treated. Trial registration: Current Controlled Trials ISRCTN22636071 . Retrospectively registered 19 May 2010.

  18. A Statistical Analysis of Brain Morphology Using Wild Bootstrapping

    PubMed Central

    Ibrahim, Joseph G.; Tang, Niansheng; Rowe, Daniel B.; Hao, Xuejun; Bansal, Ravi; Peterson, Bradley S.

    2008-01-01

    Methods for the analysis of brain morphology, including voxel-based morphology and surface-based morphometries, have been used to detect associations between brain structure and covariates of interest, such as diagnosis, severity of disease, age, IQ, and genotype. The statistical analysis of morphometric measures usually involves two statistical procedures: 1) invoking a statistical model at each voxel (or point) on the surface of the brain or brain subregion, followed by mapping test statistics (e.g., t test) or their associated p values at each of those voxels; 2) correction for the multiple statistical tests conducted across all voxels on the surface of the brain region under investigation. We propose the use of new statistical methods for each of these procedures. We first use a heteroscedastic linear model to test the associations between the morphological measures at each voxel on the surface of the specified subregion (e.g., cortical or subcortical surfaces) and the covariates of interest. Moreover, we develop a robust test procedure that is based on a resampling method, called wild bootstrapping. This procedure assesses the statistical significance of the associations between a measure of given brain structure and the covariates of interest. The value of this robust test procedure lies in its computationally simplicity and in its applicability to a wide range of imaging data, including data from both anatomical and functional magnetic resonance imaging (fMRI). Simulation studies demonstrate that this robust test procedure can accurately control the family-wise error rate. We demonstrate the application of this robust test procedure to the detection of statistically significant differences in the morphology of the hippocampus over time across gender groups in a large sample of healthy subjects. PMID:17649909

  19. Radiographic comparison of different concentrations of recombinant human bone morphogenetic protein with allogenic bone compared with the use of 100% mineralized cancellous bone allograft in maxillary sinus grafting.

    PubMed

    Froum, Stuart J; Wallace, Stephen; Cho, Sang-Choon; Khouly, Ismael; Rosenberg, Edwin; Corby, Patricia; Froum, Scott; Mascarenhas, Patrick; Tarnow, Dennis P

    2014-01-01

    The purpose of this study was to radiographically evaluate, then analyze, bone height, volume, and density with reference to percentage of vital bone after maxillary sinuses were grafted using two different doses of recombinant human bone morphogenetic protein 2/acellular collagen sponge (rhBMP-2/ACS) combined with mineralized cancellous bone allograft (MCBA) and a control sinus grafted with MCBA only. A total of 18 patients (36 sinuses) were used for analysis of height and volume measurements, having two of three graft combinations (one in each sinus): (1) control, MCBA only; (2) test 1, MCBA + 5.6 mL of rhBMP-2/ACS (containing 8.4 mg of rhBMP-2); and (3) test 2, MCBA + 2.8 mL of rhBMP-2/ACS (containing 4.2 mg of rhBMP-2). The study was completed with 16 patients who also had bilateral cores removed 6 to 9 months following sinus augmentation. A computer software system was used to evaluate 36 computed tomography scans. Two time points where selected for measurements of height: The results indicated that height of the grafted sinus was significantly greater in the treatment groups compared with the control. However, by the second time point, there were no statistically significant differences. Three weeks post-surgery bone volume measurements showed similar statistically significant differences between test and controls. However, prior to core removal, test group 1 with the greater dose of rhBMP-2 showed a statistically significant greater increase compared with test group 2 and the control. There was no statistically significant difference between the latter two groups. All three groups had similar volume and shrinkage. Density measurements varied from the above results, with the control showing statistically significant greater density at both time points. By contrast, the density increase over time in both rhBMP groups was similar and statistically higher than in the control group. There were strong associations between height and volume in all groups and between volume and new vital bone only in the control group. There were no statistically significant relationships observed between height and bone density or between volume and bone density for any parameter measured. More cases and monitoring of the future survival of implants placed in these augmented sinuses are needed to verify these results.

  20. The change of adjacent segment after cervical disc arthroplasty compared with anterior cervical discectomy and fusion: a meta-analysis of randomized controlled trials.

    PubMed

    Dong, Liang; Xu, Zhengwei; Chen, Xiujin; Wang, Dongqi; Li, Dichen; Liu, Tuanjing; Hao, Dingjun

    2017-10-01

    Many meta-analyses have been performed to study the efficacy of cervical disc arthroplasty (CDA) compared with anterior cervical discectomy and fusion (ACDF); however, there are few data referring to adjacent segment within these meta-analyses, or investigators are unable to arrive at the same conclusion in the few meta-analyses about adjacent segment. With the increased concerns surrounding adjacent segment degeneration (ASDeg) and adjacent segment disease (ASDis) after anterior cervical surgery, it is necessary to perform a comprehensive meta-analysis to analyze adjacent segment parameters. To perform a comprehensive meta-analysis to elaborate adjacent segment motion, degeneration, disease, and reoperation of CDA compared with ACDF. Meta-analysis of randomized controlled trials (RCTs). PubMed, Embase, and Cochrane Library were searched for RCTs comparing CDA and ACDF before May 2016. The analysis parameters included follow-up time, operative segments, adjacent segment motion, ASDeg, ASDis, and adjacent segment reoperation. The risk of bias scale was used to assess the papers. Subgroup analysis and sensitivity analysis were used to analyze the reason for high heterogeneity. Twenty-nine RCTs fulfilled the inclusion criteria. Compared with ACDF, the rate of adjacent segment reoperation in the CDA group was significantly lower (p<.01), and the advantage of that group in reducing adjacent segment reoperation increases with increasing follow-up time by subgroup analysis. There was no statistically significant difference in ASDeg between CDA and ACDF within the 24-month follow-up period; however, the rate of ASDeg in CDA was significantly lower than that of ACDF with the increase in follow-up time (p<.01). There was no statistically significant difference in ASDis between CDA and ACDF (p>.05). Cervical disc arthroplasty provided a lower adjacent segment range of motion (ROM) than did ACDF, but the difference was not statistically significant. Compared with ACDF, the advantages of CDA were lower ASDeg and adjacent segment reoperation. However, there was no statistically significant difference in ASDis and adjacent segment ROM. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Validation of a proposal for evaluating hospital infection control programs.

    PubMed

    Silva, Cristiane Pavanello Rodrigues; Lacerda, Rúbia Aparecida

    2011-02-01

    To validate the construct and discriminant properties of a hospital infection prevention and control program. The program consisted of four indicators: technical-operational structure; operational prevention and control guidelines; epidemiological surveillance system; and prevention and control activities. These indicators, with previously validated content, were applied to 50 healthcare institutions in the city of São Paulo, Southeastern Brazil, in 2009. Descriptive statistics were used to characterize the hospitals and indicator scores, and Cronbach's α coefficient was used to evaluate the internal consistency. The discriminant validity was analyzed by comparing indicator scores between groups of hospitals: with versus without quality certification. The construct validity analysis was based on exploratory factor analysis with a tetrachoric correlation matrix. The indicators for the technical-operational structure and epidemiological surveillance presented almost 100% conformity in the whole sample. The indicators for the operational prevention and control guidelines and the prevention and control activities presented internal consistency ranging from 0.67 to 0.80. The discriminant validity of these indicators indicated higher and statistically significant mean conformity scores among the group of institutions with healthcare certification or accreditation processes. In the construct validation, two dimensions were identified for the operational prevention and control guidelines: recommendations for preventing hospital infection and recommendations for standardizing prophylaxis procedures, with good correlation between the analysis units that formed the guidelines. The same was found for the prevention and control activities: interfaces with treatment units and support units were identified. Validation of the measurement properties of the hospital infection prevention and control program indicators made it possible to develop a tool for evaluating these programs in an ethical and scientific manner in order to obtain a quality diagnosis in this field.

  2. Once-only sigmoidoscopy in colorectal cancer screening: follow-up findings of the Italian Randomized Controlled Trial--SCORE.

    PubMed

    Segnan, Nereo; Armaroli, Paola; Bonelli, Luigina; Risio, Mauro; Sciallero, Stefania; Zappa, Marco; Andreoni, Bruno; Arrigoni, Arrigo; Bisanti, Luigi; Casella, Claudia; Crosta, Cristiano; Falcini, Fabio; Ferrero, Franco; Giacomin, Adriano; Giuliani, Orietta; Santarelli, Alessandra; Visioli, Carmen Beatriz; Zanetti, Roberto; Atkin, Wendy S; Senore, Carlo

    2011-09-07

    A single flexible sigmoidoscopy at around the age of 60 years has been proposed as an effective strategy for colorectal cancer (CRC) screening. We conducted a randomized controlled trial to evaluate the effect of flexible sigmoidoscopy screening on CRC incidence and mortality. A questionnaire to assess the eligibility and interest in screening was mailed to 236,568 men and women, aged 55-64 years, who were randomly selected from six trial centers in Italy. Of the 56,532 respondents, interested and eligible subjects were randomly assigned to the intervention group (invitation for flexible sigmoidoscopy; n = 17,148) or the control group (no further contact; n = 17,144), between June 14, 1995, and May 10, 1999. Flexible sigmoidoscopy was performed on 9911 subjects. Intention-to-treat and per-protocol analyses were performed to compare the CRC incidence and mortality rates in the intervention and control groups. Per-protocol analysis was adjusted for noncompliance. A total of 34,272 subjects (17,136 in each group) were included in the follow-up analysis. The median follow-up period was 10.5 years for incidence and 11.4 years for mortality; 251 subjects were diagnosed with CRC in the intervention group and 306 in the control group. Overall incidence rates in the intervention and control groups were 144.11 and 176.43, respectively, per 100,000 person-years. CRC-related death was noted in 65 subjects in the intervention group and 83 subjects in the control group. Mortality rates in the intervention and control groups were 34.66 and 44.45, respectively, per 100,000 person-years. In the intention-to-treat analysis, the rate of CRC incidence was statistically significantly reduced in the intervention group by 18% (rate ratio [RR] = 0.82, 95% confidence interval [CI] = 0.69 to 0.96), and the mortality rate was non-statistically significantly reduced by 22% (RR = 0.78; 95% CI = 0.56 to 1.08) compared with the control group. In the per-protocol analysis, both CRC incidence and mortality rates were statistically significantly reduced among the screened subjects; CRC incidence was reduced by 31% (RR = 0.69; 95% CI = 0.56 to 0.86) and mortality was reduced by 38% (RR = 0.62; 95% CI = 0.40 to 0.96) compared with the control group. A single flexible sigmoidoscopy screening between ages 55 and 64 years was associated with a substantial reduction of CRC incidence and mortality.

  3. Statistical Surrogate Modeling of Atmospheric Dispersion Events Using Bayesian Adaptive Splines

    NASA Astrophysics Data System (ADS)

    Francom, D.; Sansó, B.; Bulaevskaya, V.; Lucas, D. D.

    2016-12-01

    Uncertainty in the inputs of complex computer models, including atmospheric dispersion and transport codes, is often assessed via statistical surrogate models. Surrogate models are computationally efficient statistical approximations of expensive computer models that enable uncertainty analysis. We introduce Bayesian adaptive spline methods for producing surrogate models that capture the major spatiotemporal patterns of the parent model, while satisfying all the necessities of flexibility, accuracy and computational feasibility. We present novel methodological and computational approaches motivated by a controlled atmospheric tracer release experiment conducted at the Diablo Canyon nuclear power plant in California. Traditional methods for building statistical surrogate models often do not scale well to experiments with large amounts of data. Our approach is well suited to experiments involving large numbers of model inputs, large numbers of simulations, and functional output for each simulation. Our approach allows us to perform global sensitivity analysis with ease. We also present an approach to calibration of simulators using field data.

  4. An ANOVA approach for statistical comparisons of brain networks.

    PubMed

    Fraiman, Daniel; Fraiman, Ricardo

    2018-03-16

    The study of brain networks has developed extensively over the last couple of decades. By contrast, techniques for the statistical analysis of these networks are less developed. In this paper, we focus on the statistical comparison of brain networks in a nonparametric framework and discuss the associated detection and identification problems. We tested network differences between groups with an analysis of variance (ANOVA) test we developed specifically for networks. We also propose and analyse the behaviour of a new statistical procedure designed to identify different subnetworks. As an example, we show the application of this tool in resting-state fMRI data obtained from the Human Connectome Project. We identify, among other variables, that the amount of sleep the days before the scan is a relevant variable that must be controlled. Finally, we discuss the potential bias in neuroimaging findings that is generated by some behavioural and brain structure variables. Our method can also be applied to other kind of networks such as protein interaction networks, gene networks or social networks.

  5. The effect of project-based learning on students' statistical literacy levels for data representation

    NASA Astrophysics Data System (ADS)

    Koparan, Timur; Güven, Bülent

    2015-07-01

    The point of this study is to define the effect of project-based learning approach on 8th Grade secondary-school students' statistical literacy levels for data representation. To achieve this goal, a test which consists of 12 open-ended questions in accordance with the views of experts was developed. Seventy 8th grade secondary-school students, 35 in the experimental group and 35 in the control group, took this test twice, one before the application and one after the application. All the raw scores were turned into linear points by using the Winsteps 3.72 modelling program that makes the Rasch analysis and t-tests, and an ANCOVA analysis was carried out with the linear points. Depending on the findings, it was concluded that the project-based learning approach increases students' level of statistical literacy for data representation. Students' levels of statistical literacy before and after the application were shown through the obtained person-item maps.

  6. MWASTools: an R/bioconductor package for metabolome-wide association studies.

    PubMed

    Rodriguez-Martinez, Andrea; Posma, Joram M; Ayala, Rafael; Neves, Ana L; Anwar, Maryam; Petretto, Enrico; Emanueli, Costanza; Gauguier, Dominique; Nicholson, Jeremy K; Dumas, Marc-Emmanuel

    2018-03-01

    MWASTools is an R package designed to provide an integrated pipeline to analyse metabonomic data in large-scale epidemiological studies. Key functionalities of our package include: quality control analysis; metabolome-wide association analysis using various models (partial correlations, generalized linear models); visualization of statistical outcomes; metabolite assignment using statistical total correlation spectroscopy (STOCSY); and biological interpretation of metabolome-wide association studies results. The MWASTools R package is implemented in R (version  > =3.4) and is available from Bioconductor: https://bioconductor.org/packages/MWASTools/. m.dumas@imperial.ac.uk. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  7. Iodixanol versus low-osmolar contrast media for prevention of contrast induced nephropathy: meta-analysis of randomized, controlled trials.

    PubMed

    From, Aaron M; Al Badarin, Firas J; McDonald, Furman S; Bartholmai, Brian J; Cha, Stephen S; Rihal, Charanjit S

    2010-08-01

    Contrast-induced nephropathy (CIN) is associated with significant morbidity and mortality. The objective of our meta-analysis was to assess the efficacy of iodixanol compared with low-osmolar contrast media (LOCM) for prevention of CIN. We searched MEDLINE, the Cochrane Central Register of Controlled Trials, and internet sources of cardiology trial results for individual and relevant reviews of randomized, controlled trials, for the terms contrast media, contrast nephropathy, renal failure, iodixanol, Visipaque, and low-osmolar contrast media. All studies reported an incidence rate of CIN for each study group; there was no restriction on the definition of CIN. There were no restrictions on journal type or patient population. Overall, 36 trials were identified for analysis of aggregated summary data on 7166 patients; 3672 patients received iodixanol and 3494 patients received LOCM. Overall, iodixanol showed no statistically significant reduction in CIN incidence below that observed with heterogeneous comparator agents (P=0.11). Analysis of patient subgroups revealed that there was a significant benefit of iodixanol when compared with iohexol alone (odds ratio, 0.25; 95% confidence interval, 0.11 to 0.55; P<0.001) but not when compared with LOCM other than iohexol or with other ionic dimers or among patients receiving intra-arterial contrast injections or among patients undergoing coronary angiography with or without percutaneous intervention. Analysis of aggregated summary data from multiple randomized, controlled trials of iodixanol against diverse LOCMs for heterogeneous procedures and definitions of CIN show an iodixanol-associated reduction that is suggestive but statistically nonsignificant.

  8. How to Make Nothing Out of Something: Analyses of the Impact of Study Sampling and Statistical Interpretation in Misleading Meta-Analytic Conclusions

    PubMed Central

    Cunningham, Michael R.; Baumeister, Roy F.

    2016-01-01

    The limited resource model states that self-control is governed by a relatively finite set of inner resources on which people draw when exerting willpower. Once self-control resources have been used up or depleted, they are less available for other self-control tasks, leading to a decrement in subsequent self-control success. The depletion effect has been studied for over 20 years, tested or extended in more than 600 studies, and supported in an independent meta-analysis (Hagger et al., 2010). Meta-analyses are supposed to reduce bias in literature reviews. Carter et al.’s (2015) meta-analysis, by contrast, included a series of questionable decisions involving sampling, methods, and data analysis. We provide quantitative analyses of key sampling issues: exclusion of many of the best depletion studies based on idiosyncratic criteria and the emphasis on mini meta-analyses with low statistical power as opposed to the overall depletion effect. We discuss two key methodological issues: failure to code for research quality, and the quantitative impact of weak studies by novice researchers. We discuss two key data analysis issues: questionable interpretation of the results of trim and fill and Funnel Plot Asymmetry test procedures, and the use and misinterpretation of the untested Precision Effect Test and Precision Effect Estimate with Standard Error (PEESE) procedures. Despite these serious problems, the Carter et al. (2015) meta-analysis results actually indicate that there is a real depletion effect – contrary to their title. PMID:27826272

  9. An analysis of the cognitive deficit of schizophrenia based on the Piaget developmental theory.

    PubMed

    Torres, Alejandro; Olivares, Jose M; Rodriguez, Angel; Vaamonde, Antonio; Berrios, German E

    2007-01-01

    The objective of the study was to evaluate from the perspective of the Piaget developmental model the cognitive functioning of a sample of patients diagnosed with schizophrenia. Fifty patients with schizophrenia (Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition) and 40 healthy matched controls were evaluated by means of the Longeot Logical Thought Evaluation Scale. Only 6% of the subjects with schizophrenia reached the "formal period," and 70% remained at the "concrete operations" stage. The corresponding figures for the control sample were 25% and 15%, respectively. These differences were statistically significant. The samples were specifically differentiable on the permutation, probabilities, and pendulum tests of the scale. The Longeot Logical Thought Evaluation Scale can discriminate between subjects with schizophrenia and healthy controls.

  10. Fully Bayesian inference for structural MRI: application to segmentation and statistical analysis of T2-hypointensities.

    PubMed

    Schmidt, Paul; Schmid, Volker J; Gaser, Christian; Buck, Dorothea; Bührlen, Susanne; Förschler, Annette; Mühlau, Mark

    2013-01-01

    Aiming at iron-related T2-hypointensity, which is related to normal aging and neurodegenerative processes, we here present two practicable approaches, based on Bayesian inference, for preprocessing and statistical analysis of a complex set of structural MRI data. In particular, Markov Chain Monte Carlo methods were used to simulate posterior distributions. First, we rendered a segmentation algorithm that uses outlier detection based on model checking techniques within a Bayesian mixture model. Second, we rendered an analytical tool comprising a Bayesian regression model with smoothness priors (in the form of Gaussian Markov random fields) mitigating the necessity to smooth data prior to statistical analysis. For validation, we used simulated data and MRI data of 27 healthy controls (age: [Formula: see text]; range, [Formula: see text]). We first observed robust segmentation of both simulated T2-hypointensities and gray-matter regions known to be T2-hypointense. Second, simulated data and images of segmented T2-hypointensity were analyzed. We found not only robust identification of simulated effects but also a biologically plausible age-related increase of T2-hypointensity primarily within the dentate nucleus but also within the globus pallidus, substantia nigra, and red nucleus. Our results indicate that fully Bayesian inference can successfully be applied for preprocessing and statistical analysis of structural MRI data.

  11. Statistical analysis plan for the Laser-1st versus Drops-1st for Glaucoma and Ocular Hypertension Trial (LiGHT): a multi-centre randomised controlled trial.

    PubMed

    Vickerstaff, Victoria; Ambler, Gareth; Bunce, Catey; Xing, Wen; Gazzard, Gus

    2015-11-11

    The LiGHT trial (Laser-1st versus Drops-1st for Glaucoma and Ocular Hypertension Trial) is a multicentre randomised controlled trial of two treatment pathways for patients who are newly diagnosed with open-angle glaucoma (OAG) and ocular hypertension (OHT). The main hypothesis for the trial is that lowering intraocular pressure (IOP) with selective laser trabeculoplasty (SLT) as the primary treatment ('Laser-1st') leads to a better health-related quality of life than for those started on IOP-lowering drops as their primary treatment ('Medicine-1st') and that this is associated with reduced costs and improved tolerability of treatment. This paper describes the statistical analysis plan for the study. The LiGHT trial is an unmasked, multi-centre randomised controlled trial. A total of 718 patients (359 per arm) are being randomised to two groups: medicine-first or laser-first treatment. Outcomes are recorded at baseline and at 6-month intervals up to 36 months. The primary outcome measure is health-related quality of life (HRQL) at 36 months measured using the EQ-5D-5L. The main secondary outcome is the Glaucoma Utility Index. We plan to analyse the patient outcome data according to the group to which the patient was originally assigned. Methods of statistical analysis are described, including the handling of missing data, the covariates used in the adjusted analyses and the planned sensitivity analyses. The trial was registered with the ISRCTN register on 23/07/2012, number ISRCTN32038223 .

  12. TU-FG-201-05: Varian MPC as a Statistical Process Control Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carver, A; Rowbottom, C

    Purpose: Quality assurance in radiotherapy requires the measurement of various machine parameters to ensure they remain within permitted values over time. In Truebeam release 2.0 the Machine Performance Check (MPC) was released allowing beam output and machine axis movements to be assessed in a single test. We aim to evaluate the Varian Machine Performance Check (MPC) as a tool for Statistical Process Control (SPC). Methods: Varian’s MPC tool was used on three Truebeam and one EDGE linac for a period of approximately one year. MPC was commissioned against independent systems. After this period the data were reviewed to determine whethermore » or not the MPC was useful as a process control tool. Analyses on individual tests were analysed using Shewhart control plots, using Matlab for analysis. Principal component analysis was used to determine if a multivariate model was of any benefit in analysing the data. Results: Control charts were found to be useful to detect beam output changes, worn T-nuts and jaw calibration issues. Upper and lower control limits were defined at the 95% level. Multivariate SPC was performed using Principal Component Analysis. We found little evidence of clustering beyond that which might be naively expected such as beam uniformity and beam output. Whilst this makes multivariate analysis of little use it suggests that each test is giving independent information. Conclusion: The variety of independent parameters tested in MPC makes it a sensitive tool for routine machine QA. We have determined that using control charts in our QA programme would rapidly detect changes in machine performance. The use of control charts allows large quantities of tests to be performed on all linacs without visual inspection of all results. The use of control limits alerts users when data are inconsistent with previous measurements before they become out of specification. A. Carver has received a speaker’s honorarium from Varian.« less

  13. Stress and adult smartphone addiction: Mediation by self-control, neuroticism, and extraversion.

    PubMed

    Cho, Hea-Young; Kim, Dai Jin; Park, Jae Woo

    2017-12-01

    This study employed descriptive statistics and correlation analysis to examine the influence of stress on smartphone addiction as well as the mediating effects of self-control, neuroticism, and extraversion using 400 men and women in their 20s to 40s followed by structural equation analysis. Our findings indicate that stress had a significant influence on smartphone addiction, and self-control mediates the influence of stress on smartphone addiction. As stress increases, self-control decreases, which subsequently leads to increased smartphone addiction. Self-control was confirmed as an important factor in the prevention of smartphone addiction. Finally, among personality factors, neuroticism, and extraversion mediate the influence of stress on smartphone addiction. Copyright © 2017 John Wiley & Sons, Ltd.

  14. Lactobacillus for preventing recurrent urinary tract infections in women: meta-analysis.

    PubMed

    Grin, Peter M; Kowalewska, Paulina M; Alhazzan, Waleed; Fox-Robichaud, Alison E

    2013-02-01

    Urinary tract infections (UTIs) are the most common infections affecting women, and often recur. Lactobacillus probiotics could potentially replace low dose, long term antibiotics as a safer prophylactic for recurrent UTI (rUTI). This systematic review and meta-analysis was performed to compile the results of existing randomized clinical trials (RCTs) to determine the efficacy of probiotic Lactobacillus species in preventing rUTI. MEDLINE and EMBASE were searched from inception to July 2012 for RCTs using a Lactobacillus prophylactic against rUTI in premenopausal adult women. A random-effects model meta-analysis was performed using a pooled risk ratio, comparing incidence of rUTI in patients receiving Lactobacillus to control. Data from 294 patients across five studies were included. There was no statistically significant difference in the risk for rUTI in patients receiving Lactobacillus versus controls, as indicated by the pooled risk ratio of 0.85 (95% confidence interval of 0.58-1.25, p = 0.41). A sensitivity analysis was performed, excluding studies using ineffective strains and studies testing for safety. Data from 127 patients in two studies were included. A statistically significant decrease in rUTI was found in patients given Lactobacillus, denoted by the pooled risk ratio of 0.51 (95% confidence interval 0.26-0.99, p = 0.05) with no statistical heterogeneity (I2 = 0%). Probiotic strains of Lactobacillus are safe and effective in preventing rUTI in adult women. However, more RCTs are required before a definitive recommendation can be made since the patient population contributing data to this meta-analysis was small.

  15. Analysis of Rare, Exonic Variation amongst Subjects with Autism Spectrum Disorders and Population Controls

    PubMed Central

    Liu, Li; Sabo, Aniko; Neale, Benjamin M.; Nagaswamy, Uma; Stevens, Christine; Lim, Elaine; Bodea, Corneliu A.; Muzny, Donna; Reid, Jeffrey G.; Banks, Eric; Coon, Hillary; DePristo, Mark; Dinh, Huyen; Fennel, Tim; Flannick, Jason; Gabriel, Stacey; Garimella, Kiran; Gross, Shannon; Hawes, Alicia; Lewis, Lora; Makarov, Vladimir; Maguire, Jared; Newsham, Irene; Poplin, Ryan; Ripke, Stephan; Shakir, Khalid; Samocha, Kaitlin E.; Wu, Yuanqing; Boerwinkle, Eric; Buxbaum, Joseph D.; Cook, Edwin H.; Devlin, Bernie; Schellenberg, Gerard D.; Sutcliffe, James S.; Daly, Mark J.; Gibbs, Richard A.; Roeder, Kathryn

    2013-01-01

    We report on results from whole-exome sequencing (WES) of 1,039 subjects diagnosed with autism spectrum disorders (ASD) and 870 controls selected from the NIMH repository to be of similar ancestry to cases. The WES data came from two centers using different methods to produce sequence and to call variants from it. Therefore, an initial goal was to ensure the distribution of rare variation was similar for data from different centers. This proved straightforward by filtering called variants by fraction of missing data, read depth, and balance of alternative to reference reads. Results were evaluated using seven samples sequenced at both centers and by results from the association study. Next we addressed how the data and/or results from the centers should be combined. Gene-based analyses of association was an obvious choice, but should statistics for association be combined across centers (meta-analysis) or should data be combined and then analyzed (mega-analysis)? Because of the nature of many gene-based tests, we showed by theory and simulations that mega-analysis has better power than meta-analysis. Finally, before analyzing the data for association, we explored the impact of population structure on rare variant analysis in these data. Like other recent studies, we found evidence that population structure can confound case-control studies by the clustering of rare variants in ancestry space; yet, unlike some recent studies, for these data we found that principal component-based analyses were sufficient to control for ancestry and produce test statistics with appropriate distributions. After using a variety of gene-based tests and both meta- and mega-analysis, we found no new risk genes for ASD in this sample. Our results suggest that standard gene-based tests will require much larger samples of cases and controls before being effective for gene discovery, even for a disorder like ASD. PMID:23593035

  16. Boosting Stochastic Problem Solvers Through Online Self-Analysis of Performance

    DTIC Science & Technology

    2003-07-21

    Boosting Stochastic Problem Solvers Through Online Self-Analysis of Performance Vincent A. Cicirello CMU-RI-TR-03-27 Submitted in partial fulfillment...AND SUBTITLE Boosting Stochastic Problem Solvers Through Online Self-Analysis of Performance 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM...lead to the development of a search control framework, called QD-BEACON that uses online -generated statistical models of search performance to

  17. A systematic review of the quality of statistical methods employed for analysing quality of life data in cancer randomised controlled trials.

    PubMed

    Hamel, Jean-Francois; Saulnier, Patrick; Pe, Madeline; Zikos, Efstathios; Musoro, Jammbe; Coens, Corneel; Bottomley, Andrew

    2017-09-01

    Over the last decades, Health-related Quality of Life (HRQoL) end-points have become an important outcome of the randomised controlled trials (RCTs). HRQoL methodology in RCTs has improved following international consensus recommendations. However, no international recommendations exist concerning the statistical analysis of such data. The aim of our study was to identify and characterise the quality of the statistical methods commonly used for analysing HRQoL data in cancer RCTs. Building on our recently published systematic review, we analysed a total of 33 published RCTs studying the HRQoL methods reported in RCTs since 1991. We focussed on the ability of the methods to deal with the three major problems commonly encountered when analysing HRQoL data: their multidimensional and longitudinal structure and the commonly high rate of missing data. All studies reported HRQoL being assessed repeatedly over time for a period ranging from 2 to 36 months. Missing data were common, with compliance rates ranging from 45% to 90%. From the 33 studies considered, 12 different statistical methods were identified. Twenty-nine studies analysed each of the questionnaire sub-dimensions without type I error adjustment. Thirteen studies repeated the HRQoL analysis at each assessment time again without type I error adjustment. Only 8 studies used methods suitable for repeated measurements. Our findings show a lack of consistency in statistical methods for analysing HRQoL data. Problems related to multiple comparisons were rarely considered leading to a high risk of false positive results. It is therefore critical that international recommendations for improving such statistical practices are developed. Copyright © 2017. Published by Elsevier Ltd.

  18. 14 CFR 23.621 - Casting factors.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... either magnetic particle, penetrant or other approved equivalent non-destructive inspection method; or... percent approved non-destructive inspection. When an approved quality control procedure is established and an acceptable statistical analysis supports reduction, non-destructive inspection may be reduced from...

  19. 14 CFR 23.621 - Casting factors.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... either magnetic particle, penetrant or other approved equivalent non-destructive inspection method; or... percent approved non-destructive inspection. When an approved quality control procedure is established and an acceptable statistical analysis supports reduction, non-destructive inspection may be reduced from...

  20. 14 CFR 23.621 - Casting factors.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... either magnetic particle, penetrant or other approved equivalent non-destructive inspection method; or... percent approved non-destructive inspection. When an approved quality control procedure is established and an acceptable statistical analysis supports reduction, non-destructive inspection may be reduced from...

  1. 14 CFR 23.621 - Casting factors.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... either magnetic particle, penetrant or other approved equivalent non-destructive inspection method; or... percent approved non-destructive inspection. When an approved quality control procedure is established and an acceptable statistical analysis supports reduction, non-destructive inspection may be reduced from...

  2. Rapid analysis of pharmaceutical drugs using LIBS coupled with multivariate analysis.

    PubMed

    Tiwari, P K; Awasthi, S; Kumar, R; Anand, R K; Rai, P K; Rai, A K

    2018-02-01

    Type 2 diabetes drug tablets containing voglibose having dose strengths of 0.2 and 0.3 mg of various brands have been examined, using laser-induced breakdown spectroscopy (LIBS) technique. The statistical methods such as the principal component analysis (PCA) and the partial least square regression analysis (PLSR) have been employed on LIBS spectral data for classifying and developing the calibration models of drug samples. We have developed the ratio-based calibration model applying PLSR in which relative spectral intensity ratios H/C, H/N and O/N are used. Further, the developed model has been employed to predict the relative concentration of element in unknown drug samples. The experiment has been performed in air and argon atmosphere, respectively, and the obtained results have been compared. The present model provides rapid spectroscopic method for drug analysis with high statistical significance for online control and measurement process in a wide variety of pharmaceutical industrial applications.

  3. Flight path control strategies and preliminary deltaV requirements for the 2007 Mars Phoenix (PHX) mission

    NASA Technical Reports Server (NTRS)

    Raofi, Behzad

    2005-01-01

    This paper describes the methods used to estimate the statistical deltaV requirements for the propulsive maneuvers that will deliver the spacecraft to its target landing site while satisfying planetary protection requirements. the paper presents flight path control analysis results for three different trajectories, open, middle, and close of launch period for the mission.

  4. A statistical-based scheduling algorithm in automated data path synthesis

    NASA Technical Reports Server (NTRS)

    Jeon, Byung Wook; Lursinsap, Chidchanok

    1992-01-01

    In this paper, we propose a new heuristic scheduling algorithm based on the statistical analysis of the cumulative frequency distribution of operations among control steps. It has a tendency of escaping from local minima and therefore reaching a globally optimal solution. The presented algorithm considers the real world constraints such as chained operations, multicycle operations, and pipelined data paths. The result of the experiment shows that it gives optimal solutions, even though it is greedy in nature.

  5. Accounting for competing risks in randomized controlled trials: a review and recommendations for improvement

    PubMed Central

    Fine, Jason P.

    2017-01-01

    In studies with survival or time‐to‐event outcomes, a competing risk is an event whose occurrence precludes the occurrence of the primary event of interest. Specialized statistical methods must be used to analyze survival data in the presence of competing risks. We conducted a review of randomized controlled trials with survival outcomes that were published in high‐impact general medical journals. Of 40 studies that we identified, 31 (77.5%) were potentially susceptible to competing risks. However, in the majority of these studies, the potential presence of competing risks was not accounted for in the statistical analyses that were described. Of the 31 studies potentially susceptible to competing risks, 24 (77.4%) reported the results of a Kaplan–Meier survival analysis, while only five (16.1%) reported using cumulative incidence functions to estimate the incidence of the outcome over time in the presence of competing risks. The former approach will tend to result in an overestimate of the incidence of the outcome over time, while the latter approach will result in unbiased estimation of the incidence of the primary outcome over time. We provide recommendations on the analysis and reporting of randomized controlled trials with survival outcomes in the presence of competing risks. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. PMID:28102550

  6. Genetic association between the dopamine D1-receptor gene and paranoid schizophrenia in a northern Han Chinese population.

    PubMed

    Yao, Jun; Ding, Mei; Xing, Jiaxin; Xuan, Jinfeng; Pang, Hao; Pan, Yuqing; Wang, Baojie

    2014-01-01

    Dysregulation of dopaminergic neurotransmission at the D1 receptor in the prefrontal cortex has been implicated in the pathogenesis of schizophrenia. Genetic polymorphisms of the dopamine D1-receptor gene have a plausible role in modulating the risk of schizophrenia. To determine the role of DRD1 genetic polymorphisms as a risk factor for schizophrenia, we undertook a case-control study to look for an association between the DRD1 gene and schizophrenia. We genotyped eleven single-nucleotide polymorphisms within the DRD1 gene by deoxyribonucleic acid sequencing involving 173 paranoid schizophrenia patients and 213 unrelated healthy individuals. Statistical analysis was performed to identify the difference of genotype, allele, or haplotype distribution between cases and controls. A significantly lower risk of paranoid schizophrenia was associated with the AG + GG genotype of rs5326 and the AG + GG genotype of rs4532 compared to the AA genotype and the AA genotype, respectively. Distribution of haplotypes was no different between controls and paranoid schizophrenia patients. In the males, the genotype distribution of rs5326 was statistically different between cases and controls. In the females, the genotype distribution of rs4532 was statistically different between cases and controls. However, the aforementioned statistical significances were lost after Bonferroni correction. It is unlikely that DRD1 accounts for a substantial proportion of the genetic risk for schizophrenia. As an important dopaminergic gene, DRD1 may contribute to schizophrenia by interacting with other genes, and further relevant studies are warranted.

  7. Association of Glycemic Status with Bone Turnover Markers in Type 2 Diabetes Mellitus.

    PubMed

    Kulkarni, Sweta Vilas; Meenatchi, Suruthi; Reeta, R; Ramesh, Ramasamy; Srinivasan, A R; Lenin, C

    2017-01-01

    Type 2 diabetes mellitus has profound implications on the skeleton. Even though bone mineral density is increased in type 2 diabetes mellitus patients, they are more prone for fractures. The weakening of bone tissue in type 2 diabetes mellitus can be due to uncontrolled blood sugar levels leading to high levels of bone turnover markers in blood. The aim of this study is to find the association between glycemic status and bone turnover markers in type 2 diabetes mellitus. This case-control study was carried out in a tertiary health care hospital. Fifty clinically diagnosed type 2 diabetes mellitus patients in the age group between 30 and 50 years were included as cases. Fifty age- and gender-matched healthy nondiabetics were included as controls. Patients with complications and chronic illness were excluded from the study. Depending on glycated hemoglobin (HbA1c) levels, patients were grouped into uncontrolled (HbA1c >7%, n = 36) and controlled (HbA1c <7%, n = 14) diabetics. Based on duration of diabetes, patients were grouped into newly diagnosed, 1-2 years, 3-5 years, and >5 years. Serum osteocalcin (OC), bone alkaline phosphatase (BAP), acid phosphatase (ACP), and HbA1c levels were estimated. OC/BAP and OC/ACP ratio was calculated. Student's t -test, analysis of variance, and Chi-square tests were used for analysis. Receiver operating characteristic (ROC) curve analysis was done for OC/BAP and OC/ACP ratios. Serum OC, HbA1c, and OC/BAP ratio were increased in cases when compared to controls and were statistically significant ( P < 0.001). OC/ACP ratio was decreased in type 2 diabetes mellitus and was statistically significant ( P = 0.01). In patients with >5-year duration of diabetes, HbA1c level was high and was statistically significant ( P < 0.042). BAP levels were high in uncontrolled diabetics but statistically not significant. ROC curve showed OC/BAP ratio better marker than OC/ACP ratio. Uncontrolled type 2 diabetes mellitus affects bone tissue resulting in variations in bone turnover markers. Bone turnover markers are better in predicting recent changes in bone morphology and are cost effective.

  8. A Powerful Procedure for Pathway-Based Meta-analysis Using Summary Statistics Identifies 43 Pathways Associated with Type II Diabetes in European Populations.

    PubMed

    Zhang, Han; Wheeler, William; Hyland, Paula L; Yang, Yifan; Shi, Jianxin; Chatterjee, Nilanjan; Yu, Kai

    2016-06-01

    Meta-analysis of multiple genome-wide association studies (GWAS) has become an effective approach for detecting single nucleotide polymorphism (SNP) associations with complex traits. However, it is difficult to integrate the readily accessible SNP-level summary statistics from a meta-analysis into more powerful multi-marker testing procedures, which generally require individual-level genetic data. We developed a general procedure called Summary based Adaptive Rank Truncated Product (sARTP) for conducting gene and pathway meta-analysis that uses only SNP-level summary statistics in combination with genotype correlation estimated from a panel of individual-level genetic data. We demonstrated the validity and power advantage of sARTP through empirical and simulated data. We conducted a comprehensive pathway-based meta-analysis with sARTP on type 2 diabetes (T2D) by integrating SNP-level summary statistics from two large studies consisting of 19,809 T2D cases and 111,181 controls with European ancestry. Among 4,713 candidate pathways from which genes in neighborhoods of 170 GWAS established T2D loci were excluded, we detected 43 T2D globally significant pathways (with Bonferroni corrected p-values < 0.05), which included the insulin signaling pathway and T2D pathway defined by KEGG, as well as the pathways defined according to specific gene expression patterns on pancreatic adenocarcinoma, hepatocellular carcinoma, and bladder carcinoma. Using summary data from 8 eastern Asian T2D GWAS with 6,952 cases and 11,865 controls, we showed 7 out of the 43 pathways identified in European populations remained to be significant in eastern Asians at the false discovery rate of 0.1. We created an R package and a web-based tool for sARTP with the capability to analyze pathways with thousands of genes and tens of thousands of SNPs.

  9. A Powerful Procedure for Pathway-Based Meta-analysis Using Summary Statistics Identifies 43 Pathways Associated with Type II Diabetes in European Populations

    PubMed Central

    Zhang, Han; Wheeler, William; Hyland, Paula L.; Yang, Yifan; Shi, Jianxin; Chatterjee, Nilanjan; Yu, Kai

    2016-01-01

    Meta-analysis of multiple genome-wide association studies (GWAS) has become an effective approach for detecting single nucleotide polymorphism (SNP) associations with complex traits. However, it is difficult to integrate the readily accessible SNP-level summary statistics from a meta-analysis into more powerful multi-marker testing procedures, which generally require individual-level genetic data. We developed a general procedure called Summary based Adaptive Rank Truncated Product (sARTP) for conducting gene and pathway meta-analysis that uses only SNP-level summary statistics in combination with genotype correlation estimated from a panel of individual-level genetic data. We demonstrated the validity and power advantage of sARTP through empirical and simulated data. We conducted a comprehensive pathway-based meta-analysis with sARTP on type 2 diabetes (T2D) by integrating SNP-level summary statistics from two large studies consisting of 19,809 T2D cases and 111,181 controls with European ancestry. Among 4,713 candidate pathways from which genes in neighborhoods of 170 GWAS established T2D loci were excluded, we detected 43 T2D globally significant pathways (with Bonferroni corrected p-values < 0.05), which included the insulin signaling pathway and T2D pathway defined by KEGG, as well as the pathways defined according to specific gene expression patterns on pancreatic adenocarcinoma, hepatocellular carcinoma, and bladder carcinoma. Using summary data from 8 eastern Asian T2D GWAS with 6,952 cases and 11,865 controls, we showed 7 out of the 43 pathways identified in European populations remained to be significant in eastern Asians at the false discovery rate of 0.1. We created an R package and a web-based tool for sARTP with the capability to analyze pathways with thousands of genes and tens of thousands of SNPs. PMID:27362418

  10. Linnorm: improved statistical analysis for single cell RNA-seq expression data.

    PubMed

    Yip, Shun H; Wang, Panwen; Kocher, Jean-Pierre A; Sham, Pak Chung; Wang, Junwen

    2017-12-15

    Linnorm is a novel normalization and transformation method for the analysis of single cell RNA sequencing (scRNA-seq) data. Linnorm is developed to remove technical noises and simultaneously preserve biological variations in scRNA-seq data, such that existing statistical methods can be improved. Using real scRNA-seq data, we compared Linnorm with existing normalization methods, including NODES, SAMstrt, SCnorm, scran, DESeq and TMM. Linnorm shows advantages in speed, technical noise removal and preservation of cell heterogeneity, which can improve existing methods in the discovery of novel subtypes, pseudo-temporal ordering of cells, clustering analysis, etc. Linnorm also performs better than existing DEG analysis methods, including BASiCS, NODES, SAMstrt, Seurat and DESeq2, in false positive rate control and accuracy. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  11. Mass univariate analysis of event-related brain potentials/fields I: a critical tutorial review.

    PubMed

    Groppe, David M; Urbach, Thomas P; Kutas, Marta

    2011-12-01

    Event-related potentials (ERPs) and magnetic fields (ERFs) are typically analyzed via ANOVAs on mean activity in a priori windows. Advances in computing power and statistics have produced an alternative, mass univariate analyses consisting of thousands of statistical tests and powerful corrections for multiple comparisons. Such analyses are most useful when one has little a priori knowledge of effect locations or latencies, and for delineating effect boundaries. Mass univariate analyses complement and, at times, obviate traditional analyses. Here we review this approach as applied to ERP/ERF data and four methods for multiple comparison correction: strong control of the familywise error rate (FWER) via permutation tests, weak control of FWER via cluster-based permutation tests, false discovery rate control, and control of the generalized FWER. We end with recommendations for their use and introduce free MATLAB software for their implementation. Copyright © 2011 Society for Psychophysiological Research.

  12. Impact of Integrated Science and English Language Arts Literacy Supplemental Instructional Intervention on Science Academic Achievement of Elementary Students

    NASA Astrophysics Data System (ADS)

    Marks, Jamar Terry

    The purpose of this quasi-experimental, nonequivalent pretest-posttest control group design study was to determine if any differences existed in upper elementary school students' science academic achievement when instructed using an 8-week integrated science and English language arts literacy supplemental instructional intervention in conjunction with traditional science classroom instruction as compared to when instructed using solely traditional science classroom instruction. The targeted sample population consisted of fourth-grade students enrolled in a public elementary school located in the southeastern region of the United States. The convenience sample size consisted of 115 fourth-grade students enrolled in science classes. The pretest and posttest academic achievement data collected consisted of the science segment from the Spring 2015, and Spring 2016 state standardized assessments. Pretest and posttest academic achievement data were analyzed using an ANCOVA statistical procedure to test for differences, and the researcher reported the results of the statistical analysis. The results of the study show no significant difference in science academic achievement between treatment and control groups. An interpretation of the results and recommendations for future research were provided by the researcher upon completion of the statistical analysis.

  13. Comparison of a non-stationary voxelation-corrected cluster-size test with TFCE for group-Level MRI inference.

    PubMed

    Li, Huanjie; Nickerson, Lisa D; Nichols, Thomas E; Gao, Jia-Hong

    2017-03-01

    Two powerful methods for statistical inference on MRI brain images have been proposed recently, a non-stationary voxelation-corrected cluster-size test (CST) based on random field theory and threshold-free cluster enhancement (TFCE) based on calculating the level of local support for a cluster, then using permutation testing for inference. Unlike other statistical approaches, these two methods do not rest on the assumptions of a uniform and high degree of spatial smoothness of the statistic image. Thus, they are strongly recommended for group-level fMRI analysis compared to other statistical methods. In this work, the non-stationary voxelation-corrected CST and TFCE methods for group-level analysis were evaluated for both stationary and non-stationary images under varying smoothness levels, degrees of freedom and signal to noise ratios. Our results suggest that, both methods provide adequate control for the number of voxel-wise statistical tests being performed during inference on fMRI data and they are both superior to current CSTs implemented in popular MRI data analysis software packages. However, TFCE is more sensitive and stable for group-level analysis of VBM data. Thus, the voxelation-corrected CST approach may confer some advantages by being computationally less demanding for fMRI data analysis than TFCE with permutation testing and by also being applicable for single-subject fMRI analyses, while the TFCE approach is advantageous for VBM data. Hum Brain Mapp 38:1269-1280, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  14. Improved score statistics for meta-analysis in single-variant and gene-level association studies.

    PubMed

    Yang, Jingjing; Chen, Sai; Abecasis, Gonçalo

    2018-06-01

    Meta-analysis is now an essential tool for genetic association studies, allowing them to combine large studies and greatly accelerating the pace of genetic discovery. Although the standard meta-analysis methods perform equivalently as the more cumbersome joint analysis under ideal settings, they result in substantial power loss under unbalanced settings with various case-control ratios. Here, we investigate the power loss problem by the standard meta-analysis methods for unbalanced studies, and further propose novel meta-analysis methods performing equivalently to the joint analysis under both balanced and unbalanced settings. We derive improved meta-score-statistics that can accurately approximate the joint-score-statistics with combined individual-level data, for both linear and logistic regression models, with and without covariates. In addition, we propose a novel approach to adjust for population stratification by correcting for known population structures through minor allele frequencies. In the simulated gene-level association studies under unbalanced settings, our method recovered up to 85% power loss caused by the standard methods. We further showed the power gain of our methods in gene-level tests with 26 unbalanced studies of age-related macular degeneration . In addition, we took the meta-analysis of three unbalanced studies of type 2 diabetes as an example to discuss the challenges of meta-analyzing multi-ethnic samples. In summary, our improved meta-score-statistics with corrections for population stratification can be used to construct both single-variant and gene-level association studies, providing a useful framework for ensuring well-powered, convenient, cross-study analyses. © 2018 WILEY PERIODICALS, INC.

  15. Batch Statistical Process Monitoring Approach to a Cocrystallization Process.

    PubMed

    Sarraguça, Mafalda C; Ribeiro, Paulo R S; Dos Santos, Adenilson O; Lopes, João A

    2015-12-01

    Cocrystals are defined as crystalline structures composed of two or more compounds that are solid at room temperature held together by noncovalent bonds. Their main advantages are the increase of solubility, bioavailability, permeability, stability, and at the same time retaining active pharmaceutical ingredient bioactivity. The cocrystallization between furosemide and nicotinamide by solvent evaporation was monitored on-line using near-infrared spectroscopy (NIRS) as a process analytical technology tool. The near-infrared spectra were analyzed using principal component analysis. Batch statistical process monitoring was used to create control charts to perceive the process trajectory and define control limits. Normal and non-normal operating condition batches were performed and monitored with NIRS. The use of NIRS associated with batch statistical process models allowed the detection of abnormal variations in critical process parameters, like the amount of solvent or amount of initial components present in the cocrystallization. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  16. Cone Beam Computed Tomography Analysis of Oropharyngeal Airway in Preadolescent Nonsyndromic Bilateral and Unilateral Cleft Lip and Palate Patients.

    PubMed

    Al-Fahdawi, Mahmood Abd; El-Kassaby, Marwa Abdelwahab; Farid, Mary Medhat; El-Fotouh, Mona Abou

    2018-01-01

    Objective The objective of this study was to assess the volume, area, and dimensions of the oropharyngeal airway (OPA) in a previously repaired nonsyndromic unilateral cleft lip and palate (UCLP) versus bilateral cleft lip and palate (BCLP) patients when compared with noncleft controls using cone beam computed tomography (CBCT). Design This was a retrospective case-control study. Setting The Cleft Care Center and outpatient clinic that are affiliated to our faculty were the settings for the study. Participants A total of 58 CBCT scans were selected of preadolescent individuals: 14 BCLP, 20 UCLP, and 24 age- and gender-matched noncleft controls. Variables Variables were volume, cross-sectional area (CSA), midsagittal area (MSA), and dimensions of OPA. Statistical analysis One-way analysis of variance and post hoc tests were used to compare variables. Statistical significance was set at P ≤ .05. Results UCLP showed significantly smaller superior oropharyngeal airway volume than both controls and BCLP ( P ≤ .05). BCLP showed significantly larger CSA at soft palate plane and significantly larger MSA than both UCLP and controls ( P < .05). Conclusions UCLP patients at the studied age and stage of previously repaired clefts have significantly less superior oropharyngeal airway volume than both controls and BCLP patients. This confirms that preadolescents with UCLP are at greater risk for superior oropharyngeal airway obstruction when compared with those BCLP and controls. Furthermore, BCLP patients showed significantly larger CSA at soft palate plane and MSA than both controls and UCLP patients. These variations in OPA characteristics of cleft patients can influence function in terms of respiration and vocalization.

  17. Childhood obesity in relation to poor asthma control and exacerbation: a meta-analysis.

    PubMed

    Ahmadizar, Fariba; Vijverberg, Susanne J H; Arets, Hubertus G M; de Boer, Anthonius; Lang, Jason E; Kattan, Meyer; Palmer, Colin N A; Mukhopadhyay, Somnath; Turner, Steve; Maitland-van der Zee, Anke H

    2016-10-01

    To estimate the association between obesity and poor asthma control or risk of exacerbations in asthmatic children and adolescents, and to assess whether these associations are different by sex.A meta-analysis was performed on unpublished data from three North-European paediatric asthma cohorts (BREATHE, PACMAN (Pharmacogenetics of Asthma medication in Children: Medication with Anti-inflammatory effects) and PAGES (Pediatric Asthma Gene Environment Study)) and 11 previously published studies (cross-sectional and longitudinal studies). Outcomes were poor asthma control (based on asthma symptoms) and exacerbations rates (asthma-related visits to the emergency department, asthma-related hospitalisations or use of oral corticosteroids). Overall pooled estimates of the odds ratios were obtained using fixed- or random-effects models.In a meta-analysis of 46 070 asthmatic children and adolescents, obese children (body mass index ≥95th percentile) compared with non-obese peers had a small but significant increased risk of asthma exacerbations (OR 1.17, 95% CI 1.03-1.34; I 2 : 54.7%). However, there was no statistically significant association between obesity and poor asthma control (n=4973, OR 1.23, 95% CI 0.99-1.53; I 2 : 0.0%). After stratification for sex, the differences in odds ratios for girls and boys were similar, yet no longer statistically significant.In asthmatic children, obesity is associated with a minor increased risk of asthma exacerbations but not with poor asthma control. Sex does not appear to modify this risk. Copyright ©ERS 2016.

  18. Measuring skin necrosis in a randomised controlled feasibility trial of heat preconditioning on wound healing after reconstructive breast surgery: study protocol and statistical analysis plan for the PREHEAT trial.

    PubMed

    Cro, Suzie; Mehta, Saahil; Farhadi, Jian; Coomber, Billie; Cornelius, Victoria

    2018-01-01

    Essential strategies are needed to help reduce the number of post-operative complications and associated costs for breast cancer patients undergoing reconstructive breast surgery. Evidence suggests that local heat preconditioning could help improve the provision of this procedure by reducing skin necrosis. Before testing the effectiveness of heat preconditioning in a definitive randomised controlled trial (RCT), we must first establish the best way to measure skin necrosis and estimate the event rate using this definition. PREHEAT is a single-blind randomised controlled feasibility trial comparing local heat preconditioning, using a hot water bottle, against standard care on skin necrosis among breast cancer patients undergoing reconstructive breast surgery. The primary objective of this study is to determine the best way to measure skin necrosis and to estimate the event rate using this definition in each trial arm. Secondary feasibility objectives include estimating recruitment and 30 day follow-up retention rates, levels of compliance with the heating protocol, length of stay in hospital and the rates of surgical versus conservative management of skin necrosis. The information from these objectives will inform the design of a larger definitive effectiveness and cost-effectiveness RCT. This article describes the PREHEAT trial protocol and detailed statistical analysis plan, which includes the pre-specified criteria and process for establishing the best way to measure necrosis. This study will provide the evidence needed to establish the best way to measure skin necrosis, to use as the primary outcome in a future RCT to definitively test the effectiveness of local heat preconditioning. The pre-specified statistical analysis plan, developed prior to unblinded data extraction, sets out the analysis strategy and a comparative framework to support a committee evaluation of skin necrosis measurements. It will increase the transparency of the data analysis for the PREHEAT trial. ISRCTN ISRCTN15744669. Registered 25 February 2015.

  19. Carbon film coating of abutment surfaces: effect on the abutment screw removal torque.

    PubMed

    Corazza, Pedro Henrique; de Moura Silva, Alecsandro; Cavalcanti Queiroz, José Renato; Salazar Marocho, Susana María; Bottino, Marco Antonia; Massi, Marcos; de Assunção e Souza, Rodrigo Othávio

    2014-08-01

    To evaluate the effect of diamond-like carbon (DLC) coating of prefabricated implant abutment on screw removal torque (RT) before and after mechanical cycling (MC). Fifty-four abutments for external-hex implants were divided among 6 groups (n = 9): S, straight abutment (control); SC, straight coated abutment; SCy, straight abutment and MC; SCCy, straight coated abutment and MC; ACy, angled abutment and MC; and ACCy, angled coated abutment and MC. The abutments were attached to the implants by a titanium screw. RT values were measured and registered. Data (in Newton centimeter) were analyzed with analysis of variance and Dunnet test (α = 0.05). RT values were significantly affected by MC (P = 0.001) and the interaction between DLC coating and MC (P = 0.038). SCy and ACy showed the lowest RT values, statistically different from the control. The abutment coated groups had no statistical difference compared with the control. Scanning electron microscopy analysis showed DLC film with a thickness of 3 μm uniformly coating the hexagonal abutment. DLC film deposited on the abutment can be used as an alternative procedure to reduce abutment screw loosening.

  20. Study design and statistical analysis of data in human population studies with the micronucleus assay.

    PubMed

    Ceppi, Marcello; Gallo, Fabio; Bonassi, Stefano

    2011-01-01

    The most common study design performed in population studies based on the micronucleus (MN) assay, is the cross-sectional study, which is largely performed to evaluate the DNA damaging effects of exposure to genotoxic agents in the workplace, in the environment, as well as from diet or lifestyle factors. Sample size is still a critical issue in the design of MN studies since most recent studies considering gene-environment interaction, often require a sample size of several hundred subjects, which is in many cases difficult to achieve. The control of confounding is another major threat to the validity of causal inference. The most popular confounders considered in population studies using MN are age, gender and smoking habit. Extensive attention is given to the assessment of effect modification, given the increasing inclusion of biomarkers of genetic susceptibility in the study design. Selected issues concerning the statistical treatment of data have been addressed in this mini-review, starting from data description, which is a critical step of statistical analysis, since it allows to detect possible errors in the dataset to be analysed and to check the validity of assumptions required for more complex analyses. Basic issues dealing with statistical analysis of biomarkers are extensively evaluated, including methods to explore the dose-response relationship among two continuous variables and inferential analysis. A critical approach to the use of parametric and non-parametric methods is presented, before addressing the issue of most suitable multivariate models to fit MN data. In the last decade, the quality of statistical analysis of MN data has certainly evolved, although even nowadays only a small number of studies apply the Poisson model, which is the most suitable method for the analysis of MN data.

  1. Statistical analysis of long-term monitoring data for persistent organic pollutants in the atmosphere at 20 monitoring stations broadly indicates declining concentrations.

    PubMed

    Kong, Deguo; MacLeod, Matthew; Hung, Hayley; Cousins, Ian T

    2014-11-04

    During recent decades concentrations of persistent organic pollutants (POPs) in the atmosphere have been monitored at multiple stations worldwide. We used three statistical methods to analyze a total of 748 time series of selected POPs in the atmosphere to determine if there are statistically significant reductions in levels of POPs that have had control actions enacted to restrict or eliminate manufacture, use and emissions. Significant decreasing trends were identified in 560 (75%) of the 748 time series collected from the Arctic, North America, and Europe, indicating that the atmospheric concentrations of these POPs are generally decreasing, consistent with the overall effectiveness of emission control actions. Statistically significant trends in synthetic time series could be reliably identified with the improved Mann-Kendall (iMK) test and the digital filtration (DF) technique in time series longer than 5 years. The temporal trends of new (or emerging) POPs in the atmosphere are often unclear because time series are too short. A statistical detrending method based on the iMK test was not able to identify abrupt changes in the rates of decline of atmospheric POP concentrations encoded into synthetic time series.

  2. Mean template for tensor-based morphometry using deformation tensors.

    PubMed

    Leporé, Natasha; Brun, Caroline; Pennec, Xavier; Chou, Yi-Yu; Lopez, Oscar L; Aizenstein, Howard J; Becker, James T; Toga, Arthur W; Thompson, Paul M

    2007-01-01

    Tensor-based morphometry (TBM) studies anatomical differences between brain images statistically, to identify regions that differ between groups, over time, or correlate with cognitive or clinical measures. Using a nonlinear registration algorithm, all images are mapped to a common space, and statistics are most commonly performed on the Jacobian determinant (local expansion factor) of the deformation fields. In, it was shown that the detection sensitivity of the standard TBM approach could be increased by using the full deformation tensors in a multivariate statistical analysis. Here we set out to improve the common space itself, by choosing the shape that minimizes a natural metric on the deformation tensors from that space to the population of control subjects. This method avoids statistical bias and should ease nonlinear registration of new subjects data to a template that is 'closest' to all subjects' anatomies. As deformation tensors are symmetric positive-definite matrices and do not form a vector space, all computations are performed in the log-Euclidean framework. The control brain B that is already the closest to 'average' is found. A gradient descent algorithm is then used to perform the minimization that iteratively deforms this template and obtains the mean shape. We apply our method to map the profile of anatomical differences in a dataset of 26 HIV/AIDS patients and 14 controls, via a log-Euclidean Hotelling's T2 test on the deformation tensors. These results are compared to the ones found using the 'best' control, B. Statistics on both shapes are evaluated using cumulative distribution functions of the p-values in maps of inter-group differences.

  3. Diagnostic index of 3D osteoarthritic changes in TMJ condylar morphology

    NASA Astrophysics Data System (ADS)

    Gomes, Liliane R.; Gomes, Marcelo; Jung, Bryan; Paniagua, Beatriz; Ruellas, Antonio C.; Gonçalves, João. Roberto; Styner, Martin A.; Wolford, Larry; Cevidanes, Lucia

    2015-03-01

    The aim of this study was to investigate imaging statistical approaches for classifying 3D osteoarthritic morphological variations among 169 Temporomandibular Joint (TMJ) condyles. Cone beam Computed Tomography (CBCT) scans were acquired from 69 patients with long-term TMJ Osteoarthritis (OA) (39.1 ± 15.7 years), 15 patients at initial diagnosis of OA (44.9 ± 14.8 years) and 7 healthy controls (43 ± 12.4 years). 3D surface models of the condyles were constructed and Shape Correspondence was used to establish correspondent points on each model. The statistical framework included a multivariate analysis of covariance (MANCOVA) and Direction-Projection- Permutation (DiProPerm) for testing statistical significance of the differences between healthy control and the OA group determined by clinical and radiographic diagnoses. Unsupervised classification using hierarchical agglomerative clustering (HAC) was then conducted. Condylar morphology in OA and healthy subjects varied widely. Compared with healthy controls, OA average condyle was statistically significantly smaller in all dimensions except its anterior surface. Significant flattening of the lateral pole was noticed at initial diagnosis (p < 0.05). It was observed areas of 3.88 mm bone resorption at the superior surface and 3.10 mm bone apposition at the anterior aspect of the long-term OA average model. 1000 permutation statistics of DiProPerm supported a significant difference between the healthy control group and OA group (t = 6.7, empirical p-value = 0.001). Clinically meaningful unsupervised classification of TMJ condylar morphology determined a preliminary diagnostic index of 3D osteoarthritic changes, which may be the first step towards a more targeted diagnosis of this condition.

  4. Learning and understanding the Kruskal-Wallis one-way analysis-of-variance-by-ranks test for differences among three or more independent groups.

    PubMed

    Chan, Y; Walmsley, R P

    1997-12-01

    When several treatment methods are available for the same problem, many clinicians are faced with the task of deciding which treatment to use. Many clinicians may have conducted informal "mini-experiments" on their own to determine which treatment is best suited for the problem. These results are usually not documented or reported in a formal manner because many clinicians feel that they are "statistically challenged." Another reason may be because clinicians do not feel they have controlled enough test conditions to warrant analysis. In this update, a statistic is described that does not involve complicated statistical assumptions, making it a simple and easy-to-use statistical method. This update examines the use of two statistics and does not deal with other issues that could affect clinical research such as issues affecting credibility. For readers who want a more in-depth examination of this topic, references have been provided. The Kruskal-Wallis one-way analysis-of-variance-by-ranks test (or H test) is used to determine whether three or more independent groups are the same or different on some variable of interest when an ordinal level of data or an interval or ratio level of data is available. A hypothetical example will be presented to explain when and how to use this statistic, how to interpret results using the statistic, the advantages and disadvantages of the statistic, and what to look for in a written report. This hypothetical example will involve the use of ratio data to demonstrate how to choose between using the nonparametric H test and the more powerful parametric F test.

  5. Statistical transformation and the interpretation of inpatient glucose control data.

    PubMed

    Saulnier, George E; Castro, Janna C; Cook, Curtiss B

    2014-03-01

    To introduce a statistical method of assessing hospital-based non-intensive care unit (non-ICU) inpatient glucose control. Point-of-care blood glucose (POC-BG) data from hospital non-ICUs were extracted for January 1 through December 31, 2011. Glucose data distribution was examined before and after Box-Cox transformations and compared to normality. Different subsets of data were used to establish upper and lower control limits, and exponentially weighted moving average (EWMA) control charts were constructed from June, July, and October data as examples to determine if out-of-control events were identified differently in nontransformed versus transformed data. A total of 36,381 POC-BG values were analyzed. In all 3 monthly test samples, glucose distributions in nontransformed data were skewed but approached a normal distribution once transformed. Interpretation of out-of-control events from EWMA control chart analyses also revealed differences. In the June test data, an out-of-control process was identified at sample 53 with nontransformed data, whereas the transformed data remained in control for the duration of the observed period. Analysis of July data demonstrated an out-of-control process sooner in the transformed (sample 55) than nontransformed (sample 111) data, whereas for October, transformed data remained in control longer than nontransformed data. Statistical transformations increase the normal behavior of inpatient non-ICU glycemic data sets. The decision to transform glucose data could influence the interpretation and conclusions about the status of inpatient glycemic control. Further study is required to determine whether transformed versus nontransformed data influence clinical decisions or evaluation of interventions.

  6. Combining synthetic controls and interrupted time series analysis to improve causal inference in program evaluation.

    PubMed

    Linden, Ariel

    2018-04-01

    Interrupted time series analysis (ITSA) is an evaluation methodology in which a single treatment unit's outcome is studied over time and the intervention is expected to "interrupt" the level and/or trend of the outcome. The internal validity is strengthened considerably when the treated unit is contrasted with a comparable control group. In this paper, we introduce a robust evaluation framework that combines the synthetic controls method (SYNTH) to generate a comparable control group and ITSA regression to assess covariate balance and estimate treatment effects. We evaluate the effect of California's Proposition 99 for reducing cigarette sales, by comparing California to other states not exposed to smoking reduction initiatives. SYNTH is used to reweight nontreated units to make them comparable to the treated unit. These weights are then used in ITSA regression models to assess covariate balance and estimate treatment effects. Covariate balance was achieved for all but one covariate. While California experienced a significant decrease in the annual trend of cigarette sales after Proposition 99, there was no statistically significant treatment effect when compared to synthetic controls. The advantage of using this framework over regression alone is that it ensures that a comparable control group is generated. Additionally, it offers a common set of statistical measures familiar to investigators, the capability for assessing covariate balance, and enhancement of the evaluation with a comprehensive set of postestimation measures. Therefore, this robust framework should be considered as a primary approach for evaluating treatment effects in multiple group time series analysis. © 2018 John Wiley & Sons, Ltd.

  7. Two-Year versus One-Year Head Start Program Impact: Addressing Selection Bias by Comparing Regression Modeling with Propensity Score Analysis

    ERIC Educational Resources Information Center

    Leow, Christine; Wen, Xiaoli; Korfmacher, Jon

    2015-01-01

    This article compares regression modeling and propensity score analysis as different types of statistical techniques used in addressing selection bias when estimating the impact of two-year versus one-year Head Start on children's school readiness. The analyses were based on the national Head Start secondary dataset. After controlling for…

  8. Defect Analysis Of Quality Palm Kernel Meal Using Statistical Quality Control In Kernels Factory

    NASA Astrophysics Data System (ADS)

    Sembiring, M. T.; Marbun, N. J.

    2018-04-01

    The production quality has an important impact retain the totality of characteristics of a product or service to pay attention to its capabilities to meet the needs that have been established. Quality criteria Palm Kernel Meal (PKM) set Factory kernel is as follows: oil content: max 8.50%, water content: max 12,00% and impurity content: max 4.00% While the average quality of the oil content of 8.94%, the water content of 5.51%, and 8.45% impurity content. To identify the defective product quality PKM produced, then used a method of analysis using Statistical Quality Control (SQC). PKM Plant Quality Kernel shows the oil content was 0.44% excess of a predetermined maximum value, and 4.50% impurity content. With excessive PKM content of oil and dirt cause disability content of production for oil, amounted to 854.6078 kg PKM and 8643.193 kg impurity content of PKM. Analysis of the results of cause and effect diagram and SQC, the factors that lead to poor quality of PKM is Ampere second press oil expeller and hours second press oil expeller.

  9. An information-theoretic approach to the modeling and analysis of whole-genome bisulfite sequencing data.

    PubMed

    Jenkinson, Garrett; Abante, Jordi; Feinberg, Andrew P; Goutsias, John

    2018-03-07

    DNA methylation is a stable form of epigenetic memory used by cells to control gene expression. Whole genome bisulfite sequencing (WGBS) has emerged as a gold-standard experimental technique for studying DNA methylation by producing high resolution genome-wide methylation profiles. Statistical modeling and analysis is employed to computationally extract and quantify information from these profiles in an effort to identify regions of the genome that demonstrate crucial or aberrant epigenetic behavior. However, the performance of most currently available methods for methylation analysis is hampered by their inability to directly account for statistical dependencies between neighboring methylation sites, thus ignoring significant information available in WGBS reads. We present a powerful information-theoretic approach for genome-wide modeling and analysis of WGBS data based on the 1D Ising model of statistical physics. This approach takes into account correlations in methylation by utilizing a joint probability model that encapsulates all information available in WGBS methylation reads and produces accurate results even when applied on single WGBS samples with low coverage. Using the Shannon entropy, our approach provides a rigorous quantification of methylation stochasticity in individual WGBS samples genome-wide. Furthermore, it utilizes the Jensen-Shannon distance to evaluate differences in methylation distributions between a test and a reference sample. Differential performance assessment using simulated and real human lung normal/cancer data demonstrate a clear superiority of our approach over DSS, a recently proposed method for WGBS data analysis. Critically, these results demonstrate that marginal methods become statistically invalid when correlations are present in the data. This contribution demonstrates clear benefits and the necessity of modeling joint probability distributions of methylation using the 1D Ising model of statistical physics and of quantifying methylation stochasticity using concepts from information theory. By employing this methodology, substantial improvement of DNA methylation analysis can be achieved by effectively taking into account the massive amount of statistical information available in WGBS data, which is largely ignored by existing methods.

  10. Significant Association of Urinary Toxic Metals and Autism-Related Symptoms—A Nonlinear Statistical Analysis with Cross Validation

    PubMed Central

    Adams, James; Kruger, Uwe; Geis, Elizabeth; Gehn, Eva; Fimbres, Valeria; Pollard, Elena; Mitchell, Jessica; Ingram, Julie; Hellmers, Robert; Quig, David; Hahn, Juergen

    2017-01-01

    Introduction A number of previous studies examined a possible association of toxic metals and autism, and over half of those studies suggest that toxic metal levels are different in individuals with Autism Spectrum Disorders (ASD). Additionally, several studies found that those levels correlate with the severity of ASD. Methods In order to further investigate these points, this paper performs the most detailed statistical analysis to date of a data set in this field. First morning urine samples were collected from 67 children and adults with ASD and 50 neurotypical controls of similar age and gender. The samples were analyzed to determine the levels of 10 urinary toxic metals (UTM). Autism-related symptoms were assessed with eleven behavioral measures. Statistical analysis was used to distinguish participants on the ASD spectrum and neurotypical participants based upon the UTM data alone. The analysis also included examining the association of autism severity with toxic metal excretion data using linear and nonlinear analysis. “Leave-one-out” cross-validation was used to ensure statistical independence of results. Results and Discussion Average excretion levels of several toxic metals (lead, tin, thallium, antimony) were significantly higher in the ASD group. However, ASD classification using univariate statistics proved difficult due to large variability, but nonlinear multivariate statistical analysis significantly improved ASD classification with Type I/II errors of 15% and 18%, respectively. These results clearly indicate that the urinary toxic metal excretion profiles of participants in the ASD group were significantly different from those of the neurotypical participants. Similarly, nonlinear methods determined a significantly stronger association between the behavioral measures and toxic metal excretion. The association was strongest for the Aberrant Behavior Checklist (including subscales on Irritability, Stereotypy, Hyperactivity, and Inappropriate Speech), but significant associations were found for UTM with all eleven autism-related assessments with cross-validation R2 values ranging from 0.12–0.48. PMID:28068407

  11. Serum Levels of 25-hydroxyvitamin D in Chronic Urticaria and its Association with Disease Activity: A Case Control Study.

    PubMed

    Rather, Shagufta; Keen, Abid; Sajad, Peerzada

    2018-01-01

    To evaluate the relationship between vitamin D levels and chronic spontaneous urticaria (CSU) and compare with healthy age and sex matched controls. This was a hospital-based cross-sectional study conducted over a period of 1 year, in which 110 patients with CSU were recruited along with an equal number of sex and age-matched healthy controls. For each patient, urticaria activity score (UAS) was calculated and autologous serum skin test (ASST) was performed. Plasma 25-hydroxyvitamin D [25-(OH)D] was analyzed by chemiluminescence method. A deficiency in vitamin D was defined as serum 25-(OH)D concentrations <30 ng/mL. The statistical analysis was carried out by using appropriate statistical tests. The mean serum 25-(OH)D levels of CSU patients was 19.6 ± 6.9 ng/mL, whereas in control group, the mean level was 38.5 ± 6.7, the difference being statistically significant ( P < 0.001). A significant negative correlation was found between vitamin D levels and UAS. ( P < 0.001). The number of patients with ASST positivity was 44 (40%). The patients with CSU had reduced levels of vitamin D when compared to healthy controls. Furthermore, there was a significant negative correlation between the levels of serum vitamin D and severity of CSU.

  12. Vibroacoustic optimization using a statistical energy analysis model

    NASA Astrophysics Data System (ADS)

    Culla, Antonio; D`Ambrogio, Walter; Fregolent, Annalisa; Milana, Silvia

    2016-08-01

    In this paper, an optimization technique for medium-high frequency dynamic problems based on Statistical Energy Analysis (SEA) method is presented. Using a SEA model, the subsystem energies are controlled by internal loss factors (ILF) and coupling loss factors (CLF), which in turn depend on the physical parameters of the subsystems. A preliminary sensitivity analysis of subsystem energy to CLF's is performed to select CLF's that are most effective on subsystem energies. Since the injected power depends not only on the external loads but on the physical parameters of the subsystems as well, it must be taken into account under certain conditions. This is accomplished in the optimization procedure, where approximate relationships between CLF's, injected power and physical parameters are derived. The approach is applied on a typical aeronautical structure: the cabin of a helicopter.

  13. Processes and subdivisions in diogenites, a multivariate statistical analysis

    NASA Technical Reports Server (NTRS)

    Harriott, T. A.; Hewins, R. H.

    1984-01-01

    Multivariate statistical techniques used on diogenite orthopyroxene analyses show the relationships that occur within diogenites and the two orthopyroxenite components (class I and II) in the polymict diogenite Garland. Cluster analysis shows that only Peckelsheim is similar to Garland class I (Fe-rich) and the other diogenites resemble Garland class II. The unique diogenite Y 75032 may be related to type I by fractionation. Factor analysis confirms the subdivision and shows that Fe does not correlate with the weakly incompatible elements across the entire pyroxene composition range, indicating that igneous fractionation is not the process controlling total diogenite composition variation. The occurrence of two groups of diogenites is interpreted as the result of sampling or mixing of two main sequences of orthopyroxene cumulates with slightly different compositions.

  14. Common statistical and research design problems in manuscripts submitted to high-impact psychiatry journals: what editors and reviewers want authors to know.

    PubMed

    Harris, Alex H S; Reeder, Rachelle; Hyun, Jenny K

    2009-10-01

    Journal editors and statistical reviewers are often in the difficult position of catching serious problems in submitted manuscripts after the research is conducted and data have been analyzed. We sought to learn from editors and reviewers of major psychiatry journals what common statistical and design problems they most often find in submitted manuscripts and what they wished to communicate to authors regarding these issues. Our primary goal was to facilitate communication between journal editors/reviewers and researchers/authors and thereby improve the scientific and statistical quality of research and submitted manuscripts. Editors and statistical reviewers of 54 high-impact psychiatry journals were surveyed to learn what statistical or design problems they encounter most often in submitted manuscripts. Respondents completed the survey online. The authors analyzed survey text responses using content analysis procedures to identify major themes related to commonly encountered statistical or research design problems. Editors and reviewers (n=15) who handle manuscripts from 39 different high-impact psychiatry journals responded to the survey. The most commonly cited problems regarded failure to map statistical models onto research questions, improper handling of missing data, not controlling for multiple comparisons, not understanding the difference between equivalence and difference trials, and poor controls in quasi-experimental designs. The scientific quality of psychiatry research and submitted reports could be greatly improved if researchers became sensitive to, or sought consultation on frequently encountered methodological and analytic issues.

  15. A comparative evaluation of dental caries status among hearing-impaired and normal children of Malda, West Bengal, evaluated with the Caries Assessment Spectrum and Treatment.

    PubMed

    Kar, Sudipta; Kundu, Goutam; Maiti, Shyamal Kumar; Ghosh, Chiranjit; Bazmi, Badruddin Ahamed; Mukhopadhyay, Santanu

    2016-01-01

    Dental caries is one of the major modern-day diseases of dental hard tissue. It may affect both normal and hearing-impaired children. This study is aimed to evaluate and compare the prevalence of dental caries in hearing-impaired and normal children of Malda, West Bengal, utilizing the Caries Assessment Spectrum and Treatment (CAST). In a cross-sectional, case-control study of dental caries status of 6-12-year-old children was assessed. Statistically significant difference was found in studied (hearing-impaired) and control group (normal children). In the present study, caries affected hearing-impaired children found to be about 30.51% compared to 15.81% in normal children, and the result was statistically significant. Regarding individual caries assessment criteria, nearly all subgroups reflect statistically significant difference except sealed tooth structure group, internal caries-related discoloration in dentin, and distinct cavitation into dentine group, and the result is significant at P < 0.05. Statistical analysis was carried out utilizing Z-test. Statistically significant difference was found in studied (hearing-impaired) and control group (normal children). In the present study, caries effected hearing-impaired children found about 30.51% instead of 15.81% in normal children, and the result was statistically significant (P < 0.05). Regarding individual caries assessment criteria, nearly all subgroups reflect statistically significant difference except sealed tooth structure group, internal caries-related discoloration in dentin, and distinct cavitation into dentine group. Dental health of hearing-impaired children was found unsatisfactory than normal children when studied in relation to dental caries status evaluated with CAST.

  16. Effects of far-infrared irradiation on myofascial neck pain: a randomized, double-blind, placebo-controlled pilot study.

    PubMed

    Lai, Chien-Hung; Leung, Ting-Kai; Peng, Chih-Wei; Chang, Kwang-Hwa; Lai, Ming-Jun; Lai, Wen-Fu; Chen, Shih-Ching

    2014-02-01

    The objective of this study was to determine the relative efficacy of irradiation using a device containing a far-infrared emitting ceramic powder (cFIR) for the management of chronic myofascial neck pain compared with a control treatment. This was a randomized, double-blind, placebo-controlled pilot study. The study comprised 48 patients with chronic, myofascial neck pain. Patients were randomly assigned to the experimental group or the control (sham-treatment) group. The patients in the experimental group wore a cFIR neck device for 1 week, and the control group wore an inert neck device for 1 week. Quantitative measurements based on a visual analogue scale (VAS) scoring of pain, a sleep quality assessment, pressure-pain threshold (PPT) testing, muscle tone and compliance analysis, and skin temperature analysis were obtained. Both the experimental and control groups demonstrated significant improvement in pain scores. However, no statistically significant difference in the pain scores was observed between the experimental and control groups. Significant decreases in muscle stiffness in the upper regions of the trapezius muscles were reported in the experimental group after 1 week of treatment. Short-term treatment using the cFIR neck device partly reduced muscle stiffness. Although the differences in the VAS and PPT scores for the experimental and control groups were not statistically significant, the improvement in muscle stiffness in the experimental group warrants further investigation of the long-term effects of cFIR treatment for pain management.

  17. [How reliable is the monitoring for doping?].

    PubMed

    Hüsler, J

    1990-12-01

    The reliability of the dope control, of the chemical analysis of the urine probes in the accredited laboratories and their decisions, is discussed using probabilistic and statistical methods. Basically, we evaluated and estimated the positive predictive value which means the probability that an urine probe contains prohibited dope substances given a positive test decision. Since there are not statistical data and evidence for some important quantities in relation to the predictive value, an exact evaluation is not possible, only conservative, lower bounds can be given. We found that the predictive value is at least 90% or 95% with respect to the analysis and decision based on the A-probe only, and at least 99% with respect to both A- and B-probes. A more realistic observation, but without sufficient statistical confidence, points to the fact that the true predictive value is significantly larger than these lower estimates.

  18. A PLSPM-Based Test Statistic for Detecting Gene-Gene Co-Association in Genome-Wide Association Study with Case-Control Design

    PubMed Central

    Zhang, Xiaoshuai; Yang, Xiaowei; Yuan, Zhongshang; Liu, Yanxun; Li, Fangyu; Peng, Bin; Zhu, Dianwen; Zhao, Jinghua; Xue, Fuzhong

    2013-01-01

    For genome-wide association data analysis, two genes in any pathway, two SNPs in the two linked gene regions respectively or in the two linked exons respectively within one gene are often correlated with each other. We therefore proposed the concept of gene-gene co-association, which refers to the effects not only due to the traditional interaction under nearly independent condition but the correlation between two genes. Furthermore, we constructed a novel statistic for detecting gene-gene co-association based on Partial Least Squares Path Modeling (PLSPM). Through simulation, the relationship between traditional interaction and co-association was highlighted under three different types of co-association. Both simulation and real data analysis demonstrated that the proposed PLSPM-based statistic has better performance than single SNP-based logistic model, PCA-based logistic model, and other gene-based methods. PMID:23620809

  19. A PLSPM-based test statistic for detecting gene-gene co-association in genome-wide association study with case-control design.

    PubMed

    Zhang, Xiaoshuai; Yang, Xiaowei; Yuan, Zhongshang; Liu, Yanxun; Li, Fangyu; Peng, Bin; Zhu, Dianwen; Zhao, Jinghua; Xue, Fuzhong

    2013-01-01

    For genome-wide association data analysis, two genes in any pathway, two SNPs in the two linked gene regions respectively or in the two linked exons respectively within one gene are often correlated with each other. We therefore proposed the concept of gene-gene co-association, which refers to the effects not only due to the traditional interaction under nearly independent condition but the correlation between two genes. Furthermore, we constructed a novel statistic for detecting gene-gene co-association based on Partial Least Squares Path Modeling (PLSPM). Through simulation, the relationship between traditional interaction and co-association was highlighted under three different types of co-association. Both simulation and real data analysis demonstrated that the proposed PLSPM-based statistic has better performance than single SNP-based logistic model, PCA-based logistic model, and other gene-based methods.

  20. A Critical Analysis of U.S. Army Accessions through Socioeconomic Consideration between 1970 and 1984.

    DTIC Science & Technology

    1985-06-01

    ADDRESS 10. PROGRAM ELEMENT, PROJECT. TASK AREA & WORK UNIT NUMBERS Naval Postgraduate School Monterey, California 93943 11. CONTROLLING OFFICE NAME AND...determine the sccioeccnomic representativeness of the Army’s enlistees in that iarticular year. In addition, the socioeconomic overviev of Republic cf...accomplished with the use of the Statistical Analysis System (SAS), an integrated computer system for data analysis. 32 TABLE 2 The States in Each District

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chertkov, Michael; Turitsyn, Konstantin; Sulc, Petr

    The anticipated increase in the number of plug-in electric vehicles (EV) will put additional strain on electrical distribution circuits. Many control schemes have been proposed to control EV charging. Here, we develop control algorithms based on randomized EV charging start times and simple one-way broadcast communication allowing for a time delay between communication events. Using arguments from queuing theory and statistical analysis, we seek to maximize the utilization of excess distribution circuit capacity while keeping the probability of a circuit overload negligible.

  2. Sulcal depth-based cortical shape analysis in normal healthy control and schizophrenia groups

    NASA Astrophysics Data System (ADS)

    Lyu, Ilwoo; Kang, Hakmook; Woodward, Neil D.; Landman, Bennett A.

    2018-03-01

    Sulcal depth is an important marker of brain anatomy in neuroscience/neurological function. Previously, sulcal depth has been explored at the region-of-interest (ROI) level to increase statistical sensitivity to group differences. In this paper, we present a fully automated method that enables inferences of ROI properties from a sulcal region- focused perspective consisting of two main components: 1) sulcal depth computation and 2) sulcal curve-based refined ROIs. In conventional statistical analysis, the average sulcal depth measurements are employed in several ROIs of the cortical surface. However, taking the average sulcal depth over the full ROI blurs overall sulcal depth measurements which may result in reduced sensitivity to detect sulcal depth changes in neurological and psychiatric disorders. To overcome such a blurring effect, we focus on sulcal fundic regions in each ROI by filtering out other gyral regions. Consequently, the proposed method results in more sensitive to group differences than a traditional ROI approach. In the experiment, we focused on a cortical morphological analysis to sulcal depth reduction in schizophrenia with a comparison to the normal healthy control group. We show that the proposed method is more sensitivity to abnormalities of sulcal depth in schizophrenia; sulcal depth is significantly smaller in most cortical lobes in schizophrenia compared to healthy controls (p < 0.05).

  3. Schooling mediates brain reserve in Alzheimer's disease: findings of fluoro-deoxy-glucose-positron emission tomography.

    PubMed

    Perneczky, R; Drzezga, A; Diehl-Schmid, J; Schmid, G; Wohlschläger, A; Kars, S; Grimmer, T; Wagenpfeil, S; Monsch, A; Kurz, A

    2006-09-01

    Functional imaging studies report that higher education is associated with more severe pathology in patients with Alzheimer's disease, controlling for disease severity. Therefore, schooling seems to provide brain reserve against neurodegeneration. To provide further evidence for brain reserve in a large sample, using a sensitive technique for the indirect assessment of brain abnormality (18F-fluoro-deoxy-glucose-positron emission tomography (FDG-PET)), a comprehensive measure of global cognitive impairment to control for disease severity (total score of the Consortium to Establish a Registry for Alzheimer's Disease Neuropsychological Battery) and an approach unbiased by predefined regions of interest for the statistical analysis (statistical parametric mapping (SPM)). 93 patients with mild Alzheimer's disease and 16 healthy controls underwent 18F-FDG-PET imaging of the brain. A linear regression analysis with education as independent and glucose utilisation as dependent variables, adjusted for global cognitive status and demographic variables, was conducted in SPM2. The regression analysis showed a marked inverse association between years of schooling and glucose metabolism in the posterior temporo-occipital association cortex and the precuneus in the left hemisphere. In line with previous reports, the findings suggest that education is associated with brain reserve and that people with higher education can cope with brain damage for a longer time.

  4. [Establishment of diagnostic model to monitor minimal residual disease of acute promyelocytic leukemia by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry].

    PubMed

    Zhang, Lin-lin; Xu, Zhi-fang; Tan, Yan-hong; Chen, Xiu-hua; Xu, Ai-ning; Ren, Fang-gang; Wang, Hong-wei

    2013-01-01

    To screen the potential protein biomarkers in minimal residual disease (MRD) of the acute promyelocytic leukemia (APL) by comparison of differentially expressed serum protein between APL patients at diagnosis and after complete remission (CR) and healthy controls, and to establish and verify a diagnostic model. Serum proteins from 36 cases of primary APL, 29 cases of APL during complete remission and 32 healthy controls were purified by magnetic beads and then analyzed by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS). The spectra were analyzed statistically using FlexAnalysis(TM) and ClinProt(TM) software. Two prediction model of primary APL/healthy control, primary APL/APL CR were developed. Thirty four statistically significant peptide peaks were obtained with the m/z value ranging from 1000 to 10 000 (P < 0.001) in primary APL/healthy control model. Seven statistically significant peptide peaks were obtained in primary APL/APL CR model (P < 0.001). Comparison of the protein profiles between the two models, three peptides with m/z 4642, 7764 and 9289 were considered as the protein biomarker of APL MRD. A diagnostic pattern for APL CR using m/z 4642 and 9289 was established. Blind validation yielded correct classification of 6 out of 8 cases. The MALDI-TOF MS analysis of APL patients serum protein can be used as a promising dynamic method for MRD detection and the two peptides with m/z 4642 and 9289 may be better biomarkers.

  5. Statistical Process Control for KSC Processing

    NASA Technical Reports Server (NTRS)

    Ford, Roger G.; Delgado, Hector; Tilley, Randy

    1996-01-01

    The 1996 Summer Faculty Fellowship Program and Kennedy Space Center (KSC) served as the basis for a research effort into statistical process control for KSC processing. The effort entailed several tasks and goals. The first was to develop a customized statistical process control (SPC) course for the Safety and Mission Assurance Trends Analysis Group. The actual teaching of this course took place over several weeks. In addition, an Internet version of the same course complete with animation and video excerpts from the course when it was taught at KSC was developed. The application of SPC to shuttle processing took up the rest of the summer research project. This effort entailed the evaluation of SPC use at KSC, both present and potential, due to the change in roles for NASA and the Single Flight Operations Contractor (SFOC). Individual consulting on SPC use was accomplished as well as an evaluation of SPC software for KSC use in the future. A final accomplishment of the orientation of the author to NASA changes, terminology, data format, and new NASA task definitions will allow future consultation when the needs arise.

  6. A clinical research analytics toolkit for cohort study.

    PubMed

    Yu, Yiqin; Zhu, Yu; Sun, Xingzhi; Tao, Ying; Zhang, Shuo; Xu, Linhao; Pan, Yue

    2012-01-01

    This paper presents a clinical informatics toolkit that can assist physicians to conduct cohort studies effectively and efficiently. The toolkit has three key features: 1) support of procedures defined in epidemiology, 2) recommendation of statistical methods in data analysis, and 3) automatic generation of research reports. On one hand, our system can help physicians control research quality by leveraging the integrated knowledge of epidemiology and medical statistics; on the other hand, it can improve productivity by reducing the complexities for physicians during their cohort studies.

  7. Statistical Performances of Resistive Active Power Splitter

    NASA Astrophysics Data System (ADS)

    Lalléchère, Sébastien; Ravelo, Blaise; Thakur, Atul

    2016-03-01

    In this paper, the synthesis and sensitivity analysis of an active power splitter (PWS) is proposed. It is based on the active cell composed of a Field Effect Transistor in cascade with shunted resistor at the input and the output (resistive amplifier topology). The PWS uncertainty versus resistance tolerances is suggested by using stochastic method. Furthermore, with the proposed topology, we can control easily the device gain while varying a resistance. This provides useful tool to analyse the statistical sensitivity of the system in uncertain environment.

  8. Fast and accurate imputation of summary statistics enhances evidence of functional enrichment

    PubMed Central

    Pasaniuc, Bogdan; Zaitlen, Noah; Shi, Huwenbo; Bhatia, Gaurav; Gusev, Alexander; Pickrell, Joseph; Hirschhorn, Joel; Strachan, David P.; Patterson, Nick; Price, Alkes L.

    2014-01-01

    Motivation: Imputation using external reference panels (e.g. 1000 Genomes) is a widely used approach for increasing power in genome-wide association studies and meta-analysis. Existing hidden Markov models (HMM)-based imputation approaches require individual-level genotypes. Here, we develop a new method for Gaussian imputation from summary association statistics, a type of data that is becoming widely available. Results: In simulations using 1000 Genomes (1000G) data, this method recovers 84% (54%) of the effective sample size for common (>5%) and low-frequency (1–5%) variants [increasing to 87% (60%) when summary linkage disequilibrium information is available from target samples] versus the gold standard of 89% (67%) for HMM-based imputation, which cannot be applied to summary statistics. Our approach accounts for the limited sample size of the reference panel, a crucial step to eliminate false-positive associations, and it is computationally very fast. As an empirical demonstration, we apply our method to seven case–control phenotypes from the Wellcome Trust Case Control Consortium (WTCCC) data and a study of height in the British 1958 birth cohort (1958BC). Gaussian imputation from summary statistics recovers 95% (105%) of the effective sample size (as quantified by the ratio of χ2 association statistics) compared with HMM-based imputation from individual-level genotypes at the 227 (176) published single nucleotide polymorphisms (SNPs) in the WTCCC (1958BC height) data. In addition, for publicly available summary statistics from large meta-analyses of four lipid traits, we publicly release imputed summary statistics at 1000G SNPs, which could not have been obtained using previously published methods, and demonstrate their accuracy by masking subsets of the data. We show that 1000G imputation using our approach increases the magnitude and statistical evidence of enrichment at genic versus non-genic loci for these traits, as compared with an analysis without 1000G imputation. Thus, imputation of summary statistics will be a valuable tool in future functional enrichment analyses. Availability and implementation: Publicly available software package available at http://bogdan.bioinformatics.ucla.edu/software/. Contact: bpasaniuc@mednet.ucla.edu or aprice@hsph.harvard.edu Supplementary information: Supplementary materials are available at Bioinformatics online. PMID:24990607

  9. GWAR: robust analysis and meta-analysis of genome-wide association studies.

    PubMed

    Dimou, Niki L; Tsirigos, Konstantinos D; Elofsson, Arne; Bagos, Pantelis G

    2017-05-15

    In the context of genome-wide association studies (GWAS), there is a variety of statistical techniques in order to conduct the analysis, but, in most cases, the underlying genetic model is usually unknown. Under these circumstances, the classical Cochran-Armitage trend test (CATT) is suboptimal. Robust procedures that maximize the power and preserve the nominal type I error rate are preferable. Moreover, performing a meta-analysis using robust procedures is of great interest and has never been addressed in the past. The primary goal of this work is to implement several robust methods for analysis and meta-analysis in the statistical package Stata and subsequently to make the software available to the scientific community. The CATT under a recessive, additive and dominant model of inheritance as well as robust methods based on the Maximum Efficiency Robust Test statistic, the MAX statistic and the MIN2 were implemented in Stata. Concerning MAX and MIN2, we calculated their asymptotic null distributions relying on numerical integration resulting in a great gain in computational time without losing accuracy. All the aforementioned approaches were employed in a fixed or a random effects meta-analysis setting using summary data with weights equal to the reciprocal of the combined cases and controls. Overall, this is the first complete effort to implement procedures for analysis and meta-analysis in GWAS using Stata. A Stata program and a web-server are freely available for academic users at http://www.compgen.org/tools/GWAR. pbagos@compgen.org. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  10. MethVisual - visualization and exploratory statistical analysis of DNA methylation profiles from bisulfite sequencing.

    PubMed

    Zackay, Arie; Steinhoff, Christine

    2010-12-15

    Exploration of DNA methylation and its impact on various regulatory mechanisms has become a very active field of research. Simultaneously there is an arising need for tools to process and analyse the data together with statistical investigation and visualisation. MethVisual is a new application that enables exploratory analysis and intuitive visualization of DNA methylation data as is typically generated by bisulfite sequencing. The package allows the import of DNA methylation sequences, aligns them and performs quality control comparison. It comprises basic analysis steps as lollipop visualization, co-occurrence display of methylation of neighbouring and distant CpG sites, summary statistics on methylation status, clustering and correspondence analysis. The package has been developed for methylation data but can be also used for other data types for which binary coding can be inferred. The application of the package, as well as a comparison to existing DNA methylation analysis tools and its workflow based on two datasets is presented in this paper. The R package MethVisual offers various analysis procedures for data that can be binarized, in particular for bisulfite sequenced methylation data. R/Bioconductor has become one of the most important environments for statistical analysis of various types of biological and medical data. Therefore, any data analysis within R that allows the integration of various data types as provided from different technological platforms is convenient. It is the first and so far the only specific package for DNA methylation analysis, in particular for bisulfite sequenced data available in R/Bioconductor enviroment. The package is available for free at http://methvisual.molgen.mpg.de/ and from the Bioconductor Consortium http://www.bioconductor.org.

  11. MethVisual - visualization and exploratory statistical analysis of DNA methylation profiles from bisulfite sequencing

    PubMed Central

    2010-01-01

    Background Exploration of DNA methylation and its impact on various regulatory mechanisms has become a very active field of research. Simultaneously there is an arising need for tools to process and analyse the data together with statistical investigation and visualisation. Findings MethVisual is a new application that enables exploratory analysis and intuitive visualization of DNA methylation data as is typically generated by bisulfite sequencing. The package allows the import of DNA methylation sequences, aligns them and performs quality control comparison. It comprises basic analysis steps as lollipop visualization, co-occurrence display of methylation of neighbouring and distant CpG sites, summary statistics on methylation status, clustering and correspondence analysis. The package has been developed for methylation data but can be also used for other data types for which binary coding can be inferred. The application of the package, as well as a comparison to existing DNA methylation analysis tools and its workflow based on two datasets is presented in this paper. Conclusions The R package MethVisual offers various analysis procedures for data that can be binarized, in particular for bisulfite sequenced methylation data. R/Bioconductor has become one of the most important environments for statistical analysis of various types of biological and medical data. Therefore, any data analysis within R that allows the integration of various data types as provided from different technological platforms is convenient. It is the first and so far the only specific package for DNA methylation analysis, in particular for bisulfite sequenced data available in R/Bioconductor enviroment. The package is available for free at http://methvisual.molgen.mpg.de/ and from the Bioconductor Consortium http://www.bioconductor.org. PMID:21159174

  12. 77 FR 46096 - Statistical Process Controls for Blood Establishments; Public Workshop

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-02

    ...] Statistical Process Controls for Blood Establishments; Public Workshop AGENCY: Food and Drug Administration... workshop entitled: ``Statistical Process Controls for Blood Establishments.'' The purpose of this public workshop is to discuss the implementation of statistical process controls to validate and monitor...

  13. TQM (Total Quality Management) SPARC (Special Process Action Review Committees) Handbook

    DTIC Science & Technology

    1989-08-01

    This document describes the techniques used to support and guide the Special Process Action Review Committees for accomplishing their goals for Total Quality Management (TQM). It includes concepts and definitions, checklists, sample formats, and assessment criteria. Keywords: Continuous process improvement; Logistics information; Process analysis; Quality control; Quality assurance; Total Quality Management ; Statistical processes; Management Planning and control; Management training; Management information systems.

  14. Interior Noise

    NASA Technical Reports Server (NTRS)

    Mixson, John S.; Wilby, John F.

    1991-01-01

    The generation and control of flight vehicle interior noise is discussed. Emphasis is placed on the mechanisms of transmission through airborne and structure-borne paths and the control of cabin noise by path modification. Techniques for identifying the relative contributions of the various source-path combinations are also discussed along with methods for the prediction of aircraft interior noise such as those based on the general modal theory and statistical energy analysis.

  15. Effectiveness of Riluzole as a pharmacotherapeutic treatment option for early cervical myelopathy: a double-blinded, placebo-controlled randomised controlled trial.

    PubMed

    Rajasekaran, S; Aiyer, Siddharth N; Shetty, Ajoy Prasad; Kanna, Rishi Mugesh; Maheswaran, Anupama; Shetty, Janardhan Yerram

    2016-06-01

    To evaluate the effectiveness of Riluzole as a pharmacotherapeutic treatment option for early cervical myelopathy using clinical parameters and DTI analysis. Early cervical myelopathy cases with MJOA scores ≥13, were recruited for the double-blinded, placebo-controlled randomised control trial. Thirty cases with fifteen cases each in the test and placebo group were studied. Analysis was done using diffusion tensor imaging (DTI) and clinical evaluation, pre- and post-institution of sodium channel blocker Riluzole for a period of 1 month (50 mg twice daily). Placebo group was treated with Vitamin B complex tablets. Diffusion co-efficient fractional anisotrophy (FA), apparent diffusion co-efficient (ADC), volume ratio (VR), relative anisotrophy (RA) and Eigen vectors were calculated. Outcomes analysis was based on clinical scores of MJOA, Nurick grading, SF-12, NDI, and statistical analysis of DTI datametrics. The mean MJOA score was 15.6 (13-17) with no significant change in the test and control groups. The mean ADC, FA values were 1533.36 (1238-1779) and 494.36 (364-628) and changed to 1531.57 (1312-2091) and 484.86 (294-597), respectively, in the Riluzole group. However, the changes in the values of ADC, FA, and other co-efficients including VR, RA and eigenvectors in the two groups were not statistically significant. The functional scores in the SF-12 and NDI questionnaires did not change significantly. Our study did not show a significant change in the clinical outcome and DTI Indices with the use of Riluzole as a standalone pharmacotherapeutic agent for early cervical myelopathy. More studies may be needed to confirm the usefulness of Riluzole as a treatment option for cervical myelopathy.

  16. Identifying functional reorganization of spelling networks: an individual peak probability comparison approach

    PubMed Central

    Purcell, Jeremy J.; Rapp, Brenda

    2013-01-01

    Previous research has shown that damage to the neural substrates of orthographic processing can lead to functional reorganization during reading (Tsapkini et al., 2011); in this research we ask if the same is true for spelling. To examine the functional reorganization of spelling networks we present a novel three-stage Individual Peak Probability Comparison (IPPC) analysis approach for comparing the activation patterns obtained during fMRI of spelling in a single brain-damaged individual with dysgraphia to those obtained in a set of non-impaired control participants. The first analysis stage characterizes the convergence in activations across non-impaired control participants by applying a technique typically used for characterizing activations across studies: Activation Likelihood Estimate (ALE) (Turkeltaub et al., 2002). This method was used to identify locations that have a high likelihood of yielding activation peaks in the non-impaired participants. The second stage provides a characterization of the degree to which the brain-damaged individual's activations correspond to the group pattern identified in Stage 1. This involves performing a Mahalanobis distance statistics analysis (Tsapkini et al., 2011) that compares each of a control group's peak activation locations to the nearest peak generated by the brain-damaged individual. The third stage evaluates the extent to which the brain-damaged individual's peaks are atypical relative to the range of individual variation among the control participants. This IPPC analysis allows for a quantifiable, statistically sound method for comparing an individual's activation pattern to the patterns observed in a control group and, thus, provides a valuable tool for identifying functional reorganization in a brain-damaged individual with impaired spelling. Furthermore, this approach can be applied more generally to compare any individual's activation pattern with that of a set of other individuals. PMID:24399981

  17. Brush head composition, wear profile, and cleaning efficacy: an assessment of three electric brush heads using in vitro methods.

    PubMed

    Kaiser, Eva; Meyners, Michael; Markgraf, Dirk; Stoerkel, Ulrich; von Koppenfels, Roxana; Adam, Ralf; Soukup, Martin; Wehrbein, Heinrich; Erbe, Christina

    2014-01-01

    The objective of this research was to evaluate a current store brand (SB) brush head for composition/physical characteristics, Wear Index (WI), and cleaning efficacy versus the previous SB brush head refill design (SB control) and the Oral-B Precision Clean brush head (positive control, PC). This research consisted of three parts: 1) Analytical analysis using Fourier Transform Infrared (FT-IR) spectrometry to evaluate the chemical composition of the current SB brush head bristles relative to the SB control. In addition, physical parameters such as bristle count and diameter were determined. 2) Wear Index (WI) investigation to determine the Wear Index scores of in vitro-aged brush heads at four weeks (one month) and 13 weeks (three months) by a trained investigator. To "age" the brush heads, a robot system was used as a new alternative in vitro method to simulate aging by consumer use. 3) Robot testing to determine the cleaning performance of in vitro-aged brush heads, comparing one month-aged current SB brush heads with the SB control (one and three months-aged) and the PC brush heads (three months-aged) in a standardized fashion. 1) FT-IR analysis revealed that the chemical composition of the current and control SB refill brush heads is identical. In terms of physical parameters, the current SB brush head has 12% more bristles and a slightly oval brush head compared to the round brush head of the SB control. 2) Wear Index analysis showed there was no difference in the one month-aged current SB brush head versus the one month-aged SB control (1.67 vs. 1.50, p = 0.65) or versus the three months-aged PC brush head (1.67 vs. 1.50, p = 0.65). The one month-aged current SB brush head demonstrated statistically significantly less wear than the three months-aged SB control (1.67 vs. 2.67, p = 0.01). 3) Analysis of cleaning efficacy shows that the one month-aged current SB brush head had improved cleaning performance over the one month-aged SB control brush head (p < 0.05), despite no statistically significant difference in wear. Both the one month-aged current and control SB brush heads showed statistically significantly lower cleaning performance compared to the three months-aged PC brush heads (p < 0.01). While the current SB brush head showed improved cleaning over the SB control, it demonstrated significantly lower durability and cleaning in comparison to the PC brush head. Dental professionals should be aware of these differences, both in durability and in cleaning performance, when recommending brush heads to their patients.

  18. FADTTSter: accelerating hypothesis testing with functional analysis of diffusion tensor tract statistics

    NASA Astrophysics Data System (ADS)

    Noel, Jean; Prieto, Juan C.; Styner, Martin

    2017-03-01

    Functional Analysis of Diffusion Tensor Tract Statistics (FADTTS) is a toolbox for analysis of white matter (WM) fiber tracts. It allows associating diffusion properties along major WM bundles with a set of covariates of interest, such as age, diagnostic status and gender, and the structure of the variability of these WM tract properties. However, to use this toolbox, a user must have an intermediate knowledge in scripting languages (MATLAB). FADTTSter was created to overcome this issue and make the statistical analysis accessible to any non-technical researcher. FADTTSter is actively being used by researchers at the University of North Carolina. FADTTSter guides non-technical users through a series of steps including quality control of subjects and fibers in order to setup the necessary parameters to run FADTTS. Additionally, FADTTSter implements interactive charts for FADTTS' outputs. This interactive chart enhances the researcher experience and facilitates the analysis of the results. FADTTSter's motivation is to improve usability and provide a new analysis tool to the community that complements FADTTS. Ultimately, by enabling FADTTS to a broader audience, FADTTSter seeks to accelerate hypothesis testing in neuroimaging studies involving heterogeneous clinical data and diffusion tensor imaging. This work is submitted to the Biomedical Applications in Molecular, Structural, and Functional Imaging conference. The source code of this application is available in NITRC.

  19. Statistical Significance of Optical Map Alignments

    PubMed Central

    Sarkar, Deepayan; Goldstein, Steve; Schwartz, David C.

    2012-01-01

    Abstract The Optical Mapping System constructs ordered restriction maps spanning entire genomes through the assembly and analysis of large datasets comprising individually analyzed genomic DNA molecules. Such restriction maps uniquely reveal mammalian genome structure and variation, but also raise computational and statistical questions beyond those that have been solved in the analysis of smaller, microbial genomes. We address the problem of how to filter maps that align poorly to a reference genome. We obtain map-specific thresholds that control errors and improve iterative assembly. We also show how an optimal self-alignment score provides an accurate approximation to the probability of alignment, which is useful in applications seeking to identify structural genomic abnormalities. PMID:22506568

  20. The social construction of "evidence-based'' drug prevention programs: a reanalysis of data from the Drug Abuse Resistance Education (DARE) program.

    PubMed

    Gorman, Dennis M; Huber, J Charles

    2009-08-01

    This study explores the possibility that any drug prevention program might be considered ;;evidence-based'' given the use of data analysis procedures that optimize the chance of producing statistically significant results by reanalyzing data from a Drug Abuse Resistance Education (DARE) program evaluation. The analysis produced a number of statistically significant differences between the DARE and control conditions on alcohol and marijuana use measures. Many of these differences occurred at cutoff points on the assessment scales for which post hoc meaningful labels were created. Our results are compared to those from evaluations of programs that appear on evidence-based drug prevention lists.

  1. The Timing of First Marriage: Are There Religious Variations?

    ERIC Educational Resources Information Center

    Xu, Xiaohe; Hudspeth, Clark D.; Bartkowski, John P.

    2005-01-01

    Using survey data from a nationally representative sample, this article explores how marriage timing varies across major religious denominations. Survival analysis indicates that net of statistical controls, Catholics, moderate Protestants, conservative Protestants, and Mormons marry significantly earlier than their unaffiliated counterparts. This…

  2. Assessment of the beryllium lymphocyte proliferation test using statistical process control.

    PubMed

    Cher, Daniel J; Deubner, David C; Kelsh, Michael A; Chapman, Pamela S; Ray, Rose M

    2006-10-01

    Despite more than 20 years of surveillance and epidemiologic studies using the beryllium blood lymphocyte proliferation test (BeBLPT) as a measure of beryllium sensitization (BeS) and as an aid for diagnosing subclinical chronic beryllium disease (CBD), improvements in specific understanding of the inhalation toxicology of CBD have been limited. Although epidemiologic data suggest that BeS and CBD risks vary by process/work activity, it has proven difficult to reach specific conclusions regarding the dose-response relationship between workplace beryllium exposure and BeS or subclinical CBD. One possible reason for this uncertainty could be misclassification of BeS resulting from variation in BeBLPT testing performance. The reliability of the BeBLPT, a biological assay that measures beryllium sensitization, is unknown. To assess the performance of four laboratories that conducted this test, we used data from a medical surveillance program that offered testing for beryllium sensitization with the BeBLPT. The study population was workers exposed to beryllium at various facilities over a 10-year period (1992-2001). Workers with abnormal results were offered diagnostic workups for CBD. Our analyses used a standard statistical technique, statistical process control (SPC), to evaluate test reliability. The study design involved a repeated measures analysis of BeBLPT results generated from the company-wide, longitudinal testing. Analytical methods included use of (1) statistical process control charts that examined temporal patterns of variation for the stimulation index, a measure of cell reactivity to beryllium; (2) correlation analysis that compared prior perceptions of BeBLPT instability to the statistical measures of test variation; and (3) assessment of the variation in the proportion of missing test results and how time periods with more missing data influenced SPC findings. During the period of this study, all laboratories displayed variation in test results that were beyond what would be expected due to chance alone. Patterns of test results suggested that variations were systematic. We conclude that laboratories performing the BeBLPT or other similar biological assays of immunological response could benefit from a statistical approach such as SPC to improve quality management.

  3. Method and algorithm of automatic estimation of road surface type for variable damping control

    NASA Astrophysics Data System (ADS)

    Dąbrowski, K.; Ślaski, G.

    2016-09-01

    In this paper authors presented an idea of road surface estimation (recognition) on a base of suspension dynamic response signals statistical analysis. For preliminary analysis cumulated distribution function (CDF) was used, and some conclusion that various roads have responses values in a different ranges of limits for the same percentage of samples or for the same limits different percentages of samples are located within the range between limit values. That was the base for developed and presented algorithm which was tested using suspension response signals recorded during road test riding over various surfaces. Proposed algorithm can be essential part of adaptive damping control algorithm for a vehicle suspension or adaptive control strategy for suspension damping control.

  4. Effects of a Minimal Workplace Intervention to Reduce Sedentary Behaviors and Improve Perceived Wellness in Middle-Aged Women Office Workers.

    PubMed

    Urda, Joyan L; Lynn, Jeffrey S; Gorman, Andrea; Larouere, Beth

    2016-08-01

    The purpose of this study was to determine whether an alert to get up once per hour while at work would reduce sitting time, increase sit-to-stand transitions, and improve perceived wellness in women with sedentary jobs. Female university staff and administrators (48 ± 10 years) were randomly assigned to control-control (CC) (n = 22) or control-intervention (CI) (n = 22) groups. Both used a thigh-worn postural-based activity monitor for 2 weeks. The CC group maintained normal behaviors, whereas the CI group maintained behaviors during control week, but received hourly alerts on their computer during work hours in the intervention week. Time sitting and sit-to-stand transitions during an 8.5-hour workday were examined. A perceived wellness survey was completed at baseline and after the control and intervention weeks. Among all participants (N = 44) during the control week, 68% of the workday was spent sitting and 41 sit-to-stand transitions occurred. An analysis of variance revealed no statistically significant differences in variables over time (P > .05). There was a significant increase in perceived wellness from baseline in both groups (P ≤ .05). Perceived wellness showed no statistically significant difference between groups. The intervention had no statistically significant effect on sitting time or sit-to-stand transitions. Participation improved perceived wellness in the absence of behavior change.

  5. Steep discounting of delayed monetary and food rewards in obesity: a meta-analysis.

    PubMed

    Amlung, M; Petker, T; Jackson, J; Balodis, I; MacKillop, J

    2016-08-01

    An increasing number of studies have investigated delay discounting (DD) in relation to obesity, but with mixed findings. This meta-analysis synthesized the literature on the relationship between monetary and food DD and obesity, with three objectives: (1) to characterize the relationship between DD and obesity in both case-control comparisons and continuous designs; (2) to examine potential moderators, including case-control v. continuous design, money v. food rewards, sample sex distribution, and sample age (18 years); and (3) to evaluate publication bias. From 134 candidate articles, 39 independent investigations yielded 29 case-control and 30 continuous comparisons (total n = 10 278). Random-effects meta-analysis was conducted using Cohen's d as the effect size. Publication bias was evaluated using fail-safe N, Begg-Mazumdar and Egger tests, meta-regression of publication year and effect size, and imputation of missing studies. The primary analysis revealed a medium effect size across studies that was highly statistically significant (d = 0.43, p < 10-14). None of the moderators examined yielded statistically significant differences, although notably larger effect sizes were found for studies with case-control designs, food rewards and child/adolescent samples. Limited evidence of publication bias was present, although the Begg-Mazumdar test and meta-regression suggested a slightly diminishing effect size over time. Steep DD of food and money appears to be a robust feature of obesity that is relatively consistent across the DD assessment methodologies and study designs examined. These findings are discussed in the context of research on DD in drug addiction, the neural bases of DD in obesity, and potential clinical applications.

  6. Voice analysis before and after vocal rehabilitation in patients following open surgery on vocal cords.

    PubMed

    Bunijevac, Mila; Petrović-Lazić, Mirjana; Jovanović-Simić, Nadica; Vuković, Mile

    2016-02-01

    The major role of larynx in speech, respiration and swallowing makes carcinomas of this region and their treatment very influential for patients' life quality. The aim of this study was to assess the importance of voice therapy in patients after open surgery on vocal cords. This study included 21 male patients and the control group of 19 subjects. The vowel (A) was recorded and analyzed for each examinee. All the patients were recorded twice: firstly, when they contacted the clinic and secondly, after a three-month vocal therapy, which was held twiceper week on an outpatient basis. The voice analysis was carried out in the Ear, Nose and Throat (ENT) Clinic, Clinical Hospital Center "Zvezdara" in Belgrade. The values of the acoustic parameters in the patients submitted to open surgery on the vocal cords before vocal rehabilitation and the control group subjects were significantly different in all specified parameters. These results suggest that the voice of the patients was damaged before vocal rehabilitation. The results of the acoustic parameters of the vowel (A) before and after vocal rehabilitation of the patients with open surgery on vocal cords were statistically significantly different. Among the parameters--Jitter (%), Shimmer (%)--the observed difference was highly statistically significant (p < 0.01). The voice turbulence index and the noise/harmonic ratio were also notably improved, and the observed difference was statistically significant (p < 0.05). The analysis of the tremor intensity index showed no significant improvement and the observed difference was not statistically significant (p > 0.05 ). CONCLUSION. There was a significant improvement of the acoustic parameters of the vowel (A) in the study subjects three months following vocal therapy. Only one out of five representative parameters showed no significant improvement.

  7. Antithrombotic drug therapy for IgA nephropathy: a meta analysis of randomized controlled trials.

    PubMed

    Liu, Xiu-Juan; Geng, Yan-Qiu; Xin, Shao-Nan; Huang, Guo-Ming; Tu, Xiao-Wen; Ding, Zhong-Ru; Chen, Xiang-Mei

    2011-01-01

    Antithrombotic agents, including antiplatelet agents, anticoagulants and thrombolysis agents, have been widely used in the management of immunoglobulin A (IgA) nephropathy in Chinese and Japanese populations. To systematically evaluate the effects of antithrombotic agents for IgA nephropathy. Data sources consisted of MEDLINE, EMBASE, the Cochrane Library, Chinese Biomedical Literature Database (CBM), Chinese Science and Technology Periodicals Databases (CNKI) and Japana Centra Revuo Medicina (http://www.jamas.gr.jp) up to April 5, 2011. The quality of the studies was evaluated from the intention to treat analysis and allocation concealment, as well as by the Jadad method. Meta-analyses were performed on the outcomes of proteinuria and renal function. Six articles met the predetermined inclusion criteria. Antithrombotic agents showed statistically significant effects on proteinuria (p<0.0001) but not on the protection of renal function (p=0.07). The pooled risk ratio for proteinuria was 0.53, [95% confidence intervals (CI): 0.41-0.68; I(2)=0%] and for renal function it was 0.42 (95% CI 0.17-1.06; I(2)=72%). Subgroup analysis showed that dipyridamole was beneficial for proteinuria (p=0.0003) but had no significant effects on protecting renal function. Urokinase had statistically significant effects both on the reduction of proteinuria (p=0.0005) and protecting renal function (p<0.00001) when compared with the control group. Antithrombotic agents had statistically significant effects on the reduction of proteinuria but not on the protection of renal function in patients with IgAN. Urokinase had statistically significant effects both on the reduction of proteinuria and on protecting renal function. Urokinase was shown to be a promising medication and should be investigated further.

  8. The investigation of the some body parameters of obese and (obese+diabetes) patients with using bioelectrical impedance analysis techniques

    NASA Astrophysics Data System (ADS)

    Yerlikaya, Emrah; Karageçili, Hasan; Aydin, Ruken Zeynep

    2016-04-01

    Obesity is a key risk for the development of hyperglycemia, hypertension, hyperlipidemia, insulin resistance and is totally referred to as the metabolic disorders. Diabetes mellitus, a metabolic disorder, is related with hyperglycemia, altered metabolism of lipids, carbohydrates and proteins. The minimum defining characteristic feature to identify diabetes mellitus is chronic and substantiated elevation of circulating glucose concentration. In this study, it is aimed to determine the body composition analyze of obese and (obese+diabetes) patients.We studied the datas taken from three independent groups with the body composition analyzer instrument. The body composition analyzer calculates body parameters, such as body fat ratio, body fat mass, fat free mass, estimated muscle mass, and base metabolic rate on the basis of data obtained by Dual Energy X-ray Absorptiometry using Bioelectrical Impedance Analysis. All patients and healthy subjects applied to Siirt University Medico and their datas were taken. The Statistical Package for Social Sciences version 21 was used for descriptive data analysis. When we compared and analyzed three groups datas, we found statistically significant difference between obese, (obese+diabetes) and control groups values. Anova test and tukey test are used to analyze the difference between groups and to do multiple comparisons. T test is also used to analyze the difference between genders. We observed the statistically significant difference in age and mineral amount p<0.00 between (diabetes+obese) and obese groups. Besides, when these patient groups and control group were analyzed, there were significant difference between most parameters. In terms of education level among the illiterate and university graduates; fat mass kg, fat percentage, internal lubrication, body mass index, water percentage, protein mass percentage, mineral percentage p<0.05, significant statistically difference were observed. This difference especially may result of a sedentary lifestyle.

  9. Training in metabolomics research. II. Processing and statistical analysis of metabolomics data, metabolite identification, pathway analysis, applications of metabolomics and its future

    PubMed Central

    Barnes, Stephen; Benton, H. Paul; Casazza, Krista; Cooper, Sara; Cui, Xiangqin; Du, Xiuxia; Engler, Jeffrey; Kabarowski, Janusz H.; Li, Shuzhao; Pathmasiri, Wimal; Prasain, Jeevan K.; Renfrow, Matthew B.; Tiwari, Hemant K.

    2017-01-01

    Metabolomics, a systems biology discipline representing analysis of known and unknown pathways of metabolism, has grown tremendously over the past 20 years. Because of its comprehensive nature, metabolomics requires careful consideration of the question(s) being asked, the scale needed to answer the question(s), collection and storage of the sample specimens, methods for extraction of the metabolites from biological matrices, the analytical method(s) to be employed and the quality control of the analyses, how collected data are correlated, the statistical methods to determine metabolites undergoing significant change, putative identification of metabolites, and the use of stable isotopes to aid in verifying metabolite identity and establishing pathway connections and fluxes. This second part of a comprehensive description of the methods of metabolomics focuses on data analysis, emerging methods in metabolomics and the future of this discipline. PMID:28239968

  10. Transdermal granisetron for the prevention of nausea and vomiting following moderately or highly emetogenic chemotherapy in Chinese patients: a randomized, double-blind, phase III study.

    PubMed

    Yang, Liu-Qing; Sun, Xin-Chen; Qin, Shu-Kui; Chen, Ying-Xia; Zhang, He-Long; Cheng, Ying; Chen, Zhen-Dong; Shi, Jian-Hua; Wu, Qiong; Bai, Yu-Xian; Han, Bao-Hui; Liu, Wei; Ouyang, Xue-Nong; Liu, Ji-Wei; Zhang, Zhi-Hui; Li, Yong-Qiang; Xu, Jian-Ming; Yu, Shi-Ying

    2016-12-01

    The granisetron transdermal delivery system (GTDS) has been demonstrated effectiveness in the control of chemotherapy-induced nausea and vomiting (CINV) in previous studies. This is the first phase III study to evaluate the efficacy and tolerability of GTDS in patients receiving moderately emetogenic chemotherapy (MEC) or highly emetogenic chemotherapy (HEC) in China. A total of 313 patients were randomized into the GTDS group (one transdermal granisetron patch, 7 days) or the oral granisetron group (granisetron oral 2 mg/day, ≥2 days). The primary endpoint was the percentage of patients achieving complete control (CC) from chemotherapy initiation until 24 h after final administration (PEEP). Chi-square test and Fisher's exact test were used for statistical analysis. Two hundred eighty-one patients were included in the per protocol analysis. During PEEP, CC was achieved by 67 (47.52%) patients in the GTDS group and 83 (59.29%) patients in the oral granisetron group. There was no statistical significance between the groups (P=0.0559). However, the difference of the CC percentage mainly occurred on the first day of chemotherapy between the groups. The CC was 70.13% on day 1 in the GTDS group, which was significantly lower than that of 91.03% in the oral granisetron group in the full analysis set. In the following days of chemotherapy, the CC was similar between the groups. In terms of cisplatin-contained regimen and female, there was statistical significance between the groups. Both treatments were well tolerated and safe. The most common adverse event was constipation. GTDS provided effective and well-tolerated control of CINV in Chinese patients, especially to non-cisplatin-contained regimen.

  11. Statistical modeling of crystalline silica exposure by trade in the construction industry using a database compiled from the literature.

    PubMed

    Sauvé, Jean-François; Beaudry, Charles; Bégin, Denis; Dion, Chantal; Gérin, Michel; Lavoué, Jérôme

    2012-09-01

    A quantitative determinants-of-exposure analysis of respirable crystalline silica (RCS) levels in the construction industry was performed using a database compiled from an extensive literature review. Statistical models were developed to predict work-shift exposure levels by trade. Monte Carlo simulation was used to recreate exposures derived from summarized measurements which were combined with single measurements for analysis. Modeling was performed using Tobit models within a multimodel inference framework, with year, sampling duration, type of environment, project purpose, project type, sampling strategy and use of exposure controls as potential predictors. 1346 RCS measurements were included in the analysis, of which 318 were non-detects and 228 were simulated from summary statistics. The model containing all the variables explained 22% of total variability. Apart from trade, sampling duration, year and strategy were the most influential predictors of RCS levels. The use of exposure controls was associated with an average decrease of 19% in exposure levels compared to none, and increased concentrations were found for industrial, demolition and renovation projects. Predicted geometric means for year 1999 were the highest for drilling rig operators (0.238 mg m(-3)) and tunnel construction workers (0.224 mg m(-3)), while the estimated exceedance fraction of the ACGIH TLV by trade ranged from 47% to 91%. The predicted geometric means in this study indicated important overexposure compared to the TLV. However, the low proportion of variability explained by the models suggests that the construction trade is only a moderate predictor of work-shift exposure levels. The impact of the different tasks performed during a work shift should also be assessed to provide better management and control of RCS exposure levels on construction sites.

  12. EFFECTS OF FUNCTIONAL ELECTRICAL STIMULATION IN REHABILITATION WITH HEMIPARESIS PATIENTS

    PubMed Central

    Tanović, Edina

    2009-01-01

    Cerebrovascular accident is a focal neurological deficiency occurring suddenly and lasting for more than 24 hours. The purpose of our work is to determine the role of the functional electrical simulation (FES) in the rehabilitation of patients with hemiparesis, which occurred as a consequence of a cerebrovascular accident. This study includes the analysis of two groups of 40 patients with hemiparesis (20 patients with deep hemiparesis and 20 patients with light hemi- paresis), a control group which was only treated with kinesiotherapy and a tested group which was treated with kinesiotherapy and functional electrical stimulation. Both groups of patients were analyzed in respect to their sex and age. Additional analysis of the walking function was completed in accordance with the BI and RAP index. The analysis of the basic demographical data demonstrated that there is no significant difference between the control and tested group. The patients of both groups are equal in respect of age and sex. After 4 weeks of rehabilitation of patients with deep and light hemiparesis there were no statistically significant differences between the groups after evaluation by the BI index. However, a statistically significant difference was noted between the groups by the RAP index among patients with deep hemiparesis. After 8 weeks of rehabilitation the group of patients who were treated with kinesiotherapy and functional electrical stimulation showed better statistically significant results of rehabilitation in respect to the control group with both the BI index and the RAP index (p<0,001). In conclusion, we can state that the patients in rehabilitation after a cerebrovascular accident require rehabilitation longer than 4 weeks. Walking rehabilitation after stroke is faster and more successful if we used functional electrical stimulation, in combination with kinesiotherapy, in patients with disabled extremities. PMID:19284395

  13. Low power and type II errors in recent ophthalmology research.

    PubMed

    Khan, Zainab; Milko, Jordan; Iqbal, Munir; Masri, Moness; Almeida, David R P

    2016-10-01

    To investigate the power of unpaired t tests in prospective, randomized controlled trials when these tests failed to detect a statistically significant difference and to determine the frequency of type II errors. Systematic review and meta-analysis. We examined all prospective, randomized controlled trials published between 2010 and 2012 in 4 major ophthalmology journals (Archives of Ophthalmology, British Journal of Ophthalmology, Ophthalmology, and American Journal of Ophthalmology). Studies that used unpaired t tests were included. Power was calculated using the number of subjects in each group, standard deviations, and α = 0.05. The difference between control and experimental means was set to be (1) 20% and (2) 50% of the absolute value of the control's initial conditions. Power and Precision version 4.0 software was used to carry out calculations. Finally, the proportion of articles with type II errors was calculated. β = 0.3 was set as the largest acceptable value for the probability of type II errors. In total, 280 articles were screened. Final analysis included 50 prospective, randomized controlled trials using unpaired t tests. The median power of tests to detect a 50% difference between means was 0.9 and was the same for all 4 journals regardless of the statistical significance of the test. The median power of tests to detect a 20% difference between means ranged from 0.26 to 0.9 for the 4 journals. The median power of these tests to detect a 50% and 20% difference between means was 0.9 and 0.5 for tests that did not achieve statistical significance. A total of 14% and 57% of articles with negative unpaired t tests contained results with β > 0.3 when power was calculated for differences between means of 50% and 20%, respectively. A large portion of studies demonstrate high probabilities of type II errors when detecting small differences between means. The power to detect small difference between means varies across journals. It is, therefore, worthwhile for authors to mention the minimum clinically important difference for individual studies. Journals can consider publishing statistical guidelines for authors to use. Day-to-day clinical decisions rely heavily on the evidence base formed by the plethora of studies available to clinicians. Prospective, randomized controlled clinical trials are highly regarded as a robust study and are used to make important clinical decisions that directly affect patient care. The quality of study designs and statistical methods in major clinical journals is improving overtime, 1 and researchers and journals are being more attentive to statistical methodologies incorporated by studies. The results of well-designed ophthalmic studies with robust methodologies, therefore, have the ability to modify the ways in which diseases are managed. Copyright © 2016 Canadian Ophthalmological Society. Published by Elsevier Inc. All rights reserved.

  14. Voxel-based statistical analysis of cerebral glucose metabolism in patients with permanent vegetative state after acquired brain injury.

    PubMed

    Kim, Yong Wook; Kim, Hyoung Seop; An, Young-Sil; Im, Sang Hee

    2010-10-01

    Permanent vegetative state is defined as the impaired level of consciousness longer than 12 months after traumatic causes and 3 months after non-traumatic causes of brain injury. Although many studies assessed the cerebral metabolism in patients with acute and persistent vegetative state after brain injury, few studies investigated the cerebral metabolism in patients with permanent vegetative state. In this study, we performed the voxel-based analysis of cerebral glucose metabolism and investigated the relationship between regional cerebral glucose metabolism and the severity of impaired consciousness in patients with permanent vegetative state after acquired brain injury. We compared the regional cerebral glucose metabolism as demonstrated by F-18 fluorodeoxyglucose positron emission tomography from 12 patients with permanent vegetative state after acquired brain injury with those from 12 control subjects. Additionally, covariance analysis was performed to identify regions where decreased changes in regional cerebral glucose metabolism significantly correlated with a decrease of level of consciousness measured by JFK-coma recovery scale. Statistical analysis was performed using statistical parametric mapping. Compared with controls, patients with permanent vegetative state demonstrated decreased cerebral glucose metabolism in the left precuneus, both posterior cingulate cortices, the left superior parietal lobule (P(corrected) < 0.001), and increased cerebral glucose metabolism in the both cerebellum and the right supramarginal cortices (P(corrected) < 0.001). In the covariance analysis, a decrease in the level of consciousness was significantly correlated with decreased cerebral glucose metabolism in the both posterior cingulate cortices (P(uncorrected) < 0.005). Our findings suggest that the posteromedial parietal cortex, which are part of neural network for consciousness, may be relevant structure for pathophysiological mechanism in patients with permanent vegetative state after acquired brain injury.

  15. The Ups and Downs of Repeated Cleavage and Internal Fragment Production in Top-Down Proteomics.

    PubMed

    Lyon, Yana A; Riggs, Dylan; Fornelli, Luca; Compton, Philip D; Julian, Ryan R

    2018-01-01

    Analysis of whole proteins by mass spectrometry, or top-down proteomics, has several advantages over methods relying on proteolysis. For example, proteoforms can be unambiguously identified and examined. However, from a gas-phase ion-chemistry perspective, proteins are enormous molecules that present novel challenges relative to peptide analysis. Herein, the statistics of cleaving the peptide backbone multiple times are examined to evaluate the inherent propensity for generating internal versus terminal ions. The raw statistics reveal an inherent bias favoring production of terminal ions, which holds true regardless of protein size. Importantly, even if the full suite of internal ions is generated by statistical dissociation, terminal ions are predicted to account for at least 50% of the total ion current, regardless of protein size, if there are three backbone dissociations or fewer. Top-down analysis should therefore be a viable approach for examining proteins of significant size. Comparison of the purely statistical analysis with actual top-down data derived from ultraviolet photodissociation (UVPD) and higher-energy collisional dissociation (HCD) reveals that terminal ions account for much of the total ion current in both experiments. Terminal ion production is more favored in UVPD relative to HCD, which is likely due to differences in the mechanisms controlling fragmentation. Importantly, internal ions are not found to dominate from either the theoretical or experimental point of view. Graphical abstract ᅟ.

  16. The Ups and Downs of Repeated Cleavage and Internal Fragment Production in Top-Down Proteomics

    NASA Astrophysics Data System (ADS)

    Lyon, Yana A.; Riggs, Dylan; Fornelli, Luca; Compton, Philip D.; Julian, Ryan R.

    2018-01-01

    Analysis of whole proteins by mass spectrometry, or top-down proteomics, has several advantages over methods relying on proteolysis. For example, proteoforms can be unambiguously identified and examined. However, from a gas-phase ion-chemistry perspective, proteins are enormous molecules that present novel challenges relative to peptide analysis. Herein, the statistics of cleaving the peptide backbone multiple times are examined to evaluate the inherent propensity for generating internal versus terminal ions. The raw statistics reveal an inherent bias favoring production of terminal ions, which holds true regardless of protein size. Importantly, even if the full suite of internal ions is generated by statistical dissociation, terminal ions are predicted to account for at least 50% of the total ion current, regardless of protein size, if there are three backbone dissociations or fewer. Top-down analysis should therefore be a viable approach for examining proteins of significant size. Comparison of the purely statistical analysis with actual top-down data derived from ultraviolet photodissociation (UVPD) and higher-energy collisional dissociation (HCD) reveals that terminal ions account for much of the total ion current in both experiments. Terminal ion production is more favored in UVPD relative to HCD, which is likely due to differences in the mechanisms controlling fragmentation. Importantly, internal ions are not found to dominate from either the theoretical or experimental point of view. [Figure not available: see fulltext.

  17. Antiviral treatment of Bell's palsy based on baseline severity: a systematic review and meta-analysis.

    PubMed

    Turgeon, Ricky D; Wilby, Kyle J; Ensom, Mary H H

    2015-06-01

    We conducted a systematic review with meta-analysis to evaluate the efficacy of antiviral agents on complete recovery of Bell's palsy. We searched CENTRAL, Embase, MEDLINE, International Pharmaceutical Abstracts, and sources of unpublished literature to November 1, 2014. Primary and secondary outcomes were complete and satisfactory recovery, respectively. To evaluate statistical heterogeneity, we performed subgroup analysis of baseline severity of Bell's palsy and between-study sensitivity analyses based on risk of allocation and detection bias. The 10 included randomized controlled trials (2419 patients; 807 with severe Bell's palsy at onset) had variable risk of bias, with 9 trials having a high risk of bias in at least 1 domain. Complete recovery was not statistically significantly greater with antiviral use versus no antiviral use in the random-effects meta-analysis of 6 trials (relative risk, 1.06; 95% confidence interval, 0.97-1.16; I(2) = 65%). Conversely, random-effects meta-analysis of 9 trials showed a statistically significant difference in satisfactory recovery (relative risk, 1.10; 95% confidence interval, 1.02-1.18; I(2) = 63%). Response to antiviral agents did not differ visually or statistically between patients with severe symptoms at baseline and those with milder disease (test for interaction, P = .11). Sensitivity analyses did not show a clear effect of bias on outcomes. Antiviral agents are not efficacious in increasing the proportion of patients with Bell's palsy who achieved complete recovery, regardless of baseline symptom severity. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Chronic atrophic gastritis in association with hair mercury level.

    PubMed

    Xue, Zeyun; Xue, Huiping; Jiang, Jianlan; Lin, Bing; Zeng, Si; Huang, Xiaoyun; An, Jianfu

    2014-11-01

    The objective of this study was to explore hair mercury level in association with chronic atrophic gastritis, a precancerous stage of gastric cancer (GC), and thus provide a brand new angle of view on the timely intervention of precancerous stage of GC. We recruited 149 healthy volunteers as controls and 152 patients suffering from chronic gastritis as cases. The controls denied upper gastrointestinal discomforts, and the cases were diagnosed as chronic superficial gastritis (n=68) or chronic atrophic gastritis (n=84). We utilized Mercury Automated Analyzer (NIC MA-3000) to detect hair mercury level of both healthy controls and cases of chronic gastritis. The statistic of measurement data was expressed as mean ± standard deviation, which was analyzed using Levene variance equality test and t test. Pearson correlation analysis was employed to determine associated factors affecting hair mercury levels, and multiple stepwise regression analysis was performed to deduce regression equations. Statistical significance is considered if p value is less than 0.05. The overall hair mercury level was 0.908949 ± 0.8844490 ng/g [mean ± standard deviation (SD)] in gastritis cases and 0.460198 ± 0.2712187 ng/g (mean±SD) in healthy controls; the former level was significantly higher than the latter one (p=0.000<0.01). The hair mercury level in chronic atrophic gastritis subgroup was 1.155220 ± 0.9470246 ng/g (mean ± SD) and that in chronic superficial gastritis subgroup was 0.604732 ± 0.6942509 ng/g (mean ± SD); the former level was significantly higher than the latter level (p<0.01). The hair mercury level in chronic superficial gastritis cases was significantly higher than that in healthy controls (p<0.05). The hair mercury level in chronic atrophic gastritis cases was significantly higher than that in healthy controls (p<0.01). Stratified analysis indicated that the hair mercury level in healthy controls with eating seafood was significantly higher than that in healthy controls without eating seafood (p<0.01) and that the hair mercury level in chronic atrophic gastritis cases was significantly higher than that in chronic superficial gastritis cases (p<0.01). Pearson correlation analysis indicated that eating seafood was most correlated with hair mercury level and positively correlated in the healthy controls and that the severity of gastritis was most correlated with hair mercury level and positively correlated in the gastritis cases. Multiple stepwise regression analysis indicated that the regression equation of hair mercury level in controls could be expressed as 0.262 multiplied the value of eating seafood plus 0.434, the model that was statistically significant (p<0.01). Multiple stepwise regression analysis also indicated that the regression equation of hair mercury level in gastritis cases could be expressed as 0.305 multiplied the severity of gastritis, the model that was also statistically significant (p<0.01). The graphs of regression standardized residual for both controls and cases conformed to normal distribution. The main positively correlated factor affecting the hair mercury level is eating seafood in healthy people whereas the predominant positively correlated factor affecting the hair mercury level is the severity of gastritis in chronic gastritis patients. That is to say, the severity of chronic gastritis is positively correlated with the level of hair mercury. The incessantly increased level of hair mercury possibly reflects the development of gastritis from normal stomach to superficial gastritis and to atrophic gastritis. The detection of hair mercury is potentially a means to predict the severity of chronic gastritis and possibly to insinuate the environmental mercury threat to human health in terms of gastritis or even carcinogenesis.

  19. The Australasian Resuscitation in Sepsis Evaluation (ARISE) trial statistical analysis plan.

    PubMed

    Delaney, Anthony P; Peake, Sandra L; Bellomo, Rinaldo; Cameron, Peter; Holdgate, Anna; Howe, Belinda; Higgins, Alisa; Presneill, Jeffrey; Webb, Steve

    2013-09-01

    The Australasian Resuscitation in Sepsis Evaluation (ARISE) study is an international, multicentre, randomised, controlled trial designed to evaluate the effectiveness of early goal-directed therapy compared with standard care for patients presenting to the emergency department with severe sepsis. In keeping with current practice, and considering aspects of trial design and reporting specific to non-pharmacological interventions, our plan outlines the principles and methods for analysing and reporting the trial results. The document is prepared before completion of recruitment into the ARISE study, without knowledge of the results of the interim analysis conducted by the data safety and monitoring committee and before completion of the two related international studies. Our statistical analysis plan was designed by the ARISE chief investigators, and reviewed and approved by the ARISE steering committee. We reviewed the data collected by the research team as specified in the study protocol and detailed in the study case report form. We describe information related to baseline characteristics, characteristics of delivery of the trial interventions, details of resuscitation, other related therapies and other relevant data with appropriate comparisons between groups. We define the primary, secondary and tertiary outcomes for the study, with description of the planned statistical analyses. We have developed a statistical analysis plan with a trial profile, mock-up tables and figures. We describe a plan for presenting baseline characteristics, microbiological and antibiotic therapy, details of the interventions, processes of care and concomitant therapies and adverse events. We describe the primary, secondary and tertiary outcomes with identification of subgroups to be analysed. We have developed a statistical analysis plan for the ARISE study, available in the public domain, before the completion of recruitment into the study. This will minimise analytical bias and conforms to current best practice in conducting clinical trials.

  20. Endpoint in plasma etch process using new modified w-multivariate charts and windowed regression

    NASA Astrophysics Data System (ADS)

    Zakour, Sihem Ben; Taleb, Hassen

    2017-09-01

    Endpoint detection is very important undertaking on the side of getting a good understanding and figuring out if a plasma etching process is done in the right way, especially if the etched area is very small (0.1%). It truly is a crucial part of supplying repeatable effects in every single wafer. When the film being etched has been completely cleared, the endpoint is reached. To ensure the desired device performance on the produced integrated circuit, the high optical emission spectroscopy (OES) sensor is employed. The huge number of gathered wavelengths (profiles) is then analyzed and pre-processed using a new proposed simple algorithm named Spectra peak selection (SPS) to select the important wavelengths, then we employ wavelet analysis (WA) to enhance the performance of detection by suppressing noise and redundant information. The selected and treated OES wavelengths are then used in modified multivariate control charts (MEWMA and Hotelling) for three statistics (mean, SD and CV) and windowed polynomial regression for mean. The employ of three aforementioned statistics is motivated by controlling mean shift, variance shift and their ratio (CV) if both mean and SD are not stable. The control charts show their performance in detecting endpoint especially W-mean Hotelling chart and the worst result is given by CV statistic. As the best detection of endpoint is given by the W-Hotelling mean statistic, this statistic will be used to construct a windowed wavelet Hotelling polynomial regression. This latter can only identify the window containing endpoint phenomenon.

  1. Local image statistics: maximum-entropy constructions and perceptual salience

    PubMed Central

    Victor, Jonathan D.; Conte, Mary M.

    2012-01-01

    The space of visual signals is high-dimensional and natural visual images have a highly complex statistical structure. While many studies suggest that only a limited number of image statistics are used for perceptual judgments, a full understanding of visual function requires analysis not only of the impact of individual image statistics, but also, how they interact. In natural images, these statistical elements (luminance distributions, correlations of low and high order, edges, occlusions, etc.) are intermixed, and their effects are difficult to disentangle. Thus, there is a need for construction of stimuli in which one or more statistical elements are introduced in a controlled fashion, so that their individual and joint contributions can be analyzed. With this as motivation, we present algorithms to construct synthetic images in which local image statistics—including luminance distributions, pair-wise correlations, and higher-order correlations—are explicitly specified and all other statistics are determined implicitly by maximum-entropy. We then apply this approach to measure the sensitivity of the human visual system to local image statistics and to sample their interactions. PMID:22751397

  2. Metabolomics analysis of follicular fluid in women with ovarian endometriosis undergoing in vitro fertilization.

    PubMed

    Karaer, Abdullah; Tuncay, Gorkem; Mumcu, Akın; Dogan, Berat

    2018-05-28

    The purpose of this study was to investigate whether a change in the follicular fluid metabolomics profile due to endometrioma is identifiable. Twelve women with ovarian endometriosis (aged<40 years, with a body mass index [BMI] of <30 kg/m 2 ) and 12 age- and BMI-matched controls (women with infertility purely due to a male factor) underwent ovarian stimulation for intracytoplasmic sperm injection (ICSI). Follicular fluid samples were collected from both of groups at the time of oocyte retrieval for ICSI. Next, nuclear magnetic resonance (NMR) spectroscopy was performed for the collected follicular fluids. The metabolic compositions of the follicular fluids were then compared using univariate and multivariate statistical analyses of NMR data. Univariate and multivariate statistical analyses of NMR data showed that the metabolomic profiles of the follicular fluids obtained from the women with ovarian endometriosis were distinctly different from those obtained from the control group. In comparison with the controls, the follicular fluids of the women with ovarian endometriosis had statistically significant elevated levels of lactate, β-glucose, pyruvate, and valine. We conclude that the levels of lactate, β-glucose, pyruvate, and valine in the follicular fluid of the women with endometrioma were higher than those of the controls. ASRM: American Society for Reproductive Medicine; BMI: body mass index; CPMG: Carr-Purcell-Meiboom-Gill; E 2 : estradiol; ESHRE: European Society of Human Reproduction and Embryology; ERETIC: electronic to access in vivo concentration; FF: follicular fluid; FSH: follicle-stimulating hormone; hCG: human chorionic gonadotropin; HEPES: 2-hydroxyethyl-1-piperazineethanesulfonic acid; ICSI: intracytoplasmic sperm injection; IVF: in vitro fertilization; NMR: nuclear magnetic resonance spectroscopy; PCA: principal component analysis; PCOS: polycystic ovary syndrome; PLS-DA: partial least squares discriminant analysis; ppm: parts per million; PULCON: pulse length-based concentration determination; TSP: 3-(trimethylsilyl)-1-propanesulfonic acid sodium salt; VIP: variable importance in projection.

  3. Using Technology to Expand and Enhance Applied Behavioral Analysis Programs for Children with Autism in Military Families

    DTIC Science & Technology

    2014-07-01

    statistically controlling for the effects of the pretest , the difference between the treatment and control group means on the posttest was large (i.e...effects of the pretest , the difference between the treatment and control group means on the posttest was large (i.e., partial eta squared of .708... posttest of the BISPA are shown in the upper-left panel of Fig. 1. Both groups performed poorly on the BISPA during the pretest , although the

  4. Lessons Learned from the Implementation of Total Quality Management at the Naval Aviation Depot, North Island, California

    DTIC Science & Technology

    1988-12-01

    Kaoru Ishikawa recognized the potential of statistical process control during one of Dr. Deming’s many instructional visits to Japan. He wrote the Guide...to Quality Control which has been utilized for both self-study and classroom training. In the Guide to Quality Control, Dr. Ishikawa describes...job data are essential for making a proper evaluation.( Ishikawa , p. 14) The gathering of data and its subsequent analysis are the foundation of

  5. Design and Analysis of A Multi-Backend Database System for Performance Improvement, Functionality Expansion and Capacity Growth. Part II.

    DTIC Science & Technology

    1981-08-01

    of Transactions ..... . 29 5.5.2 Attached Execution of Transactions ........ ... 29 5.5.3 The Choice of Transaction Execution for Access Control...basic access control mech- anism for statistical security and value-dependent security. In Section 5.5, * we describe the process of execution of ...the process of request execution with access control for in- sert and non-insert requests in MDBS. We recall again (see Chapter 4) that the process

  6. USAF (United States Air Force) Stability and Control DATCOM (Data Compendium)

    DTIC Science & Technology

    1978-04-01

    regression analysis involves the study of a group of variables to determine their effect on a given parameter. Because of the empirical nature of this...regression analysis of mathematical statistics. In general, a regression analysis involves the study of a group of variables to determine their effect on a...Excperiment, OSR TN 58-114, MIT Fluid Dynamics Research Group Rapt. 57-5, 1957. (U) 90. Kennet, H., and Ashley, H.: Review of Unsteady Aerodynamic Studies in

  7. Planning representation for automated exploratory data analysis

    NASA Astrophysics Data System (ADS)

    St. Amant, Robert; Cohen, Paul R.

    1994-03-01

    Igor is a knowledge-based system for exploratory statistical analysis of complex systems and environments. Igor has two related goals: to help automate the search for interesting patterns in data sets, and to help develop models that capture significant relationships in the data. We outline a language for Igor, based on techniques of opportunistic planning, which balances control and opportunism. We describe the application of Igor to the analysis of the behavior of Phoenix, an artificial intelligence planning system.

  8. Spatial characterization of dissolved trace elements and heavy metals in the upper Han River (China) using multivariate statistical techniques.

    PubMed

    Li, Siyue; Zhang, Quanfa

    2010-04-15

    A data matrix (4032 observations), obtained during a 2-year monitoring period (2005-2006) from 42 sites in the upper Han River is subjected to various multivariate statistical techniques including cluster analysis, principal component analysis (PCA), factor analysis (FA), correlation analysis and analysis of variance to determine the spatial characterization of dissolved trace elements and heavy metals. Our results indicate that waters in the upper Han River are primarily polluted by Al, As, Cd, Pb, Sb and Se, and the potential pollutants include Ba, Cr, Hg, Mn and Ni. Spatial distribution of trace metals indicates the polluted sections mainly concentrate in the Danjiang, Danjiangkou Reservoir catchment and Hanzhong Plain, and the most contaminated river is in the Hanzhong Plain. Q-model clustering depends on geographical location of sampling sites and groups the 42 sampling sites into four clusters, i.e., Danjiang, Danjiangkou Reservoir region (lower catchment), upper catchment and one river in headwaters pertaining to water quality. The headwaters, Danjiang and lower catchment, and upper catchment correspond to very high polluted, moderate polluted and relatively low polluted regions, respectively. Additionally, PCA/FA and correlation analysis demonstrates that Al, Cd, Mn, Ni, Fe, Si and Sr are controlled by natural sources, whereas the other metals appear to be primarily controlled by anthropogenic origins though geogenic source contributing to them. 2009 Elsevier B.V. All rights reserved.

  9. Energy Savings Analysis for Energy Monitoring and Control Systems

    DTIC Science & Technology

    1995-01-01

    for evaluating design and construction a:-0 quality, and for studying the effectiveness of air - tightening AC retrofits. No simple relationship...Energy These models of residential infiltration are based on statistical "Resource Center (1983) include information on air tightening in fits of

  10. Risk Driven Outcome-Based Command and Control (C2) Assessment

    DTIC Science & Technology

    2000-01-01

    shaping the risk ranking scores into more interpretable and statistically sound risk measures. Regression analysis was applied to determine what...Architecture Framework Implementation, AFCEA Coursebook 503J, February 8-11, 2000, San Diego, California. [Morgan and Henrion, 1990] M. Granger Morgan and

  11. Quality control analysis : part IV : field simulation of asphaltic concrete specifications.

    DOT National Transportation Integrated Search

    1969-02-01

    The report present some of the major findings, from a simulated study of statistical specifications, on three asphaltic concrete projects representing a total of approximately 30, 000 tons of hot mix. The major emphasis of the study has been on the a...

  12. Energy Monitoring and Targeting as diagnosis; Applying work analysis to adapt a statistical change detection strategy using representation aiding

    NASA Astrophysics Data System (ADS)

    Hilliard, Antony

    Energy Monitoring and Targeting is a well-established business process that develops information about utility energy consumption in a business or institution. While M&T has persisted as a worthwhile energy conservation support activity, it has not been widely adopted. This dissertation explains M&T challenges in terms of diagnosing and controlling energy consumption, informed by a naturalistic field study of M&T work. A Cognitive Work Analysis of M&T identifies structures that diagnosis can search, information flows un-supported in canonical support tools, and opportunities to extend the most popular tool for MM&T: Cumulative Sum of Residuals (CUSUM) charts. A design application outlines how CUSUM charts were augmented with a more contemporary statistical change detection strategy, Recursive Parameter Estimates, modified to better suit the M&T task using Representation Aiding principles. The design was experimentally evaluated in a controlled M&T synthetic task, and was shown to significantly improve diagnosis performance.

  13. Walking execution is not affected by divided attention in patients with multiple sclerosis with no disability, but there is a motor planning impairment.

    PubMed

    Nogueira, Leandro Alberto Calazans; Santos, Luciano Teixeira Dos; Sabino, Pollyane Galinari; Alvarenga, Regina Maria Papais; Thuler, Luiz Claudio Santos

    2013-08-01

    We analysed the cognitive influence on walking in multiple sclerosis (MS) patients, in the absence of clinical disability. A case-control study was conducted with 12 MS patients with no disability and 12 matched healthy controls. Subjects were referred for completion a timed walk test of 10 m and a 3D-kinematic analysis. Participants were instructed to walk at a comfortable speed in a dual-task (arithmetic task) condition, and motor planning was measured by mental chronometry. Scores of walking speed and cadence showed no statistically significant differences between the groups in the three conditions. The dual-task condition showed an increase in the double support duration in both groups. Motor imagery analysis showed statistically significant differences between real and imagined walking in patients. MS patients with no disability did not show any influence of divided attention on walking execution. However, motor planning was overestimated as compared with real walking.

  14. Data on xylem sap proteins from Mn- and Fe-deficient tomato plants obtained using shotgun proteomics.

    PubMed

    Ceballos-Laita, Laura; Gutierrez-Carbonell, Elain; Takahashi, Daisuke; Abadía, Anunciación; Uemura, Matsuo; Abadía, Javier; López-Millán, Ana Flor

    2018-04-01

    This article contains consolidated proteomic data obtained from xylem sap collected from tomato plants grown in Fe- and Mn-sufficient control, as well as Fe-deficient and Mn-deficient conditions. Data presented here cover proteins identified and quantified by shotgun proteomics and Progenesis LC-MS analyses: proteins identified with at least two peptides and showing changes statistically significant (ANOVA; p ≤ 0.05) and above a biologically relevant selected threshold (fold ≥ 2) between treatments are listed. The comparison between Fe-deficient, Mn-deficient and control xylem sap samples using a multivariate statistical data analysis (Principal Component Analysis, PCA) is also included. Data included in this article are discussed in depth in the research article entitled "Effects of Fe and Mn deficiencies on the protein profiles of tomato ( Solanum lycopersicum) xylem sap as revealed by shotgun analyses" [1]. This dataset is made available to support the cited study as well to extend analyses at a later stage.

  15. MAGMA: analysis of two-channel microarrays made easy.

    PubMed

    Rehrauer, Hubert; Zoller, Stefan; Schlapbach, Ralph

    2007-07-01

    The web application MAGMA provides a simple and intuitive interface to identify differentially expressed genes from two-channel microarray data. While the underlying algorithms are not superior to those of similar web applications, MAGMA is particularly user friendly and can be used without prior training. The user interface guides the novice user through the most typical microarray analysis workflow consisting of data upload, annotation, normalization and statistical analysis. It automatically generates R-scripts that document MAGMA's entire data processing steps, thereby allowing the user to regenerate all results in his local R installation. The implementation of MAGMA follows the model-view-controller design pattern that strictly separates the R-based statistical data processing, the web-representation and the application logic. This modular design makes the application flexible and easily extendible by experts in one of the fields: statistical microarray analysis, web design or software development. State-of-the-art Java Server Faces technology was used to generate the web interface and to perform user input processing. MAGMA's object-oriented modular framework makes it easily extendible and applicable to other fields and demonstrates that modern Java technology is also suitable for rather small and concise academic projects. MAGMA is freely available at www.magma-fgcz.uzh.ch.

  16. Water quality analysis of the Rapur area, Andhra Pradesh, South India using multivariate techniques

    NASA Astrophysics Data System (ADS)

    Nagaraju, A.; Sreedhar, Y.; Thejaswi, A.; Sayadi, Mohammad Hossein

    2017-10-01

    The groundwater samples from Rapur area were collected from different sites to evaluate the major ion chemistry. The large number of data can lead to difficulties in the integration, interpretation, and representation of the results. Two multivariate statistical methods, hierarchical cluster analysis (HCA) and factor analysis (FA), were applied to evaluate their usefulness to classify and identify geochemical processes controlling groundwater geochemistry. Four statistically significant clusters were obtained from 30 sampling stations. This has resulted two important clusters viz., cluster 1 (pH, Si, CO3, Mg, SO4, Ca, K, HCO3, alkalinity, Na, Na + K, Cl, and hardness) and cluster 2 (EC and TDS) which are released to the study area from different sources. The application of different multivariate statistical techniques, such as principal component analysis (PCA), assists in the interpretation of complex data matrices for a better understanding of water quality of a study area. From PCA, it is clear that the first factor (factor 1), accounted for 36.2% of the total variance, was high positive loading in EC, Mg, Cl, TDS, and hardness. Based on the PCA scores, four significant cluster groups of sampling locations were detected on the basis of similarity of their water quality.

  17. Global hospital bed utilization crisis. A different approach.

    PubMed

    Waness, Abdelkarim; Akbar, Jalal U; Kharal, Mubashar; BinSalih, Salih; Harakati, Mohammed

    2010-04-01

    To test the effect of improved physician availability on hospital bed utilization. A prospective cohort study was conducted from 1st January 2009 to 31st March 2009 in the Division of Internal Medicine (DIM), King Abdul-Aziz Medical City (KAMC), Riyadh, Kingdom of Saudi Arabia. Two clinical teaching units (CTU) were compared head-to-head. Each CTU has 3 consultants. The CTU-control provides standard care, while the CTU-intervention was designed to provide better physician-consultant availability. Three outcomes were evaluated: patient outsourcing to another hospital, patient discharge during weekends, and overall admissions. Statistical analysis was carried out by electronic statistics calculator from the Center for Evidence-Based Medicine. Three hundred and thirty-four patients were evaluated for admission at the Emergency Room by both CTU's. One hundred and eighty-three patients were seen by the CTU-control, 6 patients were outsourced, and 177 were admitted. One hundred fifty-one patients were seen by the CTU-intervention: 39 of them were outsourced, and 112 were admitted. Forty-eight weekend patient discharges occurred during this period of time: 21 by CTU-control, and 27 by CTU-intervention. Analysis for odds ratio in both the rate of outsourcing, and weekend discharges, showed statistical significance in favor of the intervention group. The continuous availability of a physician-consultant for patient admission evaluation, outsourcing, or discharge during regular weekdays and weekends at DIM, KAMC proved to have a positive impact on bed utilization.

  18. Apolipoprotein E gene polymorphism and Alzheimer's disease in Chinese population: a meta-analysis

    NASA Astrophysics Data System (ADS)

    Liu, Mengying; Bian, Chen; Zhang, Jiqiang; Wen, Feng

    2014-03-01

    The relationship between Apolipoprotein E (ApoE) genotype and the risk of Alzheimer's disease (AD) is relatively well established in Caucasians, but less established in other ethnicities. To examine the association between ApoE polymorphism and the onset of AD in Chinese population, we searched the commonly used electronic databases between January 2000 and November 2013 for relevant studies. Total 20 studies, including 1576 cases and 1741 controls, were retrieved. The results showed statistically significant positive association between risk factor ɛ4 allele carriers and AD in Chinese population (OR = 3.93, 95% CI = 3.37-4.58, P < 0.00001). Genotype ApoE ɛ4/ɛ4 and ɛ4/ɛ3 have statistically significant association with AD as well (ɛ4/ɛ4: OR = 11.76, 95% CI = 6.38-21.47, P < 0.00001; ɛ4/ɛ3: OR = 3.08, 95% CI = 2.57-3.69, P < 0.00001). Furthermore, the frequency of the ApoE ɛ3 is lower in AD than that in the health controls, and the difference of ɛ3 allele is also statistically significant (OR = 0.42, 95% CI = 0.37-0.47, P < 0.00001). No significant heterogeneity was observed among all studies. This meta-analysis suggests that the subject with at least one ApoE ɛ4 allele has higher risk suffering from AD than controls in Chinese population. The results also provide a support for the protection effect of ApoE ɛ3 allele in developing AD.

  19. Effects of microcurrent stimulation on hyaline cartilage repair in immature male rats (Rattus norvegicus).

    PubMed

    de Campos Ciccone, Carla; Zuzzi, Denise Cristina; Neves, Lia Mara Grosso; Mendonça, Josué Sampaio; Joazeiro, Paulo Pinto; Esquisatto, Marcelo Augusto Marretto

    2013-01-19

    In this study, we investigate the effects of microcurrent stimulation on the repair process of xiphoid cartilage in 45-days-old rats. Twenty male rats were divided into a control group and a treated group. A 3-mm defect was then created with a punch in anesthetized animals. In the treated group, animals were submitted to daily applications of a biphasic square pulse microgalvanic continuous electrical current during 5 min. In each application, it was used a frequency of 0.3 Hz and intensity of 20 μA. The animals were sacrificed at 7, 21 and 35 days after injury for structural analysis. Basophilia increased gradually in control animals during the experimental period. In treated animals, newly formed cartilage was observed on days 21 and 35. No statistically significant differences in birefringent collagen fibers were seen between groups at any of the time points. Treated animals presented a statistically larger number of chondroblasts. Calcification points were observed in treated animals on day 35. Ultrastructural analysis revealed differences in cell and matrix characteristics between the two groups. Chondrocyte-like cells were seen in control animals only after 35 days, whereas they were present in treated animals as early as by day 21. The number of cuprolinic blue-stained proteoglycans was statistically higher in treated animals on days 21 and 35. We conclude that microcurrent stimulation accelerates the cartilage repair in non-articular site from prepuberal animals.

  20. Effects of microcurrent stimulation on Hyaline cartilage repair in immature male rats (Rattus norvegicus)

    PubMed Central

    2013-01-01

    Background In this study, we investigate the effects of microcurrent stimulation on the repair process of xiphoid cartilage in 45-days-old rats. Methods Twenty male rats were divided into a control group and a treated group. A 3-mm defect was then created with a punch in anesthetized animals. In the treated group, animals were submitted to daily applications of a biphasic square pulse microgalvanic continuous electrical current during 5 min. In each application, it was used a frequency of 0.3 Hz and intensity of 20 μA. The animals were sacrificed at 7, 21 and 35 days after injury for structural analysis. Results Basophilia increased gradually in control animals during the experimental period. In treated animals, newly formed cartilage was observed on days 21 and 35. No statistically significant differences in birefringent collagen fibers were seen between groups at any of the time points. Treated animals presented a statistically larger number of chondroblasts. Calcification points were observed in treated animals on day 35. Ultrastructural analysis revealed differences in cell and matrix characteristics between the two groups. Chondrocyte-like cells were seen in control animals only after 35 days, whereas they were present in treated animals as early as by day 21. The number of cuprolinic blue-stained proteoglycans was statistically higher in treated animals on days 21 and 35. Conclusion We conclude that microcurrent stimulation accelerates the cartilage repair in non-articular site from prepuberal animals. PMID:23331612

  1. Breast cancer lymphoscintigraphy: Factors associated with sentinel lymph node non visualization.

    PubMed

    Vaz, S C; Silva, Â; Sousa, R; Ferreira, T C; Esteves, S; Carvalho, I P; Ratão, P; Daniel, A; Salgado, L

    2015-01-01

    To evaluate factors associated with non identification of the sentinel lymph node (SLN) in lymphoscintigraphy of breast cancer patients and analyze the relationship with SLN metastases. A single-center, cross-sectional and retrospective study was performed. Forty patients with lymphoscintigraphy without sentinel lymph node identification (negative lymphoscintigraphy - NL) were enrolled. The control group included 184 patients with SLN identification (positive lymphoscintigraphy - PL). Evaluated factors were age, body mass index (BMI), tumor size, histology, localization, preoperative breast lesion hookwire (harpoon) marking and SLN metastases. The statistical analysis was performed with uni- and multivariate logistic regression models and matched-pairs analysis. Age (p=0.036) or having BMI (p=0.047) were the only factors significantly associated with NL. Being ≥60 years with a BMI ≥30 increased the odds of having a NL 2 and 3.8 times, respectively. Marking with hookwire seems to increase the likelihood of NL, but demonstrated statistical significance is lacking (p=0.087). The other tested variables did not affect the examination result. When controlling for age, BMI and marking with the harpoon, a significant association between lymph node metastization and NL was not found (p=0.565). The most important factors related with non identification of SLN in the patients were age, BMI and marking with hook wire. However, only the first two had statistical importance. When these variables were controlled, no association was found between NL and axillary metastases. Copyright © 2015 Elsevier España, S.L.U. and SEMNIM. All rights reserved.

  2. Scalable privacy-preserving data sharing methodology for genome-wide association studies.

    PubMed

    Yu, Fei; Fienberg, Stephen E; Slavković, Aleksandra B; Uhler, Caroline

    2014-08-01

    The protection of privacy of individual-level information in genome-wide association study (GWAS) databases has been a major concern of researchers following the publication of "an attack" on GWAS data by Homer et al. (2008). Traditional statistical methods for confidentiality and privacy protection of statistical databases do not scale well to deal with GWAS data, especially in terms of guarantees regarding protection from linkage to external information. The more recent concept of differential privacy, introduced by the cryptographic community, is an approach that provides a rigorous definition of privacy with meaningful privacy guarantees in the presence of arbitrary external information, although the guarantees may come at a serious price in terms of data utility. Building on such notions, Uhler et al. (2013) proposed new methods to release aggregate GWAS data without compromising an individual's privacy. We extend the methods developed in Uhler et al. (2013) for releasing differentially-private χ(2)-statistics by allowing for arbitrary number of cases and controls, and for releasing differentially-private allelic test statistics. We also provide a new interpretation by assuming the controls' data are known, which is a realistic assumption because some GWAS use publicly available data as controls. We assess the performance of the proposed methods through a risk-utility analysis on a real data set consisting of DNA samples collected by the Wellcome Trust Case Control Consortium and compare the methods with the differentially-private release mechanism proposed by Johnson and Shmatikov (2013). Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Robust inference for group sequential trials.

    PubMed

    Ganju, Jitendra; Lin, Yunzhi; Zhou, Kefei

    2017-03-01

    For ethical reasons, group sequential trials were introduced to allow trials to stop early in the event of extreme results. Endpoints in such trials are usually mortality or irreversible morbidity. For a given endpoint, the norm is to use a single test statistic and to use that same statistic for each analysis. This approach is risky because the test statistic has to be specified before the study is unblinded, and there is loss in power if the assumptions that ensure optimality for each analysis are not met. To minimize the risk of moderate to substantial loss in power due to a suboptimal choice of a statistic, a robust method was developed for nonsequential trials. The concept is analogous to diversification of financial investments to minimize risk. The method is based on combining P values from multiple test statistics for formal inference while controlling the type I error rate at its designated value.This article evaluates the performance of 2 P value combining methods for group sequential trials. The emphasis is on time to event trials although results from less complex trials are also included. The gain or loss in power with the combination method relative to a single statistic is asymmetric in its favor. Depending on the power of each individual test, the combination method can give more power than any single test or give power that is closer to the test with the most power. The versatility of the method is that it can combine P values from different test statistics for analysis at different times. The robustness of results suggests that inference from group sequential trials can be strengthened with the use of combined tests. Copyright © 2017 John Wiley & Sons, Ltd.

  4. Surgical Correction of Metopic Craniosynostosis: A 3-D Photogrammetric Analysis of Cranial Vault Outcomes.

    PubMed

    Linden, Olivia E; Baratta, Vanessa M; Gonzalez, Jose A; Byrne, Margaret E; Klinge, Petra M; Sullivan, Stephen R; Taylor, Helena O

    2018-01-01

    To evaluate 3-dimensional (3-D) photogrammetry as a tool for assessing the postoperative head shape of patients who had undergone cranial vault remodeling for metopic synostosis. We prospectively analyzed images of patients with metopic craniosynostosis who had undergone anterior cranial vault remodeling and age-matched controls. To ensure standardized facial orientation, each 3-D image was positioned to "best fit" the preoperative face by aligning 6 soft tissue landmarks. Forehead measurements were taken from a standardized position behind the surface of the face to landmarks placed in a ray configuration across the forehead. Academic teaching hospital. Thirteen pediatric patients with metopic craniosynostosis who had undergone anterior cranial vault remodeling and age-matched controls. Images were taken preoperatively, immediately postoperatively, and over 1-year postoperatively. Forehead contours preoperatively and postoperatively, with statistics performed using a multivariate analysis of variance shape analysis. Mean postoperative follow-up was 1.8 (0.6) years. The average distance from the origin to forehead landmarks was 55.1 (3.4) mm preoperatively, 59.3 (0.7) mm immediate postoperatively, 59.1 (1.0) mm 1-year postoperatively, and 59.4 (0.6) mm in controls. Postoperative metopic forehead contours varied significantly from preoperative contours ( P < .01), while there was no statistical difference between the 2 postoperative time points ( P = .70). One-year postoperative patients were not significantly different from their age-matched controls ( P > .99). Preoperative metopic forehead contours varied significantly from postoperative contours. Cranial reconstructions approximated the foreheads of normal controls, and reconstructions were stable at more than 1-year follow-up.

  5. The Microbiome in Posttraumatic Stress Disorder and Trauma-Exposed Controls: An Exploratory Study.

    PubMed

    Hemmings, Sian M J; Malan-Müller, Stefanie; van den Heuvel, Leigh L; Demmitt, Brittany A; Stanislawski, Maggie A; Smith, David G; Bohr, Adam D; Stamper, Christopher E; Hyde, Embriette R; Morton, James T; Marotz, Clarisse A; Siebler, Philip H; Braspenning, Maarten; Van Criekinge, Wim; Hoisington, Andrew J; Brenner, Lisa A; Postolache, Teodor T; McQueen, Matthew B; Krauter, Kenneth S; Knight, Rob; Seedat, Soraya; Lowry, Christopher A

    2017-10-01

    Inadequate immunoregulation and elevated inflammation may be risk factors for posttraumatic stress disorder (PTSD), and microbial inputs are important determinants of immunoregulation; however, the association between the gut microbiota and PTSD is unknown. This study investigated the gut microbiome in a South African sample of PTSD-affected individuals and trauma-exposed (TE) controls to identify potential differences in microbial diversity or microbial community structure. The Clinician-Administered PTSD Scale for DSM-5 was used to diagnose PTSD according to Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition criteria. Microbial DNA was extracted from stool samples obtained from 18 individuals with PTSD and 12 TE control participants. Bacterial 16S ribosomal RNA gene V3/V4 amplicons were generated and sequenced. Microbial community structure, α-diversity, and β-diversity were analyzed; random forest analysis was used to identify associations between bacterial taxa and PTSD. There were no differences between PTSD and TE control groups in α- or β-diversity measures (e.g., α-diversity: Shannon index, t = 0.386, p = .70; β-diversity, on the basis of analysis of similarities: Bray-Curtis test statistic = -0.033, p = .70); however, random forest analysis highlighted three phyla as important to distinguish PTSD status: Actinobacteria, Lentisphaerae, and Verrucomicrobia. Decreased total abundance of these taxa was associated with higher Clinician-Administered PTSD Scale scores (r = -0.387, p = .035). In this exploratory study, measures of overall microbial diversity were similar among individuals with PTSD and TE controls; however, decreased total abundance of Actinobacteria, Lentisphaerae, and Verrucomicrobia was associated with PTSD status.

  6. [Pathogenetic therapy of mastopathies in the prevention of breast cancer].

    PubMed

    Iaritsyn, S S; Sidorenko, L N

    1979-01-01

    The breast cancer morbidity among the population of the city of Leningrad has been analysed. It was shown that there is a tendency to the increased number of breast cancer patients. In this respect attention is given to the prophylactic measures, accomplished in Leningrad City oncological dyspensary. As proved statistically, the pathogenetic therapy of mastopathy is a factor contributing to less risk of malignant transformation. For the statistical analysis the authors used the data of 132 breast cancer patients; previously operated upon for local fibroadenomatosis, and the data of 259 control patients. It was found that among the patients with fibroadenomatosis who subsequently developed cancer of the mammary gland, the proportion of untreated patients was 2.8 times as much as in the control group.

  7. Comparative Efficacy of Tongxinluo Capsule and Beta-Blockers in Treating Angina Pectoris: Meta-Analysis of Randomized Controlled Trials.

    PubMed

    Jia, Yongliang; Leung, Siu-wai

    2015-11-01

    There have been no systematic reviews, let alone meta-analyses, of randomized controlled trials (RCTs) comparing tongxinluo capsule (TXL) and beta-blockers in treating angina pectoris. This study aimed to evaluate the efficacy of TXL and beta-blockers in treating angina pectoris by a meta-analysis of eligible RCTs. The RCTs comparing TXL with beta-blockers (including metoprolol) in treating angina pectoris were searched and retrieved from databases including PubMed, Chinese National Knowledge Infrastructure, and WanFang Data. Eligible RCTs were selected according to prespecified criteria. Meta-analysis was performed on the odds ratios (OR) of symptomatic and electrocardiographic (ECG) improvements after treatment. Subgroup analysis, sensitivity analysis, meta-regression, and publication biases analysis were conducted to evaluate the robustness of the results. Seventy-three RCTs published between 2000 and 2014 with 7424 participants were eligible. Overall ORs comparing TXL with beta-blockers were 3.40 (95% confidence interval [CI], 2.97-3.89; p<0.0001) for symptomatic improvement and 2.63 (95% CI, 2.29-3.02; p<0.0001) for ECG improvement. Subgroup analysis and sensitivity analysis found no statistically significant dependence of overall ORs on specific study characteristics except efficacy criteria. Meta-regression found no significant except sample sizes for data on symptomatic improvement. Publication biases were statistically significant. TXL seems to be more effective than beta-blockers in treating angina pectoris, on the basis of the eligible RCTs. Further RCTs are warranted to reduce publication bias and verify efficacy.

  8. EVALUATION OF THE EXTRACELLULAR MATRIX OF INJURED SUPRASPINATUS IN RATS

    PubMed Central

    Almeida, Luiz Henrique Oliveira; Ikemoto, Roberto; Mader, Ana Maria; Pinhal, Maria Aparecida Silva; Munhoz, Bruna; Murachovsky, Joel

    2016-01-01

    ABSTRACT Objective: To evaluate the evolution of injuries of the supraspinatus muscle by immunohistochemistry (IHC) and anatomopathological analysis in animal model (Wistar rats). Methods: Twenty-five Wistar rats were submitted to complete injury of the supraspinatus tendon, then subsequently sacrificed in groups of five animals at the following periods: immediately after the injury, 24h after the injury, 48h after, 30 days after and three months after the injury. All groups underwent histological and IHC analysis. Results: Regarding vascular proliferation and inflammatory infiltrate, we found a statistically significant difference between groups 1(control group) and 2 (24h after injury). IHC analysis showed that expression of vascular endothelial growth factor (VEGF) showed a statistically significant difference between groups 1 and 2, and collagen type 1 (Col-1) evaluation presented a statistically significant difference between groups 1 and 4. Conclusion: We observed changes in the extracellular matrix components compatible with remodeling and healing. Remodeling is more intense 24h after injury. However, VEGF and Col-1 are substantially increased at 24h and 30 days after the injury, respectively. Level of Evidence I, Experimental Study. PMID:26997907

  9. Weighted analysis of composite endpoints with simultaneous inference for flexible weight constraints.

    PubMed

    Duc, Anh Nguyen; Wolbers, Marcel

    2017-02-10

    Composite endpoints are widely used as primary endpoints of randomized controlled trials across clinical disciplines. A common critique of the conventional analysis of composite endpoints is that all disease events are weighted equally, whereas their clinical relevance may differ substantially. We address this by introducing a framework for the weighted analysis of composite endpoints and interpretable test statistics, which are applicable to both binary and time-to-event data. To cope with the difficulty of selecting an exact set of weights, we propose a method for constructing simultaneous confidence intervals and tests that asymptotically preserve the family-wise type I error in the strong sense across families of weights satisfying flexible inequality or order constraints based on the theory of χ¯2-distributions. We show that the method achieves the nominal simultaneous coverage rate with substantial efficiency gains over Scheffé's procedure in a simulation study and apply it to trials in cardiovascular disease and enteric fever. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  10. Statistical analysis of the calibration procedure for personnel radiation measurement instruments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bush, W.J.; Bengston, S.J.; Kalbeitzer, F.L.

    1980-11-01

    Thermoluminescent analyzer (TLA) calibration procedures were used to estimate personnel radiation exposure levels at the Idaho National Engineering Laboratory (INEL). A statistical analysis is presented herein based on data collected over a six month period in 1979 on four TLA's located in the Department of Energy (DOE) Radiological and Environmental Sciences Laboratory at the INEL. The data were collected according to the day-to-day procedure in effect at that time. Both gamma and beta radiation models are developed. Observed TLA readings of thermoluminescent dosimeters are correlated with known radiation levels. This correlation is then used to predict unknown radiation doses frommore » future analyzer readings of personnel thermoluminescent dosimeters. The statistical techniques applied in this analysis include weighted linear regression, estimation of systematic and random error variances, prediction interval estimation using Scheffe's theory of calibration, the estimation of the ratio of the means of two normal bivariate distributed random variables and their corresponding confidence limits according to Kendall and Stuart, tests of normality, experimental design, a comparison between instruments, and quality control.« less

  11. Hitting Is Contagious in Baseball: Evidence from Long Hitting Streaks

    PubMed Central

    Bock, Joel R.; Maewal, Akhilesh; Gough, David A.

    2012-01-01

    Data analysis is used to test the hypothesis that “hitting is contagious”. A statistical model is described to study the effect of a hot hitter upon his teammates’ batting during a consecutive game hitting streak. Box score data for entire seasons comprising streaks of length games, including a total observations were compiled. Treatment and control sample groups () were constructed from core lineups of players on the streaking batter’s team. The percentile method bootstrap was used to calculate confidence intervals for statistics representing differences in the mean distributions of two batting statistics between groups. Batters in the treatment group (hot streak active) showed statistically significant improvements in hitting performance, as compared against the control. Mean for the treatment group was found to be to percentage points higher during hot streaks (mean difference increased points), while the batting heat index introduced here was observed to increase by points. For each performance statistic, the null hypothesis was rejected at the significance level. We conclude that the evidence suggests the potential existence of a “statistical contagion effect”. Psychological mechanisms essential to the empirical results are suggested, as several studies from the scientific literature lend credence to contagious phenomena in sports. Causal inference from these results is difficult, but we suggest and discuss several latent variables that may contribute to the observed results, and offer possible directions for future research. PMID:23251507

  12. Efficacy of workplace interventions for shoulder pain: A systematic review and meta-analysis.

    PubMed

    Lowry, Veronique; Desjardins-Charbonneau, Ariel; Roy, Jean-Sébastien; Dionne, Clermont E; Frémont, Pierre; MacDermid, Joy C; Desmeules, François

    2017-07-07

    To perform a systematic review and meta-analysis of randomized controlled trials on the efficacy of workplace-based interventions to prevent or treat shoulder pain. A systematic review of 4 databases was performed up to January 2016. Randomized controlled trials were included if the intervention under study was a workplace-based intervention performed to prevent or reduce shoulder pain and disability in workers. The methodological quality of the studies was evaluated and meta-analyses were conducted. Pooled mean differences and risk ratios were calculated. Data from 4 studies on strengthening exercises performed in the workplace for workers with shoulder pain (n = 368) were pooled. A statistically significant reduction in pain intensity was observed compared with different control interventions (mean differences (scale out of 10) 1.31 (95% confidence interval (95% CI) 0.86-1.76)). Pooled data from 5 studies on the efficacy of workstation modifications (n = 2,148) showed a statistically significant reduction in the prevalence of shoulder pain with a risk ratio of 1.88 (95% CI 1.20-2.96) compared with different control interventions. Low-grade evidence exists that a workplace exercise programme may reduce the intensity of shoulder pain, and that workstation modifications may reduce the prevalence of shoulder pain.

  13. Relationship of saving habit determinants among undergraduate students: A case study of UiTM Negeri Sembilan, Kampus Seremban

    NASA Astrophysics Data System (ADS)

    Syahrom, N. S.; Nasrudin, N. S.; Mohamad Yasin, N.; Azlan, N.; Manap, N.

    2017-08-01

    It has been reported that students are already accumulating substantial debt from higher education fees and their spending habit are at a critical stage. These situations cause the youngsters facing bankruptcy if they cannot improve their saving habit. Many researches have acknowledged that the determinants of saving habit include financial management, parental socialization, peer influence, and self-control. However, there remains a need for investigating the relationship between these determinants in order to avoid bankruptcy among youngsters. The objectives of this study are to investigate the relationship between saving habit determinants and to generate a statistical model based on the determinants of saving habit. Data collection method used is questionnaire and its scope is undergraduate students of UiTM Negeri Sembilan, Kampus Seremban. Samples are chosen by using stratified probability sampling technique and cross-sectional method is used as the research design. The results are then analysed by using descriptive analysis, reliability test, Pearson Correlation, and Multiple Linear Regression analysis. It shows that saving habit and self-control has no relationship. It means that students with less self-control tend to save less. A statistical model has been generated that incorporated this relationship. This study is significant to help the students save more by having high self-control.

  14. Does Topical Ozone Therapy Improve Patient Comfort After Surgical Removal of Impacted Mandibular Third Molar? A Randomized Controlled Trial.

    PubMed

    Sivalingam, Varun P; Panneerselvam, Elavenil; Raja, Krishnakumar V B; Gopi, Gayathri

    2017-01-01

    To assess the influence of topical ozone administration on patient comfort after third molar surgery. A single-blinded randomized controlled clinical trial was designed involving patients who required removal of bilateral impacted mandibular third molars. The predictor variable was the postoperative medication used after third molar surgery. Using the split-mouth design, the study group received topical ozone without postoperative systemic antibiotics, whereas the control group did not receive ozone but only systemic antibiotics. The 2 groups were prescribed analgesics for 2 days. The assessing surgeon was blinded to treatment assignment. The primary outcome variables were postoperative mouth opening, pain, and swelling. The secondary outcome variable was the number of analgesic doses required by each group on postoperative days 3 to 5. Data analysis involved descriptive statistics, paired t tests, and 2-way analysis of variance with repeated measures (P < .05). SPSS 20.0 was used for data analysis. The study sample included 33 patients (n = 33 in each group). The study group showed statistically relevant decreases in postoperative pain, swelling, and trismus. Further, the number of analgesics required was smaller than in the control group. No adverse effects of ozone gel were observed in any patient. Ozone gel was found to be an effective topical agent that considerably improves patient comfort postoperatively and can be considered a substitute of postoperative systemic antibiotics. Copyright © 2016 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  15. Trend analysis of body weight parameters, mortality, and incidence of spontaneous tumors in Tg.rasH2 mice.

    PubMed

    Paranjpe, Madhav G; Denton, Melissa D; Vidmar, Tom; Elbekai, Reem H

    2014-01-01

    Carcinogenicity studies have been performed in conventional 2-year rodent studies for at least 3 decades, whereas the short-term carcinogenicity studies in transgenic mice, such as Tg.rasH2, have only been performed over the last decade. In the 2-year conventional rodent studies, interlinked problems, such as increasing trends in the initial body weights, increased body weight gains, high incidence of spontaneous tumors, and low survival, that complicate the interpretation of findings have been well established. However, these end points have not been evaluated in the short-term carcinogenicity studies involving the Tg.rasH2 mice. In this article, we present retrospective analysis of data obtained from control groups in 26-week carcinogenicity studies conducted in Tg.rasH2 mice since 2004. Our analysis showed statistically significant decreasing trends in initial body weights of both sexes. Although the terminal body weights did not show any significant trends, there was a statistically significant increasing trend toward body weight gains, more so in males than in females, which correlated with increasing trends in the food consumption. There were no statistically significant alterations in mortality trends. In addition, the incidence of all common spontaneous tumors remained fairly constant with no statistically significant differences in trends. © The Author(s) 2014.

  16. A methodological analysis of chaplaincy research: 2000-2009.

    PubMed

    Galek, Kathleen; Flannelly, Kevin J; Jankowski, Katherine R B; Handzo, George F

    2011-01-01

    The present article presents a comprehensive review and analysis of quantitative research conducted in the United States on chaplaincy and closely related topics published between 2000 and 2009. A combined search strategy identified 49 quantitative studies in 13 journals. The analysis focuses on the methodological sophistication of the studies, compared to earlier research on chaplaincy and pastoral care. Cross-sectional surveys of convenience samples still dominate the field, but sample sizes have increased somewhat over the past three decades. Reporting of the validity and reliability of measures continues to be low, although reporting of response rates has improved. Improvements in the use of inferential statistics and statistical controls were also observed, compared to previous research. The authors conclude that more experimental research is needed on chaplaincy, along with an increased use of hypothesis testing, regardless of the research designs that are used.

  17. Development of new on-line statistical program for the Korean Society for Radiation Oncology

    PubMed Central

    Song, Si Yeol; Ahn, Seung Do; Chung, Weon Kuu; Choi, Eun Kyung; Cho, Kwan Ho

    2015-01-01

    Purpose To develop new on-line statistical program for the Korean Society for Radiation Oncology (KOSRO) to collect and extract medical data in radiation oncology more efficiently. Materials and Methods The statistical program is a web-based program. The directory was placed in a sub-folder of the homepage of KOSRO and its web address is http://www.kosro.or.kr/asda. The operating systems server is Linux and the webserver is the Apache HTTP server. For database (DB) server, MySQL is adopted and dedicated scripting language is the PHP. Each ID and password are controlled independently and all screen pages for data input or analysis are made to be friendly to users. Scroll-down menu is actively used for the convenience of user and the consistence of data analysis. Results Year of data is one of top categories and main topics include human resource, equipment, clinical statistics, specialized treatment and research achievement. Each topic or category has several subcategorized topics. Real-time on-line report of analysis is produced immediately after entering each data and the administrator is able to monitor status of data input of each hospital. Backup of data as spread sheets can be accessed by the administrator and be used for academic works by any members of the KOSRO. Conclusion The new on-line statistical program was developed to collect data from nationwide departments of radiation oncology. Intuitive screen and consistent input structure are expected to promote entering data of member hospitals and annual statistics should be a cornerstone of advance in radiation oncology. PMID:26157684

  18. Development of new on-line statistical program for the Korean Society for Radiation Oncology.

    PubMed

    Song, Si Yeol; Ahn, Seung Do; Chung, Weon Kuu; Shin, Kyung Hwan; Choi, Eun Kyung; Cho, Kwan Ho

    2015-06-01

    To develop new on-line statistical program for the Korean Society for Radiation Oncology (KOSRO) to collect and extract medical data in radiation oncology more efficiently. The statistical program is a web-based program. The directory was placed in a sub-folder of the homepage of KOSRO and its web address is http://www.kosro.or.kr/asda. The operating systems server is Linux and the webserver is the Apache HTTP server. For database (DB) server, MySQL is adopted and dedicated scripting language is the PHP. Each ID and password are controlled independently and all screen pages for data input or analysis are made to be friendly to users. Scroll-down menu is actively used for the convenience of user and the consistence of data analysis. Year of data is one of top categories and main topics include human resource, equipment, clinical statistics, specialized treatment and research achievement. Each topic or category has several subcategorized topics. Real-time on-line report of analysis is produced immediately after entering each data and the administrator is able to monitor status of data input of each hospital. Backup of data as spread sheets can be accessed by the administrator and be used for academic works by any members of the KOSRO. The new on-line statistical program was developed to collect data from nationwide departments of radiation oncology. Intuitive screen and consistent input structure are expected to promote entering data of member hospitals and annual statistics should be a cornerstone of advance in radiation oncology.

  19. Effectiveness of Treatment Approaches for Children and Adolescents with Reading Disabilities: A Meta-Analysis of Randomized Controlled Trials

    PubMed Central

    Galuschka, Katharina; Ise, Elena; Krick, Kathrin; Schulte-Körne, Gerd

    2014-01-01

    Children and adolescents with reading disabilities experience a significant impairment in the acquisition of reading and spelling skills. Given the emotional and academic consequences for children with persistent reading disorders, evidence-based interventions are critically needed. The present meta-analysis extracts the results of all available randomized controlled trials. The aims were to determine the effectiveness of different treatment approaches and the impact of various factors on the efficacy of interventions. The literature search for published randomized-controlled trials comprised an electronic search in the databases ERIC, PsycINFO, PubMed, and Cochrane, and an examination of bibliographical references. To check for unpublished trials, we searched the websites clinicaltrials.com and ProQuest, and contacted experts in the field. Twenty-two randomized controlled trials with a total of 49 comparisons of experimental and control groups could be included. The comparisons evaluated five reading fluency trainings, three phonemic awareness instructions, three reading comprehension trainings, 29 phonics instructions, three auditory trainings, two medical treatments, and four interventions with coloured overlays or lenses. One trial evaluated the effectiveness of sunflower therapy and another investigated the effectiveness of motor exercises. The results revealed that phonics instruction is not only the most frequently investigated treatment approach, but also the only approach whose efficacy on reading and spelling performance in children and adolescents with reading disabilities is statistically confirmed. The mean effect sizes of the remaining treatment approaches did not reach statistical significance. The present meta-analysis demonstrates that severe reading and spelling difficulties can be ameliorated with appropriate treatment. In order to be better able to provide evidence-based interventions to children and adolescent with reading disabilities, research should intensify the application of blinded randomized controlled trials. PMID:24587110

  20. Utility of Shear Wave Elastography for Differentiating Biliary Atresia From Infantile Hepatitis Syndrome.

    PubMed

    Wang, Xiaoman; Qian, Linxue; Jia, Liqun; Bellah, Richard; Wang, Ning; Xin, Yue; Liu, Qinglin

    2016-07-01

    The purpose of this study was to investigate the potential utility of shear wave elastography (SWE) for diagnosis of biliary atresia and for differentiating biliary atresia from infantile hepatitis syndrome by measuring liver stiffness. Thirty-eight patients with biliary atresia and 17 patients with infantile hepatitis syndrome were included, along with 31 healthy control infants. The 3 groups underwent SWE. The hepatic tissue of each patient with biliary atresia had been surgically biopsied. Statistical analyses for mean values of the 3 groups were performed. Optimum cutoff values using SWE for differentiation between the biliary atresia and control groups were calculated by a receiver operating characteristic (ROC) analysis. The mean SWE values ± SD for the 3 groups were as follows: biliary atresia group, 20.46 ± 10.19 kPa; infantile hepatitis syndrome group, 6.29 ± 0.99 kPa; and control group, 6.41 ± 1.08 kPa. The mean SWE value for the biliary atresia group was higher than the values for the control and infantile hepatitis syndrome groups (P < .01). The mean SWE values between the control and infantile hepatitis syndrome groups were not statistically different. The ROC analysis showed a cutoff value of 8.68 kPa for differentiation between the biliary atresia and control groups. The area under the ROC curve was 0.997, with sensitivity of 97.4%, specificity of 100%, a positive predictive value of 100%, and a negative predictive value of 96.9%. Correlation analysis suggested a positive correlation between SWE values and age for patients with biliary atresia, with a Pearson correlation coefficient of 0.463 (P < .05). The significant increase in liver SWE values in neonates and infants with biliary atresia supports their application for differentiating biliary atresia from infantile hepatitis syndrome.

  1. Adaptation of Chain Event Graphs for use with Case-Control Studies in Epidemiology.

    PubMed

    Keeble, Claire; Thwaites, Peter Adam; Barber, Stuart; Law, Graham Richard; Baxter, Paul David

    2017-09-26

    Case-control studies are used in epidemiology to try to uncover the causes of diseases, but are a retrospective study design known to suffer from non-participation and recall bias, which may explain their decreased popularity in recent years. Traditional analyses report usually only the odds ratio for given exposures and the binary disease status. Chain event graphs are a graphical representation of a statistical model derived from event trees which have been developed in artificial intelligence and statistics, and only recently introduced to the epidemiology literature. They are a modern Bayesian technique which enable prior knowledge to be incorporated into the data analysis using the agglomerative hierarchical clustering algorithm, used to form a suitable chain event graph. Additionally, they can account for missing data and be used to explore missingness mechanisms. Here we adapt the chain event graph framework to suit scenarios often encountered in case-control studies, to strengthen this study design which is time and financially efficient. We demonstrate eight adaptations to the graphs, which consist of two suitable for full case-control study analysis, four which can be used in interim analyses to explore biases, and two which aim to improve the ease and accuracy of analyses. The adaptations are illustrated with complete, reproducible, fully-interpreted examples, including the event tree and chain event graph. Chain event graphs are used here for the first time to summarise non-participation, data collection techniques, data reliability, and disease severity in case-control studies. We demonstrate how these features of a case-control study can be incorporated into the analysis to provide further insight, which can help to identify potential biases and lead to more accurate study results.

  2. Robot-assisted walking training for individuals with Parkinson’s disease: a pilot randomized controlled trial

    PubMed Central

    2013-01-01

    Background Over the last years, the introduction of robotic technologies into Parkinson’s disease rehabilitation settings has progressed from concept to reality. However, the benefit of robotic training remains elusive. This pilot randomized controlled observer trial is aimed at investigating the feasibility, the effectiveness and the efficacy of new end-effector robot training in people with mild Parkinson’s disease. Methods Design. Pilot randomized controlled trial. Setting. Robot assisted gait training (EG) compared to treadmill training (CG). Participants. Twenty cognitively intact participants with mild Parkinson’s disease and gait disturbance. Interventions. The EG underwent a rehabilitation programme of robot assisted walking for 40 minutes, 5 times a week for 4 weeks. The CG received a treadmill training programme for 40 minutes, 5 times a week for 4 weeks. Main outcome measures. The outcome measure of efficacy was recorded by gait analysis laboratory. The assessments were performed at the beginning (T0) and at the end of the treatment (T1). The main outcome was the change in velocity. The feasibility of the intervention was assessed by recording exercise adherence and acceptability by specific test. Results Robot training was feasible, acceptable, safe, and the participants completed 100% of the prescribed training sessions. A statistically significant improvement in gait index was found in favour of the EG (T0 versus T1). In particular, the statistical analysis of primary outcome (gait speed) using the Friedman test showed statistically significant improvements for the EG (p = 0,0195). The statistical analysis performed by Friedman test of Step length left (p = 0,0195) and right (p = 0,0195) and Stride length left (p = 0,0078) and right (p = 0,0195) showed a significant statistical gain. No statistically significant improvements on the CG were found. Conclusions Robot training is a feasible and safe form of rehabilitative exercise for cognitively intact people with mild PD. This original approach can contribute to increase a short time lower limb motor recovery in idiopathic PD patients. The focus on the gait recovery is a further characteristic that makes this research relevant to clinical practice. On the whole, the simplicity of treatment, the lack of side effects, and the positive results from patients support the recommendation to extend the use of this treatment. Further investigation regarding the long-time effectiveness of robot training is warranted. Trial registration ClinicalTrials.gov NCT01668407 PMID:23706025

  3. Treatment of missing data in follow-up studies of randomised controlled trials: A systematic review of the literature.

    PubMed

    Sullivan, Thomas R; Yelland, Lisa N; Lee, Katherine J; Ryan, Philip; Salter, Amy B

    2017-08-01

    After completion of a randomised controlled trial, an extended follow-up period may be initiated to learn about longer term impacts of the intervention. Since extended follow-up studies often involve additional eligibility restrictions and consent processes for participation, and a longer duration of follow-up entails a greater risk of participant attrition, missing data can be a considerable threat in this setting. As a potential source of bias, it is critical that missing data are appropriately handled in the statistical analysis, yet little is known about the treatment of missing data in extended follow-up studies. The aims of this review were to summarise the extent of missing data in extended follow-up studies and the use of statistical approaches to address this potentially serious problem. We performed a systematic literature search in PubMed to identify extended follow-up studies published from January to June 2015. Studies were eligible for inclusion if the original randomised controlled trial results were also published and if the main objective of extended follow-up was to compare the original randomised groups. We recorded information on the extent of missing data and the approach used to treat missing data in the statistical analysis of the primary outcome of the extended follow-up study. Of the 81 studies included in the review, 36 (44%) reported additional eligibility restrictions and 24 (30%) consent processes for entry into extended follow-up. Data were collected at a median of 7 years after randomisation. Excluding 28 studies with a time to event primary outcome, 51/53 studies (96%) reported missing data on the primary outcome. The median percentage of randomised participants with complete data on the primary outcome was just 66% in these studies. The most common statistical approach to address missing data was complete case analysis (51% of studies), while likelihood-based analyses were also well represented (25%). Sensitivity analyses around the missing data mechanism were rarely performed (25% of studies), and when they were, they often involved unrealistic assumptions about the mechanism. Despite missing data being a serious problem in extended follow-up studies, statistical approaches to addressing missing data were often inadequate. We recommend researchers clearly specify all sources of missing data in follow-up studies and use statistical methods that are valid under a plausible assumption about the missing data mechanism. Sensitivity analyses should also be undertaken to assess the robustness of findings to assumptions about the missing data mechanism.

  4. Effectiveness of Platelet Rich Plasma and Bone Graft in the Treatment of Intrabony Defects: A Clinico-radiographic Study

    PubMed Central

    Jalaluddin, Mohammad; Mahesh, Jayachandran; Mahesh, Rethi; Jayanti, Ipsita; Faizuddin, Mohamed; Kripal, Krishna; Nazeer, Nazia

    2018-01-01

    Background & Objectives: Periodontal disease is characterized by the presence of gingival inflammation, periodontal pocket formation, loss of connective tissue attachment and alveolar bone around the affected tooth. Different modalities have been employed in the treatment and regeneration of periodontal defects which include the use of bone grafts, PRP and other growth factors.The purpose of this prospective, randomized controlled study was to compare the regenerative efficacy of PRP and bonegraft in intrabony periodontal defects. Methodology: This randomized control trial was carried out in the Department of Periodontics & Oral Implantology, Kalinga Institute of Dental Sciences and Hospital, KIIT University, Bhubaneswar. The study sample included 20 periodontal infrabony defects in 20 patients, 12 males and 8 females. The patients were aged between 25 -45 years(with mean age of 35 years). The 20 sites selected for the study were was randomly divided into 2 groups of 10 sites each. Group A: PRP alone, Group B: Bone Graft. Statistical Anaysis & Results: Statistical Analysis Was Done Using SPSS (Version 18.0): Statistical analysis was done usingpaired ‘t’ tests and ANOVA that revealed a significant reduction ingingival index, plaque index, probing pocket depth and gain in clinical attachment level at various time intervalswithin both the groups. Radiographic evaluation revealed statistically significant defect fill (p<0.001) at the end of 6months within both the groups. However, there was astatistically significant difference seen in group B radiographically, when compared to group A. Conclusion: Both the groups showed promising results in enhancing periodontal regeneration; however the resultswith bonegraftwere comparatively better, although not statistically significant when compared to PRP alone. PMID:29682091

  5. The influence of control group reproduction on the statistical ...

    EPA Pesticide Factsheets

    Because of various Congressional mandates to protect the environment from endocrine disrupting chemicals (EDCs), the United States Environmental Protection Agency (USEPA) initiated the Endocrine Disruptor Screening Program. In the context of this framework, the Office of Research and Development within the USEPA developed the Medaka Extended One Generation Reproduction Test (MEOGRT) to characterize the endocrine action of a suspected EDC. One important endpoint of the MEOGRT is fecundity of breeding pairs of medaka. Power analyses were conducted to determine the number of replicates needed in proposed test designs and to determine the effects that varying reproductive parameters (e.g. mean fecundity, variance, and days with no egg production) will have on the statistical power of the test. A software tool, the MEOGRT Reproduction Power Analysis Tool, was developed to expedite these power analyses by both calculating estimates of the needed reproductive parameters (e.g. population mean and variance) and performing the power analysis under user specified scenarios. The manuscript illustrates how the reproductive performance of the control medaka that are used in a MEOGRT influence statistical power, and therefore the successful implementation of the protocol. Example scenarios, based upon medaka reproduction data collected at MED, are discussed that bolster the recommendation that facilities planning to implement the MEOGRT should have a culture of medaka with hi

  6. Identifying Pleiotropic Genes in Genome-Wide Association Studies for Multivariate Phenotypes with Mixed Measurement Scales

    PubMed Central

    Williams, L. Keoki; Buu, Anne

    2017-01-01

    We propose a multivariate genome-wide association test for mixed continuous, binary, and ordinal phenotypes. A latent response model is used to estimate the correlation between phenotypes with different measurement scales so that the empirical distribution of the Fisher’s combination statistic under the null hypothesis is estimated efficiently. The simulation study shows that our proposed correlation estimation methods have high levels of accuracy. More importantly, our approach conservatively estimates the variance of the test statistic so that the type I error rate is controlled. The simulation also shows that the proposed test maintains the power at the level very close to that of the ideal analysis based on known latent phenotypes while controlling the type I error. In contrast, conventional approaches–dichotomizing all observed phenotypes or treating them as continuous variables–could either reduce the power or employ a linear regression model unfit for the data. Furthermore, the statistical analysis on the database of the Study of Addiction: Genetics and Environment (SAGE) demonstrates that conducting a multivariate test on multiple phenotypes can increase the power of identifying markers that may not be, otherwise, chosen using marginal tests. The proposed method also offers a new approach to analyzing the Fagerström Test for Nicotine Dependence as multivariate phenotypes in genome-wide association studies. PMID:28081206

  7. Nateglinide versus repaglinide for type 2 diabetes mellitus in China.

    PubMed

    Li, Chanjuan; Xia, Jielai; Zhang, Gaokui; Wang, Suzhen; Wang, Ling

    2009-12-01

    The purpose of this study is to evaluate efficacy and safety of nateglinide tablet administration in comparison with those of repaglinide tablet as control on treating type 2 diabetes mellitus in China. Pooled-analysis with analysis of covariance (ANCOVA) method was applied to assess the efficacy and safety based on original data collected from four independent randomized clinical trials with similar research protocols. However meta-analysis was applied based on the outcomes of the four studies. The results by meta-analysis were comparable to those obtained by pooled-analysis. The means of HbA(1c), and fasting blood glucose in both the nateglinide and repaglinide groups were reduced significantly after 12 weeks duration but no statistical differences in reduction between the two groups. The adverse reaction rates were 9.89 and 6.51% in the nateglinide and repaglinide groups respectively, with the rate difference showing no statistical significance, and the Odds Ratio of adverse reaction rate (95% confidence interval) was 1.59 (0.99, 2.55). Both nateglinide and repaglinide administration have similarly significant effects on reducing HbA(1c) and FBG. However, the adverse reaction rate in the nateglinide group is higher than that in the latter using repaglinide but no statistical significance difference as revealed in the four clinical trials detailed below.

  8. Multi-resolutional shape features via non-Euclidean wavelets: Applications to statistical analysis of cortical thickness

    PubMed Central

    Kim, Won Hwa; Singh, Vikas; Chung, Moo K.; Hinrichs, Chris; Pachauri, Deepti; Okonkwo, Ozioma C.; Johnson, Sterling C.

    2014-01-01

    Statistical analysis on arbitrary surface meshes such as the cortical surface is an important approach to understanding brain diseases such as Alzheimer’s disease (AD). Surface analysis may be able to identify specific cortical patterns that relate to certain disease characteristics or exhibit differences between groups. Our goal in this paper is to make group analysis of signals on surfaces more sensitive. To do this, we derive multi-scale shape descriptors that characterize the signal around each mesh vertex, i.e., its local context, at varying levels of resolution. In order to define such a shape descriptor, we make use of recent results from harmonic analysis that extend traditional continuous wavelet theory from the Euclidean to a non-Euclidean setting (i.e., a graph, mesh or network). Using this descriptor, we conduct experiments on two different datasets, the Alzheimer’s Disease NeuroImaging Initiative (ADNI) data and images acquired at the Wisconsin Alzheimer’s Disease Research Center (W-ADRC), focusing on individuals labeled as having Alzheimer’s disease (AD), mild cognitive impairment (MCI) and healthy controls. In particular, we contrast traditional univariate methods with our multi-resolution approach which show increased sensitivity and improved statistical power to detect a group-level effects. We also provide an open source implementation. PMID:24614060

  9. SQC: secure quality control for meta-analysis of genome-wide association studies.

    PubMed

    Huang, Zhicong; Lin, Huang; Fellay, Jacques; Kutalik, Zoltán; Hubaux, Jean-Pierre

    2017-08-01

    Due to the limited power of small-scale genome-wide association studies (GWAS), researchers tend to collaborate and establish a larger consortium in order to perform large-scale GWAS. Genome-wide association meta-analysis (GWAMA) is a statistical tool that aims to synthesize results from multiple independent studies to increase the statistical power and reduce false-positive findings of GWAS. However, it has been demonstrated that the aggregate data of individual studies are subject to inference attacks, hence privacy concerns arise when researchers share study data in GWAMA. In this article, we propose a secure quality control (SQC) protocol, which enables checking the quality of data in a privacy-preserving way without revealing sensitive information to a potential adversary. SQC employs state-of-the-art cryptographic and statistical techniques for privacy protection. We implement the solution in a meta-analysis pipeline with real data to demonstrate the efficiency and scalability on commodity machines. The distributed execution of SQC on a cluster of 128 cores for one million genetic variants takes less than one hour, which is a modest cost considering the 10-month time span usually observed for the completion of the QC procedure that includes timing of logistics. SQC is implemented in Java and is publicly available at https://github.com/acs6610987/secureqc. jean-pierre.hubaux@epfl.ch. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  10. Statistical methods for launch vehicle guidance, navigation, and control (GN&C) system design and analysis

    NASA Astrophysics Data System (ADS)

    Rose, Michael Benjamin

    A novel trajectory and attitude control and navigation analysis tool for powered ascent is developed. The tool is capable of rapid trade-space analysis and is designed to ultimately reduce turnaround time for launch vehicle design, mission planning, and redesign work. It is streamlined to quickly determine trajectory and attitude control dispersions, propellant dispersions, orbit insertion dispersions, and navigation errors and their sensitivities to sensor errors, actuator execution uncertainties, and random disturbances. The tool is developed by applying both Monte Carlo and linear covariance analysis techniques to a closed-loop, launch vehicle guidance, navigation, and control (GN&C) system. The nonlinear dynamics and flight GN&C software models of a closed-loop, six-degree-of-freedom (6-DOF), Monte Carlo simulation are formulated and developed. The nominal reference trajectory (NRT) for the proposed lunar ascent trajectory is defined and generated. The Monte Carlo truth models and GN&C algorithms are linearized about the NRT, the linear covariance equations are formulated, and the linear covariance simulation is developed. The performance of the launch vehicle GN&C system is evaluated using both Monte Carlo and linear covariance techniques and their trajectory and attitude control dispersion, propellant dispersion, orbit insertion dispersion, and navigation error results are validated and compared. Statistical results from linear covariance analysis are generally within 10% of Monte Carlo results, and in most cases the differences are less than 5%. This is an excellent result given the many complex nonlinearities that are embedded in the ascent GN&C problem. Moreover, the real value of this tool lies in its speed, where the linear covariance simulation is 1036.62 times faster than the Monte Carlo simulation. Although the application and results presented are for a lunar, single-stage-to-orbit (SSTO), ascent vehicle, the tools, techniques, and mathematical formulations that are discussed are applicable to ascent on Earth or other planets as well as other rocket-powered systems such as sounding rockets and ballistic missiles.

  11. Chemical Structure and Molecular Dimension As Controls on the Inherent Stability of Charcoal in Boreal Forest Soil

    NASA Astrophysics Data System (ADS)

    Hockaday, W. C.; Kane, E. S.; Ohlson, M.; Huang, R.; Von Bargen, J.; Davis, R.

    2014-12-01

    Efforts have been made by various scientific disciplines to study hyporheic zones and characterize their associated processes. One way to approach the study of the hyporheic zone is to define facies, which are elements of a (hydrobio) geologic classification scheme that groups components of a complex system with high variability into a manageable set of discrete classes. In this study, we try to classify the hyporheic zone based on the geology, geochemistry, microbiology, and understand their interactive influences on the integrated biogeochemical distributions and processes. A number of measurements have been taken for 21 freeze core samples along the Columbia River bank in the Hanford 300 Area, and unique datasets have been obtained on biomass, pH, number of microbial taxa, percentage of N/C/H/S, microbial activity parameters, as well as microbial community attributes/modules. In order to gain a complete understanding of the geological control on these variables and processes, the explanatory variables are set to include quantitative gravel/sand/mud/silt/clay percentages, statistical moments of grain size distributions, as well as geological (e.g., Folk-Wentworth) and statistical (e.g., hierarchical) clusters. The dominant factors for major microbial and geochemical variables are identified and summarized using exploratory data analysis approaches (e.g., principal component analysis, hierarchical clustering, factor analysis, multivariate analysis of variance). The feasibility of extending the facies definition and its control of microbial and geochemical properties to larger scales is discussed.

  12. Microbial facies distribution and its geological and geochemical controls at the Hanford 300 area

    NASA Astrophysics Data System (ADS)

    Hou, Z.; Nelson, W.; Stegen, J.; Murray, C. J.; Arntzen, E.

    2015-12-01

    Efforts have been made by various scientific disciplines to study hyporheic zones and characterize their associated processes. One way to approach the study of the hyporheic zone is to define facies, which are elements of a (hydrobio) geologic classification scheme that groups components of a complex system with high variability into a manageable set of discrete classes. In this study, we try to classify the hyporheic zone based on the geology, geochemistry, microbiology, and understand their interactive influences on the integrated biogeochemical distributions and processes. A number of measurements have been taken for 21 freeze core samples along the Columbia River bank in the Hanford 300 Area, and unique datasets have been obtained on biomass, pH, number of microbial taxa, percentage of N/C/H/S, microbial activity parameters, as well as microbial community attributes/modules. In order to gain a complete understanding of the geological control on these variables and processes, the explanatory variables are set to include quantitative gravel/sand/mud/silt/clay percentages, statistical moments of grain size distributions, as well as geological (e.g., Folk-Wentworth) and statistical (e.g., hierarchical) clusters. The dominant factors for major microbial and geochemical variables are identified and summarized using exploratory data analysis approaches (e.g., principal component analysis, hierarchical clustering, factor analysis, multivariate analysis of variance). The feasibility of extending the facies definition and its control of microbial and geochemical properties to larger scales is discussed.

  13. Medial prefrontal aberrations in major depressive disorder revealed by cytoarchitectonically informed voxel-based morphometry

    PubMed Central

    Bludau, Sebastian; Bzdok, Danilo; Gruber, Oliver; Kohn, Nils; Riedl, Valentin; Sorg, Christian; Palomero-Gallagher, Nicola; Müller, Veronika I.; Hoffstaedter, Felix; Amunts, Katrin; Eickhoff, Simon B.

    2017-01-01

    Objective The heterogeneous human frontal pole has been identified as a node in the dysfunctional network of major depressive disorder. The contribution of the medial (socio-affective) versus lateral (cognitive) frontal pole to major depression pathogenesis is currently unclear. The present study performs morphometric comparison of the microstructurally informed subdivisions of human frontal pole between depressed patients and controls using both uni- and multivariate statistics. Methods Multi-site voxel- and region-based morphometric MRI analysis of 73 depressed patients and 73 matched controls without psychiatric history. Frontal pole volume was first compared between depressed patients and controls by subdivision-wise classical morphometric analysis. In a second approach, frontal pole volume was compared by subdivision-naive multivariate searchlight analysis based on support vector machines. Results Subdivision-wise morphometric analysis found a significantly smaller medial frontal pole in depressed patients with a negative correlation of disease severity and duration. Histologically uninformed multivariate voxel-wise statistics provided converging evidence for structural aberrations specific to the microstructurally defined medial area of the frontal pole in depressed patients. Conclusions Across disparate methods, we demonstrated subregion specificity in the left medial frontal pole volume in depressed patients. Indeed, the frontal pole was shown to structurally and functionally connect to other key regions in major depression pathology like the anterior cingulate cortex and the amygdala via the uncinate fasciculus. Present and previous findings consolidate the left medial portion of the frontal pole as particularly altered in major depression. PMID:26621569

  14. Morphometric analysis of long-term dentoskeletal effects induced by treatment with Balters bionator.

    PubMed

    Bigliazzi, Renato; Franchi, Lorenzo; Bertoz, André Pinheiro de Magalhães; McNamara, James A; Faltin, Kurt; Bertoz, Francisco Antonio

    2015-09-01

    To evaluate the long-term effects of the standard (Class II) Balters bionator in growing patients with Class II malocclusion with mandibular retrusion by using morphometrics (thin-plate spline [TPS] analysis). Twenty-three Class II patients (8 male, 15 female) were treated consecutively with the Balters bionator (bionator group). The sample was evaluated at T0, start of treatment; T1, end of bionator therapy; and T2, long-term observation (including fixed appliances). Mean age at the start of treatment was 10 years 2 months (T0); at posttreatment, 12 years 3 months (T1); and at long-term follow-up, 18 years 2 months (T2). The control group consisted of 22 subjects (11 male, 11 female) with untreated Class II malocclusion. Lateral cephalograms were analyzed at the three time points for all groups. TPS analysis evaluated statistical differences (permutation tests) in the craniofacial shape and size between the bionator and control groups. TPS analysis showed that treatment with the bionator is able to produce favorable mandibular shape changes (forward and downward displacement) that contribute significantly to the correction of the Class II dentoskeletal imbalance. These results are maintained at a long-term observation after completion of growth. The control group showed no statistically significant differences in the correction of Class II malocclusion. This study suggests that bionator treatment of Class II malocclusion produces favorable results over the long term with a combination of skeletal and dentoalveolar shape changes.

  15. Methods to control for unmeasured confounding in pharmacoepidemiology: an overview.

    PubMed

    Uddin, Md Jamal; Groenwold, Rolf H H; Ali, Mohammed Sanni; de Boer, Anthonius; Roes, Kit C B; Chowdhury, Muhammad A B; Klungel, Olaf H

    2016-06-01

    Background Unmeasured confounding is one of the principal problems in pharmacoepidemiologic studies. Several methods have been proposed to detect or control for unmeasured confounding either at the study design phase or the data analysis phase. Aim of the Review To provide an overview of commonly used methods to detect or control for unmeasured confounding and to provide recommendations for proper application in pharmacoepidemiology. Methods/Results Methods to control for unmeasured confounding in the design phase of a study are case only designs (e.g., case-crossover, case-time control, self-controlled case series) and the prior event rate ratio adjustment method. Methods that can be applied in the data analysis phase include, negative control method, perturbation variable method, instrumental variable methods, sensitivity analysis, and ecological analysis. A separate group of methods are those in which additional information on confounders is collected from a substudy. The latter group includes external adjustment, propensity score calibration, two-stage sampling, and multiple imputation. Conclusion As the performance and application of the methods to handle unmeasured confounding may differ across studies and across databases, we stress the importance of using both statistical evidence and substantial clinical knowledge for interpretation of the study results.

  16. Dacron® vs. PTFE as bypass materials in peripheral vascular surgery – systematic review and meta-analysis

    PubMed Central

    Roll, Stephanie; Müller-Nordhorn, Jacqueline; Keil, Thomas; Scholz, Hans; Eidt, Daniela; Greiner, Wolfgang; Willich, Stefan N

    2008-01-01

    Background In peripheral vascular bypass surgery different synthetic materials are available for bypass grafting. It is unclear which of the two commonly used materials, polytetrafluoroethylene (PTFE) or polyester (Dacron®) grafts, is to be preferred. Thus, the aim of this meta-analysis and systematic review was to compare the effectiveness of these two prosthetic bypass materials (Dacron® and PTFE). Methods We performed a systematic literature search in MEDLINE, Cochrane-Library – CENTRAL, EMBASE and other databases for relevant publications in English and German published between 1999 and 2008. Only randomized controlled trials were considered for inclusion. We assessed the methodological quality by means of standardized checklists. Primary patency was used as the main endpoint. Random-effect meta-analysis as well as pooling data in life table format was performed to combine study results. Results Nine randomized controlled trials (RCT) were included. Two trials showed statistically significant differences in primary patency, one favouring Dacron® and one favouring PTFE grafts, while 7 trials did not show statistically significant differences between the two materials. Meta-analysis on the comparison of PTFE vs. Dacron® grafts yielded no differences with regard to primary patency rates (hazard ratio 1.04 (95% confidence interval [0.85;1.28]), no significant heterogeneity (p = 0.32, I2 = 14%)). Similarly, there were no significant differences with regard to secondary patency rates. Conclusion Systematic evaluation and meta-analysis of randomized controlled trials comparing Dacron® and PTFE as bypass materials for peripheral vascular surgery showed no evidence of an advantage of one synthetic material over the other. PMID:19099583

  17. The contribution of executive functions to emergent mathematic skills in preschool children.

    PubMed

    Espy, Kimberly Andrews; McDiarmid, Melanie M; Cwik, Mary F; Stalets, Melissa Meade; Hamby, Arlena; Senn, Theresa E

    2004-01-01

    Mathematical ability is related to both activation of the prefrontal cortex in neuroimaging studies of adults and to executive functions in school-age children. The purpose of this study was to determine whether executive functions were related to emergent mathematical proficiency in preschool children. Preschool children (N = 96) were administered an executive function battery that was reduced empirically to working memory (WM), inhibitory control (IC), and shifting abilities by calculating composite scores derived from principal component analysis. Both WM and IC predicted early arithmetic competency, with the observed relations robust after controlling statistically for child age, maternal education, and child vocabulary. Only IC accounted for unique variance in mathematical skills, after the contribution of other executive functions were controlled statistically as well. Specific executive functions are related to emergent mathematical proficiency in this age range. Longitudinal studies using structural equation modeling are necessary to better characterize these ontogenetic relations.

  18. Chromosome aberration analysis in peripheral lymphocytes of Gulf War and Balkans War veterans.

    PubMed

    Schröder, H; Heimers, A; Frentzel-Beyme, R; Schott, A; Hoffmann, W

    2003-01-01

    Chromosome aberrations and sister chromatid exchanges (SCEs) were determined in standard peripheral lymphocyte metaphase preparations of 13 British Gulf War veterans, two veterans of the recent war in the Balkans and one veteran of both wars. All 16 volunteers suspect exposures to depleted uranium (DU) while deployed at the two different theatres of war in 1990 and later on. The Bremen laboratory control served as a reference in this study. Compared with this control there was a statistically significant increase in the frequency of dicentric chromosomes (dic) and centric ring chromosomes (cR) in the veterans' group. indicating a previous exposure to ionising radiation. The statistically significant overdispersion of die and cR indicates non-uniform irradiation as would be expected after non-uniform exposure and/or exposure to radiation with a high linear energy transfer (LET). The frequency of SCEs was decreased when compared with the laboratory control.

  19. Toxicity of zero-valent iron nanoparticles to a trichloroethylene-degrading groundwater microbial community.

    PubMed

    Zabetakis, Kara M; Niño de Guzmán, Gabriela T; Torrents, Alba; Yarwood, Stephanie

    2015-01-01

    The microbiological impact of zero-valent iron used in the remediation of groundwater was investigated by exposing a trichloroethylene-degrading anaerobic microbial community to two types of iron nanoparticles. Changes in total bacterial and archaeal population numbers were analyzed using qPCR and were compared to results from a blank and negative control to assess for microbial toxicity. Additionally, the results were compared to those of samples exposed to silver nanoparticles and iron filings in an attempt to discern the source of toxicity. Statistical analysis revealed that the three different iron treatments were equally toxic to the total bacteria and archaea populations, as compared with the controls. Conversely, the silver nanoparticles had a limited statistical impact when compared to the controls and increased the microbial populations in some instances. Therefore, the findings suggest that zero-valent iron toxicity does not result from a unique nanoparticle-based effect.

  20. Evaluation of Nasal Mucociliary Transport Rate byTc-Macroaggregated Albumin Rhinoscintigraphy in Woodworkers.

    PubMed

    Dostbil, Zeki; Polat, Cahit; Uysal, Ismail Önder; Bakır, Salih; Karakuş, Askeri; Altındağ, Serdar

    2011-01-01

    Woodworkers in the furniture industry are exposed to wood dust in their workplaces. The aim of this study is to investigate the effect of occupational wood dust exposure on the nasal mucociliary transport rates (NMTRs) in woodworkers. Twenty five woodworkers and 30 healthy controls were included in this study. Wood dust concentration in workplaces was measured using the sampling device. (99m) Tc-macroaggregated albumin ((99m)Tc-MAA) rhinoscintigraphy was performed, and NMTR was calculated in all cases. In statistical analysis, an independent samples t-test was used to compare NMTR of woodworkers and control subjects. We found that the mean NMTR of the woodworkers was lower than that of the healthy controls. However, there was not a statistically significant difference between them (P = 0.066). In conclusion, our findings suggested that wood dust exposure may not impair nasal mucociliary transport rate in woodworkers employed in joinery workshops.

  1. Evaluation of Nasal Mucociliary Transport Rate by99mTc-Macroaggregated Albumin Rhinoscintigraphy in Woodworkers

    PubMed Central

    Dostbil, Zeki; Polat, Cahit; Uysal, İsmail Önder; Bakır, Salih; Karakuş, Askeri; Altındağ, Serdar

    2011-01-01

    Woodworkers in the furniture industry are exposed to wood dust in their workplaces. The aim of this study is to investigate the effect of occupational wood dust exposure on the nasal mucociliary transport rates (NMTRs) in woodworkers. Twenty five woodworkers and 30 healthy controls were included in this study. Wood dust concentration in workplaces was measured using the sampling device. 99m Tc-macroaggregated albumin (99mTc-MAA) rhinoscintigraphy was performed, and NMTR was calculated in all cases. In statistical analysis, an independent samples t-test was used to compare NMTR of woodworkers and control subjects. We found that the mean NMTR of the woodworkers was lower than that of the healthy controls. However, there was not a statistically significant difference between them (P = 0.066). In conclusion, our findings suggested that wood dust exposure may not impair nasal mucociliary transport rate in woodworkers employed in joinery workshops. PMID:21804940

  2. Multifractal Properties of Process Control Variables

    NASA Astrophysics Data System (ADS)

    Domański, Paweł D.

    2017-06-01

    Control system is an inevitable element of any industrial installation. Its quality affects overall process performance significantly. The assessment, whether control system needs any improvement or not, requires relevant and constructive measures. There are various methods, like time domain based, Minimum Variance, Gaussian and non-Gaussian statistical factors, fractal and entropy indexes. Majority of approaches use time series of control variables. They are able to cover many phenomena. But process complexities and human interventions cause effects that are hardly visible for standard measures. It is shown that the signals originating from industrial installations have multifractal properties and such an analysis may extend standard approach to further observations. The work is based on industrial and simulation data. The analysis delivers additional insight into the properties of control system and the process. It helps to discover internal dependencies and human factors, which are hardly detectable.

  3. Statistical analysis of wavefront fluctuations from measurements of a wave-front sensor

    NASA Astrophysics Data System (ADS)

    Botygina, N. N.; Emaleev, O. N.; Konyaev, P. A.; Lukin, V. P.

    2017-11-01

    Measurements of the wave front aberrations at the input aperture of the Big Solar Vacuum Telescope (LSVT) were carried out by a wave-front sensor (WFS) of an adaptive optical system when the controlled deformable mirror was replaced by a plane one.

  4. The Use of Citation Counting to Identify Research Trends

    ERIC Educational Resources Information Center

    Rothman, Harry; Woodhead, Michael

    1971-01-01

    The analysis and application of manpower statistics to identify some long-term international research trends in economic entomology and pest conrol are described. Movements in research interests, particularly towards biological methods of control, correlations between these sectors, and the difficulties encountered in the construction of a…

  5. Title VII and the Male/Female Earnings Gap: An Economic Analysis.

    ERIC Educational Resources Information Center

    Beller, Andrea

    1978-01-01

    After controlling statistically for the effects of other factors that affect earnings, it was found that enforcement of sex discrimination charges under Title VII increased the relative demand for women and thus decreased the male/female earnings differential between 1967 and 1974. (Author)

  6. An introduction to statistical process control in research proteomics.

    PubMed

    Bramwell, David

    2013-12-16

    Statistical process control is a well-established and respected method which provides a general purpose, and consistent framework for monitoring and improving the quality of a process. It is routinely used in many industries where the quality of final products is critical and is often required in clinical diagnostic laboratories [1,2]. To date, the methodology has been little utilised in research proteomics. It has been shown to be capable of delivering quantitative QC procedures for qualitative clinical assays [3] making it an ideal methodology to apply to this area of biological research. To introduce statistical process control as an objective strategy for quality control and show how it could be used to benefit proteomics researchers and enhance the quality of the results they generate. We demonstrate that rules which provide basic quality control are easy to derive and implement and could have a major impact on data quality for many studies. Statistical process control is a powerful tool for investigating and improving proteomics research work-flows. The process of characterising measurement systems and defining control rules forces the exploration of key questions that can lead to significant improvements in performance. This work asserts that QC is essential to proteomics discovery experiments. Every experimenter must know the current capabilities of their measurement system and have an objective means for tracking and ensuring that performance. Proteomic analysis work-flows are complicated and multi-variate. QC is critical for clinical chemistry measurements and huge strides have been made in ensuring the quality and validity of results in clinical biochemistry labs. This work introduces some of these QC concepts and works to bridge their use from single analyte QC to applications in multi-analyte systems. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. Copyright © 2013 The Author. Published by Elsevier B.V. All rights reserved.

  7. Three-dimensional images contribute to the diagnosis of mucous retention cyst in maxillary sinus.

    PubMed

    Donizeth-Rodrigues, Cleomar; Fonseca-Da Silveira, Márcia; Gonçalves-De Alencar, Ana-Helena; Garcia-Santos-Silva, Maria-Alves; Francisco-De-Mendonça, Elismauro; Estrela, Carlos

    2013-01-01

    To evaluate the detection of mucous retention cyst of maxillary sinus (MRCMS) using panoramic radiography and cone beam computed tomography (CBCT). A digital database with 6,000 panoramic radiographs was reviewed for MRCMS. Suggestive images of MRCMS were detected on 185 radiographs, and patients were located and invited to return for follow-up. Thirty patients returned, and control panoramic radiographs were obtained 6 to 46 months after the initial radiograph. When MRCMS was found on control radiographs, CBCT scans were obtained. Cysts were measured and compared on radiographs and scans. The Wilcoxon, Spearman and Kolmorogov-Smirnov tests were used for statistical analysis. The level of significance was set at 5%. There were statistically significant differences between the two methods (p<0.05): 23 MRCMS detected on panoramic radiographs were confirmed by CBCT, but 5 MRCMS detected on CBCT images had not been identified by panoramic radiography. Eight MRCMS detected on control radiographs were not confirmed by CBCT. MRCMS size differences from initial to control panoramic radiographs and CBCT scans were not statistically significant (p= 0.617 and p= 0.626). The correlation between time and MRCMS size differences was not significant (r = -0.16, p = 0.381). CBCT scanning detect MRCMS more accurately than panoramic radiography.

  8. Piloting a Sex-Specific, Technology-Enhanced, Active Learning Intervention for Stroke Prevention in Women.

    PubMed

    Dirickson, Amanda; Stutzman, Sonja E; Alberts, Mark J; Novakovic, Roberta L; Stowe, Ann M; Beal, Claudia C; Goldberg, Mark P; Olson, DaiWai M

    2017-12-01

    Recent studies reveal deficiencies in stroke awareness and knowledge of risk factors among women. Existing stroke education interventions may not address common and sex-specific risk factors in the population with the highest stroke-related rate of mortality. This pilot study assessed the efficacy of a technology-enhanced, sex-specific educational program ("SISTERS") for women's knowledge of stroke. This was an experimental pretest-posttest design. The sample consisted of 150 women (mean age, 55 years) with at least 1 stroke risk factor. Participants were randomized to either the intervention (n = 75) or control (n = 75) group. Data were collected at baseline and at a 2-week posttest. There was no statistically significant difference in mean knowledge score (P = .67), mean confidence score (P = .77), or mean accuracy score (P = .75) between the intervention and control groups at posttest. Regression analysis revealed that older age was associated with lower knowledge scores (P < .001) and lower confidence scores (P < .001). After controlling for age, the SISTERS program was associated with a statistically significant difference in knowledge (P < .001) and confidence (P < .001). Although no change occurred overall, after controlling for age, there was a statistically significant benefit. Older women may have less comfort with technology and require consideration for cognitive differences.

  9. Eye-gaze determination of user intent at the computer interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldberg, J.H.; Schryver, J.C.

    1993-12-31

    Determination of user intent at the computer interface through eye-gaze monitoring can significantly aid applications for the disabled, as well as telerobotics and process control interfaces. Whereas current eye-gaze control applications are limited to object selection and x/y gazepoint tracking, a methodology was developed here to discriminate a more abstract interface operation: zooming-in or out. This methodology first collects samples of eve-gaze location looking at controlled stimuli, at 30 Hz, just prior to a user`s decision to zoom. The sample is broken into data frames, or temporal snapshots. Within a data frame, all spatial samples are connected into a minimummore » spanning tree, then clustered, according to user defined parameters. Each cluster is mapped to one in the prior data frame, and statistics are computed from each cluster. These characteristics include cluster size, position, and pupil size. A multiple discriminant analysis uses these statistics both within and between data frames to formulate optimal rules for assigning the observations into zooming, zoom-out, or no zoom conditions. The statistical procedure effectively generates heuristics for future assignments, based upon these variables. Future work will enhance the accuracy and precision of the modeling technique, and will empirically test users in controlled experiments.« less

  10. A supportive-educative telephone program: impact on knowledge and anxiety after coronary artery bypass graft surgery.

    PubMed

    Beckie, T

    1989-01-01

    The purpose of this study was to investigate the impact of a supportive-educative telephone program on the levels of knowledge and anxiety of patients undergoing coronary artery bypass graft surgery during the first 6 weeks after hospital discharge. With a posttest-only control group design, the first 74 patients scheduled, between September 1986 and February 1987, for coronary artery bypass graft surgery in a large, western Canadian teaching hospital were randomly assigned to either an experimental or a control group. The effect of the intervention, which was implemented by a cardiac rehabilitation nurse specialist, was assessed by a knowledge test and a state anxiety inventory. Data were collected without knowledge of the participants' group assignment. As hypothesized, data analysis with independent t tests revealed a statistically significant (p less than 0.05) difference between the knowledge level of the experimental and the control group in the areas of coronary artery disease, diet, medications, physical activity restrictions, exercise, and rest. A statistically significant difference between the state anxiety level of the experimental and the control group was also evident, as was a statistically significant inverse relationship between participants' knowledge and anxiety levels. From these findings, several implications and recommendations for nursing practice and research have been generated.

  11. Does chess instruction improve mathematical problem-solving ability? Two experimental studies with an active control group.

    PubMed

    Sala, Giovanni; Gobet, Fernand

    2017-12-01

    It has been proposed that playing chess enables children to improve their ability in mathematics. These claims have been recently evaluated in a meta-analysis (Sala & Gobet, 2016, Educational Research Review, 18, 46-57), which indicated a significant effect in favor of the groups playing chess. However, the meta-analysis also showed that most of the reviewed studies used a poor experimental design (in particular, they lacked an active control group). We ran two experiments that used a three-group design including both an active and a passive control group, with a focus on mathematical ability. In the first experiment (N = 233), a group of third and fourth graders was taught chess for 25 hours and tested on mathematical problem-solving tasks. Participants also filled in a questionnaire assessing their meta-cognitive ability for mathematics problems. The group playing chess was compared to an active control group (playing checkers) and a passive control group. The three groups showed no statistically significant difference in mathematical problem-solving or metacognitive abilities in the posttest. The second experiment (N = 52) broadly used the same design, but the Oriental game of Go replaced checkers in the active control group. While the chess-treated group and the passive control group slightly outperformed the active control group with mathematical problem solving, the differences were not statistically significant. No differences were found with respect to metacognitive ability. These results suggest that the effects (if any) of chess instruction, when rigorously tested, are modest and that such interventions should not replace the traditional curriculum in mathematics.

  12. An experimental evaluation of the Sternberg task as a workload metric for helicopter Flight Handling Qualities (FHQ) research

    NASA Technical Reports Server (NTRS)

    Hemingway, J. C.

    1984-01-01

    The objective was to determine whether the Sternberg item-recognition task, employed as a secondary task measure of spare mental capacity for flight handling qualities (FHQ) simulation research, could help to differentiate between different flight-control conditions. FHQ evaluations were conducted on the Vertical Motion Simulator at Ames Research Center to investigate different primary flight-control configurations, and selected stability and control augmentation levels for helicopters engaged in low-level flight regimes. The Sternberg task was superimposed upon the primary flight-control task in a balanced experimental design. The results of parametric statistical analysis of Sternberg secondary task data failed to support the continued use of this task as a measure of pilot workload. In addition to the secondary task, subjects provided Cooper-Harper pilot ratings (CHPR) and responded to workload questionnaire. The CHPR data also failed to provide reliable statistical discrimination between FHQ treatment conditions; some insight into the behavior of the secondary task was gained from the workload questionnaire data.

  13. The Effects of Energy Drinks on Cognitive Ability

    NASA Astrophysics Data System (ADS)

    Lucas, Marlon R.

    Fatigue problems have been widespread in the air traffic control industry; in past years a common practice among air traffic controllers has been to consume highly caffeinated beverages to maintain awareness and thwart sleep deprivation. This study sought to examine what impact the consumption of an energy drink had on Air Traffic Control Collegiate Training Initiative students at Middle Tennessee State University to solve Air Traffic Selection and Training Battery Applied Math type test problems. Participants consumed a Red Bull energy drink or a placebo and then were asked to complete speed, time, distance, and rate of climb and descent rates questions in addition to answering questions regarding their perception of energy drinks. An appropriate statistical analysis was applied to compare scores of participants. The experimental group which received the energy drink averaged slightly lower (M=77.27, SD=19.79) than the control group, which consumed the placebo beverage (M=81.5, SD=19.01), but this difference was not statistically significant.

  14. The Sternberg Task as a Workload Metric in Flight Handling Qualities Research

    NASA Technical Reports Server (NTRS)

    Hemingway, J. C.

    1984-01-01

    The objective of this research was to determine whether the Sternberg item-recognition task, employed as a secondary task measure of spare mental capacity for flight handling qualities (FHQ) simulation research, could help to differentiate between different flight-control conditions. FHQ evaluations were conducted on the Vertical Motion Simulator at Ames Research Center to investigate different primary flight-control configurations, and selected stability and control augmentation levels for helicopers engaged in low-level flight regimes. The Sternberg task was superimposed upon the primary flight-control task in a balanced experimental design. The results of parametric statistical analysis of Sternberg secondary task data failed to support the continued use of this task as a measure of pilot workload. In addition to the secondary task, subjects provided Cooper-Harper pilot ratings (CHPR) and responded to a workload questionnaire. The CHPR data also failed to provide reliable statistical discrimination between FHQ treatment conditions; some insight into the behavior of the secondary task was gained from the workload questionnaire data.

  15. [Effect of overdose fluoride on expression of bone sialoprotein in developing dental tissues of rats].

    PubMed

    Xu, Zhi-ling; Wang, Qiang; Liu, Tian-lin; Guo, Li-ying; Jing, Feng-qiu; Liu, Hui

    2006-04-01

    To investigate the changes of bone sialoprotein (BSP) in developing dental tissues of rats exposed to fluoride. Twenty rats were randomly divided into two groups, one was with distilled water (control group), the other was with distilled water treated by fluoride (experimental group). When the fluorosis model was established, the changes of the expression of BSP were investigated and compared between the two groups. HE staining was used to observe the morphology of the cell, and immunohistochemisty assay was used to determine the expression of BSP in rat incisor. Student's t test was used for statistical analysis. The ameloblasts had normal morphology and arranged orderly. Immunoreactivitis of BSP was present in matured ameloblasts, dentinoblasts, cementoblasts, and the matrix in the control group. But in the experimental group the ameloblasts arranged in multiple layers, the enamel matrix was confused and the expression of BSP was significantly lower than that of the control group. Statistical analysis showed significant differences between the two groups (P<0.01). Fluoride can inhibit the expression of BSP in developing dental tissues of rats, and then inhibit differentiation of the tooth epithelial cells and secretion of matrix. This is a probable intracellular mechanism of dental fluorosis.

  16. Effectiveness of Conceptual Change Text-oriented Instruction on Students' Understanding of Energy in Chemical Reactions

    NASA Astrophysics Data System (ADS)

    Taştan, Özgecan; Yalçınkaya, Eylem; Boz, Yezdan

    2008-10-01

    The aim of this study is to compare the effectiveness of conceptual change text instruction (CCT) in the context of energy in chemical reactions. The subjects of the study were 60, 10th grade students at a high school, who were in two different classes and taught by the same teacher. One of the classes was randomly selected as the experimental group in which CCT instruction was applied, and the other as the control group in which traditional teaching method was used. The data were obtained through the use of Energy Concept Test (ECT), the Attitude Scale towards Chemistry (ASC) and Science Process Skill Test (SPST). In order to find out the effect of the conceptual change text on students' learning of energy concept, independent sample t-tests, ANCOVA (analysis of covariance) and ANOVA (analysis of variance) were used. Results revealed that there was a statistically significant mean difference between the experimental and control group in terms of students' ECT total mean scores; however, there was no statistically significant difference between the experimental and control group in terms of students' attitude towards chemistry. These findings suggest that conceptual change text instruction enhances the understanding and achievement.

  17. Lack of genotoxicity in medical oncology nurses handling antineoplastic drugs: effect of work environment and protective equipment.

    PubMed

    Gulten, Tuna; Evke, Elif; Ercan, Ilker; Evrensel, Turkkan; Kurt, Ender; Manavoglu, Osman

    2011-01-01

    In this study we aimed to investigate the genotoxic effects of antineoplastic agents in occupationally exposed oncology nurses. Genotoxic effects mean the disruptive effects in the integrity of DNA and they are associated with cancer development. Biomonitoring of health care workers handling antineoplastic agents is helpful for the evaluation of exposure to cytostatics. The study included an exposed and two control groups. The exposed group (n=9) was comprised of oncology nurses. The first (n=9) and second (n=10) control groups were comprised of subjects who did not come into contact with antineoplastic drugs working respectively in the same department with oncology nurses and in different departments. Genotoxicity evaluation was performed using SCE analysis. After applying culture, harvest and chromosome staining procedures, a total of 25 metaphases were analyzed per person. Kruskal Wallis test was used to perform statistical analysis. A statistically significant difference of sister chromatid exchange frequencies was not observed between the exposed and control groups. Lack of genotoxicity in medical oncology nurses might be due to good working conditions with high standards of technical equipment and improved personal protection.

  18. Serum Levels of 25-hydroxyvitamin D in Chronic Urticaria and its Association with Disease Activity: A Case Control Study

    PubMed Central

    Rather, Shagufta; Keen, Abid; Sajad, Peerzada

    2018-01-01

    Aim: To evaluate the relationship between vitamin D levels and chronic spontaneous urticaria (CSU) and compare with healthy age and sex matched controls. Material and Methods: This was a hospital-based cross-sectional study conducted over a period of 1 year, in which 110 patients with CSU were recruited along with an equal number of sex and age-matched healthy controls. For each patient, urticaria activity score (UAS) was calculated and autologous serum skin test (ASST) was performed. Plasma 25-hydroxyvitamin D [25-(OH)D] was analyzed by chemiluminescence method. A deficiency in vitamin D was defined as serum 25-(OH)D concentrations <30 ng/mL. The statistical analysis was carried out by using appropriate statistical tests. Results: The mean serum 25-(OH)D levels of CSU patients was 19.6 ± 6.9 ng/mL, whereas in control group, the mean level was 38.5 ± 6.7, the difference being statistically significant (P < 0.001). A significant negative correlation was found between vitamin D levels and UAS. (P < 0.001). The number of patients with ASST positivity was 44 (40%). Conclusion: The patients with CSU had reduced levels of vitamin D when compared to healthy controls. Furthermore, there was a significant negative correlation between the levels of serum vitamin D and severity of CSU. PMID:29854636

  19. Perceived Health Locus of Control, Self-Esteem, and Its Relations to Psychological Well-Being Status in Iranian Students

    PubMed Central

    Moshki, M; Ashtarian, H

    2010-01-01

    Background: Health locus of control (HLC) has been associated with a variety of ailments and health outcomes and designed to predict behaviors and cognitive processes relevant to mental and physical health. This study investigated the relationships between perceived health locus of control, self-esteem, and mental health status among Iranian students. Methods: In this analytical study the subjects were recruited from students in Gonabad University of Medical Sciences, Iran, who studied in the first year (N=154). Students completed the questionnaires for assessing demographic, perceived health locus of control, self - esteem and psychological well- being data. Results: The statistical analysis revealed a negative relationship between perceived Internal HLC and self-esteem with psychological well-being. The positive correlation of the perceived Chance HLC with psychological well-being was statistically significant (r= 0.21, P< 0.01) and the positive correlation of the perceived Internal HLC with self-esteem was statistically significant (r= 0.25, P< 0.01). A significantly direct relationship between low perceived Internal HLC, self–esteem and psychological problems was found among these students. Conclusion: The findings will be addressed in relation to their implications for effective mental health education based on health locus of control especially internal and powerful others beliefs associated with self-esteem for students. This will require additional monitoring and uninterrupted trying in order to be effective. PMID:23113040

  20. Quantifying intrinsic and extrinsic control of single-cell fates in cancer and stem/progenitor cell pedigrees with competing risks analysis

    PubMed Central

    Cornwell, J. A.; Hallett, R. M.; der Mauer, S. Auf; Motazedian, A.; Schroeder, T.; Draper, J. S.; Harvey, R. P.; Nordon, R. E.

    2016-01-01

    The molecular control of cell fate and behaviour is a central theme in biology. Inherent heterogeneity within cell populations requires that control of cell fate is studied at the single-cell level. Time-lapse imaging and single-cell tracking are powerful technologies for acquiring cell lifetime data, allowing quantification of how cell-intrinsic and extrinsic factors control single-cell fates over time. However, cell lifetime data contain complex features. Competing cell fates, censoring, and the possible inter-dependence of competing fates, currently present challenges to modelling cell lifetime data. Thus far such features are largely ignored, resulting in loss of data and introducing a source of bias. Here we show that competing risks and concordance statistics, previously applied to clinical data and the study of genetic influences on life events in twins, respectively, can be used to quantify intrinsic and extrinsic control of single-cell fates. Using these statistics we demonstrate that 1) breast cancer cell fate after chemotherapy is dependent on p53 genotype; 2) granulocyte macrophage progenitors and their differentiated progeny have concordant fates; and 3) cytokines promote self-renewal of cardiac mesenchymal stem cells by symmetric divisions. Therefore, competing risks and concordance statistics provide a robust and unbiased approach for evaluating hypotheses at the single-cell level. PMID:27250534

  1. Modified optimal control pilot model for computer-aided design and analysis

    NASA Technical Reports Server (NTRS)

    Davidson, John B.; Schmidt, David K.

    1992-01-01

    This paper presents the theoretical development of a modified optimal control pilot model based upon the optimal control model (OCM) of the human operator developed by Kleinman, Baron, and Levison. This model is input compatible with the OCM and retains other key aspects of the OCM, such as a linear quadratic solution for the pilot gains with inclusion of control rate in the cost function, a Kalman estimator, and the ability to account for attention allocation and perception threshold effects. An algorithm designed for each implementation in current dynamic systems analysis and design software is presented. Example results based upon the analysis of a tracking task using three basic dynamic systems are compared with measured results and with similar analyses performed with the OCM and two previously proposed simplified optimal pilot models. The pilot frequency responses and error statistics obtained with this modified optimal control model are shown to compare more favorably to the measured experimental results than the other previously proposed simplified models evaluated.

  2. Information integration and diagnosis analysis of equipment status and production quality for machining process

    NASA Astrophysics Data System (ADS)

    Zan, Tao; Wang, Min; Hu, Jianzhong

    2010-12-01

    Machining status monitoring technique by multi-sensors can acquire and analyze the machining process information to implement abnormity diagnosis and fault warning. Statistical quality control technique is normally used to distinguish abnormal fluctuations from normal fluctuations through statistical method. In this paper by comparing the advantages and disadvantages of the two methods, the necessity and feasibility of integration and fusion is introduced. Then an approach that integrates multi-sensors status monitoring and statistical process control based on artificial intelligent technique, internet technique and database technique is brought forward. Based on virtual instrument technique the author developed the machining quality assurance system - MoniSysOnline, which has been used to monitoring the grinding machining process. By analyzing the quality data and AE signal information of wheel dressing process the reason of machining quality fluctuation has been obtained. The experiment result indicates that the approach is suitable for the status monitoring and analyzing of machining process.

  3. Statistically Controlling for Confounding Constructs Is Harder than You Think

    PubMed Central

    Westfall, Jacob; Yarkoni, Tal

    2016-01-01

    Social scientists often seek to demonstrate that a construct has incremental validity over and above other related constructs. However, these claims are typically supported by measurement-level models that fail to consider the effects of measurement (un)reliability. We use intuitive examples, Monte Carlo simulations, and a novel analytical framework to demonstrate that common strategies for establishing incremental construct validity using multiple regression analysis exhibit extremely high Type I error rates under parameter regimes common in many psychological domains. Counterintuitively, we find that error rates are highest—in some cases approaching 100%—when sample sizes are large and reliability is moderate. Our findings suggest that a potentially large proportion of incremental validity claims made in the literature are spurious. We present a web application (http://jakewestfall.org/ivy/) that readers can use to explore the statistical properties of these and other incremental validity arguments. We conclude by reviewing SEM-based statistical approaches that appropriately control the Type I error rate when attempting to establish incremental validity. PMID:27031707

  4. Brain serotonin transporter density and aggression in abstinent methamphetamine abusers.

    PubMed

    Sekine, Yoshimoto; Ouchi, Yasuomi; Takei, Nori; Yoshikawa, Etsuji; Nakamura, Kazuhiko; Futatsubashi, Masami; Okada, Hiroyuki; Minabe, Yoshio; Suzuki, Katsuaki; Iwata, Yasuhide; Tsuchiya, Kenji J; Tsukada, Hideo; Iyo, Masaomi; Mori, Norio

    2006-01-01

    In animals, methamphetamine is known to have a neurotoxic effect on serotonin neurons, which have been implicated in the regulation of mood, anxiety, and aggression. It remains unknown whether methamphetamine damages serotonin neurons in humans. To investigate the status of brain serotonin neurons and their possible relationship with clinical characteristics in currently abstinent methamphetamine abusers. Case-control analysis. A hospital research center. Twelve currently abstinent former methamphetamine abusers (5 women and 7 men) and 12 age-, sex-, and education-matched control subjects recruited from the community. The brain regional density of the serotonin transporter, a structural component of serotonin neurons, was estimated using positron emission tomography and trans-1,2,3,5,6,10-beta-hexahydro-6-[4-(methylthio)phenyl]pyrrolo-[2,1-a]isoquinoline ([(11)C](+)McN-5652). Estimates were derived from region-of-interest and statistical parametric mapping methods, followed by within-case analysis using the measures of clinical variables. The duration of methamphetamine use, the magnitude of aggression and depressive symptoms, and changes in serotonin transporter density represented by the [(11)C](+)McN-5652 distribution volume. Methamphetamine abusers showed increased levels of aggression compared with controls. Region-of-interest and statistical parametric mapping analyses revealed that the serotonin transporter density in global brain regions (eg, the midbrain, thalamus, caudate, putamen, cerebral cortex, and cerebellum) was significantly lower in methamphetamine abusers than in control subjects, and this reduction was significantly inversely correlated with the duration of methamphetamine use. Furthermore, statistical parametric mapping analyses indicated that the density in the orbitofrontal, temporal, and anterior cingulate areas was closely associated with the magnitude of aggression in methamphetamine abusers. Protracted abuse of methamphetamine may reduce the density of the serotonin transporter in the brain, leading to elevated aggression, even in currently abstinent abusers.

  5. The influence of control group reproduction on the statistical power of the Environmental Protection Agency's Medaka Extended One Generation Reproduction Test (MEOGRT).

    PubMed

    Flynn, Kevin; Swintek, Joe; Johnson, Rodney

    2017-02-01

    Because of various Congressional mandates to protect the environment from endocrine disrupting chemicals (EDCs), the United States Environmental Protection Agency (USEPA) initiated the Endocrine Disruptor Screening Program. In the context of this framework, the Office of Research and Development within the USEPA developed the Medaka Extended One Generation Reproduction Test (MEOGRT) to characterize the endocrine action of a suspected EDC. One important endpoint of the MEOGRT is fecundity of medaka breeding pairs. Power analyses were conducted to determine the number of replicates needed in proposed test designs and to determine the effects that varying reproductive parameters (e.g. mean fecundity, variance, and days with no egg production) would have on the statistical power of the test. The MEOGRT Reproduction Power Analysis Tool (MRPAT) is a software tool developed to expedite these power analyses by both calculating estimates of the needed reproductive parameters (e.g. population mean and variance) and performing the power analysis under user specified scenarios. Example scenarios are detailed that highlight the importance of the reproductive parameters on statistical power. When control fecundity is increased from 21 to 38 eggs per pair per day and the variance decreased from 49 to 20, the gain in power is equivalent to increasing replication by 2.5 times. On the other hand, if 10% of the breeding pairs, including controls, do not spawn, the power to detect a 40% decrease in fecundity drops to 0.54 from nearly 0.98 when all pairs have some level of egg production. Perhaps most importantly, MRPAT was used to inform the decision making process that lead to the final recommendation of the MEOGRT to have 24 control breeding pairs and 12 breeding pairs in each exposure group. Published by Elsevier Inc.

  6. Diet and liver apoptosis in rats: a particular metabolic pathway.

    PubMed

    Monteiro, Maria Emilia Lopes; Xavier, Analucia Rampazzo; Azeredo, Vilma Blondet

    2017-03-30

    Various studies have indicated an association between modifi cation in dietary macronutrient composition and liver apoptosis. To explain how changes in metabolic pathways associated with a high-protein, high-fat, and low-carbohydrate diet causes liver apoptosis. Two groups of rats were compared. An experimental diet group (n = 8) using a high-protein (59.46%), high-fat (31.77%), and low-carbohydrate (8.77%) diet versus a control one (n = 9) with American Institute of Nutrition (AIN)-93-M diet. Animals were sacrificed after eight weeks, the adipose tissue weighed, the liver removed for flow cytometry analysis, and blood collected to measure glucose, insulin, glucagon, IL-6, TNF, triglycerides, malondialdehyde, and β-hydroxybutyrate. Statistical analysis was carried out using the unpaired and parametric Student's t-test and Pearson's correlation coeffi ents. Significance was set at p < 0.05. Animals from the experimental group presented less adipose tissue than dose of the control group. Percentage of nonviable hepatocytes in the experimental group was 2.18 times larger than the control group (p = 0.001). No statistically significant differences were found in capillary glucose, insulin, glucagon, IL-6, or TNF-α between two groups. Plasmatic β-hydroxybutyrate and malondialdehyde of the experimental group expressed higher levels and triglycerides lower levels compared with the control group. The results show a positive and significant correlation between the percentage of nonviable hepatocytes and malondialdehyde levels (p = 0.0217) and a statistically significant negative correlation with triglycerides levels (p = 0.006). Results suggest that plasmatic malondialdehyde and triglyceride levels are probably good predictors of liver damage associated with an experimental low-carbohydrate diet in rats.

  7. Development of computer-assisted instruction application for statistical data analysis android platform as learning resource

    NASA Astrophysics Data System (ADS)

    Hendikawati, P.; Arifudin, R.; Zahid, M. Z.

    2018-03-01

    This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.

  8. Epidemiologic methods in clinical trials.

    PubMed

    Rothman, K J

    1977-04-01

    Epidemiologic methods developed to control confounding in non-experimental studies are equally applicable for experiments. In experiments, most confounding is usually controlled by random allocation of subjects to treatment groups, but randomization does not preclude confounding except for extremely large studies, the degree of confounding expected being inversely related to the size of the treatment groups. In experiments, as in non-experimental studies, the extent of confounding for each risk indicator should be assessed, and if sufficiently large, controlled. Confounding is properly assessed by comparing the unconfounded effect estimate to the crude effect estimate; a common error is to assess confounding by statistical tests of significance. Assessment of confounding involves its control as a prerequisite. Control is most readily and cogently achieved by stratification of the data, though with many factors to control simultaneously, multivariate analysis or a combination of multivariate analysis and stratification might be necessary.

  9. [Analysis on theses of the Chinese Journal of Parasitology and Parasitic Diseases in 2009-2012].

    PubMed

    Yi, Feng-Yun; Qu, Lin-Ping; Yan, He; Sheng, Hui-Feng

    2013-12-01

    The published articles at the Chinese Journal of Parasitology and Parasitic Diseases in 2009-2012 were statistically analyzed. Among 547 papers published in the four years, original articles occupied 45.3% (248/547). The number of authors was 2712, with an average cooperation degree of 5.0, and the co-authorship accounted for 95.4% of the papers. Authors were mainly from colleges/universities (51.9%, 284/547), institutions for disease control (34.4%, 188/547) and hospitals health centers (13.7%, 75/547). The average publishing delay was 212, 141, 191 and 207 d in 2009-2012. Statistical analysis reflected the characteristics and academic level for improving the quality of the journal, and revealed the latest development and trends.

  10. The Determination of Self-Control Skill of Tuberculosis Patients According to Some Variables.

    PubMed

    Pehlivan, Şenay Akgün; PurutcuoĞlu, Eda; Duyan, Gülsüm; Duyan, Veli

    2015-01-01

    This study was conducted in Ankara, Turkey. In the study, questionnaire form and Self-Control Schedule (SCS) was used. According to the t-test, statistically significant difference between self-control skill and "support of family members to each other" was found. Analysis showed that the occupational status and educational level had a significant effect on self-control skill. Besides, there is a positive relationship between average monthly income and self-control skill, while there is a negative correlation among number of hospitalizations, diagnosis period, and self-control skill. Our results may be a guide to develop more effective intervention strategies for tuberculosis management.

  11. [Evaluation of using statistical methods in selected national medical journals].

    PubMed

    Sych, Z

    1996-01-01

    The paper covers the performed evaluation of frequency with which the statistical methods were applied in analyzed works having been published in six selected, national medical journals in the years 1988-1992. For analysis the following journals were chosen, namely: Klinika Oczna, Medycyna Pracy, Pediatria Polska, Polski Tygodnik Lekarski, Roczniki Państwowego Zakładu Higieny, Zdrowie Publiczne. Appropriate number of works up to the average in the remaining medical journals was randomly selected from respective volumes of Pol. Tyg. Lek. The studies did not include works wherein the statistical analysis was not implemented, which referred both to national and international publications. That exemption was also extended to review papers, casuistic ones, reviews of books, handbooks, monographies, reports from scientific congresses, as well as papers on historical topics. The number of works was defined in each volume. Next, analysis was performed to establish the mode of finding out a suitable sample in respective studies, differentiating two categories: random and target selections. Attention was also paid to the presence of control sample in the individual works. In the analysis attention was also focussed on the existence of sample characteristics, setting up three categories: complete, partial and lacking. In evaluating the analyzed works an effort was made to present the results of studies in tables and figures (Tab. 1, 3). Analysis was accomplished with regard to the rate of employing statistical methods in analyzed works in relevant volumes of six selected, national medical journals for the years 1988-1992, simultaneously determining the number of works, in which no statistical methods were used. Concurrently the frequency of applying the individual statistical methods was analyzed in the scrutinized works. Prominence was given to fundamental statistical methods in the field of descriptive statistics (measures of position, measures of dispersion) as well as most important methods of mathematical statistics such as parametric tests of significance, analysis of variance (in single and dual classifications). non-parametric tests of significance, correlation and regression. The works, in which use was made of either multiple correlation or multiple regression or else more complex methods of studying the relationship for two or more numbers of variables, were incorporated into the works whose statistical methods were constituted by correlation and regression as well as other methods, e.g. statistical methods being used in epidemiology (coefficients of incidence and morbidity, standardization of coefficients, survival tables) factor analysis conducted by Jacobi-Hotellng's method, taxonomic methods and others. On the basis of the performed studies it has been established that the frequency of employing statistical methods in the six selected national, medical journals in the years 1988-1992 was 61.1-66.0% of the analyzed works (Tab. 3), and they generally were almost similar to the frequency provided in English language medical journals. On a whole, no significant differences were disclosed in the frequency of applied statistical methods (Tab. 4) as well as in frequency of random tests (Tab. 3) in the analyzed works, appearing in the medical journals in respective years 1988-1992. The most frequently used statistical methods in analyzed works for 1988-1992 were the measures of position 44.2-55.6% and measures of dispersion 32.5-38.5% as well as parametric tests of significance 26.3-33.1% of the works analyzed (Tab. 4). For the purpose of increasing the frequency and reliability of the used statistical methods, the didactics should be widened in the field of biostatistics at medical studies and postgraduation training designed for physicians and scientific-didactic workers.

  12. Impact of traumatic dental injury on the quality of life of young children: a case-control study.

    PubMed

    Vieira-Andrade, Raquel Gonçalves; Siqueira, Maria Betânia Lins; Gomes, Genara Brum; D'Avila, Sérgio; Pordeus, Isabela Almeida; Paiva, Saul Martins; Granville-Garcia, Ana Flávia

    2015-10-01

    There are no longitudinal studies that assess the impact of traumatic dental injury (TDI) on the oral health-related quality of life (OHRQoL) of preschool children. To investigate the impact of TDI on OHRQoL among preschool children, a population-based case-control study was carried out with a representative sample of 335 children, 3-5 years of age, enrolled at public and private preschools in the city of Campina Grande, Brazil. The case group and the control group were matched for age, gender, type of preschool and monthly household income at a ratio of 1:4 (67 cases and 286 controls). Impact on the OHRQoL of children was assessed through administration of the Early Childhood Oral Health Impact Scale (ECOHIS). The occurrence of TDI was determined through clinical examinations performed by three calibrated dentists. Data analysis involved descriptive statistics, McNemar's test, the chi-square test with linear trend and conditional logistic regression analysis [P≤0.05; 95% confidence interval (95% CI)]. The most frequent responses were 'felt pain' (19.4%) and 'difficulty eating' (16.4%). The prevalence of TDI was 37.3% in the case group and 33.9% in the control group. No statistically significant differences were found between case and control groups regarding the presence of TDI (odds ratio=1.16; 95% CI: 0.66-2.02). TDI had no impact on the quality of life of preschool children. © 2015 FDI World Dental Federation.

  13. Investigation of attention deficit hyperactivity disorder (ADHD) sub-types in children via EEG frequency domain analysis.

    PubMed

    Aldemir, Ramazan; Demirci, Esra; Per, Huseyin; Canpolat, Mehmet; Özmen, Sevgi; Tokmakçı, Mahmut

    2018-04-01

    To investigate the frequency domain effects and changes in electroencephalography (EEG) signals in children diagnosed with attention deficit hyperactivity disorder (ADHD). The study contains 40 children. All children were between the ages of 7 and 12 years. Participants were classified into four groups which were ADHD (n=20), ADHD-I (ADHD-Inattentive type) (n=10), ADHD-C (ADHD-Combined type) (n=10), and control (n=20) groups. In this study, the frequency domain of EEG signals for ADHD, subtypes and control groups were analyzed and compared using Matlab software. The mean age of the ADHD children's group was 8.7 years and the control group 9.1 years. Spectral analysis of mean power (μV 2 ) and relative-mean power (%) was carried out for four different frequency bands: delta (0--4 Hz), theta (4--8 Hz), alpha (8--13 Hz) and beta (13--32 Hz). The ADHD and subtypes of ADHD-I, and ADHD-C groups had higher average power value of delta and theta band than that of control group. However, this is not the case for alpha and beta bands. Increases in delta/beta ratio and statistical significance were found only between ADHD-I and control group, and in delta/beta, theta/delta ratio statistical significance values were found to exist between ADHD-C and control group. EEG analyzes can be used as an alternative method when ADHD subgroups are identified.

  14. Performance evaluation of mobile downflow booths for reducing airborne particles in the workplace.

    PubMed

    Lo, Li-Ming; Hocker, Braden; Steltz, Austin E; Kremer, John; Feng, H Amy

    2017-11-01

    Compared to other common control measures, the downflow booth is a costly engineering control used to contain airborne dust or particles. The downflow booth provides unidirectional filtered airflow from the ceiling, entraining released particles away from the workers' breathing zone, and delivers contained airflow to a lower level exhaust for removing particulates by filtering media. In this study, we designed and built a mobile downflow booth that is capable of quick assembly and easy size change to provide greater flexibility and particle control for various manufacturing processes or tasks. An experimental study was conducted to thoroughly evaluate the control performance of downflow booths used for removing airborne particles generated by the transfer of powdered lactose between two containers. Statistical analysis compared particle reduction ratios obtained from various test conditions including booth size (short, regular, or extended), supply air velocity (0.41 and 0.51 m/s or 80 and 100 feet per minute, fpm), powder transfer location (near or far from the booth exhaust), and inclusion or exclusion of curtains at the booth entrance. Our study results show that only short-depth downflow booths failed to protect the worker performing powder transfer far from the booth exhausts. Statistical analysis shows that better control performance can be obtained with supply air velocity of 0.51 m/s (100 fpm) than with 0.41 m/s (80 fpm) and that use of curtains for downflow booths did not improve their control performance.

  15. Research on Time Selection of Mass Sports in Tibetan Areas Plateau of Gansu Province Based on Environmental Science

    NASA Astrophysics Data System (ADS)

    Gao, Jike

    2018-01-01

    Through using the method of literature review, instrument measuring, questionnaire and mathematical statistics, this paper analyzed the current situation in Mass Sports of Tibetan Areas Plateau in Gansu Province. Through experimental test access to Tibetan areas in gansu province of air pollutants and meteorological index data as the foundation, control related national standard and exercise science, statistical analysis of data, the Tibetan plateau, gansu province people participate in physical exercise is dedicated to providing you with scientific methods and appropriate time.

  16. Wavelet and receiver operating characteristic analysis of heart rate variability

    NASA Astrophysics Data System (ADS)

    McCaffery, G.; Griffith, T. M.; Naka, K.; Frennaux, M. P.; Matthai, C. C.

    2002-02-01

    Multiresolution wavelet analysis has been used to study the heart rate variability in two classes of patients with different pathological conditions. The scale dependent measure of Thurner et al. was found to be statistically significant in discriminating patients suffering from hypercardiomyopathy from a control set of normal subjects. We have performed Receiver Operating Characteristc (ROC) analysis and found the ROC area to be a useful measure by which to label the significance of the discrimination, as well as to describe the severity of heart dysfunction.

  17. Gene-Based Association Analysis for Censored Traits Via Fixed Effect Functional Regressions.

    PubMed

    Fan, Ruzong; Wang, Yifan; Yan, Qi; Ding, Ying; Weeks, Daniel E; Lu, Zhaohui; Ren, Haobo; Cook, Richard J; Xiong, Momiao; Swaroop, Anand; Chew, Emily Y; Chen, Wei

    2016-02-01

    Genetic studies of survival outcomes have been proposed and conducted recently, but statistical methods for identifying genetic variants that affect disease progression are rarely developed. Motivated by our ongoing real studies, here we develop Cox proportional hazard models using functional regression (FR) to perform gene-based association analysis of survival traits while adjusting for covariates. The proposed Cox models are fixed effect models where the genetic effects of multiple genetic variants are assumed to be fixed. We introduce likelihood ratio test (LRT) statistics to test for associations between the survival traits and multiple genetic variants in a genetic region. Extensive simulation studies demonstrate that the proposed Cox RF LRT statistics have well-controlled type I error rates. To evaluate power, we compare the Cox FR LRT with the previously developed burden test (BT) in a Cox model and sequence kernel association test (SKAT), which is based on mixed effect Cox models. The Cox FR LRT statistics have higher power than or similar power as Cox SKAT LRT except when 50%/50% causal variants had negative/positive effects and all causal variants are rare. In addition, the Cox FR LRT statistics have higher power than Cox BT LRT. The models and related test statistics can be useful in the whole genome and whole exome association studies. An age-related macular degeneration dataset was analyzed as an example. © 2016 WILEY PERIODICALS, INC.

  18. Gene-based Association Analysis for Censored Traits Via Fixed Effect Functional Regressions

    PubMed Central

    Fan, Ruzong; Wang, Yifan; Yan, Qi; Ding, Ying; Weeks, Daniel E.; Lu, Zhaohui; Ren, Haobo; Cook, Richard J; Xiong, Momiao; Swaroop, Anand; Chew, Emily Y.; Chen, Wei

    2015-01-01

    Summary Genetic studies of survival outcomes have been proposed and conducted recently, but statistical methods for identifying genetic variants that affect disease progression are rarely developed. Motivated by our ongoing real studies, we develop here Cox proportional hazard models using functional regression (FR) to perform gene-based association analysis of survival traits while adjusting for covariates. The proposed Cox models are fixed effect models where the genetic effects of multiple genetic variants are assumed to be fixed. We introduce likelihood ratio test (LRT) statistics to test for associations between the survival traits and multiple genetic variants in a genetic region. Extensive simulation studies demonstrate that the proposed Cox RF LRT statistics have well-controlled type I error rates. To evaluate power, we compare the Cox FR LRT with the previously developed burden test (BT) in a Cox model and sequence kernel association test (SKAT) which is based on mixed effect Cox models. The Cox FR LRT statistics have higher power than or similar power as Cox SKAT LRT except when 50%/50% causal variants had negative/positive effects and all causal variants are rare. In addition, the Cox FR LRT statistics have higher power than Cox BT LRT. The models and related test statistics can be useful in the whole genome and whole exome association studies. An age-related macular degeneration dataset was analyzed as an example. PMID:26782979

  19. Efficacy of Exclusive Lingual Nerve Block versus Conventional Inferior Alveolar Nerve Block in Achieving Lingual Soft-tissue Anesthesia.

    PubMed

    Balasubramanian, Sasikala; Paneerselvam, Elavenil; Guruprasad, T; Pathumai, M; Abraham, Simin; Krishnakumar Raja, V B

    2017-01-01

    The aim of this randomized clinical trial was to assess the efficacy of exclusive lingual nerve block (LNB) in achieving selective lingual soft-tissue anesthesia in comparison with conventional inferior alveolar nerve block (IANB). A total of 200 patients indicated for the extraction of lower premolars were recruited for the study. The samples were allocated by randomization into control and study groups. Lingual soft-tissue anesthesia was achieved by IANB and exclusive LNB in the control and study group, respectively. The primary outcome variable studied was anesthesia of ipsilateral lingual mucoperiosteum, floor of mouth and tongue. The secondary variables assessed were (1) taste sensation immediately following administration of local anesthesia and (2) mouth opening and lingual nerve paresthesia on the first postoperative day. Data analysis for descriptive and inferential statistics was performed using SPSS (IBM SPSS Statistics for Windows, Version 22.0, Armonk, NY: IBM Corp. Released 2013) and a P < 0.05 was considered statistically significant. In comparison with the control group, the study group (LNB) showed statistically significant anesthesia of the lingual gingiva of incisors, molars, anterior floor of the mouth, and anterior tongue. Exclusive LNB is superior to IAN nerve block in achieving selective anesthesia of lingual soft tissues. It is technically simple and associated with minimal complications as compared to IAN block.

  20. Efficacy of Exclusive Lingual Nerve Block versus Conventional Inferior Alveolar Nerve Block in Achieving Lingual Soft-tissue Anesthesia

    PubMed Central

    Balasubramanian, Sasikala; Paneerselvam, Elavenil; Guruprasad, T; Pathumai, M; Abraham, Simin; Krishnakumar Raja, V. B.

    2017-01-01

    Objective: The aim of this randomized clinical trial was to assess the efficacy of exclusive lingual nerve block (LNB) in achieving selective lingual soft-tissue anesthesia in comparison with conventional inferior alveolar nerve block (IANB). Materials and Methods: A total of 200 patients indicated for the extraction of lower premolars were recruited for the study. The samples were allocated by randomization into control and study groups. Lingual soft-tissue anesthesia was achieved by IANB and exclusive LNB in the control and study group, respectively. The primary outcome variable studied was anesthesia of ipsilateral lingual mucoperiosteum, floor of mouth and tongue. The secondary variables assessed were (1) taste sensation immediately following administration of local anesthesia and (2) mouth opening and lingual nerve paresthesia on the first postoperative day. Results: Data analysis for descriptive and inferential statistics was performed using SPSS (IBM SPSS Statistics for Windows, Version 22.0, Armonk, NY: IBM Corp. Released 2013) and a P < 0.05 was considered statistically significant. In comparison with the control group, the study group (LNB) showed statistically significant anesthesia of the lingual gingiva of incisors, molars, anterior floor of the mouth, and anterior tongue. Conclusion: Exclusive LNB is superior to IAN nerve block in achieving selective anesthesia of lingual soft tissues. It is technically simple and associated with minimal complications as compared to IAN block. PMID:29264294

  1. Fumonisin B1 and Risk of Hepatocellular Carcinoma in Two Chinese Cohorts

    PubMed Central

    Persson, E. Christina; Sewram, Vikash; Evans, Alison A.; London, W. Thomas; Volkwyn, Yvette; Shen, Yen-Ju; Van Zyl, Jacobus A.; Chen, Gang; Lin, Wenyao; Shephard, Gordon S.; Taylor, Philip R.; Fan, Jin-Hu; Dawsey, Sanford M.; Qiao, You-Lin; McGlynn, Katherine A.; Abnet, Christian C.

    2011-01-01

    Fumonisin B1 (FB1), a mycotoxin that contaminates corn in certain climates, has been demonstrated to cause hepatocellular cancer (HCC) in animal models. Whether a relationship between FB1 and HCC exists in humans is not known. To examine the hypothesis, we conducted case-control studies nested within two large cohorts in China; the Haimen City Cohort and the General Population Study of the Nutritional Intervention Trials cohort in Linxian. In the Haimen City Cohort, nail FB1 levels were determined in 271 HCC cases and 280 controls. In the General Population Nutritional Intervention Trial, nail FB1 levels were determined in 72 HCC cases and 147 controls. In each population, odds ratios and 95% confidence intervals (95%CI) from logistic regression models estimated the association between measurable FB1 and HCC, adjusting for hepatitis B virus infection and other factors. A meta-analysis that included both populations was also conducted. The analysis revealed no statistically significant association between FB1 and HCC in either Haimen City (OR=1.10, 95%CI=0.64–1.89) or in Linxian (OR=1.47, 95%CI=0.70–3.07). Similarly, the pooled meta-analysis showed no statistically significant association between FB1 exposure and HCC (OR=1.22, 95%CI=0.79–1.89). These findings, although somewhat preliminary, do not support an associated between FB1 and HCC. PMID:22142693

  2. Statistical Analysis of Individual Participant Data Meta-Analyses: A Comparison of Methods and Recommendations for Practice

    PubMed Central

    Stewart, Gavin B.; Altman, Douglas G.; Askie, Lisa M.; Duley, Lelia; Simmonds, Mark C.; Stewart, Lesley A.

    2012-01-01

    Background Individual participant data (IPD) meta-analyses that obtain “raw” data from studies rather than summary data typically adopt a “two-stage” approach to analysis whereby IPD within trials generate summary measures, which are combined using standard meta-analytical methods. Recently, a range of “one-stage” approaches which combine all individual participant data in a single meta-analysis have been suggested as providing a more powerful and flexible approach. However, they are more complex to implement and require statistical support. This study uses a dataset to compare “two-stage” and “one-stage” models of varying complexity, to ascertain whether results obtained from the approaches differ in a clinically meaningful way. Methods and Findings We included data from 24 randomised controlled trials, evaluating antiplatelet agents, for the prevention of pre-eclampsia in pregnancy. We performed two-stage and one-stage IPD meta-analyses to estimate overall treatment effect and to explore potential treatment interactions whereby particular types of women and their babies might benefit differentially from receiving antiplatelets. Two-stage and one-stage approaches gave similar results, showing a benefit of using anti-platelets (Relative risk 0.90, 95% CI 0.84 to 0.97). Neither approach suggested that any particular type of women benefited more or less from antiplatelets. There were no material differences in results between different types of one-stage model. Conclusions For these data, two-stage and one-stage approaches to analysis produce similar results. Although one-stage models offer a flexible environment for exploring model structure and are useful where across study patterns relating to types of participant, intervention and outcome mask similar relationships within trials, the additional insights provided by their usage may not outweigh the costs of statistical support for routine application in syntheses of randomised controlled trials. Researchers considering undertaking an IPD meta-analysis should not necessarily be deterred by a perceived need for sophisticated statistical methods when combining information from large randomised trials. PMID:23056232

  3. Topical tranexamic acid in total knee replacement: a systematic review and meta-analysis.

    PubMed

    Panteli, Michalis; Papakostidis, Costas; Dahabreh, Ziad; Giannoudis, Peter V

    2013-10-01

    To examine the safety and efficacy of topical use of tranexamic acid (TA) in total knee arthroplasty (TKA). An electronic literature search of PubMed Medline; Ovid Medline; Embase; and the Cochrane Library was performed, identifying studies published in any language from 1966 to February 2013. The studies enrolled adults undergoing a primary TKA, where topical TA was used. Inverse variance statistical method and either a fixed or random effect model, depending on the absence or presence of statistical heterogeneity were used; subgroup analysis was performed when possible. We identified a total of seven eligible reports for analysis. Our meta-analysis indicated that when compared with the control group, topical application of TA limited significantly postoperative drain output (mean difference: -268.36ml), total blood loss (mean difference=-220.08ml), Hb drop (mean difference=-0.94g/dL) and lowered the risk of transfusion requirements (risk ratio=0.47, 95CI=0.26-0.84), without increased risk of thromboembolic events. Sub-group analysis indicated that a higher dose of topical TA (>2g) significantly reduced transfusion requirements. Although the present meta-analysis proved a statistically significant reduction of postoperative blood loss and transfusion requirements with topical use of TA in TKA, the clinical importance of the respective estimates of effect size should be interpreted with caution. I, II. Copyright © 2013 Elsevier B.V. All rights reserved.

  4. Effects of lifestyle change interventions for people with intellectual disabilities: Systematic review and meta-analysis of randomized controlled trials.

    PubMed

    Willems, Mariël; Waninge, Aly; Hilgenkamp, Thessa I M; van Empelen, Pepijn; Krijnen, Wim P; van der Schans, Cees P; Melville, Craig A

    2018-05-08

    Promotion of a healthy lifestyle for people with intellectual disabilities is important; however, the effectiveness of lifestyle change interventions is unclear. This research will examine the effectiveness of lifestyle change interventions for people with intellectual disabilities. Randomized controlled trials (RCTs) of lifestyle change interventions for people with intellectual disabilities were included in a systematic review and meta-analysis. Data on study and intervention characteristics were extracted, as well as data on outcome measures and results. Internal validity of the selected papers was assessed using the Cochrane Collaboration's risk bias tool. Eight RCTs were included. Multiple outcome measures were used, whereby outcome measures targeting environmental factors and participation were lacking and personal outcome measures were mostly used by a single study. Risks of bias were found for all studies. Meta-analysis showed some effectiveness for lifestyle change interventions, and a statistically significant decrease was found for waist circumference. Some effectiveness was found for lifestyle change interventions for people with intellectual disabilities. However, the effects were only statistically significant for waist circumference, so current lifestyle change interventions may not be optimally tailored to meet the needs of people with intellectual disabilities. © 2018 John Wiley & Sons Ltd.

  5. Duration on unemployment: geographic mobility and selectivity bias.

    PubMed

    Goss, E P; Paul, C; Wilhite, A

    1994-01-01

    Modeling the factors affecting the duration of unemployment was found to be influenced by the inclusion of migration factors. Traditional models which did not control for migration factors were found to underestimate movers' probability of finding an acceptable job. The empirical test of the theory, based on the analysis of data on US household heads unemployed in 1982 and employed in 1982 and 1983, found that the cumulative probability of reemployment in the traditional model was .422 and in the migration selectivity model was .624 after 30 weeks of searching. In addition, controlling for selectivity eliminated the significance of the relationship between race and job search duration in the model. The relationship between search duration and the county unemployment rate in 1982 became statistically significant, and the relationship between search duration and 1980 population per square mile in the 1982 county of residence became statistically insignificant. The finding that non-Whites have a longer duration of unemployment can better be understood as non-Whites' lower geographic mobility and lack of greater job contacts. The statistical significance of a high unemployment rate in the home labor market reducing the probability of finding employment was more in keeping with expectations. The findings assumed that the duration of employment accurately reflected the length of job search. The sample was redrawn to exclude discouraged workers and the analysis was repeated. The findings were similar to the full sample, with the coefficient for migration variable being negative and statistically significant and the coefficient for alpha remaining positive and statistically significant. Race in the selectivity model remained statistically insignificant. The findings supported the Schwartz model hypothesizing that the expansion of the radius of the search would reduce the duration of unemployment. The exclusion of the migration factor misspecified the equation for unemployment duration. Policy should be directed to the problems of geographic mobility, particularly among non-Whites.

  6. Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction

    PubMed Central

    Gallistel, C. R.; Balci, Fuat; Freestone, David; Kheifets, Aaron; King, Adam

    2014-01-01

    We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer. PMID:24637442

  7. Automated, quantitative cognitive/behavioral screening of mice: for genetics, pharmacology, animal cognition and undergraduate instruction.

    PubMed

    Gallistel, C R; Balci, Fuat; Freestone, David; Kheifets, Aaron; King, Adam

    2014-02-26

    We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.

  8. Is combined topical with intravenous tranexamic acid superior than topical, intravenous tranexamic acid alone and control groups for blood loss controlling after total knee arthroplasty: A meta-analysis.

    PubMed

    Lin, Chunmei; Qi, Yingmei; Jie, Li; Li, Hong-Biao; Zhao, Xi-Cheng; Qin, Lei; Jiang, Xin-Qiang; Zhang, Zhen-Hua; Ma, Liping

    2016-12-01

    The purpose of this systematic review and meta-analysis of randomized controlled trials (RCTs) was to evaluate the efficacy and safety of combined topical with intravenous tranexamic acid (TXA) versus topical, intravenous TXA alone or control for reducing blood loss after a total knee arthroplasty (TKA). In May 2016, a systematic computer-based search was conducted in the PubMed, Embase, Cochrane Library, Web of Science, and Chinese Wanfang database. This systematic review and meta-analysis were performed according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement criteria. Only patients prepared for primary TKA that administration combined topical with intravenous TXA with topical TXA, intravenous (IV) TXA, or control group for reducing blood loss were included. Eligible criteria were published RCTs about combined topical with intravenous TXA with topical alone or intravenous alone. The primary endpoint was the total blood loss and need for transfusion. The complications of deep venous thrombosis (DVT) were also compiled to assess the safety of combined topical TXA with intravenous TXA. Relative risks (RRs) with 95% CIs were estimated for dichotomous outcomes, and mean differences (MDs) with 95% CIs for continuous outcomes. The Cochrane risk of bias tool was used to appraise a risk of bias. Stata 12.0 software was used for meta-analysis. Fifteen studies involving 1495 patients met the inclusion criteria. The pooled meta-analysis indicated that combined topical TXA with intravenous TXA can reduce the total blood loss compared with placebo with a mean of 458.66 mL and the difference is statistically significant (MD = -458.66, 95% CI: -655.40 to 261.91, P < 0.001). Compared with intravenous TXA, combined administrated TXA can decrease the total blood loss, and the difference is statistically significant (MD = -554.03, 95% CI: -1066.21 to -41.85, P = 0.034). Compared with the topical administration TXA, the pooled meta-analysis indicated that combined TXA can decrease the amount of total blood loss with mean 107.65 mL with statistically significant(MD = -107.65, 95% CI: -525.55 to -239.9141.85, P = 0.001). The pooled results indicated that combined topical with intravenous TXA can decrease the need for transfusion (RR = 0.34, 95% CI: 0.23-0.50, P < 0.001). There is no significant difference between combined topical with intravenous TXA with topical or intravenous TXA (P > 0.05) in terms of need for transfusion and the occurrence of DVT. Compared with topical, intravenous TXA alone or control group, combined topical with TXA, can decrease the total blood loss and subsequent need for transfusion without increasing the occurrence of DVT. The dose and timing to administration TXA is different, and more randomized controlled trials are warranted to clarify the optimal dosing and time to administration TXA.

  9. Investigations of interference between electromagnetic transponders and wireless MOSFET dosimeters: a phantom study.

    PubMed

    Su, Zhong; Zhang, Lisha; Ramakrishnan, V; Hagan, Michael; Anscher, Mitchell

    2011-05-01

    To evaluate both the Calypso Systems' (Calypso Medical Technologies, Inc., Seattle, WA) localization accuracy in the presence of wireless metal-oxide-semiconductor field-effect transistor (MOSFET) dosimeters of dose verification system (DVS, Sicel Technologies, Inc., Morrisville, NC) and the dosimeters' reading accuracy in the presence of wireless electromagnetic transponders inside a phantom. A custom-made, solid-water phantom was fabricated with space for transponders and dosimeters. Two inserts were machined with positioning grooves precisely matching the dimensions of the transponders and dosimeters and were arranged in orthogonal and parallel orientations, respectively. To test the transponder localization accuracy with/without presence of dosimeters (hypothesis 1), multivariate analyses were performed on transponder-derived localization data with and without dosimeters at each preset distance to detect statistically significant localization differences between the control and test sets. To test dosimeter dose-reading accuracy with/without presence of transponders (hypothesis 2), an approach of alternating the transponder presence in seven identical fraction dose (100 cGy) deliveries and measurements was implemented. Two-way analysis of variance was performed to examine statistically significant dose-reading differences between the two groups and the different fractions. A relative-dose analysis method was also used to evaluate transponder impact on dose-reading accuracy after dose-fading effect was removed by a second-order polynomial fit. Multivariate analysis indicated that hypothesis 1 was false; there was a statistically significant difference between the localization data from the control and test sets. However, the upper and lower bounds of the 95% confidence intervals of the localized positional differences between the control and test sets were less than 0.1 mm, which was significantly smaller than the minimum clinical localization resolution of 0.5 mm. For hypothesis 2, analysis of variance indicated that there was no statistically significant difference between the dosimeter readings with and without the presence of transponders. Both orthogonal and parallel configurations had difference of polynomial-fit dose to measured dose values within 1.75%. The phantom study indicated that the Calypso System's localization accuracy was not affected clinically due to the presence of DVS wireless MOSFET dosimeters and the dosimeter-measured doses were not affected by the presence of transponders. Thus, the same patients could be implanted with both transponders and dosimeters to benefit from improved accuracy of radiotherapy treatments offered by conjunctional use of the two systems.

  10. Study of Porphyromonas gingivalis in periodontal diseases: A systematic review and meta-analysis.

    PubMed

    Rafiei, Mohammad; Kiani, Faezeh; Sayehmiri, Fatemeh; Sayehmiri, Kourosh; Sheikhi, Abdolkarim; Zamanian Azodi, Mona

    2017-01-01

    Background : The mouth cavity hosts various types of anaerobic bacteria including Porphyromonas gingivalis , which causes periodontal inflammatory diseases. P. gingivalis is a gram-negative oral anaerobe and is considered as a main etiological factor in periodontal diseases. Several studies have reported a relationship between P. gingivalis in individuals with periodontal diseases and a critical role of this bacterium in the pathogenesis of periodontal diseases. The present study aimed at estimating this probability using a meta-analysis. Methods : We searched several databases including PubMed, Scopus, Google Scholar, and Web of Science to identify case-control studies addressing the relationship between P. gingivalis with periodontal diseases. A total of 49 reports published from different countries from 1993 to 2014 were included in this study. I² (heterogeneity index) statistics were calculated to examine heterogeneity. Data were analyzed using STATA Version 11. Results : After a detailed analysis of the selected articles, 49 case-control studies with 5924 individuals fulfilled the inclusion criteria for the meta-analysis. The healthy controls included 2600 healthy individuals with a Mean±SD age of 36.56±7.45 years. The periodontal diseases group included 3356 patients with a mean age of 43.62±8.35 years. There was a statistically significant difference between P. gingivalis in periodontal patients and healthy controls; 9.24 (95% CI: 5.78 to 14.77; P = 0.000). In the other word, there was a significant relationship between the presence of P. gingivalis and periodontal diseases. Conclusion : Analyzing the results of the present study, we found a strong association between the presence of P. gingivalis and periodontal diseases. This result suggests that another research is needed to further assess this subject.

  11. Physical therapy treatments for low back pain in children and adolescents: a meta-analysis

    PubMed Central

    2013-01-01

    Background Low back pain (LBP) in adolescents is associated with LBP in later years. In recent years treatments have been administered to adolescents for LBP, but it is not known which physical therapy treatment is the most efficacious. By means of a meta-analysis, the current study investigated the effectiveness of the physical therapy treatments for LBP in children and adolescents. Methods Studies in English, Spanish, French, Italian and Portuguese, and carried out by March 2011, were selected by electronic and manual search. Two independent researchers coded the moderator variables of the studies, and performed the effect size calculations. The mean effect size index used was the standardized mean change between the pretest and posttest, and it was applied separately for each combination of outcome measures, (pain, disability, flexibility, endurance and mental health) and measurement type (self-reports, and clinician assessments). Results Eight articles that met the selection criteria enabled us to define 11 treatment groups and 5 control groups using the group as the unit of analysis. The 16 groups involved a total sample of 334 subjects at the posttest (221 in the treatment groups and 113 in the control groups). For all outcome measures, the average effect size of the treatment groups was statistically and clinically significant, whereas the control groups had negative average effect sizes that were not statistically significant. Conclusions Of all the physical therapy treatments for LBP in children and adolescents, the combination of therapeutic physical conditioning and manual therapy is the most effective. The low number of studies and control groups, and the methodological limitations in this meta-analysis prevent us from drawing definitive conclusions in relation to the efficacy of physical therapy treatments in LBP. PMID:23374375

  12. Implantable cardioverter defibrillators for primary prevention in patients with nonischemic cardiomyopathy: A systematic review and meta-analysis.

    PubMed

    Akel, Tamer; Lafferty, James

    2017-06-01

    Implantable cardioverter defibrillators (ICDs) have proved their favorable outcomes on survival in selected patients with cardiomyopathy. Although previous meta-analyses have shown benefit for their use in primary prevention, the evidence remains less robust for patients with nonischemic cardiomyopathy (NICM) in comparison to patients with coronary artery disease (CAD). To evaluate the effect of ICD therapy on reducing all-cause mortality and sudden cardiac death (SCD) in patients with NICM. PubMed (1993-2016), the Cochrane Central Register of Controlled Trials (2000-2016), reference lists of relevant articles, and previous meta-analyses. Search terms included defibrillator, heart failure, cardiomyopathy, randomized controlled trials, and clinical trials. Eligible trials were randomized controlled trials with at least an arm of ICD, an arm of medical therapy and enrolled some patients with NICM. The primary endpoint in the trials should include all-cause mortality or mortality from SCD. Hazard ratios (HRs) for all-cause mortality and mortality from SCD were either extracted or calculated along with their standard errors. Of the 1047 abstracts retained by the initial screen, eight randomized controlled trials were identified. Five of these trials reported relevant data regarding patients with NICM and were subsequently included in this meta-analysis. Pooled analysis of HRs suggested a statistically significant reduction in all-cause mortality among a total of 2573 patients randomized to ICD vs medical therapy (HR 0.80; 95% CI, 0.67-0.96; P=.02). Pooled analysis of HRs for mortality from SCD was also statistically significant (n=1677) (HR 0.51; 95% CI, 0.34-0.76; P=.001). ICD implantation is beneficial in terms of all-cause mortality and mortality from SCD in certain subgroups of patients with NICM. © 2017 John Wiley & Sons Ltd.

  13. Physical therapy treatments for low back pain in children and adolescents: a meta-analysis.

    PubMed

    Calvo-Muñoz, Inmaculada; Gómez-Conesa, Antonia; Sánchez-Meca, Julio

    2013-02-02

    Low back pain (LBP) in adolescents is associated with LBP in later years. In recent years treatments have been administered to adolescents for LBP, but it is not known which physical therapy treatment is the most efficacious. By means of a meta-analysis, the current study investigated the effectiveness of the physical therapy treatments for LBP in children and adolescents. Studies in English, Spanish, French, Italian and Portuguese, and carried out by March 2011, were selected by electronic and manual search. Two independent researchers coded the moderator variables of the studies, and performed the effect size calculations. The mean effect size index used was the standardized mean change between the pretest and posttest, and it was applied separately for each combination of outcome measures, (pain, disability, flexibility, endurance and mental health) and measurement type (self-reports, and clinician assessments). Eight articles that met the selection criteria enabled us to define 11 treatment groups and 5 control groups using the group as the unit of analysis. The 16 groups involved a total sample of 334 subjects at the posttest (221 in the treatment groups and 113 in the control groups). For all outcome measures, the average effect size of the treatment groups was statistically and clinically significant, whereas the control groups had negative average effect sizes that were not statistically significant. Of all the physical therapy treatments for LBP in children and adolescents, the combination of therapeutic physical conditioning and manual therapy is the most effective. The low number of studies and control groups, and the methodological limitations in this meta-analysis prevent us from drawing definitive conclusions in relation to the efficacy of physical therapy treatments in LBP.

  14. Effect of psycho-educational interventions on quality of life in patients with implantable cardioverter defibrillators: a meta-analysis of randomized controlled trials.

    PubMed

    Kao, Chi-Wen; Chen, Miao-Yi; Chen, Ting-Yu; Lin, Pai-Hui

    2016-09-30

    Implantable cardioverter defibrillators (ICD) were developed for primary and secondary prevention of sudden cardiac death. However, ICD recipients' mortality is significantly predicted by their quality of life (QOL). The aim of this meta-analysis was to evaluate the effects of psycho-educational interventions on QOL in patients with ICDs. We systematically searched PubMed, Medline, Cochrane Library, and CINAHL through April 2015 and references of relevant articles. Studies were reviewed if they met following criteria: (1) randomized controlled trial, (2) participants were adults with an ICD, and (3) data were sufficient to evaluate the effect of psychological or educational interventions on QOL measured by the SF-36 or SF-12. Studies were independently selected and their data were extracted by two reviewers. Study quality was evaluated using a modified Jadad scale. The meta-analysis was conducted using the Cochrane Collaboration's Review Manager Software Package (RevMan 5). Study heterogeneity was assessed by Q statistics and I 2 statistic. Depending on heterogeneity, data were pooled across trials using fixed-effect or random-effect modeling. Seven randomized controlled trials fulfilled the inclusion and exclusion criteria, and included 1017 participants. The psycho-educational interventions improved physical component summary (PCS) scores in the intervention groups more than in control groups (mean difference 2.08, 95 % CI 0.86 to 3.29, p < 0.001), but did not significantly affect mental component summary (MCS) scores (mean difference 0.84, 95 % CI -1.68 to 3.35, p = 0.52). Our meta-analysis demonstrates that psycho-educational interventions improved the physical component, but not the mental component of QOL in patients with ICDs.

  15. pcr: an R package for quality assessment, analysis and testing of qPCR data

    PubMed Central

    Ahmed, Mahmoud

    2018-01-01

    Background Real-time quantitative PCR (qPCR) is a broadly used technique in the biomedical research. Currently, few different analysis models are used to determine the quality of data and to quantify the mRNA level across the experimental conditions. Methods We developed an R package to implement methods for quality assessment, analysis and testing qPCR data for statistical significance. Double Delta CT and standard curve models were implemented to quantify the relative expression of target genes from CT in standard qPCR control-group experiments. In addition, calculation of amplification efficiency and curves from serial dilution qPCR experiments are used to assess the quality of the data. Finally, two-group testing and linear models were used to test for significance of the difference in expression control groups and conditions of interest. Results Using two datasets from qPCR experiments, we applied different quality assessment, analysis and statistical testing in the pcr package and compared the results to the original published articles. The final relative expression values from the different models, as well as the intermediary outputs, were checked against the expected results in the original papers and were found to be accurate and reliable. Conclusion The pcr package provides an intuitive and unified interface for its main functions to allow biologist to perform all necessary steps of qPCR analysis and produce graphs in a uniform way. PMID:29576953

  16. Cognitive Stimulation of Elderly Residents in Social Protection Centers in Cartagena, 2014.

    PubMed

    Melguizo Herrera, Estela; Bertel De La Hoz, Anyel; Paternina Osorio, Diego; Felfle Fuentes, Yurani; Porto Osorio, Leidy

    To determine the effectiveness of a program of cognitive stimulation of the elderly residents in Social Protection Centers in Cartagena, 2014. Quasi-experimental study with pre and post tests in control and experimental groups. A sample of 37 elderly residents in Social Protection Centers participated: 23 in the experimental group and 14 in the control group. A survey and a mental evaluation test (Pfeiffer) were applied. The experimental group participated in 10 sessions of cognitive stimulation. The paired t-test showed statistically significant differences in the Pfeiffer test, pre and post intervention, compared to the experimental group (P=.0005). The unpaired t-test showed statistically significant differences in Pfeiffer test results to the experimental and control groups (P=.0450). The analysis of the main components showed that more interrelated variables were: age, diseases, number of errors and test results; which were grouped around the disease variable, with a negative association. The intervention demonstrated a statistically significant improvement in cognitive functionality of the elderly. Nursing can lead this type of intervention. It should be studied further to strengthen and clarify these results. Copyright © 2016 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  17. Postural-Sway Response in Learning-Disabled Children: Pilot Data.

    ERIC Educational Resources Information Center

    Polatajko, H. J.

    1987-01-01

    The postural-sway response of five learning disabled (LD) and five nondisabled children was evaluated using a force platform. From statistical analysis of the two groups, the LD children appeared to use visual input to compensate for postural problems and had significant difficulty controlling posture with eyes closed. (SK)

  18. Mental-behavioral health data: 2001 NHIS.

    PubMed

    Lied, Terry R

    2004-01-01

    These data highlights are based on analysis of the 2001 National Health Interview Survey (NHIS) public use data (http://www.cdc. gov/nchs/nhis.htm). NHIS is a multi-purpose survey conducted by the National Center for Health Statistics, Centers for Disease Control and Prevention. NHIS has been conducted continuously since 1957.

  19. Statistical analysis of hail characteristics in the hail-protected western part of Croatia using data from hail suppression stations

    NASA Astrophysics Data System (ADS)

    Počakal, Damir; Štalec, Janez

    In the continental part of Croatia, operational hail suppression has been conducted for more than 30 years. The current protected area is 25,177 km 2 and has about 492 hail suppression stations which are managed with eight weather radar centres. This paper present a statistical analysis of parameters connected with hail occurrence on hail suppression stations in the western part of protected area in 1981-2000 period. This analysis compares data of two periods with different intensity of hail suppression activity and is made as a part of a project for assessment of hail suppression efficiency in Croatia. Because of disruption in hail suppression system during the independence war in Croatia (1991-1995), lack of rockets and other objective circumstances, it is considered that in the 1991-2000 period, hail suppression system could not act properly. Because of that, a comparison of hail suppression data for two periods was made. The first period (1981-1990), which is characterised with full application of hail suppression technology is compared with the second period (1991-2000). The protected area is divided into quadrants (9×9 km), such that every quadrant has at least one hail suppression station and intercomparison is more precise. Discriminant analysis was performed for the yearly values of each quadrant. These values included number of cases with solid precipitation, hail damage, heavy hail damage, number of active hail suppression stations, number of days with solid precipitation, solid precipitation damage, heavy solid precipitation damage and the number and duration of air traffic control bans. The discriminant analysis shows that there is a significant difference between the two periods. Average values of observed periods on isolated discriminant function 1 are for the first period (1981-1990) -0.36 and for the second period +0.23 standard deviation of all observations. The analysis for all eight variables shows statistically substantial differences in the number of hail suppression stations (which have a positive correlation) and in the number of cases with air traffic control ban, which have, like all other variables, a negative correlation. Results of statistical analysis for two periods show positive influence of hail suppression system. The discriminant analysis made for three periods shows that these three periods can not be compared because of the short time period, the difference in hail suppression technology, working conditions and possible differences in meteorological conditions. Therefore, neither the effectiveness nor ineffectiveness of hail suppression operations nor their efficiency can be statistically proven. For an exact assessment of hail suppression effectiveness, it is necessary to develop a project, which would take into consideration all the parameters used in such previous projects around the world—a hailpad polygon.

  20. Effect of a stress management program on subjects with neck pain: A pilot randomized controlled trial.

    PubMed

    Metikaridis, T Damianos; Hadjipavlou, Alexander; Artemiadis, Artemios; Chrousos, George; Darviri, Christina

    2016-05-20

    Studies have shown that stress is implicated in the cause of neck pain (NP). The purpose of this study is to examine the effect of a simple, zero cost stress management program on patients suffering from NP. This study is a parallel-type randomized clinical study. People suffering from chronic non-specific NP were chosen randomly to participate in an eight week duration program of stress management (N= 28) (including diaphragmatic breathing, progressive muscle relaxation) or in a no intervention control condition (N= 25). Self-report measures were used for the evaluation of various variables at the beginning and at the end of the eight-week monitoring period. Descriptive and inferential statistic methods were used for the statistical analysis. At the end of the monitoring period, the intervention group showed a statistically significant reduction of stress and anxiety (p= 0.03, p= 0.01), report of stress related symptoms (p= 0.003), percentage of disability due to NP (p= 0.000) and NP intensity (p= 0.002). At the same time, daily routine satisfaction levels were elevated (p= 0.019). No statistically significant difference was observed in cortisol measurements. Stress management has positive effects on NP patients.

  1. Comparing meta-analysis and ecological-longitudinal analysis in time-series studies. A case study of the effects of air pollution on mortality in three Spanish cities.

    PubMed

    Saez, M; Figueiras, A; Ballester, F; Pérez-Hoyos, S; Ocaña, R; Tobías, A

    2001-06-01

    The objective of this paper is to introduce a different approach, called the ecological-longitudinal, to carrying out pooled analysis in time series ecological studies. Because it gives a larger number of data points and, hence, increases the statistical power of the analysis, this approach, unlike conventional ones, allows the complementation of aspects such as accommodation of random effect models, of lags, of interaction between pollutants and between pollutants and meteorological variables, that are hardly implemented in conventional approaches. The approach is illustrated by providing quantitative estimates of the short-term effects of air pollution on mortality in three Spanish cities, Barcelona, Valencia and Vigo, for the period 1992-1994. Because the dependent variable was a count, a Poisson generalised linear model was first specified. Several modelling issues are worth mentioning. Firstly, because the relations between mortality and explanatory variables were non-linear, cubic splines were used for covariate control, leading to a generalised additive model, GAM. Secondly, the effects of the predictors on the response were allowed to occur with some lag. Thirdly, the residual autocorrelation, because of imperfect control, was controlled for by means of an autoregressive Poisson GAM. Finally, the longitudinal design demanded the consideration of the existence of individual heterogeneity, requiring the consideration of mixed models. The estimates of the relative risks obtained from the individual analyses varied across cities, particularly those associated with sulphur dioxide. The highest relative risks corresponded to black smoke in Valencia. These estimates were higher than those obtained from the ecological-longitudinal analysis. Relative risks estimated from this latter analysis were practically identical across cities, 1.00638 (95% confidence intervals 1.0002, 1.0011) for a black smoke increase of 10 microg/m(3) and 1.00415 (95% CI 1.0001, 1.0007) for a increase of 10 microg/m(3) of sulphur dioxide. Because the statistical power is higher than in the individual analysis more interactions were statistically significant, especially those among air pollutants and meteorological variables. Air pollutant levels were related to mortality in the three cities of the study, Barcelona, Valencia and Vigo. These results were consistent with similar studies in other cities, with other multicentric studies and coherent with both, previous individual, for each city, and multicentric studies for all three cities.

  2. Rigorous Statistical Bounds in Uncertainty Quantification for One-Layer Turbulent Geophysical Flows

    NASA Astrophysics Data System (ADS)

    Qi, Di; Majda, Andrew J.

    2018-04-01

    Statistical bounds controlling the total fluctuations in mean and variance about a basic steady-state solution are developed for the truncated barotropic flow over topography. Statistical ensemble prediction is an important topic in weather and climate research. Here, the evolution of an ensemble of trajectories is considered using statistical instability analysis and is compared and contrasted with the classical deterministic instability for the growth of perturbations in one pointwise trajectory. The maximum growth of the total statistics in fluctuations is derived relying on the statistical conservation principle of the pseudo-energy. The saturation bound of the statistical mean fluctuation and variance in the unstable regimes with non-positive-definite pseudo-energy is achieved by linking with a class of stable reference states and minimizing the stable statistical energy. Two cases with dependence on initial statistical uncertainty and on external forcing and dissipation are compared and unified under a consistent statistical stability framework. The flow structures and statistical stability bounds are illustrated and verified by numerical simulations among a wide range of dynamical regimes, where subtle transient statistical instability exists in general with positive short-time exponential growth in the covariance even when the pseudo-energy is positive-definite. Among the various scenarios in this paper, there exist strong forward and backward energy exchanges between different scales which are estimated by the rigorous statistical bounds.

  3. COgnitive behavioural therapy versus standardised medical care for adults with Dissociative non-Epileptic Seizures (CODES): statistical and economic analysis plan for a randomised controlled trial.

    PubMed

    Robinson, Emily J; Goldstein, Laura H; McCrone, Paul; Perdue, Iain; Chalder, Trudie; Mellers, John D C; Richardson, Mark P; Murray, Joanna; Reuber, Markus; Medford, Nick; Stone, Jon; Carson, Alan; Landau, Sabine

    2017-06-06

    Dissociative seizures (DSs), also called psychogenic non-epileptic seizures, are a distressing and disabling problem for many patients in neurological settings with high and often unnecessary economic costs. The COgnitive behavioural therapy versus standardised medical care for adults with Dissociative non-Epileptic Seizures (CODES) trial is an evaluation of a specifically tailored psychological intervention with the aims of reducing seizure frequency and severity and improving psychological well-being in adults with DS. The aim of this paper is to report in detail the quantitative and economic analysis plan for the CODES trial, as agreed by the trial steering committee. The CODES trial is a multicentre, pragmatic, parallel group, randomised controlled trial performed to evaluate the clinical effectiveness and cost-effectiveness of 13 sessions of cognitive behavioural therapy (CBT) plus standardised medical care (SMC) compared with SMC alone for adult outpatients with DS. The objectives and design of the trial are summarised, and the aims and procedures of the planned analyses are illustrated. The proposed analysis plan addresses statistical considerations such as maintaining blinding, monitoring adherence with the protocol, describing aspects of treatment and dealing with missing data. The formal analysis approach for the primary and secondary outcomes is described, as are the descriptive statistics that will be reported. This paper provides transparency to the planned inferential analyses for the CODES trial prior to the extraction of outcome data. It also provides an update to the previously published trial protocol and guidance to those conducting similar trials. ISRCTN registry ISRCTN05681227 (registered on 5 March 2014); ClinicalTrials.gov NCT02325544 (registered on 15 December 2014).

  4. Thr105Ile (rs11558538) polymorphism in the histamine N-methyltransferase (HNMT) gene and risk for Parkinson disease

    PubMed Central

    Jiménez-Jiménez, Félix Javier; Alonso-Navarro, Hortensia; García-Martín, Elena; Agúndez, José A.G.

    2016-01-01

    Abstract Background/aims: Several neuropathological, biochemical, and pharmacological data suggested a possible role of histamine in the etiopathogenesis of Parkinson disease (PD). The single nucleotide polymorphism (SNP) rs11558538 in the histamine N-methyltransferase (HNMT) gene has been associated with the risk of developing PD by several studies but not by some others. We carried out a systematic review that included all the studies published on PD risk related to the rs11558538 SNP, and we conducted a meta-analysis following Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. Methods: We used several databases to perform the systematic review, the software Meta-DiSc 1.1.1 to perform the meta-analysis of the eligible studies, and the Q-statistic to test heterogeneity between studies. Results: The meta-analysis included 4 eligible case–control association studies for the HNMT rs11558538 SNP and the risk for PD (2108 patients, 2158 controls). The frequency of the minor allele positivity showed a statistically significant association with a decreased risk for PD, both in the total series and in Caucasians. Although homozygosity for the minor allele did not reach statistical significance, the test for trend indicates the occurrence of a gene–dose effect. Global diagnostic odds ratios (95% confidence intervals) for rs11558538T were 0.61 (0.46–0.81) for the total group, and 0.63 (0.45–0.88) for Caucasian patients. Conclusion: The present meta-analysis confirms published evidence suggesting that the HNMT rs11558538 minor allele is related to a reduced risk of developing PD. PMID:27399132

  5. CTLA-4 gene polymorphisms and their influence on predisposition to autoimmune thyroid diseases (Graves’ disease and Hashimoto's thyroiditis)

    PubMed Central

    Pastuszak-Lewandoska, Dorota; Sewerynek, Ewa; Domańska, Daria; Gładyś, Aleksandra; Skrzypczak, Renata

    2012-01-01

    Introduction Autoimmune thyroid disease (AITD) is associated with both genetic and environmental factors which lead to the overactivity of immune system. Cytotoxic T-Lymphocyte Antigen 4 (CTLA-4) gene polymorphisms belong to the main genetic factors determining the susceptibility to AITD (Hashimoto's thyroiditis, HT and Graves' disease, GD) development. The aim of the study was to evaluate the relationship between CTLA-4 polymorphisms (A49G, 1822 C/T and CT60 A/G) and HT and/or GD in Polish patients. Material and methods Molecular analysis involved AITD group, consisting of HT (n=28) and GD (n=14) patients, and a control group of healthy persons (n=20). Genomic DNA was isolated from peripheral blood and CTLA-4 polymorphisms were assessed by polymerase chain reaction-restriction fragment length polymorphism method, using three restriction enzymes: Fnu4HI (A49G), BsmAI (1822 C/T) and BsaAI (CT60 A/G). Results Statistical analysis (χ2 test) confirmed significant differences between the studied groups concerning CTLA-4 A49G genotypes. CTLA-4 A/G genotype was significantly more frequent in AITD group and OR analysis suggested that it might increase the susceptibility to HT. In GD patients, OR analysis revealed statistically significant relationship with the presence of G allele. In controls, CTLA-4 A/A genotype frequency was significantly increased suggesting a protective effect. There were no statistically significant differences regarding frequencies of other genotypes and polymorphic alleles of the CTLA-4 gene (1822 C/T and CT60 A/G) between the studied groups. Conclusions CTLA-4 A49G polymorphism seems to be an important genetic determinant of the risk of HT and GD in Polish patients. PMID:22851994

  6. CTLA-4 gene polymorphisms and their influence on predisposition to autoimmune thyroid diseases (Graves' disease and Hashimoto's thyroiditis).

    PubMed

    Pastuszak-Lewandoska, Dorota; Sewerynek, Ewa; Domańska, Daria; Gładyś, Aleksandra; Skrzypczak, Renata; Brzeziańska, Ewa

    2012-07-04

    Autoimmune thyroid disease (AITD) is associated with both genetic and environmental factors which lead to the overactivity of immune system. Cytotoxic T-Lymphocyte Antigen 4 (CTLA-4) gene polymorphisms belong to the main genetic factors determining the susceptibility to AITD (Hashimoto's thyroiditis, HT and Graves' disease, GD) development. The aim of the study was to evaluate the relationship between CTLA-4 polymorphisms (A49G, 1822 C/T and CT60 A/G) and HT and/or GD in Polish patients. Molecular analysis involved AITD group, consisting of HT (n=28) and GD (n=14) patients, and a control group of healthy persons (n=20). Genomic DNA was isolated from peripheral blood and CTLA-4 polymorphisms were assessed by polymerase chain reaction-restriction fragment length polymorphism method, using three restriction enzymes: Fnu4HI (A49G), BsmAI (1822 C/T) and BsaAI (CT60 A/G). Statistical analysis (χ(2) test) confirmed significant differences between the studied groups concerning CTLA-4 A49G genotypes. CTLA-4 A/G genotype was significantly more frequent in AITD group and OR analysis suggested that it might increase the susceptibility to HT. In GD patients, OR analysis revealed statistically significant relationship with the presence of G allele. In controls, CTLA-4 A/A genotype frequency was significantly increased suggesting a protective effect. There were no statistically significant differences regarding frequencies of other genotypes and polymorphic alleles of the CTLA-4 gene (1822 C/T and CT60 A/G) between the studied groups. CTLA-4 A49G polymorphism seems to be an important genetic determinant of the risk of HT and GD in Polish patients.

  7. Applying quantitative bias analysis to estimate the plausible effects of selection bias in a cluster randomised controlled trial: secondary analysis of the Primary care Osteoarthritis Screening Trial (POST).

    PubMed

    Barnett, L A; Lewis, M; Mallen, C D; Peat, G

    2017-12-04

    Selection bias is a concern when designing cluster randomised controlled trials (c-RCT). Despite addressing potential issues at the design stage, bias cannot always be eradicated from a trial design. The application of bias analysis presents an important step forward in evaluating whether trial findings are credible. The aim of this paper is to give an example of the technique to quantify potential selection bias in c-RCTs. This analysis uses data from the Primary care Osteoarthritis Screening Trial (POST). The primary aim of this trial was to test whether screening for anxiety and depression, and providing appropriate care for patients consulting their GP with osteoarthritis would improve clinical outcomes. Quantitative bias analysis is a seldom-used technique that can quantify types of bias present in studies. Due to lack of information on the selection probability, probabilistic bias analysis with a range of triangular distributions was also used, applied at all three follow-up time points; 3, 6, and 12 months post consultation. A simple bias analysis was also applied to the study. Worse pain outcomes were observed among intervention participants than control participants (crude odds ratio at 3, 6, and 12 months: 1.30 (95% CI 1.01, 1.67), 1.39 (1.07, 1.80), and 1.17 (95% CI 0.90, 1.53), respectively). Probabilistic bias analysis suggested that the observed effect became statistically non-significant if the selection probability ratio was between 1.2 and 1.4. Selection probability ratios of > 1.8 were needed to mask a statistically significant benefit of the intervention. The use of probabilistic bias analysis in this c-RCT suggested that worse outcomes observed in the intervention arm could plausibly be attributed to selection bias. A very large degree of selection of bias was needed to mask a beneficial effect of intervention making this interpretation less plausible.

  8. Fuzzy Adaptive Control for Intelligent Autonomous Space Exploration Problems

    NASA Technical Reports Server (NTRS)

    Esogbue, Augustine O.

    1998-01-01

    The principal objective of the research reported here is the re-design, analysis and optimization of our newly developed neural network fuzzy adaptive controller model for complex processes capable of learning fuzzy control rules using process data and improving its control through on-line adaption. The learned improvement is according to a performance objective function that provides evaluative feedback; this performance objective is broadly defined to meet long-range goals over time. Although fuzzy control had proven effective for complex, nonlinear, imprecisely-defined processes for which standard models and controls are either inefficient, impractical or cannot be derived, the state of the art prior to our work showed that procedures for deriving fuzzy control, however, were mostly ad hoc heuristics. The learning ability of neural networks was exploited to systematically derive fuzzy control and permit on-line adaption and in the process optimize control. The operation of neural networks integrates very naturally with fuzzy logic. The neural networks which were designed and tested using simulation software and simulated data, followed by realistic industrial data were reconfigured for application on several platforms as well as for the employment of improved algorithms. The statistical procedures of the learning process were investigated and evaluated with standard statistical procedures (such as ANOVA, graphical analysis of residuals, etc.). The computational advantage of dynamic programming-like methods of optimal control was used to permit on-line fuzzy adaptive control. Tests for the consistency, completeness and interaction of the control rules were applied. Comparisons to other methods and controllers were made so as to identify the major advantages of the resulting controller model. Several specific modifications and extensions were made to the original controller. Additional modifications and explorations have been proposed for further study. Some of these are in progress in our laboratory while others await additional support. All of these enhancements will improve the attractiveness of the controller as an effective tool for the on line control of an array of complex process environments.

  9. Statistical Inference at Work: Statistical Process Control as an Example

    ERIC Educational Resources Information Center

    Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia

    2008-01-01

    To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…

  10. Control of maglev vehicles with aerodynamic and guideway disturbances

    NASA Technical Reports Server (NTRS)

    Flueckiger, Karl; Mark, Steve; Caswell, Ruth; Mccallum, Duncan

    1994-01-01

    A modeling, analysis, and control design methodology is presented for maglev vehicle ride quality performance improvement as measured by the Pepler Index. Ride quality enhancement is considered through active control of secondary suspension elements and active aerodynamic surfaces mounted on the train. To analyze and quantify the benefits of active control, the authors have developed a five degree-of-freedom lumped parameter model suitable for describing a large class of maglev vehicles, including both channel and box-beam guideway configurations. Elements of this modeling capability have been recently employed in studies sponsored by the U.S. Department of Transportation (DOT). A perturbation analysis about an operating point, defined by vehicle and average crosswind velocities, yields a suitable linearized state space model for multivariable control system analysis and synthesis. Neglecting passenger compartment noise, the ride quality as quantified by the Pepler Index is readily computed from the system states. A statistical analysis is performed by modeling the crosswind disturbances and guideway variations as filtered white noise, whereby the Pepler Index is established in closed form through the solution to a matrix Lyapunov equation. Data is presented which indicates the anticipated ride quality achieved through various closed-loop control arrangements.

  11. Analysis of covariance as a remedy for demographic mismatch of research subject groups: some sobering simulations.

    PubMed

    Adams, K M; Brown, G G; Grant, I

    1985-08-01

    Analysis of Covariance (ANCOVA) is often used in neuropsychological studies to effect ex-post-facto adjustment of performance variables amongst groups of subjects mismatched on some relevant demographic variable. This paper reviews some of the statistical assumptions underlying this usage. In an attempt to illustrate the complexities of this statistical technique, three sham studies using actual patient data are presented. These staged simulations have varying relationships between group test performance differences and levels of covariate discrepancy. The results were robust and consistent in their nature, and were held to support the wisdom of previous cautions by statisticians concerning the employment of ANCOVA to justify comparisons between incomparable groups. ANCOVA should not be used in neuropsychological research to equate groups unequal on variables such as age and education or to exert statistical control whose objective is to eliminate consideration of the covariate as an explanation for results. Finally, the report advocates by example the use of simulation to further our understanding of neuropsychological variables.

  12. A PDF-based classification of gait cadence patterns in patients with amyotrophic lateral sclerosis.

    PubMed

    Wu, Yunfeng; Ng, Sin Chun

    2010-01-01

    Amyotrophic lateral sclerosis (ALS) is a type of neurological disease due to the degeneration of motor neurons. During the course of such a progressive disease, it would be difficult for ALS patients to regulate normal locomotion, so that the gait stability becomes perturbed. This paper presents a pilot statistical study on the gait cadence (or stride interval) in ALS, based on the statistical analysis method. The probability density functions (PDFs) of stride interval were first estimated with the nonparametric Parzen-window method. We computed the mean of the left-foot stride interval and the modified Kullback-Leibler divergence (MKLD) from the PDFs estimated. The analysis results suggested that both of these two statistical parameters were significantly altered in ALS, and the least-squares support vector machine (LS-SVM) may effectively distinguish the stride patterns between the ALS patients and healthy controls, with an accurate rate of 82.8% and an area of 0.87 under the receiver operating characteristic curve.

  13. Descriptive data analysis.

    PubMed

    Thompson, Cheryl Bagley

    2009-01-01

    This 13th article of the Basics of Research series is first in a short series on statistical analysis. These articles will discuss creating your statistical analysis plan, levels of measurement, descriptive statistics, probability theory, inferential statistics, and general considerations for interpretation of the results of a statistical analysis.

  14. Analysis of broadcasting satellite service feeder link power control and polarization

    NASA Technical Reports Server (NTRS)

    Sullivan, T. M.

    1982-01-01

    Statistical analyses of carrier to interference power ratios (C/Is) were performed in assessing 17.5 GHz feeder links using (1) fixed power and power control, and (2) orthogonal linear and orthogonal circular polarizations. The analysis methods and attenuation/depolarization data base were based on CCIR findings to the greatest possible extent. Feeder links using adaptive power control were found to neither cause or suffer significant C/I degradation relative to that for fixed power feeder links having similar or less stringent availability objectives. The C/Is for sharing between orthogonal linearly polarized feeder links were found to be significantly higher than those for circular polarization only in links to nominally colocated satellites from nominally colocated Earth stations in high attenuation environments.

  15. Surface electromyography and ultrasound evaluation of pelvic floor muscles in hyperandrogenic women.

    PubMed

    Vassimon, Flávia Ignácio Antonio; Ferreira, Cristine Homsi Jorge; Martins, Wellington Paula; Ferriani, Rui Alberto; Batista, Roberta Leopoldino de Andrade; Bo, Kari

    2016-04-01

    High levels of androgens increase muscle mass. Due to the characteristics of hyperandrogenism in polycystic ovary syndrome (PCOS), it is plausible that women with PCOS may have increased pelvic floor muscle (PFM) thickness and neuromuscular activity levels compared with controls. The aim of this study was to assess PFM thickness and neuromuscular activity among hyperandrogenic women with PCOS and controls. This was an observational, cross-sectional, case-control study evaluating PFM by ultrasound (US) and surface electromyography (sEMG) in nonobese women with and without PCOS. Seventy-two women were divided into two groups: PCOS (n = 33) and controls (n = 39). PFM thickness during contraction was assessed by US (Vingmed CFM 800). Pelvic floor muscle activity was assessed by sEMG (MyoTrac Infinit) during contractions at different time lengths: quick, and 8 and 60 s. Descriptive analysis, analysis of variance (ANOVA), and Student's t test were used for statistical analyses. There were no significant differences in PFM sEMG activity between PCOS and controls in any of the contractions: quick contraction (73.23 mV/ 71.56 mV; p = 0.62), 8 s (55.77 mV/ 54.17 mV; p = 0.74), and 60 s (49.26 mV/ 47.32 mV; p = 0.68), respectively. There was no difference in PFM thickness during contractions evaluated by US between PCOS and controls (12.78 mm/ 13.43 mm; p =  .48). This study did not find statistically significant differences in pelvic floor muscle thickness or in muscle activity between PCOS women and controls.

  16. Fast and accurate imputation of summary statistics enhances evidence of functional enrichment.

    PubMed

    Pasaniuc, Bogdan; Zaitlen, Noah; Shi, Huwenbo; Bhatia, Gaurav; Gusev, Alexander; Pickrell, Joseph; Hirschhorn, Joel; Strachan, David P; Patterson, Nick; Price, Alkes L

    2014-10-15

    Imputation using external reference panels (e.g. 1000 Genomes) is a widely used approach for increasing power in genome-wide association studies and meta-analysis. Existing hidden Markov models (HMM)-based imputation approaches require individual-level genotypes. Here, we develop a new method for Gaussian imputation from summary association statistics, a type of data that is becoming widely available. In simulations using 1000 Genomes (1000G) data, this method recovers 84% (54%) of the effective sample size for common (>5%) and low-frequency (1-5%) variants [increasing to 87% (60%) when summary linkage disequilibrium information is available from target samples] versus the gold standard of 89% (67%) for HMM-based imputation, which cannot be applied to summary statistics. Our approach accounts for the limited sample size of the reference panel, a crucial step to eliminate false-positive associations, and it is computationally very fast. As an empirical demonstration, we apply our method to seven case-control phenotypes from the Wellcome Trust Case Control Consortium (WTCCC) data and a study of height in the British 1958 birth cohort (1958BC). Gaussian imputation from summary statistics recovers 95% (105%) of the effective sample size (as quantified by the ratio of [Formula: see text] association statistics) compared with HMM-based imputation from individual-level genotypes at the 227 (176) published single nucleotide polymorphisms (SNPs) in the WTCCC (1958BC height) data. In addition, for publicly available summary statistics from large meta-analyses of four lipid traits, we publicly release imputed summary statistics at 1000G SNPs, which could not have been obtained using previously published methods, and demonstrate their accuracy by masking subsets of the data. We show that 1000G imputation using our approach increases the magnitude and statistical evidence of enrichment at genic versus non-genic loci for these traits, as compared with an analysis without 1000G imputation. Thus, imputation of summary statistics will be a valuable tool in future functional enrichment analyses. Publicly available software package available at http://bogdan.bioinformatics.ucla.edu/software/. bpasaniuc@mednet.ucla.edu or aprice@hsph.harvard.edu Supplementary materials are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  17. Monitoring the healing process of rat bones using Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Gamulin, O.; Serec, K.; Bilić, V.; Balarin, M.; Kosović, M.; Drmić, D.; Brčić, L.; Seiwerth, S.; Sikirić, P.

    2013-07-01

    The healing effect of BPC 157 on rat femoral head osteonecrosis was monitored by Raman spectroscopy. Three groups of rats were defined: an injured group treated with BPC 157 (10 μg/kg/daily ip), an injured control group (treated with saline, 5 ml/kg/daily ip), and an uninjured healthy group. The spectra were recorded and the healing effect assessed on samples harvested from animals which were sacrificed 3 and 6 weeks after being injured. The statistical analysis of the recorded spectra showed statistical differences between the BPC 157-treated, control, and healthy groups of animals. In particular, after 6 weeks the spectral resemblance between the healthy and BPC 157 samples indicated a positive BPC 157 influence on the healing process of rat femoral head.

  18. Automatic detection of health changes using statistical process control techniques on measured transfer times of elderly.

    PubMed

    Baldewijns, Greet; Luca, Stijn; Nagels, William; Vanrumste, Bart; Croonenborghs, Tom

    2015-01-01

    It has been shown that gait speed and transfer times are good measures of functional ability in elderly. However, data currently acquired by systems that measure either gait speed or transfer times in the homes of elderly people require manual reviewing by healthcare workers. This reviewing process is time-consuming. To alleviate this burden, this paper proposes the use of statistical process control methods to automatically detect both positive and negative changes in transfer times. Three SPC techniques: tabular CUSUM, standardized CUSUM and EWMA, known for their ability to detect small shifts in the data, are evaluated on simulated transfer times. This analysis shows that EWMA is the best-suited method with a detection accuracy of 82% and an average detection time of 9.64 days.

  19. Mechanical properties of silicate glasses exposed to a low-Earth orbit

    NASA Technical Reports Server (NTRS)

    Wiedlocher, David E.; Tucker, Dennis S.; Nichols, Ron; Kinser, Donald L.

    1992-01-01

    The effects of a 5.8 year exposure to low earth orbit environment upon the mechanical properties of commercial optical fused silica, low iron soda-lime-silica, Pyrex 7740, Vycor 7913, BK-7, and the glass ceramic Zerodur were examined. Mechanical testing employed the ASTM-F-394 piston on 3-ball method in a liquid nitrogen environment. Samples were exposed on the Long Duration Exposure Facility (LDEF) in two locations. Impacts were observed on all specimens except Vycor. Weibull analysis as well as a standard statistical evaluation were conducted. The Weibull analysis revealed no differences between control samples and the two exposed samples. We thus concluded that radiation components of the Earth orbital environment did not degrade the mechanical strength of the samples examined within the limits of experimental error. The upper bound of strength degradation for meteorite impacted samples based upon statistical analysis and observation was 50 percent.

  20. Combat Ration Advanced Manufacturing Technology Demonstration (CRAMTD). ’Generic Inspection-Statistical Process Control System for a Combat Ration Manufacturing Facility’. Short Term Project (STP) Number 3.

    DTIC Science & Technology

    1996-01-01

    failure as due to an adhesive layer between the foil and inner polypropylene layers. "* Under subcontract, NFPA provided HACCP draft manuals for the...parameters of the production process and to ensure that they are within their target values. In addition, a HACCP program was used to assure product...played an important part in implementing Hazard Analysis Critical Control Points ( HACCP ) as part of the Process and Quality Control manual. The National

Top