Sample records for statistical procedures including

  1. A Statistical Analysis of Brain Morphology Using Wild Bootstrapping

    PubMed Central

    Ibrahim, Joseph G.; Tang, Niansheng; Rowe, Daniel B.; Hao, Xuejun; Bansal, Ravi; Peterson, Bradley S.

    2008-01-01

    Methods for the analysis of brain morphology, including voxel-based morphology and surface-based morphometries, have been used to detect associations between brain structure and covariates of interest, such as diagnosis, severity of disease, age, IQ, and genotype. The statistical analysis of morphometric measures usually involves two statistical procedures: 1) invoking a statistical model at each voxel (or point) on the surface of the brain or brain subregion, followed by mapping test statistics (e.g., t test) or their associated p values at each of those voxels; 2) correction for the multiple statistical tests conducted across all voxels on the surface of the brain region under investigation. We propose the use of new statistical methods for each of these procedures. We first use a heteroscedastic linear model to test the associations between the morphological measures at each voxel on the surface of the specified subregion (e.g., cortical or subcortical surfaces) and the covariates of interest. Moreover, we develop a robust test procedure that is based on a resampling method, called wild bootstrapping. This procedure assesses the statistical significance of the associations between a measure of given brain structure and the covariates of interest. The value of this robust test procedure lies in its computationally simplicity and in its applicability to a wide range of imaging data, including data from both anatomical and functional magnetic resonance imaging (fMRI). Simulation studies demonstrate that this robust test procedure can accurately control the family-wise error rate. We demonstrate the application of this robust test procedure to the detection of statistically significant differences in the morphology of the hippocampus over time across gender groups in a large sample of healthy subjects. PMID:17649909

  2. Statistics and Discoveries at the LHC (1/4)

    ScienceCinema

    Cowan, Glen

    2018-02-09

    The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.

  3. Statistics and Discoveries at the LHC (3/4)

    ScienceCinema

    Cowan, Glen

    2018-02-19

    The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.

  4. Statistics and Discoveries at the LHC (4/4)

    ScienceCinema

    Cowan, Glen

    2018-05-22

    The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.

  5. Statistics and Discoveries at the LHC (2/4)

    ScienceCinema

    Cowan, Glen

    2018-04-26

    The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.

  6. Statistics in the pharmacy literature.

    PubMed

    Lee, Charlene M; Soin, Herpreet K; Einarson, Thomas R

    2004-09-01

    Research in statistical methods is essential for maintenance of high quality of the published literature. To update previous reports of the types and frequencies of statistical terms and procedures in research studies of selected professional pharmacy journals. We obtained all research articles published in 2001 in 6 journals: American Journal of Health-System Pharmacy, The Annals of Pharmacotherapy, Canadian Journal of Hospital Pharmacy, Formulary, Hospital Pharmacy, and Journal of the American Pharmaceutical Association. Two independent reviewers identified and recorded descriptive and inferential statistical terms/procedures found in the methods, results, and discussion sections of each article. Results were determined by tallying the total number of times, as well as the percentage, that each statistical term or procedure appeared in the articles. One hundred forty-four articles were included. Ninety-eight percent employed descriptive statistics; of these, 28% used only descriptive statistics. The most common descriptive statistical terms were percentage (90%), mean (74%), standard deviation (58%), and range (46%). Sixty-nine percent of the articles used inferential statistics, the most frequent being chi(2) (33%), Student's t-test (26%), Pearson's correlation coefficient r (18%), ANOVA (14%), and logistic regression (11%). Statistical terms and procedures were found in nearly all of the research articles published in pharmacy journals. Thus, pharmacy education should aim to provide current and future pharmacists with an understanding of the common statistical terms and procedures identified to facilitate the appropriate appraisal and consequential utilization of the information available in research articles.

  7. An Empirical Investigation of Methods for Assessing Item Fit for Mixed Format Tests

    ERIC Educational Resources Information Center

    Chon, Kyong Hee; Lee, Won-Chan; Ansley, Timothy N.

    2013-01-01

    Empirical information regarding performance of model-fit procedures has been a persistent need in measurement practice. Statistical procedures for evaluating item fit were applied to real test examples that consist of both dichotomously and polytomously scored items. The item fit statistics used in this study included the PARSCALE's G[squared],…

  8. 42 CFR 493.1256 - Standard: Control procedures.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... for having control procedures that monitor the accuracy and precision of the complete analytic process..., include two control materials, including one that is capable of detecting errors in the extraction process... control materials having previously determined statistical parameters. (e) For reagent, media, and supply...

  9. Using SPSS to Analyze Book Collection Data.

    ERIC Educational Resources Information Center

    Townley, Charles T.

    1981-01-01

    Describes and illustrates Statistical Package for the Social Sciences (SPSS) procedures appropriate for book collection data analysis. Several different procedures for univariate, bivariate, and multivariate analysis are discussed, and applications of procedures for book collection studies are presented. Included are 24 tables illustrating output…

  10. Statistical correlation of structural mode shapes from test measurements and NASTRAN analytical values

    NASA Technical Reports Server (NTRS)

    Purves, L.; Strang, R. F.; Dube, M. P.; Alea, P.; Ferragut, N.; Hershfeld, D.

    1983-01-01

    The software and procedures of a system of programs used to generate a report of the statistical correlation between NASTRAN modal analysis results and physical tests results from modal surveys are described. Topics discussed include: a mathematical description of statistical correlation, a user's guide for generating a statistical correlation report, a programmer's guide describing the organization and functions of individual programs leading to a statistical correlation report, and a set of examples including complete listings of programs, and input and output data.

  11. 5 CFR 532.215 - Establishments included in regular appropriated fund surveys.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... in surveys shall be selected under standard probability sample selection procedures. In areas with... establishment list drawn under statistical sampling procedures. [55 FR 46142, Nov. 1, 1990] ...

  12. Analyzing longitudinal data with the linear mixed models procedure in SPSS.

    PubMed

    West, Brady T

    2009-09-01

    Many applied researchers analyzing longitudinal data share a common misconception: that specialized statistical software is necessary to fit hierarchical linear models (also known as linear mixed models [LMMs], or multilevel models) to longitudinal data sets. Although several specialized statistical software programs of high quality are available that allow researchers to fit these models to longitudinal data sets (e.g., HLM), rapid advances in general purpose statistical software packages have recently enabled analysts to fit these same models when using preferred packages that also enable other more common analyses. One of these general purpose statistical packages is SPSS, which includes a very flexible and powerful procedure for fitting LMMs to longitudinal data sets with continuous outcomes. This article aims to present readers with a practical discussion of how to analyze longitudinal data using the LMMs procedure in the SPSS statistical software package.

  13. Statistical methods in personality assessment research.

    PubMed

    Schinka, J A; LaLone, L; Broeckel, J A

    1997-06-01

    Emerging models of personality structure and advances in the measurement of personality and psychopathology suggest that research in personality and personality assessment has entered a stage of advanced development, in this article we examine whether researchers in these areas have taken advantage of new and evolving statistical procedures. We conducted a review of articles published in the Journal of Personality, Assessment during the past 5 years. Of the 449 articles that included some form of data analysis, 12.7% used only descriptive statistics, most employed only univariate statistics, and fewer than 10% used multivariate methods of data analysis. We discuss the cost of using limited statistical methods, the possible reasons for the apparent reluctance to employ advanced statistical procedures, and potential solutions to this technical shortcoming.

  14. 50 CFR 600.135 - Meeting procedures.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 50 Wildlife and Fisheries 8 2010-10-01 2010-10-01 false Meeting procedures. 600.135 Section 600....135 Meeting procedures. Link to an amendment published at 75 FR 59150, Sept. 27, 2010. (a) Public notice of regular meetings of the Council, scientific statistical committee or advisory panels, including...

  15. 34 CFR 668.46 - Institutional security policies and crime statistics.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    .... (11) A statement of policy regarding the institution's campus sexual assault programs to prevent sex offenses, and procedures to follow when a sex offense occurs. The statement must include— (i) A description... nonforcible sex offenses; (ii) Procedures students should follow if a sex offense occurs, including procedures...

  16. The Incoming Statistical Knowledge of Undergraduate Majors in a Department of Mathematics and Statistics

    ERIC Educational Resources Information Center

    Cook, Samuel A.; Fukawa-Connelly, Timothy

    2016-01-01

    Studies have shown that at the end of an introductory statistics course, students struggle with building block concepts, such as mean and standard deviation, and rely on procedural understandings of the concepts. This study aims to investigate the understandings entering freshman of a department of mathematics and statistics (including mathematics…

  17. Changes in Occupational Radiation Exposures after Incorporation of a Real-time Dosimetry System in the Interventional Radiology Suite.

    PubMed

    Poudel, Sashi; Weir, Lori; Dowling, Dawn; Medich, David C

    2016-08-01

    A statistical pilot study was retrospectively performed to analyze potential changes in occupational radiation exposures to Interventional Radiology (IR) staff at Lawrence General Hospital after implementation of the i2 Active Radiation Dosimetry System (Unfors RaySafe Inc, 6045 Cochran Road Cleveland, OH 44139-3302). In this study, the monthly OSL dosimetry records obtained during the eight-month period prior to i2 implementation were normalized to the number of procedures performed during each month and statistically compared to the normalized dosimetry records obtained for the 8-mo period after i2 implementation. The resulting statistics included calculation of the mean and standard deviation of the dose equivalences per procedure and included appropriate hypothesis tests to assess for statistically valid differences between the pre and post i2 study periods. Hypothesis testing was performed on three groups of staff present during an IR procedure: The first group included all members of the IR staff, the second group consisted of the IR radiologists, and the third group consisted of the IR technician staff. After implementing the i2 active dosimetry system, participating members of the Lawrence General IR staff had a reduction in the average dose equivalence per procedure of 43.1% ± 16.7% (p = 0.04). Similarly, Lawrence General IR radiologists had a 65.8% ± 33.6% (p=0.01) reduction while the technologists had a 45.0% ± 14.4% (p=0.03) reduction.

  18. Fitting a three-parameter lognormal distribution with applications to hydrogeochemical data from the National Uranium Resource Evaluation Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kane, V.E.

    1979-10-01

    The standard maximum likelihood and moment estimation procedures are shown to have some undesirable characteristics for estimating the parameters in a three-parameter lognormal distribution. A class of goodness-of-fit estimators is found which provides a useful alternative to the standard methods. The class of goodness-of-fit tests considered include the Shapiro-Wilk and Shapiro-Francia tests which reduce to a weighted linear combination of the order statistics that can be maximized in estimation problems. The weighted-order statistic estimators are compared to the standard procedures in Monte Carlo simulations. Bias and robustness of the procedures are examined and example data sets analyzed including geochemical datamore » from the National Uranium Resource Evaluation Program.« less

  19. Quantifying the impact of between-study heterogeneity in multivariate meta-analyses

    PubMed Central

    Jackson, Dan; White, Ian R; Riley, Richard D

    2012-01-01

    Measures that quantify the impact of heterogeneity in univariate meta-analysis, including the very popular I2 statistic, are now well established. Multivariate meta-analysis, where studies provide multiple outcomes that are pooled in a single analysis, is also becoming more commonly used. The question of how to quantify heterogeneity in the multivariate setting is therefore raised. It is the univariate R2 statistic, the ratio of the variance of the estimated treatment effect under the random and fixed effects models, that generalises most naturally, so this statistic provides our basis. This statistic is then used to derive a multivariate analogue of I2, which we call . We also provide a multivariate H2 statistic, the ratio of a generalisation of Cochran's heterogeneity statistic and its associated degrees of freedom, with an accompanying generalisation of the usual I2 statistic, . Our proposed heterogeneity statistics can be used alongside all the usual estimates and inferential procedures used in multivariate meta-analysis. We apply our methods to some real datasets and show how our statistics are equally appropriate in the context of multivariate meta-regression, where study level covariate effects are included in the model. Our heterogeneity statistics may be used when applying any procedure for fitting the multivariate random effects model. Copyright © 2012 John Wiley & Sons, Ltd. PMID:22763950

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, Richard O.

    The application of statistics to environmental pollution monitoring studies requires a knowledge of statistical analysis methods particularly well suited to pollution data. This book fills that need by providing sampling plans, statistical tests, parameter estimation procedure techniques, and references to pertinent publications. Most of the statistical techniques are relatively simple, and examples, exercises, and case studies are provided to illustrate procedures. The book is logically divided into three parts. Chapters 1, 2, and 3 are introductory chapters. Chapters 4 through 10 discuss field sampling designs and Chapters 11 through 18 deal with a broad range of statistical analysis procedures. Somemore » statistical techniques given here are not commonly seen in statistics book. For example, see methods for handling correlated data (Sections 4.5 and 11.12), for detecting hot spots (Chapter 10), and for estimating a confidence interval for the mean of a lognormal distribution (Section 13.2). Also, Appendix B lists a computer code that estimates and tests for trends over time at one or more monitoring stations using nonparametric methods (Chapters 16 and 17). Unfortunately, some important topics could not be included because of their complexity and the need to limit the length of the book. For example, only brief mention could be made of time series analysis using Box-Jenkins methods and of kriging techniques for estimating spatial and spatial-time patterns of pollution, although multiple references on these topics are provided. Also, no discussion of methods for assessing risks from environmental pollution could be included.« less

  1. Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-04-01

    The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (1) linear relationships with correlation coefficients, (2) monotonic relationships with rank correlation coefficients, (3) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (4) trends in variability as defined by variances and interquartile ranges, and (5) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are consideredmore » for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (1) Type I errors are unavoidable, (2) Type II errors can occur when inappropriate analysis procedures are used, (3) physical explanations should always be sought for why statistical procedures identify variables as being important, and (4) the identification of important variables tends to be stable for independent Latin hypercube samples.« less

  2. Revised Planning Methodology For Signalized Intersections And Operational Analysis Of Exclusive Left-Turn Lanes, Part-II: Models And Procedures (Final Report)

    DOT National Transportation Integrated Search

    1996-04-01

    THIS REPORT ALSO DESCRIBES THE PROCEDURES FOR DIRECT ESTIMATION OF INTERSECTION CAPACITY WITH SIMULATION, INCLUDING A SET OF RIGOROUS STATISTICAL TESTS FOR SIMULATION PARAMETER CALIBRATION FROM FIELD DATA.

  3. Statistical Software and Artificial Intelligence: A Watershed in Applications Programming.

    ERIC Educational Resources Information Center

    Pickett, John C.

    1984-01-01

    AUTOBJ and AUTOBOX are revolutionary software programs which contain the first application of artificial intelligence to statistical procedures used in analysis of time series data. The artificial intelligence included in the programs and program features are discussed. (JN)

  4. An operational GLS model for hydrologic regression

    USGS Publications Warehouse

    Tasker, Gary D.; Stedinger, J.R.

    1989-01-01

    Recent Monte Carlo studies have documented the value of generalized least squares (GLS) procedures to estimate empirical relationships between streamflow statistics and physiographic basin characteristics. This paper presents a number of extensions of the GLS method that deal with realities and complexities of regional hydrologic data sets that were not addressed in the simulation studies. These extensions include: (1) a more realistic model of the underlying model errors; (2) smoothed estimates of cross correlation of flows; (3) procedures for including historical flow data; (4) diagnostic statistics describing leverage and influence for GLS regression; and (5) the formulation of a mathematical program for evaluating future gaging activities. ?? 1989.

  5. Regression Models for Identifying Noise Sources in Magnetic Resonance Images

    PubMed Central

    Zhu, Hongtu; Li, Yimei; Ibrahim, Joseph G.; Shi, Xiaoyan; An, Hongyu; Chen, Yashen; Gao, Wei; Lin, Weili; Rowe, Daniel B.; Peterson, Bradley S.

    2009-01-01

    Stochastic noise, susceptibility artifacts, magnetic field and radiofrequency inhomogeneities, and other noise components in magnetic resonance images (MRIs) can introduce serious bias into any measurements made with those images. We formally introduce three regression models including a Rician regression model and two associated normal models to characterize stochastic noise in various magnetic resonance imaging modalities, including diffusion-weighted imaging (DWI) and functional MRI (fMRI). Estimation algorithms are introduced to maximize the likelihood function of the three regression models. We also develop a diagnostic procedure for systematically exploring MR images to identify noise components other than simple stochastic noise, and to detect discrepancies between the fitted regression models and MRI data. The diagnostic procedure includes goodness-of-fit statistics, measures of influence, and tools for graphical display. The goodness-of-fit statistics can assess the key assumptions of the three regression models, whereas measures of influence can isolate outliers caused by certain noise components, including motion artifacts. The tools for graphical display permit graphical visualization of the values for the goodness-of-fit statistic and influence measures. Finally, we conduct simulation studies to evaluate performance of these methods, and we analyze a real dataset to illustrate how our diagnostic procedure localizes subtle image artifacts by detecting intravoxel variability that is not captured by the regression models. PMID:19890478

  6. The transfer of analytical procedures.

    PubMed

    Ermer, J; Limberger, M; Lis, K; Wätzig, H

    2013-11-01

    Analytical method transfers are certainly among the most discussed topics in the GMP regulated sector. However, they are surprisingly little regulated in detail. General information is provided by USP, WHO, and ISPE in particular. Most recently, the EU emphasized the importance of analytical transfer by including it in their draft of the revised GMP Guideline. In this article, an overview and comparison of these guidelines is provided. The key to success for method transfers is the excellent communication between sending and receiving unit. In order to facilitate this communication, procedures, flow charts and checklists for responsibilities, success factors, transfer categories, the transfer plan and report, strategies in case of failed transfers, tables with acceptance limits are provided here, together with a comprehensive glossary. Potential pitfalls are described such that they can be avoided. In order to assure an efficient and sustainable transfer of analytical procedures, a practically relevant and scientifically sound evaluation with corresponding acceptance criteria is crucial. Various strategies and statistical tools such as significance tests, absolute acceptance criteria, and equivalence tests are thoroughly descibed and compared in detail giving examples. Significance tests should be avoided. The success criterion is not statistical significance, but rather analytical relevance. Depending on a risk assessment of the analytical procedure in question, statistical equivalence tests are recommended, because they include both, a practically relevant acceptance limit and a direct control of the statistical risks. However, for lower risk procedures, a simple comparison of the transfer performance parameters to absolute limits is also regarded as sufficient. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Nevada Applied Ecology Group procedures handbook for environmental transuranics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, M.G.; Dunaway, P.B.

    The activities of the Nevada Applied Ecology Group (NAEG) integrated research studies of environmental plutonium and other transuranics at the Nevada Test Site have required many standardized field and laboratory procedures. These include sampling techniques, collection and preparation, radiochemical and wet chemistry analysis, data bank storage and reporting, and statistical considerations for environmental samples of soil, vegetation, resuspended particles, animals, and others. This document, printed in two volumes, includes most of the Nevada Applied Ecology Group standard procedures, with explanations as to the specific applications involved in the environmental studies. Where there is more than one document concerning a procedure,more » it has been included to indicate special studies or applications perhaps more complex than the routine standard sampling procedures utilized.« less

  8. Nevada Applied Ecology Group procedures handbook for environmental transuranics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, M.G.; Dunaway, P.B.

    The activities of the Nevada Applied Ecology Group (NAEG) integrated research studies of environmental plutonium and other transuranics at the Nevada Test Site have required many standardized field and laboratory procedures. These include sampling techniques, collection and preparation, radiochemical and wet chemistry analysis, data bank storage and reporting, and statistical considerations for environmental samples of soil, vegetation, resuspended particles, animals, and other biological material. This document, printed in two volumes, includes most of the Nevada Applied Ecology Group standard procedures, with explanations as to the specific applications involved in the environmental studies. Where there is more than one document concerningmore » a procedure, it has been included to indicate special studies or applications more complex than the routine standard sampling procedures utilized.« less

  9. PROC IRT: A SAS Procedure for Item Response Theory

    PubMed Central

    Matlock Cole, Ki; Paek, Insu

    2017-01-01

    This article reviews the procedure for item response theory (PROC IRT) procedure in SAS/STAT 14.1 to conduct item response theory (IRT) analyses of dichotomous and polytomous datasets that are unidimensional or multidimensional. The review provides an overview of available features, including models, estimation procedures, interfacing, input, and output files. A small-scale simulation study evaluates the IRT model parameter recovery of the PROC IRT procedure. The use of the IRT procedure in Statistical Analysis Software (SAS) may be useful for researchers who frequently utilize SAS for analyses, research, and teaching.

  10. Data management in large-scale collaborative toxicity studies: how to file experimental data for automated statistical analysis.

    PubMed

    Stanzel, Sven; Weimer, Marc; Kopp-Schneider, Annette

    2013-06-01

    High-throughput screening approaches are carried out for the toxicity assessment of a large number of chemical compounds. In such large-scale in vitro toxicity studies several hundred or thousand concentration-response experiments are conducted. The automated evaluation of concentration-response data using statistical analysis scripts saves time and yields more consistent results in comparison to data analysis performed by the use of menu-driven statistical software. Automated statistical analysis requires that concentration-response data are available in a standardised data format across all compounds. To obtain consistent data formats, a standardised data management workflow must be established, including guidelines for data storage, data handling and data extraction. In this paper two procedures for data management within large-scale toxicological projects are proposed. Both procedures are based on Microsoft Excel files as the researcher's primary data format and use a computer programme to automate the handling of data files. The first procedure assumes that data collection has not yet started whereas the second procedure can be used when data files already exist. Successful implementation of the two approaches into the European project ACuteTox is illustrated. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. 'Chain pooling' model selection as developed for the statistical analysis of a rotor burst protection experiment

    NASA Technical Reports Server (NTRS)

    Holms, A. G.

    1977-01-01

    A statistical decision procedure called chain pooling had been developed for model selection in fitting the results of a two-level fixed-effects full or fractional factorial experiment not having replication. The basic strategy included the use of one nominal level of significance for a preliminary test and a second nominal level of significance for the final test. The subject has been reexamined from the point of view of using as many as three successive statistical model deletion procedures in fitting the results of a single experiment. The investigation consisted of random number studies intended to simulate the results of a proposed aircraft turbine-engine rotor-burst-protection experiment. As a conservative approach, population model coefficients were chosen to represent a saturated 2 to the 4th power experiment with a distribution of parameter values unfavorable to the decision procedures. Three model selection strategies were developed.

  12. Do statistical segmentation abilities predict lexical-phonological and lexical-semantic abilities in children with and without SLI?

    PubMed Central

    Mainela-Arnold, Elina; Evans, Julia L.

    2014-01-01

    This study tested the predictions of the procedural deficit hypothesis by investigating the relationship between sequential statistical learning and two aspects of lexical ability, lexical-phonological and lexical-semantic, in children with and without specific language impairment (SLI). Participants included 40 children (ages 8;5–12;3), 20 children with SLI and 20 with typical development. Children completed Saffran’s statistical word segmentation task, a lexical-phonological access task (gating task), and a word definition task. Poor statistical learners were also poor at managing lexical-phonological competition during the gating task. However, statistical learning was not a significant predictor of semantic richness in word definitions. The ability to track statistical sequential regularities may be important for learning the inherently sequential structure of lexical-phonology, but not as important for learning lexical-semantic knowledge. Consistent with the procedural/declarative memory distinction, the brain networks associated with the two types of lexical learning are likely to have different learning properties. PMID:23425593

  13. [Quality of clinical studies published in the RBGO over one decade (1999-2009): methodological and ethical aspects and statistical procedures].

    PubMed

    de Sá, Joceline Cássia Ferezini; Marini, Gabriela; Gelaleti, Rafael Bottaro; da Silva, João Batista; de Azevedo, George Gantas; Rudge, Marilza Vieira Cunha

    2013-11-01

    To evaluate the methodological and statistical design evolution of the publications in the Brazilian Journal of Gynecology and Obstetrics (RBGO) from resolution 196/96. A review of 133 articles published in 1999 (65) and 2009 (68) was performed by two independent reviewers with training in clinical epidemiology and methodology of scientific research. We included all original clinical articles, case and series reports and excluded editorials, letters to the editor, systematic reviews, experimental studies, opinion articles, besides abstracts of theses and dissertations. Characteristics related to the methodological quality of the studies were analyzed in each article using a checklist that evaluated two criteria: methodological aspects and statistical procedures. We used descriptive statistics and the χ2 test for comparison of the two years. There was a difference between 1999 and 2009 regarding the study and statistical design, with more accuracy in the procedures and the use of more robust tests between 1999 and 2009. In RBGO, we observed an evolution in the methods of published articles and a more in-depth use of the statistical analyses, with more sophisticated tests such as regression and multilevel analyses, which are essential techniques for the knowledge and planning of health interventions, leading to fewer interpretation errors.

  14. Physics in Perspective Volume II, Part C, Statistical Data.

    ERIC Educational Resources Information Center

    National Academy of Sciences - National Research Council, Washington, DC. Physics Survey Committee.

    Statistical data relating to the sociology and economics of the physics enterprise are presented and explained. The data are divided into three sections: manpower data, data on funding and costs, and data on the literature of physics. Each section includes numerous studies, with notes on the sources and types of data, gathering procedures, and…

  15. Forest statistics for east Oklahoma counties - l993

    Treesearch

    Patrick E. Miller; Andrew J. Hartsell; Jack D. London

    1993-01-01

    This report contains the statistical tables and figures derived from data obtained during a recent inventory of east Oklahoma. The multiresource inventory included 18 counties and two survey regions. Data on forest acreage and timber volume involved a three-step procedure. First, estimate of forest acreage were made for each county using aerial photographs....

  16. Characterizations of linear sufficient statistics

    NASA Technical Reports Server (NTRS)

    Peters, B. C., Jr.; Reoner, R.; Decell, H. P., Jr.

    1977-01-01

    A surjective bounded linear operator T from a Banach space X to a Banach space Y must be a sufficient statistic for a dominated family of probability measures defined on the Borel sets of X. These results were applied, so that they characterize linear sufficient statistics for families of the exponential type, including as special cases the Wishart and multivariate normal distributions. The latter result was used to establish precisely which procedures for sampling from a normal population had the property that the sample mean was a sufficient statistic.

  17. Effect of non-normality on test statistics for one-way independent groups designs.

    PubMed

    Cribbie, Robert A; Fiksenbaum, Lisa; Keselman, H J; Wilcox, Rand R

    2012-02-01

    The data obtained from one-way independent groups designs is typically non-normal in form and rarely equally variable across treatment populations (i.e., population variances are heterogeneous). Consequently, the classical test statistic that is used to assess statistical significance (i.e., the analysis of variance F test) typically provides invalid results (e.g., too many Type I errors, reduced power). For this reason, there has been considerable interest in finding a test statistic that is appropriate under conditions of non-normality and variance heterogeneity. Previously recommended procedures for analysing such data include the James test, the Welch test applied either to the usual least squares estimators of central tendency and variability, or the Welch test with robust estimators (i.e., trimmed means and Winsorized variances). A new statistic proposed by Krishnamoorthy, Lu, and Mathew, intended to deal with heterogeneous variances, though not non-normality, uses a parametric bootstrap procedure. In their investigation of the parametric bootstrap test, the authors examined its operating characteristics under limited conditions and did not compare it to the Welch test based on robust estimators. Thus, we investigated how the parametric bootstrap procedure and a modified parametric bootstrap procedure based on trimmed means perform relative to previously recommended procedures when data are non-normal and heterogeneous. The results indicated that the tests based on trimmed means offer the best Type I error control and power when variances are unequal and at least some of the distribution shapes are non-normal. © 2011 The British Psychological Society.

  18. Application of the QSDC procedure to the formulation of space shuttle design criteria. Volume 2: Applications guide

    NASA Technical Reports Server (NTRS)

    Bouton, I.; Martin, G. L.

    1972-01-01

    Criteria to determine the probability of aircraft structural failure were established according to the Quantitative Structural Design Criteria by Statistical Methods, the QSDC Procedure. This criteria method was applied to the design of the space shuttle during this contract. An Applications Guide was developed to demonstrate the utilization of the QSDC Procedure, with examples of the application to a hypothetical space shuttle illustrating the application to specific design problems. Discussions of the basic parameters of the QSDC Procedure: the Limit and Omega Conditions, and the strength scatter, have been included. Available data pertinent to the estimation of the strength scatter have also been included.

  19. Standard and goodness-of-fit parameter estimation methods for the three-parameter lognormal distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kane, V.E.

    1982-01-01

    A class of goodness-of-fit estimators is found to provide a useful alternative in certain situations to the standard maximum likelihood method which has some undesirable estimation characteristics for estimation from the three-parameter lognormal distribution. The class of goodness-of-fit tests considered include the Shapiro-Wilk and Filliben tests which reduce to a weighted linear combination of the order statistics that can be maximized in estimation problems. The weighted order statistic estimators are compared to the standard procedures in Monte Carlo simulations. Robustness of the procedures are examined and example data sets analyzed.

  20. Comparative evaluation of stress levels before, during, and after periodontal surgical procedures with and without nitrous oxide-oxygen inhalation sedation

    PubMed Central

    Sandhu, Gurkirat; Khinda, Paramjit Kaur; Gill, Amarjit Singh; Singh Khinda, Vineet Inder; Baghi, Kamal; Chahal, Gurparkash Singh

    2017-01-01

    Context: Periodontal surgical procedures produce varying degree of stress in all patients. Nitrous oxide-oxygen inhalation sedation is very effective for adult patients with mild-to-moderate anxiety due to dental procedures and needle phobia. Aim: The present study was designed to perform periodontal surgical procedures under nitrous oxide-oxygen inhalation sedation and assess whether this technique actually reduces stress physiologically, in comparison to local anesthesia alone (LA) during lengthy periodontal surgical procedures. Settings and Design: This was a randomized, split-mouth, cross-over study. Materials and Methods: A total of 16 patients were selected for this randomized, split-mouth, cross-over study. One surgical session (SS) was performed under local anesthesia aided by nitrous oxide-oxygen inhalation sedation, and the other SS was performed on the contralateral quadrant under LA. For each session, blood samples to measure and evaluate serum cortisol levels were obtained, and vital parameters including blood pressure, heart rate, respiratory rate, and arterial blood oxygen saturation were monitored before, during, and after periodontal surgical procedures. Statistical Analysis Used: Paired t-test and repeated measure ANOVA. Results: The findings of the present study revealed a statistically significant decrease in serum cortisol levels, blood pressure and pulse rate and a statistically significant increase in respiratory rate and arterial blood oxygen saturation during periodontal surgical procedures under nitrous oxide inhalation sedation. Conclusion: Nitrous oxide-oxygen inhalation sedation for periodontal surgical procedures is capable of reducing stress physiologically, in comparison to LA during lengthy periodontal surgical procedures. PMID:29386796

  1. 78 FR 43002 - Proposed Collection; Comment Request for Revenue Procedure 2004-29

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-18

    ... comments concerning statistical sampling in Sec. 274 Context. DATES: Written comments should be received on... INFORMATION: Title: Statistical Sampling in Sec. 274 Contest. OMB Number: 1545-1847. Revenue Procedure Number: Revenue Procedure 2004-29. Abstract: Revenue Procedure 2004-29 prescribes the statistical sampling...

  2. Do Statistical Segmentation Abilities Predict Lexical-Phonological and Lexical-Semantic Abilities in Children with and without SLI?

    ERIC Educational Resources Information Center

    Mainela-Arnold, Elina; Evans, Julia L.

    2014-01-01

    This study tested the predictions of the procedural deficit hypothesis by investigating the relationship between sequential statistical learning and two aspects of lexical ability, lexical-phonological and lexical-semantic, in children with and without specific language impairment (SLI). Participants included forty children (ages 8;5-12;3), twenty…

  3. AGR-1 Thermocouple Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeff Einerson

    2012-05-01

    This report documents an effort to analyze measured and simulated data obtained in the Advanced Gas Reactor (AGR) fuel irradiation test program conducted in the INL's Advanced Test Reactor (ATR) to support the Next Generation Nuclear Plant (NGNP) R&D program. The work follows up on a previous study (Pham and Einerson, 2010), in which statistical analysis methods were applied for AGR-1 thermocouple data qualification. The present work exercises the idea that, while recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, results of the numerical simulations can be used in combination with the statistical analysis methods tomore » further improve qualification of measured data. Additionally, the combined analysis of measured and simulation data can generate insights about simulation model uncertainty that can be useful for model improvement. This report also describes an experimental control procedure to maintain fuel target temperature in the future AGR tests using regression relationships that include simulation results. The report is organized into four chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program, AGR-1 test configuration and test procedure, overview of AGR-1 measured data, and overview of physics and thermal simulation, including modeling assumptions and uncertainties. A brief summary of statistical analysis methods developed in (Pham and Einerson 2010) for AGR-1 measured data qualification within NGNP Data Management and Analysis System (NDMAS) is also included for completeness. Chapters 2-3 describe and discuss cases, in which the combined use of experimental and simulation data is realized. A set of issues associated with measurement and modeling uncertainties resulted from the combined analysis are identified. This includes demonstration that such a combined analysis led to important insights for reducing uncertainty in presentation of AGR-1 measured data (Chapter 2) and interpretation of simulation results (Chapter 3). The statistics-based simulation-aided experimental control procedure described for the future AGR tests is developed and demonstrated in Chapter 4. The procedure for controlling the target fuel temperature (capsule peak or average) is based on regression functions of thermocouple readings and other relevant parameters and accounting for possible changes in both physical and thermal conditions and in instrument performance.« less

  4. 28 CFR Appendix D to Part 61 - Office of Justice Assistance, Research, and Statistics Procedures Relating to the Implementation...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., and Statistics Procedures Relating to the Implementation of the National Environmental Policy Act D... Assistance, Research, and Statistics Procedures Relating to the Implementation of the National Environmental... Statistics (OJARS) assists State and local units of government in strengthening and improving law enforcement...

  5. 28 CFR Appendix D to Part 61 - Office of Justice Assistance, Research, and Statistics Procedures Relating to the Implementation...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., and Statistics Procedures Relating to the Implementation of the National Environmental Policy Act D... Assistance, Research, and Statistics Procedures Relating to the Implementation of the National Environmental... Statistics (OJARS) assists State and local units of government in strengthening and improving law enforcement...

  6. Why McNemar's Procedure Needs to Be Included in the Business Statistics Curriculum

    ERIC Educational Resources Information Center

    Berenson, Mark L.; Koppel, Nicole B.

    2005-01-01

    In business research situations it is often of interest to examine the differences in the responses in repeated measurements of the same subjects or from among matched or paired subjects. A simple and useful procedure for comparing differences between proportions in two related samples was devised by McNemar (1947) nearly 60 years ago. Although…

  7. Monitoring the use and outcomes of new devices and procedures: how does coding affect what Hospital Episode Statistics contribute? Lessons from 12 emerging procedures 2006-10.

    PubMed

    Patrick, Hannah; Sims, Andrew; Burn, Julie; Bousfield, Derek; Colechin, Elaine; Reay, Christopher; Alderson, Neil; Goode, Stephen; Cunningham, David; Campbell, Bruce

    2013-03-01

    New devices and procedures are often introduced into health services when the evidence base for their efficacy and safety is limited. The authors sought to assess the availability and accuracy of routinely collected Hospital Episodes Statistics (HES) data in the UK and their potential contribution to the monitoring of new procedures. Four years of HES data (April 2006-March 2010) were analysed to identify episodes of hospital care involving a sample of 12 new interventional procedures. HES data were cross checked against other relevant sources including national or local registers and manufacturers' information. HES records were available for all 12 procedures during the entire study period. Comparative data sources were available from national (5), local (2) and manufacturer (2) registers. Factors found to affect comparisons were miscoding, alternative coding and inconsistent use of subsidiary codes. The analysis of provider coverage showed that HES is sensitive at detecting centres which carry out procedures, but specificity is poor in some cases. Routinely collected HES data have the potential to support quality improvements and evidence-based commissioning of devices and procedures in health services but achievement of this potential depends upon the accurate coding of procedures.

  8. Randomised controlled trial to assess the effect of a Just-in-Time training on procedural performance: a proof-of-concept study to address procedural skill decay.

    PubMed

    Branzetti, Jeremy B; Adedipe, Adeyinka A; Gittinger, Matthew J; Rosenman, Elizabeth D; Brolliar, Sarah; Chipman, Anne K; Grand, James A; Fernandez, Rosemarie

    2017-11-01

    A subset of high-risk procedures present significant safety threats due to their (1) infrequent occurrence, (2) execution under time constraints and (3) immediate necessity for patient survival. A Just-in-Time (JIT) intervention could provide real-time bedside guidance to improve high-risk procedural performance and address procedural deficits associated with skill decay. To evaluate the impact of a novel JIT intervention on transvenous pacemaker (TVP) placement during a simulated patient event. This was a prospective, randomised controlled study to determine the effect of a JIT intervention on performance of TVP placement. Subjects included board-certified emergency medicine physicians from two hospitals. The JIT intervention consisted of a portable, bedside computer-based procedural adjunct. The primary outcome was performance during a simulated patient encounter requiring TVP placement, as assessed by trained raters using a technical skills checklist. Secondary outcomes included global performance ratings, time to TVP placement, number of critical omissions and System Usability Scale scores (intervention only). Groups were similar at baseline across all outcomes. Compared with the control group, the intervention group demonstrated statistically significant improvement in the technical checklist score (11.45 vs 23.44, p<0.001, Cohen's d effect size 4.64), the global rating scale (2.27 vs 4.54, p<0.001, Cohen's d effect size 3.76), and a statistically significant reduction in critical omissions (2.23 vs 0.68, p<0.001, Cohen's d effect size -1.86). The difference in time to procedural completion was not statistically significant between conditions (11.15 min vs 12.80 min, p=0.12, Cohen's d effect size 0.65). System Usability Scale scores demonstrated excellent usability. A JIT intervention improved procedure perfromance, suggesting a role for JIT interventions in rarely performed procedures. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  9. 15-Year-Experience of a Knee Arthroscopist

    PubMed Central

    Tatari, Mehmet Hasan; Bektaş, Yunus Emre; Demirkıran, Demirhan; Ellidokuz, Hülya

    2014-01-01

    Objectives: Arthroscopic knee surgery is a an experience-demanding procedure throughout diagnostic and reconstructive parts. Altough the literature says that there must be no need for diagnostic arthroscopy today, most arthroscopic surgeons have gained experience and developed themselves by the help of diagnostic arthroscopy and some basic procedures like debridement and lavage. The purpose of this study was to observe what happenned in the 15-year-experience of an orthopaedic surgeon who deals with knee arthroscopy. The hypothesis was that the mean age of the patients, who have undergone arthroscopic procedures, would decrease, and the percentage of the diagnostic and debridement applications would diminish and reconstructive procedures would increase. Methods: For this purpose, 959 patients who have undergone knee arthroscopy in 15 years, were evaluated retrospectively. The gender, age, operation year and the procedure applied for the patients were enrolled on an Excel file. Chi-Square test was used for statistical evaluation. The patients were divided into three groups according to the year they were operated. Period 1 included the patients who were operated between the years 1999-2003, Period 2 between 2004-2008 and Period 3 between 2009-2013. According to their ages, the patients were evaluated in three groups; Group 1 included the patients ≤ 25 years old while Group 2 between 26-40 and Group 3 ≥ 41. Arthroscopic procedures were evaluated in three groups: Group X: meniscectomy, chondral debridement, lavage, synoviectomy, loose body removal. Group Y: ACL and PCL reconstruction, meniscal repair. Group Z: Microfracture, lateral release, meniscal normalization, second look arthroscopy, diagnostic arthroscopy before osteotomy. Results: Among all patients, 60 % was male and Group 3 (45.4 %) was the larger group in population. The procedures in Group X were used in most of the operations ( 59.2 %). The population of the patients in the periods increased gradually throughout the years: 24 % in Period 1, 36.6 % in Period 2 and 39.4 % in Period 3. While the population of Group 3 was higher than the others in the first two periods, Group 2 was the leader in the last period (p< 0.001). While male/female ratio was statistically insignificant in Periods 1 and 2, the number of the males in Period 3 was statistically higher than the females (p< 0.001). The procedures in Group Y were used significantly for males in Periods 2 and 3 (p< 0.001). The procedures in Group X were used significantly for females (p< 0.001) while the ones in Group Y were applied for males (p< 0.001). Among all arthroscopic procedures, Group X was the leader in Period 1 (85 %) but this frequency decreased throughout the years and the procedures in Group Y increased gradually more than twice consisting more than half of the procedures in Period 3 (p< 0.001). Conclusion: Throughout the years, the age of the patients, for whom arthroscopic procedures were done, and the percentage of debridement and diagnostic procedures have decreased, while the population of the patients and the number of the reconstructive procedures, especially for males, have increased. The results were statistically significant. In our opinion, this statistical conclusion must be the usual academic development of an orthopeadic surgeon who deals mostly with knee arthroscopy in his daily practice. This must be a guide for young arthroscopists.

  10. 75 FR 38871 - Proposed Collection; Comment Request for Revenue Procedure 2004-29

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-06

    ... comments concerning Revenue Procedure 2004-29, Statistical Sampling in Sec. 274 Context. DATES: Written... Internet, at [email protected] . SUPPLEMENTARY INFORMATION: Title: Statistical Sampling in Sec...: Revenue Procedure 2004-29 prescribes the statistical sampling methodology by which taxpayers under...

  11. Improving the efficiency of the cardiac catheterization laboratories through understanding the stochastic behavior of the scheduled procedures.

    PubMed

    Stepaniak, Pieter S; Soliman Hamad, Mohamed A; Dekker, Lukas R C; Koolen, Jacques J

    2014-01-01

    In this study, we sought to analyze the stochastic behavior of Catherization Laboratories (Cath Labs) procedures in our institution. Statistical models may help to improve estimated case durations to support management in the cost-effective use of expensive surgical resources. We retrospectively analyzed all the procedures performed in the Cath Labs in 2012. The duration of procedures is strictly positive (larger than zero) and has mostly a large minimum duration. Because of the strictly positive character of the Cath Lab procedures, a fit of a lognormal model may be desirable. Having a minimum duration requires an estimate of the threshold (shift) parameter of the lognormal model. Therefore, the 3-parameter lognormal model is interesting. To avoid heterogeneous groups of observations, we tested every group-cardiologist-procedure combination for the normal, 2- and 3-parameter lognormal distribution. The total number of elective and emergency procedures performed was 6,393 (8,186 h). The final analysis included 6,135 procedures (7,779 h). Electrophysiology (intervention) procedures fit the 3-parameter lognormal model 86.1% (80.1%). Using Friedman test statistics, we conclude that the 3-parameter lognormal model is superior to the 2-parameter lognormal model. Furthermore, the 2-parameter lognormal is superior to the normal model. Cath Lab procedures are well-modelled by lognormal models. This information helps to improve and to refine Cath Lab schedules and hence their efficient use.

  12. 75 FR 53738 - Proposed Collection; Comment Request for Rev. Proc. 2007-35

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-01

    ... Revenue Procedure Revenue Procedure 2007-35, Statistical Sampling for purposes of Section 199. DATES... through the Internet, at [email protected] . SUPPLEMENTARY INFORMATION: Title: Statistical Sampling...: This revenue procedure provides for determining when statistical sampling may be used in purposes of...

  13. Statistical assessment of the learning curves of health technologies.

    PubMed

    Ramsay, C R; Grant, A M; Wallace, S A; Garthwaite, P H; Monk, A F; Russell, I T

    2001-01-01

    (1) To describe systematically studies that directly assessed the learning curve effect of health technologies. (2) Systematically to identify 'novel' statistical techniques applied to learning curve data in other fields, such as psychology and manufacturing. (3) To test these statistical techniques in data sets from studies of varying designs to assess health technologies in which learning curve effects are known to exist. METHODS - STUDY SELECTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): For a study to be included, it had to include a formal analysis of the learning curve of a health technology using a graphical, tabular or statistical technique. METHODS - STUDY SELECTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): For a study to be included, it had to include a formal assessment of a learning curve using a statistical technique that had not been identified in the previous search. METHODS - DATA SOURCES: Six clinical and 16 non-clinical biomedical databases were searched. A limited amount of handsearching and scanning of reference lists was also undertaken. METHODS - DATA EXTRACTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): A number of study characteristics were abstracted from the papers such as study design, study size, number of operators and the statistical method used. METHODS - DATA EXTRACTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): The new statistical techniques identified were categorised into four subgroups of increasing complexity: exploratory data analysis; simple series data analysis; complex data structure analysis, generic techniques. METHODS - TESTING OF STATISTICAL METHODS: Some of the statistical methods identified in the systematic searches for single (simple) operator series data and for multiple (complex) operator series data were illustrated and explored using three data sets. The first was a case series of 190 consecutive laparoscopic fundoplication procedures performed by a single surgeon; the second was a case series of consecutive laparoscopic cholecystectomy procedures performed by ten surgeons; the third was randomised trial data derived from the laparoscopic procedure arm of a multicentre trial of groin hernia repair, supplemented by data from non-randomised operations performed during the trial. RESULTS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: Of 4571 abstracts identified, 272 (6%) were later included in the study after review of the full paper. Some 51% of studies assessed a surgical minimal access technique and 95% were case series. The statistical method used most often (60%) was splitting the data into consecutive parts (such as halves or thirds), with only 14% attempting a more formal statistical analysis. The reporting of the studies was poor, with 31% giving no details of data collection methods. RESULTS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: Of 9431 abstracts assessed, 115 (1%) were deemed appropriate for further investigation and, of these, 18 were included in the study. All of the methods for complex data sets were identified in the non-clinical literature. These were discriminant analysis, two-stage estimation of learning rates, generalised estimating equations, multilevel models, latent curve models, time series models and stochastic parameter models. In addition, eight new shapes of learning curves were identified. RESULTS - TESTING OF STATISTICAL METHODS: No one particular shape of learning curve performed significantly better than another. The performance of 'operation time' as a proxy for learning differed between the three procedures. Multilevel modelling using the laparoscopic cholecystectomy data demonstrated and measured surgeon-specific and confounding effects. The inclusion of non-randomised cases, despite the possible limitations of the method, enhanced the interpretation of learning effects. CONCLUSIONS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: The statistical methods used for assessing learning effects in health technology assessment have been crude and the reporting of studies poor. CONCLUSIONS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: A number of statistical methods for assessing learning effects were identified that had not hitherto been used in health technology assessment. There was a hierarchy of methods for the identification and measurement of learning, and the more sophisticated methods for both have had little if any use in health technology assessment. This demonstrated the value of considering fields outside clinical research when addressing methodological issues in health technology assessment. CONCLUSIONS - TESTING OF STATISTICAL METHODS: It has been demonstrated that the portfolio of techniques identified can enhance investigations of learning curve effects. (ABSTRACT TRUNCATED)

  14. Statistical analysis of the calibration procedure for personnel radiation measurement instruments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bush, W.J.; Bengston, S.J.; Kalbeitzer, F.L.

    1980-11-01

    Thermoluminescent analyzer (TLA) calibration procedures were used to estimate personnel radiation exposure levels at the Idaho National Engineering Laboratory (INEL). A statistical analysis is presented herein based on data collected over a six month period in 1979 on four TLA's located in the Department of Energy (DOE) Radiological and Environmental Sciences Laboratory at the INEL. The data were collected according to the day-to-day procedure in effect at that time. Both gamma and beta radiation models are developed. Observed TLA readings of thermoluminescent dosimeters are correlated with known radiation levels. This correlation is then used to predict unknown radiation doses frommore » future analyzer readings of personnel thermoluminescent dosimeters. The statistical techniques applied in this analysis include weighted linear regression, estimation of systematic and random error variances, prediction interval estimation using Scheffe's theory of calibration, the estimation of the ratio of the means of two normal bivariate distributed random variables and their corresponding confidence limits according to Kendall and Stuart, tests of normality, experimental design, a comparison between instruments, and quality control.« less

  15. Statistical methodology for the analysis of dye-switch microarray experiments

    PubMed Central

    Mary-Huard, Tristan; Aubert, Julie; Mansouri-Attia, Nadera; Sandra, Olivier; Daudin, Jean-Jacques

    2008-01-01

    Background In individually dye-balanced microarray designs, each biological sample is hybridized on two different slides, once with Cy3 and once with Cy5. While this strategy ensures an automatic correction of the gene-specific labelling bias, it also induces dependencies between log-ratio measurements that must be taken into account in the statistical analysis. Results We present two original statistical procedures for the statistical analysis of individually balanced designs. These procedures are compared with the usual ML and REML mixed model procedures proposed in most statistical toolboxes, on both simulated and real data. Conclusion The UP procedure we propose as an alternative to usual mixed model procedures is more efficient and significantly faster to compute. This result provides some useful guidelines for the analysis of complex designs. PMID:18271965

  16. Expected p-values in light of an ROC curve analysis applied to optimal multiple testing procedures.

    PubMed

    Vexler, Albert; Yu, Jihnhee; Zhao, Yang; Hutson, Alan D; Gurevich, Gregory

    2017-01-01

    Many statistical studies report p-values for inferential purposes. In several scenarios, the stochastic aspect of p-values is neglected, which may contribute to drawing wrong conclusions in real data experiments. The stochastic nature of p-values makes their use to examine the performance of given testing procedures or associations between investigated factors to be difficult. We turn our focus on the modern statistical literature to address the expected p-value (EPV) as a measure of the performance of decision-making rules. During the course of our study, we prove that the EPV can be considered in the context of receiver operating characteristic (ROC) curve analysis, a well-established biostatistical methodology. The ROC-based framework provides a new and efficient methodology for investigating and constructing statistical decision-making procedures, including: (1) evaluation and visualization of properties of the testing mechanisms, considering, e.g. partial EPVs; (2) developing optimal tests via the minimization of EPVs; (3) creation of novel methods for optimally combining multiple test statistics. We demonstrate that the proposed EPV-based approach allows us to maximize the integrated power of testing algorithms with respect to various significance levels. In an application, we use the proposed method to construct the optimal test and analyze a myocardial infarction disease dataset. We outline the usefulness of the "EPV/ROC" technique for evaluating different decision-making procedures, their constructions and properties with an eye towards practical applications.

  17. Engineering Students Designing a Statistical Procedure for Quantifying Variability

    ERIC Educational Resources Information Center

    Hjalmarson, Margret A.

    2007-01-01

    The study examined first-year engineering students' responses to a statistics task that asked them to generate a procedure for quantifying variability in a data set from an engineering context. Teams used technological tools to perform computations, and their final product was a ranking procedure. The students could use any statistical measures,…

  18. A comparative study of restricted randomization procedures for multiarm trials with equal or unequal treatment allocation ratios.

    PubMed

    Ryeznik, Yevgen; Sverdlov, Oleksandr

    2018-06-04

    Randomization designs for multiarm clinical trials are increasingly used in practice, especially in phase II dose-ranging studies. Many new methods have been proposed in the literature; however, there is lack of systematic, head-to-head comparison of the competing designs. In this paper, we systematically investigate statistical properties of various restricted randomization procedures for multiarm trials with fixed and possibly unequal allocation ratios. The design operating characteristics include measures of allocation balance, randomness of treatment assignments, variations in the allocation ratio, and statistical characteristics such as type I error rate and power. The results from the current paper should help clinical investigators select an appropriate randomization procedure for their clinical trial. We also provide a web-based R shiny application that can be used to reproduce all results in this paper and run simulations under additional user-defined experimental scenarios. Copyright © 2018 John Wiley & Sons, Ltd.

  19. An evaluation of three statistical estimation methods for assessing health policy effects on prescription drug claims.

    PubMed

    Mittal, Manish; Harrison, Donald L; Thompson, David M; Miller, Michael J; Farmer, Kevin C; Ng, Yu-Tze

    2016-01-01

    While the choice of analytical approach affects study results and their interpretation, there is no consensus to guide the choice of statistical approaches to evaluate public health policy change. This study compared and contrasted three statistical estimation procedures in the assessment of a U.S. Food and Drug Administration (FDA) suicidality warning, communicated in January 2008 and implemented in May 2009, on antiepileptic drug (AED) prescription claims. Longitudinal designs were utilized to evaluate Oklahoma (U.S. State) Medicaid claim data from January 2006 through December 2009. The study included 9289 continuously eligible individuals with prevalent diagnoses of epilepsy and/or psychiatric disorder. Segmented regression models using three estimation procedures [i.e., generalized linear models (GLM), generalized estimation equations (GEE), and generalized linear mixed models (GLMM)] were used to estimate trends of AED prescription claims across three time periods: before (January 2006-January 2008); during (February 2008-May 2009); and after (June 2009-December 2009) the FDA warning. All three statistical procedures estimated an increasing trend (P < 0.0001) in AED prescription claims before the FDA warning period. No procedures detected a significant change in trend during (GLM: -30.0%, 99% CI: -60.0% to 10.0%; GEE: -20.0%, 99% CI: -70.0% to 30.0%; GLMM: -23.5%, 99% CI: -58.8% to 1.2%) and after (GLM: 50.0%, 99% CI: -70.0% to 160.0%; GEE: 80.0%, 99% CI: -20.0% to 200.0%; GLMM: 47.1%, 99% CI: -41.2% to 135.3%) the FDA warning when compared to pre-warning period. Although the three procedures provided consistent inferences, the GEE and GLMM approaches accounted appropriately for correlation. Further, marginal models estimated using GEE produced more robust and valid population-level estimations. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Preservation of Mercury in Polyethylene Containers.

    ERIC Educational Resources Information Center

    Piccolino, Samuel Paul

    1983-01-01

    Reports results of experiments favoring use of 0.5 percent nitric acid with an oxidant (potassium dichromate or potassium permanganate) to preserve samples in polyethylene containers for mercury analysis. Includes procedures used and statistical data obtained from the experiments. (JN)

  1. Efficiency and Safety of One-Step Procedure Combined Laparoscopic Cholecystectomy and Eretrograde Cholangiopancreatography for Treatment of Cholecysto-Choledocholithiasis: A Randomized Controlled Trial.

    PubMed

    Liu, Zhiyi; Zhang, Luyao; Liu, Yanling; Gu, Yang; Sun, Tieliang

    2017-11-01

    We aimed to evaluate the efficiency and safety of one-step procedure combined endoscopic retrograde cholangiopancreatography (ERCP) and laparoscopic cholecystectomy (LC) for treatment of patients with cholecysto-choledocholithiasis. A prospective randomized study was performed on 63 consecutive cholecysto-choledocholithiasis patients during 2008 and 2011. The efficiency and safety of one-step procedure was assessed by comparing the two-step LC with ERCP + endoscopic sphincterotomy (EST). Outcomes including intraoperative features, postoperative features (length of stay and postoperative complications) were evaluated. One- or two-step procedure of LC with ERCP + EST was successfully performed in all patients, and common bile duct stones were completely removed. Statistical analyses showed that length of stay and pulmonary infection rate were significantly lower in the test group compared with that in the control group (P < 0.05), whereas no statistical difference in other outcomes was found between the two groups (all P > 0.05). The one-step procedure of LC with ERCP + EST is superior to the two-step procedure for treatment of patients with cholecysto-choledocholithiasis regarding to the reduced hospital stay and inhibited occurrence of pulmonary infections. Compared with two-step procedure, one-step procedure of LC with ERCP + EST may be a superior option for cholecysto-choledocholithiasis patients treatment regarding to hospital stay and pulmonary infections.

  2. Meta-analysis of the technical performance of an imaging procedure: guidelines and statistical methodology.

    PubMed

    Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun

    2015-02-01

    Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test-retest repeatability data for illustrative purposes. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  3. Meta-analysis of the technical performance of an imaging procedure: Guidelines and statistical methodology

    PubMed Central

    Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun

    2017-01-01

    Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test–retest repeatability data for illustrative purposes. PMID:24872353

  4. 7 CFR 52.38c - Statistical sampling procedures for lot inspection of processed fruits and vegetables by attributes.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 2 2011-01-01 2011-01-01 false Statistical sampling procedures for lot inspection of processed fruits and vegetables by attributes. 52.38c Section 52.38c Agriculture Regulations of the... Regulations Governing Inspection and Certification Sampling § 52.38c Statistical sampling procedures for lot...

  5. 7 CFR 52.38b - Statistical sampling procedures for on-line inspection by attributes of processed fruits and...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 2 2011-01-01 2011-01-01 false Statistical sampling procedures for on-line inspection by attributes of processed fruits and vegetables. 52.38b Section 52.38b Agriculture Regulations of... Regulations Governing Inspection and Certification Sampling § 52.38b Statistical sampling procedures for on...

  6. 75 FR 79320 - Animal Drugs, Feeds, and Related Products; Regulation of Carcinogenic Compounds in Food-Producing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-20

    ... is calculated from tumor data of the cancer bioassays using a statistical extrapolation procedure... carcinogenic concern currently set forth in Sec. 500.84 utilizes a statistical extrapolation procedure that... procedures did not rely on a statistical extrapolation of the data to a 1 in 1 million risk of cancer to test...

  7. 7 CFR 52.38b - Statistical sampling procedures for on-line inspection by attributes of processed fruits and...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Statistical sampling procedures for on-line inspection by attributes of processed fruits and vegetables. 52.38b Section 52.38b Agriculture Regulations of... Regulations Governing Inspection and Certification Sampling § 52.38b Statistical sampling procedures for on...

  8. 7 CFR 52.38c - Statistical sampling procedures for lot inspection of processed fruits and vegetables by attributes.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Statistical sampling procedures for lot inspection of processed fruits and vegetables by attributes. 52.38c Section 52.38c Agriculture Regulations of the... Regulations Governing Inspection and Certification Sampling § 52.38c Statistical sampling procedures for lot...

  9. Risk of miscarriage following amniocentesis and chorionic villus sampling: a systematic review of the literature.

    PubMed

    Beta, Jaroslaw; Lesmes-Heredia, Cristina; Bedetti, Chiara; Akolekar, Ranjit

    2018-04-01

    The aim of this paper was to estimate the risk of miscarriage after amniocentesis or chorionic villus sampling (CVS) based on a systematic review of the literature. A search of Medline, Embase, and The Cochrane Library (2000-2017) was carried out to identify studies reporting complications following CVS or amniocentesis. The inclusion criteria for the systematic review were studies reporting results from large controlled studies (N.≥1000 invasive procedures) and those reporting data for pregnancy loss prior to 24 weeks' gestation. Data for cases that had invasive procedure and controls were inputted in contingency tables and risk of miscarriage was estimated for each study. Summary statistics were calculated after taking into account the weighting for each study included in the systematic review. Procedure-related risk of miscarriage was estimated as a weighted risk difference from the summary statistics for cases and controls. The electronic search from the databases yielded 2465 potential citations of which 2431 were excluded, leaving 34 studies for full-text review. The final review included 10 studies for amniocentesis and 6 studies for CVS, which were used to estimate risk of miscarriage in pregnancies that had an invasive procedure and the control pregnancies that did not. The procedure-related risk of miscarriage following amniocentesis was 0.35% (95% confidence interval [CI]: 0.07 to 0.63) and that following CVS was 0.35% (95% CI: -0.31 to 1.00). The procedure-related risks of miscarriage following amniocentesis and CVS are lower than currently quoted to women.

  10. A General Procedure to Assess the Internal Structure of a Noncognitive Measure--The Student360 Insight Program (S360) Time Management Scale. Research Report. ETS RR-11-42

    ERIC Educational Resources Information Center

    Ling, Guangming; Rijmen, Frank

    2011-01-01

    The factorial structure of the Time Management (TM) scale of the Student 360: Insight Program (S360) was evaluated based on a national sample. A general procedure with a variety of methods was introduced and implemented, including the computation of descriptive statistics, exploratory factor analysis (EFA), and confirmatory factor analysis (CFA).…

  11. TRAN-STAT: statistics for environmental transuranic studies, July 1978, Number 5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This issue is concerned with nonparametric procedures for (1) estimating the central tendency of a population, (2) describing data sets through estimating percentiles, (3) estimating confidence limits for the median and other percentiles, (4) estimating tolerance limits and associated numbers of samples, and (5) tests of significance and associated procedures for a variety of testing situations (counterparts to t-tests and analysis of variance). Some characteristics of several nonparametric tests are illustrated using the NAEG /sup 241/Am aliquot data presented and discussed in the April issue of TRAN-STAT. Some of the statistical terms used here are defined in a glossary. Themore » reference list also includes short descriptions of nonparametric books. 31 references, 3 figures, 1 table.« less

  12. Comparative evaluation of stress levels before, during, and after periodontal surgical procedures with and without nitrous oxide-oxygen inhalation sedation.

    PubMed

    Sandhu, Gurkirat; Khinda, Paramjit Kaur; Gill, Amarjit Singh; Singh Khinda, Vineet Inder; Baghi, Kamal; Chahal, Gurparkash Singh

    2017-01-01

    Periodontal surgical procedures produce varying degree of stress in all patients. Nitrous oxide-oxygen inhalation sedation is very effective for adult patients with mild-to-moderate anxiety due to dental procedures and needle phobia. The present study was designed to perform periodontal surgical procedures under nitrous oxide-oxygen inhalation sedation and assess whether this technique actually reduces stress physiologically, in comparison to local anesthesia alone (LA) during lengthy periodontal surgical procedures. This was a randomized, split-mouth, cross-over study. A total of 16 patients were selected for this randomized, split-mouth, cross-over study. One surgical session (SS) was performed under local anesthesia aided by nitrous oxide-oxygen inhalation sedation, and the other SS was performed on the contralateral quadrant under LA. For each session, blood samples to measure and evaluate serum cortisol levels were obtained, and vital parameters including blood pressure, heart rate, respiratory rate, and arterial blood oxygen saturation were monitored before, during, and after periodontal surgical procedures. Paired t -test and repeated measure ANOVA. The findings of the present study revealed a statistically significant decrease in serum cortisol levels, blood pressure and pulse rate and a statistically significant increase in respiratory rate and arterial blood oxygen saturation during periodontal surgical procedures under nitrous oxide inhalation sedation. Nitrous oxide-oxygen inhalation sedation for periodontal surgical procedures is capable of reducing stress physiologically, in comparison to LA during lengthy periodontal surgical procedures.

  13. Descriptive study of perioperative analgesic medications associated with general anesthesia for dental rehabilitation of children.

    PubMed

    Carter, Laura; Wilson, Stephen; Tumer, Erwin G

    2010-01-01

    The purpose of this retrospective chart review was to document sedation and analgesic medications administered preoperotively, intraoperatively, and during postanesthesia care for children undergoing dental rehabilitation using general anesthesia (GA). Patient gender, age, procedure type performed, and ASA status were recorded from the medical charts of children undergoing GA for dental rehabilitation. The sedative and analgesic drugs administered pre-, intra-, and postoperatively were recorded. Statistical analysis included descriptive statistics and cross-tabulation. A sample of 115 patients with a mean age of 64 (+/-30) months was studied; 47% were females, and 71% were healthy. Over 80% of the patients were administered medications primarily during pre- and intraoperative phases, with fewer than 25% receiving medications postoperatively. Morphine and fentanyl were the most frequently administered agents intraoperatively. The procedure type, gender, and health status were not statistically associated with the number of agents administered. Younger patients, however, were statistically more likely to receive additional analgesic medications. Our study suggests that a minority of patients have postoperative discomfort in the postanesthesia care unit; mild to moderate analgesics were administered during intraoperative phases of dental rehabilitation.

  14. Uncertainty Analysis of Inertial Model Attitude Sensor Calibration and Application with a Recommended New Calibration Method

    NASA Technical Reports Server (NTRS)

    Tripp, John S.; Tcheng, Ping

    1999-01-01

    Statistical tools, previously developed for nonlinear least-squares estimation of multivariate sensor calibration parameters and the associated calibration uncertainty analysis, have been applied to single- and multiple-axis inertial model attitude sensors used in wind tunnel testing to measure angle of attack and roll angle. The analysis provides confidence and prediction intervals of calibrated sensor measurement uncertainty as functions of applied input pitch and roll angles. A comparative performance study of various experimental designs for inertial sensor calibration is presented along with corroborating experimental data. The importance of replicated calibrations over extended time periods has been emphasized; replication provides independent estimates of calibration precision and bias uncertainties, statistical tests for calibration or modeling bias uncertainty, and statistical tests for sensor parameter drift over time. A set of recommendations for a new standardized model attitude sensor calibration method and usage procedures is included. The statistical information provided by these procedures is necessary for the uncertainty analysis of aerospace test results now required by users of industrial wind tunnel test facilities.

  15. An Apple for the Librarian: The OUC Experience.

    ERIC Educational Resources Information Center

    Planton, Stanley; Phillips, Susan

    1986-01-01

    Describes computerization of routine library procedures on Apple microcomputers at a small regional campus of Ohio University. Highlights include use of a database management program--PFS:FILE--for acquisition lists, equipment/supplies inventory, microfilm and periodicals management, and statistical manipulations, and a spreadsheet…

  16. Avoid Age Discrimination.

    ERIC Educational Resources Information Center

    Bernstein, Michael I.

    1982-01-01

    Steps a school board can take to minimize the risk of age discrimination suits include reviewing all written policies, forms, files, and collective bargaining agreements for age discriminatory items; preparing a detailed statistical analysis of the age of personnel; and reviewing reduction-in-force procedures. (Author/MLF)

  17. Statistics for nuclear engineers and scientists. Part 1. Basic statistical inference

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beggs, W.J.

    1981-02-01

    This report is intended for the use of engineers and scientists working in the nuclear industry, especially at the Bettis Atomic Power Laboratory. It serves as the basis for several Bettis in-house statistics courses. The objectives of the report are to introduce the reader to the language and concepts of statistics and to provide a basic set of techniques to apply to problems of the collection and analysis of data. Part 1 covers subjects of basic inference. The subjects include: descriptive statistics; probability; simple inference for normally distributed populations, and for non-normal populations as well; comparison of two populations; themore » analysis of variance; quality control procedures; and linear regression analysis.« less

  18. 40 CFR Appendix Xviii to Part 86 - Statistical Outlier Identification Procedure for Light-Duty Vehicles and Light Light-Duty Trucks...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 19 2011-07-01 2011-07-01 false Statistical Outlier Identification... (CONTINUED) Pt. 86, App. XVIII Appendix XVIII to Part 86—Statistical Outlier Identification Procedure for..., but suffer theoretical deficiencies if statistical significance tests are required. Consequently, the...

  19. 40 CFR Appendix Xviii to Part 86 - Statistical Outlier Identification Procedure for Light-Duty Vehicles and Light Light-Duty Trucks...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Statistical Outlier Identification... (CONTINUED) Pt. 86, App. XVIII Appendix XVIII to Part 86—Statistical Outlier Identification Procedure for..., but suffer theoretical deficiencies if statistical significance tests are required. Consequently, the...

  20. Applications of statistics to medical science, II overview of statistical procedures for general use.

    PubMed

    Watanabe, Hiroshi

    2012-01-01

    Procedures of statistical analysis are reviewed to provide an overview of applications of statistics for general use. Topics that are dealt with are inference on a population, comparison of two populations with respect to means and probabilities, and multiple comparisons. This study is the second part of series in which we survey medical statistics. Arguments related to statistical associations and regressions will be made in subsequent papers.

  1. Factors that influence length of stay for in-patient gynaecology surgery: is the Case Mix Group (CMG) or type of procedure more important?

    PubMed

    Carey, Mark S; Victory, Rahi; Stitt, Larry; Tsang, Nicole

    2006-02-01

    To compare the association between the Case Mix Group (CMG) code and length of stay (LOS) with the association between the type of procedure and LOS in patients admitted for gynaecology surgery. We examined the records of women admitted for surgery in CMG 579 (major uterine/adnexal procedure, no malignancy) or 577 (major surgery ovary/adnexa with malignancy) between April 1997 and March 1999. Factors thought to influence LOS included age, weight, American Society of Anesthesiologists (ASA) score, physician, day of the week on which surgery was performed, and procedure type. Procedures were divided into six categories, four for CMG 579 and two for CMG 577. Data were abstracted from the hospital information costing system (T2 system) and by retrospective chart review. Multivariable analysis was performed using linear regression with backwards elimination. There were 606 patients in CMG 579 and 101 patients in CMG 577, and the corresponding median LOS was four days (range 1-19) for CMG 579 and nine days (range 3-30) for CMG 577. Combined analysis of both CMGs 577 and 579 revealed the following factors as highly significant determinants of LOS: procedure, age, physician, and ASA score. Although confounded by procedure type, the CMG did not significantly account for differences in LOS in the model if procedure was considered. Pairwise comparisons of procedure categories were all found to be statistically significant, even when controlled for other important variables. The type of procedure better accounts for differences in LOS by describing six statistically distinct procedure groups rather than the traditional two CMGs. It is reasonable therefore to consider changing the current CMG codes for gynaecology to a classification based on the type of procedure.

  2. Randomization Procedures Applied to Analysis of Ballistic Data

    DTIC Science & Technology

    1991-06-01

    test,;;15. NUMBER OF PAGES data analysis; computationally intensive statistics ; randomization tests; permutation tests; 16 nonparametric statistics ...be 0.13. 8 Any reasonable statistical procedure would fail to support the notion of improvement of dynamic over standard indexing based on this data ...AD-A238 389 TECHNICAL REPORT BRL-TR-3245 iBRL RANDOMIZATION PROCEDURES APPLIED TO ANALYSIS OF BALLISTIC DATA MALCOLM S. TAYLOR BARRY A. BODT - JUNE

  3. Management of vocal fold scar with autologous fat implantation: perceptual results.

    PubMed

    Neuenschwander, M C; Sataloff, R T; Abaza, M M; Hawkshaw, M J; Reiter, D; Spiegel, J R

    2001-06-01

    Vocal fold scar disrupts the mucosal wave and interferes with glottic closure. Treatment involves a multidisciplinary approach that includes voice therapy, medical management, and sometimes surgery. We reviewed the records of the first eight patients who underwent autologous fat implantation for vocal fold scar. Information on the etiology of scar, physical findings, and prior interventions were collected. Videotapes of videostroboscopic findings and perceptual voice ratings [Grade, Roughness, Breathiness, Asthenia, Strain (GRBAS)] were randomized and analyzed independently by four blinded observers. Etiology of scar included mass excision (7), vocal fold stripping (3), congenital sulcus (2), and hemorrhage (1). Prior surgical procedures performed included thyroplasty (1), autologous fat injection (9), excision of scar (2), and lysis of adhesions (2). Strobovideolaryngoscopy: Statistically significant improvement was found in glottic closure, mucosal wave, and stiffness (P = 0.05). Perceptual ratings (GRBAS): Statistically significant improvement was found in all five parameters, including overall Grade, Roughness, Breathiness, Asthenia, and Strain (P = 0.05). Patients appear to have improved vocal fold function and quality of voice after autologous fat implantation in the vocal fold. Autologous fat implantation is an important adjunctive procedure in the management of vocal fold scar, and a useful addition to the armamentarium of the experienced phonomicrosurgeon.

  4. Using a Five-Step Procedure for Inferential Statistical Analyses

    ERIC Educational Resources Information Center

    Kamin, Lawrence F.

    2010-01-01

    Many statistics texts pose inferential statistical problems in a disjointed way. By using a simple five-step procedure as a template for statistical inference problems, the student can solve problems in an organized fashion. The problem and its solution will thus be a stand-by-itself organic whole and a single unit of thought and effort. The…

  5. Risk factors for early post-operative neurological deterioration in dogs undergoing a cervical dorsal laminectomy or hemilaminectomy: 100 cases (2002-2014).

    PubMed

    Taylor-Brown, F E; Cardy, T J A; Liebel, F X; Garosi, L; Kenny, P J; Volk, H A; De Decker, S

    2015-12-01

    Early post-operative neurological deterioration is a well-known complication following dorsal cervical laminectomies and hemilaminectomies in dogs. This study aimed to evaluate potential risk factors for early post-operative neurological deterioration following these surgical procedures. Medical records of 100 dogs that had undergone a cervical dorsal laminectomy or hemilaminectomy between 2002 and 2014 were assessed retrospectively. Assessed variables included signalment, bodyweight, duration of clinical signs, neurological status before surgery, diagnosis, surgical site, type and extent of surgery and duration of procedure. Outcome measures were neurological status immediately following surgery and duration of hospitalisation. Univariate statistical analysis was performed to identify variables to be included in a multivariate model. Diagnoses included osseous associated cervical spondylomyelopathy (OACSM; n = 41), acute intervertebral disk extrusion (IVDE; 31), meningioma (11), spinal arachnoid diverticulum (10) and vertebral arch anomalies (7). Overall 54% (95% CI 45.25-64.75) of dogs were neurologically worse 48 h post-operatively. Multivariate statistical analysis identified four factors significantly related to early post-operative neurological outcome. Diagnoses of OACSM or meningioma were considered the strongest variables to predict early post-operative neurological deterioration, followed by higher (more severely affected) neurological grade before surgery and longer surgery time. This information can aid in the management of expectations of clinical staff and owners with dogs undergoing these surgical procedures. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Longitudinal trends with improvement in quality of life after TVT, TVT O and Burch colposuspension procedures.

    PubMed

    Drahoradova, Petra; Martan, Alois; Svabik, Kamil; Zvara, Karel; Otava, Martin; Masata, Jaromir

    2011-02-01

    Comparison of the quality of life (QoL) trends after TVT, TVT O and Burch colposuspension (BCS) procedures and comparison of long-term subjective and objective outcomes. The study included 215 women who underwent a TVT, TVT O or BCS procedure. We monitored QoL after each procedure and the effect of complications on the QoL as assessed by the IQOL questionnaire over a 3-year period. The study was completed by 74.5% of women after TVT, 74.5% after TVT O, and 65.2% after BCS procedure. In the long-term, the QoL improved from 46.9 to 88.7 and remained stable after BCS; after TVT and TVT O, it declined, but only after TVT O was the decline statistically significant compared to BCS. The IQOL for women with post-operative complications has a clear descending tendency. The effect of the complications is highly significant (p<0.001). Only the OAB complication had a statistically significant effect on QoL p<0.001. Preexistent OAB does not negatively affect postoperative results of anti-incontinence surgery. There was a statistically significant decline with the longitudinal values of IQOL with TVT O, but not with TVT or BCS. Anti-incontinence operations significantly improve quality of life for women with MI, but compared to the SI group, the quality of life is worse when measured at a longer time interval after the operation. Anti-incontinence operations significantly improve quality of life, and the difference in preoperative status in the long-term follow-up is demonstrable.

  7. Statistical Study of High-Velocity Compact Clouds Based on the Complete CO Imagings of the Central Molecular Zone

    NASA Astrophysics Data System (ADS)

    Tokuyama, Sekito; Oka, Tomoharu; Takekawa, Shunya; Yamada, Masaya; Iwata, Yuhei; Tsujimoto, Shiho

    2017-01-01

    High-velocity compact clouds (HVCCs) is one of the populations of peculiar clouds detected in the Central Molecular Zone (CMZ) of our Galaxy. They have compact appearances (< 5 pc) and large velocity widths (> 50 km s-1). Several explanations for the origin of HVCC were proposed; e.g., a series of supernovae (SN) explosions (Oka et al. 1999) or a gravitational kick by a point-like gravitational source (Oka et al. 2016). To investigate the statistical property of HVCCs, a complete list of them is acutely necessary. However, the previous list is not complete since the identification procedure included automated processes and manual selection (Nagai 2008). Here we developed an automated procedure to identify HVCCs in a spectral line data.

  8. Load research manual. Volume 2: Fundamentals of implementing load research procedures

    NASA Astrophysics Data System (ADS)

    1980-11-01

    This manual will assist electric utilities and state regulatory authorities in investigating customer electricity demand as part of cost-of-service studies, rate design, marketing research, system design, load forecasting, rate reform analysis, and load management research. Load research procedures are described in detail. Research programs at three utilities are compared: Carolina Power and Light Company, Long Island Lighting Company, and Southern California Edison Company. A load research bibliography and glossaries of load research and statistical terms are also included.

  9. 40 CFR 1065.12 - Approval of alternate procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... engine meets all applicable emission standards according to specified procedures. (iii) Use statistical.... (e) We may give you specific directions regarding methods for statistical analysis, or we may approve... statistical tests. Perform the tests as follows: (1) Repeat measurements for all applicable duty cycles at...

  10. Analysis of half diallel mating designs I: a practical analysis procedure for ANOVA approximation.

    Treesearch

    G.R. Johnson; J.N. King

    1998-01-01

    Procedures to analyze half-diallel mating designs using the SAS statistical package are presented. The procedure requires two runs of PROC and VARCOMP and results in estimates of additive and non-additive genetic variation. The procedures described can be modified to work on most statistical software packages which can compute variance component estimates. The...

  11. 75 FR 2488 - Mid-Atlantic Fishery Management Council; Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-15

    ... Mid-Atlantic Fishery Management Council's (MAFMC) Scientific and Statistical Committee (SSC) will hold... include new member orientation (overview of Council process and role of the SSC), review and adoption of SSC Standard Operating Practices and Procedures, ABC Control Rule Framework and Council Risk Policy...

  12. [Clinical study on vocal cords spontaneous rehabilitation after CO2 laser surgery].

    PubMed

    Zhang, Qingxiang; Hu, Huiying; Sun, Guoyan; Yu, Zhenkun

    2014-10-01

    To study the spontaneous rehabilitation and phonation quality of vocal cords after different types of CO2 laser microsurgery. Surgical procedures based on Remacle system Type I, Type II, Type III, Type IV and Type V a respectively. Three hundred and fifteen cases with hoarseness based on strobe laryngoscopy results were prospectively assigned to different group according to vocal lesions apperence,vocal vibration and imaging of larynx CT/MRI. Each group holded 63 cases. The investigation included the vocal cords morphological features,the patients' subjective feelings and objective results of vocal cords. There are no severe complications for all patients in perioperative period. Vocal scar found in Type I ,1 case; Type II, 9 cases ;Type III, 47 cases; Type IV, 61 cases and Type Va 63 cases respectively after surgery. The difference of Vocal scar formation after surgery between surgical procedures are statistical significance (χ2 = 222.24, P < 0.05). Hoarseness improved after the surgery in 59 cases of Type I , 51 cases of Type II, 43 cases of Type III, 21 cases of Type IV and 17 cases of Type Va. There are statistically significance (χ2 = 89.46, P < 0.05) between different surgical procedures. The parameters of strobe laryngoscope: there are statistical significance on jitter between procedures (F 44.51, P < 0.05), but without difference within Type I and Type II (P > 0.05). This happened in shimmer parameter and the maximum phonation time (MPT) as jitter. There are no statistical significance between Type IV and Type Va on MPT (P > 0.05). Morphological and functional rehabilitation of vocal cord will be affected obviously when the body layer is injured. The depth and range of the CO2 laser microsurgery are the key factors affecting the vocal rehabilitation.

  13. Medicare payment data for spine reimbursement; important but flawed data for evaluating utilization of resources.

    PubMed

    Menger, Richard P; Wolf, Michael E; Kukreja, Sunil; Sin, Anthony; Nanda, Anil

    2015-01-01

    Medicare data showing physician-specific reimbursement for 2012 were recently made public in the mainstream media. Given the ongoing interest in containing healthcare costs, we analyze these data in the context of the delivery of spinal surgery. Demographics of 206 leading surgeons were extracted including state, geographic area, residency training program, fellowship training, and academic affiliation. Using current procedural terminology (CPT) codes, information was evaluated regarding the number of lumbar laminectomies, lumbar fusions, add-on laminectomy levels, and anterior cervical fusions reimbursed by Medicare in 2012. In 2012 Medicare reimbursed the average neurosurgeon slightly more than an orthopedic surgeon for all procedures ($142,075 vs. $110,920), but this was not found to be statistically significant (P = 0.218). Orthopedic surgeons had a statistical trend illustrating increased reimbursement for lumbar fusions specifically, $1187 versus $1073 (P = 0.07). Fellowship trained spinal surgeons also, on average, received more from Medicare ($125,407 vs. $76,551), but again this was not statistically significant (P = 0.112). A surgeon in private practice, on average, was reimbursed $137,495 while their academic counterparts were reimbursed $103,144 (P = 0.127). Surgeons performing cervical fusions in the Centers for Disease Control West Region did receive statistically significantly less reimbursement for that procedure then those surgeons in other parts of the country (P = 0.015). Surgeons in the West were reimbursed on average $849 for CPT code 22,551 while those in the Midwest received $1475 per procedure. Medicare reimbursement data are fundamentally flawed in determining healthcare expenditure as it shows a bias toward delivery of care in specific patient demographics. However, neurosurgeons, not just policy makers, must take ownership to analyze, investigate, and interpret these data as it will affect healthcare reimbursement and delivery moving forward.

  14. Monitoring Items in Real Time to Enhance CAT Security

    ERIC Educational Resources Information Center

    Zhang, Jinming; Li, Jie

    2016-01-01

    An IRT-based sequential procedure is developed to monitor items for enhancing test security. The procedure uses a series of statistical hypothesis tests to examine whether the statistical characteristics of each item under inspection have changed significantly during CAT administration. This procedure is compared with a previously developed…

  15. Using Statistical Process Control to Make Data-Based Clinical Decisions.

    ERIC Educational Resources Information Center

    Pfadt, Al; Wheeler, Donald J.

    1995-01-01

    Statistical process control (SPC), which employs simple statistical tools and problem-solving techniques such as histograms, control charts, flow charts, and Pareto charts to implement continual product improvement procedures, can be incorporated into human service organizations. Examples illustrate use of SPC procedures to analyze behavioral data…

  16. Statistical Cost Estimation in Higher Education: Some Alternatives.

    ERIC Educational Resources Information Center

    Brinkman, Paul T.; Niwa, Shelley

    Recent developments in econometrics that are relevant to the task of estimating costs in higher education are reviewed. The relative effectiveness of alternative statistical procedures for estimating costs are also tested. Statistical cost estimation involves three basic parts: a model, a data set, and an estimation procedure. Actual data are used…

  17. Scaling up to address data science challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendelberger, Joanne R.

    Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less

  18. Scaling up to address data science challenges

    DOE PAGES

    Wendelberger, Joanne R.

    2017-04-27

    Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less

  19. ESEA Title I Migrant. Final Technical Report. Publication 80.40.

    ERIC Educational Resources Information Center

    Austin Independent School District, TX. Office of Research and Evaluation.

    Data from 24 instruments used to evaluate the 1980-81 ESEA Title I Migrant program in the Austin (Texas) Independent School District are presented. A separate section for each instrument includes a description of purpose; procedures and results; and, where appropriate, relevant communications, instructions and statistical data. Summaries describe…

  20. Drug and Alcohol Use by Canadian University Athletes: A National Survey.

    ERIC Educational Resources Information Center

    Spence, John C.; Gauvin, Lise

    1996-01-01

    Using a stratified random sampling procedure, 754 student athletes were surveyed regarding drug and alcohol use in eight different sports from eight universities across Canada. Provides statistics of substances athletes reported using, including pain medications, weight loss products, anabolic steroids, smokeless tobacco products, alcohol,…

  1. 76 FR 647 - Energy Conservation Program: Test Procedures for Electric Motors and Small Electric Motors

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-05

    ... determination method (AEDM) for small electric motors, including the statistical requirements to substantiate... restriction to a particular application or type of application; or (2) Standard operating characteristics or... application, and which can be used in most general purpose applications. [[Page 652

  2. 75 FR 59143 - Magnuson-Stevens Fishery Conservation and Management Act; Regional Fishery Management Councils...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-27

    ... Coordinating Committee (CCC), require that the Councils' science and statistical committee (SSC) members... Council's Internet site, with alternative methods of retrieval for specific documents. The words ``to the... restrictions on lobbying; the procedures for Council member nomination, including timing for submission of...

  3. Information Input and Performance in Small Decision Making Groups.

    ERIC Educational Resources Information Center

    Ryland, Edwin Holman

    It was hypothesized that increases in the amount and specificity of information furnished to a discussion group would facilitate group decision making and improve other aspects of group and individual performance. Procedures in testing these assumptions included varying the amounts of statistics, examples, testimony, and augmented information…

  4. Academic Achievement of Girls in Rural Schools in Kenya

    ERIC Educational Resources Information Center

    Mungai, A. M.

    2012-01-01

    This study examined the effect of two family factors (financial, social capital) and school factors on students' achievement. One hundred eighty two, seventh-grade female students from nine schools in Muranga district, Kenya, were studied. The statistical procedures included logit regression, cross-tabulations, frequency counting and chi-square…

  5. Algorithm for Identifying Erroneous Rain-Gauge Readings

    NASA Technical Reports Server (NTRS)

    Rickman, Doug

    2005-01-01

    An algorithm analyzes rain-gauge data to identify statistical outliers that could be deemed to be erroneous readings. Heretofore, analyses of this type have been performed in burdensome manual procedures that have involved subjective judgements. Sometimes, the analyses have included computational assistance for detecting values falling outside of arbitrary limits. The analyses have been performed without statistically valid knowledge of the spatial and temporal variations of precipitation within rain events. In contrast, the present algorithm makes it possible to automate such an analysis, makes the analysis objective, takes account of the spatial distribution of rain gauges in conjunction with the statistical nature of spatial variations in rainfall readings, and minimizes the use of arbitrary criteria. The algorithm implements an iterative process that involves nonparametric statistics.

  6. Hemodynamic Parameters during Laryngoscopic Procedures in the Office and in the Operating Room.

    PubMed

    Tierney, William S; Chota, Rebecca L; Benninger, Michael S; Nowacki, Amy S; Bryson, Paul C

    2016-09-01

    Previous research has shown that office-based laryngoscopic procedures can induce hemodynamic changes, including tachycardia and severe hypertension, calling into question the safety of these procedures. However, comparison between office and operating room (OR) procedures has not been carried out. Therefore, we prospectively measured hemodynamic variables in both settings to compare hemodynamic changes between office and OR procedures. Prospective cohort study. Single academic center. Subjects undergoing office and OR laryngoscopic procedures were prospectively identified, and 92 OR and 70 office subjects were included. Heart rate and blood pressure were measured at established time points before, during, and after the procedures. Descriptive and comparative statistical analyses were conducted. Severe hemodynamic events, either tachycardia or severe hypertension (blood pressure >180 mm Hg systolic or >110 mm Hg diastolic), occurred significantly more frequently in OR than office procedures (41% vs 20%; P = .006). OR severe hemodynamic events occurred more commonly than previously reported rates in the office (41% vs 28%; P = .012). Regression analyses showed that the odds of having a severe hemodynamic event were 3.66 times higher in OR versus office procedures. Severe hemodynamic events are more likely to occur in the OR than in the office during laryngologic procedures. While larger studies will be required to establish rates of dangerous cardiovascular events in laryngoscopic procedures, hemodynamic parameters indicate that office-based procedures have a safety benefit for procedures that can be conducted in either setting. © American Academy of Otolaryngology—Head and Neck Surgery Foundation 2016.

  7. Cost analysis of robotic versus laparoscopic general surgery procedures.

    PubMed

    Higgins, Rana M; Frelich, Matthew J; Bosler, Matthew E; Gould, Jon C

    2017-01-01

    Robotic surgical systems have been used at a rapidly increasing rate in general surgery. Many of these procedures have been performed laparoscopically for years. In a surgical encounter, a significant portion of the total costs is associated with consumable supplies. Our hospital system has invested in a software program that can track the costs of consumable surgical supplies. We sought to determine the differences in cost of consumables with elective laparoscopic and robotic procedures for our health care organization. De-identified procedural cost and equipment utilization data were collected from the Surgical Profitability Compass Procedure Cost Manager System (The Advisory Board Company, Washington, DC) for our health care system for laparoscopic and robotic cholecystectomy, fundoplication, and inguinal hernia between the years 2013 and 2015. Outcomes were length of stay, case duration, and supply cost. Statistical analysis was performed using a t-test for continuous variables, and statistical significance was defined as p < 0.05. The total cost of consumable surgical supplies was significantly greater for all robotic procedures. Length of stay did not differ for fundoplication or cholecystectomy. Length of stay was greater for robotic inguinal hernia repair. Case duration was similar for cholecystectomy (84.3 robotic and 75.5 min laparoscopic, p = 0.08), but significantly longer for robotic fundoplication (197.2 robotic and 162.1 min laparoscopic, p = 0.01) and inguinal hernia repair (124.0 robotic and 84.4 min laparoscopic, p = ≪0.01). We found a significantly increased cost of general surgery procedures for our health care system when cases commonly performed laparoscopically are instead performed robotically. Our analysis is limited by the fact that we only included costs associated with consumable surgical supplies. The initial acquisition cost (over $1 million for robotic surgical system), depreciation, and service contract for the robotic and laparoscopic systems were not included in this analysis.

  8. Simulation center training as a means to improve resident performance in percutaneous noncontinuous CT-guided fluoroscopic procedures with dose reduction.

    PubMed

    Mendiratta-Lala, Mishal; Williams, Todd R; Mendiratta, Vivek; Ahmed, Hafeez; Bonnett, John W

    2015-04-01

    The purpose of this study was to evaluate the effectiveness of a multifaceted simulation-based resident training for CT-guided fluoroscopic procedures by measuring procedural and technical skills, radiation dose, and procedure times before and after simulation training. A prospective analysis included 40 radiology residents and eight staff radiologists. Residents took an online pretest to assess baseline procedural knowledge. Second-through fourth-year residents' baseline technical skills with a procedural phantom were evaluated. First-through third-year residents then underwent formal didactic and simulation-based procedural and technical training with one of two interventional radiologists and followed the training with 1 month of supervised phantom-based practice. Thereafter, residents underwent final written and practical examinations. The practical examination included essential items from a 20-point checklist, including site and side marking, consent, time-out, and sterile technique along with a technical skills portion assessing pedal steps, radiation dose, needle redirects, and procedure time. The results indicated statistically significant improvement in procedural and technical skills after simulation training. For residents, the median number of pedal steps decreased by three (p=0.001), median dose decreased by 15.4 mGy (p<0.001), median procedure time decreased by 4.0 minutes (p<0.001), median number of needle redirects decreased by 1.0 (p=0.005), and median number of 20-point checklist items successfully completed increased by three (p<0.001). The results suggest that procedural skills can be acquired and improved by simulation-based training of residents, regardless of experience. CT simulation training decreases procedural time, decreases radiation dose, and improves resident efficiency and confidence, which may transfer to clinical practice with improved patient care and safety.

  9. Metabolic effects of large-volume liposuction for obese healthy women: a meta-analysis of fasting insulin levels.

    PubMed

    Boriani, Filippo; Villani, Riccardo; Morselli, Paolo Giovanni

    2014-10-01

    Obesity is increasingly frequent in our society and is associated closely with metabolic disorders. As some studies have suggested, removal of fat tissue through liposuction and dermolipectomies may be of some benefit in the improvement of metabolic indices. This article aimed to review the published literature on this topic and to evaluate metabolic variations meta-analytically after liposuction, dermolipectomy, or both. Through a literature search with the PubMed/Medline database, 14 studies were identified. All articles were analyzed, and several metabolic variables were chosen in the attempt to meta-analyze the effect of adipose tissue removal through the various studies. All statistical calculations were performed with Review Manager (RevMan), version 5.0. Several cardiovascular and metabolic variables are described as prone to variations after body-contouring procedures when a significant amount of adipose tissue has been excised. Four of the studies included in the analysis reported improvements in all the parameters examined. Seven articles showed improvement in some variables and no improvement in others, whereas three studies showed no beneficial variation in any of the considered indicators after body-contouring procedures. Fasting plasma insulin was identified as the only variable for which a meta-analysis of five included studies was possible. The meta-analysis showed a statistically significant reduction in fasting plasma insulin resulting from large-volume liposuction in obese healthy women. Many beneficial metabolic effects resulting from dermolipectomy and liposuction procedures are described in the literature. In particular, fasting plasma insulin and thus insulin sensitivity seem to be positively influenced. Further research, including prospective clinical studies, is necessary for better exploration of the effects that body-contouring plastic surgery procedures have on metabolic parameters.

  10. Load research manual. Volume 1. Load research procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brandenburg, L.; Clarkson, G.; Grund, Jr., C.

    1980-11-01

    This three-volume manual presents technical guidelines for electric utility load research. Special attention is given to issues raised by the load data reporting requirements of the Public Utility Regulatory Policies Act of 1978 and to problems faced by smaller utilities that are initiating load research programs. In Volumes 1 and 2, procedures are suggested for determining data requirements for load research, establishing the size and customer composition of a load survey sample, selecting and using equipment to record customer electricity usage, processing data tapes from the recording equipment, and analyzing the data. Statistical techniques used in customer sampling are discussedmore » in detail. The costs of load research also are estimated, and ongoing load research programs at three utilities are described. The manual includes guides to load research literature and glossaries of load research and statistical terms.« less

  11. Introducing Statistical Inference to Biology Students through Bootstrapping and Randomization

    ERIC Educational Resources Information Center

    Lock, Robin H.; Lock, Patti Frazer

    2008-01-01

    Bootstrap methods and randomization tests are increasingly being used as alternatives to standard statistical procedures in biology. They also serve as an effective introduction to the key ideas of statistical inference in introductory courses for biology students. We discuss the use of such simulation based procedures in an integrated curriculum…

  12. Efficiency Analysis: Enhancing the Statistical and Evaluative Power of the Regression-Discontinuity Design.

    ERIC Educational Resources Information Center

    Madhere, Serge

    An analytic procedure, efficiency analysis, is proposed for improving the utility of quantitative program evaluation for decision making. The three features of the procedure are explained: (1) for statistical control, it adopts and extends the regression-discontinuity design; (2) for statistical inferences, it de-emphasizes hypothesis testing in…

  13. Longitudinal trends with Improvement in Quality of Life after TVT, TVT O and Burch Colposuspension Procedures

    PubMed Central

    Drahoradova, Petra; Martan, Alois; Svabik, Kamil; Zvara, Karel; Otava, Martin; Masata, Jaromir

    2011-01-01

    Summary Background Comparison of the quality of life (QoL) trends after TVT, TVT O and Burch colposuspension (BCS) procedures and comparison of long-term subjective and objective outcomes. Material/Methods The study included 215 women who underwent a TVT, TVT O or BCS procedure. We monitored QoL after each procedure and the effect of complications on the QoL as assessed by the IQOL questionnaire over a 3-year period. Results The study was completed by 74.5% of women after TVT, 74.5% after TVT O, and 65.2% after BCS procedure. In the long-term, the QoL improved from 46.9 to 88.7 and remained stable after BCS; after TVT and TVT O, it declined, but only after TVT O was the decline statistically significant compared to BCS. The IQOL for women with post-operative complications has a clear descending tendency. The effect of the complications is highly significant (p<0.001). Only the OAB complication had a statistically significant effect on QoL p<0.001. Preexistent OAB does not negatively affect postoperative results of anti-incontinence surgery. Conclusions There was a statistically significant decline with the longitudinal values of IQOL with TVT O, but not with TVT or BCS. Anti-incontinence operations significantly improve quality of life for women with MI, but compared to the SI group, the quality of life is worse when measured at a longer time interval after the operation. Anti-incontinence operations significantly improve quality of life, and the difference in preoperative status in the long-term follow-up is demonstrable. PMID:21278690

  14. Evaluating measurement models in clinical research: covariance structure analysis of latent variable models of self-conception.

    PubMed

    Hoyle, R H

    1991-02-01

    Indirect measures of psychological constructs are vital to clinical research. On occasion, however, the meaning of indirect measures of psychological constructs is obfuscated by statistical procedures that do not account for the complex relations between items and latent variables and among latent variables. Covariance structure analysis (CSA) is a statistical procedure for testing hypotheses about the relations among items that indirectly measure a psychological construct and relations among psychological constructs. This article introduces clinical researchers to the strengths and limitations of CSA as a statistical procedure for conceiving and testing structural hypotheses that are not tested adequately with other statistical procedures. The article is organized around two empirical examples that illustrate the use of CSA for evaluating measurement models with correlated error terms, higher-order factors, and measured and latent variables.

  15. Consequences of common data analysis inaccuracies in CNS trauma injury basic research.

    PubMed

    Burke, Darlene A; Whittemore, Scott R; Magnuson, David S K

    2013-05-15

    The development of successful treatments for humans after traumatic brain or spinal cord injuries (TBI and SCI, respectively) requires animal research. This effort can be hampered when promising experimental results cannot be replicated because of incorrect data analysis procedures. To identify and hopefully avoid these errors in future studies, the articles in seven journals with the highest number of basic science central nervous system TBI and SCI animal research studies published in 2010 (N=125 articles) were reviewed for their data analysis procedures. After identifying the most common statistical errors, the implications of those findings were demonstrated by reanalyzing previously published data from our laboratories using the identified inappropriate statistical procedures, then comparing the two sets of results. Overall, 70% of the articles contained at least one type of inappropriate statistical procedure. The highest percentage involved incorrect post hoc t-tests (56.4%), followed by inappropriate parametric statistics (analysis of variance and t-test; 37.6%). Repeated Measures analysis was inappropriately missing in 52.0% of all articles and, among those with behavioral assessments, 58% were analyzed incorrectly. Reanalysis of our published data using the most common inappropriate statistical procedures resulted in a 14.1% average increase in significant effects compared to the original results. Specifically, an increase of 15.5% occurred with Independent t-tests and 11.1% after incorrect post hoc t-tests. Utilizing proper statistical procedures can allow more-definitive conclusions, facilitate replicability of research results, and enable more accurate translation of those results to the clinic.

  16. Efficacy of a radiation absorbing shield in reducing dose to the interventionalist during peripheral endovascular procedures: a single centre pilot study.

    PubMed

    Power, S; Mirza, M; Thakorlal, A; Ganai, B; Gavagan, L D; Given, M F; Lee, M J

    2015-06-01

    This prospective pilot study was undertaken to evaluate the feasibility and effectiveness of using a radiation absorbing shield to reduce operator dose from scatter during lower limb endovascular procedures. A commercially available bismuth shield system (RADPAD) was used. Sixty consecutive patients undergoing lower limb angioplasty were included. Thirty procedures were performed without the RADPAD (control group) and thirty with the RADPAD (study group). Two separate methods were used to measure dose to a single operator. Thermoluminescent dosimeter (TLD) badges were used to measure hand, eye, and unshielded body dose. A direct dosimeter with digital readout was also used to measure eye and unshielded body dose. To allow for variation between control and study groups, dose per unit time was calculated. TLD results demonstrated a significant reduction in median body dose per unit time for the study group compared with controls (p = 0.001), corresponding to a mean dose reduction rate of 65 %. Median eye and hand dose per unit time were also reduced in the study group compared with control group, however, this was not statistically significant (p = 0.081 for eye, p = 0.628 for hand). Direct dosimeter readings also showed statistically significant reduction in median unshielded body dose rate for the study group compared with controls (p = 0.037). Eye dose rate was reduced for the study group but this was not statistically significant (p = 0.142). Initial results are encouraging. Use of the shield resulted in a statistically significant reduction in unshielded dose to the operator's body. Measured dose to the eye and hand of operator were also reduced but did not reach statistical significance in this pilot study.

  17. Global aesthetic surgery statistics: a closer look.

    PubMed

    Heidekrueger, Paul I; Juran, S; Ehrl, D; Aung, T; Tanna, N; Broer, P Niclas

    2017-08-01

    Obtaining quality global statistics about surgical procedures remains an important yet challenging task. The International Society of Aesthetic Plastic Surgery (ISAPS) reports the total number of surgical and non-surgical procedures performed worldwide on a yearly basis. While providing valuable insight, ISAPS' statistics leave two important factors unaccounted for: (1) the underlying base population, and (2) the number of surgeons performing the procedures. Statistics of the published ISAPS' 'International Survey on Aesthetic/Cosmetic Surgery' were analysed by country, taking into account the underlying national base population according to the official United Nations population estimates. Further, the number of surgeons per country was used to calculate the number of surgeries performed per surgeon. In 2014, based on ISAPS statistics, national surgical procedures ranked in the following order: 1st USA, 2nd Brazil, 3rd South Korea, 4th Mexico, 5th Japan, 6th Germany, 7th Colombia, and 8th France. When considering the size of the underlying national populations, the demand for surgical procedures per 100,000 people changes the overall ranking substantially. It was also found that the rate of surgical procedures per surgeon shows great variation between the responding countries. While the US and Brazil are often quoted as the countries with the highest demand for plastic surgery, according to the presented analysis, other countries surpass these countries in surgical procedures per capita. While data acquisition and quality should be improved in the future, valuable insight regarding the demand for surgical procedures can be gained by taking specific demographic and geographic factors into consideration.

  18. Comparison of drug-induced sleep endoscopy and Müller's maneuver in diagnosing obstructive sleep apnea using the VOTE classification system.

    PubMed

    Yegïn, Yakup; Çelik, Mustafa; Kaya, Kamïl Hakan; Koç, Arzu Karaman; Kayhan, Fatma Tülin

    Knowledge of the site of obstruction and the pattern of airway collapse is essential for determining correct surgical and medical management of patients with Obstructive Sleep Apnea Syndrome (OSAS). To this end, several diagnostic tests and procedures have been developed. To determine whether drug-induced sleep endoscopy (DISE) or Müller's maneuver (MM) would be more successful at identifying the site of obstruction and the pattern of upper airway collapse in patients with OSAS. The study included 63 patients (52 male and 11 female) who were diagnosed with OSAS at our clinic. Ages ranged from 30 to 66 years old and the average age was 48.5 years. All patients underwent DISE and MM and the results of these examinations were characterized according to the region/degree of obstruction as well as the VOTE classification. The results of each test were analyzed per upper airway level and compared using statistical analysis (Cohen's kappa statistic test). There was statistically significant concordance between the results from DISE and MM for procedures involving the anteroposterior (73%), lateral (92.1%), and concentric (74.6%) configuration of the velum. Results from the lateral part of the oropharynx were also in concordance between the tests (58.7%). Results from the lateral configuration of the epiglottis were in concordance between the tests (87.3%). There was no statistically significant concordance between the two examinations for procedures involving the anteroposterior of the tongue (23.8%) and epiglottis (42.9%). We suggest that DISE has several advantages including safety, ease of use, and reliability, which outweigh MM in terms of the ability to diagnose sites of obstruction and the pattern of upper airway collapse. Also, MM can provide some knowledge of the pattern of pharyngeal collapse. Furthermore, we also recommend using the VOTE classification in combination with DISE. Copyright © 2016 Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Published by Elsevier Editora Ltda. All rights reserved.

  19. Statistical analysis and digital processing of the Mössbauer spectra

    NASA Astrophysics Data System (ADS)

    Prochazka, Roman; Tucek, Pavel; Tucek, Jiri; Marek, Jaroslav; Mashlan, Miroslav; Pechousek, Jiri

    2010-02-01

    This work is focused on using the statistical methods and development of the filtration procedures for signal processing in Mössbauer spectroscopy. Statistical tools for noise filtering in the measured spectra are used in many scientific areas. The use of a pure statistical approach in accumulated Mössbauer spectra filtration is described. In Mössbauer spectroscopy, the noise can be considered as a Poisson statistical process with a Gaussian distribution for high numbers of observations. This noise is a superposition of the non-resonant photons counting with electronic noise (from γ-ray detection and discrimination units), and the velocity system quality that can be characterized by the velocity nonlinearities. The possibility of a noise-reducing process using a new design of statistical filter procedure is described. This mathematical procedure improves the signal-to-noise ratio and thus makes it easier to determine the hyperfine parameters of the given Mössbauer spectra. The filter procedure is based on a periodogram method that makes it possible to assign the statistically important components in the spectral domain. The significance level for these components is then feedback-controlled using the correlation coefficient test results. The estimation of the theoretical correlation coefficient level which corresponds to the spectrum resolution is performed. Correlation coefficient test is based on comparison of the theoretical and the experimental correlation coefficients given by the Spearman method. The correctness of this solution was analyzed by a series of statistical tests and confirmed by many spectra measured with increasing statistical quality for a given sample (absorber). The effect of this filter procedure depends on the signal-to-noise ratio and the applicability of this method has binding conditions.

  20. Two Paradoxes in Linear Regression Analysis.

    PubMed

    Feng, Ge; Peng, Jing; Tu, Dongke; Zheng, Julia Z; Feng, Changyong

    2016-12-25

    Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection.

  1. FACTORS ASSOCIATED WITH ODONTOGENIC BACTERAEMIA IN ORTHODONTIC PATIENTS.

    PubMed

    Umeh, O D; Sanu, O O; Utomi, I L; Nwaokorie, F O

    2016-01-01

    Various researches have investigated factors associated with the prevalence and intensity of bacteraemia following oral procedures including orthodontic procedures. The aim of this study was to determine the effect of age, gender, plaque and gingival indices on the occurrence of odontogenic bacteraemia following orthodontic treatment procedures. Orthodontic Clinic, Lagos University Teaching Hospital (LUTH), Lagos , Nigeria. Using the consecutive, convenience sampling method, a total of 100 subjects who met the inclusion criteria were recruited for the study and peripheral blood was collected before and again within 2 minutes of completion of orthodontic procedures for microbiologic analysis using the BACTEC automated blood culture system and the lysis filtration methods of blood culturing. The subjects were randomly placed in one of four orthodontic procedures investigated: alginate impression making (Group I), separator placement (Group II), band cementation (Group III) and arch wire change (Group IV). Plaque and gingival indices were assessed using the plaque component of the Simplified Oral Hygiene Index (OHI-S) (Greene & Vermillion) and Modified gingival index (Lobene) respectively before blood collection. Spearman Point bi-serial correlations and logistic regression statistics were used for statistical evaluations at p < 0.05 level. An overall baseline prevalence of bacteraemia of 3% and 17% were observed using the BACCTEC and lysis filtration methods respectively. Similarly, overall prevalence of bacteraemia following orthodontic treatment procedures of 16% and 28% were observed respectively using the BACTEC and lysis filtration methods. A statistically significant increase in the prevalence of bateraemia was observed following separator placement (p=0.016). An increase in age, plaque index scores and modified gingival index scores of the subjects were found to be associated with an increase in the prevalence of bacteraemia following orthodontic treatment procedures, with plaque index score showing the strongest correlation. Separator placement was found to induce significantly highest level of bacteraemia. Meticulous oral hygiene practice and the use of 0.2% chlorhexidine mouth rinse prior to separator placement may be considered an effective measure in reducing oral bacteria load and consequent reduction of the occurrence of bacteraemia following orthodontic treatment procedures.

  2. Probabilistic micromechanics for metal matrix composites

    NASA Astrophysics Data System (ADS)

    Engelstad, S. P.; Reddy, J. N.; Hopkins, Dale A.

    A probabilistic micromechanics-based nonlinear analysis procedure is developed to predict and quantify the variability in the properties of high temperature metal matrix composites. Monte Carlo simulation is used to model the probabilistic distributions of the constituent level properties including fiber, matrix, and interphase properties, volume and void ratios, strengths, fiber misalignment, and nonlinear empirical parameters. The procedure predicts the resultant ply properties and quantifies their statistical scatter. Graphite copper and Silicon Carbide Titanlum Aluminide (SCS-6 TI15) unidirectional plies are considered to demonstrate the predictive capabilities. The procedure is believed to have a high potential for use in material characterization and selection to precede and assist in experimental studies of new high temperature metal matrix composites.

  3. Effects of Instructional Design with Mental Model Analysis on Learning.

    ERIC Educational Resources Information Center

    Hong, Eunsook

    This paper presents a model for systematic instructional design that includes mental model analysis together with the procedures used in developing computer-based instructional materials in the area of statistical hypothesis testing. The instructional design model is based on the premise that the objective for learning is to achieve expert-like…

  4. Return to Our Roots: Raising Radishes to Teach Experimental Design. Methods and Techniques.

    ERIC Educational Resources Information Center

    Stallings, William M.

    1993-01-01

    Reviews research in teaching applied statistics. Concludes that students should analyze data from studies they have designed and conducted. Describes an activity in which students study germination and growth of radish seeds. Includes a table providing student instructions for both the experimental procedure and data analysis. (CFR)

  5. More Powerful Tests of Simple Interaction Contrasts in the Two-Way Factorial Design

    ERIC Educational Resources Information Center

    Hancock, Gregory R.; McNeish, Daniel M.

    2017-01-01

    For the two-way factorial design in analysis of variance, the current article explicates and compares three methods for controlling the Type I error rate for all possible simple interaction contrasts following a statistically significant interaction, including a proposed modification to the Bonferroni procedure that increases the power of…

  6. A Case Study to Examine Peer Grouping and Aspirant Selection. Professional File. Article 132, Fall 2013

    ERIC Educational Resources Information Center

    D'Allegro, Mary Lou; Zhou, Kai

    2013-01-01

    Peer selection based on the similarity of a couple of institutional parameters, by itself, is insufficient. Several other considerations, including clarity of purpose, alignment of institutional information to that purpose, identification of appropriate statistical procedures, review of preliminary peer sets, and the application of additional…

  7. A New Way for Antihelixplasty in Prominent Ear Surgery: Modified Postauricular Fascial Flap.

    PubMed

    Taş, Süleyman; Benlier, Erol

    2016-06-01

    Otoplasty procedures aim to reduce the concha-mastoid angle and recreate the antihelical fold. Here, we explained the modified postauricular fascial flap, described as a new way for recreating the antihelical fold, and reported the results of patients on whom this flap was used. The defined technique was used on 24 patients (10 females and 14 males; age, 6-27 years; mean, 16.7 years) between June 2009 and July 2012, a total of 48 procedures in total (bilateral). Follow-up ranged from 1 to 3 years (mean, 1.5 years). At the preoperative and postoperative time points (1 and 12 months after surgery), all patients were measured for upper and middle helix-head distance and were photographed. The records were analyzed statistically using t test and analysis of variance. The procedure resulted in ears that were natural in appearance without any significant visible evidence of surgery. The operations resulted in no complications except 1 patient who developed a small skin ulcer on the left ear because of band pressure. When we compared the preoperative and postoperative upper and middle helix-head distance, there was a high significance statistically. To introduce modified postauricular fascial flap, we used a simple and safe procedure to recreate an antihelical fold. This procedure led to several benefits, including a natural-in-appearance antihelical fold, prevention of suture extrusion and granuloma, as well as minimized risk for recurrence due to neochondrogenesis. This method may be used as a standard procedure for treating prominent ears surgically.

  8. Statistical analysis of multivariate atmospheric variables. [cloud cover

    NASA Technical Reports Server (NTRS)

    Tubbs, J. D.

    1979-01-01

    Topics covered include: (1) estimation in discrete multivariate distributions; (2) a procedure to predict cloud cover frequencies in the bivariate case; (3) a program to compute conditional bivariate normal parameters; (4) the transformation of nonnormal multivariate to near-normal; (5) test of fit for the extreme value distribution based upon the generalized minimum chi-square; (6) test of fit for continuous distributions based upon the generalized minimum chi-square; (7) effect of correlated observations on confidence sets based upon chi-square statistics; and (8) generation of random variates from specified distributions.

  9. The compartment bag test (CBT) for enumerating fecal indicator bacteria: Basis for design and interpretation of results.

    PubMed

    Gronewold, Andrew D; Sobsey, Mark D; McMahan, Lanakila

    2017-06-01

    For the past several years, the compartment bag test (CBT) has been employed in water quality monitoring and public health protection around the world. To date, however, the statistical basis for the design and recommended procedures for enumerating fecal indicator bacteria (FIB) concentrations from CBT results have not been formally documented. Here, we provide that documentation following protocols for communicating the evolution of similar water quality testing procedures. We begin with an overview of the statistical theory behind the CBT, followed by a description of how that theory was applied to determine an optimal CBT design. We then provide recommendations for interpreting CBT results, including procedures for estimating quantiles of the FIB concentration probability distribution, and the confidence of compliance with recognized water quality guidelines. We synthesize these values in custom user-oriented 'look-up' tables similar to those developed for other FIB water quality testing methods. Modified versions of our tables are currently distributed commercially as part of the CBT testing kit. Published by Elsevier B.V.

  10. VARIABLE SELECTION FOR REGRESSION MODELS WITH MISSING DATA

    PubMed Central

    Garcia, Ramon I.; Ibrahim, Joseph G.; Zhu, Hongtu

    2009-01-01

    We consider the variable selection problem for a class of statistical models with missing data, including missing covariate and/or response data. We investigate the smoothly clipped absolute deviation penalty (SCAD) and adaptive LASSO and propose a unified model selection and estimation procedure for use in the presence of missing data. We develop a computationally attractive algorithm for simultaneously optimizing the penalized likelihood function and estimating the penalty parameters. Particularly, we propose to use a model selection criterion, called the ICQ statistic, for selecting the penalty parameters. We show that the variable selection procedure based on ICQ automatically and consistently selects the important covariates and leads to efficient estimates with oracle properties. The methodology is very general and can be applied to numerous situations involving missing data, from covariates missing at random in arbitrary regression models to nonignorably missing longitudinal responses and/or covariates. Simulations are given to demonstrate the methodology and examine the finite sample performance of the variable selection procedures. Melanoma data from a cancer clinical trial is presented to illustrate the proposed methodology. PMID:20336190

  11. Precision of guided scanning procedures for full-arch digital impressions in vivo.

    PubMed

    Zimmermann, Moritz; Koller, Christina; Rumetsch, Moritz; Ender, Andreas; Mehl, Albert

    2017-11-01

    System-specific scanning strategies have been shown to influence the accuracy of full-arch digital impressions. Special guided scanning procedures have been implemented for specific intraoral scanning systems with special regard to the digital orthodontic workflow. The aim of this study was to evaluate the precision of guided scanning procedures compared to conventional impression techniques in vivo. Two intraoral scanning systems with implemented full-arch guided scanning procedures (Cerec Omnicam Ortho; Ormco Lythos) were included along with one conventional impression technique with irreversible hydrocolloid material (alginate). Full-arch impressions were taken three times each from 5 participants (n = 15). Impressions were then compared within the test groups using a point-to-surface distance method after best-fit model matching (OraCheck). Precision was calculated using the (90-10%)/2 quantile and statistical analysis with one-way repeated measures ANOVA and post hoc Bonferroni test was performed. The conventional impression technique with alginate showed the lowest precision for full-arch impressions with 162.2 ± 71.3 µm. Both guided scanning procedures performed statistically significantly better than the conventional impression technique (p < 0.05). Mean values for group Cerec Omnicam Ortho were 74.5 ± 39.2 µm and for group Ormco Lythos 91.4 ± 48.8 µm. The in vivo precision of guided scanning procedures exceeds conventional impression techniques with the irreversible hydrocolloid material alginate. Guided scanning procedures may be highly promising for clinical applications, especially for digital orthodontic workflows.

  12. A Low-Cost Method for Multiple Disease Prediction.

    PubMed

    Bayati, Mohsen; Bhaskar, Sonia; Montanari, Andrea

    Recently, in response to the rising costs of healthcare services, employers that are financially responsible for the healthcare costs of their workforce have been investing in health improvement programs for their employees. A main objective of these so called "wellness programs" is to reduce the incidence of chronic illnesses such as cardiovascular disease, cancer, diabetes, and obesity, with the goal of reducing future medical costs. The majority of these wellness programs include an annual screening to detect individuals with the highest risk of developing chronic disease. Once these individuals are identified, the company can invest in interventions to reduce the risk of those individuals. However, capturing many biomarkers per employee creates a costly screening procedure. We propose a statistical data-driven method to address this challenge by minimizing the number of biomarkers in the screening procedure while maximizing the predictive power over a broad spectrum of diseases. Our solution uses multi-task learning and group dimensionality reduction from machine learning and statistics. We provide empirical validation of the proposed solution using data from two different electronic medical records systems, with comparisons to a statistical benchmark.

  13. Estimating multilevel logistic regression models when the number of clusters is low: a comparison of different statistical software procedures.

    PubMed

    Austin, Peter C

    2010-04-22

    Multilevel logistic regression models are increasingly being used to analyze clustered data in medical, public health, epidemiological, and educational research. Procedures for estimating the parameters of such models are available in many statistical software packages. There is currently little evidence on the minimum number of clusters necessary to reliably fit multilevel regression models. We conducted a Monte Carlo study to compare the performance of different statistical software procedures for estimating multilevel logistic regression models when the number of clusters was low. We examined procedures available in BUGS, HLM, R, SAS, and Stata. We found that there were qualitative differences in the performance of different software procedures for estimating multilevel logistic models when the number of clusters was low. Among the likelihood-based procedures, estimation methods based on adaptive Gauss-Hermite approximations to the likelihood (glmer in R and xtlogit in Stata) or adaptive Gaussian quadrature (Proc NLMIXED in SAS) tended to have superior performance for estimating variance components when the number of clusters was small, compared to software procedures based on penalized quasi-likelihood. However, only Bayesian estimation with BUGS allowed for accurate estimation of variance components when there were fewer than 10 clusters. For all statistical software procedures, estimation of variance components tended to be poor when there were only five subjects per cluster, regardless of the number of clusters.

  14. Applying a statistical PTB detection procedure to complement the gold standard.

    PubMed

    Noor, Norliza Mohd; Yunus, Ashari; Bakar, S A R Abu; Hussin, Amran; Rijal, Omar Mohd

    2011-04-01

    This paper investigates a novel statistical discrimination procedure to detect PTB when the gold standard requirement is taken into consideration. Archived data were used to establish two groups of patients which are the control and test group. The control group was used to develop the statistical discrimination procedure using four vectors of wavelet coefficients as feature vectors for the detection of pulmonary tuberculosis (PTB), lung cancer (LC), and normal lung (NL). This discrimination procedure was investigated using the test group where the number of sputum positive and sputum negative cases that were correctly classified as PTB cases were noted. The proposed statistical discrimination method is able to detect PTB patients and LC with high true positive fraction. The method is also able to detect PTB patients that are sputum negative and therefore may be used as a complement to the gold standard. Copyright © 2010 Elsevier Ltd. All rights reserved.

  15. Clinical skills temporal degradation assessment in undergraduate medical education.

    PubMed

    Fisher, Joseph; Viscusi, Rebecca; Ratesic, Adam; Johnstone, Cameron; Kelley, Ross; Tegethoff, Angela M; Bates, Jessica; Situ-Lacasse, Elaine H; Adamas-Rappaport, William J; Amini, Richard

    2018-01-01

    Medical students' ability to learn clinical procedures and competently apply these skills is an essential component of medical education. Complex skills with limited opportunity for practice have been shown to degrade without continued refresher training. To our knowledge there is no evidence that objectively evaluates temporal degradation of clinical skills in undergraduate medical education. The purpose of this study was to evaluate temporal retention of clinical skills among third year medical students. This was a cross-sectional study conducted at four separate time intervals in the cadaver laboratory at a public medical school. Forty-five novice third year medical students were evaluated for retention of skills in the following three procedures: pigtail thoracostomy, femoral line placement, and endotracheal intubation. Prior to the start of third-year medical clerkships, medical students participated in a two-hour didactic session designed to teach clinically relevant materials including the procedures. Prior to the start of their respective surgery clerkships, students were asked to perform the same three procedures and were evaluated by trained emergency medicine and surgery faculty for retention rates, using three validated checklists. Students were then reassessed at six week intervals in four separate groups based on the start date of their respective surgical clerkships. We compared the evaluation results between students tested one week after training and those tested at three later dates for statistically significant differences in score distribution using a one-tailed Wilcoxon Mann-Whitney U-test for non-parametric rank-sum analysis. Retention rates were shown to have a statistically significant decline between six and 12 weeks for all three procedural skills. In the instruction of medical students, skill degradation should be considered when teaching complex technical skills. Based on the statistically significant decline in procedural skills noted in our investigation, instructors should consider administering a refresher course between six and twelve weeks from initial training.

  16. The effect of vasodilatory medications on radial artery spasm in patients undergoing transradial coronary artery procedures: a systematic review.

    PubMed

    Curtis, Elizabeth; Fernandez, Ritin; Lee, Astin

    2017-07-01

    The uptake of percutaneous coronary procedures via the radial artery has increased internationally due to the decreased risk of complications and increased patient satisfaction. The increased susceptibility of the radial artery to spasm however presents a potential risk for procedural failure. Although most experts agree on the need for prophylactic medications to reduce radial artery spasm, currently there is inconsistency in literature regarding the most effective vasodilatory medication or combination of medications. The objective of this study is to identify the effectiveness of vasodilatory medications on radial artery spasm in patients undergoing transradial coronary artery procedures. This review considered studies that included participants aged 18 years and over undergoing non-emergent transradial percutaneous coronary artery procedures. This review considered studies that used vasodilating intravenous and intra-arterial medications or combinations of medications prior to commencing and during transradial coronary approaches to reduce radial artery spasm. The outcomes of interest were the incidence of radial artery spasm during percutaneous coronary procedure using objective and/or subjective measures and its effect on the successful completion of the procedure. Randomized controlled trials published in the English language between 1989 to date were considered for inclusion. The search strategy aimed to find both published and unpublished studies. A three-step search strategy was utilized in this review. An initial search of MEDLINE, CINAHL and Scopus was undertaken, followed by a search for unpublished studies. Papers selected for retrieval were assessed by two independent reviewers for methodological validity prior to inclusion in the review using standardized critical appraisal instruments. Any disagreements that arose between the reviewers were resolved through discussion. Quantitative data was extracted from papers included in the review using the standardized data extraction tool from RevMan5 (Copenhagen: The Nordic Cochrane Centre, Cochrane). Quantitative data, where possible, was pooled in statistical meta-analysis using RevMan5. All results were subject to double data entry. Effect sizes expressed as risk ratio (for categorical data) and weighted mean differences (for continuous data) and their 95% confidence intervals were calculated for analysis. Nine trials involving 3614 patients were included in the final review. Pooled data involving 992 patients on the effect of calcium channel blockers demonstrated a statistically significant reduction in the incidence of vasospasm in patients who received verapamil 5 mg compared to those who received a placebo (OR 0.33; 95%CI 0.19, 0.58). Similarly patients who received verapamil 2.5 mg or 1.25 mg had significantly fewer incidences of vasospasm when compared to those who received a placebo. Nitroglycerine 100mcg was demonstrated to be associated with a statistically significant reduction in the incidence of vasospasm. The evidence demonstrates a benefit in the use of vasodilatory medications for the reduction of vasospasm in patients having radial coronary procedures. Further large-scale multi-center trials are needed to determine the preferred medication.

  17. Consensus building for interlaboratory studies, key comparisons, and meta-analysis

    NASA Astrophysics Data System (ADS)

    Koepke, Amanda; Lafarge, Thomas; Possolo, Antonio; Toman, Blaza

    2017-06-01

    Interlaboratory studies in measurement science, including key comparisons, and meta-analyses in several fields, including medicine, serve to intercompare measurement results obtained independently, and typically produce a consensus value for the common measurand that blends the values measured by the participants. Since interlaboratory studies and meta-analyses reveal and quantify differences between measured values, regardless of the underlying causes for such differences, they also provide so-called ‘top-down’ evaluations of measurement uncertainty. Measured values are often substantially over-dispersed by comparison with their individual, stated uncertainties, thus suggesting the existence of yet unrecognized sources of uncertainty (dark uncertainty). We contrast two different approaches to take dark uncertainty into account both in the computation of consensus values and in the evaluation of the associated uncertainty, which have traditionally been preferred by different scientific communities. One inflates the stated uncertainties by a multiplicative factor. The other adds laboratory-specific ‘effects’ to the value of the measurand. After distinguishing what we call recipe-based and model-based approaches to data reductions in interlaboratory studies, we state six guiding principles that should inform such reductions. These principles favor model-based approaches that expose and facilitate the critical assessment of validating assumptions, and give preeminence to substantive criteria to determine which measurement results to include, and which to exclude, as opposed to purely statistical considerations, and also how to weigh them. Following an overview of maximum likelihood methods, three general purpose procedures for data reduction are described in detail, including explanations of how the consensus value and degrees of equivalence are computed, and the associated uncertainty evaluated: the DerSimonian-Laird procedure; a hierarchical Bayesian procedure; and the Linear Pool. These three procedures have been implemented and made widely accessible in a Web-based application (NIST Consensus Builder). We illustrate principles, statistical models, and data reduction procedures in four examples: (i) the measurement of the Newtonian constant of gravitation; (ii) the measurement of the half-lives of radioactive isotopes of caesium and strontium; (iii) the comparison of two alternative treatments for carotid artery stenosis; and (iv) a key comparison where the measurand was the calibration factor of a radio-frequency power sensor.

  18. Statistical prediction with Kanerva's sparse distributed memory

    NASA Technical Reports Server (NTRS)

    Rogers, David

    1989-01-01

    A new viewpoint of the processing performed by Kanerva's sparse distributed memory (SDM) is presented. In conditions of near- or over-capacity, where the associative-memory behavior of the model breaks down, the processing performed by the model can be interpreted as that of a statistical predictor. Mathematical results are presented which serve as the framework for a new statistical viewpoint of sparse distributed memory and for which the standard formulation of SDM is a special case. This viewpoint suggests possible enhancements to the SDM model, including a procedure for improving the predictiveness of the system based on Holland's work with genetic algorithms, and a method for improving the capacity of SDM even when used as an associative memory.

  19. Computing Inter-Rater Reliability for Observational Data: An Overview and Tutorial

    PubMed Central

    Hallgren, Kevin A.

    2012-01-01

    Many research designs require the assessment of inter-rater reliability (IRR) to demonstrate consistency among observational ratings provided by multiple coders. However, many studies use incorrect statistical procedures, fail to fully report the information necessary to interpret their results, or do not address how IRR affects the power of their subsequent analyses for hypothesis testing. This paper provides an overview of methodological issues related to the assessment of IRR with a focus on study design, selection of appropriate statistics, and the computation, interpretation, and reporting of some commonly-used IRR statistics. Computational examples include SPSS and R syntax for computing Cohen’s kappa and intra-class correlations to assess IRR. PMID:22833776

  20. Statistical procedures for analyzing mental health services data.

    PubMed

    Elhai, Jon D; Calhoun, Patrick S; Ford, Julian D

    2008-08-15

    In mental health services research, analyzing service utilization data often poses serious problems, given the presence of substantially skewed data distributions. This article presents a non-technical introduction to statistical methods specifically designed to handle the complexly distributed datasets that represent mental health service use, including Poisson, negative binomial, zero-inflated, and zero-truncated regression models. A flowchart is provided to assist the investigator in selecting the most appropriate method. Finally, a dataset of mental health service use reported by medical patients is described, and a comparison of results across several different statistical methods is presented. Implications of matching data analytic techniques appropriately with the often complexly distributed datasets of mental health services utilization variables are discussed.

  1. Philosophers assess randomized clinical trials: the need for dialogue.

    PubMed

    Miké, V

    1989-09-01

    In recent years a growing number of professional philosophers have joined in the controversy over ethical aspects of randomized clinical trials (RCTs). Morally questionable in their utilitarian approach, RCTs are claimed by some to be in direct violation of the second form of Kant's Categorical Imperative. But the arguments used in these critiques at times derive from a lack of insight into basic statistical procedures and the realities of the biomedical research process. Presented to physicians and other nonspecialists, including the lay public, such distortions can be harmful. Given the great complexity of statistical methodology and the anomalous nature of concepts of evidence, more sustained input into the interdisciplinary dialogue is needed from the statistical profession.

  2. Two Paradoxes in Linear Regression Analysis

    PubMed Central

    FENG, Ge; PENG, Jing; TU, Dongke; ZHENG, Julia Z.; FENG, Changyong

    2016-01-01

    Summary Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection. PMID:28638214

  3. Uncertainty Analysis for DAM Projects.

    DTIC Science & Technology

    1987-09-01

    overwhelming majority of articles published on the use of statistical methodology for geotechnical engineering focus on performance predictions and design ...Results of the present study do not support the adoption of more esoteric statistical procedures except on a special case basis or in research ...influence that recommended statistical procedures might have had on the Carters Project, had they been applied during planning and design phases

  4. Load research manual. Volume 2. Fundamentals of implementing load research procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brandenburg, L.; Clarkson, G.; Grund, Jr., C.

    This three-volume manual presents technical guidelines for electric utility load research. Special attention is given to issues raised by the load data reporting requirements of the Public Utility Regulatory Policies Act of 1978 and to problems faced by smaller utilities that are initiating load research programs. In Volumes 1 and 2, procedures are suggested for determining data requirements for load research, establishing the size and customer composition of a load survey sample, selecting and using equipment to record customer electricity usage, processing data tapes from the recording equipment, and analyzing the data. Statistical techniques used in customer sampling are discussedmore » in detail. The costs of load research also are estimated, and ongoing load research programs at three utilities are described. The manual includes guides to load research literature and glossaries of load research and statistical terms.« less

  5. Statistical analysis of water-quality data containing multiple detection limits: S-language software for regression on order statistics

    USGS Publications Warehouse

    Lee, L.; Helsel, D.

    2005-01-01

    Trace contaminants in water, including metals and organics, often are measured at sufficiently low concentrations to be reported only as values below the instrument detection limit. Interpretation of these "less thans" is complicated when multiple detection limits occur. Statistical methods for multiply censored, or multiple-detection limit, datasets have been developed for medical and industrial statistics, and can be employed to estimate summary statistics or model the distributions of trace-level environmental data. We describe S-language-based software tools that perform robust linear regression on order statistics (ROS). The ROS method has been evaluated as one of the most reliable procedures for developing summary statistics of multiply censored data. It is applicable to any dataset that has 0 to 80% of its values censored. These tools are a part of a software library, or add-on package, for the R environment for statistical computing. This library can be used to generate ROS models and associated summary statistics, plot modeled distributions, and predict exceedance probabilities of water-quality standards. ?? 2005 Elsevier Ltd. All rights reserved.

  6. Testing homogeneity of proportion ratios for stratified correlated bilateral data in two-arm randomized clinical trials.

    PubMed

    Pei, Yanbo; Tian, Guo-Liang; Tang, Man-Lai

    2014-11-10

    Stratified data analysis is an important research topic in many biomedical studies and clinical trials. In this article, we develop five test statistics for testing the homogeneity of proportion ratios for stratified correlated bilateral binary data based on an equal correlation model assumption. Bootstrap procedures based on these test statistics are also considered. To evaluate the performance of these statistics and procedures, we conduct Monte Carlo simulations to study their empirical sizes and powers under various scenarios. Our results suggest that the procedure based on score statistic performs well generally and is highly recommended. When the sample size is large, procedures based on the commonly used weighted least square estimate and logarithmic transformation with Mantel-Haenszel estimate are recommended as they do not involve any computation of maximum likelihood estimates requiring iterative algorithms. We also derive approximate sample size formulas based on the recommended test procedures. Finally, we apply the proposed methods to analyze a multi-center randomized clinical trial for scleroderma patients. Copyright © 2014 John Wiley & Sons, Ltd.

  7. Statistical Reform in School Psychology Research: A Synthesis

    ERIC Educational Resources Information Center

    Swaminathan, Hariharan; Rogers, H. Jane

    2007-01-01

    Statistical reform in school psychology research is discussed in terms of research designs, measurement issues, statistical modeling and analysis procedures, interpretation and reporting of statistical results, and finally statistics education.

  8. A closer look at diagnosis in clinical dental practice: part 1. Reliability, validity, specificity and sensitivity of diagnostic procedures.

    PubMed

    Pretty, Iain A; Maupomé, Gerardo

    2004-04-01

    Dentists are involved in diagnosing disease in every aspect of their clinical practice. A range of tests, systems, guides and equipment--which can be generally referred to as diagnostic procedures--are available to aid in diagnostic decision making. In this era of evidence-based dentistry, and given the increasing demand for diagnostic accuracy and properly targeted health care, it is important to assess the value of such diagnostic procedures. Doing so allows dentists to weight appropriately the information these procedures supply, to purchase new equipment if it proves more reliable than existing equipment or even to discard a commonly used procedure if it is shown to be unreliable. This article, the first in a 6-part series, defines several concepts used to express the usefulness of diagnostic procedures, including reliability and validity, and describes some of their operating characteristics (statistical measures of performance), in particular, specificity and sensitivity. Subsequent articles in the series will discuss the value of diagnostic procedures used in daily dental practice and will compare today's most innovative procedures with established methods.

  9. Empirical-statistical downscaling of reanalysis data to high-resolution air temperature and specific humidity above a glacier surface (Cordillera Blanca, Peru)

    NASA Astrophysics Data System (ADS)

    Hofer, Marlis; MöLg, Thomas; Marzeion, Ben; Kaser, Georg

    2010-06-01

    Recently initiated observation networks in the Cordillera Blanca (Peru) provide temporally high-resolution, yet short-term, atmospheric data. The aim of this study is to extend the existing time series into the past. We present an empirical-statistical downscaling (ESD) model that links 6-hourly National Centers for Environmental Prediction (NCEP)/National Center for Atmospheric Research (NCAR) reanalysis data to air temperature and specific humidity, measured at the tropical glacier Artesonraju (northern Cordillera Blanca). The ESD modeling procedure includes combined empirical orthogonal function and multiple regression analyses and a double cross-validation scheme for model evaluation. Apart from the selection of predictor fields, the modeling procedure is automated and does not include subjective choices. We assess the ESD model sensitivity to the predictor choice using both single-field and mixed-field predictors. Statistical transfer functions are derived individually for different months and times of day. The forecast skill largely depends on month and time of day, ranging from 0 to 0.8. The mixed-field predictors perform better than the single-field predictors. The ESD model shows added value, at all time scales, against simpler reference models (e.g., the direct use of reanalysis grid point values). The ESD model forecast 1960-2008 clearly reflects interannual variability related to the El Niño/Southern Oscillation but is sensitive to the chosen predictor type.

  10. Compendium of Methods for Applying Measured Data to Vibration and Acoustic Problems

    DTIC Science & Technology

    1985-10-01

    statistical energy analysis , finite element models, transfer function...Procedures for the Modal Analysis Method .............................................. 8-22 8.4 Summary of the Procedures for the Statistical Energy Analysis Method... statistical energy analysis . 8-1 • o + . . i... "_+,A" L + "+..• •+A ’! i, + +.+ +• o.+ -ore -+. • -..- , .%..% ". • 2 -".-2- ;.-.’, . o . It is helpful

  11. On Improving the Experiment Methodology in Pedagogical Research

    ERIC Educational Resources Information Center

    Horakova, Tereza; Houska, Milan

    2014-01-01

    The paper shows how the methodology for a pedagogical experiment can be improved through including the pre-research stage. If the experiment has the form of a test procedure, an improvement of methodology can be achieved using for example the methods of statistical and didactic analysis of tests which are traditionally used in other areas, i.e.…

  12. A Survey of the Practices, Procedures, and Techniques in Undergraduate Organic Chemistry Teaching Laboratories

    ERIC Educational Resources Information Center

    Martin, Christopher B.; Schmidt, Monica; Soniat, Michael

    2011-01-01

    A survey was conducted of four-year institutions that teach undergraduate organic chemistry laboratories in the United States. The data include results from over 130 schools, describes the current practices at these institutions, and discusses the statistical results such as the scale of the laboratories performed, the chemical techniques applied,…

  13. Optimal Sample Size Determinations for the Heteroscedastic Two One-Sided Tests of Mean Equivalence: Design Schemes and Software Implementations

    ERIC Educational Resources Information Center

    Jan, Show-Li; Shieh, Gwowen

    2017-01-01

    Equivalence assessment is becoming an increasingly important topic in many application areas including behavioral and social sciences research. Although there exist more powerful tests, the two one-sided tests (TOST) procedure is a technically transparent and widely accepted method for establishing statistical equivalence. Alternatively, a direct…

  14. Suggestions for Teaching Mathematics Using Laboratory Approaches. 6. Probability. Experimental Edition.

    ERIC Educational Resources Information Center

    New York State Education Dept., Albany. Bureau of Elementary Curriculum Development.

    This guide is the sixth in a series of publications to assist teachers in using a laboratory approach to mathematics. Twenty activities on probability and statistics for the elementary grades are described in terms of purpose, materials needed, and procedures to be used. Objectives of these activities include basic probability concepts; gathering,…

  15. Development of QC Procedures for Ocean Data Obtained by National Research Projects of Korea

    NASA Astrophysics Data System (ADS)

    Kim, S. D.; Park, H. M.

    2017-12-01

    To establish data management system for ocean data obtained by national research projects of Ministry of Oceans and Fisheries of Korea, KIOST conducted standardization and development of QC procedures. After reviewing and analyzing the existing international and domestic ocean-data standards and QC procedures, the draft version of standards and QC procedures were prepared. The proposed standards and QC procedures were reviewed and revised by experts in the field of oceanography and academic societies several times. A technical report on the standards of 25 data items and 12 QC procedures for physical, chemical, biological and geological data items. The QC procedure for temperature and salinity data was set up by referring the manuals published by GTSPP, ARGO and IOOS QARTOD. It consists of 16 QC tests applicable for vertical profile data and time series data obtained in real-time mode and delay mode. Three regional range tests to inspect annual, seasonal and monthly variations were included in the procedure. Three programs were developed to calculate and provide upper limit and lower limit of temperature and salinity at depth from 0 to 1550m. TS data of World Ocean Database, ARGO, GTSPP and in-house data of KIOST were analysed statistically to calculate regional limit of Northwest Pacific area. Based on statistical analysis, the programs calculate regional ranges using mean and standard deviation at 3 kind of grid systems (3° grid, 1° grid and 0.5° grid) and provide recommendation. The QC procedures for 12 data items were set up during 1st phase of national program for data management (2012-2015) and are being applied to national research projects practically at 2nd phase (2016-2019). The QC procedures will be revised by reviewing the result of QC application when the 2nd phase of data management programs is completed.

  16. Decision rules for unbiased inventory estimates

    NASA Technical Reports Server (NTRS)

    Argentiero, P. D.; Koch, D.

    1979-01-01

    An efficient and accurate procedure for estimating inventories from remote sensing scenes is presented. In place of the conventional and expensive full dimensional Bayes decision rule, a one-dimensional feature extraction and classification technique was employed. It is shown that this efficient decision rule can be used to develop unbiased inventory estimates and that for large sample sizes typical of satellite derived remote sensing scenes, resulting accuracies are comparable or superior to more expensive alternative procedures. Mathematical details of the procedure are provided in the body of the report and in the appendix. Results of a numerical simulation of the technique using statistics obtained from an observed LANDSAT scene are included. The simulation demonstrates the effectiveness of the technique in computing accurate inventory estimates.

  17. Utilizing formative evaluation to enhance the understanding of chemistry and the methods and procedures of science

    NASA Astrophysics Data System (ADS)

    Pizzini, Edward L.; Treagust, David F.; Cody, John

    The purpose of this study was to determine whether or not formative evaluation could facilitate goal attainment in a biochemistry course and produce desired learning outcomes consistently by altering course materials and/or instruction. Formative evaluation procedures included the administration of the Inorganic-Organic-Biological Chemistry Test Form 1974 and the Methods and Procedures of Science test to course participants over three consecutive years. A one group pretest-post-test design was used. The statistical analysis involved the use of the Wilcoxon matched-pairs signed-ranks test. The study involved 64 participants. The findings indicate that the use of formative evaluation can be effective in producing desired learning outcomes to facilitate goal attainment.

  18. A demonstration of the application of the new paradigm for the evaluation of forensic evidence under conditions reflecting those of a real forensic-voice-comparison case.

    PubMed

    Enzinger, Ewald; Morrison, Geoffrey Stewart; Ochoa, Felipe

    2016-01-01

    The new paradigm for the evaluation of the strength of forensic evidence includes: The use of the likelihood-ratio framework. The use of relevant data, quantitative measurements, and statistical models. Empirical testing of validity and reliability under conditions reflecting those of the case under investigation. Transparency as to decisions made and procedures employed. The present paper illustrates the use of the new paradigm to evaluate strength of evidence under conditions reflecting those of a real forensic-voice-comparison case. The offender recording was from a landline telephone system, had background office noise, and was saved in a compressed format. The suspect recording included substantial reverberation and ventilation system noise, and was saved in a different compressed format. The present paper includes descriptions of the selection of the relevant hypotheses, sampling of data from the relevant population, simulation of suspect and offender recording conditions, and acoustic measurement and statistical modelling procedures. The present paper also explores the use of different techniques to compensate for the mismatch in recording conditions. It also examines how system performance would have differed had the suspect recording been of better quality. Copyright © 2015 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.

  19. Precipitate statistics in an Al-Mg-Si-Cu alloy from scanning precession electron diffraction data

    NASA Astrophysics Data System (ADS)

    Sunde, J. K.; Paulsen, Ø.; Wenner, S.; Holmestad, R.

    2017-09-01

    The key microstructural feature providing strength to age-hardenable Al alloys is nanoscale precipitates. Alloy development requires a reliable statistical assessment of these precipitates, in order to link the microstructure with material properties. Here, it is demonstrated that scanning precession electron diffraction combined with computational analysis enable the semi-automated extraction of precipitate statistics in an Al-Mg-Si-Cu alloy. Among the main findings is the precipitate number density, which agrees well with a conventional method based on manual counting and measurements. By virtue of its data analysis objectivity, our methodology is therefore seen as an advantageous alternative to existing routines, offering reproducibility and efficiency in alloy statistics. Additional results include improved qualitative information on phase distributions. The developed procedure is generic and applicable to any material containing nanoscale precipitates.

  20. BTS statistical standards manual

    DOT National Transportation Integrated Search

    2005-10-01

    The Bureau of Transportation Statistics (BTS), like other federal statistical agencies, establishes professional standards to guide the methods and procedures for the collection, processing, storage, and presentation of statistical data. Standards an...

  1. Is it safe to combine abdominoplasty with elective breast surgery? A review of 151 consecutive cases.

    PubMed

    Stevens, W Grant; Cohen, Robert; Vath, Steven D; Stoker, David A; Hirsch, Elliot M

    2006-07-01

    This study was designed to evaluate and compare the complication rates of patients having abdominoplasty without breast surgery with the rates of those having abdominoplasty with various types of elective breast surgery, including breast augmentation, breast reduction, mastopexy, and mastopexy combined with simultaneous augmentation. The data collected represent a retrospective chart review of consecutive abdominoplasty procedures performed at a single outpatient facility by the senior surgeon (W.G.S.) over a 15-year period (1989 to 2004). Two groups were compared: patients who underwent abdominoplasty without breast surgery and those who had abdominoplasty with breast surgery. The second group was subdivided by the various types of breast procedures noted above. The minor complications assessed included seromas, hematomas, infections, and small (<5 cm) wound breakdowns. Major complications evaluated included large (>5 cm) flap necrosis, need for blood transfusion, deep vein thrombosis, pulmonary embolus, myocardial infarction, and death. Additional data compiled included age, sex, tobacco use, body mass index, past medical history, American Society of Anesthesiologists physical status level, and operative times. Of the 415 abdominoplasty procedures, 264 (group 1) did not include simultaneous breast surgery. One hundred fifty-one procedures (group 2) involved simultaneous breast surgery, representing 36 percent of the total. Group 2 was further subdivided into those who had breast augmentation surgery (group 2A, n = 50), those who had breast reduction surgery (group 2B, n = 31), those who had mastopexy surgery (group 2C, n = 28), and those who had simultaneous mastopexy and breast augmentation surgery (group 2D, n = 42). Removal and replacement of implants and capsulectomy/capsulotomy procedures were included in the augmentation group (group 2A). There were no major complications, including flap necrosis (open wound >5 cm), blood transfusions, deep vein thrombosis, pulmonary embolus, myocardial infarction, or death. No patients required hospitalization. No statistically significant associations with complications were noted between groups 1 and 2 (chi-square, 0.0045; p > 0.95, not significant). Furthermore, when subdivided by type of breast surgery, no statistically significant associations were noted among subgroups: group 1 versus 2A (chi-square, 0.96; p > 0.05, not significant), group 1 versus 2B (chi-square, 0.032; p > 0.9, not significant), group 1 versus 2C (chi-square, 0.003; p > 0.975, not significant), and group 1 versus 2D (chi-square, 0.83; p > 0.5, not significant). The results of this retrospective review indicate that combining elective breast surgery with abdominoplasty does not appear to significantly increase the number of major or minor complications.

  2. The Use of Statistical Process Control-Charts for Person-Fit Analysis on Computerized Adaptive Testing. LSAC Research Report Series.

    ERIC Educational Resources Information Center

    Meijer, Rob R.; van Krimpen-Stoop, Edith M. L. A.

    In this study a cumulative-sum (CUSUM) procedure from the theory of Statistical Process Control was modified and applied in the context of person-fit analysis in a computerized adaptive testing (CAT) environment. Six person-fit statistics were proposed using the CUSUM procedure, and three of them could be used to investigate the CAT in online test…

  3. 77 FR 53889 - Statement of Organization, Functions, and Delegations of Authority

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-04

    ..., methods, and statistical procedures for assessing and monitoring the health of communities and measuring... methods and the Community Guide, and coordinates division responses to requests for technical assistance...-federal partners in developing indicators, methods, and statistical procedures for measuring and reporting...

  4. 10 CFR Appendix II to Part 504 - Fuel Price Computation

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 504—Fuel Price Computation (a) Introduction. This appendix provides the equations and parameters... inflation indices must follow standard statistical procedures and must be fully documented within the... the weighted average fuel price must follow standard statistical procedures and be fully documented...

  5. Correlation of a Novel Noninvasive Tissue Oxygen Saturation Monitor to Serum Central Venous Oxygen Saturation in Pediatric Patients with Postoperative Congenital Cyanotic Heart Disease

    PubMed Central

    Yadlapati, Ajay; Grogan, Tristan; Elashoff, David; Kelly, Robert B.

    2013-01-01

    Abstract: Using a novel noninvasive, visible-light optical diffusion oximeter (T-Stat VLS Tissue Oximeter; Spectros Corporation, Portola Valley, CA) to measure the tissue oxygen saturation (StO2) of the buccal mucosa, the correlation between StO2 and central venous oxygen saturation (ScvO2) was examined in children with congenital cyanotic heart disease undergoing a cardiac surgical procedure. Paired StO2 and serum ScvO2 measurements were obtained postoperatively and statistically analyzed for agreement and association. Thirteen children (nine male) participated in the study (age range, 4 days to 18 months). Surgeries included Glenn shunt procedures, Norwood procedures, unifocalization procedures with Blalock-Taussig shunt placement, a Kawashima/Glenn shunt procedure, a Blalock-Taussig shunt placement, and a modified Norwood procedure. A total of 45 paired StO2-ScvO2 measurements was obtained. Linear regression demonstrated a Pearson’s correlation of .58 (95% confidence interval [CI], .35–.75; p < .0001). The regression slope coefficient estimate was .95 (95% CI, .54–1.36) with an interclass correlation coefficient of .48 (95% CI, .22–.68). Below a clinically relevant average ScvO2 value, a receiver operator characteristic analysis yielded an area under the curve of .78. Statistical methods to control for repeatedly measuring the same subjects produced similar results. This study shows a moderate relationship and agreement between StO2 and ScvO2 measurements in pediatric patients with a history of congenital cyanotic heart disease undergoing a cardiac surgical procedure. This real-time monitoring device can act as a valuable adjunct to standard noninvasive monitoring in which serum ScvO2 sampling currently assists in the diagnosis of low cardiac output after pediatric cardiac surgery. PMID:23691783

  6. Scientific procedures on living animals in Great Britain in 2003: the facts, figures and consequences.

    PubMed

    Hudson, Michelle; Bhogal, Nirmala

    2004-11-01

    The statistics for animal procedures performed in 2003 were recently released by the Home Office. They indicate that, for the second year running, there was a significant increase in the number of laboratory animal procedures undertaken in Great Britain. The species and genera used, the numbers of toxicology and non-toxicology procedures, and the overall trends, are described. The implications of these latest statistics are discussed with reference to key areas of interest and to the impact of existing regulations and pending legislative reforms.

  7. Random forests for classification in ecology

    USGS Publications Warehouse

    Cutler, D.R.; Edwards, T.C.; Beard, K.H.; Cutler, A.; Hess, K.T.; Gibson, J.; Lawler, J.J.

    2007-01-01

    Classification procedures are some of the most widely used statistical methods in ecology. Random forests (RF) is a new and powerful statistical classifier that is well established in other disciplines but is relatively unknown in ecology. Advantages of RF compared to other statistical classifiers include (1) very high classification accuracy; (2) a novel method of determining variable importance; (3) ability to model complex interactions among predictor variables; (4) flexibility to perform several types of statistical data analysis, including regression, classification, survival analysis, and unsupervised learning; and (5) an algorithm for imputing missing values. We compared the accuracies of RF and four other commonly used statistical classifiers using data on invasive plant species presence in Lava Beds National Monument, California, USA, rare lichen species presence in the Pacific Northwest, USA, and nest sites for cavity nesting birds in the Uinta Mountains, Utah, USA. We observed high classification accuracy in all applications as measured by cross-validation and, in the case of the lichen data, by independent test data, when comparing RF to other common classification methods. We also observed that the variables that RF identified as most important for classifying invasive plant species coincided with expectations based on the literature. ?? 2007 by the Ecological Society of America.

  8. Temperature and Voltage Offsets in High- ZT Thermoelectrics

    NASA Astrophysics Data System (ADS)

    Levy, George S.

    2018-06-01

    Thermodynamic temperature can take on different meanings. Kinetic temperature is an expectation value and a function of the kinetic energy distribution. Statistical temperature is a parameter of the distribution. Kinetic temperature and statistical temperature, identical in Maxwell-Boltzmann statistics, can differ in other statistics such as those of Fermi-Dirac or Bose-Einstein when a field is present. Thermal equilibrium corresponds to zero statistical temperature gradient, not zero kinetic temperature gradient. Since heat carriers in thermoelectrics are fermions, the difference between these two temperatures may explain voltage and temperature offsets observed during meticulous Seebeck measurements in which the temperature-voltage curve does not go through the origin. In conventional semiconductors, temperature offsets produced by fermionic electrical carriers are not observable because they are shorted by heat phonons in the lattice. In high- ZT materials, however, these offsets have been detected but attributed to faulty laboratory procedures. Additional supporting evidence for spontaneous voltages and temperature gradients includes data collected in epistatic experiments and in the plasma Q-machine. Device fabrication guidelines for testing the hypothesis are suggested including using unipolar junctions stacked in a superlattice, alternating n/ n + and p/ p + junctions, selecting appropriate dimensions, doping, and loading.

  9. Temperature and Voltage Offsets in High-ZT Thermoelectrics

    NASA Astrophysics Data System (ADS)

    Levy, George S.

    2017-10-01

    Thermodynamic temperature can take on different meanings. Kinetic temperature is an expectation value and a function of the kinetic energy distribution. Statistical temperature is a parameter of the distribution. Kinetic temperature and statistical temperature, identical in Maxwell-Boltzmann statistics, can differ in other statistics such as those of Fermi-Dirac or Bose-Einstein when a field is present. Thermal equilibrium corresponds to zero statistical temperature gradient, not zero kinetic temperature gradient. Since heat carriers in thermoelectrics are fermions, the difference between these two temperatures may explain voltage and temperature offsets observed during meticulous Seebeck measurements in which the temperature-voltage curve does not go through the origin. In conventional semiconductors, temperature offsets produced by fermionic electrical carriers are not observable because they are shorted by heat phonons in the lattice. In high-ZT materials, however, these offsets have been detected but attributed to faulty laboratory procedures. Additional supporting evidence for spontaneous voltages and temperature gradients includes data collected in epistatic experiments and in the plasma Q-machine. Device fabrication guidelines for testing the hypothesis are suggested including using unipolar junctions stacked in a superlattice, alternating n/n + and p/p + junctions, selecting appropriate dimensions, doping, and loading.

  10. Robust model selection and the statistical classification of languages

    NASA Astrophysics Data System (ADS)

    García, J. E.; González-López, V. A.; Viola, M. L. L.

    2012-10-01

    In this paper we address the problem of model selection for the set of finite memory stochastic processes with finite alphabet, when the data is contaminated. We consider m independent samples, with more than half of them being realizations of the same stochastic process with law Q, which is the one we want to retrieve. We devise a model selection procedure such that for a sample size large enough, the selected process is the one with law Q. Our model selection strategy is based on estimating relative entropies to select a subset of samples that are realizations of the same law. Although the procedure is valid for any family of finite order Markov models, we will focus on the family of variable length Markov chain models, which include the fixed order Markov chain model family. We define the asymptotic breakdown point (ABDP) for a model selection procedure, and we show the ABDP for our procedure. This means that if the proportion of contaminated samples is smaller than the ABDP, then, as the sample size grows our procedure selects a model for the process with law Q. We also use our procedure in a setting where we have one sample conformed by the concatenation of sub-samples of two or more stochastic processes, with most of the subsamples having law Q. We conducted a simulation study. In the application section we address the question of the statistical classification of languages according to their rhythmic features using speech samples. This is an important open problem in phonology. A persistent difficulty on this problem is that the speech samples correspond to several sentences produced by diverse speakers, corresponding to a mixture of distributions. The usual procedure to deal with this problem has been to choose a subset of the original sample which seems to best represent each language. The selection is made by listening to the samples. In our application we use the full dataset without any preselection of samples. We apply our robust methodology estimating a model which represent the main law for each language. Our findings agree with the linguistic conjecture, related to the rhythm of the languages included on our dataset.

  11. A comparison of vowel normalization procedures for language variation research

    NASA Astrophysics Data System (ADS)

    Adank, Patti; Smits, Roel; van Hout, Roeland

    2004-11-01

    An evaluation of vowel normalization procedures for the purpose of studying language variation is presented. The procedures were compared on how effectively they (a) preserve phonemic information, (b) preserve information about the talker's regional background (or sociolinguistic information), and (c) minimize anatomical/physiological variation in acoustic representations of vowels. Recordings were made for 80 female talkers and 80 male talkers of Dutch. These talkers were stratified according to their gender and regional background. The normalization procedures were applied to measurements of the fundamental frequency and the first three formant frequencies for a large set of vowel tokens. The normalization procedures were evaluated through statistical pattern analysis. The results show that normalization procedures that use information across multiple vowels (``vowel-extrinsic'' information) to normalize a single vowel token performed better than those that include only information contained in the vowel token itself (``vowel-intrinsic'' information). Furthermore, the results show that normalization procedures that operate on individual formants performed better than those that use information across multiple formants (e.g., ``formant-extrinsic'' F2-F1). .

  12. A comparison of vowel normalization procedures for language variation research.

    PubMed

    Adank, Patti; Smits, Roel; van Hout, Roeland

    2004-11-01

    An evaluation of vowel normalization procedures for the purpose of studying language variation is presented. The procedures were compared on how effectively they (a) preserve phonemic information, (b) preserve information about the talker's regional background (or sociolinguistic information), and (c) minimize anatomical/physiological variation in acoustic representations of vowels. Recordings were made for 80 female talkers and 80 male talkers of Dutch. These talkers were stratified according to their gender and regional background. The normalization procedures were applied to measurements of the fundamental frequency and the first three formant frequencies for a large set of vowel tokens. The normalization procedures were evaluated through statistical pattern analysis. The results show that normalization procedures that use information across multiple vowels ("vowel-extrinsic" information) to normalize a single vowel token performed better than those that include only information contained in the vowel token itself ("vowel-intrinsic" information). Furthermore, the results show that normalization procedures that operate on individual formants performed better than those that use information across multiple formants (e.g., "formant-extrinsic" F2-F1).

  13. Statistical significance of task related deep brain EEG dynamic changes in the time-frequency domain.

    PubMed

    Chládek, J; Brázdil, M; Halámek, J; Plešinger, F; Jurák, P

    2013-01-01

    We present an off-line analysis procedure for exploring brain activity recorded from intra-cerebral electroencephalographic data (SEEG). The objective is to determine the statistical differences between different types of stimulations in the time-frequency domain. The procedure is based on computing relative signal power change and subsequent statistical analysis. An example of characteristic statistically significant event-related de/synchronization (ERD/ERS) detected across different frequency bands following different oddball stimuli is presented. The method is used for off-line functional classification of different brain areas.

  14. A Framework for Assessing High School Students' Statistical Reasoning.

    PubMed

    Chan, Shiau Wei; Ismail, Zaleha; Sumintono, Bambang

    2016-01-01

    Based on a synthesis of literature, earlier studies, analyses and observations on high school students, this study developed an initial framework for assessing students' statistical reasoning about descriptive statistics. Framework descriptors were established across five levels of statistical reasoning and four key constructs. The former consisted of idiosyncratic reasoning, verbal reasoning, transitional reasoning, procedural reasoning, and integrated process reasoning. The latter include describing data, organizing and reducing data, representing data, and analyzing and interpreting data. In contrast to earlier studies, this initial framework formulated a complete and coherent statistical reasoning framework. A statistical reasoning assessment tool was then constructed from this initial framework. The tool was administered to 10 tenth-grade students in a task-based interview. The initial framework was refined, and the statistical reasoning assessment tool was revised. The ten students then participated in the second task-based interview, and the data obtained were used to validate the framework. The findings showed that the students' statistical reasoning levels were consistent across the four constructs, and this result confirmed the framework's cohesion. Developed to contribute to statistics education, this newly developed statistical reasoning framework provides a guide for planning learning goals and designing instruction and assessments.

  15. A Framework for Assessing High School Students' Statistical Reasoning

    PubMed Central

    2016-01-01

    Based on a synthesis of literature, earlier studies, analyses and observations on high school students, this study developed an initial framework for assessing students’ statistical reasoning about descriptive statistics. Framework descriptors were established across five levels of statistical reasoning and four key constructs. The former consisted of idiosyncratic reasoning, verbal reasoning, transitional reasoning, procedural reasoning, and integrated process reasoning. The latter include describing data, organizing and reducing data, representing data, and analyzing and interpreting data. In contrast to earlier studies, this initial framework formulated a complete and coherent statistical reasoning framework. A statistical reasoning assessment tool was then constructed from this initial framework. The tool was administered to 10 tenth-grade students in a task-based interview. The initial framework was refined, and the statistical reasoning assessment tool was revised. The ten students then participated in the second task-based interview, and the data obtained were used to validate the framework. The findings showed that the students’ statistical reasoning levels were consistent across the four constructs, and this result confirmed the framework’s cohesion. Developed to contribute to statistics education, this newly developed statistical reasoning framework provides a guide for planning learning goals and designing instruction and assessments. PMID:27812091

  16. Predicting juvenile recidivism: new method, old problems.

    PubMed

    Benda, B B

    1987-01-01

    This prediction study compared three statistical procedures for accuracy using two assessment methods. The criterion is return to a juvenile prison after the first release, and the models tested are logit analysis, predictive attribute analysis, and a Burgess procedure. No significant differences are found between statistics in prediction.

  17. Analytical procedure validation and the quality by design paradigm.

    PubMed

    Rozet, Eric; Lebrun, Pierre; Michiels, Jean-François; Sondag, Perceval; Scherder, Tara; Boulanger, Bruno

    2015-01-01

    Since the adoption of the ICH Q8 document concerning the development of pharmaceutical processes following a quality by design (QbD) approach, there have been many discussions on the opportunity for analytical procedure developments to follow a similar approach. While development and optimization of analytical procedure following QbD principles have been largely discussed and described, the place of analytical procedure validation in this framework has not been clarified. This article aims at showing that analytical procedure validation is fully integrated into the QbD paradigm and is an essential step in developing analytical procedures that are effectively fit for purpose. Adequate statistical methodologies have also their role to play: such as design of experiments, statistical modeling, and probabilistic statements. The outcome of analytical procedure validation is also an analytical procedure design space, and from it, control strategy can be set.

  18. [What is the methodological quality of articles on therapeutic procedures published in Cirugía Española?].

    PubMed

    Manterola, Carlos; Busquets, Juli; Pascual, Marta; Grande, Luis

    2006-02-01

    The aim of this study was to determine the methodological quality of articles on therapeutic procedures published in Cirugía Española and to study its association with the publication year, center, and subject-matter. A bibliometric study that included all articles on therapeutic procedures published in Cirugía Española between 2001 and 2004 was performed. All kinds of clinical designs were considered, excluding editorials, review articles, letters to editor, and experimental studies. The variables analyzed were: year of publication, center, design, and methodological quality. Methodological quality was determined by a valid and reliable scale. Descriptive statistics (calculation of means, standard deviation and medians) and analytical statistics (Pearson's chi2, nonparametric, ANOVA and Bonferroni tests) were used. A total of 244 articles were studied (197 case series [81%], 28 cohort studies [12%], 17 clinical trials [7%], 1 cross sectional study and 1 case-control study [0.8%]). The studies were performed mainly in Catalonia and Murcia (22% and 16%, respectively). The most frequent subject areas were soft tissue and hepatobiliopancreatic surgery (23% and 19%, respectively). The mean and median of the methodological quality score calculated for the entire series was 10.2 +/- 3.9 points and 9.5 points, respectively. Methodological quality significantly increased by publication year (p < 0.001). An association between methodological quality and subject area was observed but no association was detected with the center performing the study. The methodological quality of articles on therapeutic procedures published in Cirugía Española between 2001 and 2004 is low. However, a statistically significant trend toward improvement was observed.

  19. Image-guided placement of port catheters: is there an increased risk of infection if the port is immediately accessed and used?

    PubMed

    Salazar, Gloria; Yeddula, Kalpana; Wicky, Stephan; Oklu, Ramhi; Ganguli, Suvranu; Waltman, Arthur C; Walker, Thomas G; Kalva, Sanjeeva P

    2013-01-01

    To compare complication rates in patients who have port-a-catheters inserted and left accessed for immediate use and those who have ports inserted but not accessed. In this retrospective, IRB-approved study, medical records of patients who received a port catheter between 9/2009 and 2/2010 were reviewed. The data collected included patient demographics, diagnosis, procedure and complications. The patients were categorized into two groups: accessed (patients in whom the port was accessed with a Huber needle for immediate intravenous use and the patient left the procedure area with needle indwelling) and control (patients in whom the ports were not accessed). Complications were classified according to Society of Interventional Radiology guidelines. Results are given as mean ±SD. Statistical analysis was performed with student t test and statistical significance was considered at P<.05. A total of 467 ports were placed in 465 patients (Men: 206); 10.7% in the accessed group (n=50, age: 60±13.9) and 89.3% in the control group (n=417, age: 59±13.5). There were no statistically significant differences in patient demographics between the groups. The overall complication rate was 0.6% (n=3). Two complications (hematoma causing skin necrosis and thrombosis of the port) occurred in the control group and one (infection) in the accessed group. Infection rates after procedures were 2% (1/50) in the accessed group and 0% (0/417) in the control group. There was no statistically significant difference in overall complication (P=.1) and infection (P=.1) rates among the groups. Leaving the port accessed immediately after placement does not increase the risk of infection or other complications.

  20. The role of ultrasound guidance in pediatric caudal block

    PubMed Central

    Erbüyün, Koray; Açıkgöz, Barış; Ok, Gülay; Yılmaz, Ömer; Temeltaş, Gökhan; Tekin, İdil; Tok, Demet

    2016-01-01

    Objectives: To compare the time interval of the procedure, possible complications, post-operative pain levels, additional analgesics, and nurse satisfaction in ultrasonography-guided and standard caudal block applications. Methods: This retrospective study was conducted in Celal Bayar University Hospital, Manisa, Turkey, between January and December 2014, included 78 pediatric patients. Caudal block was applied to 2 different groups; one with ultrasound guide, and the other using the standard method. Results: The time interval of the procedure was significantly shorter in the standard application group compared with ultrasound-guided group (p=0.020). Wong-Baker FACES Pain Rating Scale values obtained at the 90th minute was statistically lower in the standard application group compared with ultrasound-guided group (p=0.035). No statistically significant difference was found on the other parameters between the 2 groups. The shorter time interval of the procedure at standard application group should not be considered as a distinctive mark by the pediatric anesthesiologists, because this time difference was as short as seconds. Conclusion: Ultrasound guidance for caudal block applications would neither increase nor decrease the success of the treatment. However, ultrasound guidance should be needed in cases where the detection of sacral anatomy is difficult, especially by palpations. PMID:26837396

  1. Quality Assurance for Rapid Airfield Construction

    DTIC Science & Technology

    2008-05-01

    necessary to conduct a volume-replacement density test for in-place soil. This density test, which was developed during this investigation, involves...the test both simpler and quicker. The Clegg hammer results are the primary means of judging compaction; thus, the requirements for density tests are...minimized through a stepwise acceptance procedure. Statistical criteria for evaluating Clegg hammer and density measurements are also included

  2. Impact of Monetary Incentives and Mailing Procedures: An Experiment in a Federally Sponsored Telephone Survey. Methodology Report. NCES 2006-066

    ERIC Educational Resources Information Center

    Brick, J. Michael; Hagedorn, Mary Collins; Montaquila, Jill; Roth, Shelley Brock; Chapman, Christopher

    2006-01-01

    The National Household Education Surveys Program (NHES) includes a series of random digit dial (RDD) surveys developed by the National Center for Education Statistics (NCES) in the Institute of Education Sciences, U.S. Department of Education. It is designed to collect information on important educational issues through telephone surveys of…

  3. Standard Setting in a Small Scale OSCE: A Comparison of the Modified Borderline-Group Method and the Borderline Regression Method

    ERIC Educational Resources Information Center

    Wood, Timothy J.; Humphrey-Murto, Susan M.; Norman, Geoffrey R.

    2006-01-01

    When setting standards, administrators of small-scale OSCEs often face several challenges, including a lack of resources, a lack of available expertise in statistics, and difficulty in recruiting judges. The Modified Borderline-Group Method is a standard setting procedure that compensates for these challenges by using physician examiners and is…

  4. A Comparison of Item Fit Statistics for Mixed IRT Models

    ERIC Educational Resources Information Center

    Chon, Kyong Hee; Lee, Won-Chan; Dunbar, Stephen B.

    2010-01-01

    In this study we examined procedures for assessing model-data fit of item response theory (IRT) models for mixed format data. The model fit indices used in this study include PARSCALE's G[superscript 2], Orlando and Thissen's S-X[superscript 2] and S-G[superscript 2], and Stone's chi[superscript 2*] and G[superscript 2*]. To investigate the…

  5. A new organic-rich soil reference material certified for its EDTA- and acetic acid- extractable contents of Cd, Cr, Cu, Ni, Pb and Zn, following collaboratively tested and harmonised procedures.

    PubMed

    Pueyo, M; Rauret, G; Bacon, J R; Gomez, A; Muntau, H; Quevauviller, P; López-Sánchez, J F

    2001-02-01

    There is an increasing requirement for assessment of the bioavailable metal fraction and the mobility of trace elements in soils upon disposal. One of the approaches is the use of leaching procedures, but the results obtained are operationally defined; therefore, their significance is highly dependent on the extraction protocol performed. So, for this type of study, there is a need for reference materials that allow the quality of measurements to be controlled. This paper describes the steps involved in the certification of an organic-rich soil reference material, BCR-700, for the EDTA- and acetic acid-extractable contents of some trace elements, following collaboratively tested and harmonised extraction procedures. Details are given for the preparation of the soil, homogeneity and stability testing, analytical procedures and the statistical selection of data to be included in the certification.

  6. A Primer on Multivariate Analysis of Variance (MANOVA) for Behavioral Scientists

    ERIC Educational Resources Information Center

    Warne, Russell T.

    2014-01-01

    Reviews of statistical procedures (e.g., Bangert & Baumberger, 2005; Kieffer, Reese, & Thompson, 2001; Warne, Lazo, Ramos, & Ritter, 2012) show that one of the most common multivariate statistical methods in psychological research is multivariate analysis of variance (MANOVA). However, MANOVA and its associated procedures are often not…

  7. Martian cratering 11. Utilizing decameter scale crater populations to study Martian history

    NASA Astrophysics Data System (ADS)

    Hartmann, W. K.; Daubar, I. J.

    2017-03-01

    New information has been obtained in recent years regarding formation rates and the production size-frequency distribution (PSFD) of decameter-scale primary Martian craters formed during recent orbiter missions. Here we compare the PSFD of the currently forming small primaries (P) with new data on the PSFD of the total small crater population that includes primaries and field secondaries (P + fS), which represents an average over longer time periods. The two data sets, if used in a combined manner, have extraordinary potential for clarifying not only the evolutionary history and resurfacing episodes of small Martian geological formations (as small as one or few km2) but also possible episodes of recent climatic change. In response to recent discussions of statistical methodologies, we point out that crater counts do not produce idealized statistics, and that inherent uncertainties limit improvements that can be made by more sophisticated statistical analyses. We propose three mutually supportive procedures for interpreting crater counts of small craters in this context. Applications of these procedures support suggestions that topographic features in upper meters of mid-latitude ice-rich areas date only from the last few periods of extreme Martian obliquity, and associated predicted climate excursions.

  8. Knowledge dimensions in hypothesis test problems

    NASA Astrophysics Data System (ADS)

    Krishnan, Saras; Idris, Noraini

    2012-05-01

    The reformation in statistics education over the past two decades has predominantly shifted the focus of statistical teaching and learning from procedural understanding to conceptual understanding. The emphasis of procedural understanding is on the formulas and calculation procedures. Meanwhile, conceptual understanding emphasizes students knowing why they are using a particular formula or executing a specific procedure. In addition, the Revised Bloom's Taxonomy offers a twodimensional framework to describe learning objectives comprising of the six revised cognition levels of original Bloom's taxonomy and four knowledge dimensions. Depending on the level of complexities, the four knowledge dimensions essentially distinguish basic understanding from the more connected understanding. This study identifiesthe factual, procedural and conceptual knowledgedimensions in hypothesis test problems. Hypothesis test being an important tool in making inferences about a population from sample informationis taught in many introductory statistics courses. However, researchers find that students in these courses still have difficulty in understanding the underlying concepts of hypothesis test. Past studies also show that even though students can perform the hypothesis testing procedure, they may not understand the rationale of executing these steps or know how to apply them in novel contexts. Besides knowing the procedural steps in conducting a hypothesis test, students must have fundamental statistical knowledge and deep understanding of the underlying inferential concepts such as sampling distribution and central limit theorem. By identifying the knowledge dimensions of hypothesis test problems in this study, suitable instructional and assessment strategies can be developed in future to enhance students' learning of hypothesis test as a valuable inferential tool.

  9. The Progressive Approach to EMDR Group Therapy for Complex Trauma and Dissociation: A Case-Control Study.

    PubMed

    Gonzalez-Vazquez, Ana I; Rodriguez-Lago, Lucía; Seoane-Pillado, Maria T; Fernández, Isabel; García-Guerrero, Francisca; Santed-Germán, Miguel A

    2017-01-01

    Eye Movement Desensitization and Reprocessing is a psychotherapeutic approach with recognized efficiency in treating post-traumatic stress disorder (PTSD), which is being used and studied in other psychiatric diagnoses partially based on adverse and traumatic life experiences. Nevertheless, there is not enough empirical evidence at the moment to support its usefulness in a diagnosis other than PTSD. It is commonly accepted that the use of EMDR in severely traumatized patients requires an extended stabilization phase. Some authors have proposed integrating both the theory of structural dissociation of the personality and the adaptive information processing model guiding EMDR therapy. One of these proposals is the Progressive Approach. Some of these EMDR procedures will be evaluated in a group therapy format, integrating them along with emotional regulation, dissociation, and trauma-oriented psychoeducational interventions. Patients presenting a history of severe traumatization, mostly early severe and interpersonal trauma, combined with additional significant traumatizing events in adulthood were included. In order to discriminate the specific effect of EMDR procedures, two types of groups were compared: TAU (treatment as usual: psychoeducational intervention only) vs. TAU+EMDR (the same psychoeducational intervention plus EMDR specific procedures). In pre-post comparison, more variables presented positive changes in the group including EMDR procedures. In the TAU+EMDR group, 4 of the 5 measured variables presented significant and positive changes: general health (GHQ), general satisfaction (Schwartz), subjective well-being, and therapy session usefulness assessment. On the contrary, only 2 of the 5 variables in the TAU group showed statistically significant changes: general health (GHQ), and general satisfaction (Schwartz). Regarding post-test inter-group comparison, improvement in subjective well-being was related to belonging to the group that included EMDR procedures, with differences between TAU and TAU+EMDR groups being statistically significant [χ 2 (1) = 14.226; p < 0.0001]. In the TAU+EMDR group there was not one patient who got worse or did not improve; 100% experienced some improvement. In the TAU group, 70.6% referred some improvement, and 29.4% said to have gotten worse or not improved.

  10. MIRACAL: A mission radiation calculation program for analysis of lunar and interplanetary missions

    NASA Technical Reports Server (NTRS)

    Nealy, John E.; Striepe, Scott A.; Simonsen, Lisa C.

    1992-01-01

    A computational procedure and data base are developed for manned space exploration missions for which estimates are made for the energetic particle fluences encountered and the resulting dose equivalent incurred. The data base includes the following options: statistical or continuum model for ordinary solar proton events, selection of up to six large proton flare spectra, and galactic cosmic ray fluxes for elemental nuclei of charge numbers 1 through 92. The program requires an input trajectory definition information and specifications of optional parameters, which include desired spectral data and nominal shield thickness. The procedure may be implemented as an independent program or as a subroutine in trajectory codes. This code should be most useful in mission optimization and selection studies for which radiation exposure is of special importance.

  11. The efficiacy of anterior and posterior archs suturation at inferior tonsillar pole for posttonsillectomy pain control.

    PubMed

    Sakallioğlu, Oner; Düzer, Sertaç; Kapusuz, Zeliha

    2014-01-01

    The aim of our study was to investigate the efficiacy of the suturation technique after completing the tonsillectomy procedure for posttonsillectomy pain control in adult patients. August 2010-February 2011, 44 adult patients, ages ranged from 16 to 41 years old who underwent tonsillectomy at Elaziğ Training and Research Hospital Otorhinolaryngology Clinic were included to the study. After tonsillectomy procedure, anterior and posterior tonsillar archs were sutured each other and so, the area of tonsillectomy lodges which covered with mucosa were increased. Twenty two patients who applied posttonsillectomy suturation were used as study group and remnant 22 patients who did not applied posttonsillectomy suturation were used as control group. The visual analogue score (VAS) was used to evaluate the postoperative pain degree (0 no pain, 10 worst pain). ANOVA test (two ways classification with repeated measures) was used for statistical analysis of VAS values. P < 0.05 was accepted as statistically significant. The effect of time (each post-operative day) on VAS values was significant. The mean VAS values between study and control group on post-operative day 1st, 3rd, 7th, and 10th were statistically significant (P < 0.05). The severity of posttonsillectomy pain was less in study group patients than control group patients. The suturation of anterior and posterior tonsillar archs after tonsillectomy procedure was found effective to alleviate the posttonsillectomy pain in adult patients.

  12. Mediation Analysis with Survival Outcomes: Accelerated Failure Time vs. Proportional Hazards Models.

    PubMed

    Gelfand, Lois A; MacKinnon, David P; DeRubeis, Robert J; Baraldi, Amanda N

    2016-01-01

    Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored) events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH) and fully parametric accelerated failure time (AFT) approaches for illustration. We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively) under varied data conditions, some including censoring. A simulated data set illustrates the findings. AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome-underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG. When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results.

  13. Efficacy of a Radiation Absorbing Shield in Reducing Dose to the Interventionalist During Peripheral Endovascular Procedures: A Single Centre Pilot Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Power, S.; Mirza, M.; Thakorlal, A.

    PurposeThis prospective pilot study was undertaken to evaluate the feasibility and effectiveness of using a radiation absorbing shield to reduce operator dose from scatter during lower limb endovascular procedures.Materials and MethodsA commercially available bismuth shield system (RADPAD) was used. Sixty consecutive patients undergoing lower limb angioplasty were included. Thirty procedures were performed without the RADPAD (control group) and thirty with the RADPAD (study group). Two separate methods were used to measure dose to a single operator. Thermoluminescent dosimeter (TLD) badges were used to measure hand, eye, and unshielded body dose. A direct dosimeter with digital readout was also used tomore » measure eye and unshielded body dose. To allow for variation between control and study groups, dose per unit time was calculated.ResultsTLD results demonstrated a significant reduction in median body dose per unit time for the study group compared with controls (p = 0.001), corresponding to a mean dose reduction rate of 65 %. Median eye and hand dose per unit time were also reduced in the study group compared with control group, however, this was not statistically significant (p = 0.081 for eye, p = 0.628 for hand). Direct dosimeter readings also showed statistically significant reduction in median unshielded body dose rate for the study group compared with controls (p = 0.037). Eye dose rate was reduced for the study group but this was not statistically significant (p = 0.142).ConclusionInitial results are encouraging. Use of the shield resulted in a statistically significant reduction in unshielded dose to the operator’s body. Measured dose to the eye and hand of operator were also reduced but did not reach statistical significance in this pilot study.« less

  14. Protocol for monitoring metals in Ozark National Scenic Riverways, Missouri: Version 1.0

    USGS Publications Warehouse

    Schmitt, Christopher J.; Brumbaugh, William G.; Besser, John M.; Hinck, Jo Ellen; Bowles, David E.; Morrison, Lloyd W.; Williams, Michael H.

    2008-01-01

    The National Park Service is developing a monitoring plan for the Ozark National Scenic Riverways in southeastern Missouri. Because of concerns about the release of lead, zinc, and other metals from lead-zinc mining to streams, the monitoring plan will include mining-related metals. After considering a variety of alternatives, the plan will consist of measuring the concentrations of cadmium, cobalt, lead, nickel, and zinc in composite samples of crayfish (Orconectes luteus or alternate species) and Asian clam (Corbicula fluminea) collected periodically from selected sites. This document, which comprises a protocol narrative and supporting standard operating procedures, describes the methods to be employed prior to, during, and after collection of the organisms, along with procedures for their chemical analysis and quality assurance; statistical analysis, interpretation, and reporting of the data; and for modifying the protocol narrative and supporting standard operating procedures. A list of supplies and equipment, data forms, and sample labels are also included. An example based on data from a pilot study is presented.

  15. Hypnosis for procedure-related pain and distress in pediatric cancer patients: a systematic review of effectiveness and methodology related to hypnosis interventions.

    PubMed

    Richardson, Janet; Smith, Joanna E; McCall, Gillian; Pilkington, Karen

    2006-01-01

    The aim of this study was to systematically review and critically appraise the evidence on the effectiveness of hypnosis for procedure-related pain and distress in pediatric cancer patients. A comprehensive search of major biomedical and specialist complementary and alternative medicine databases was conducted. Citations were included from the databases' inception to March 2005. Efforts were made to identify unpublished and ongoing research. Controlled trials were appraised using predefined criteria. Clinical commentaries were obtained for each study. Seven randomized controlled clinical trials and one controlled clinical trial were found. Studies report positive results, including statistically significant reductions in pain and anxiety/distress, but a number of methodological limitations were identified. Systematic searching and appraisal has demonstrated that hypnosis has potential as a clinically valuable intervention for procedure-related pain and distress in pediatric cancer patients. Further research into the effectiveness and acceptability of hypnosis for pediatric cancer patients is recommended.

  16. A unified framework for weighted parametric multiple test procedures.

    PubMed

    Xi, Dong; Glimm, Ekkehard; Maurer, Willi; Bretz, Frank

    2017-09-01

    We describe a general framework for weighted parametric multiple test procedures based on the closure principle. We utilize general weighting strategies that can reflect complex study objectives and include many procedures in the literature as special cases. The proposed weighted parametric tests bridge the gap between rejection rules using either adjusted significance levels or adjusted p-values. This connection is made by allowing intersection hypotheses of the underlying closed test procedure to be tested at level smaller than α. This may be also necessary to take certain study situations into account. For such cases we introduce a subclass of exact α-level parametric tests that satisfy the consonance property. When the correlation is known only for certain subsets of the test statistics, a new procedure is proposed to fully utilize this knowledge within each subset. We illustrate the proposed weighted parametric tests using a clinical trial example and conduct a simulation study to investigate its operating characteristics. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Topical anaesthesia for needle-related pain in newborn infants.

    PubMed

    Foster, Jann P; Taylor, Christine; Spence, Kaye

    2017-02-04

    Hospitalised newborn neonates frequently undergo painful invasive procedures that involve penetration of the skin and other tissues by a needle. One intervention that can be used prior to a needle insertion procedure is application of a topical local anaesthetic. To evaluate the efficacy and safety of topical anaesthetics such as amethocaine and EMLA in newborn term or preterm infants requiring an invasive procedure involving puncture of skin and other tissues with a needle. We searched the Cochrane Central Register of Controlled Trials (CENTRAL), PubMed, Embase and CINAHL up to 15 May 2016; previous reviews including cross-references, abstracts, and conference proceedings. We contacted expert informants. We contacted authors directly to obtain additional data. We imposed no language restrictions. Randomised, quasi-randomised controlled trials, and cluster and cross-over randomised trials that compared the topical anaesthetics amethocaine and eutectic mixture of local anaesthetics (EMLA) in terms of anaesthetic efficacy and safety in newborn term or preterm infants requiring an invasive procedure involving puncture of skin and other tissues with a needle DATA COLLECTION AND ANALYSIS: From the reports of the clinical trials we extracted data regarding clinical outcomes including pain, number of infants with methaemoglobin level 5% and above, number of needle prick attempts prior to successful needle-related procedure, crying, time taken to complete the procedure, episodes of apnoea, episodes of bradycardia, episodes of oxygen desaturation, neurodevelopmental disability and other adverse events. Eight small randomised controlled trials met the inclusion criteria (n = 506). These studies compared either EMLA and placebo or amethocaine and placebo. No studies compared EMLA and amethocaine. We were unable to meta-analyse the outcome of pain due to differing outcome measures and methods of reporting. For EMLA, two individual studies reported a statistically significant reduction in pain compared to placebo during lumbar puncture and venepuncture. Three studies found no statistical difference between the groups during heel lancing. For amethocaine, three studies reported a statistically significant reduction in pain compared to placebo during venepuncture and one study reported a statistically significant reduction in pain compared to placebo during cannulation. One study reported no statistical difference between the two groups during intramuscular injection.One study reported no statistical difference between EMLA and the placebo group for successful venepuncture at first attempt. One study similarly reported no statistically significant difference between Amethocaine and the placebo group for successful cannulation at first attempt.Risk for local redness, swelling or blanching was significantly higher with EMLA (typical risk ratio (RR) 1.65, 95% confidence interval (CI) 1.24 to 2.19; typical risk difference (RD) 0.17, 95% CI 0.09 to 0.26; n = 272; number needed to treat for an additional harmful outcome (NNTH) 6, 95% CI 4 to 11; I 2 = 92% indicating considerable heterogeneity) although not for amethocaine (typical RR 2.11, 95% CI 0.72 to 6.16; typical RD 0.05, 95% CI -0.02 to 0.11, n = 221). These local skin reactions for EMLA and amethocaine were reported as short-lasting. Two studies reported no methaemoglobinaemia with single application of EMLA. The quality of the evidence on outcomes assessed according to GRADE was low to moderate. Overall, all the trials were small, and the effects of uncertain clinical significance. The evidence regarding the effectiveness or safety of the interventions studied is inadequate to support clinical recommendations. There has been no evaluation regarding any long-term effects of topical anaesthetics in newborn infants.High quality studies evaluating the efficacy and safety of topical anaesthetics such as amethocaine and EMLA for needle-related pain in newborn term or preterm infants are required. These studies should aim to determine efficacy of these topical anaesthetics and on homogenous groups of infants for gestational age. While there was no methaemoglobinaemia in the studies that reported methaemoglobin, the efficacy and safety of EMLA, especially in very preterm infants, and for repeated application, need to be further evaluated in future studies.

  18. Generalized massive optimal data compression

    NASA Astrophysics Data System (ADS)

    Alsing, Justin; Wandelt, Benjamin

    2018-05-01

    In this paper, we provide a general procedure for optimally compressing N data down to n summary statistics, where n is equal to the number of parameters of interest. We show that compression to the score function - the gradient of the log-likelihood with respect to the parameters - yields n compressed statistics that are optimal in the sense that they preserve the Fisher information content of the data. Our method generalizes earlier work on linear Karhunen-Loéve compression for Gaussian data whilst recovering both lossless linear compression and quadratic estimation as special cases when they are optimal. We give a unified treatment that also includes the general non-Gaussian case as long as mild regularity conditions are satisfied, producing optimal non-linear summary statistics when appropriate. As a worked example, we derive explicitly the n optimal compressed statistics for Gaussian data in the general case where both the mean and covariance depend on the parameters.

  19. The Development of Statistical Models for Predicting Surgical Site Infections in Japan: Toward a Statistical Model-Based Standardized Infection Ratio.

    PubMed

    Fukuda, Haruhisa; Kuroki, Manabu

    2016-03-01

    To develop and internally validate a surgical site infection (SSI) prediction model for Japan. Retrospective observational cohort study. We analyzed surveillance data submitted to the Japan Nosocomial Infections Surveillance system for patients who had undergone target surgical procedures from January 1, 2010, through December 31, 2012. Logistic regression analyses were used to develop statistical models for predicting SSIs. An SSI prediction model was constructed for each of the procedure categories by statistically selecting the appropriate risk factors from among the collected surveillance data and determining their optimal categorization. Standard bootstrapping techniques were applied to assess potential overfitting. The C-index was used to compare the predictive performances of the new statistical models with those of models based on conventional risk index variables. The study sample comprised 349,987 cases from 428 participant hospitals throughout Japan, and the overall SSI incidence was 7.0%. The C-indices of the new statistical models were significantly higher than those of the conventional risk index models in 21 (67.7%) of the 31 procedure categories (P<.05). No significant overfitting was detected. Japan-specific SSI prediction models were shown to generally have higher accuracy than conventional risk index models. These new models may have applications in assessing hospital performance and identifying high-risk patients in specific procedure categories.

  20. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data.

    PubMed

    Tekwe, Carmen D; Carroll, Raymond J; Dabney, Alan R

    2012-08-01

    Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. ctekwe@stat.tamu.edu.

  1. 7 CFR 800.86 - Inspection of shiplot, unit train, and lash barge grain in single lots.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... prescribed in the instructions. (b) Application procedure. Applications for the official inspection of... statistical acceptance sampling and inspection plan according to the provisions of this section and procedures... inspection as part of a single lot and accepted by a statistical acceptance sampling and inspection plan...

  2. Statistical Assessment of Variability of Terminal Restriction Fragment Length Polymorphism Analysis Applied to Complex Microbial Communities ▿ †

    PubMed Central

    Rossi, Pierre; Gillet, François; Rohrbach, Emmanuelle; Diaby, Nouhou; Holliger, Christof

    2009-01-01

    The variability of terminal restriction fragment polymorphism analysis applied to complex microbial communities was assessed statistically. Recent technological improvements were implemented in the successive steps of the procedure, resulting in a standardized procedure which provided a high level of reproducibility. PMID:19749066

  3. Statistical Analysis and Time Series Modeling of Air Traffic Operations Data From Flight Service Stations and Terminal Radar Approach Control Facilities : Two Case Studies

    DOT National Transportation Integrated Search

    1981-10-01

    Two statistical procedures have been developed to estimate hourly or daily aircraft counts. These counts can then be transformed into estimates of instantaneous air counts. The first procedure estimates the stable (deterministic) mean level of hourly...

  4. The change and development of statistical methods used in research articles in child development 1930-2010.

    PubMed

    Køppe, Simo; Dammeyer, Jesper

    2014-09-01

    The evolution of developmental psychology has been characterized by the use of different quantitative and qualitative methods and procedures. But how does the use of methods and procedures change over time? This study explores the change and development of statistical methods used in articles published in Child Development from 1930 to 2010. The methods used in every article in the first issue of every volume were categorized into four categories. Until 1980 relatively simple statistical methods were used. During the last 30 years there has been an explosive use of more advanced statistical methods employed. The absence of statistical methods or use of simple methods had been eliminated.

  5. Statistical properties of filtered pseudorandom digital sequences formed from the sum of maximum-length sequences

    NASA Technical Reports Server (NTRS)

    Wallace, G. R.; Weathers, G. D.; Graf, E. R.

    1973-01-01

    The statistics of filtered pseudorandom digital sequences called hybrid-sum sequences, formed from the modulo-two sum of several maximum-length sequences, are analyzed. The results indicate that a relation exists between the statistics of the filtered sequence and the characteristic polynomials of the component maximum length sequences. An analysis procedure is developed for identifying a large group of sequences with good statistical properties for applications requiring the generation of analog pseudorandom noise. By use of the analysis approach, the filtering process is approximated by the convolution of the sequence with a sum of unit step functions. A parameter reflecting the overall statistical properties of filtered pseudorandom sequences is derived. This parameter is called the statistical quality factor. A computer algorithm to calculate the statistical quality factor for the filtered sequences is presented, and the results for two examples of sequence combinations are included. The analysis reveals that the statistics of the signals generated with the hybrid-sum generator are potentially superior to the statistics of signals generated with maximum-length generators. Furthermore, fewer calculations are required to evaluate the statistics of a large group of hybrid-sum generators than are required to evaluate the statistics of the same size group of approximately equivalent maximum-length sequences.

  6. A new zero-inflated negative binomial methodology for latent category identification.

    PubMed

    Blanchard, Simon J; DeSarbo, Wayne S

    2013-04-01

    We introduce a new statistical procedure for the identification of unobserved categories that vary between individuals and in which objects may span multiple categories. This procedure can be used to analyze data from a proposed sorting task in which individuals may simultaneously assign objects to multiple piles. The results of a synthetic example and a consumer psychology study involving categories of restaurant brands illustrate how the application of the proposed methodology to the new sorting task can account for a variety of categorization phenomena including multiple category memberships and for heterogeneity through individual differences in the saliency of latent category structures.

  7. Cleanroom certification model

    NASA Technical Reports Server (NTRS)

    Currit, P. A.

    1983-01-01

    The Cleanroom software development methodology is designed to take the gamble out of product releases for both suppliers and receivers of the software. The ingredients of this procedure are a life cycle of executable product increments, representative statistical testing, and a standard estimate of the MTTF (Mean Time To Failure) of the product at the time of its release. A statistical approach to software product testing using randomly selected samples of test cases is considered. A statistical model is defined for the certification process which uses the timing data recorded during test. A reasonableness argument for this model is provided that uses previously published data on software product execution. Also included is a derivation of the certification model estimators and a comparison of the proposed least squares technique with the more commonly used maximum likelihood estimators.

  8. The use of analysis of variance procedures in biological studies

    USGS Publications Warehouse

    Williams, B.K.

    1987-01-01

    The analysis of variance (ANOVA) is widely used in biological studies, yet there remains considerable confusion among researchers about the interpretation of hypotheses being tested. Ambiguities arise when statistical designs are unbalanced, and in particular when not all combinations of design factors are represented in the data. This paper clarifies the relationship among hypothesis testing, statistical modelling and computing procedures in ANOVA for unbalanced data. A simple two-factor fixed effects design is used to illustrate three common parametrizations for ANOVA models, and some associations among these parametrizations are developed. Biologically meaningful hypotheses for main effects and interactions are given in terms of each parametrization, and procedures for testing the hypotheses are described. The standard statistical computing procedures in ANOVA are given along with their corresponding hypotheses. Throughout the development unbalanced designs are assumed and attention is given to problems that arise with missing cells.

  9. A 20-year period of orthotopic liver transplantation activity in a single center: a time series analysis performed using the R Statistical Software.

    PubMed

    Santori, G; Andorno, E; Morelli, N; Casaccia, M; Bottino, G; Di Domenico, S; Valente, U

    2009-05-01

    In many Western countries a "minimum volume rule" policy has been adopted as a quality measure for complex surgical procedures. In Italy, the National Transplant Centre set the minimum number of orthotopic liver transplantation (OLT) procedures/y at 25/center. OLT procedures performed in a single center for a reasonably large period may be treated as a time series to evaluate trend, seasonal cycles, and nonsystematic fluctuations. Between January 1, 1987 and December 31, 2006, we performed 563 cadaveric donor OLTs to adult recipients. During 2007, there were another 28 procedures. The greatest numbers of OLTs/y were performed in 2001 (n = 51), 2005 (n = 50), and 2004 (n = 49). A time series analysis performed using R Statistical Software (Foundation for Statistical Computing, Vienna, Austria), a free software environment for statistical computing and graphics, showed an incremental trend after exponential smoothing as well as after seasonal decomposition. The predicted OLT/mo for 2007 calculated with the Holt-Winters exponential smoothing applied to the previous period 1987-2006 helped to identify the months where there was a major difference between predicted and performed procedures. The time series approach may be helpful to establish a minimum volume/y at a single-center level.

  10. A close examination of double filtering with fold change and t test in microarray analysis

    PubMed Central

    2009-01-01

    Background Many researchers use the double filtering procedure with fold change and t test to identify differentially expressed genes, in the hope that the double filtering will provide extra confidence in the results. Due to its simplicity, the double filtering procedure has been popular with applied researchers despite the development of more sophisticated methods. Results This paper, for the first time to our knowledge, provides theoretical insight on the drawback of the double filtering procedure. We show that fold change assumes all genes to have a common variance while t statistic assumes gene-specific variances. The two statistics are based on contradicting assumptions. Under the assumption that gene variances arise from a mixture of a common variance and gene-specific variances, we develop the theoretically most powerful likelihood ratio test statistic. We further demonstrate that the posterior inference based on a Bayesian mixture model and the widely used significance analysis of microarrays (SAM) statistic are better approximations to the likelihood ratio test than the double filtering procedure. Conclusion We demonstrate through hypothesis testing theory, simulation studies and real data examples, that well constructed shrinkage testing methods, which can be united under the mixture gene variance assumption, can considerably outperform the double filtering procedure. PMID:19995439

  11. New Mexico State Annual Evaluation Report, Fiscal Year Ending June 30, 1970. P. L. 89-10, Title 1 ESEA Projects.

    ERIC Educational Resources Information Center

    Legant, Jean; Eakens, Doyle R.

    Contents of the New Mexico State Annual Evaluation Report for ESEA Title I Projects, for fiscal year ending June 30, 1970, include: New Mexico allocations for 1969-70; school districts allocations for 1969-70--basic statistics, state education agency staff visits to local education agencies, changes in the effect of state agency procedures, effect…

  12. Statistics of Scientific Procedures on Living Animals Great Britain 2015 - highlighting an ongoing upward trend in animal use and missed opportunities.

    PubMed

    Hudson-Shore, Michelle

    2016-12-01

    The Annual Statistics of Scientific Procedures on Living Animals Great Britain 2015 indicate that the Home Office were correct in recommending that caution should be exercised when interpreting the 2014 data as an apparent decline in animal experiments. The 2015 report shows that, as the changes to the format of the annual statistics have become more familiar and less problematic, there has been a re-emergence of the upward trend in animal research and testing in Great Britain. The 2015 statistics report an increase in animal procedures (up to 4,142,631) and in the number of animals used (up to 4,069,349). This represents 1% more than the totals in 2013, and a 7% increase on the procedures reported in 2014. This paper details an analysis of these most recent statistics, providing information on overall animal use and highlighting specific issues associated with genetically-altered animals, dogs and primates. It also reflects on areas of the new format that have previously been highlighted as being problematic, and concludes with a discussion about the use of animals in regulatory research and testing, and how there are significant missed opportunities for replacing some of the animal-based tests in this area. 2016 FRAME.

  13. Quantifying the Diversity and Similarity of Surgical Procedures Among Hospitals and Anesthesia Providers.

    PubMed

    Dexter, Franklin; Ledolter, Johannes; Hindman, Bradley J

    2016-01-01

    In this Statistical Grand Rounds, we review methods for the analysis of the diversity of procedures among hospitals, the activities among anesthesia providers, etc. We apply multiple methods and consider their relative reliability and usefulness for perioperative applications, including calculations of SEs. We also review methods for comparing the similarity of procedures among hospitals, activities among anesthesia providers, etc. We again apply multiple methods and consider their relative reliability and usefulness for perioperative applications. The applications include strategic analyses (e.g., hospital marketing) and human resource analytics (e.g., comparisons among providers). Measures of diversity of procedures and activities (e.g., Herfindahl and Gini-Simpson index) are used for quantification of each facility (hospital) or anesthesia provider, one at a time. Diversity can be thought of as a summary measure. Thus, if the diversity of procedures for 48 hospitals is studied, the diversity (and its SE) is being calculated for each hospital. Likewise, the effective numbers of common procedures at each hospital can be calculated (e.g., by using the exponential of the Shannon index). Measures of similarity are pairwise assessments. Thus, if quantifying the similarity of procedures among cases with a break or handoff versus cases without a break or handoff, a similarity index represents a correlation coefficient. There are several different measures of similarity, and we compare their features and applicability for perioperative data. We rely extensively on sensitivity analyses to interpret observed values of the similarity index.

  14. Methods for collection and analysis of aquatic biological and microbiological samples

    USGS Publications Warehouse

    Greeson, Phillip E.; Ehlke, T.A.; Irwin, G.A.; Lium, B.W.; Slack, K.V.

    1977-01-01

    Chapter A4 contains methods used by the U.S. Geological Survey to collect, preserve, and analyze waters to determine their biological and microbiological properties. Part 1 discusses biological sampling and sampling statistics. The statistical procedures are accompanied by examples. Part 2 consists of detailed descriptions of more than 45 individual methods, including those for bacteria, phytoplankton, zooplankton, seston, periphyton, macrophytes, benthic invertebrates, fish and other vertebrates, cellular contents, productivity, and bioassays. Each method is summarized, and the application, interferences, apparatus, reagents, collection, analysis, calculations, reporting of results, precision and references are given. Part 3 consists of a glossary. Part 4 is a list of taxonomic references.

  15. Estimating times of surgeries with two component procedures: comparison of the lognormal and normal models.

    PubMed

    Strum, David P; May, Jerrold H; Sampson, Allan R; Vargas, Luis G; Spangler, William E

    2003-01-01

    Variability inherent in the duration of surgical procedures complicates surgical scheduling. Modeling the duration and variability of surgeries might improve time estimates. Accurate time estimates are important operationally to improve utilization, reduce costs, and identify surgeries that might be considered outliers. Surgeries with multiple procedures are difficult to model because they are difficult to segment into homogenous groups and because they are performed less frequently than single-procedure surgeries. The authors studied, retrospectively, 10,740 surgeries each with exactly two CPTs and 46,322 surgical cases with only one CPT from a large teaching hospital to determine if the distribution of dual-procedure surgery times fit more closely a lognormal or a normal model. The authors tested model goodness of fit to their data using Shapiro-Wilk tests, studied factors affecting the variability of time estimates, and examined the impact of coding permutations (ordered combinations) on modeling. The Shapiro-Wilk tests indicated that the lognormal model is statistically superior to the normal model for modeling dual-procedure surgeries. Permutations of component codes did not appear to differ significantly with respect to total procedure time and surgical time. To improve individual models for infrequent dual-procedure surgeries, permutations may be reduced and estimates may be based on the longest component procedure and type of anesthesia. The authors recommend use of the lognormal model for estimating surgical times for surgeries with two component procedures. Their results help legitimize the use of log transforms to normalize surgical procedure times prior to hypothesis testing using linear statistical models. Multiple-procedure surgeries may be modeled using the longest (statistically most important) component procedure and type of anesthesia.

  16. Applications of statistics to medical science (1) Fundamental concepts.

    PubMed

    Watanabe, Hiroshi

    2011-01-01

    The conceptual framework of statistical tests and statistical inferences are discussed, and the epidemiological background of statistics is briefly reviewed. This study is one of a series in which we survey the basics of statistics and practical methods used in medical statistics. Arguments related to actual statistical analysis procedures will be made in subsequent papers.

  17. Comparing outcomes of pediatric and adult external dacryocystorhinostomy in Nepal: Is age a prognostic factor?

    PubMed

    Limbu, Ben; Katwal, Sulaxmi; Lim, Nicole S; Faierman, Michelle L; Gushchin, Anna G; Saiju, Rohit

    2017-08-01

    We determine whether age is a prognostic factor for surgical outcomes of external dacryocystorhinostomy (Ex-DCR). This retrospective cohort study conducted at Tilganga Institute of Ophthalmology (Kathmandu, Nepal) compared pediatric Ex-DCR procedures (age ≤ 15 years) to adult Ex-DCR procedures (age > 15 years) and was performed between January 2013 and December 2013, with a minimum follow-up period of 6 months. Primary outcome measure was rate of success, defined as complete resolution of subjective symptom(s) of epiphora (subjective success), combined with patent lacrimal passage on syringing (anatomical success) at last follow-up visit. Other outcome measures included clinical presentation, diagnosis, intraoperative complications and post-operative complications. In total, 154 Ex-DCR procedures were included, with an age range of 8 months to 81 years (mean age 36.4 ± 21.0 years). In all, 38 pediatric Ex-DCR procedures were compared to 116 adult procedures. Success rates were 97% in the pediatric group and 95% in the adult group, with no clinically or statistically significant difference in success rate or complication rate between groups (p > 0.05). Our study yielded high success rates of Ex-DCR in both pediatric and adult age groups suggesting that Ex-DCR remains an optimal treatment choice for all age groups. With no difference in surgical outcomes between pediatric and adult patients, including complication rate, we conclude that age is not a prognostic factor for Ex-DCR failure. We do not recommend adjuvant therapy for pediatric patients.

  18. Walking through the statistical black boxes of plant breeding.

    PubMed

    Xavier, Alencar; Muir, William M; Craig, Bruce; Rainey, Katy Martin

    2016-10-01

    The main statistical procedures in plant breeding are based on Gaussian process and can be computed through mixed linear models. Intelligent decision making relies on our ability to extract useful information from data to help us achieve our goals more efficiently. Many plant breeders and geneticists perform statistical analyses without understanding the underlying assumptions of the methods or their strengths and pitfalls. In other words, they treat these statistical methods (software and programs) like black boxes. Black boxes represent complex pieces of machinery with contents that are not fully understood by the user. The user sees the inputs and outputs without knowing how the outputs are generated. By providing a general background on statistical methodologies, this review aims (1) to introduce basic concepts of machine learning and its applications to plant breeding; (2) to link classical selection theory to current statistical approaches; (3) to show how to solve mixed models and extend their application to pedigree-based and genomic-based prediction; and (4) to clarify how the algorithms of genome-wide association studies work, including their assumptions and limitations.

  19. An automated approach to the design of decision tree classifiers

    NASA Technical Reports Server (NTRS)

    Argentiero, P.; Chin, R.; Beaudet, P.

    1982-01-01

    An automated technique is presented for designing effective decision tree classifiers predicated only on a priori class statistics. The procedure relies on linear feature extractions and Bayes table look-up decision rules. Associated error matrices are computed and utilized to provide an optimal design of the decision tree at each so-called 'node'. A by-product of this procedure is a simple algorithm for computing the global probability of correct classification assuming the statistical independence of the decision rules. Attention is given to a more precise definition of decision tree classification, the mathematical details on the technique for automated decision tree design, and an example of a simple application of the procedure using class statistics acquired from an actual Landsat scene.

  20. Biostatistical analysis of quantitative immunofluorescence microscopy images.

    PubMed

    Giles, C; Albrecht, M A; Lam, V; Takechi, R; Mamo, J C

    2016-12-01

    Semiquantitative immunofluorescence microscopy has become a key methodology in biomedical research. Typical statistical workflows are considered in the context of avoiding pseudo-replication and marginalising experimental error. However, immunofluorescence microscopy naturally generates hierarchically structured data that can be leveraged to improve statistical power and enrich biological interpretation. Herein, we describe a robust distribution fitting procedure and compare several statistical tests, outlining their potential advantages/disadvantages in the context of biological interpretation. Further, we describe tractable procedures for power analysis that incorporates the underlying distribution, sample size and number of images captured per sample. The procedures outlined have significant potential for increasing understanding of biological processes and decreasing both ethical and financial burden through experimental optimization. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  1. Treatment of control data in lunar phototriangulation. [application of statistical procedures and development of mathematical and computer techniques

    NASA Technical Reports Server (NTRS)

    Wong, K. W.

    1974-01-01

    In lunar phototriangulation, there is a complete lack of accurate ground control points. The accuracy analysis of the results of lunar phototriangulation must, therefore, be completely dependent on statistical procedure. It was the objective of this investigation to examine the validity of the commonly used statistical procedures, and to develop both mathematical techniques and computer softwares for evaluating (1) the accuracy of lunar phototriangulation; (2) the contribution of the different types of photo support data on the accuracy of lunar phototriangulation; (3) accuracy of absolute orientation as a function of the accuracy and distribution of both the ground and model points; and (4) the relative slope accuracy between any triangulated pass points.

  2. Generalized Appended Product Indicator Procedure for Nonlinear Structural Equation Analysis.

    ERIC Educational Resources Information Center

    Wall, Melanie M.; Amemiya, Yasuo

    2001-01-01

    Considers the estimation of polynomial structural models and shows a limitation of an existing method. Introduces a new procedure, the generalized appended product indicator procedure, for nonlinear structural equation analysis. Addresses statistical issues associated with the procedure through simulation. (SLD)

  3. A Multidisciplinary Approach for Teaching Statistics and Probability

    ERIC Educational Resources Information Center

    Rao, C. Radhakrishna

    1971-01-01

    The author presents a syllabus for an introductory (first year after high school) course in statistics and probability and some methods of teaching statistical techniques. The description comes basically from the procedures used at the Indian Statistical Institute, Calcutta. (JG)

  4. Reproducibility-optimized test statistic for ranking genes in microarray studies.

    PubMed

    Elo, Laura L; Filén, Sanna; Lahesmaa, Riitta; Aittokallio, Tero

    2008-01-01

    A principal goal of microarray studies is to identify the genes showing differential expression under distinct conditions. In such studies, the selection of an optimal test statistic is a crucial challenge, which depends on the type and amount of data under analysis. While previous studies on simulated or spike-in datasets do not provide practical guidance on how to choose the best method for a given real dataset, we introduce an enhanced reproducibility-optimization procedure, which enables the selection of a suitable gene- anking statistic directly from the data. In comparison with existing ranking methods, the reproducibilityoptimized statistic shows good performance consistently under various simulated conditions and on Affymetrix spike-in dataset. Further, the feasibility of the novel statistic is confirmed in a practical research setting using data from an in-house cDNA microarray study of asthma-related gene expression changes. These results suggest that the procedure facilitates the selection of an appropriate test statistic for a given dataset without relying on a priori assumptions, which may bias the findings and their interpretation. Moreover, the general reproducibilityoptimization procedure is not limited to detecting differential expression only but could be extended to a wide range of other applications as well.

  5. Sources of funding for adult and paediatric CT procedures at a metropolitan tertiary hospital: How much do Medicare statistics really cover?

    PubMed

    Hayton, Anna; Wallace, Anthony; Johnston, Peter

    2015-12-01

    The radiation dose to the Australian paediatric population as a result of medical imaging is of growing concern, in particular the dose from CT. Estimates of the Australian population dose have largely relied on Medicare Australia statistics, which capture only a fraction of those imaging procedures actually performed. The fraction not captured has been estimated using a value obtained for a survey of the adult population in the mid-1990s. To better quantify the fraction of procedures that are not captured by Medicare Australia, procedure frequency and funding data for adult and paediatric patients were obtained from a metropolitan tertiary teaching and research hospital. Five calendar years of data were obtained with a financial class specified for each individual procedure. The financial classes were grouped to give the percentage of Medicare Australia billable procedures for both adult and paediatric patients. The data were also grouped to align with the Medicare Australia age cohorts. The percentage of CT procedures billable to Medicare Australia increased from 16% to 28% between 2008 and 2012. In 2012, the percentage billable for adult and paediatric patients was 28% and 33%, respectively; however, many adult CT procedures are performed at stand-alone clinics, which bulk bill. Using Medicare Australia statistics alone, the frequency of paediatric CT procedures performed on the Australian paediatric population will be grossly under estimated. A correction factor of 4.5 is suggested for paediatric procedures and 1.5 for adult procedures. The fraction of actual procedures performed that are captured by Medicare Australia will vary with time. © 2015 The Royal Australian and New Zealand College of Radiologists.

  6. Streamflow measurements, basin characteristics, and streamflow statistics for low-flow partial-record stations operated in Massachusetts from 1989 through 1996

    USGS Publications Warehouse

    Ries, Kernell G.

    1999-01-01

    A network of 148 low-flow partial-record stations was operated on streams in Massachusetts during the summers of 1989 through 1996. Streamflow measurements (including historical measurements), measured basin characteristics, and estimated streamflow statistics are provided in the report for each low-flow partial-record station. Also included for each station are location information, streamflow-gaging stations for which flows were correlated to those at the low-flowpartial-record station, years of operation, and remarks indicating human influences of stream-flowsat the station. Three or four streamflow measurements were made each year for three years during times of low flow to obtain nine or ten measurements for each station. Measured flows at the low-flow partial-record stations were correlated with same-day mean flows at a nearby gaging station to estimate streamflow statistics for the low-flow partial-record stations. The estimated streamflow statistics include the 99-, 98-, 97-, 95-, 93-, 90-, 85-, 80-, 75-, 70-, 65-, 60-, 55-, and 50-percent duration flows; the 7-day, 10- and 2-year low flows; and the August median flow. Characteristics of the drainage basins for the stations that theoretically relate to the response of the station to climatic variations were measured from digital map data by use of an automated geographic information system procedure. Basin characteristics measured include drainage area; total stream length; mean basin slope; area of surficial stratified drift; area of wetlands; area of water bodies; and mean, maximum, and minimum basin elevation.Station descriptions and calculated streamflow statistics are also included in the report for the 50 continuous gaging stations used in correlations with the low-flow partial-record stations.

  7. Large Variability in the Diversity of Physiologically Complex Surgical Procedures Exists Nationwide Among All Hospitals Including Among Large Teaching Hospitals.

    PubMed

    Dexter, Franklin; Epstein, Richard H; Thenuwara, Kokila; Lubarsky, David A

    2017-11-22

    Multiple previous studies have shown that having a large diversity of procedures has a substantial impact on quality management of hospital surgical suites. At hospitals with substantial diversity, unless sophisticated statistical methods suitable for rare events are used, anesthesiologists working in surgical suites will have inaccurate predictions of surgical blood usage, case durations, cost accounting and price transparency, times remaining in late running cases, and use of intraoperative equipment. What is unknown is whether large diversity is a feature of only a few very unique set of hospitals nationwide (eg, the largest hospitals in each state or province). The 2013 United States Nationwide Readmissions Database was used to study heterogeneity among 1981 hospitals in their diversities of physiologically complex surgical procedures (ie, the procedure codes). The diversity of surgical procedures performed at each hospital was quantified using a summary measure, the number of different physiologically complex surgical procedures commonly performed at the hospital (ie, 1/Herfindahl). A total of 53.9% of all hospitals commonly performed <10 physiologically complex procedures (lower 99% confidence limit [CL], 51.3%). A total of 14.2% (lower 99% CL, 12.4%) of hospitals had >3-fold larger diversity (ie, >30 commonly performed physiologically complex procedures). Larger hospitals had greater diversity than the small- and medium-sized hospitals (P < .0001). Teaching hospitals had greater diversity than did the rural and urban nonteaching hospitals (P < .0001). A total of 80.0% of the 170 large teaching hospitals commonly performed >30 procedures (lower 99% CL, 71.9% of hospitals). However, there was considerable variability among the large teaching hospitals in their diversity (interquartile range of the numbers of commonly performed physiologically complex procedures = 19.3; lower 99% CL, 12.8 procedures). The diversity of procedures represents a substantive differentiator among hospitals. Thus, the usefulness of statistical methods for operating room management should be expected to be heterogeneous among hospitals. Our results also show that "large teaching hospital" alone is an insufficient description for accurate prediction of the extent to which a hospital sustains the operational and financial consequences of performing a wide diversity of surgical procedures. Future research can evaluate the extent to which hospitals with very large diversity are indispensable in their catchment area.

  8. Intradural Procedural Time to Assess Technical Difficulty of Superciliary Keyhole and Pterional Approaches for Unruptured Middle Cerebral Artery Aneurysms

    PubMed Central

    Choi, Yeon-Ju; Son, Wonsoo; Park, Ki-Su

    2016-01-01

    Objective This study used the intradural procedural time to assess the overall technical difficulty involved in surgically clipping an unruptured middle cerebral artery (MCA) aneurysm via a pterional or superciliary approach. The clinical and radiological variables affecting the intradural procedural time were investigated, and the intradural procedural time compared between a superciliary keyhole approach and a pterional approach. Methods During a 5.5-year period, patients with a single MCA aneurysm were enrolled in this retrospective study. The selection criteria for a superciliary keyhole approach included : 1) maximum diameter of the unruptured MCA aneurysm <15 mm, 2) neck diameter of the MCA aneurysm <10 mm, and 3) aneurysm location involving the sphenoidal or horizontal segment of MCA (M1) segment and MCA bifurcation, excluding aneurysms distal to the MCA genu. Meanwhile, the control comparison group included patients with the same selection criteria as for a superciliary approach, yet who preferred a pterional approach to avoid a postoperative facial wound or due to preoperative skin trouble in the supraorbital area. To determine the variables affecting the intradural procedural time, a multiple regression analysis was performed using such data as the patient age and gender, maximum aneurysm diameter, aneurysm neck diameter, and length of the pre-aneurysm M1 segment. In addition, the intradural procedural times were compared between the superciliary and pterional patient groups, along with the other variables. Results A total of 160 patients underwent a superciliary (n=124) or pterional (n=36) approach for an unruptured MCA aneurysm. In the multiple regression analysis, an increase in the diameter of the aneurysm neck (p<0.001) was identified as a statistically significant factor increasing the intradural procedural time. A Pearson correlation analysis also showed a positive correlation (r=0.340) between the neck diameter and the intradural procedural time. When comparing the superciliary and pterional groups, no statistically significant between-group difference was found in terms of the intradural procedural time reflecting the technical difficulty (mean±standard deviation : 29.8±13.0 min versus 27.7±9.6 min). Conclusion A superciliary keyhole approach can be a useful alternative to a pterional approach for an unruptured MCA aneurysm with a maximum diameter <15 mm and neck diameter <10 mm, representing no more of a technical challenge. For both surgical approaches, the technical difficulty increases along with the neck diameter of the MCA aneurysm. PMID:27847568

  9. Evolution of treatment of fistula in ano.

    PubMed

    Blumetti, J; Abcarian, A; Quinteros, F; Chaudhry, V; Prasad, L; Abcarian, H

    2012-05-01

    Fistula-in-ano is a common medical problem affecting thousands of patients annually. In the past, the options for treatment of fistula-in-ano were limited to fistulotomy and/or seton placement. Current treatment options also include muscle-sparing techniques such as a dermal island flap, endorectal advancement flap, fibrin sealent injection, anal fistula plug, and most recently ligation of the intersphincteric fistula tract (procedure). This study seeks to evaluate types and time trends for treatment of fistula-in-ano. A retrospective review from 1975 to 2009 was performed. Data were collected and sorted into 5-year increments for type and time trends of treatment. Fistulotomy and partial fistulotomy were grouped as cutting procedures. Seton placement, fibrin sealant, dermal flap, endorectal flap, and fistula plug were grouped as noncutting procedures. Statistical analysis was performed for each time period to determine trends. With institutional review board approval, the records of 2,267 fistula operations available for analysis were included. Most of the patients were men (74 vs. 26%). Cutting procedures comprised 66.6% (n = 1510) of all procedures. Noncutting procedures were utilized in 33.4% (n = 757), including Seton placement alone 370 (16.3%), fibrin sealant 168 (7.4%), dermal or endorectal flap 147 (6.5%), and fistula plug 72 (3.2%). The distribution of operations grouped in 5-year intervals is as follows: 1975-1979, 78 cutting and one noncutting; 1980-1984, 170 cutting and 10 noncutting; 1985-1989, 54 cutting and five noncutting; 1990-1994, 37 cutting and six noncutting; 1995-1999, 367 cutting and 167 noncutting; 2000-2004, 514 cutting and 283 noncutting; 2005-2009, 290 cutting and 285 noncutting. The percentage of cutting and noncutting procedures significantly differed over time, with cutting procedures decreasing and noncutting procedures increasing proportionally (χ(2) linear-by-linear association, p < 0.05). Fistula-in-ano remains a common complex disease process. Its treatment has evolved to include a variety of noncutting techniques in addition to traditional fistulotomy. With the advent of more sphincter-sparing techniques, the number of patients undergoing fistulotomy should continue to decrease over time. Surgeons should become familiar with various surgical techniques so the treatment can be tailored to the patient.

  10. MiDAS ENCORE: Randomized Controlled Clinical Trial Report of 6-Month Results.

    PubMed

    Staats, Peter S; Benyamin, Ramsin M

    2016-02-01

    Patients suffering from neurogenic claudication due to lumbar spinal stenosis (LSS) often experience moderate to severe pain and significant functional disability. Neurogenic claudication results from progressive degenerative changes in the spine, and most often affects the elderly. Both the MILD® procedure and epidural steroid injections (ESIs) offer interventional pain treatment options for LSS patients experiencing neurogenic claudication refractory to more conservative therapies. MILD provides an alternative to ESIs via minimally invasive lumbar decompression. Prospective, multi-center, randomized controlled clinical trial. Twenty-six US interventional pain management centers. To compare patient outcomes following treatment with either MILD (treatment group) or ESIs (active control group) in LSS patients with neurogenic claudication and verified ligamentum flavum hypertrophy. This prospective, multi-center, randomized controlled clinical trial includes 2 study arms with a 1-to-1 randomization ratio. A total of 302 patients were enrolled, with 149 randomized to MILD and 153 to the active control. Six-month follow-up has been completed and is presented in this report. In addition, one year follow-up will be conducted for patients in both study arms, and supplementary 2 year outcome data will be collected for patients in the MILD group only. Outcomes are assessed using the Oswestry Disability Index (ODI), numeric pain rating scale (NPRS) and Zurich Claudication Questionnaire (ZCQ). Primary efficacy is the proportion of ODI responders, tested for statistical superiority of the MILD group versus the active control group. ODI responders are defined as patients achieving the validated Minimal Important Change (MIC) of =10 point improvement in ODI from baseline to follow-up. Similarly, secondary efficacy includes proportion of NPRS and ZCQ responders using validated MIC thresholds. Primary safety is the incidence of device or procedure-related adverse events in each group. At 6 months, all primary and secondary efficacy results provided statistically significant evidence that MILD is superior to the active control. For primary efficacy, the proportion of ODI responders in the MILD group (62.2%) was statistically significantly higher than for the epidural steroid group (35.7%) (P < 0.001). Further, all secondary efficacy parameters demonstrated statistical superiority of MILD versus the active control. The primary safety endpoint was achieved, demonstrating that there is no difference in safety between MILD and ESIs (P = 1.00). Limitations include lack of patient blinding due to considerable differences in treatment protocols, and a potentially higher non-responder rate for both groups versus standard-of-care due to study restrictions on adjunctive pain therapies. Six month follow-up data from this trial demonstrate that the MILD procedure is statistically superior to epidural steroids, a known active treatment for LSS patients with neurogenic claudication and verified central stenosis due to ligamentum flavum hypertrophy. The results of all primary and secondary efficacy outcome measures achieved statistically superior outcomes in the MILD group versus ESIs. Further, there were no statistically significant differences in the safety profile between study groups. This prospective, multi-center, randomized controlled clinical trial provides strong evidence of the effectiveness of MILD versus epidural steroids in this patient population. NCT02093520.

  11. Comparison of Dissolution Similarity Assessment Methods for Products with Large Variations: f2 Statistics and Model-Independent Multivariate Confidence Region Procedure for Dissolution Profiles of Multiple Oral Products.

    PubMed

    Yoshida, Hiroyuki; Shibata, Hiroko; Izutsu, Ken-Ichi; Goda, Yukihiro

    2017-01-01

    The current Japanese Ministry of Health Labour and Welfare (MHLW)'s Guideline for Bioequivalence Studies of Generic Products uses averaged dissolution rates for the assessment of dissolution similarity between test and reference formulations. This study clarifies how the application of model-independent multivariate confidence region procedure (Method B), described in the European Medical Agency and U.S. Food and Drug Administration guidelines, affects similarity outcomes obtained empirically from dissolution profiles with large variations in individual dissolution rates. Sixty-one datasets of dissolution profiles for immediate release, oral generic, and corresponding innovator products that showed large variation in individual dissolution rates in generic products were assessed on their similarity by using the f 2 statistics defined in the MHLW guidelines (MHLW f 2 method) and two different Method B procedures, including a bootstrap method applied with f 2 statistics (BS method) and a multivariate analysis method using the Mahalanobis distance (MV method). The MHLW f 2 and BS methods provided similar dissolution similarities between reference and generic products. Although a small difference in the similarity assessment may be due to the decrease in the lower confidence interval for expected f 2 values derived from the large variation in individual dissolution rates, the MV method provided results different from those obtained through MHLW f 2 and BS methods. Analysis of actual dissolution data for products with large individual variations would provide valuable information towards an enhanced understanding of these methods and their possible incorporation in the MHLW guidelines.

  12. Mediation Analysis with Survival Outcomes: Accelerated Failure Time vs. Proportional Hazards Models

    PubMed Central

    Gelfand, Lois A.; MacKinnon, David P.; DeRubeis, Robert J.; Baraldi, Amanda N.

    2016-01-01

    Objective: Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored) events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH) and fully parametric accelerated failure time (AFT) approaches for illustration. Method: We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively) under varied data conditions, some including censoring. A simulated data set illustrates the findings. Results: AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome—underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG. Conclusions: When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results. PMID:27065906

  13. Graft survival of diabetic versus nondiabetic donor tissue after initial keratoplasty.

    PubMed

    Vislisel, Jesse M; Liaboe, Chase A; Wagoner, Michael D; Goins, Kenneth M; Sutphin, John E; Schmidt, Gregory A; Zimmerman, M Bridget; Greiner, Mark A

    2015-04-01

    To compare corneal graft survival using tissue from diabetic and nondiabetic donors in patients undergoing initial Descemet stripping automated endothelial keratoplasty (DSAEK) or penetrating keratoplasty (PKP). A retrospective chart review of pseudophakic eyes that underwent DSAEK or PKP was performed. The primary outcome measure was graft failure. Cox proportional hazard regression and Kaplan-Meier survival analyses were used to compare diabetic versus nondiabetic donor tissue for all keratoplasty cases. A total of 183 eyes (136 DSAEK, 47 PKP) were included in the statistical analysis. Among 24 procedures performed using diabetic donor tissue, there were 4 cases (16.7%) of graft failure (3 DSAEK, 1 PKP), and among 159 procedures performed using nondiabetic donor tissue, there were 18 cases (11.3%) of graft failure (12 DSAEK, 6 PKP). Cox proportional hazard ratio of graft failure for all cases comparing diabetic with nondiabetic donor tissue was 1.69, but this difference was not statistically significant (95% confidence interval, 0.56-5.06; P = 0.348). There were no significant differences in Kaplan-Meier curves comparing diabetic with nondiabetic donor tissue for all cases (P = 0.380). Statistical analysis of graft failure by donor diabetes status within each procedure type was not possible because of the small number of graft failure events involving diabetic tissue. We found similar rates of graft failure in all keratoplasty cases when comparing tissue from diabetic and nondiabetic donors, but further investigation is needed to determine whether diabetic donor tissue results in different graft failure rates after DSAEK compared with PKP.

  14. Procedure-related risk of miscarriage following amniocentesis and chorionic villus sampling: a systematic review and meta-analysis.

    PubMed

    Akolekar, R; Beta, J; Picciarelli, G; Ogilvie, C; D'Antonio, F

    2015-01-01

    To estimate procedure-related risks of miscarriage following amniocentesis and chorionic villus sampling (CVS) based on a systematic review of the literature and a meta-analysis. A search of MEDLINE, EMBASE, CINHAL and The Cochrane Library (2000-2014) was performed to review relevant citations reporting procedure-related complications of amniocentesis and CVS. Only studies reporting data on more than 1000 procedures were included in this review to minimize the effect of bias from smaller studies. Heterogeneity between studies was estimated using Cochran's Q, the I(2) statistic and Egger bias. Meta-analysis of proportions was used to derive weighted pooled estimates for the risk of miscarriage before 24 weeks' gestation. Incidence-rate difference meta-analysis was used to estimate pooled procedure-related risks. The weighted pooled risks of miscarriage following invasive procedures were estimated from analysis of controlled studies including 324 losses in 42 716 women who underwent amniocentesis and 207 losses in 8899 women who underwent CVS. The risk of miscarriage prior to 24 weeks in women who underwent amniocentesis and CVS was 0.81% (95% CI, 0.58-1.08%) and 2.18% (95% CI, 1.61-2.82%), respectively. The background rates of miscarriage in women from the control group that did not undergo any procedures were 0.67% (95% CI, 0.46-0.91%) for amniocentesis and 1.79% (95% CI, 0.61-3.58%) for CVS. The weighted pooled procedure-related risks of miscarriage for amniocentesis and CVS were 0.11% (95% CI, -0.04 to 0.26%) and 0.22% (95% CI, -0.71 to 1.16%), respectively. The procedure-related risks of miscarriage following amniocentesis and CVS are much lower than are currently quoted. Copyright © 2014 ISUOG. Published by John Wiley & Sons Ltd.

  15. Simulation-based educational curriculum for fluoroscopically guided lumbar puncture improves operator confidence and reduces patient dose.

    PubMed

    Faulkner, Austin R; Bourgeois, Austin C; Bradley, Yong C; Hudson, Kathleen B; Heidel, R Eric; Pasciak, Alexander S

    2015-05-01

    Fluoroscopically guided lumbar puncture (FGLP) is a commonly performed procedure with increased success rates relative to bedside technique. However, FGLP also exposes both patient and staff to ionizing radiation. The purpose of this study was to determine if the use of a simulation-based FGLP training program using an original, inexpensive lumbar spine phantom could improve operator confidence and efficiency, while also reducing patient dose. A didactic and simulation-based FGLP curriculum was designed, including a 1-hour lecture and hands-on training with a lumbar spine phantom prototype developed at our institution. Six incoming post-graduate year 2 (PGY-2) radiology residents completed a short survey before taking the course, and each resident practiced 20 simulated FGLPs using the phantom before their first clinical procedure. Data from the 114 lumbar punctures (LPs) performed by the six trained residents (prospective cohort) were compared to data from 514 LPs performed by 17 residents who did not receive simulation-based training (retrospective cohort). Fluoroscopy time (FT), FGLP success rate, and indication were compared. There was a statistically significant reduction in average FT for the 114 procedures performed by the prospective study cohort compared to the 514 procedures performed by the retrospective cohort. This held true for all procedures in aggregate, LPs for myelography, and all procedures performed for a diagnostic indication. Aggregate FT for the prospective group (0.87 ± 0.68 minutes) was significantly lower compared to the retrospective group (1.09 ± 0.65 minutes) and resulted in a 25% reduction in average FT (P = .002). There was no statistically significant difference in the number of failed FGLPs between the two groups. Our simulation-based FGLP curriculum resulted in improved operator confidence and reduced FT. These changes suggest that resident procedure efficiency was improved, whereas patient dose was reduced. The FGLP training program was implemented by radiology residents and required a minimal investment of time and resources. The LP spine phantom used during training was inexpensive, durable, and effective. In addition, the phantom is compatible with multiple modalities including fluoroscopy, computed tomography, and ultrasound and could be easily adapted to other applications such as facet injections or joint arthrograms. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  16. Development, implementation and evaluation of satellite-aided agricultural monitoring systems

    NASA Technical Reports Server (NTRS)

    Cicone, R. C.; Crist, E. P.; Metzler, M.; Nuesch, D.

    1982-01-01

    Research activities in support of AgRISTARS Inventory Technology Development Project in the use of aerospace remote sensing for agricultural inventory described include: (1) corn and soybean crop spectral temporal signature characterization; (2) efficient area estimation techniques development; and (3) advanced satellite and sensor system definition. Studies include a statistical evaluation of the impact of cultural and environmental factors on crop spectral profiles, the development and evaluation of an automatic crop area estimation procedure, and the joint use of SEASAT-SAR and LANDSAT MSS for crop inventory.

  17. Regression methods for spatially correlated data: an example using beetle attacks in a seed orchard

    Treesearch

    Preisler Haiganoush; Nancy G. Rappaport; David L. Wood

    1997-01-01

    We present a statistical procedure for studying the simultaneous effects of observed covariates and unmeasured spatial variables on responses of interest. The procedure uses regression type analyses that can be used with existing statistical software packages. An example using the rate of twig beetle attacks on Douglas-fir trees in a seed orchard illustrates the...

  18. 40 CFR Appendix Xviii to Part 86 - Statistical Outlier Identification Procedure for Light-Duty Vehicles and Light Light-Duty Trucks...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 20 2012-07-01 2012-07-01 false Statistical Outlier Identification Procedure for Light-Duty Vehicles and Light Light-Duty Trucks Certifying to the Provisions of Part 86... (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES...

  19. 40 CFR Appendix Xviii to Part 86 - Statistical Outlier Identification Procedure for Light-Duty Vehicles and Light Light-Duty Trucks...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 20 2013-07-01 2013-07-01 false Statistical Outlier Identification Procedure for Light-Duty Vehicles and Light Light-Duty Trucks Certifying to the Provisions of Part 86... (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES...

  20. Information flow and quantum cryptography using statistical fluctuations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Home, D.; Whitaker, M.A.B.

    2003-02-01

    A procedure is formulated, using the quantum teleportation arrangement, that communicates knowledge of an apparatus setting between the wings of the experiment, using statistical fluctuations in a sequence of measurement results. It requires an entangled state, and transmission of classical information totally unrelated to the apparatus setting actually communicated. Our procedure has conceptual interest, and has applications to quantum cryptography.

  1. Assessing residents' operative skills for external ventricular drain placement and shunt surgery in pediatric neurosurgery.

    PubMed

    Aldave, Guillermo; Hansen, Daniel; Briceño, Valentina; Luerssen, Thomas G; Jea, Andrew

    2017-04-01

    OBJECTIVE The authors previously demonstrated the use of a validated Objective Structured Assessment of Technical Skills (OSATS) tool for evaluating residents' operative skills in pediatric neurosurgery. However, no benchmarks have been established for specific pediatric procedures despite an increased need for meaningful assessments that can either allow for early intervention for underperforming trainees or allow for proficient residents to progress to conducting operations independently with more passive supervision. This validated methodology and tool for assessment of operative skills for common pediatric neurosurgical procedures-external ventricular drain (EVD) placement and shunt surgery- was applied to establish its procedure-based feasibility and reliability, and to document the effect of repetition on achieving surgical skill proficiency in pediatric EVD placement and shunt surgery. METHODS A procedure-based technical skills assessment for EVD placements and shunt surgeries in pediatric neurosurgery was established through the use of task analysis. The authors enrolled all residents from 3 training programs (Baylor College of Medicine, Houston Methodist Hospital, and University of Texas-Medical Branch) who rotated through pediatric neurosurgery at Texas Children's Hospital over a 26-month period. For each EVD placement or shunt procedure performed with a resident, the faculty and resident (for self-assessment) completed an evaluation form (OSATS) based on a 5-point Likert scale with 7 categories. Data forms were then grouped according to faculty versus resident (self) assessment, length of pediatric neurosurgery rotation, postgraduate year level, and date of evaluation ("beginning of rotation," within 1 month of start date; "end of rotation," within 1 month of completion date; or "middle of rotation"). Descriptive statistical analyses were performed with the commercially available SPSS statistical software package. A p value < 0.05 was considered statistically significant. RESULTS Five attending evaluators (including 2 fellows who acted as attending surgeons) completed 260 evaluations. Twenty house staff completed 269 evaluations for self-assessment. Evaluations were completed in 562 EVD and shunt procedures before the surgeons left the operating room. There were statistically significant differences (p < 0.05) between overall attending (mean 4.3) and junior resident (self; mean 3.6) assessments, and between overall attending (mean 4.8) and senior resident (self; mean 4.6) assessment scores on general performance and technical skills. The learning curves produced for the residents demonstrate a stereotypical U- or V-shaped curve for acquiring skills, with a significant improvement in overall scores at the end of the rotation compared with the beginning. The improvement for junior residents (Δ score = 0.5; p = 0.002) was larger than for senior residents (Δ score = 0.2; p = 0.018). CONCLUSIONS The OSATS is an effective assessment tool as part of a comprehensive evaluation of neurosurgery residents' performance for specific pediatric procedures. The authors observed a U-shaped learning curve, contradicting the idea that developing one's surgical technique and learning a procedure represents a monotonic, cumulative process of repetitions and improvement.

  2. 2012/14 Beginning Postsecondary Students Longitudinal Study: (BPS:12/14). Supporting Statement Part A. Request for OMB Review. OMB #1850-0631 v.8. Revised May 28, 2014

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2014

    2014-01-01

    The National Center for Education Statistics (NCES), within the U.S. Department of Education (ED), Institute of Education Sciences, is requesting clearance for data collection materials and procedures for the full-scale collection of the 2012/14 Beginning Postsecondary Students Longitudinal Study (BPS:12/14) first follow-up, including the student…

  3. Selected aspects of microelectronics technology and applications: Numerically controlled machine tools. Technology trends series no. 2

    NASA Astrophysics Data System (ADS)

    Sigurdson, J.; Tagerud, J.

    1986-05-01

    A UNIDO publication about machine tools with automatic control discusses the following: (1) numerical control (NC) machine tool perspectives, definition of NC, flexible manufacturing systems, robots and their industrial application, research and development, and sensors; (2) experience in developing a capability in NC machine tools; (3) policy issues; (4) procedures for retrieval of relevant documentation from data bases. Diagrams, statistics, bibliography are included.

  4. Reconsidering barriers to wind power projects: community engagement, developer transparency and place

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Firestone, Jeremy; Hoen, Ben; Rand, Joseph

    In 2016, we undertook a nationally representative wind power perceptions survey of individuals living within 8 km of over 600 projects in the United States, generating 1705 telephone, web, and mail responses. We sought information on a variety of topics, including procedural fairness and its relationship to project attitude, the foci of the present analysis. Here, we present a series of descriptive statistics and regression results, emphasizing those residents who were aware of their local project prior to construction. Sample weighting is employed to account for stratification and non-response. We find that a developer being open and transparent, a communitymore » being able to influence the outcome, and having a say in the planning process are all statistically significant predictors of a process perceived as being ‘fair,’ with an open and transparent developer having the largest effect. We also find developer transparency and ability to influence outcomes to have statistically significant relationships to a more positive attitude, with those findings holding when aesthetics, landscape, and wind turbine sound considerations are controlled for. The results indicate that jurisdictions might consider developing procedures, which ensure citizens are consulted and heard, and benchmarks or best practices for developer interaction with communities and citizens.« less

  5. Reconsidering barriers to wind power projects: community engagement, developer transparency and place

    DOE PAGES

    Firestone, Jeremy; Hoen, Ben; Rand, Joseph; ...

    2017-12-21

    In 2016, we undertook a nationally representative wind power perceptions survey of individuals living within 8 km of over 600 projects in the United States, generating 1705 telephone, web, and mail responses. We sought information on a variety of topics, including procedural fairness and its relationship to project attitude, the foci of the present analysis. Here, we present a series of descriptive statistics and regression results, emphasizing those residents who were aware of their local project prior to construction. Sample weighting is employed to account for stratification and non-response. We find that a developer being open and transparent, a communitymore » being able to influence the outcome, and having a say in the planning process are all statistically significant predictors of a process perceived as being ‘fair,’ with an open and transparent developer having the largest effect. We also find developer transparency and ability to influence outcomes to have statistically significant relationships to a more positive attitude, with those findings holding when aesthetics, landscape, and wind turbine sound considerations are controlled for. The results indicate that jurisdictions might consider developing procedures, which ensure citizens are consulted and heard, and benchmarks or best practices for developer interaction with communities and citizens.« less

  6. Fast Identification of Biological Pathways Associated with a Quantitative Trait Using Group Lasso with Overlaps

    PubMed Central

    Silver, Matt; Montana, Giovanni

    2012-01-01

    Where causal SNPs (single nucleotide polymorphisms) tend to accumulate within biological pathways, the incorporation of prior pathways information into a statistical model is expected to increase the power to detect true associations in a genetic association study. Most existing pathways-based methods rely on marginal SNP statistics and do not fully exploit the dependence patterns among SNPs within pathways. We use a sparse regression model, with SNPs grouped into pathways, to identify causal pathways associated with a quantitative trait. Notable features of our “pathways group lasso with adaptive weights” (P-GLAW) algorithm include the incorporation of all pathways in a single regression model, an adaptive pathway weighting procedure that accounts for factors biasing pathway selection, and the use of a bootstrap sampling procedure for the ranking of important pathways. P-GLAW takes account of the presence of overlapping pathways and uses a novel combination of techniques to optimise model estimation, making it fast to run, even on whole genome datasets. In a comparison study with an alternative pathways method based on univariate SNP statistics, our method demonstrates high sensitivity and specificity for the detection of important pathways, showing the greatest relative gains in performance where marginal SNP effect sizes are small. PMID:22499682

  7. Statistical monitoring of the hand, foot and mouth disease in China.

    PubMed

    Zhang, Jingnan; Kang, Yicheng; Yang, Yang; Qiu, Peihua

    2015-09-01

    In a period starting around 2007, the Hand, Foot, and Mouth Disease (HFMD) became wide-spreading in China, and the Chinese public health was seriously threatened. To prevent the outbreak of infectious diseases like HFMD, effective disease surveillance systems would be especially helpful to give signals of disease outbreaks as early as possible. Statistical process control (SPC) charts provide a major statistical tool in industrial quality control for detecting product defectives in a timely manner. In recent years, SPC charts have been used for disease surveillance. However, disease surveillance data often have much more complicated structures, compared to the data collected from industrial production lines. Major challenges, including lack of in-control data, complex seasonal effects, and spatio-temporal correlations, make the surveillance data difficult to handle. In this article, we propose a three-step procedure for analyzing disease surveillance data, and our procedure is demonstrated using the HFMD data collected during 2008-2009 in China. Our method uses nonparametric longitudinal data and time series analysis methods to eliminate the possible impact of seasonality and temporal correlation before the disease incidence data are sequentially monitored by a SPC chart. At both national and provincial levels, our proposed method can effectively detect the increasing trend of disease incidence rate before the disease becomes wide-spreading. © 2015, The International Biometric Society.

  8. The incidence of vitamin, mineral, herbal, and other supplement use in facial cosmetic patients.

    PubMed

    Zwiebel, Samantha J; Lee, Michelle; Alleyne, Brendan; Guyuron, Bahman

    2013-07-01

    Dietary supplement use is common in the United States. Some herbal supplements may cause coagulopathy, hypertension, or dry eyes. The goal of this study is to reveal the incidence of herbal supplement use in the cosmetic surgery population. A retrospective chart review of 200 patients undergoing facial cosmetic surgery performed by a single surgeon was performed. Variables studied included patient age, sex, surgical procedure, herbal medication use, and intraoperative variables. Exclusion criteria were age younger than 15 years, noncosmetic procedures such as trauma, and incomplete preoperative medication form. Patients were subdivided into the supplement user group (herbal) and the supplement nonuser group (nonherbal). Statistical analysis included descriptive statistics, t test, and chi-square analysis. The incidence of supplement use was 49 percent in the 200 patients; 24.5 percent of patients used only vitamins or minerals, 2.5 percent of patients used only animal- and plant-based (nonvitamin/mineral) supplements, and 22 percent of patients used both types of supplements. In the herbal group, patients used an average of 2.8 supplements. The herbal and nonherbal groups differed significantly in sex (herbal, 89.8 percent female; nonherbal, 77.5 percent; p < 0.04) and age (herbal, 51.4 years; nonherbal, 38.5 years; p < 0.001). Herbal supplement use is prevalent in the facial cosmetic surgery population, especially in the older female population. Considering the potential ill effects of these products on surgery and recovery, awareness and careful documentation and prohibiting the patients from the consumption of these products will increase the safety and reduce the recovery following cosmetic procedures.

  9. Improved staff procedure skills lead to improved managment skills: an observational study in an educational setting.

    PubMed

    Rüter, Anders; Vikstrom, Tore

    2009-01-01

    Good staff procedure skills in a management group during incidents and disasters are believed to be a prerequisite for good management of the situation. However, this has not been demonstrated scientifically. Templates for evaluation results from performance indicators during simulation exercises have previously been tested. The aim of this study was to demonstrate the possibility that these indicators can be used as a tool for studying the relationship between good management skills and good staff procedure skills. Good and structured work (staff procedure skills) in a hospital management group during simulation exercises in disaster medicine is related to good and timely decisions (good management skills). Results from 29 consecutive simulation exercises in which staff procedure skills and management skills were evaluated using quantitative measurements were included. The statistical analysis method used was simple linear regression with staff procedure skills as the response variable and management skills as the predictor variable. An overall significant relationship was identified between staff procedure skills and management skills (p(2)0.05). This study suggests that there is a relationship between staff procedure skills and management skills in the educational setting used. Future studies are needed to demonstrate if this also can be observed during actual incidents.

  10. User manual for Blossom statistical package for R

    USGS Publications Warehouse

    Talbert, Marian; Cade, Brian S.

    2005-01-01

    Blossom is an R package with functions for making statistical comparisons with distance-function based permutation tests developed by P.W. Mielke, Jr. and colleagues at Colorado State University (Mielke and Berry, 2001) and for testing parameters estimated in linear models with permutation procedures developed by B. S. Cade and colleagues at the Fort Collins Science Center, U.S. Geological Survey. This manual is intended to provide identical documentation of the statistical methods and interpretations as the manual by Cade and Richards (2005) does for the original Fortran program, but with changes made with respect to command inputs and outputs to reflect the new implementation as a package for R (R Development Core Team, 2012). This implementation in R has allowed for numerous improvements not supported by the Cade and Richards (2005) Fortran implementation, including use of categorical predictor variables in most routines.

  11. 50 CFR 600.130 - Protection of confidentiality of statistics.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... statistics. 600.130 Section 600.130 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL... Fishery Management Councils § 600.130 Protection of confidentiality of statistics. Each Council must establish appropriate procedures for ensuring the confidentiality of the statistics that may be submitted to...

  12. 50 CFR 600.130 - Protection of confidentiality of statistics.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... statistics. 600.130 Section 600.130 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL... Fishery Management Councils § 600.130 Protection of confidentiality of statistics. Each Council must establish appropriate procedures for ensuring the confidentiality of the statistics that may be submitted to...

  13. Probability of identification: a statistical model for the validation of qualitative botanical identification methods.

    PubMed

    LaBudde, Robert A; Harnly, James M

    2012-01-01

    A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.

  14. Prospective Cohort Study Investigating Changes in Body Image, Quality of Life, and Self-Esteem Following Minimally Invasive Cosmetic Procedures.

    PubMed

    Sobanko, Joseph F; Dai, Julia; Gelfand, Joel M; Sarwer, David B; Percec, Ivona

    2018-04-13

    Minimally invasive cosmetic injectable procedures are increasingly common. However, a few studies have investigated changes in psychosocial functioning following these treatments. To assess changes in body image, quality of life, and self-esteem following cosmetic injectable treatment with soft tissue fillers and neuromodulators. Open, prospective study of 75 patients undergoing cosmetic injectable procedures for facial aging to evaluate changes in psychosocial functioning within 6 weeks of treatment. Outcome measures included the Derriford appearance scale (DAS-24), body image quality of life inventory (BIQLI), and the Rosenberg self-esteem scale. Body image dissatisfaction, as assessed by the DAS-24, improved significantly 6 weeks after the treatment. Body image quality of life, as assessed by the BIQLI, improved, but the change did not reach statistical significance. Self-esteem was unchanged after the treatment. Minimally invasive cosmetic injectable procedures were associated with reductions in body image dissatisfaction. Future research, using recently developed cosmetic surgery-specific instruments, may provide further insight into the psychosocial benefits of minimally invasive procedures.

  15. A Web Terminology Server Using UMLS for the Description of Medical Procedures

    PubMed Central

    Burgun, Anita; Denier, Patrick; Bodenreider, Olivier; Botti, Geneviève; Delamarre, Denis; Pouliquen, Bruno; Oberlin, Philippe; Lévéque, Jean M.; Lukacs, Bertrand; Kohler, François; Fieschi, Marius; Le Beux, Pierre

    1997-01-01

    Abstract The Model for Assistance in the Orientation of a User within Coding Systems (MAOUSSC) project has been designed to provide a representation for medical and surgical procedures that allows several applications to be developed from several viewpoints. It is based on a conceptual model, a controlled set of terms, and Web server development. The design includes the UMLS knowledge sources associated with additional knowledge about medico-surgical procedures. The model was implemented using a relational database. The authors developed a complete interface for the Web presentation, with the intermediary layer being written in PERL. The server has been used for the representation of medico-surgical procedures that occur in the discharge summaries of the national survey of hospital activities that is performed by the French Health Statistics Agency in order to produce inpatient profiles. The authors describe the current status of the MAOUSSC server and discuss their interest in using such a server to assist in the coordination of terminology tasks and in the sharing of controlled terminologies. PMID:9292841

  16. An Evaluation of the Effects of the Transobturator Tape Procedure on Sexual Satisfaction in Women with Stress Urinary Incontinence Using the Libido Scoring System

    PubMed Central

    Narin, Raziye; Nazik, Hakan; Narin, Mehmet Ali; Aytan, Hakan; Api, Murat

    2013-01-01

    Introduction and Hypothesis. Most women experience automatic urine leakage in their lifetimes. SUI is the most common type in women. Suburethral slings have become a standard surgical procedure for the treatment of stress urinary incontinence when conservative therapy failed. The treatment of stress urinary incontinence by suburethral sling may improve body image by reducing urinary leakage and may improve sexual satisfaction. Methods. A total of 59 sexually active patients were included in the study and underwent a TOT outside-in procedure. The LSS was applied in all patients by self-completion of questionnaires preoperatively and 6 months after the operation. General pleasure with the operation was measured by visual analogue score (VAS). Pre- and postoperative scores were recorded and analyzed using SPSS 11.5. Results. Two parameters of the LSS, orgasm and who starts the sexual activity, increased at a statistically significant rate. Conclusion. Sexual satisfaction and desire have partially improved after the TOT procedure. PMID:24288621

  17. Rapid detection of foodborne microorganisms on food surface using Fourier transform Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Yang, Hong; Irudayaraj, Joseph

    2003-02-01

    Fourier transform (FT) Raman spectroscopy was used for non-destructive characterization and differentiation of six different microorganisms including the pathogen Escherichia coli O157:H7 on whole apples. Mahalanobis distance metric was used to evaluate and quantify the statistical differences between the spectra of six different microorganisms. The same procedure was extended to discriminate six different strains of E. coli. The FT-Raman procedure was not only successful in discriminating the different E. coli strain but also accurately differentiated the pathogen from non-pathogens. Results demonstrate that FT-Raman spectroscopy can be an excellent tool for rapid examination of food surfaces for microorganism contamination and for the classification of microbial cultures.

  18. The risk of shorter fasting time for pediatric deep sedation.

    PubMed

    Clark, Mathew; Birisci, Esma; Anderson, Jordan E; Anliker, Christina M; Bryant, Micheal A; Downs, Craig; Dalabih, Abdallah

    2016-01-01

    Current guidelines adopted by the American Academy of Pediatrics calls for prolonged fasting times before performing pediatric procedural sedation and analgesia (PSA). PSA is increasingly provided to children outside of the operating theater by sedation trained pediatric providers and does not require airway manipulation. We investigated the safety of a shorter fasting time compared to a longer and guideline compliant fasting time. We tried to identify the association between fasting time and sedation-related complications. This is a prospective observational study that included children 2 months to 18 years of age and had an American Society of Anesthesiologists physical status classification of I or II, who underwent deep sedation for elective procedures, performed by pediatric critical care providers. Procedures included radiologic imaging studies, electroencephalograms, auditory brainstem response, echocardiograms, Botox injections, and other minor surgical procedures. Subjects were divided into two groups depending on the length of their fasting time (4-6 h and >6 h). Complication rates were calculated and compared between the three groups. In the studied group of 2487 subjects, 1007 (40.5%) had fasting time of 4-6 h and the remaining 1480 (59.5%) subjects had fasted for >6 h. There were no statistically significant differences in any of the studied complications between the two groups. This study found no difference in complication rate in regard to the fasting time among our subjects cohort, which included only healthy children receiving elective procedures performed by sedation trained pediatric critical care providers. This suggests that using shorter fasting time may be safe for procedures performed outside of the operating theater that does not involve high-risk patients or airway manipulation.

  19. A direct-gradient multivariate index of biotic condition

    USGS Publications Warehouse

    Miranda, Leandro E.; Aycock, J.N.; Killgore, K. J.

    2012-01-01

    Multimetric indexes constructed by summing metric scores have been criticized despite many of their merits. A leading criticism is the potential for investigator bias involved in metric selection and scoring. Often there is a large number of competing metrics equally well correlated with environmental stressors, requiring a judgment call by the investigator to select the most suitable metrics to include in the index and how to score them. Data-driven procedures for multimetric index formulation published during the last decade have reduced this limitation, yet apprehension remains. Multivariate approaches that select metrics with statistical algorithms may reduce the level of investigator bias and alleviate a weakness of multimetric indexes. We investigated the suitability of a direct-gradient multivariate procedure to derive an index of biotic condition for fish assemblages in oxbow lakes in the Lower Mississippi Alluvial Valley. Although this multivariate procedure also requires that the investigator identify a set of suitable metrics potentially associated with a set of environmental stressors, it is different from multimetric procedures because it limits investigator judgment in selecting a subset of biotic metrics to include in the index and because it produces metric weights suitable for computation of index scores. The procedure, applied to a sample of 35 competing biotic metrics measured at 50 oxbow lakes distributed over a wide geographical region in the Lower Mississippi Alluvial Valley, selected 11 metrics that adequately indexed the biotic condition of five test lakes. Because the multivariate index includes only metrics that explain the maximum variability in the stressor variables rather than a balanced set of metrics chosen to reflect various fish assemblage attributes, it is fundamentally different from multimetric indexes of biotic integrity with advantages and disadvantages. As such, it provides an alternative to multimetric procedures.

  20. Laparoscopic colorectal surgery in obese and nonobese patients: do differences in body mass indices lead to different outcomes?

    PubMed

    Schwandner, O; Farke, S; Schiedeck, T H K; Bruch, H-P

    2004-10-01

    The aim of this prospective study was to compare the outcome of laparoscopic colorectal surgery in obese and nonobese patients. All patients who underwent laparoscopic surgery for both benign and malignant disease within the past 5 years were entered into the prospective database registry. Body mass index (BMI; kg/m(2)) was used as the objective measure to indicate morbid obesity. Patients with a BMI >30 were defined as obese, and patients with a BMI <30 were defined as nonobese. The parameters analyzed included age, gender, comorbid conditions, diagnosis, procedure, duration of surgery, transfusion requirements, conversion rate, overall morbidity rate including major complications (requiring reoperation), minor complications (conservative treatment) and late-onset complications (postdischarge), stay on intensive case unit, hospitalization, and mortality. For objective evaluation, only laparoscopically completed procedures were analyzed. Statistics included Student's t test and chi-square analysis. Statistical significance was assessed at the 5% level (p < 0. 05 statistically significant). A total of 589 patients were evaluated, including 95 patients in the obese group and 494 patients in the nonobese group. There was no significant difference in conversion rate (7.3% in the obese group vs 9.5% in the nonobese group, p > 0.05) so that the laparoscopic completion rate was 90.5% (n = 86) in the obese and 92.7% (n = 458) in the nonobese group. The rate of females was significantly lower among obese patients (55.8% in the obese group vs 74.2% in the nonobese group, p = 0.001). No significant differences were observed with respect to age, diagnosis, procedure, duration of surgery, and transfusion requirements (p > 0.05). In terms of morbidity, there were no significant differences related to overall complication rates with respect to BMI (23.3% in the obese group vs 24.5% in the nonobese group, p > 0.05). Major complications were more common in the obese group without showing statistical significance (12.8% in the obese group vs 6.6% in the nonobese group, p = 0.078). Conversely, minor complications were more frequently documented in the nonobese group (8.1% in the obese group vs 15.5% in the nonobese group, p = 0.080). In the postoperative course, no differences were documented in terms of return of bowel function, duration of analgesics required, oral feeding, and length of hospitalization (p > 0.05). These data indicate that laparoscopic colorectal surgery is feasible and effective in both obese and nonobese patients. Obese patients who are thought to be at increased risk of postoperative morbidity have the similar benefit of laparoscopic surgery as nonobese patients with colorectal disease.

  1. 48 CFR 215.404-76 - Reporting profit and fee statistics.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... statistics. 215.404-76 Section 215.404-76 Federal Acquisition Regulations System DEFENSE ACQUISITION... Contract Pricing 215.404-76 Reporting profit and fee statistics. Follow the procedures at PGI 215.404-76 for reporting profit and fee statistics. [71 FR 69494, Dec. 1, 2006] ...

  2. 40 CFR 1065.602 - Statistics.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Statistics. 1065.602 Section 1065.602... PROCEDURES Calculations and Data Requirements § 1065.602 Statistics. (a) Overview. This section contains equations and example calculations for statistics that are specified in this part. In this section we use...

  3. 48 CFR 215.404-76 - Reporting profit and fee statistics.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... statistics. 215.404-76 Section 215.404-76 Federal Acquisition Regulations System DEFENSE ACQUISITION... Contract Pricing 215.404-76 Reporting profit and fee statistics. Follow the procedures at PGI 215.404-76 for reporting profit and fee statistics. [71 FR 69494, Dec. 1, 2006] ...

  4. A Total Quality-Control Plan with Right-Sized Statistical Quality-Control.

    PubMed

    Westgard, James O

    2017-03-01

    A new Clinical Laboratory Improvement Amendments option for risk-based quality-control (QC) plans became effective in January, 2016. Called an Individualized QC Plan, this option requires the laboratory to perform a risk assessment, develop a QC plan, and implement a QC program to monitor ongoing performance of the QC plan. Difficulties in performing a risk assessment may limit validity of an Individualized QC Plan. A better alternative is to develop a Total QC Plan including a right-sized statistical QC procedure to detect medically important errors. Westgard Sigma Rules provides a simple way to select the right control rules and the right number of control measurements. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Statistical Signal Models and Algorithms for Image Analysis

    DTIC Science & Technology

    1984-10-25

    In this report, two-dimensional stochastic linear models are used in developing algorithms for image analysis such as classification, segmentation, and object detection in images characterized by textured backgrounds. These models generate two-dimensional random processes as outputs to which statistical inference procedures can naturally be applied. A common thread throughout our algorithms is the interpretation of the inference procedures in terms of linear prediction

  6. Comparison of the visual results after SMILE and femtosecond laser-assisted LASIK for myopia.

    PubMed

    Lin, Fangyu; Xu, Yesheng; Yang, Yabo

    2014-04-01

    To perform a comparative clinical analysis of the safety, efficacy, and predictability of two surgical procedures (ie, small incision lenticule extraction [SMILE] and femtosecond laser-assisted LASIK [FS-LASIK]) to correct myopia. Sixty eyes of 31 patients with a mean spherical equivalent of -5.13 ± 1.75 diopters underwent myopia correction with the SMILE procedure. Fifty-one eyes of 27 patients with a mean spherical equivalent of -5.58 ± 2.41 diopters were treated with the FS-LASIK procedure. Postoperative uncorrected and corrected distance visual acuity, manifest refraction, and higher-order aberrations were analyzed statistically at 1 and 3 months postoperatively. No statistically significant differences were found at 1 and 3 months in parameters that included the percentage of eyes with an uncorrected distance visual acuity of 20/20 or better (P = .556, .920) and mean spherical equivalent refraction (P = .055, .335). At 1 month, 4 SMILE-treated eyes and 1 FS-LASIK-treated eye lost one or more line of visual acuity (P = .214, chi-square test). At 3 months, 2 SMILE-treated eyes lost one or more line of visual acuity, whereas all FS-LASIK-treated eyes had an unchanged or corrected distance visual acuity. Higher-order aberrations and spherical aberration were significantly lower in the SMILE group than the FS-LASIK group at 1 (P = .007, .000) and 3 (P = .006, .000) months of follow-up. SMILE and FS-LASIK are safe, effective, and predictable surgical procedures to treat myopia. SMILE has a lower induction rate of higher-order aberrations and spherical aberration than the FS-LASIK procedure. Copyright 2014, SLACK Incorporated.

  7. Temperature-Controlled Delivery of Radiofrequency Energy in Fecal Incontinence: A Randomized Sham-Controlled Clinical Trial.

    PubMed

    Visscher, Arjan P; Lam, Tze J; Meurs-Szojda, Maria M; Felt-Bersma, Richelle J F

    2017-08-01

    Controlled delivery of radiofrequency energy has been suggested as treatment for fecal incontinence. The aim of this study was to determine whether the clinical response to the radiofrequency energy procedure is superior to sham in patients with fecal incontinence. This was a randomized sham-controlled clinical trial from 2008 to 2015. This study was conducted in an outpatient clinic. Forty patients with fecal incontinence in whom maximal conservative management had failed were randomly assigned to receiving either radiofrequency energy or sham procedure. Fecal incontinence was measured using the Vaizey incontinence score (range, 0-24). The impact of fecal incontinence on quality of life was measured by using the fecal incontinence quality-of-life score (range, 1-4). Measurements were performed at baseline and at 6 months. Anorectal function was evaluated using anal manometry and anorectal endosonography at baseline and at 3 months. At baseline, Vaizey incontinence score was 16.8 (SD 2.9). At t = 6 months, the radiofrequency energy group improved by 2.5 points on the Vaizey incontinence score compared with the sham group (13.2 (SD 3.1), 15.6 (SD 3.3), p = 0.02). The fecal incontinence quality-of-life score at t = 6 months was not statistically different. Anorectal function did not show any alteration. Patients with severe fecal incontinence were included in the study, thus making it difficult to generalize the results. Both radiofrequency energy and sham procedure improved the fecal incontinence score, the radiofrequency energy procedure more than sham. Although statistically significant, the clinical impact for most of the patients was negligible. Therefore, the radiofrequency energy procedure should not be recommended for patients with fecal incontinence until patient-related factors associated with treatment success are known. See Video Abstract at http://links.lww.com/DCR/A373.

  8. Use of repeat anterior maxillary distraction to correct residual midface hypoplasia in cleft patients.

    PubMed

    Richardson, Sunil; Krishna, Shreya; Bansal, Avi

    2017-12-01

    The study was designed to evaluate the efficacy of performing a second, repeat anterior maxillary distraction (AMD) to treat residual cleft maxillary hypoplasia. Five patients between the ages of 12 to 15 years with a history of AMD and with residual cleft maxillary hypoplasia were included in the study. Inclusion was irrespective of gender, type of cleft lip and palate, and the amount of advancement needed. Repeat AMD was executed in these patients 4 to 5 years after the primary AMD procedure to correct the cleft maxillary hypoplasia that had developed since the initial procedure. Orthopantomogram (OPG) and lateral cephalograms were taken for evaluation preoperatively, immediately after distraction, after consolidation, and one year postoperatively. The data obtained was tabulated and a Mann Whitney U-test was used for statistical comparisons. At the time of presentation, a residual maxillary hypoplasia was observed with a well maintained distraction gap on the OPG which ruled out the occurrence of a relapse. Favorable movement of the segments without any resistance was seen in all patients. Mean maxillary advancement of 10.56 mm was achieved at repeat AMD. Statistically significant increases in midfacial length, SNA angle, and nasion perpendicular to point A distance was achieved ( P =0.012, P =0.011, and P =0.012, respectively). Good profile was achieved for all patients. Minimal transient complications, for example anterior open bite and bleeding episodes, were managed. Addressing the problem of cleft maxillary hypoplasia at an early age (12-15 years) is beneficial for the child. Residual hypoplasia may develop in some patients, which may require additional corrective procedures. The results of our study show that AMD can be repeated when residual deformity develops with the previous procedure having no negative impact on the results of the repeat procedure.

  9. Statistical Analyses of Raw Material Data for MTM45-1/CF7442A-36% RW: CMH Cure Cycle

    NASA Technical Reports Server (NTRS)

    Coroneos, Rula; Pai, Shantaram, S.; Murthy, Pappu

    2013-01-01

    This report describes statistical characterization of physical properties of the composite material system MTM45-1/CF7442A, which has been tested and is currently being considered for use on spacecraft structures. This composite system is made of 6K plain weave graphite fibers in a highly toughened resin system. This report summarizes the distribution types and statistical details of the tests and the conditions for the experimental data generated. These distributions will be used in multivariate regression analyses to help determine material and design allowables for similar material systems and to establish a procedure for other material systems. Additionally, these distributions will be used in future probabilistic analyses of spacecraft structures. The specific properties that are characterized are the ultimate strength, modulus, and Poisson??s ratio by using a commercially available statistical package. Results are displayed using graphical and semigraphical methods and are included in the accompanying appendixes.

  10. [Gender-sensitive epidemiological data analysis: methodological aspects and empirical outcomes. Illustrated by a health reporting example].

    PubMed

    Jahn, I; Foraita, R

    2008-01-01

    In Germany gender-sensitive approaches are part of guidelines for good epidemiological practice as well as health reporting. They are increasingly claimed to realize the gender mainstreaming strategy in research funding by the federation and federal states. This paper focuses on methodological aspects of data analysis, as an empirical data example of which serves the health report of Bremen, a population-based cross-sectional study. Health reporting requires analysis and reporting methods that are able to discover sex/gender issues of questions, on the one hand, and consider how results can adequately be communicated, on the other hand. The core question is: Which consequences do a different inclusion of the category sex in different statistical analyses for identification of potential target groups have on the results? As evaluation methods logistic regressions as well as a two-stage procedure were exploratively conducted. This procedure combines graphical models with CHAID decision trees and allows for visualising complex results. Both methods are analysed by stratification as well as adjusted by sex/gender and compared with each other. As a result, only stratified analyses are able to detect differences between the sexes and within the sex/gender groups as long as one cannot resort to previous knowledge. Adjusted analyses can detect sex/gender differences only if interaction terms have been included in the model. Results are discussed from a statistical-epidemiological perspective as well as in the context of health reporting. As a conclusion, the question, if a statistical method is gender-sensitive, can only be answered by having concrete research questions and known conditions. Often, an appropriate statistic procedure can be chosen after conducting a separate analysis for women and men. Future gender studies deserve innovative study designs as well as conceptual distinctiveness with regard to the biological and the sociocultural elements of the category sex/gender.

  11. A Powerful Procedure for Pathway-Based Meta-analysis Using Summary Statistics Identifies 43 Pathways Associated with Type II Diabetes in European Populations.

    PubMed

    Zhang, Han; Wheeler, William; Hyland, Paula L; Yang, Yifan; Shi, Jianxin; Chatterjee, Nilanjan; Yu, Kai

    2016-06-01

    Meta-analysis of multiple genome-wide association studies (GWAS) has become an effective approach for detecting single nucleotide polymorphism (SNP) associations with complex traits. However, it is difficult to integrate the readily accessible SNP-level summary statistics from a meta-analysis into more powerful multi-marker testing procedures, which generally require individual-level genetic data. We developed a general procedure called Summary based Adaptive Rank Truncated Product (sARTP) for conducting gene and pathway meta-analysis that uses only SNP-level summary statistics in combination with genotype correlation estimated from a panel of individual-level genetic data. We demonstrated the validity and power advantage of sARTP through empirical and simulated data. We conducted a comprehensive pathway-based meta-analysis with sARTP on type 2 diabetes (T2D) by integrating SNP-level summary statistics from two large studies consisting of 19,809 T2D cases and 111,181 controls with European ancestry. Among 4,713 candidate pathways from which genes in neighborhoods of 170 GWAS established T2D loci were excluded, we detected 43 T2D globally significant pathways (with Bonferroni corrected p-values < 0.05), which included the insulin signaling pathway and T2D pathway defined by KEGG, as well as the pathways defined according to specific gene expression patterns on pancreatic adenocarcinoma, hepatocellular carcinoma, and bladder carcinoma. Using summary data from 8 eastern Asian T2D GWAS with 6,952 cases and 11,865 controls, we showed 7 out of the 43 pathways identified in European populations remained to be significant in eastern Asians at the false discovery rate of 0.1. We created an R package and a web-based tool for sARTP with the capability to analyze pathways with thousands of genes and tens of thousands of SNPs.

  12. A Powerful Procedure for Pathway-Based Meta-analysis Using Summary Statistics Identifies 43 Pathways Associated with Type II Diabetes in European Populations

    PubMed Central

    Zhang, Han; Wheeler, William; Hyland, Paula L.; Yang, Yifan; Shi, Jianxin; Chatterjee, Nilanjan; Yu, Kai

    2016-01-01

    Meta-analysis of multiple genome-wide association studies (GWAS) has become an effective approach for detecting single nucleotide polymorphism (SNP) associations with complex traits. However, it is difficult to integrate the readily accessible SNP-level summary statistics from a meta-analysis into more powerful multi-marker testing procedures, which generally require individual-level genetic data. We developed a general procedure called Summary based Adaptive Rank Truncated Product (sARTP) for conducting gene and pathway meta-analysis that uses only SNP-level summary statistics in combination with genotype correlation estimated from a panel of individual-level genetic data. We demonstrated the validity and power advantage of sARTP through empirical and simulated data. We conducted a comprehensive pathway-based meta-analysis with sARTP on type 2 diabetes (T2D) by integrating SNP-level summary statistics from two large studies consisting of 19,809 T2D cases and 111,181 controls with European ancestry. Among 4,713 candidate pathways from which genes in neighborhoods of 170 GWAS established T2D loci were excluded, we detected 43 T2D globally significant pathways (with Bonferroni corrected p-values < 0.05), which included the insulin signaling pathway and T2D pathway defined by KEGG, as well as the pathways defined according to specific gene expression patterns on pancreatic adenocarcinoma, hepatocellular carcinoma, and bladder carcinoma. Using summary data from 8 eastern Asian T2D GWAS with 6,952 cases and 11,865 controls, we showed 7 out of the 43 pathways identified in European populations remained to be significant in eastern Asians at the false discovery rate of 0.1. We created an R package and a web-based tool for sARTP with the capability to analyze pathways with thousands of genes and tens of thousands of SNPs. PMID:27362418

  13. Regularized learning of linear ordered-statistic constant false alarm rate filters (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Havens, Timothy C.; Cummings, Ian; Botts, Jonathan; Summers, Jason E.

    2017-05-01

    The linear ordered statistic (LOS) is a parameterized ordered statistic (OS) that is a weighted average of a rank-ordered sample. LOS operators are useful generalizations of aggregation as they can represent any linear aggregation, from minimum to maximum, including conventional aggregations, such as mean and median. In the fuzzy logic field, these aggregations are called ordered weighted averages (OWAs). Here, we present a method for learning LOS operators from training data, viz., data for which you know the output of the desired LOS. We then extend the learning process with regularization, such that a lower complexity or sparse LOS can be learned. Hence, we discuss what 'lower complexity' means in this context and how to represent that in the optimization procedure. Finally, we apply our learning methods to the well-known constant-false-alarm-rate (CFAR) detection problem, specifically for the case of background levels modeled by long-tailed distributions, such as the K-distribution. These backgrounds arise in several pertinent imaging problems, including the modeling of clutter in synthetic aperture radar and sonar (SAR and SAS) and in wireless communications.

  14. Permutation tests for goodness-of-fit testing of mathematical models to experimental data.

    PubMed

    Fişek, M Hamit; Barlas, Zeynep

    2013-03-01

    This paper presents statistical procedures for improving the goodness-of-fit testing of theoretical models to data obtained from laboratory experiments. We use an experimental study in the expectation states research tradition which has been carried out in the "standardized experimental situation" associated with the program to illustrate the application of our procedures. We briefly review the expectation states research program and the fundamentals of resampling statistics as we develop our procedures in the resampling context. The first procedure we develop is a modification of the chi-square test which has been the primary statistical tool for assessing goodness of fit in the EST research program, but has problems associated with its use. We discuss these problems and suggest a procedure to overcome them. The second procedure we present, the "Average Absolute Deviation" test, is a new test and is proposed as an alternative to the chi square test, as being simpler and more informative. The third and fourth procedures are permutation versions of Jonckheere's test for ordered alternatives, and Kendall's tau(b), a rank order correlation coefficient. The fifth procedure is a new rank order goodness-of-fit test, which we call the "Deviation from Ideal Ranking" index, which we believe may be more useful than other rank order tests for assessing goodness-of-fit of models to experimental data. The application of these procedures to the sample data is illustrated in detail. We then present another laboratory study from an experimental paradigm different from the expectation states paradigm - the "network exchange" paradigm, and describe how our procedures may be applied to this data set. Copyright © 2012 Elsevier Inc. All rights reserved.

  15. System engineering toolbox for design-oriented engineers

    NASA Technical Reports Server (NTRS)

    Goldberg, B. E.; Everhart, K.; Stevens, R.; Babbitt, N., III; Clemens, P.; Stout, L.

    1994-01-01

    This system engineering toolbox is designed to provide tools and methodologies to the design-oriented systems engineer. A tool is defined as a set of procedures to accomplish a specific function. A methodology is defined as a collection of tools, rules, and postulates to accomplish a purpose. For each concept addressed in the toolbox, the following information is provided: (1) description, (2) application, (3) procedures, (4) examples, if practical, (5) advantages, (6) limitations, and (7) bibliography and/or references. The scope of the document includes concept development tools, system safety and reliability tools, design-related analytical tools, graphical data interpretation tools, a brief description of common statistical tools and methodologies, so-called total quality management tools, and trend analysis tools. Both relationship to project phase and primary functional usage of the tools are also delineated. The toolbox also includes a case study for illustrative purposes. Fifty-five tools are delineated in the text.

  16. 45 CFR 160.536 - Statistical sampling.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Statistical sampling. 160.536 Section 160.536... REQUIREMENTS GENERAL ADMINISTRATIVE REQUIREMENTS Procedures for Hearings § 160.536 Statistical sampling. (a) In... statistical sampling study as evidence of the number of violations under § 160.406 of this part, or the...

  17. 45 CFR 160.536 - Statistical sampling.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 1 2011-10-01 2011-10-01 false Statistical sampling. 160.536 Section 160.536... REQUIREMENTS GENERAL ADMINISTRATIVE REQUIREMENTS Procedures for Hearings § 160.536 Statistical sampling. (a) In... statistical sampling study as evidence of the number of violations under § 160.406 of this part, or the...

  18. Postoperative Early Major and Minor Complications in Laparoscopic Vertical Sleeve Gastrectomy (LVSG) Versus Laparoscopic Roux-en-Y Gastric Bypass (LRYGB) Procedures: A Meta-Analysis and Systematic Review.

    PubMed

    Osland, Emma; Yunus, Rossita Mohamad; Khan, Shahjahan; Alodat, Tareq; Memon, Breda; Memon, Muhammed Ashraf

    2016-10-01

    Laparoscopic Roux-en-Y gastric bypass (LRYGB) and laparoscopic vertical sleeve gastrectomy (LVSG) have been proposed as cost-effective strategies to manage obesity-related chronic disease. The aim of this meta-analysis and systematic review was to compare the "early postoperative complication rate i.e. within 30-days" reported from randomized control trials (RCTs) comparing these two procedures. RCTs comparing the early complication rates following LVSG and LRYGB between 2000 and 2015 were selected from PubMed, Medline, Embase, Science Citation Index, Current Contents, and the Cochrane database. The outcome variables analyzed included 30-day mortality, major and minor complications and interventions required for their management, length of hospital stay, readmission rates, operating time, and conversions from laparoscopic to open procedures. Six RCTs involving a total of 695 patients (LVSG n = 347, LRYGB n = 348) reported on early major complications. A statistically significant reduction in relative odds of early major complications favoring the LVSG procedure was noted (p = 0.05). Five RCTs representing 633 patients (LVSG n = 317, LRYGB n = 316) reported early minor complications. A non-statically significant reduction in relative odds of 29 % favoring the LVSG procedure was observed for early minor complications (p = 0.4). However, other outcomes directly related to complications which included reoperation rates, readmission rate, and 30-day mortality rate showed comparable effect size for both surgical procedures. This meta-analysis and systematic review of RCTs suggests that fewer early major and minor complications are associated with LVSG compared with LRYGB procedure. However, this does not translate into higher readmission rate, reoperation rate, or 30-day mortality for either procedure.

  19. Left spermatic vein retrograde sclerosis: comparison between sclerosant agent injection through a diagnostic catheter versus through an occluding balloon catheter.

    PubMed

    Basile, Antonio; Failla, Giovanni; La Vignera, Sandro; Condorelli, Rosita Angela; Calogero, Aldo; Vicari, Enzo; Granata, Antonio; Mundo, Elena; Caltabiano, Giuseppe; Pizzarelli, Marco; Messina, Martina; Scavone, Giovanni; Lanzafame, Franz; Iezzi, Roberto; Tsetis, Dimitrios

    2015-05-01

    The aim of this study was to compare the technical success between left spermatic vein (LSV) scleroembolisation achieved with the injection of sclerosant through a diagnostic catheter and through an occluding balloon (OB), in the treatment of male varicocele. From January 2012 to September 2013, we prospectively enrolled 100 patients with left varicocele and an indication for LSV scleroembolisation related to symptoms or spermiogram anomalies; patients were randomised to two groups (we wrote a list of 100 lines assigned casually with A or B and each patient was consecutively allocated to group A or B on the basis of this list). Patients in group A underwent injection of the sclerosing agent through an angiographic diagnostic catheter (free catheter technique) and patients in group B through an OB catheter (OB technique). In cases of incomplete occlusion of the LSV, the procedure was completed with coils. Total occlusion of the LSV at post-treatment phlebography during a Valsalva manoeuvre before any coil embolisation was considered a technical success. The rate of complications was also evaluated. The Fischer's test was used for statistical analysis. We evaluated a total of 90 patients because five patients for each group were not included in the statistical analysis owing to technical problems or complications. In group A we had a technical success of 75.6 versus 93.4 % in group B, and the difference was statistically significant (P = 0.003); in particular, we had to complete the embolisation with insertion of coils in 11 cases (24.4 %) in group A, and in three cases in group B (6.6 %). In group A, LSV rupture occurred in four cases (8 %) so the procedure was completed by sclerosant injection through the OB located distally to the lesion. These patients were not considered for evaluation. In another case, a high flow shunt towards the inferior vena cava was detected, so the patient underwent OB injection to stop the flow to the shunt, and was not included for statistical evaluation. In group B, vein rupture with contrast leakage was noted in six cases (12 %); nonetheless, all the procedures were completed because the OB was positioned distally to the vessel tear, obviating any retrograde leakage of sclerosant. In group B, in five cases (10 %), we were unable to advance the OB though the LSV ostium so the procedures were completed with the diagnostic catheter and not considered for statistical evaluation. On the basis of our data, the embolisation of the LSV obtained by injecting the sclerosant through an OB rather than through a diagnostic catheter seems to be more effective in achieving total vein embolisation, as well as allowing a controlled injection of sclerosant even in cases of vein rupture.

  20. Analysis of Two Different Arthroscopic Broström Repair Constructs for Treatment of Chronic Lateral Ankle Instability in 110 Patients: A Retrospective Cohort Study.

    PubMed

    Cottom, James M; Baker, Joseph; Plemmons, Britton S

    Chronic lateral ankle instability is a common condition treated by most foot and ankle surgeons. Once conservative treatment has failed, patients often undergo surgical reconstruction, either anatomic or nonanatomic. The present retrospective cohort study compared the clinical outcomes of 2 different arthroscopic Broström procedures. A total of 110 patients (83 females [75.5%] and 27 males [24.5%]) were treated with 1 of the 2 lateral ankle stabilization techniques from October 1, 2014 to December 31, 2015. Of the 110 patients, 75 were included in the arthroscopic lateral ankle stabilization group with an additional suture anchor used proximally and 35 were included in the arthroscopic lateral ankle stabilization group using the knotless design. The age of the cohort was 46.05 ± 17.89 (range 12 to 83) years. The body mass index was 30.03 ± 7.42 (range 18.3 to 52.5) kg/m 2 . Of the 110 patients, 25 (22.7%) had undergone concomitant procedures during lateral ankle stabilization. Overall, postoperative complications occurred in 14 patients (12.7%). No statistically significant differences were found between the 2 groups regarding the complication rates, use of concomitant procedures, and the presence of diabetes and workers compensation claims. No statistically significant differences were found in the mean age, body mass index, or gender distribution between the 2 groups. The preoperative American Orthopaedic Foot and Ankle Society (AOFAS) Ankle-Hindfoot scores were 50.85 ± 13.56 (range 18 to 76) and 51.26 ± 13.32 (range 18 to 69) in groups 1 and 2, respectively. The postoperative AOFAS Ankle-Hindfoot scores were 88.19 ± 10.72 (range 54 to 100) and 84 ± 15.41 (range 16 to 100) in groups 1 and 2, respectively. No statistically significant difference was found between these 2 groups. The preoperative visual analog scale score was 7.45 ± 1.39 (range 3 to 10) and 6.97 ± 1.25 (range 5 to 10), which had improved to 1.12 ± 1.38 (range 0 to 5) and 1.8 ± 1.98 (range 1 to 9) postoperatively for groups 1 and 2, respectively. The difference in the postoperative visual analog scale score between the 2 groups was statistically significant. The preoperative and postoperative AOFAS scale, Foot Function Index, and Karlsson-Peterson scores showed no statistically significant differences between the 2 groups. From our experience, either procedure is an acceptable treatment option for chronic lateral ankle instability, with the knotless technique showing a trend toward more complications. Copyright © 2017 American College of Foot and Ankle Surgeons. Published by Elsevier Inc. All rights reserved.

  1. PROMISE: a tool to identify genomic features with a specific biologically interesting pattern of associations with multiple endpoint variables.

    PubMed

    Pounds, Stan; Cheng, Cheng; Cao, Xueyuan; Crews, Kristine R; Plunkett, William; Gandhi, Varsha; Rubnitz, Jeffrey; Ribeiro, Raul C; Downing, James R; Lamba, Jatinder

    2009-08-15

    In some applications, prior biological knowledge can be used to define a specific pattern of association of multiple endpoint variables with a genomic variable that is biologically most interesting. However, to our knowledge, there is no statistical procedure designed to detect specific patterns of association with multiple endpoint variables. Projection onto the most interesting statistical evidence (PROMISE) is proposed as a general procedure to identify genomic variables that exhibit a specific biologically interesting pattern of association with multiple endpoint variables. Biological knowledge of the endpoint variables is used to define a vector that represents the biologically most interesting values for statistics that characterize the associations of the endpoint variables with a genomic variable. A test statistic is defined as the dot-product of the vector of the observed association statistics and the vector of the most interesting values of the association statistics. By definition, this test statistic is proportional to the length of the projection of the observed vector of correlations onto the vector of most interesting associations. Statistical significance is determined via permutation. In simulation studies and an example application, PROMISE shows greater statistical power to identify genes with the interesting pattern of associations than classical multivariate procedures, individual endpoint analyses or listing genes that have the pattern of interest and are significant in more than one individual endpoint analysis. Documented R routines are freely available from www.stjuderesearch.org/depts/biostats and will soon be available as a Bioconductor package from www.bioconductor.org.

  2. Performance-Based Outcomes after Operative Management of Athletic Pubalgia / Core Muscle Injury in National Football League Players

    PubMed Central

    Lynch, Thomas Sean; Kosanovic, Radomir; Gibbs, Daniel Bradley; Park, Caroline; Bedi, Asheesh; Larson, Christopher M.; Ahmad, Christopher S.

    2017-01-01

    Objectives: Athletic pubalgia is a condition in which there is an injury to the core musculature that precipitates groin and lower abdominal pain, particularly in cutting and pivoting sports. These are common injury patterns in the National Football League (NFL); however, the effect of surgery on performance for these players has not been described. Methods: Athletes in the NFL that underwent a surgical procedure for athletic pubalgia / core muscle injury (CMI) were identified through team injury reports and archives on public record since 2004. Outcome data was collected for athletes who met inclusion criteria which included total games played after season of injury / surgery, number of Pro Bowls voted to, yearly total years and touchdowns for offensive players and yearly total tackles sacks and interceptions for defensive players. Previously validated performance scores were calculated using this data for each player one season before and after their procedure for a CMI. Athletes were then matched to control professional football players without a diagnosis of athletic pubalgia by age, position, year and round drafted. Statistical analysis was used to compare pre-injury and post-injury performance measures for players treated with operative management to their case controls. Results: The study group was composed of 32 NFL athletes who underwent operative management for athletic pubalgia that met inclusion criteria during this study period, including 18 offensive players and 16 defensive players. The average age of athletes undergoing this surgery was 27 years old. Analysis of pre- and post-injury athletic performance revealed no statistically significant changes after return to sport after surgical intervention; however, there was a statistically significant difference in the number of Pro Bowls that affected athletes participated in before surgery (8) compared to the season after surgery (3). Analysis of durability, as measured by total number of games played before and after surgery, revealed no statistically significant difference. Conclusion: National Football League players who undergo operative care for athletic pubalgia have a high return to play with no decrease in performance scores when compared to case-matched controls. However, the indications for operative intervention and the type of procedure performed are heterogeneous. Further research is warranted to better understand how these injuries occur, what can be done to prevent their occurrence, and the long term career ramifications of this disorder.

  3. SU-D-209-03: Radiation Dose Reduction Using Real-Time Image Processing in Interventional Radiology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kanal, K; Moirano, J; Zamora, D

    Purpose: To characterize changes in radiation dose after introducing a new real-time image processing technology in interventional radiology systems. Methods: Interventional radiology (IR) procedures are increasingly complex, at times requiring substantial time and radiation dose. The risk of inducing tissue reactions as well as long-term stochastic effects such as radiation-induced cancer is not trivial. To reduce this risk, IR systems are increasingly equipped with dose reduction technologies.Recently, ClarityIQ (Philips Healthcare) technology was installed in our existing neuroradiology IR (NIR) and vascular IR (VIR) suites respectively. ClarityIQ includes real-time image processing that reduces noise/artifacts, enhances images, and sharpens edges while alsomore » reducing radiation dose rates. We reviewed 412 NIR (175 pre- and 237 post-ClarityIQ) procedures and 329 VIR (156 preand 173 post-ClarityIQ) procedures performed at our institution pre- and post-ClarityIQ implementation. NIR procedures were primarily classified as interventional or diagnostic. VIR procedures included drain port, drain placement, tube change, mesenteric, and implanted venous procedures. Air Kerma (AK in units of mGy) was documented for all the cases using a commercial radiation exposure management system. Results: When considering all NIR procedures, median AK decreased from 1194 mGy to 561 mGy. When considering all VIR procedures, median AK decreased from 49 to 14 mGy. Both NIR and VIR exhibited a decrease in AK exceeding 50% after ClarityIQ implementation, a statistically significant (p<0.05) difference. Of the 5 most common VIR procedures, all median AK values decreased, but significance (p<0.05) was only reached in venous access (N=53), angio mesenteric (N=41), and drain placement procedures (N=31). Conclusion: ClarityIQ can reduce dose significantly for both NIR and VIR procedures. Image quality was not assessed in conjunction with the dose reduction.« less

  4. "Hyperstat": an educational and working tool in epidemiology.

    PubMed

    Nicolosi, A

    1995-01-01

    The work of a researcher in epidemiology is based on studying literature, planning studies, gathering data, analyzing data and writing results. Therefore he has need for performing, more or less, simple calculations, the need for consulting or quoting literature, the need for consulting textbooks about certain issues or procedures, and the need for looking at a specific formula. There are no programs conceived as a workstation to assist the different aspects of researcher work in an integrated fashion. A hypertextual system was developed which supports different stages of the epidemiologist's work. It combines database management, statistical analysis or planning, and literature searches. The software was developed on Apple Macintosh by using Hypercard 2.1 as a database and HyperTalk as a programming language. The program is structured in 7 "stacks" or files: Procedures; Statistical Tables; Graphs; References; Text; Formulas; Help. Each stack has its own management system with an automated Table of Contents. Stacks contain "cards" which make up the databases and carry executable programs. The programs are of four kinds: association; statistical procedure; formatting (input/output); database management. The system performs general statistical procedures, procedures applicable to epidemiological studies only (follow-up and case-control), and procedures for clinical trials. All commands are given by clicking the mouse on self-explanatory "buttons". In order to perform calculations, the user only needs to enter the data into the appropriate cells and then click on the selected procedure's button. The system has a hypertextual structure. The user can go from a procedure to other cards following the preferred order of succession and according to built-in associations. The user can access different levels of knowledge or information from any stack he is consulting or operating. From every card, the user can go to a selected procedure to perform statistical calculations, to the reference database management system, to the textbook in which all procedures and issues are discussed in detail, to the database of statistical formulas with automated table of contents, to statistical tables with automated table of contents, or to the help module. he program has a very user-friendly interface and leaves the user free to use the same format he would use on paper. The interface does not require special skills. It reflects the Macintosh philosophy of using windows, buttons and mouse. This allows the user to perform complicated calculations without losing the "feel" of data, weight alternatives, and simulations. This program shares many features in common with hypertexts. It has an underlying network database where the nodes consist of text, graphics, executable procedures, and combinations of these; the nodes in the database correspond to windows on the screen; the links between the nodes in the database are visible as "active" text or icons in the windows; the text is read by following links and opening new windows. The program is especially useful as an educational tool, directed to medical and epidemiology students. The combination of computing capabilities with a textbook and databases of formulas and literature references, makes the program versatile and attractive as a learning tool. The program is also helpful in the work done at the desk, where the researcher examines results, consults literature, explores different analytic approaches, plans new studies, or writes grant proposals or scientific articles.

  5. Load research manual. Volume 3: Load research for advanced technologies

    NASA Astrophysics Data System (ADS)

    1980-11-01

    Technical guidelines for electric utility load research are presented. Special attention is given to issues raised by the load reporting requirements of the Public Utility Regulatory Policies Act of 1978 and to problems faced by smaller utilities that are initiating load research programs. The manual includes guides to load research literature and glossaries of load research and statistical terms. Special load research procedures are presented for solar, wind, and cogeneration technologies.

  6. Prevention of the Posttraumatic Fibrotic Response in Joints

    DTIC Science & Technology

    2015-10-01

    surgical procedures and subsequent collection of tissues have been developed and are currently used on a regular basis. Major Task 4: Evaluating the...needed to evaluate the utility of the inhibitory antibody to reduce the flexion contracture of injured knee joints. The employed techniques include...second surgery to remove a pin, and it did not change by the end of the 32nd week 1. Major Task 5: Task 4. Data analysis and statistical evaluation

  7. U.S. Marine Corps Communication-Electronics School Training Process: Discrete-Event Simulation and Lean Options

    DTIC Science & Technology

    2007-12-01

    acknowledged that Continuous Improvement (CI), or Kaizen in Japanese, is practiced in some way, shape, or form by most if not all Fortune 500 companies...greater resistance in the individualistic U.S. culture. Kaizen generally involves methodical examination and testing, followed by the adoption of new...or streamlined procedures, including scrupulous measurement and changes based on statistical deviation formulas. Kaizen appears to be a perfect fit

  8. A proposed method to minimize waste from institutional radiation safety surveillance programs through the application of expected value statistics.

    PubMed

    Emery, R J

    1997-03-01

    Institutional radiation safety programs routinely use wipe test sampling and liquid scintillation counting analysis to indicate the presence of removable radioactive contamination. Significant volumes of liquid waste can be generated by such surveillance activities, and the subsequent disposal of these materials can sometimes be difficult and costly. In settings where large numbers of negative results are regularly obtained, the limited grouping of samples for analysis based on expected value statistical techniques is possible. To demonstrate the plausibility of the approach, single wipe samples exposed to varying amounts of contamination were analyzed concurrently with nine non-contaminated samples. Although the sample grouping inevitably leads to increased quenching with liquid scintillation counting systems, the effect did not impact the ability to detect removable contamination in amounts well below recommended action levels. Opportunities to further improve this cost effective semi-quantitative screening procedure are described, including improvements in sample collection procedures, enhancing sample-counting media contact through mixing and extending elution periods, increasing sample counting times, and adjusting institutional action levels.

  9. Semisupervised Clustering by Iterative Partition and Regression with Neuroscience Applications

    PubMed Central

    Qian, Guoqi; Wu, Yuehua; Ferrari, Davide; Qiao, Puxue; Hollande, Frédéric

    2016-01-01

    Regression clustering is a mixture of unsupervised and supervised statistical learning and data mining method which is found in a wide range of applications including artificial intelligence and neuroscience. It performs unsupervised learning when it clusters the data according to their respective unobserved regression hyperplanes. The method also performs supervised learning when it fits regression hyperplanes to the corresponding data clusters. Applying regression clustering in practice requires means of determining the underlying number of clusters in the data, finding the cluster label of each data point, and estimating the regression coefficients of the model. In this paper, we review the estimation and selection issues in regression clustering with regard to the least squares and robust statistical methods. We also provide a model selection based technique to determine the number of regression clusters underlying the data. We further develop a computing procedure for regression clustering estimation and selection. Finally, simulation studies are presented for assessing the procedure, together with analyzing a real data set on RGB cell marking in neuroscience to illustrate and interpret the method. PMID:27212939

  10. Communication skills in individuals with spastic diplegia.

    PubMed

    Lamônica, Dionísia Aparecida Cusin; Paiva, Cora Sofia Takaya; Abramides, Dagma Venturini Marques; Biazon, Jamile Lozano

    2015-01-01

    To assess communication skills in children with spastic diplegia. The study included 20 subjects, 10 preschool children with spastic diplegia and 10 typical matched according to gender, mental age, and socioeconomic status. Assessment procedures were the following: interviews with parents, Stanford - Binet method, Gross Motor Function Classification System, Observing the Communicative Behavior, Vocabulary Test by Peabody Picture, Denver Developmental Screening Test II, MacArthur Development Inventory on Communicative Skills. Statistical analysis was performed using the values of mean, median, minimum and maximum value, and using Student's t-test, Mann-Whitney test, and Paired t-test. Individuals with spastic diplegia, when compared to their peers of the same mental age, presented no significant difference in relation to receptive and expressive vocabulary, fine motor skills, adaptive, personal-social, and language. The most affected area was the gross motor skills in individuals with spastic cerebral palsy. The participation in intervention procedures and the pairing of participants according to mental age may have approximated the performance between groups. There was no statistically significant difference in the comparison between groups, showing appropriate communication skills, although the experimental group has not behaved homogeneously.

  11. An automated approach to the design of decision tree classifiers

    NASA Technical Reports Server (NTRS)

    Argentiero, P.; Chin, P.; Beaudet, P.

    1980-01-01

    The classification of large dimensional data sets arising from the merging of remote sensing data with more traditional forms of ancillary data is considered. Decision tree classification, a popular approach to the problem, is characterized by the property that samples are subjected to a sequence of decision rules before they are assigned to a unique class. An automated technique for effective decision tree design which relies only on apriori statistics is presented. This procedure utilizes a set of two dimensional canonical transforms and Bayes table look-up decision rules. An optimal design at each node is derived based on the associated decision table. A procedure for computing the global probability of correct classfication is also provided. An example is given in which class statistics obtained from an actual LANDSAT scene are used as input to the program. The resulting decision tree design has an associated probability of correct classification of .76 compared to the theoretically optimum .79 probability of correct classification associated with a full dimensional Bayes classifier. Recommendations for future research are included.

  12. Genome-wide regression and prediction with the BGLR statistical package.

    PubMed

    Pérez, Paulino; de los Campos, Gustavo

    2014-10-01

    Many modern genomic data analyses require implementing regressions where the number of parameters (p, e.g., the number of marker effects) exceeds sample size (n). Implementing these large-p-with-small-n regressions poses several statistical and computational challenges, some of which can be confronted using Bayesian methods. This approach allows integrating various parametric and nonparametric shrinkage and variable selection procedures in a unified and consistent manner. The BGLR R-package implements a large collection of Bayesian regression models, including parametric variable selection and shrinkage methods and semiparametric procedures (Bayesian reproducing kernel Hilbert spaces regressions, RKHS). The software was originally developed for genomic applications; however, the methods implemented are useful for many nongenomic applications as well. The response can be continuous (censored or not) or categorical (either binary or ordinal). The algorithm is based on a Gibbs sampler with scalar updates and the implementation takes advantage of efficient compiled C and Fortran routines. In this article we describe the methods implemented in BGLR, present examples of the use of the package, and discuss practical issues emerging in real-data analysis. Copyright © 2014 by the Genetics Society of America.

  13. Quality Assurance in Clinical Chemistry: A Touch of Statistics and A Lot of Common Sense

    PubMed Central

    2016-01-01

    Summary Working in laboratories of clinical chemistry, we risk feeling that our personal contribution to quality is small and that statistical models and manufacturers play the major roles. It is seldom sufficiently acknowledged that personal knowledge, skills and common sense are crucial for quality assurance in the interest of patients. The employees, environment and procedures inherent to the laboratory including its interactions with the clients are crucial for the overall result of the total testing chain. As the measurement systems, reagents and procedures are gradually improved, work on the preanalytical, postanalytical and clinical phases is likely to pay the most substantial dividends in accomplishing further quality improvements. This means changing attitudes and behaviour, especially of the users of the laboratory. It requires understanding people and how to engage them in joint improvement processes. We need to use our knowledge and common sense expanded with new skills e.g. from the humanities, management, business and change sciences in order to bring this about together with the users of the laboratory. PMID:28356868

  14. Cosmic shear measurements with Dark Energy Survey Science Verification data

    DOE PAGES

    Becker, M. R.

    2016-07-06

    Here, we present measurements of weak gravitational lensing cosmic shear two-point statistics using Dark Energy Survey Science Verification data. We demonstrate that our results are robust to the choice of shear measurement pipeline, either ngmix or im3shape, and robust to the choice of two-point statistic, including both real and Fourier-space statistics. Our results pass a suite of null tests including tests for B-mode contamination and direct tests for any dependence of the two-point functions on a set of 16 observing conditions and galaxy properties, such as seeing, airmass, galaxy color, galaxy magnitude, etc. We use a large suite of simulationsmore » to compute the covariance matrix of the cosmic shear measurements and assign statistical significance to our null tests. We find that our covariance matrix is consistent with the halo model prediction, indicating that it has the appropriate level of halo sample variance. We also compare the same jackknife procedure applied to the data and the simulations in order to search for additional sources of noise not captured by the simulations. We find no statistically significant extra sources of noise in the data. The overall detection significance with tomography for our highest source density catalog is 9.7σ. Cosmological constraints from the measurements in this work are presented in a companion paper.« less

  15. Efficacy of Negative Pressure Wound Treatment in Preventing Surgical Site Infections after Whipple Procedures.

    PubMed

    Gupta, Ryan; Darby, Geoffrey C; Imagawa, David K

    2017-10-01

    Surgical site infections (SSIs) occur at an average rate of 21.1 per cent after Whipple procedures per NSQIP data. In the setting of adherence to standard National Surgery Quality Improvement Program (NSQIP) Hepatopancreatobiliary recommendations including wound protector use and glove change before closing, this study seeks to evaluate the efficacy of using negative pressure wound treatment (NPWT) over closed incision sites after a Whipple procedure to prevent SSI formation. We retrospectively examined consecutive patients from January 2014 to July 2016 who met criteria of completing Whipple procedures with full primary incision closure performed by a single surgeon at a single institution. Sixty-one patients were included in the study between two cohorts: traditional dressing (TD) (n = 36) and NPWT dressing (n = 25). There was a statistically significant difference (P = 0.01) in SSI formation between the TD cohort (n = 15, SSI rate = 0.41) and the NPWT cohort (n = 3, SSI rate = 0.12). The adjusted odds ratio (OR) of SSI formation was significant for NPWT use [OR = 0.15, P = 0.036] and for hospital length of stay [OR = 1.21, P = 0.024]. Operative length, operative blood loss, units of perioperative blood transfusion, intraoperative gastrojejunal tube placement, preoperative stent placement, and postoperative antibiotic duration did not significantly impact SSI formation (P > 0.05).

  16. "Low-field" intraoperative MRI: a new scenario, a new adaptation.

    PubMed

    Iturri-Clavero, F; Galbarriatu-Gutierrez, L; Gonzalez-Uriarte, A; Tamayo-Medel, G; de Orte, K; Martinez-Ruiz, A; Castellon-Larios, K; Bergese, S D

    2016-11-01

    To describe the adaptation of Cruces University Hospital to the use of intraoperative magnetic resonance imaging (ioMRI), and how the acquisition and use of this technology would impact the day-to-day running of the neurosurgical suite. With the approval of the ethics committee, an observational, prospective study was performed from June 2012 to April 2014, which included 109 neurosurgical procedures with the assistance of ioMRI. These were performed using the Polestar N-30 system (PSN30; Medtronic Navigation, Louisville, CO), which was integrated into the operating room. A total of 159 procedures were included: 109 cranial surgeries assisted with ioMRI and 50 control cases (no ioMRI use). There were no statistical significant differences when anaesthetic time (p=0.587) and surgical time (p=0.792) were compared; however, an important difference was shown in duration of patient positioning (p<0.0009) and total duration of the procedure (p<0.0009) between both groups. The introduction of ioMRI is necessary for most neurosurgical suites; however, a few things need to be taken into consideration when adapting to it. Increase procedure time, the use of specific MRI-safe devices, as well as a checklist for each patient to minimise risks, should be taken into consideration. Published by Elsevier Ltd.

  17. Pulmonary Screening in Subjects after the Fontan Procedure.

    PubMed

    Liptzin, Deborah R; Di Maria, Michael V; Younoszai, Adel; Narkewicz, Michael R; Kelly, Sarah L; Wolfe, Kelly R; Veress, Livia A

    2018-05-07

    To review the pulmonary findings of the first 51 patients who presented to our interdisciplinary single-ventricle clinic after undergoing the Fontan procedure. We performed an Institutional Review Board-approved retrospective review of 51 patients evaluated following the Fontan procedure. Evaluation included history, physical examination, pulmonary function testing, and 6-minute walk. Descriptive statistics were used to describe the population and testing data. Sixty-one percent of the patients had a pulmonary concern raised during the visit. Three patients had plastic bronchitis. Abnormal lung function testing was present in 46% of patients. Two-thirds (66%) of the patients had significant desaturation during the 6-minute walk test. Patients who underwent a fenestrated Fontan procedure and those who underwent unfenestrated Fontan were compared in terms of saturation and 6-minute walk test results. Sleep concerns were present in 45% of the patients. Pulmonary morbidities are common in patients after Fontan surgery and include plastic bronchitis, abnormal lung function, desaturations with walking, and sleep concerns. Abnormal lung function and obstructive sleep apnea may stress the Fontan circuit and may have implications for cognitive and emotional functioning. A pulmonologist involved in the care of patients after Fontan surgery can assist in screening for comorbidities and recommend interventions. Copyright © 2018 Elsevier Inc. All rights reserved.

  18. Post-operative shampoo effects in neurosurgical patients: a pilot experimental study.

    PubMed

    Palese, Alvisa; Moreale, Renzo; Noacco, Massimo; Pistrino, Flavia; Mastrolia, Irene; Sartor, Assunta; Scarparo, Claudio; Skrap, Miran

    2015-04-01

    Neurosurgical site infections are an important issue. Among the acknowledged preventive tactics, the non-shaving technique is well established in the neurosurgical setting. However, given that patient's hair around the surgical site may retain biologic material that emerges during the surgical procedure or that may simply become dirty, which may increase the risk of surgical site infections, if and when shampooing should be offered remains under debate. A pilot experimental study was undertaken from 2011 to 2012. A series of neurosurgical patients not affected by conditions that would increase the risk of post-operative infection were assigned randomly to the exposed group (receiving shampoo 72 h after surgical procedure) or control group (receiving standard dressing surveillance without shampooing). Comfort, surgical site contamination (measured as the number of colony-forming units [CFU]), and SSIs at 30 d after surgery were the main study outcomes. A total of 53 patients were included: 25 (47.2%) received a shampoo after 72 h whereas 28 (52.8%) received standard care. Patients who received a shampoo reported a similar level of comfort (average=8.04; standard deviation [SD] 1.05) compared with those receiving standard care (average 7.3; SD 3.2) although this was not statistically significant (p=0.345). No statistically significant difference emerged in the occurrence of surgical site contamination between the groups, and no SSIs were detected within 30 d. In our pilot study, the results of which are not generalizable because of the limited sample of patients involved, a gentle shampoo offered 72 h after the surgical procedure did not increase the SSIs occurrence or the contamination of the surgical site, although it may increase the perception of comfort by patients. Further studies are strongly recommended involving a larger sample size and designed to include more diversified neurosurgical patients undergoing surgical procedures in different centers.

  19. On Fitting Generalized Linear Mixed-effects Models for Binary Responses using Different Statistical Packages

    PubMed Central

    Zhang, Hui; Lu, Naiji; Feng, Changyong; Thurston, Sally W.; Xia, Yinglin; Tu, Xin M.

    2011-01-01

    Summary The generalized linear mixed-effects model (GLMM) is a popular paradigm to extend models for cross-sectional data to a longitudinal setting. When applied to modeling binary responses, different software packages and even different procedures within a package may give quite different results. In this report, we describe the statistical approaches that underlie these different procedures and discuss their strengths and weaknesses when applied to fit correlated binary responses. We then illustrate these considerations by applying these procedures implemented in some popular software packages to simulated and real study data. Our simulation results indicate a lack of reliability for most of the procedures considered, which carries significant implications for applying such popular software packages in practice. PMID:21671252

  20. A method for determining the weak statistical stationarity of a random process

    NASA Technical Reports Server (NTRS)

    Sadeh, W. Z.; Koper, C. A., Jr.

    1978-01-01

    A method for determining the weak statistical stationarity of a random process is presented. The core of this testing procedure consists of generating an equivalent ensemble which approximates a true ensemble. Formation of an equivalent ensemble is accomplished through segmenting a sufficiently long time history of a random process into equal, finite, and statistically independent sample records. The weak statistical stationarity is ascertained based on the time invariance of the equivalent-ensemble averages. Comparison of these averages with their corresponding time averages over a single sample record leads to a heuristic estimate of the ergodicity of a random process. Specific variance tests are introduced for evaluating the statistical independence of the sample records, the time invariance of the equivalent-ensemble autocorrelations, and the ergodicity. Examination and substantiation of these procedures were conducted utilizing turbulent velocity signals.

  1. An Empirical Comparison of Selected Two-Sample Hypothesis Testing Procedures Which Are Locally Most Powerful Under Certain Conditions.

    ERIC Educational Resources Information Center

    Hoover, H. D.; Plake, Barbara

    The relative power of the Mann-Whitney statistic, the t-statistic, the median test, a test based on exceedances (A,B), and two special cases of (A,B) the Tukey quick test and the revised Tukey quick test, was investigated via a Monte Carlo experiment. These procedures were compared across four population probability models: uniform, beta, normal,…

  2. Spatial Statistical Models and Optimal Survey Design for Rapid Geophysical characterization of UXO Sites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    G. Ostrouchov; W.E.Doll; D.A.Wolf

    2003-07-01

    Unexploded ordnance(UXO)surveys encompass large areas, and the cost of surveying these areas can be high. Enactment of earlier protocols for sampling UXO sites have shown the shortcomings of these procedures and led to a call for development of scientifically defensible statistical procedures for survey design and analysis. This project is one of three funded by SERDP to address this need.

  3. Statistical evaluation of rainfall-simulator and erosion testing procedure : final report.

    DOT National Transportation Integrated Search

    1977-01-01

    The specific aims of this study were (1) to supply documentation of statistical repeatability and precision of the rainfall-simulator and to document the statistical repeatabiity of the soil-loss data when using the previously recommended tentative l...

  4. The statistical analysis of circadian phase and amplitude in constant-routine core-temperature data

    NASA Technical Reports Server (NTRS)

    Brown, E. N.; Czeisler, C. A.

    1992-01-01

    Accurate estimation of the phases and amplitude of the endogenous circadian pacemaker from constant-routine core-temperature series is crucial for making inferences about the properties of the human biological clock from data collected under this protocol. This paper presents a set of statistical methods based on a harmonic-regression-plus-correlated-noise model for estimating the phases and the amplitude of the endogenous circadian pacemaker from constant-routine core-temperature data. The methods include a Bayesian Monte Carlo procedure for computing the uncertainty in these circadian functions. We illustrate the techniques with a detailed study of a single subject's core-temperature series and describe their relationship to other statistical methods for circadian data analysis. In our laboratory, these methods have been successfully used to analyze more than 300 constant routines and provide a highly reliable means of extracting phase and amplitude information from core-temperature data.

  5. Patient use of social media to evaluate cosmetic treatments and procedures.

    PubMed

    Schlichte, Megan J; Karimkhani, Chante; Jones, Trevor; Trikha, Ritika; Dellavalle, Robert P

    2015-04-16

    With a growing sphere of influence in the modern world, online social media serves as a readily accessible interface for communication of information. Aesthetic medicine is one of many industries increasingly influenced by social media, as evidenced by the popular website, "RealSelf," an online community founded in 2006 that compiles ratings, reviews, photographs, and expert physician commentary for nearly 300 cosmetic treatments. To investigate the current preferences of patients regarding cosmetic non-surgical, surgical, and dental treatments on RealSelf and in the documented medical literature. On a single day of data collection, all cosmetic treatments or procedures reviewed on the RealSelf website were tabulated, including name, percent "worth it" rating, total number of reviews, and average cost. Patient satisfaction rates documented in the current medical literature for each cosmetic treatment or procedure were also recorded. Statistical t-testingcomparing RealSelf ratings and satisfaction rates in the literature was performed for each category-non-surgical, surgical, and dental. The top ten most-commonly reviewed non-surgical treatments, top ten most-commonly reviewed surgical procedures, and top 5 most-commonly reviewed dental treatments, along with documented satisfaction rates in the medical literature for each treatment or procedure were recorded in table format and ranked by RealSelf "worth it" rating. Paired t-testing revealed that satisfaction rates documented in the literature were significantly higher than RealSelf "worth it" ratings for both non-surgical cosmetic treatments (p=0.00076) and surgical cosmetic procedures (p=0.00056), with no statistically significant difference for dental treatments. For prospective patients interested in cosmetic treatments or procedures, social media sites such as RealSelf may offer information helpful to decision-making as well enable cosmetic treatment providers to build reputations and expand practices. "Worth it" ratings on RealSelf may, in fact, represent a more transparent view of cosmetic treatment or procedural outcomes relative to the high satisfaction rates documented in medical literature. Massive online communication of patient experiences made possible through social media will continue to influence the practice of medicine, both aesthetic and otherwise.

  6. SU-F-I-77: Radiation Dose in Cardiac Catheterization Procedures: Impact of a Systematic Reduction in Pulsed Fluoroscopy Frame Rate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schultz, C; Dixon, S

    Purpose: To evaluate whether one small systematic reduction in fluoroscopy frame rate has a significant effect on the total air kerma and/or dose area product for diagnostic and interventional cardiac catheterization procedures. Methods: The default fluoroscopy frame rate (FFR) was lowered from 15 to 10 fps in 5 Siemens™ Axiom Artis cardiac catheterization labs (CCL) on July 1, 2013. A total of 7212 consecutive diagnostic and interventional CCL procedures were divided into two study groups: 3602 procedures from 10/1/12 –6/30/13 with FFR of 15 fps; and 3610 procedures 7/1/13 – 3/31/14 at 10 fps. For each procedure, total air kermamore » (TAK), fluoroscopy skin dose (FSD), total/fluoroscopy dose area products (TAD, FAD), and total fluoroscopy time (FT) were recorded. Patient specific data collected for each procedure included: BSA, sex, height, weight, interventional versus diagnostic; and elective versus emergent. Results: For pre to post change in FFR, each categorical variable was compared using Pearson’s Chi-square test, Odds ratios and 95% confidence intervals. No statistically significant difference in BSA, height, weight, number of interventional versus diagnostic, elective versus emergent procedures was found between the two study groups. Decreasing the default FFR from 15 fps to 10 fps in the two study groups significantly reduced TAK from 1305 to 1061 mGy (p<0.0001), FSD from 627 to 454 mGy (p<0.0001), TAD from 8681 to 6991 uGy × m{sup 2}(p<0.0001), and FAD from 4493 to 3297 uGy × m{sup 2}(p<0.0001). No statistically significant difference in FT was noted. Clinical image quality was not analyzed, and reports of noticeable effects were minimal. From July 1, 2013 to date, the default FFR has remained 10 fps. Conclusion: Reducing the FFR from 15 to 10 fps significantly reduced total air kerma and dose area product which may decrease risk for potential radiation-induced skin injuries and improve patient outcomes.« less

  7. MiDAS I (mild Decompression Alternative to Open Surgery): a preliminary report of a prospective, multi-center clinical study.

    PubMed

    Chopko, Bohdan; Caraway, David L

    2010-01-01

    Neurogenic claudication due to lumbar spinal stenosis is a common problem that can be caused by many factors including hypertrophic ligamentum flavum, facet hypertrophy, and disc protrusion. When standard medical therapies such as pain medication, epidural steroid injections, and physical therapy fail, or when the patient is unwilling, unable, or not severe enough to advance to more invasive surgical procedures, both physicians and patients are often left with a treatment dilemma. Patients in this study were treated with mild, an ultra-minimally invasive lumbar decompression procedure using a dorsal approach. The mild procedure is performed under fluoroscopic imaging to resect bone adjacent to, and achieve partial resection of, the hypertrophic ligamentum flavum with minimal disruption of surrounding muscular and skeletal structure. To assess the clinical application and patient safety and functional outcomes of the mild lumbar decompression procedure in the treatment of symptomatic central canal spinal stenosis. Multi-center, non-blinded, prospective clinical study. Fourteen US spine specialist practices. Between July 2008 and January 2010, 78 patients were enrolled in the MiDAS I Study and treated with the mild procedure for lumbar decompression. Of these patients, 6-week follow-up was available for 75 patients. Visual Analog Score (VAS), Oswestry Disability Index (ODI), Zurich Claudication Questionnaire (ZCQ), and SF-12v2 Health Survey. Outcomes were assessed at baseline and 6 weeks post-treatment. There were no major device or procedure-related complications reported in this patient cohort. At 6 weeks, the MiDAS I Study showed statistically and clinically significant reduction of pain as measured by VAS, ZCQ, and SF-12v2. In addition, improvement in physical function and mobility as measured by ODI, ZCQ, and SF-12v2 was statistically and clinically significant in this study. This is a preliminary report encompassing 6-week follow-up. There was no control group. In this 75-patient series, and in keeping with a previously published 90-patient safety cohort, the mild procedure proved to be safe. Further, based on near-term follow-up, the mild procedure demonstrated efficacy in improving mobility and reducing pain associated with lumbar spinal canal stenosis.

  8. Variability in source sediment contributions by applying different statistic test for a Pyrenean catchment.

    PubMed

    Palazón, L; Navas, A

    2017-06-01

    Information on sediment contribution and transport dynamics from the contributing catchments is needed to develop management plans to tackle environmental problems related with effects of fine sediment as reservoir siltation. In this respect, the fingerprinting technique is an indirect technique known to be valuable and effective for sediment source identification in river catchments. Large variability in sediment delivery was found in previous studies in the Barasona catchment (1509 km 2 , Central Spanish Pyrenees). Simulation results with SWAT and fingerprinting approaches identified badlands and agricultural uses as the main contributors to sediment supply in the reservoir. In this study the <63 μm sediment fraction from the surface reservoir sediments (2 cm) are investigated following the fingerprinting procedure to assess how the use of different statistical procedures affects the amounts of source contributions. Three optimum composite fingerprints were selected to discriminate between source contributions based in land uses/land covers from the same dataset by the application of (1) discriminant function analysis; and its combination (as second step) with (2) Kruskal-Wallis H-test and (3) principal components analysis. Source contribution results were different between assessed options with the greatest differences observed for option using #3, including the two step process: principal components analysis and discriminant function analysis. The characteristics of the solutions by the applied mixing model and the conceptual understanding of the catchment showed that the most reliable solution was achieved using #2, the two step process of Kruskal-Wallis H-test and discriminant function analysis. The assessment showed the importance of the statistical procedure used to define the optimum composite fingerprint for sediment fingerprinting applications. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Minimally invasive repair of pectus excavatum (MIRPE) in adults: is it a proper choice?

    PubMed Central

    Demirkaya, Ahmet; Kılıç, Burcu; Kara, Hasan Volkan; Yakşi, Osman; Alizade, Nurlan; Demirhan, Özkan; Sayılgan, Cem; Turna, Akif; Kaynak, Kamil

    2016-01-01

    Introduction The Nuss procedure is suitable for prepubertal and early pubertal patients but can also be used in adult patients. Aim To determine whether the minimally invasive technique (MIRPE) can also be performed successfully in adults. Material and methods Between July 2006 and January 2016, 836 patients (744 male, 92 female) underwent correction of pectus excavatum with the MIRPE technique at our institution. The mean age was 16.8 years (2–45 years). There were 236 adult patients (28.2%) (> 18 years) – 20 female, 216 male. The mean age among the adult patients was 23.2 years (18–45 years). The recorded data included length of hospital stay, postoperative complications, number of bars used, duration of the surgical procedure and signs of pneumothorax on the postoperative chest X-ray. Results The MIRPE was performed in 236 adult patients. The average operative time was 44.4 min (25–90 min). The median postoperative stay was 4.92 ±2.81 days (3–21 days) in adults and 4.64 ±1.58 (2–13) in younger patients. The difference was not statistically significant (p = 0.637). Two or more bars were used in 36 (15.8%) adult patients and in 44 (7.5%) younger patients. The difference was not statistically significant either (p = 0.068). Regarding the overall complications, complication rates among the adult patients and younger patients were 26.2% and 11.8% respectively. The difference was statistically significant (p = 0.007). Conclusions MIRPE is a feasible procedure that produces good long-term results in the treatment of pectus excavatum in adults. PMID:27458490

  10. Post-surgical infections: prevalence associated with various periodontal surgical procedures.

    PubMed

    Powell, Charles A; Mealey, Brian L; Deas, David E; McDonnell, Howard T; Moritz, Alan J

    2005-03-01

    Of the various adverse outcomes that may be encountered following periodontal surgery, the risk of infection stands at the forefront of concern to the surgeon, since infection can lead to morbidity and poor healing outcomes. This paper describes a large-scale retrospective study of multiple surgical modalities in a diverse periodontal practice undertaken to explore the prevalence of clinical infections post-surgically and the relationship between diverse treatment variables and infection rates. A retrospective review of all available periodontal surgical records of patients treated in the Department of Periodontics at Wilford Hall Medical Center, San Antonio, Texas, was conducted. The sample comprised 395 patients and included 1,053 fully documented surgical procedures. Surgical techniques reviewed included osseous resective surgery, flap curettage, distal wedge procedures, gingivectomy, root resection, guided tissue regeneration, dental implant surgery, epithelialized free soft tissue autografts, subepithelial connective tissue autografts, coronally positioned flaps, sinus augmentations, and ridge preservation or augmentation procedures. Infection was defined as increasing and progressive swelling with the presence of suppuration. The impact of various treatment variables was examined including the use of bone grafts, membranes, soft tissue grafts, post-surgical chlorhexidine rinses, systemic antibiotics, and dressings. Results were analyzed using Fisher's exact test and Pearson's chi-square test. Of the 1,053 surgical procedures evaluated in this study, there were a total of 22 infections for an overall prevalence of 2.09%. Patients who received antibiotics as part of the surgical protocol (pre- and/ or post-surgically) developed eight infections in 281 procedures (2.85%) compared to 14 infections in 772 procedures (1.81%) where antibiotics were not used. Procedures in which chlorhexidine was used during post-surgical care had a lower infection rate (17 infections in 900 procedures, 1.89%) compared to procedures after which chlorhexidine was not used as part of post-surgical care (five infections in 153 procedures, 3.27%). The use of a post-surgical dressing demonstrated a slightly higher rate of infection (eight infections in 300 procedures, 2.67%) than non-use of a dressing (14 infections in 753 procedures, 1.86%). Despite these trends, no statistically significant relationship was found between post-surgical infection and any of the treatment variables examined, including the use of perioperative antibiotics. The results of this study confirm previous research demonstrating a low rate of postoperative infection following periodontal surgical procedures. Although perioperative antibiotics are commonly used when performing certain regenerative and implant surgical procedures, data from this and other studies suggest that there may be no benefit in using antibiotics for the sole purpose of preventing post-surgical infections. Further large-scale, controlled clinical studies are warranted to determine the role of perioperative antibiotics in the prevention of periodontal post-surgical infections.

  11. Integration of modern statistical tools for the analysis of climate extremes into the web-GIS “CLIMATE”

    NASA Astrophysics Data System (ADS)

    Ryazanova, A. A.; Okladnikov, I. G.; Gordov, E. P.

    2017-11-01

    The frequency of occurrence and magnitude of precipitation and temperature extreme events show positive trends in several geographical regions. These events must be analyzed and studied in order to better understand their impact on the environment, predict their occurrences, and mitigate their effects. For this purpose, we augmented web-GIS called “CLIMATE” to include a dedicated statistical package developed in the R language. The web-GIS “CLIMATE” is a software platform for cloud storage processing and visualization of distributed archives of spatial datasets. It is based on a combined use of web and GIS technologies with reliable procedures for searching, extracting, processing, and visualizing the spatial data archives. The system provides a set of thematic online tools for the complex analysis of current and future climate changes and their effects on the environment. The package includes new powerful methods of time-dependent statistics of extremes, quantile regression and copula approach for the detailed analysis of various climate extreme events. Specifically, the very promising copula approach allows obtaining the structural connections between the extremes and the various environmental characteristics. The new statistical methods integrated into the web-GIS “CLIMATE” can significantly facilitate and accelerate the complex analysis of climate extremes using only a desktop PC connected to the Internet.

  12. A Retrospective Analysis of Complications Associated With Bone Morphogenetic Protein 2 in Anterior Lumbar Interbody Fusion.

    PubMed

    Hindoyan, Kevork; Tilan, Justin; Buser, Zorica; Cohen, Jeremiah R; Brodke, Darrel S; Youssef, Jim A; Park, Jong-Beom; Yoon, S Tim; Meisel, Hans-Joerg; Wang, Jeffrey C

    2017-04-01

    Retrospective review. The aim of our study was to quantify the frequency of complications associated with recombinant human bone morphogenetic protein 2 (rhBMP-2) use in anterior lumbar interbody fusion (ALIF). The orthopedic subset of the Medicare database (PearlDiver) was queried for this retrospective cohort study using International Statistical Classification of Diseases 9 (ICD-9) and Current Procedure Terminology (CPT) codes for ALIF procedures with and without rhBMP-2 between 2005 and 2010. Frequencies of complications and reoperations were then identified within 1 year from the index procedure. Complications included reoperations, pulmonary embolus, deep vein thrombosis, myocardial infarction, nerve-related complications, incision and drainage procedures, wound, sepsis, pneumonia, urinary tract infections, respiratory, heterotopic ossification, retrograde ejaculation, radiculopathy, and other medical complications. Odds ratios (ORs) and 95% confidence intervals (CIs) were used to assess the statistical significance. We identified a total of 41 865 patients who had an ALIF procedure. A total of 14 384 patients received rhBMP-2 while 27 481 did not. Overall, 6016 (41.8%) complications within 1 year from surgery were noted within the group who received rhBMP-2 and 12 950 (47.1%) complications within 1 year from surgery were recorded in those who did not receive rhBMP-2 (OR = 0.81, CI = 0.77-0.84). Overall, exposure to rhBMP-2 was associated with significantly decreased odds of complications with exception to reoperation rates (0.9% rhBMP-2 vs 1.0% no rhBMP-2; OR = 0.88, CI = 0.71-1.09) and radiculopathy (4.4% rhBMP-2 vs 4.3% no rhBMP-2; OR = 1.02, CI = 0.93-1.13). The use of rhBMP-2 in patients undergoing ALIF procedure was associated with a significantly decreased rate of complications. Further studies are needed to elucidate a true incidence of complication.

  13. A Retrospective Analysis of Complications Associated With Bone Morphogenetic Protein 2 in Anterior Lumbar Interbody Fusion

    PubMed Central

    Hindoyan, Kevork; Tilan, Justin; Cohen, Jeremiah R.; Brodke, Darrel S.; Youssef, Jim A.; Park, Jong-Beom; Yoon, S. Tim; Meisel, Hans-Joerg; Wang, Jeffrey C.

    2017-01-01

    Study Design: Retrospective review. Objective: The aim of our study was to quantify the frequency of complications associated with recombinant human bone morphogenetic protein 2 (rhBMP-2) use in anterior lumbar interbody fusion (ALIF). Methods: The orthopedic subset of the Medicare database (PearlDiver) was queried for this retrospective cohort study using International Statistical Classification of Diseases 9 (ICD-9) and Current Procedure Terminology (CPT) codes for ALIF procedures with and without rhBMP-2 between 2005 and 2010. Frequencies of complications and reoperations were then identified within 1 year from the index procedure. Complications included reoperations, pulmonary embolus, deep vein thrombosis, myocardial infarction, nerve-related complications, incision and drainage procedures, wound, sepsis, pneumonia, urinary tract infections, respiratory, heterotopic ossification, retrograde ejaculation, radiculopathy, and other medical complications. Odds ratios (ORs) and 95% confidence intervals (CIs) were used to assess the statistical significance. Results: We identified a total of 41 865 patients who had an ALIF procedure. A total of 14 384 patients received rhBMP-2 while 27 481 did not. Overall, 6016 (41.8%) complications within 1 year from surgery were noted within the group who received rhBMP-2 and 12 950 (47.1%) complications within 1 year from surgery were recorded in those who did not receive rhBMP-2 (OR = 0.81, CI = 0.77-0.84). Overall, exposure to rhBMP-2 was associated with significantly decreased odds of complications with exception to reoperation rates (0.9% rhBMP-2 vs 1.0% no rhBMP-2; OR = 0.88, CI = 0.71-1.09) and radiculopathy (4.4% rhBMP-2 vs 4.3% no rhBMP-2; OR = 1.02, CI = 0.93-1.13). Conclusions: The use of rhBMP-2 in patients undergoing ALIF procedure was associated with a significantly decreased rate of complications. Further studies are needed to elucidate a true incidence of complication. PMID:28507884

  14. Complication rates of ostomy surgery are high and vary significantly between hospitals.

    PubMed

    Sheetz, Kyle H; Waits, Seth A; Krell, Robert W; Morris, Arden M; Englesbe, Michael J; Mullard, Andrew; Campbell, Darrell A; Hendren, Samantha

    2014-05-01

    Ostomy surgery is common and has traditionally been associated with high rates of morbidity and mortality, suggesting an important target for quality improvement. The purpose of this work was to evaluate the variation in outcomes after ostomy creation surgery within Michigan to identify targets for quality improvement. This was a retrospective cohort study. The study took place within the 34-hospital Michigan Surgical Quality Collaborative. Patients included were those undergoing ostomy creation surgery between 2006 and 2011. We evaluated hospital morbidity and mortality rates after risk adjustment (age, comorbidities, emergency vs elective, and procedure type). A total of 4250 patients underwent ostomy creation surgery; 3866 procedures (91.0%) were open and 384 (9.0%) were laparoscopic. Unadjusted morbidity and mortality rates were 43.9% and 10.7%. Unadjusted morbidity rates for specific procedures ranged from 32.7% for ostomy-creation-only procedures to 47.8% for Hartmann procedures. Risk-adjusted morbidity rates varied significantly between hospitals, ranging from 31.2% (95% CI, 18.4-43.9) to 60.8% (95% CI, 48.9-72.6). There were 5 statistically significant high-outlier hospitals and 3 statistically significant low-outlier hospitals for risk-adjusted morbidity. The pattern of complication types was similar between high- and low-outlier hospitals. Case volume, operative duration, and use of laparoscopic surgery did not explain the variation in morbidity rates across hospitals. This work was limited by its retrospective study design, by unmeasured variation in case severity, and by our inability to differentiate between colostomies and ileostomies because of the use of Current Procedural Terminology codes. Morbidity and mortality rates for modern ostomy surgery are high. Although this type of surgery has received little attention in healthcare policy, these data reveal that it is both common and uncommonly morbid. Variation in hospital performance provides an opportunity to identify quality improvement practices that could be disseminated among hospitals.

  15. Procedures for determination of detection limits: application to high-performance liquid chromatography analysis of fat-soluble vitamins in human serum.

    PubMed

    Browne, Richard W; Whitcomb, Brian W

    2010-07-01

    Problems in the analysis of laboratory data commonly arise in epidemiologic studies in which biomarkers subject to lower detection thresholds are used. Various thresholds exist including limit of detection (LOD), limit of quantification (LOQ), and limit of blank (LOB). Choosing appropriate strategies for dealing with data affected by such limits relies on proper understanding of the nature of the detection limit and its determination. In this paper, we demonstrate experimental and statistical procedures generally used for estimating different detection limits according to standard procedures in the context of analysis of fat-soluble vitamins and micronutrients in human serum. Fat-soluble vitamins and micronutrients were analyzed by high-performance liquid chromatography with diode array detection. A simulated serum matrix blank was repeatedly analyzed for determination of LOB parametrically by using the observed blank distribution as well as nonparametrically by using ranks. The LOD was determined by combining information regarding the LOB with data from repeated analysis of standard reference materials (SRMs), diluted to low levels; from LOB to 2-3 times LOB. The LOQ was determined experimentally by plotting the observed relative standard deviation (RSD) of SRM replicates compared with the concentration, where the LOQ is the concentration at an RSD of 20%. Experimental approaches and example statistical procedures are given for determination of LOB, LOD, and LOQ. These quantities are reported for each measured analyte. For many analyses, there is considerable information available below the LOQ. Epidemiologic studies must understand the nature of these detection limits and how they have been estimated for appropriate treatment of affected data.

  16. Internal Delorme's Procedure for Treating ODS Associated With Impaired Anal Continence.

    PubMed

    Liu, Weicheng; Sturiale, Alessandro; Fabiani, Bernardina; Giani, Iacopo; Menconi, Claudia; Naldini, Gabriele

    2017-12-01

    The aim of this study was to evaluate the medium-term outcomes of internal Delorme's procedure for treating obstructed defecation syndrome (ODS) patients with impaired anal continence. In a retrospective study, 41 ODS patients who underwent internal Delorme's procedure between 2011 and 2015 were divided into 3 subgroups according to their associated symptoms of impaired continence, as urgency, passive fecal incontinence and both, before study. Then the patients' preoperative statuses, perioperative complications, and postoperative outcomes were investigated and collected from standardized questionnaires, including Altomare ODS score, Fecal Incontinence Severity Index (FISI), Patient Assessment of Constipation-Quality of Life Questionnaire (PAC-QoL), and Fecal Incontinence Quality of Life Scale (FIQLS). All results with a 2-tailed P < .05 were considered statistically significant. At an average 2.8 years of follow-up, there were significant improvements ( P < .01) in Altomare ODS score, FISI, PAC-QoL, and FIQLS in all patients when comparing scores from before the operation with those at the final follow-up. Similar results were also observed in both the urgency subgroup and passive fecal incontinence subgroup, but there were no statistically significant improvements ( P > .05) in Altomare ODS score, FISI, PAC-QoL, or FIQLS in the urgency and passive fecal incontinence subgroups. Anorectal manometry showed the mean value of anal resting pressure increased 20%. Additionally, no major complications occurred. Internal Delorme's procedure is effective without major morbidity for treating ODS associated with urgency or passive fecal incontinence, but it may be less effective for treating ODS associated with both urgency and passive fecal incontinence.

  17. Hemispherectomy for catastrophic epilepsy in infants.

    PubMed

    González-Martínez, Jorge A; Gupta, Ajay; Kotagal, Prakash; Lachhwani, Deepak; Wyllie, Elaine; Lüders, Hans O; Bingaman, William E

    2005-09-01

    To report our experience with hemispherectomy in the treatment of catastrophic epilepsy in children younger than 2 years. In a single-surgeon series, we performed a retrospective analysis of 18 patients with refractory epilepsy undergoing hemispherectomy (22 procedures). Three different surgical techniques were performed: anatomic hemispherectomy, functional hemispherectomy, and modified anatomic hemispherectomy. Pre- and postoperative evaluations included extensive video-EEG monitoring, magnetic resonance imaging, and positron emission tomography scanning. Seizure outcome was correlated with possible variables associated with persistent postoperative seizures. The Generalized Estimation Equation (GEE) and the Barnard's exact test were used as statistical methods. The follow-up was 12-74 months (mean, 34.8 months). Mean weight was 9.3 kg (6-12.3 kg). The population age was 3-22 months (mean, 11.7 months). Thirteen (66%) patients were seizure free, and four patients had >90% reduction of the seizure frequency and intensity. The overall complication rate was 16.7%. No deaths occurred. Twelve (54.5%) of 22 procedures resulted in incomplete disconnection, evidenced on postoperative images. Type of surgical procedure, diagnosis categories, persistence of insular cortex, and bilateral interictal epileptiform activity were not associated with persistent seizures after surgery. Incomplete disconnection was the only variable statistically associated with persistent seizures after surgery (p<0.05). Hemispherectomy for seizure control provides excellent and dramatic results with a satisfactory complication rate. Our results support the concept that early surgery should be indicated in highly selected patients with catastrophic epilepsy. Safety factors such as an expert team in the pediatric intensive care unit, neuroanesthesia, and a pediatric epilepsy surgeon familiar with the procedure are mandatory.

  18. Improving education: just-in-time splinting video.

    PubMed

    Wang, Vincent; Cheng, Yu-Tsun; Liu, Deborah

    2016-06-01

    Just-in-time training (JITT) is an emerging concept in medical procedural education, but with few studies to support its routine use. Providing a brief educational intervention in the form of a digital video immediately prior to patient care may be an effective method to reteach knowledge for procedural techniques learned previously. Paediatric resident physicians were taught to perform a volar splint in a small workshop setting. Subsequently, they were asked to demonstrate their splinting proficiency by performing a splint on another doctor. Proficiency was scored on a five-point assessment tool. After 2-12 months, participants were asked to demonstrate their splinting proficiency on one of the investigators, and were divided into the control group (no further instruction) and the intervention group, which viewed a 3-minute JITT digital video demonstrating the splinting technique prior to performing the procedure. Thirty subjects were enrolled between August 2012 and July 2013, and 29 of 30 completed the study. The retest splinting time was not significantly different, but if the JITT group included watching the video, the total time difference was statistically significant: 3.86 minutes (control) versus 7.07 minutes (JITT) (95% confidence interval: 2.20-3.90 minutes). The average assessment score difference was 1.87 points higher for the JITT group, which was a statistically significant difference (95% confidence interval: 1.00-3.00). Just-in-time training is an emerging concept in medical procedural education JITT seems to be an effective tool in medical education for reinforcing previously learned skills. JITT may offer other possibilities for enhancing medical education. © 2015 John Wiley & Sons Ltd.

  19. The prognostic value of clinical characteristics and parameters of cerebrospinal fluid hydrodynamics in shunting for idiopathic normal pressure hydrocephalus.

    PubMed

    Delwel, E J; de Jong, D A; Avezaat, C J J

    2005-10-01

    It is difficult to predict which patients with symptoms and radiological signs of normal pressure hydrocephalus (NPH) will benefit from a shunting procedure and which patients will not. Risk of this procedure is also higher in patients with NPH than in the overall population of hydrocephalic patients. The aim of this study is to investigate which clinical characteristics, CT parameters and parameters of cerebrospinal fluid dynamics could predict improvement after shunting. Eighty-three consecutive patients with symptoms and radiological signs of NPH were included in a prospective study. Parameters of the cerebrospinal fluid dynamics were measured by calculation of computerised data obtained by a constant-flow lumbar infusion test. Sixty-six patients considered candidates for surgery were treated with a medium-pressure Spitz-Holter valve; in seventeen patients a shunting procedure was not considered indicated. Clinical and radiological follow-up was performed for at least one year postoperatively. The odds ratio, the sensitivity and specificity as well as the positive and negative predictive value of individual and combinations of measured parameters did not show a statistically significant relation to clinical improvement after shunting. We conclude that neither individual parameters nor combinations of measured parameters show any statistically significant relation to clinical improvement following shunting procedures in patients suspected of NPH. We suggest restricting the term normal pressure hydrocephalus to cases that improve after shunting and using the term normal pressure hydrocephalus syndrome for patients suspected of NPH and for patients not improving after implantation of a proven well-functioning shunt.

  20. Validation and extraction of molecular-geometry information from small-molecule databases.

    PubMed

    Long, Fei; Nicholls, Robert A; Emsley, Paul; Graǽulis, Saulius; Merkys, Andrius; Vaitkus, Antanas; Murshudov, Garib N

    2017-02-01

    A freely available small-molecule structure database, the Crystallography Open Database (COD), is used for the extraction of molecular-geometry information on small-molecule compounds. The results are used for the generation of new ligand descriptions, which are subsequently used by macromolecular model-building and structure-refinement software. To increase the reliability of the derived data, and therefore the new ligand descriptions, the entries from this database were subjected to very strict validation. The selection criteria made sure that the crystal structures used to derive atom types, bond and angle classes are of sufficiently high quality. Any suspicious entries at a crystal or molecular level were removed from further consideration. The selection criteria included (i) the resolution of the data used for refinement (entries solved at 0.84 Å resolution or higher) and (ii) the structure-solution method (structures must be from a single-crystal experiment and all atoms of generated molecules must have full occupancies), as well as basic sanity checks such as (iii) consistency between the valences and the number of connections between atoms, (iv) acceptable bond-length deviations from the expected values and (v) detection of atomic collisions. The derived atom types and bond classes were then validated using high-order moment-based statistical techniques. The results of the statistical analyses were fed back to fine-tune the atom typing. The developed procedure was repeated four times, resulting in fine-grained atom typing, bond and angle classes. The procedure will be repeated in the future as and when new entries are deposited in the COD. The whole procedure can also be applied to any source of small-molecule structures, including the Cambridge Structural Database and the ZINC database.

  1. Finnish upper secondary students' collaborative processes in learning statistics in a CSCL environment

    NASA Astrophysics Data System (ADS)

    Kaleva Oikarinen, Juho; Järvelä, Sanna; Kaasila, Raimo

    2014-04-01

    This design-based research project focuses on documenting statistical learning among 16-17-year-old Finnish upper secondary school students (N = 78) in a computer-supported collaborative learning (CSCL) environment. One novel value of this study is in reporting the shift from teacher-led mathematical teaching to autonomous small-group learning in statistics. The main aim of this study is to examine how student collaboration occurs in learning statistics in a CSCL environment. The data include material from videotaped classroom observations and the researcher's notes. In this paper, the inter-subjective phenomena of students' interactions in a CSCL environment are analysed by using a contact summary sheet (CSS). The development of the multi-dimensional coding procedure of the CSS instrument is presented. Aptly selected video episodes were transcribed and coded in terms of conversational acts, which were divided into non-task-related and task-related categories to depict students' levels of collaboration. The results show that collaborative learning (CL) can facilitate cohesion and responsibility and reduce students' feelings of detachment in our classless, periodic school system. The interactive .pdf material and collaboration in small groups enable statistical learning. It is concluded that CSCL is one possible method of promoting statistical teaching. CL using interactive materials seems to foster and facilitate statistical learning processes.

  2. NHEXAS PHASE I MARYLAND STUDY--STANDARD OPERATING PROCEDURE FOR EXPLORATORY DATA ANALYSIS AND SUMMARY STATISTICS (D05)

    EPA Science Inventory

    This SOP describes the methods and procedures for two types of QA procedures: spot checks of hand entered data, and QA procedures for co-located and split samples. The spot checks were used to determine whether the error rate goal for the input of hand entered data was being att...

  3. Lasers and losers in the eyes of the law: liability for head and neck procedures.

    PubMed

    Svider, Peter F; Carron, Michael A; Zuliani, Giancarlo F; Eloy, Jean Anderson; Setzen, Michael; Folbe, Adam J

    2014-01-01

    Although some have noted that malpractice litigation may be "plateauing," defensive medical practices are pervasive and make up a considerable proportion of the "indirect" costs medicolegal issues contribute toward our health care system. Accordingly, these trends have spurred considerable interest in characterizing factors that play a role in alleged medical negligence, along with outcomes and awards. To conduct a focused examination of malpractice litigation regarding laser procedures in the head and neck and to determine the reasons for initiating litigation as well as outcomes and awards. Retrospective analysis of the WestlawNext legal database, encompassing publicly available federal and state court records, to identify malpractice cases involving laser procedures in the head and neck. Outcomes, awards, defendant specialty, and other allegations. Most cases (28 [82%]) included in this analysis involved female plaintiffs. Of 34 cases, 19 (56%) were resolved with a defendant verdict. The median indemnity was $150 000, and dermatologists, otolaryngologists, and plastic surgeons were the most commonly named defendants. The most common procedures were performed for age-related changes, acne scarring, hair removal, and vascular lesions, although there were also several rhinologic and airway cases. Of all cases, 25 (74%) involved cutaneous procedures, and common allegations noted included permanent injury (24 cases [71%]), disfigurement/scarring (23 [68%]), inadequate informed consent (17 [50%]), unnecessary/inappropriate procedure (15 [44%]), and burns (11 [32%]). Noncutaneous procedures had higher trending median payments ($600 000 vs $103 000), although this comparison did not reach statistical significance (P = .09). Procedures using lasers represent a potential target for malpractice litigation should an adverse event occur. Although cutaneous/cosmetic procedures were noted among cases included in this analysis, as well as other head and neck interventions, otolaryngologists were more likely to be named as defendants in the latter category. Although cases had modest indemnities compared with prior analyses, the potential for significant amounts was present. Inclusion into the informed consent process of specific factors detailed in this analysis may potentially decrease liability. In addition, physicians and patients should undergo comprehensive discussion regarding expectations as well as contingencies should adverse events occur. 4.

  4. Shelf Acetabuloplasty in the Treatment of Severe Legg-Calvé-Perthes Disease: Good Outcomes at Midterm Follow-Up

    PubMed Central

    Grzegorzewski, Andrzej; Synder, Marek; Kmieć, Krysztof; Krajewski, Karol; Polguj, Michał; Sibiński, Marcin

    2013-01-01

    The aim of the study was to retrospectively review results of operative treatment for coverage deficit of femoral head in children with severe epiphysis displacement in Legg-Calvé-Perthes (LCP) disease. The material included 23 shelf acetabuloplasty procedures for LCP disease. The average age at diagnosis was 8.1 years (range 4–12). Mean follow-up was 5.8 years (range from 2.2 to 11.2 years). Mean Reimer's index decreased statistically significantly from a mean of 32% before surgery to 10.0% at the last follow-up (P < 0.00001). The mean Wiberg center-edge angle increased also statistically significantly from a mean of 17.3° before procedure to 32.3° at the last follow-up (P < 0.00001). According to the Stulberg classification, type I was observed in 2, type II in 13, type III in 6, and type IV in 2 hips. There were no differences in the range of motion or leg length discrepancy in preoperative and postoperative standing. Partial, not significant, bone graft resorption was noted in 6 cases in the first 6–9 months after surgery. To conclude, shelf acetabuloplasty allows achieving good midterm results in the treatment of severe stages of LCP disease. The procedure improves coverage of femoral head and allows its remodelling. PMID:24377097

  5. 24 CFR 180.650 - Public document items.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... AND BUSINESS OPPORTUNITY CONSOLIDATED HUD HEARING PROCEDURES FOR CIVIL RIGHTS MATTERS Procedures at..., opinion, or published scientific or economic statistical data issued by any of the executive departments...

  6. Estimating Selected Streamflow Statistics Representative of 1930-2002 in West Virginia

    USGS Publications Warehouse

    Wiley, Jeffrey B.

    2008-01-01

    Regional equations and procedures were developed for estimating 1-, 3-, 7-, 14-, and 30-day 2-year; 1-, 3-, 7-, 14-, and 30-day 5-year; and 1-, 3-, 7-, 14-, and 30-day 10-year hydrologically based low-flow frequency values for unregulated streams in West Virginia. Regional equations and procedures also were developed for estimating the 1-day, 3-year and 4-day, 3-year biologically based low-flow frequency values; the U.S. Environmental Protection Agency harmonic-mean flows; and the 10-, 25-, 50-, 75-, and 90-percent flow-duration values. Regional equations were developed using ordinary least-squares regression using statistics from 117 U.S. Geological Survey continuous streamflow-gaging stations as dependent variables and basin characteristics as independent variables. Equations for three regions in West Virginia - North, South-Central, and Eastern Panhandle - were determined. Drainage area, precipitation, and longitude of the basin centroid are significant independent variables in one or more of the equations. Estimating procedures are presented for determining statistics at a gaging station, a partial-record station, and an ungaged location. Examples of some estimating procedures are presented.

  7. PROMISE: a tool to identify genomic features with a specific biologically interesting pattern of associations with multiple endpoint variables

    PubMed Central

    Pounds, Stan; Cheng, Cheng; Cao, Xueyuan; Crews, Kristine R.; Plunkett, William; Gandhi, Varsha; Rubnitz, Jeffrey; Ribeiro, Raul C.; Downing, James R.; Lamba, Jatinder

    2009-01-01

    Motivation: In some applications, prior biological knowledge can be used to define a specific pattern of association of multiple endpoint variables with a genomic variable that is biologically most interesting. However, to our knowledge, there is no statistical procedure designed to detect specific patterns of association with multiple endpoint variables. Results: Projection onto the most interesting statistical evidence (PROMISE) is proposed as a general procedure to identify genomic variables that exhibit a specific biologically interesting pattern of association with multiple endpoint variables. Biological knowledge of the endpoint variables is used to define a vector that represents the biologically most interesting values for statistics that characterize the associations of the endpoint variables with a genomic variable. A test statistic is defined as the dot-product of the vector of the observed association statistics and the vector of the most interesting values of the association statistics. By definition, this test statistic is proportional to the length of the projection of the observed vector of correlations onto the vector of most interesting associations. Statistical significance is determined via permutation. In simulation studies and an example application, PROMISE shows greater statistical power to identify genes with the interesting pattern of associations than classical multivariate procedures, individual endpoint analyses or listing genes that have the pattern of interest and are significant in more than one individual endpoint analysis. Availability: Documented R routines are freely available from www.stjuderesearch.org/depts/biostats and will soon be available as a Bioconductor package from www.bioconductor.org. Contact: stanley.pounds@stjude.org Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19528086

  8. Development and Validation of an Agency for Healthcare Research and Quality Indicator for Mortality After Congenital Heart Surgery Harmonized With Risk Adjustment for Congenital Heart Surgery (RACHS-1) Methodology.

    PubMed

    Jenkins, Kathy J; Koch Kupiec, Jennifer; Owens, Pamela L; Romano, Patrick S; Geppert, Jeffrey J; Gauvreau, Kimberlee

    2016-05-20

    The National Quality Forum previously approved a quality indicator for mortality after congenital heart surgery developed by the Agency for Healthcare Research and Quality (AHRQ). Several parameters of the validated Risk Adjustment for Congenital Heart Surgery (RACHS-1) method were included, but others differed. As part of the National Quality Forum endorsement maintenance process, developers were asked to harmonize the 2 methodologies. Parameters that were identical between the 2 methods were retained. AHRQ's Healthcare Cost and Utilization Project State Inpatient Databases (SID) 2008 were used to select optimal parameters where differences existed, with a goal to maximize model performance and face validity. Inclusion criteria were not changed and included all discharges for patients <18 years with International Classification of Diseases, Ninth Revision, Clinical Modification procedure codes for congenital heart surgery or nonspecific heart surgery combined with congenital heart disease diagnosis codes. The final model includes procedure risk group, age (0-28 days, 29-90 days, 91-364 days, 1-17 years), low birth weight (500-2499 g), other congenital anomalies (Clinical Classifications Software 217, except for 758.xx), multiple procedures, and transfer-in status. Among 17 945 eligible cases in the SID 2008, the c statistic for model performance was 0.82. In the SID 2013 validation data set, the c statistic was 0.82. Risk-adjusted mortality rates by center ranged from 0.9% to 4.1% (5th-95th percentile). Congenital heart surgery programs can now obtain national benchmarking reports by applying AHRQ Quality Indicator software to hospital administrative data, based on the harmonized RACHS-1 method, with high discrimination and face validity. © 2016 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.

  9. Right Brodmann area 18 predicts tremor arrest after Vim radiosurgery: a voxel-based morphometry study.

    PubMed

    Tuleasca, Constantin; Witjas, Tatiana; Van de Ville, Dimitri; Najdenovska, Elena; Verger, Antoine; Girard, Nadine; Champoudry, Jerome; Thiran, Jean-Philippe; Cuadra, Meritxell Bach; Levivier, Marc; Guedj, Eric; Régis, Jean

    2018-03-01

    Drug-resistant essential tremor (ET) can benefit from open standard stereotactic procedures, such as deep-brain stimulation or radiofrequency thalamotomy. Non-surgical candidates can be offered either high-focused ultrasound (HIFU) or radiosurgery (RS). All procedures aim to target the same thalamic site, the ventro-intermediate nucleus (e.g., Vim). The mechanisms by which tremor stops after Vim RS or HIFU remain unknown. We used voxel-based morphometry (VBM) on pretherapeutic neuroimaging data and assessed which anatomical site would best correlate with tremor arrest 1 year after Vim RS. Fifty-two patients (30 male, 22 female; mean age 71.6 years, range 49-82) with right-sided ET benefited from left unilateral Vim RS in Marseille, France. Targeting was performed in a uniform manner, using 130 Gy and a single 4-mm collimator. Neurological (pretherapeutic and 1 year after) and neuroimaging (baseline) assessments were completed. Tremor score on the treated hand (TSTH) at 1 year after Vim RS was included in a statistical parametric mapping analysis of variance (ANOVA) model as a continuous variable with pretherapeutic neuroimaging data. Pretherapeutic gray matter density (GMD) was further correlated with TSTH improvement. No a priori hypothesis was used in the statistical model. The only statistically significant region was right Brodmann area (BA) 18 (visual association area V2, p = 0.05, cluster size K c  = 71). Higher baseline GMD correlated with better TSTH improvement at 1 year after Vim RS (Spearman's rank correlation coefficient = 0.002). Routine baseline structural neuroimaging predicts TSTH improvement 1 year after Vim RS. The relevant anatomical area is the right visual association cortex (BA 18, V2). The question whether visual areas should be included in the targeting remains open.

  10. Differences in Temperature Changes in Premature Infants During Invasive Procedures in Incubators and Radiant Warmers.

    PubMed

    Handhayanti, Ludwy; Rustina, Yeni; Budiati, Tri

    Premature infants tend to lose heat quickly. This loss can be aggravated when they have received an invasive procedure involving a venous puncture. This research uses crossover design by conducting 2 intervention tests to compare 2 different treatments on the same sample. This research involved 2 groups with 18 premature infants in each. The process of data analysis used a statistical independent t test. Interventions conducted in an open incubator showed a p value of .001 which statistically related to heat loss in premature infants. In contrast, the radiant warmer p value of .001 statistically referred to a different range of heat gain before and after the venous puncture was given. The radiant warmer saved the premature infant from hypothermia during the invasive procedure. However, it is inadvisable for routine care of newborn infants since it can increase insensible water loss.

  11. Intercomparison of textural parameters of intertidal sediments generated by different statistical procedures, and implications for a unifying descriptive nomenclature

    NASA Astrophysics Data System (ADS)

    Fan, Daidu; Tu, Junbiao; Cai, Guofu; Shang, Shuai

    2015-06-01

    Grain-size analysis is a basic routine in sedimentology and related fields, but diverse methods of sample collection, processing and statistical analysis often make direct comparisons and interpretations difficult or even impossible. In this paper, 586 published grain-size datasets from the Qiantang Estuary (East China Sea) sampled and analyzed by the same procedures were merged and their textural parameters calculated by a percentile and two moment methods. The aim was to explore which of the statistical procedures performed best in the discrimination of three distinct sedimentary units on the tidal flats of the middle Qiantang Estuary. A Gaussian curve-fitting method served to simulate mixtures of two normal populations having different modal sizes, sorting values and size distributions, enabling a better understanding of the impact of finer tail components on textural parameters, as well as the proposal of a unifying descriptive nomenclature. The results show that percentile and moment procedures yield almost identical results for mean grain size, and that sorting values are also highly correlated. However, more complex relationships exist between percentile and moment skewness (kurtosis), changing from positive to negative correlations when the proportions of the finer populations decrease below 35% (10%). This change results from the overweighting of tail components in moment statistics, which stands in sharp contrast to the underweighting or complete amputation of small tail components by the percentile procedure. Intercomparisons of bivariate plots suggest an advantage of the Friedman & Johnson moment procedure over the McManus moment method in terms of the description of grain-size distributions, and over the percentile method by virtue of a greater sensitivity to small variations in tail components. The textural parameter scalings of Folk & Ward were translated into their Friedman & Johnson moment counterparts by application of mathematical functions derived by regression analysis of measured and modeled grain-size data, or by determining the abscissa values of intersections between auxiliary lines running parallel to the x-axis and vertical lines corresponding to the descriptive percentile limits along the ordinate of representative bivariate plots. Twofold limits were extrapolated for the moment statistics in relation to single descriptive terms in the cases of skewness and kurtosis by considering both positive and negative correlations between percentile and moment statistics. The extrapolated descriptive scalings were further validated by examining entire size-frequency distributions simulated by mixing two normal populations of designated modal size and sorting values, but varying in mixing ratios. These were found to match well in most of the proposed scalings, although platykurtic and very platykurtic categories were questionable when the proportion of the finer population was below 5%. Irrespective of the statistical procedure, descriptive nomenclatures should therefore be cautiously used when tail components contribute less than 5% to grain-size distributions.

  12. Systematic comparisons between PRISM version 1.0.0, BAP, and CSMIP ground-motion processing

    USGS Publications Warehouse

    Kalkan, Erol; Stephens, Christopher

    2017-02-23

    A series of benchmark tests was run by comparing results of the Processing and Review Interface for Strong Motion data (PRISM) software version 1.0.0 to Basic Strong-Motion Accelerogram Processing Software (BAP; Converse and Brady, 1992), and to California Strong Motion Instrumentation Program (CSMIP) processing (Shakal and others, 2003, 2004). These tests were performed by using the MatLAB implementation of PRISM, which is equivalent to its public release version in Java language. Systematic comparisons were made in time and frequency domains of records processed in PRISM and BAP, and in CSMIP, by using a set of representative input motions with varying resolutions, frequency content, and amplitudes. Although the details of strong-motion records vary among the processing procedures, there are only minor differences among the waveforms for each component and within the frequency passband common to these procedures. A comprehensive statistical evaluation considering more than 1,800 ground-motion components demonstrates that differences in peak amplitudes of acceleration, velocity, and displacement time series obtained from PRISM and CSMIP processing are equal to or less than 4 percent for 99 percent of the data, and equal to or less than 2 percent for 96 percent of the data. Other statistical measures, including the Euclidian distance (L2 norm) and the windowed root mean square level of processed time series, also indicate that both processing schemes produce statistically similar products.

  13. Comparison of Bile Drainage Methods after Laparoscopic CBD Exploration.

    PubMed

    Kwon, Seong Uk; Choi, In Seok; Moon, Ju Ik; Ra, Yu Mi; Lee, Sang Eok; Choi, Won Jun; Yoon, Dae Sung; Min, Hyun Sik

    2011-05-01

    T-tube is a major procedure that prevents complication by biliary decompression, but which is accompanied by complications. Therefore, several procedures such as ENBD, PTBD, and antegrade biliary stent have been attempted, but with controversies as to which procedure is superior. Also, there are no standard procedures after laparoscopic CBD exploration. We performed this study to ascertain the most appropriate biliary drainage procedure after laparoscopic CBD exploration. From March 2001 to December 2009, 121 patients who underwent Laparoscopic CBD exploration in Gunyang University were included for retrospective analysis. The patients were divided to 4 groups according to type of procedure, and we compared clinical parameters including age and gender, operation time, hospital stay, start of post-operative diet, and complications. There was no difference in age, gender, mean operation time, postoperative diet between the 4 groups. Hospital stay in the Stent group was shorter than T-tube group. There were 10 (7%) complications that occurred. Two 2 occurred in the T-tube, 3 in PTBD, and 5 in the Antegrade stent group. There were more complications in Stent group but no significant statistical difference. In 5 cases with remnant CBD stone, a total of 4 (3 PTBD, 1 Stent) was performed by endoscopic CBD stone removal. One T-tube case was removed easily by choledochoscopy through the T-tube. Three migrated and the impacted stents were removed by additional endoscopy. Perioperative biliary leakage (1) and peritonitis (1) post t-tube removal were resolved by conservative treatment. T-tube appears to be an appropriate method to patients who are suspected to have remnant CBD stones. Multiple procedures may be performed on a case by case basis such as performing PTBD first in a suspected cholangitis patient.

  14. Approach for Input Uncertainty Propagation and Robust Design in CFD Using Sensitivity Derivatives

    NASA Technical Reports Server (NTRS)

    Putko, Michele M.; Taylor, Arthur C., III; Newman, Perry A.; Green, Lawrence L.

    2002-01-01

    An implementation of the approximate statistical moment method for uncertainty propagation and robust optimization for quasi 3-D Euler CFD code is presented. Given uncertainties in statistically independent, random, normally distributed input variables, first- and second-order statistical moment procedures are performed to approximate the uncertainty in the CFD output. Efficient calculation of both first- and second-order sensitivity derivatives is required. In order to assess the validity of the approximations, these moments are compared with statistical moments generated through Monte Carlo simulations. The uncertainties in the CFD input variables are also incorporated into a robust optimization procedure. For this optimization, statistical moments involving first-order sensitivity derivatives appear in the objective function and system constraints. Second-order sensitivity derivatives are used in a gradient-based search to successfully execute a robust optimization. The approximate methods used throughout the analyses are found to be valid when considering robustness about input parameter mean values.

  15. Statistical Ensemble of Large Eddy Simulations

    NASA Technical Reports Server (NTRS)

    Carati, Daniele; Rogers, Michael M.; Wray, Alan A.; Mansour, Nagi N. (Technical Monitor)

    2001-01-01

    A statistical ensemble of large eddy simulations (LES) is run simultaneously for the same flow. The information provided by the different large scale velocity fields is used to propose an ensemble averaged version of the dynamic model. This produces local model parameters that only depend on the statistical properties of the flow. An important property of the ensemble averaged dynamic procedure is that it does not require any spatial averaging and can thus be used in fully inhomogeneous flows. Also, the ensemble of LES's provides statistics of the large scale velocity that can be used for building new models for the subgrid-scale stress tensor. The ensemble averaged dynamic procedure has been implemented with various models for three flows: decaying isotropic turbulence, forced isotropic turbulence, and the time developing plane wake. It is found that the results are almost independent of the number of LES's in the statistical ensemble provided that the ensemble contains at least 16 realizations.

  16. The Society of Thoracic Surgeons Congenital Heart Surgery Database Mortality Risk Model: Part 1—Statistical Methodology

    PubMed Central

    O’Brien, Sean M.; Jacobs, Jeffrey P.; Pasquali, Sara K.; Gaynor, J. William; Karamlou, Tara; Welke, Karl F.; Filardo, Giovanni; Han, Jane M.; Kim, Sunghee; Shahian, David M.; Jacobs, Marshall L.

    2016-01-01

    Background This study’s objective was to develop a risk model incorporating procedure type and patient factors to be used for case-mix adjustment in the analysis of hospital-specific operative mortality rates after congenital cardiac operations. Methods Included were patients of all ages undergoing cardiac operations, with or without cardiopulmonary bypass, at centers participating in The Society of Thoracic Surgeons Congenital Heart Surgery Database during January 1, 2010, to December 31, 2013. Excluded were isolated patent ductus arteriosus closures in patients weighing less than or equal to 2.5 kg, centers with more than 10% missing data, and patients with missing data for key variables. Data from the first 3.5 years were used for model development, and data from the last 0.5 year were used for assessing model discrimination and calibration. Potential risk factors were proposed based on expert consensus and selected after empirically comparing a variety of modeling options. Results The study cohort included 52,224 patients from 86 centers with 1,931 deaths (3.7%). Covariates included in the model were primary procedure, age, weight, and 11 additional patient factors reflecting acuity status and comorbidities. The C statistic in the validation sample was 0.858. Plots of observed-vs-expected mortality rates revealed good calibration overall and within subgroups, except for a slight overestimation of risk in the highest decile of predicted risk. Removing patient preoperative factors from the model reduced the C statistic to 0.831 and affected the performance classification for 12 of 86 hospitals. Conclusions The risk model is well suited to adjust for case mix in the analysis and reporting of hospital-specific mortality for congenital heart operations. Inclusion of patient factors added useful discriminatory power and reduced bias in the calculation of hospital-specific mortality metrics. PMID:26245502

  17. Using Online Databases to Determine the Correlation between Ranked Lists of Journals.

    DTIC Science & Technology

    1984-12-30

    CLASSIFICATION UNCLASSIFIED/UNLIMITED EZ SAME AS RPT. 0 OTIC USERS 0l UNCLASSIFIED 22a. NAME OF RESPONSIBLE INDIVIDUAL 22b. TELEPHONE NUMBER 22c. OFFICE SYMBOL...Communications Agency. The purpose of the study was to use citation analysis and statistical testing in journal selection. Bibliographic databases...sources to justify the cost of the journals selected. The research procedures used in this study included th compilation of a list of 157 technical

  18. The clean restaurant. II: Employee hygiene.

    PubMed

    Weinstein, J

    1991-05-15

    Poor personal hygiene causes more than 90% of the sanitation problems in the foodservice industry. Government statistics show improper hand washing alone accounts for more than 25% of all foodborne illnesses. In Part II of R&I's sanitation series, experts describe in detail proper procedures for reducing cross-contamination in the workplace and suggest ways to deal with a new problem--style vs. safety, including what apparel, jewelry, cosmetics and hair styles can and cannot be worn on the job.

  19. Application of statistical process control and process capability analysis procedures in orbiter processing activities at the Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Safford, Robert R.; Jackson, Andrew E.; Swart, William W.; Barth, Timothy S.

    1994-01-01

    Successful ground processing at KSC requires that flight hardware and ground support equipment conform to specifications at tens of thousands of checkpoints. Knowledge of conformance is an essential requirement for launch. That knowledge of conformance at every requisite point does not, however, enable identification of past problems with equipment, or potential problem areas. This paper describes how the introduction of Statistical Process Control and Process Capability Analysis identification procedures into existing shuttle processing procedures can enable identification of potential problem areas and candidates for improvements to increase processing performance measures. Results of a case study describing application of the analysis procedures to Thermal Protection System processing are used to illustrate the benefits of the approaches described in the paper.

  20. On fitting generalized linear mixed-effects models for binary responses using different statistical packages.

    PubMed

    Zhang, Hui; Lu, Naiji; Feng, Changyong; Thurston, Sally W; Xia, Yinglin; Zhu, Liang; Tu, Xin M

    2011-09-10

    The generalized linear mixed-effects model (GLMM) is a popular paradigm to extend models for cross-sectional data to a longitudinal setting. When applied to modeling binary responses, different software packages and even different procedures within a package may give quite different results. In this report, we describe the statistical approaches that underlie these different procedures and discuss their strengths and weaknesses when applied to fit correlated binary responses. We then illustrate these considerations by applying these procedures implemented in some popular software packages to simulated and real study data. Our simulation results indicate a lack of reliability for most of the procedures considered, which carries significant implications for applying such popular software packages in practice. Copyright © 2011 John Wiley & Sons, Ltd.

  1. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics

    PubMed Central

    2016-01-01

    Background We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. Objective To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. Methods The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Results Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. Conclusions IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise. PMID:27729304

  2. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.

    PubMed

    Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita

    2016-10-11

    We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise.

  3. SU-G-IeP3-05: Effects of Image Receptor Technology and Dose Reduction Software On Radiation Dose Estimates for Fluoroscopically-Guided Interventional (FGI) Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merritt, Z; Dave, J; Eschelman, D

    Purpose: To investigate the effects of image receptor technology and dose reduction software on radiation dose estimates for most frequently performed fluoroscopically-guided interventional (FGI) procedures at a tertiary health care center. Methods: IRB approval was obtained for retrospective analysis of FGI procedures performed in the interventional radiology suites between January-2011 and December-2015. This included procedures performed using image-intensifier (II) based systems which were subsequently replaced, flat-panel-detector (FPD) based systems which were later upgraded with ClarityIQ dose reduction software (Philips Healthcare) and relatively new FPD system already equipped with ClarityIQ. Post procedure, technologists entered system-reported cumulative air kerma (CAK) and kerma-areamore » product (KAP; only KAP for II based systems) in RIS; these values were analyzed. Data pre-processing included correcting typographical errors and cross-verifying CAK and KAP. The most frequent high and low dose FGI procedures were identified and corresponding CAK and KAP values were compared. Results: Out of 27,251 procedures within this time period, most frequent high and low dose procedures were chemo/immuno-embolization (n=1967) and abscess drainage (n=1821). Mean KAP for embolization and abscess drainage procedures were 260,657, 310,304 and 94,908 mGycm{sup 2}, and 14,497, 15,040 and 6307 mGycm{sup 2} using II-, FPD- and FPD with ClarityIQ- based systems, respectively. Statistically significant differences were observed in KAP values for embolization procedures with respect to different systems but for abscess drainage procedures significant differences were only noted between systems with FPD and FPD with ClarityIQ (p<0.05). Mean CAK reduced significantly from 823 to 308 mGy and from 43 to 21 mGy for embolization and abscess drainage procedures, respectively, in transitioning to FPD systems with ClarityIQ (p<0.05). Conclusion: While transitioning from II- to FPD- based systems was not associated with dose reduction for the most frequently performed FGI procedures, substantial dose reduction was noted with relatively newer systems and dose reduction software.« less

  4. The Statistical Power of Planned Comparisons.

    ERIC Educational Resources Information Center

    Benton, Roberta L.

    Basic principles underlying statistical power are examined; and issues pertaining to effect size, sample size, error variance, and significance level are highlighted via the use of specific hypothetical examples. Analysis of variance (ANOVA) and related methods remain popular, although other procedures sometimes have more statistical power against…

  5. A Statistical Test for Comparing Nonnested Covariance Structure Models.

    ERIC Educational Resources Information Center

    Levy, Roy; Hancock, Gregory R.

    While statistical procedures are well known for comparing hierarchically related (nested) covariance structure models, statistical tests for comparing nonhierarchically related (nonnested) models have proven more elusive. While isolated attempts have been made, none exists within the commonly used maximum likelihood estimation framework, thereby…

  6. Monitoring Statistics Which Have Increased Power over a Reduced Time Range.

    ERIC Educational Resources Information Center

    Tang, S. M.; MacNeill, I. B.

    1992-01-01

    The problem of monitoring trends for changes at unknown times is considered. Statistics that permit one to focus high power on a segment of the monitored period are studied. Numerical procedures are developed to compute the null distribution of these statistics. (Author)

  7. Improving the Crossing-SIBTEST Statistic for Detecting Non-uniform DIF.

    PubMed

    Chalmers, R Philip

    2018-06-01

    This paper demonstrates that, after applying a simple modification to Li and Stout's (Psychometrika 61(4):647-677, 1996) CSIBTEST statistic, an improved variant of the statistic could be realized. It is shown that this modified version of CSIBTEST has a more direct association with the SIBTEST statistic presented by Shealy and Stout (Psychometrika 58(2):159-194, 1993). In particular, the asymptotic sampling distributions and general interpretation of the effect size estimates are the same for SIBTEST and the new CSIBTEST. Given the more natural connection to SIBTEST, it is shown that Li and Stout's hypothesis testing approach is insufficient for CSIBTEST; thus, an improved hypothesis testing procedure is required. Based on the presented arguments, a new chi-squared-based hypothesis testing approach is proposed for the modified CSIBTEST statistic. Positive results from a modest Monte Carlo simulation study strongly suggest the original CSIBTEST procedure and randomization hypothesis testing approach should be replaced by the modified statistic and hypothesis testing method.

  8. Estimating procedure times for surgeries by determining location parameters for the lognormal model.

    PubMed

    Spangler, William E; Strum, David P; Vargas, Luis G; May, Jerrold H

    2004-05-01

    We present an empirical study of methods for estimating the location parameter of the lognormal distribution. Our results identify the best order statistic to use, and indicate that using the best order statistic instead of the median may lead to less frequent incorrect rejection of the lognormal model, more accurate critical value estimates, and higher goodness-of-fit. Using simulation data, we constructed and compared two models for identifying the best order statistic, one based on conventional nonlinear regression and the other using a data mining/machine learning technique. Better surgical procedure time estimates may lead to improved surgical operations.

  9. Additivity of nonlinear biomass equations

    Treesearch

    Bernard R. Parresol

    2001-01-01

    Two procedures that guarantee the property of additivity among the components of tree biomass and total tree biomass utilizing nonlinear functions are developed. Procedure 1 is a simple combination approach, and procedure 2 is based on nonlinear joint-generalized regression (nonlinear seemingly unrelated regressions) with parameter restrictions. Statistical theory is...

  10. Vertebroplasty versus sham procedure for painful acute osteoporotic vertebral compression fractures (VERTOS IV): randomised sham controlled clinical trial.

    PubMed

    Firanescu, Cristina E; de Vries, Jolanda; Lodder, Paul; Venmans, Alexander; Schoemaker, Marinus C; Smeet, Albert J; Donga, Esther; Juttmann, Job R; Klazen, Caroline A H; Elgersma, Otto E H; Jansen, Frits H; Tielbeek, Alexander V; Boukrab, Issam; Schonenberg, Karen; van Rooij, Willem Jan J; Hirsch, Joshua A; Lohle, Paul N M

    2018-05-09

    To assess whether percutaneous vertebroplasty results in more pain relief than a sham procedure in patients with acute osteoporotic compression fractures of the vertebral body. Randomised, double blind, sham controlled clinical trial. Four community hospitals in the Netherlands, 2011-15. 180 participants requiring treatment for acute osteoporotic vertebral compression fractures were randomised to either vertebroplasty (n=91) or a sham procedure (n=89). Participants received local subcutaneous lidocaine (lignocaine) and bupivacaine at each pedicle. The vertebroplasty group also received cementation, which was simulated in the sham procedure group. Main outcome measure was mean reduction in visual analogue scale (VAS) scores at one day, one week, and one, three, six, and 12 months. Clinically significant pain relief was defined as a decrease of 1.5 points in VAS scores from baseline. Secondary outcome measures were the differences between groups for changes in the quality of life for osteoporosis and Roland-Morris disability questionnaire scores during 12 months' follow-up. The mean reduction in VAS score was statistically significant in the vertebroplasty and sham procedure groups at all follow-up points after the procedure compared with baseline. The mean difference in VAS scores between groups was 0.20 (95% confidence interval -0.53 to 0.94) at baseline, -0.43 (-1.17 to 0.31) at one day, -0.11 (-0.85 to 0.63) at one week, 0.41 (-0.33 to 1.15) at one month, 0.21 (-0.54 to 0.96) at three months, 0.39 (-0.37 to 1.15) at six months, and 0.45 (-0.37 to 1.24) at 12 months. These changes in VAS scores did not, however, differ statistically significantly between the groups during 12 months' follow-up. The results for secondary outcomes were not statistically significant. Use of analgesics (non-opioids, weak opioids, strong opioids) decreased statistically significantly in both groups at all time points, with no statistically significant differences between groups. Two adverse events occurred in the vertebroplasty group: one respiratory insufficiency and one vasovagal reaction. Percutaneous vertebroplasty did not result in statistically significantly greater pain relief than a sham procedure during 12 months' follow-up among patients with acute osteoporotic vertebral compression fractures. ClinicalTrials.gov NCT01200277. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  11. Single-Port Surgery: Laboratory Experience with the daVinci Single-Site Platform

    PubMed Central

    Haber, Georges-Pascal; Kaouk, Jihad; Kroh, Matthew; Chalikonda, Sricharan; Falcone, Tommaso

    2011-01-01

    Background and Objectives: The purpose of this study was to evaluate the feasibility and validity of a dedicated da Vinci single-port platform in the porcine model in the performance of gynecologic surgery. Methods: This pilot study was conducted in 4 female pigs. All pigs had a general anesthetic and were placed in the supine and flank position. A 2-cm umbilical incision was made, through which a robotic single-port device was placed and pneumoperitoneum obtained. A data set was collected for each procedure and included port placement time, docking time, operative time, blood loss, and complications. Operative times were compared between cases and procedures by use of the Student t test. Results: A total of 28 surgical procedures (8 oophorectomies, 4 hysterectomies, 8 pelvic lymph node dissections, 4 aorto-caval nodal dissections, 2 bladder repairs, 1 uterine horn anastomosis, and 1 radical cystectomy) were performed. There was no statistically significant difference in operating times for symmetrical procedures among animals (P=0.3215). Conclusions: This animal study demonstrates that single-port robotic surgery using a dedicated single-site platform allows performing technically challenging procedures within acceptable operative times and without complications or insertion of additional trocars. PMID:21902962

  12. Irradiation-hyperthermia in canine hemangiopericytomas: large-animal model for therapeutic response.

    PubMed

    Richardson, R C; Anderson, V L; Voorhees, W D; Blevins, W E; Inskeep, T K; Janas, W; Shupe, R E; Babbs, C F

    1984-11-01

    Results of irradiation-hyperthermia treatment in 11 dogs with naturally occurring hemangiopericytoma were reported. Similarities of canine and human hemangiopericytomas were described. Orthovoltage X-irradiation followed by microwave-induced hyperthermia resulted in a 91% objective response rate. A statistical procedure was given to evaluate quantitatively the clinical behavior of locally invasive, nonmetastatic tumors in dogs that were undergoing therapy for control of local disease. The procedure used a small sample size and demonstrated distribution of the data on a scaled response as well as transformation of the data through classical parametric and nonparametric statistical methods. These statistical methods set confidence limits on the population mean and placed tolerance limits on a population percentage. Application of the statistical methods to human and animal clinical trials was apparent.

  13. A Recommended Procedure for Estimating the Cosmic-Ray Spectral Parameter of a Simple Power Law With Applications to Detector Design

    NASA Technical Reports Server (NTRS)

    Howell, L. W.

    2001-01-01

    A simple power law model consisting of a single spectral index alpha-1 is believed to be an adequate description of the galactic cosmic-ray (GCR) proton flux at energies below 10(exp 13) eV. Two procedures for estimating alpha-1 the method of moments and maximum likelihood (ML), are developed and their statistical performance compared. It is concluded that the ML procedure attains the most desirable statistical properties and is hence the recommended statistical estimation procedure for estimating alpha-1. The ML procedure is then generalized for application to a set of real cosmic-ray data and thereby makes this approach applicable to existing cosmic-ray data sets. Several other important results, such as the relationship between collecting power and detector energy resolution, as well as inclusion of a non-Gaussian detector response function, are presented. These results have many practical benefits in the design phase of a cosmic-ray detector as they permit instrument developers to make important trade studies in design parameters as a function of one of the science objectives. This is particularly important for space-based detectors where physical parameters, such as dimension and weight, impose rigorous practical limits to the design envelope.

  14. Forensic analysis of Salvia divinorum using multivariate statistical procedures. Part I: discrimination from related Salvia species.

    PubMed

    Willard, Melissa A Bodnar; McGuffin, Victoria L; Smith, Ruth Waddell

    2012-01-01

    Salvia divinorum is a hallucinogenic herb that is internationally regulated. In this study, salvinorin A, the active compound in S. divinorum, was extracted from S. divinorum plant leaves using a 5-min extraction with dichloromethane. Four additional Salvia species (Salvia officinalis, Salvia guaranitica, Salvia splendens, and Salvia nemorosa) were extracted using this procedure, and all extracts were analyzed by gas chromatography-mass spectrometry. Differentiation of S. divinorum from other Salvia species was successful based on visual assessment of the resulting chromatograms. To provide a more objective comparison, the total ion chromatograms (TICs) were subjected to principal components analysis (PCA). Prior to PCA, the TICs were subjected to a series of data pretreatment procedures to minimize non-chemical sources of variance in the data set. Successful discrimination of S. divinorum from the other four Salvia species was possible based on visual assessment of the PCA scores plot. To provide a numerical assessment of the discrimination, a series of statistical procedures such as Euclidean distance measurement, hierarchical cluster analysis, Student's t tests, Wilcoxon rank-sum tests, and Pearson product moment correlation were also applied to the PCA scores. The statistical procedures were then compared to determine the advantages and disadvantages for forensic applications.

  15. Effectiveness of nonpharmacological interventions to reduce procedural anxiety in children and adolescents undergoing treatment for cancer: A systematic review and meta-analysis.

    PubMed

    Nunns, Michael; Mayhew, Dominic; Ford, Tamsin; Rogers, Morwenna; Curle, Christine; Logan, Stuart; Moore, Darren

    2018-04-30

    Children and young people (CYP) with cancer undergo painful and distressing procedures. We aimed to systematically review the effectiveness of nonpharmacological interventions to reduce procedural anxiety in CYP. Extensive literature searches sought randomised controlled trials that quantified the effect of any nonpharmacological intervention for procedural anxiety in CYP with cancer aged 0 to 25. Study selection involved independent title and abstract screening and full text screening by two reviewers. Anxiety, distress, fear, and pain outcomes were extracted from included studies. Where similar intervention, comparator, and outcomes presented, meta-analysis was performed, producing pooled effect sizes (Cohen's d) and 95% confidence intervals (95% CI). All other data were narratively described. Quality and risk of bias appraisal was performed, based on the Cochrane risk of bias tool. Screening of 11 727 records yielded 56 relevant full texts. There were 15 included studies, eight trialling hypnosis, and seven nonhypnosis interventions. There were large, statistically significant reductions in anxiety and pain for hypnosis, particularly compared with treatment as usual (anxiety: d = 2.30; 95% CI, 1.30-3.30; P < .001; pain: d = 2.16; 95% CI, 1.41-2.92; P < .001). Evidence from nonhypnosis interventions was equivocal, with some promising individual studies. There was high risk of bias across included studies limiting confidence in some positive effects. Evidence suggests promise for hypnosis interventions to reduce procedural anxiety in CYP undergoing cancer treatment. These results largely emerge from one research group, therefore wider research is required. Promising evidence for individual nonhypnosis interventions must be evaluated through rigorously conducted randomised controlled trials. Copyright © 2018 John Wiley & Sons, Ltd.

  16. Health significance and statistical uncertainty. The value of P-value.

    PubMed

    Consonni, Dario; Bertazzi, Pier Alberto

    2017-10-27

    The P-value is widely used as a summary statistics of scientific results. Unfortunately, there is a widespread tendency to dichotomize its value in "P<0.05" (defined as "statistically significant") and "P>0.05" ("statistically not significant"), with the former implying a "positive" result and the latter a "negative" one. To show the unsuitability of such an approach when evaluating the effects of environmental and occupational risk factors. We provide examples of distorted use of P-value and of the negative consequences for science and public health of such a black-and-white vision. The rigid interpretation of P-value as a dichotomy favors the confusion between health relevance and statistical significance, discourages thoughtful thinking, and distorts attention from what really matters, the health significance. A much better way to express and communicate scientific results involves reporting effect estimates (e.g., risks, risks ratios or risk differences) and their confidence intervals (CI), which summarize and convey both health significance and statistical uncertainty. Unfortunately, many researchers do not usually consider the whole interval of CI but only examine if it includes the null-value, therefore degrading this procedure to the same P-value dichotomy (statistical significance or not). In reporting statistical results of scientific research present effects estimates with their confidence intervals and do not qualify the P-value as "significant" or "not significant".

  17. Adaptive graph-based multiple testing procedures

    PubMed Central

    Klinglmueller, Florian; Posch, Martin; Koenig, Franz

    2016-01-01

    Multiple testing procedures defined by directed, weighted graphs have recently been proposed as an intuitive visual tool for constructing multiple testing strategies that reflect the often complex contextual relations between hypotheses in clinical trials. Many well-known sequentially rejective tests, such as (parallel) gatekeeping tests or hierarchical testing procedures are special cases of the graph based tests. We generalize these graph-based multiple testing procedures to adaptive trial designs with an interim analysis. These designs permit mid-trial design modifications based on unblinded interim data as well as external information, while providing strong family wise error rate control. To maintain the familywise error rate, it is not required to prespecify the adaption rule in detail. Because the adaptive test does not require knowledge of the multivariate distribution of test statistics, it is applicable in a wide range of scenarios including trials with multiple treatment comparisons, endpoints or subgroups, or combinations thereof. Examples of adaptations are dropping of treatment arms, selection of subpopulations, and sample size reassessment. If, in the interim analysis, it is decided to continue the trial as planned, the adaptive test reduces to the originally planned multiple testing procedure. Only if adaptations are actually implemented, an adjusted test needs to be applied. The procedure is illustrated with a case study and its operating characteristics are investigated by simulations. PMID:25319733

  18. Impaired Statistical Learning in Developmental Dyslexia

    PubMed Central

    Thiessen, Erik D.; Holt, Lori L.

    2015-01-01

    Purpose Developmental dyslexia (DD) is commonly thought to arise from phonological impairments. However, an emerging perspective is that a more general procedural learning deficit, not specific to phonological processing, may underlie DD. The current study examined if individuals with DD are capable of extracting statistical regularities across sequences of passively experienced speech and nonspeech sounds. Such statistical learning is believed to be domain-general, to draw upon procedural learning systems, and to relate to language outcomes. Method DD and control groups were familiarized with a continuous stream of syllables or sine-wave tones, the ordering of which was defined by high or low transitional probabilities across adjacent stimulus pairs. Participants subsequently judged two 3-stimulus test items with either high or low statistical coherence as being the most similar to the sounds heard during familiarization. Results As with control participants, the DD group was sensitive to the transitional probability structure of the familiarization materials as evidenced by above-chance performance. However, the performance of participants with DD was significantly poorer than controls across linguistic and nonlinguistic stimuli. In addition, reading-related measures were significantly correlated with statistical learning performance of both speech and nonspeech material. Conclusion Results are discussed in light of procedural learning impairments among participants with DD. PMID:25860795

  19. Variability in the Use of Simulation for Procedural Training in Radiology Residency: Opportunities for Improvement.

    PubMed

    Matalon, Shanna A; Chikarmane, Sona A; Yeh, Eren D; Smith, Stacy E; Mayo-Smith, William W; Giess, Catherine S

    2018-03-19

    Increased attention to quality and safety has led to a re-evaluation of the classic apprenticeship model for procedural training. Many have proposed simulation as a supplementary teaching tool. The purpose of this study was to assess radiology resident exposure to procedural training and procedural simulation. An IRB-exempt online survey was distributed to current radiology residents in the United States by e-mail. Survey results were summarized using frequency and percentages. Chi-square tests were used for statistical analysis where appropriate. A total of 353 current residents completed the survey. 37% (n = 129/353) of respondents had never used procedure simulation. Of the residents who had used simulation, most did not do so until after having already performed procedures on patients (59%, n = 132/223). The presence of a dedicated simulation center was reported by over half of residents (56%, n = 196/353) and was associated with prior simulation experience (P = 0.007). Residents who had not had procedural simulation were somewhat likely or highly likely (3 and 4 on a 4-point Likert-scale) to participate if it were available (81%, n = 104/129). Simulation training was associated with higher comfort levels in performing procedures (P < 0.001). Although procedural simulation training is associated with higher comfort levels when performing procedures, there is variable use in radiology resident training and its use is not currently optimized. Given the increased emphasis on patient safety, these results suggest the need to increase procedural simulation use during residency, including an earlier introduction to simulation before patient exposure. Copyright © 2018 Elsevier Inc. All rights reserved.

  20. Intermediate/Advanced Research Design and Statistics

    NASA Technical Reports Server (NTRS)

    Ploutz-Snyder, Robert

    2009-01-01

    The purpose of this module is To provide Institutional Researchers (IRs) with an understanding of the principles of advanced research design and the intermediate/advanced statistical procedures consistent with such designs

  1. Analyzing Faculty Salaries When Statistics Fail.

    ERIC Educational Resources Information Center

    Simpson, William A.

    The role played by nonstatistical procedures, in contrast to multivariant statistical approaches, in analyzing faculty salaries is discussed. Multivariant statistical methods are usually used to establish or defend against prima facia cases of gender and ethnic discrimination with respect to faculty salaries. These techniques are not applicable,…

  2. Comparing Assessment Methods in Undergraduate Statistics Courses

    ERIC Educational Resources Information Center

    Baxter, Sarah E.

    2017-01-01

    The purpose of this study was to compare undergraduate students' academic performance and attitudes about statistics in the context of two different types of assessment structures for an introductory statistics course. One assessment structure used in-class quizzes that emphasized computation and procedural fluency as well as vocabulary…

  3. Procedural instruction in invasive bedside procedures: a systematic review and meta-analysis of effective teaching approaches.

    PubMed

    Huang, Grace C; McSparron, Jakob I; Balk, Ethan M; Richards, Jeremy B; Smith, C Christopher; Whelan, Julia S; Newman, Lori R; Smetana, Gerald W

    2016-04-01

    Optimal approaches to teaching bedside procedures are unknown. To identify effective instructional approaches in procedural training. We searched PubMed, EMBASE, Web of Science and Cochrane Library through December 2014. We included research articles that addressed procedural training among physicians or physician trainees for 12 bedside procedures. Two independent reviewers screened 9312 citations and identified 344 articles for full-text review. Two independent reviewers extracted data from full-text articles. We included measurements as classified by translational science outcomes T1 (testing settings), T2 (patient care practices) and T3 (patient/public health outcomes). Due to incomplete reporting, we post hoc classified study outcomes as 'negative' or 'positive' based on statistical significance. We performed meta-analyses of outcomes on the subset of studies sharing similar outcomes. We found 161 eligible studies (44 randomised controlled trials (RCTs), 34 non-RCTs and 83 uncontrolled trials). Simulation was the most frequently published educational mode (78%). Our post hoc classification showed that studies involving simulation, competency-based approaches and RCTs had higher frequencies of T2/T3 outcomes. Meta-analyses showed that simulation (risk ratio (RR) 1.54 vs 0.55 for studies with vs without simulation, p=0.013) and competency-based approaches (RR 3.17 vs 0.89, p<0.001) were effective forms of training. This systematic review of bedside procedural skills demonstrates that the current literature is heterogeneous and of varying quality and rigour. Evidence is strongest for the use of simulation and competency-based paradigms in teaching procedures, and these approaches should be the mainstay of programmes that train physicians to perform procedures. Further research should clarify differences among instructional methods (eg, forms of hands-on training) rather than among educational modes (eg, lecture vs simulation). Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  4. Selecting Statistical Procedures for Quality Control Planning Based on Risk Management.

    PubMed

    Yago, Martín; Alcover, Silvia

    2016-07-01

    According to the traditional approach to statistical QC planning, the performance of QC procedures is assessed in terms of its probability of rejecting an analytical run that contains critical size errors (PEDC). Recently, the maximum expected increase in the number of unacceptable patient results reported during the presence of an undetected out-of-control error condition [Max E(NUF)], has been proposed as an alternative QC performance measure because it is more related to the current introduction of risk management concepts for QC planning in the clinical laboratory. We used a statistical model to investigate the relationship between PEDC and Max E(NUF) for simple QC procedures widely used in clinical laboratories and to construct charts relating Max E(NUF) with the capability of the analytical process that allow for QC planning based on the risk of harm to a patient due to the report of erroneous results. A QC procedure shows nearly the same Max E(NUF) value when used for controlling analytical processes with the same capability, and there is a close relationship between PEDC and Max E(NUF) for simple QC procedures; therefore, the value of PEDC can be estimated from the value of Max E(NUF) and vice versa. QC procedures selected by their high PEDC value are also characterized by a low value for Max E(NUF). The PEDC value can be used for estimating the probability of patient harm, allowing for the selection of appropriate QC procedures in QC planning based on risk management. © 2016 American Association for Clinical Chemistry.

  5. Systematic review of general thoracic surgery articles to identify predictors of operating room case durations.

    PubMed

    Dexter, Franklin; Dexter, Elisabeth U; Masursky, Danielle; Nussmeier, Nancy A

    2008-04-01

    Previous studies of operating room (OR) information systems data over the past two decades have shown how to predict case durations using the combination of scheduled procedure(s), individual surgeon and assistant(s), and type of anesthetic(s). We hypothesized that the accuracy of case duration prediction could be improved by the use of other electronic medical record data (e.g., patient weight or surgeon notes using standardized vocabularies). General thoracic surgery was used as a model specialty because much of its workload is elective (scheduled) and many of its cases are long. PubMed was searched for thoracic surgery papers reporting operative time, surgical time, etc. The systematic literature review identified 48 papers reporting statistically significant differences in perioperative times. There were multiple reports of differences in OR times based on the procedure(s), perioperative team including primary surgeon, and type of anesthetic, in that sequence of importance. All such detail may not be known when the case is originally scheduled and thus may require an updated duration the day before surgery. Although the use of these categorical data from OR systems can result in few historical data for estimating each case's duration, bias and imprecision of case duration estimates are unlikely to be affected. There was a report of a difference in case duration based on additional information. However, the incidence of the procedure for the diagnosis was so uncommon as to be unlikely to affect OR management. Matching findings of prior studies using OR information system data, multiple case series show that it is important to rely on the precise procedure(s), surgical team, and type of anesthetic when estimating case durations. OR information systems need to incorporate the statistical methods designed for small numbers of prior surgical cases. Future research should focus on the most effective methods to update the prediction of each case's duration as these data become available. The case series did not reveal additional data which could be cost-effectively integrated with OR information systems data to improve the accuracy of predicted durations for general thoracic surgery cases.

  6. Propofol for procedural sedation and analgesia reduced dedicated emergency nursing time while maintaining safety in a community emergency department.

    PubMed

    Reynolds, Joshua C; Abraham, Michael K; Barrueto, Fermin F; Lemkin, Daniel L; Hirshon, Jon M

    2013-09-01

    Procedural sedation and analgesia is a core competency in emergency medicine. Propofol is replacing midazolam in many emergency departments. Barriers to performing procedural sedation include resource utilization. We hypothesized that emergency nursing time is shorter with propofol than midazolam, without increasing complications. Retrospective analysis of a procedural sedation registry for two community emergency departments with combined census of 100,000 patients/year. Demographics, procedure, and ASA physical classification status of adult patients receiving procedural sedation between 2007-2010 with midazolam or propofol were analyzed. Primary outcome was dedicated emergency nursing time. Secondary outcomes were procedural success, ED length of stay, and complication rate. Comparative statistics were performed with Mann-Whitney, Kruskal-Wallis, chi-square, or Fisher's exact test. Linear regression was performed with log-transformed procedural sedation time to define predictors. Of 328 procedural sedation and analgesia, 316 met inclusion criteria, of which 60 received midazolam and 256 propofol. Sex distribution varied between groups (midazolam 3% male; propofol 55% male; P = 0.04). Age, procedure, and ASA status were not significantly different. Propofol had shorter procedural sedation time (propofol 32.5 ± 24.2 minutes; midazolam 78.7 ± 51.5 minutes; P < 0.001) and higher rates of procedural success (propofol 98%; midazolam 92%; P = 0.02). There were no significant differences between complication rates (propofol 14%; midazolam 13%; P = 0.88) or emergency department length of stay (propofol 262.5 ± 132.8 minutes; midazolam 288.6 ± 130.6 minutes; P = 0.09). Use of propofol resulted in shorter emergency nursing time and higher procedural success rate than midazolam with a comparable safety profile. Copyright © 2013 Emergency Nurses Association. Published by Mosby, Inc. All rights reserved.

  7. Missing data treatments matter: an analysis of multiple imputation for anterior cervical discectomy and fusion procedures.

    PubMed

    Ondeck, Nathaniel T; Fu, Michael C; Skrip, Laura A; McLynn, Ryan P; Cui, Jonathan J; Basques, Bryce A; Albert, Todd J; Grauer, Jonathan N

    2018-04-09

    The presence of missing data is a limitation of large datasets, including the National Surgical Quality Improvement Program (NSQIP). In addressing this issue, most studies use complete case analysis, which excludes cases with missing data, thus potentially introducing selection bias. Multiple imputation, a statistically rigorous approach that approximates missing data and preserves sample size, may be an improvement over complete case analysis. The present study aims to evaluate the impact of using multiple imputation in comparison with complete case analysis for assessing the associations between preoperative laboratory values and adverse outcomes following anterior cervical discectomy and fusion (ACDF) procedures. This is a retrospective review of prospectively collected data. Patients undergoing one-level ACDF were identified in NSQIP 2012-2015. Perioperative adverse outcome variables assessed included the occurrence of any adverse event, severe adverse events, and hospital readmission. Missing preoperative albumin and hematocrit values were handled using complete case analysis and multiple imputation. These preoperative laboratory levels were then tested for associations with 30-day postoperative outcomes using logistic regression. A total of 11,999 patients were included. Of this cohort, 63.5% of patients had missing preoperative albumin and 9.9% had missing preoperative hematocrit. When using complete case analysis, only 4,311 patients were studied. The removed patients were significantly younger, healthier, of a common body mass index, and male. Logistic regression analysis failed to identify either preoperative hypoalbuminemia or preoperative anemia as significantly associated with adverse outcomes. When employing multiple imputation, all 11,999 patients were included. Preoperative hypoalbuminemia was significantly associated with the occurrence of any adverse event and severe adverse events. Preoperative anemia was significantly associated with the occurrence of any adverse event, severe adverse events, and hospital readmission. Multiple imputation is a rigorous statistical procedure that is being increasingly used to address missing values in large datasets. Using this technique for ACDF avoided the loss of cases that may have affected the representativeness and power of the study and led to different results than complete case analysis. Multiple imputation should be considered for future spine studies. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. Normality Tests for Statistical Analysis: A Guide for Non-Statisticians

    PubMed Central

    Ghasemi, Asghar; Zahediasl, Saleh

    2012-01-01

    Statistical errors are common in scientific literature and about 50% of the published articles have at least one error. The assumption of normality needs to be checked for many statistical procedures, namely parametric tests, because their validity depends on it. The aim of this commentary is to overview checking for normality in statistical analysis using SPSS. PMID:23843808

  9. Prevalidation in pharmaceutical analysis. Part I. Fundamentals and critical discussion.

    PubMed

    Grdinić, Vladimir; Vuković, Jadranka

    2004-05-28

    A complete prevalidation, as a basic prevalidation strategy for quality control and standardization of analytical procedure was inaugurated. Fast and simple, the prevalidation methodology based on mathematical/statistical evaluation of a reduced number of experiments (N < or = 24) was elaborated and guidelines as well as algorithms were given in detail. This strategy has been produced for the pharmaceutical applications and dedicated to the preliminary evaluation of analytical methods where linear calibration model, which is very often occurred in practice, could be the most appropriate to fit experimental data. The requirements presented in this paper should therefore help the analyst to design and perform the minimum number of prevalidation experiments needed to obtain all the required information to evaluate and demonstrate the reliability of its analytical procedure. In complete prevalidation process, characterization of analytical groups, checking of two limiting groups, testing of data homogeneity, establishment of analytical functions, recognition of outliers, evaluation of limiting values and extraction of prevalidation parameters were included. Moreover, system of diagnosis for particular prevalidation step was suggested. As an illustrative example for demonstration of feasibility of prevalidation methodology, among great number of analytical procedures, Vis-spectrophotometric procedure for determination of tannins with Folin-Ciocalteu's phenol reagent was selected. Favourable metrological characteristics of this analytical procedure, as prevalidation figures of merit, recognized the metrological procedure as a valuable concept in preliminary evaluation of quality of analytical procedures.

  10. The kappa statistic in rehabilitation research: an examination.

    PubMed

    Tooth, Leigh R; Ottenbacher, Kenneth J

    2004-08-01

    The number and sophistication of statistical procedures reported in medical rehabilitation research is increasing. Application of the principles and methods associated with evidence-based practice has contributed to the need for rehabilitation practitioners to understand quantitative methods in published articles. Outcomes measurement and determination of reliability are areas that have experienced rapid change during the past decade. In this study, distinctions between reliability and agreement are examined. Information is presented on analytical approaches for addressing reliability and agreement with the focus on the application of the kappa statistic. The following assumptions are discussed: (1) kappa should be used with data measured on a categorical scale, (2) the patients or objects categorized should be independent, and (3) the observers or raters must make their measurement decisions and judgments independently. Several issues related to using kappa in measurement studies are described, including use of weighted kappa, methods of reporting kappa, the effect of bias and prevalence on kappa, and sample size and power requirements for kappa. The kappa statistic is useful for assessing agreement among raters, and it is being used more frequently in rehabilitation research. Correct interpretation of the kappa statistic depends on meeting the required assumptions and accurate reporting.

  11. 78 FR 63568 - Proposed Collection; Comment Request for Rev. Proc. 2007-35

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-24

    ... Revenue Procedure 2007-35, Statistical Sampling for purposes of Section 199. DATES: Written comments... . SUPPLEMENTARY INFORMATION: Title: Statistical Sampling for purposes of Section 199. OMB Number: 1545-2072... statistical sampling may be used in purposes of section 199, which provides a deduction for income...

  12. Antecedents to Organizational Performance: Theoretical and Practical Implications for Aircraft Maintenance Officer Force Development

    DTIC Science & Technology

    2015-03-26

    to my reader, Lieutenant Colonel Robert Overstreet, for helping solidify my research, coaching me through the statistical analysis, and positive...61  Descriptive Statistics .............................................................................................................. 61...common-method bias requires careful assessment of potential sources of bias and implementing procedural and statistical control methods. Podsakoff

  13. Explorations in Statistics: Permutation Methods

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2012-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eighth installment of "Explorations in Statistics" explores permutation methods, empiric procedures we can use to assess an experimental result--to test a null hypothesis--when we are reluctant to trust statistical…

  14. Teaching Classical Statistical Mechanics: A Simulation Approach.

    ERIC Educational Resources Information Center

    Sauer, G.

    1981-01-01

    Describes a one-dimensional model for an ideal gas to study development of disordered motion in Newtonian mechanics. A Monte Carlo procedure for simulation of the statistical ensemble of an ideal gas with fixed total energy is developed. Compares both approaches for a pseudoexperimental foundation of statistical mechanics. (Author/JN)

  15. Illustrating Sampling Distribution of a Statistic: Minitab Revisited

    ERIC Educational Resources Information Center

    Johnson, H. Dean; Evans, Marc A.

    2008-01-01

    Understanding the concept of the sampling distribution of a statistic is essential for the understanding of inferential procedures. Unfortunately, this topic proves to be a stumbling block for students in introductory statistics classes. In efforts to aid students in their understanding of this concept, alternatives to a lecture-based mode of…

  16. Statistical baseline assessment in cardiotocography.

    PubMed

    Agostinelli, Angela; Braccili, Eleonora; Marchegiani, Enrico; Rosati, Riccardo; Sbrollini, Agnese; Burattini, Luca; Morettini, Micaela; Di Nardo, Francesco; Fioretti, Sandro; Burattini, Laura

    2017-07-01

    Cardiotocography (CTG) is the most common non-invasive diagnostic technique to evaluate fetal well-being. It consists in the recording of fetal heart rate (FHR; bpm) and maternal uterine contractions. Among the main parameters characterizing FHR, baseline (BL) is fundamental to determine fetal hypoxia and distress. In computerized applications, BL is typically computed as mean FHR±ΔFHR, with ΔFHR=8 bpm or ΔFHR=10 bpm, both values being experimentally fixed. In this context, the present work aims: to propose a statistical procedure for ΔFHR assessment; to quantitatively determine ΔFHR value by applying such procedure to clinical data; and to compare the statistically-determined ΔFHR value against the experimentally-determined ΔFHR values. To these aims, the 552 recordings of the "CTU-UHB intrapartum CTG database" from Physionet were submitted to an automatic procedure, which consisted in a FHR preprocessing phase and a statistical BL assessment. During preprocessing, FHR time series were divided into 20-min sliding windows, in which missing data were removed by linear interpolation. Only windows with a correction rate lower than 10% were further processed for BL assessment, according to which ΔFHR was computed as FHR standard deviation. Total number of accepted windows was 1192 (38.5%) over 383 recordings (69.4%) with at least an accepted window. Statistically-determined ΔFHR value was 9.7 bpm. Such value was statistically different from 8 bpm (P<;10 -19 ) but not from 10 bpm (P=0.16). Thus, ΔFHR=10 bpm is preferable over 8 bpm because both experimentally and statistically validated.

  17. The sumLINK statistic for genetic linkage analysis in the presence of heterogeneity.

    PubMed

    Christensen, G B; Knight, S; Camp, N J

    2009-11-01

    We present the "sumLINK" statistic--the sum of multipoint LOD scores for the subset of pedigrees with nominally significant linkage evidence at a given locus--as an alternative to common methods to identify susceptibility loci in the presence of heterogeneity. We also suggest the "sumLOD" statistic (the sum of positive multipoint LOD scores) as a companion to the sumLINK. sumLINK analysis identifies genetic regions of extreme consistency across pedigrees without regard to negative evidence from unlinked or uninformative pedigrees. Significance is determined by an innovative permutation procedure based on genome shuffling that randomizes linkage information across pedigrees. This procedure for generating the empirical null distribution may be useful for other linkage-based statistics as well. Using 500 genome-wide analyses of simulated null data, we show that the genome shuffling procedure results in the correct type 1 error rates for both the sumLINK and sumLOD. The power of the statistics was tested using 100 sets of simulated genome-wide data from the alternative hypothesis from GAW13. Finally, we illustrate the statistics in an analysis of 190 aggressive prostate cancer pedigrees from the International Consortium for Prostate Cancer Genetics, where we identified a new susceptibility locus. We propose that the sumLINK and sumLOD are ideal for collaborative projects and meta-analyses, as they do not require any sharing of identifiable data between contributing institutions. Further, loci identified with the sumLINK have good potential for gene localization via statistical recombinant mapping, as, by definition, several linked pedigrees contribute to each peak.

  18. Statistical methods for change-point detection in surface temperature records

    NASA Astrophysics Data System (ADS)

    Pintar, A. L.; Possolo, A.; Zhang, N. F.

    2013-09-01

    We describe several statistical methods to detect possible change-points in a time series of values of surface temperature measured at a meteorological station, and to assess the statistical significance of such changes, taking into account the natural variability of the measured values, and the autocorrelations between them. These methods serve to determine whether the record may suffer from biases unrelated to the climate signal, hence whether there may be a need for adjustments as considered by M. J. Menne and C. N. Williams (2009) "Homogenization of Temperature Series via Pairwise Comparisons", Journal of Climate 22 (7), 1700-1717. We also review methods to characterize patterns of seasonality (seasonal decomposition using monthly medians or robust local regression), and explain the role they play in the imputation of missing values, and in enabling robust decompositions of the measured values into a seasonal component, a possible climate signal, and a station-specific remainder. The methods for change-point detection that we describe include statistical process control, wavelet multi-resolution analysis, adaptive weights smoothing, and a Bayesian procedure, all of which are applicable to single station records.

  19. Load research manual. Volume 3. Load research for advanced technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brandenburg, L.; Clarkson, G.; Grund, Jr., C.

    1980-11-01

    This three-volume manual presents technical guidelines for electric utility load research. Special attention is given to issues raised by the load data reporting requirements of the Public Utility Regulatory Policies Act of 1978 and to problems faced by smaller utilities that are initiating load research programs. The manual includes guides to load research literature and glossaries of load research and statistical terms. In Volume 3, special load research procedures are presented for solar, wind, and cogeneration technologies.

  20. An investigation into pilot and system response to critical in-flight events, volume 1

    NASA Technical Reports Server (NTRS)

    Rockwell, T. H.; Giffin, W. C.

    1981-01-01

    The scope of a critical in-flight event (CIFE) with emphasis on pilot management of available resources is described. Detailed scenarios for both full mission simulation and written testing of pilot responses to CIFE's, and statistical relationships among pilot characteristics and observed responses are developed. A model developed to described pilot response to CIFE and an analysis of professional fight crews compliance with specified operating procedures and the relationships with in-flight errors are included.

  1. The breaking load method - Results and statistical modification from the ASTM interlaboratory test program

    NASA Technical Reports Server (NTRS)

    Colvin, E. L.; Emptage, M. R.

    1992-01-01

    The breaking load test provides quantitative stress corrosion cracking data by determining the residual strength of tension specimens that have been exposed to corrosive environments. Eight laboratories have participated in a cooperative test program under the auspices of ASTM Committee G-1 to evaluate the new test method. All eight laboratories were able to distinguish between three tempers of aluminum alloy 7075. The statistical analysis procedures that were used in the test program do not work well in all situations. An alternative procedure using Box-Cox transformations shows a great deal of promise. An ASTM standard method has been drafted which incorporates the Box-Cox procedure.

  2. Interventional therapeutic procedures in the musculoskeletal system: an Italian Survey by the Italian College of Musculoskeletal Radiology.

    PubMed

    Silvestri, Enzo; Barile, Antonio; Albano, Domenico; Messina, Carmelo; Orlandi, Davide; Corazza, Angelo; Zugaro, Luigi; Masciocchi, Carlo; Sconfienza, Luca Maria

    2018-04-01

    To perform an online survey among all members of the Italian College of Musculoskeletal Radiology to understand how therapeutic musculoskeletal procedures are performed in daily practice in Italy. We administered an online survey to all 2405 members about the use of therapeutic musculoskeletal procedures in their institutions asking 16 different questions. Subgroup analysis was performed between general and orthopaedic hospitals with Mann-Whitney U and χ 2 statistics. A total of 129/2405 answers (5.4% of members) were included in our analysis. A median of 142.5 (25th-75th percentiles: 50-535.5; range 10-5000) therapeutic musculoskeletal procedures per single institution was performed in 2016. Arthropathic pain was the main indication. The most common procedures were joint injection, bursal/tendon injection, and irrigation of calcific tendinopathy. Ultrasound-guided procedures were mainly performed in ultrasonography rooms (77.4%) rather than in dedicated interventional rooms (22.6%). Conversely, fluoroscopic procedures were performed almost with the same frequency in interventional radiology suites (52.4%) and in general radiology rooms (47.6%). In most institutions (72%), autologous blood or components were not used. The median number of therapeutic musculoskeletal procedures performed in orthopaedic hospitals was significantly higher than in general hospitals (P = 0.002), as well as for the use of autologous preparations (P = 0.004). Joint injection, bursal/tendon injection, and irrigation of calcific tendinopathy were the most common therapeutic musculoskeletal procedures, being arthropathic pain the main indication. The percentage of procedures and the use of autologous preparations were significantly higher in orthopaedic hospitals than in general hospitals.

  3. Economic Statistics and Information Concerning the Japanese Auto Industry

    DOT National Transportation Integrated Search

    1980-12-01

    The report examines the following aspects of the Japanese automobile Industry: Identification of Japanese agencies that receive statistical data on the automobile industry; Determination of research and development and capital investment procedures; ...

  4. New robust statistical procedures for the polytomous logistic regression models.

    PubMed

    Castilla, Elena; Ghosh, Abhik; Martin, Nirian; Pardo, Leandro

    2018-05-17

    This article derives a new family of estimators, namely the minimum density power divergence estimators, as a robust generalization of the maximum likelihood estimator for the polytomous logistic regression model. Based on these estimators, a family of Wald-type test statistics for linear hypotheses is introduced. Robustness properties of both the proposed estimators and the test statistics are theoretically studied through the classical influence function analysis. Appropriate real life examples are presented to justify the requirement of suitable robust statistical procedures in place of the likelihood based inference for the polytomous logistic regression model. The validity of the theoretical results established in the article are further confirmed empirically through suitable simulation studies. Finally, an approach for the data-driven selection of the robustness tuning parameter is proposed with empirical justifications. © 2018, The International Biometric Society.

  5. Petroleum supply monthly, February 1991. [Glossary included

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-02-01

    Data presented in the Petroleum Supply Monthly (PSM) describe the supply and disposition of petroleum products in the United States and major US geographic regions. The data series describe production, imports and exports, inter-Petroleum Administration for Defense (PAD) District movements, and inventories by the primary suppliers of petroleum products in the United States (50 States and the District of Columbia). The reporting universe includes those petroleum sectors in Primary Supply. Included are: petroleum refiners, motor gasoline blenders, operators of natural gas processing plants and fractionators, inter-PAD transporters, importers, and major inventory holders of petroleum products and crude oil. When aggregated,more » the data reported by these sectors approximately represent the consumption of petroleum products in the United States. Data presented in the PSM are divided into two sections (1) the Summary Statistics and (2) the Detailed Statistics. Explanatory Notes, located at the end of this publication, present information describing data collection, sources, estimation methodology, data quality control procedures, modifications to reporting requirements and interpretation of tables. Industry terminology and product definitions are listed alphabetically in the Glossary. 12 figs., 54 tabs.« less

  6. A Procedure To Detect Test Bias Present Simultaneously in Several Items.

    ERIC Educational Resources Information Center

    Shealy, Robin; Stout, William

    A statistical procedure is presented that is designed to test for unidirectional test bias existing simultaneously in several items of an ability test, based on the assumption that test bias is incipient within the two groups' ability differences. The proposed procedure--Simultaneous Item Bias (SIB)--is based on a multidimensional item response…

  7. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR SAMPLING WEIGHT CALCULATION (IIT-A-9.0)

    EPA Science Inventory

    The purpose of this SOP is to describe the procedures undertaken to calculate sampling weights. The sampling weights are needed to obtain weighted statistics of the NHEXAS data. This SOP uses data that have been properly coded and certified with appropriate QA/QC procedures by t...

  8. Percutaneous Tracheostomy under Bronchoscopic Visualization Does Not Affect Short-Term or Long-Term Complications.

    PubMed

    Easterday, Thomas S; Moore, Joshua W; Redden, Meredith H; Feliciano, David V; Henderson, Vernon J; Humphries, Timothy; Kohler, Katherine E; Ramsay, Philip T; Spence, Stanston D; Walker, Mark; Wyrzykowski, Amy D

    2017-07-01

    Percutaneous tracheostomy is a safe and effective bedside procedure. Some advocate the use of bronchoscopy during the procedure to reduce the rate of complications. We evaluated our complication rate in trauma patients undergoing percutaneous tracheostomy with and without bronchoscopic guidance to ascertain if there was a difference in the rate of complications. A retrospective review of all tracheostomies performed in critically ill trauma patients was performed using the trauma registry from an urban, Level I Trauma Center. Bronchoscopy assistance was used based on surgeon preference. Standard statistical methodology was used to determine if there was a difference in complication rates for procedures performed with and without the bronchoscope. From January 2007, to April 2016, 649 patients underwent modified percuteaneous tracheostomy; 289 with the aid of a bronchoscope and 360 without. There were no statistically significant differences in any type of complication regardless of utilization of a bronchoscope. The addition of bronchoscopy provides several theoretical benefits when performing percutaneous tracheostomy. Our findings, however, do not demonstrate a statistically significant difference in complications between procedures performed with and without a bronchoscope. Use of the bronchoscope should, therefore, be left to the discretion of the performing physician.

  9. Active control of aerothermoelastic effects for a conceptual hypersonic aircraft

    NASA Technical Reports Server (NTRS)

    Heeg, Jennifer; Gilbert, Michael G.; Pototzky, Anthony S.

    1990-01-01

    This paper describes the procedures for an results of aeroservothermoelastic studies. The objectives of these studies were to develop the necessary procedures for performing an aeroelastic analysis of an aerodynamically heated vehicle and to analyze a configuration in the classical 'cold' state and in a 'hot' state. Major tasks include the development of the structural and aerodynamic models, open loop analyses, design of active control laws for improving dynamic responses and analyses of the closed loop vehicles. The analyses performed focused on flutter speed calculations, short period eigenvalue trends and statistical analyses of the vehicle response to controls and turbulence. Improving the ride quality of the vehicle and raising the flutter boundary of the aerodynamically-heated vehicle up to that of the cold vehicle were the objectives of the control law design investigations.

  10. Soils element activities for the period October 1973--September 1974

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fowler, E.B.; Essington, E.H.; White, M.G.

    Soils Element activities were conducted on behalf of the U. S. Atomic Energy Commission's Nevada Applied Ecology Group (NAEG) program to provide source term information for the other program elements and maintain continuous cognizance of program requirements for sampling, sample preparation, and analysis. Activities included presentation of papers; participation in workshops; analysis of soil, vegetation, and animal tissue samples for $sup 238$Pu, $sup 239-240$Pu, $sup 241$Am, $sup 137$Cs, $sup 60$Co, and gamma scan for routine and laboratory quality control purposes; preparation and analysis of animal tissue samples for NAEG laboratory certification; studies on a number of analytical, sample preparation, andmore » sample collection procedures; and contributions to the evaluation of procedures for calculation of specialized counting statistics. (auth)« less

  11. Validation of phenol red versus gravimetric method for water reabsorption correction and study of gender differences in Doluisio's absorption technique.

    PubMed

    Tuğcu-Demiröz, Fatmanur; Gonzalez-Alvarez, Isabel; Gonzalez-Alvarez, Marta; Bermejo, Marival

    2014-10-01

    The aim of the present study was to develop a method for water flux reabsorption measurement in Doluisio's Perfusion Technique based on the use of phenol red as a non-absorbable marker and to validate it by comparison with gravimetric procedure. The compounds selected for the study were metoprolol, atenolol, cimetidine and cefadroxil in order to include low, intermediate and high permeability drugs absorbed by passive diffusion and by carrier mediated mechanism. The intestinal permeabilities (Peff) of the drugs were obtained in male and female Wistar rats and calculated using both methods of water flux correction. The absorption rate coefficients of all the assayed compounds did not show statistically significant differences between male and female rats consequently all the individual values were combined to compare between reabsorption methods. The absorption rate coefficients and permeability values did not show statistically significant differences between the two strategies of concentration correction. The apparent zero order water absorption coefficients were also similar in both correction procedures. In conclusion gravimetric and phenol red method for water reabsorption correction are accurate and interchangeable for permeability estimation in closed loop perfusion method. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Confidence intervals for expected moments algorithm flood quantile estimates

    USGS Publications Warehouse

    Cohn, Timothy A.; Lane, William L.; Stedinger, Jery R.

    2001-01-01

    Historical and paleoflood information can substantially improve flood frequency estimates if appropriate statistical procedures are properly applied. However, the Federal guidelines for flood frequency analysis, set forth in Bulletin 17B, rely on an inefficient “weighting” procedure that fails to take advantage of historical and paleoflood information. This has led researchers to propose several more efficient alternatives including the Expected Moments Algorithm (EMA), which is attractive because it retains Bulletin 17B's statistical structure (method of moments with the Log Pearson Type 3 distribution) and thus can be easily integrated into flood analyses employing the rest of the Bulletin 17B approach. The practical utility of EMA, however, has been limited because no closed‐form method has been available for quantifying the uncertainty of EMA‐based flood quantile estimates. This paper addresses that concern by providing analytical expressions for the asymptotic variance of EMA flood‐quantile estimators and confidence intervals for flood quantile estimates. Monte Carlo simulations demonstrate the properties of such confidence intervals for sites where a 25‐ to 100‐year streamgage record is augmented by 50 to 150 years of historical information. The experiments show that the confidence intervals, though not exact, should be acceptable for most purposes.

  13. Longitudinal outcomes after tibioperoneal angioplasty alone compared to tibial stenting and atherectomy for critical limb ischemia.

    PubMed

    Reynolds, Shaun; Galiñanes, Edgar Luis; Dombrovskiy, Viktor Y; Vogel, Todd R

    2013-10-01

    There are limited data available evaluating longitudinal outcomes after tibioperoneal angioplasty (TA) alone compared to adjunctive tibial procedures including stenting and atherectomy. Using the Centers for Medicare & Medicaid Services inpatient claims (2005-2007), patients evaluated TA only, TA plus stent placement (TA + S), and TA plus atherectomy (TA + A). A total of 2080 patients with critical limb ischemia underwent percutaneous tibioperoneal intervention for the indication of ulceration. Procedures included TA (56.3%), TA + S (16.2%), and TA + A (27.5%). Rates of amputation were not statistically different between the groups at 30, 90, and 365 days after the intervention. Mean total hospital charges were TA ($35,867), TA + A ($41,698; P = .0004), and TA + S ($51,040; P < .0001). Patients undergoing TA alone compared to concomitant stenting or atherectomy for ulceration demonstrated no improvement in limb salvage. Future analysis of adjunctive tibioperoneal interventions is essential to temper cost, as they fail to improve long-term limb salvage.

  14. Round-off errors in cutting plane algorithms based on the revised simplex procedure

    NASA Technical Reports Server (NTRS)

    Moore, J. E.

    1973-01-01

    This report statistically analyzes computational round-off errors associated with the cutting plane approach to solving linear integer programming problems. Cutting plane methods require that the inverse of a sequence of matrices be computed. The problem basically reduces to one of minimizing round-off errors in the sequence of inverses. Two procedures for minimizing this problem are presented, and their influence on error accumulation is statistically analyzed. One procedure employs a very small tolerance factor to round computed values to zero. The other procedure is a numerical analysis technique for reinverting or improving the approximate inverse of a matrix. The results indicated that round-off accumulation can be effectively minimized by employing a tolerance factor which reflects the number of significant digits carried for each calculation and by applying the reinversion procedure once to each computed inverse. If 18 significant digits plus an exponent are carried for each variable during computations, then a tolerance value of 0.1 x 10 to the minus 12th power is reasonable.

  15. Evaluation of procedures for quality assurance specifications

    DOT National Transportation Integrated Search

    2004-10-01

    The objective of this project was to develop a comprehensive quality assurance (QA) manual, supported by scientific evidence and statistical theory, which provides step-by-step procedures and instructions for developing effective and efficient QA spe...

  16. Evaluating sufficient similarity for drinking-water disinfection by-product (DBP) mixtures with bootstrap hypothesis test procedures.

    PubMed

    Feder, Paul I; Ma, Zhenxu J; Bull, Richard J; Teuschler, Linda K; Rice, Glenn

    2009-01-01

    In chemical mixtures risk assessment, the use of dose-response data developed for one mixture to estimate risk posed by a second mixture depends on whether the two mixtures are sufficiently similar. While evaluations of similarity may be made using qualitative judgments, this article uses nonparametric statistical methods based on the "bootstrap" resampling technique to address the question of similarity among mixtures of chemical disinfectant by-products (DBP) in drinking water. The bootstrap resampling technique is a general-purpose, computer-intensive approach to statistical inference that substitutes empirical sampling for theoretically based parametric mathematical modeling. Nonparametric, bootstrap-based inference involves fewer assumptions than parametric normal theory based inference. The bootstrap procedure is appropriate, at least in an asymptotic sense, whether or not the parametric, distributional assumptions hold, even approximately. The statistical analysis procedures in this article are initially illustrated with data from 5 water treatment plants (Schenck et al., 2009), and then extended using data developed from a study of 35 drinking-water utilities (U.S. EPA/AMWA, 1989), which permits inclusion of a greater number of water constituents and increased structure in the statistical models.

  17. The Application of Kidney Injury Molecule-1 to Determine the Duration Between Shockwave Lithotripsy Sessions.

    PubMed

    Aydin, Hasan R; Irkilata, Lokman; Aydin, Mustafa; Daggulli, Mansur; Taskin, Mehmet H; Demirel, Huseyin C; Adanur, Senol; Moral, Caner; Atilla, Mustafa K; Sancaktutar, Ahmet A

    2016-01-01

    We aimed to evaluate the role of kidney injury molecule-1 (KIM-1) in determining the intervals between shockwave lithotripsy (SWL) sessions. This was a prospective, controlled study. It included 40 patients with unilateral kidney stones and 40 healthy persons of a similar age group as controls. The patients' midflow urine samples were collected before SWL and 1 hour, 1 day, 1 week, and 1 month after the procedure. The average age in the SWL and control groups was 45 ± 14 and 39 ± 15 years, respectively (P = 0.336). The average KIM-1 value before SWL was 0.74 ± 0.35 ng/mL, which was significantly higher than that of the control group (0.51 ± 0.14 ng/mL) (P < 0.001). Similarly, the average values of the urine samples after SWL were higher than those of the control group (P < 0.001). When the KIM-1 values of the patients given SWL were compared within the group, the KIM-1 values 1 hour (1.06 ± 0.51) and 1 day (0.99 ± 0.67) after the procedure were statistically clearly higher than those before the procedure (P < 0.001) and statistically clearly higher than those of the control group (P = 0.005). The KIM-1 values 1 week and 1 month after the procedure were not significantly different than the preprocedure values (P = 0.652 and P = 0.747, respectively). KIM-1 is a noninvasive biomarker that may be used to show renal damage because of stones and early-stage renal damage linked to SWL. In addition, post-SWL KIM-1 values may be used to determine the interval between SWL sessions.

  18. Kepler AutoRegressive Planet Search

    NASA Astrophysics Data System (ADS)

    Feigelson, Eric

    NASA's Kepler mission is the source of more exoplanets than any other instrument, but the discovery depends on complex statistical analysis procedures embedded in the Kepler pipeline. A particular challenge is mitigating irregular stellar variability without loss of sensitivity to faint periodic planetary transits. This proposal presents a two-stage alternative analysis procedure. First, parametric autoregressive ARFIMA models, commonly used in econometrics, remove most of the stellar variations. Second, a novel matched filter is used to create a periodogram from which transit-like periodicities are identified. This analysis procedure, the Kepler AutoRegressive Planet Search (KARPS), is confirming most of the Kepler Objects of Interest and is expected to identify additional planetary candidates. The proposed research will complete application of the KARPS methodology to the prime Kepler mission light curves of 200,000: stars, and compare the results with Kepler Objects of Interest obtained with the Kepler pipeline. We will then conduct a variety of astronomical studies based on the KARPS results. Important subsamples will be extracted including Habitable Zone planets, hot super-Earths, grazing-transit hot Jupiters, and multi-planet systems. Groundbased spectroscopy of poorly studied candidates will be performed to better characterize the host stars. Studies of stellar variability will then be pursued based on KARPS analysis. The autocorrelation function and nonstationarity measures will be used to identify spotted stars at different stages of autoregressive modeling. Periodic variables with folded light curves inconsistent with planetary transits will be identified; they may be eclipsing or mutually-illuminating binary star systems. Classification of stellar variables with KARPS-derived statistical properties will be attempted. KARPS procedures will then be applied to archived K2 data to identify planetary transits and characterize stellar variability.

  19. Nonabsorbable urethral bulking agent - clinical effectiveness and late complications rates in the treatment of recurrent stress urinary incontinence after 2 years of follow-up.

    PubMed

    Futyma, Konrad; Nowakowski, Łukasz; Gałczyński, Krzysztof; Miotła, Paweł; Rechberger, Tomasz

    2016-12-01

    Those patients who failed to achieve continence after a procedure aimed to correct it, require a special attitude and precise management due to the sophisticated anatomical and functional field of interest. The purpose of the present study was to assess long-term clinical efficacy and evaluate the frequency and severity of any complications related to recurrent stress urinary incontinence treatment with a non-absorbable bulking agent periurethral injections. Between February 2012-September 2013, 66 patients with recurrent stress urinary incontinence were treated with Urolastic in the tertiary referral gynecologic department. The efficacy of the procedure was assessed objectively at each follow-up visit, scheduled at two, six weeks and 3, 6, 12 and 24 months after primary procedure. Material was injected under local anesthesia according to the manufacturer's instructions, at 10, 2, 4 and 8 o'clock positions with 0.5-1.25ccm per spot. Statistical analyses were performed with Statistica package version 8.0 (StatSoft Inc., Tulsa, OK, USA). A p value <0.05 was considered statistically significant. Objective success rate at 24 months was found in 32.7% of patients, including 22.4% patients who were completely dry. The efficacy of Urolastic, when considering the intention to treat, is 24.2% and 16.7%, respectively. In 4.5% patients an oval shaped material was found inside the bladder. Overall, complications were observed in 17 (25.8%) patients. Although only 30% of patients will benefit from Urolastic injection on the long-term basis it seems to be a safe procedure in the treatment of recurrent stress urinary incontinence. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. Quantized correlation coefficient for measuring reproducibility of ChIP-chip data.

    PubMed

    Peng, Shouyong; Kuroda, Mitzi I; Park, Peter J

    2010-07-27

    Chromatin immunoprecipitation followed by microarray hybridization (ChIP-chip) is used to study protein-DNA interactions and histone modifications on a genome-scale. To ensure data quality, these experiments are usually performed in replicates, and a correlation coefficient between replicates is used often to assess reproducibility. However, the correlation coefficient can be misleading because it is affected not only by the reproducibility of the signal but also by the amount of binding signal present in the data. We develop the Quantized correlation coefficient (QCC) that is much less dependent on the amount of signal. This involves discretization of data into set of quantiles (quantization), a merging procedure to group the background probes, and recalculation of the Pearson correlation coefficient. This procedure reduces the influence of the background noise on the statistic, which then properly focuses more on the reproducibility of the signal. The performance of this procedure is tested in both simulated and real ChIP-chip data. For replicates with different levels of enrichment over background and coverage, we find that QCC reflects reproducibility more accurately and is more robust than the standard Pearson or Spearman correlation coefficients. The quantization and the merging procedure can also suggest a proper quantile threshold for separating signal from background for further analysis. To measure reproducibility of ChIP-chip data correctly, a correlation coefficient that is robust to the amount of signal present should be used. QCC is one such measure. The QCC statistic can also be applied in a variety of other contexts for measuring reproducibility, including analysis of array CGH data for DNA copy number and gene expression data.

  1. The subdural evacuation port system: outcomes from a single institution experience and predictors of success.

    PubMed

    Neal, Matthew T; Hsu, Wesley; Urban, Jillian E; Angelo, Nicole M; Sweasey, Thomas A; Branch, Charles L

    2013-06-01

    Numerous surgical options for treatment of chronic subdural hematomas (cSDH) exist. Several reports have examined the Subdural Evacuating Port System (SEPS), a variation of the twist drill craniotomy (TDC) technique. Although high success rates have been reported, a significant portion of patients treated with SEPS fail and require additional procedures. This report examines the largest single institution experience with the SEPS and explores patient and imaging characteristics associated with successful procedures. A retrospective chart review was performed to identify all patients who have undergone SEPS drainage of cSDH. Demographic and radiographic characteristics were evaluated. Demographic data included patient's age, sex, presenting symptoms, pre-procedural GCS score, and use of anticoagulation or antiplatelet agents. The volume of drainage per procedure and radiographic data including laterality, density, and maximal diameter of the collection, presence of septations, midline shift, resolution of the collection 3 weeks post procedure, and measurements to assess atrophy were collected. Total length of stay and time in the intensive care unit was also recorded. Results were classified as a success or failure based on the need for additional procedures including craniotomy or burr hole craniotomy in the operating room. Patients treated with two SEPS procedures during the same hospitalization and no other procedures were included in the success group for statistical analyses. 171 subdural collections were treated in 159 patients (147 unilateral and 12 bilateral). One hundred thirty three collections (77.8%) were successfully drained. In a comparison of the success and failure groups, there were no statistically significant differences (p<0.05) in the patients' mean age, sex, presenting Glasgow Coma Scale score, coagulation profile, presenting symptoms (except altered mental status and language disturbance), subdural diameter or laterality, midline shift, presence of atrophy, density of most acute portion, or time in hospital. In the success group, there was a shorter mean stay in the intensive care unit (S: 4.1±4.5 days vs F: 5.4±4.6 days; p=0.03) and a larger output drained (S: 131.1±71.2ml vs F: 99.0±84.2ml; p=0.04). Success was less likely with mixed density collections (S: 38.2% vs F: 64.3%; p=0.02) and with collections containing greater than 2 intrahematomal septations (S: 17.1% vs F: 40.7%; p=0.007). In successful cases, mean volumes for collections prior to SEPS, immediately after SEPS, and on delayed scans (≥30 days since SEPS placement) the respective volumes were 83.1±35.1ml, 41.5±23.2ml, and 37.9±26.5ml. Both post-SEPS volumes were less than the pre-SEPS volume (p<0.0001). 76.0% of patients with delayed scans had complete resolution of cSDH or minimal residual cSDH with no local mass effect on the most recent imaging. The mean period of follow-up imaging was 95.6±196.2 days. Only one patient in our series required an emergent craniotomy following immediate complications from SEPS placement. The SEPS is an effective, safe, and durable treatment for cSDH. Although we consider the SEPS a first-line treatment for the majority of patients with cSDH, management of cSDH must be tailored to each patient. In mixed density collections with large proportions of acute hemorrhage and in collections with numerous intrahematomal septations, alternative surgical techniques should be considered as first-line therapies. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Simple estimation procedures for regression analysis of interval-censored failure time data under the proportional hazards model.

    PubMed

    Sun, Jianguo; Feng, Yanqin; Zhao, Hui

    2015-01-01

    Interval-censored failure time data occur in many fields including epidemiological and medical studies as well as financial and sociological studies, and many authors have investigated their analysis (Sun, The statistical analysis of interval-censored failure time data, 2006; Zhang, Stat Modeling 9:321-343, 2009). In particular, a number of procedures have been developed for regression analysis of interval-censored data arising from the proportional hazards model (Finkelstein, Biometrics 42:845-854, 1986; Huang, Ann Stat 24:540-568, 1996; Pan, Biometrics 56:199-203, 2000). For most of these procedures, however, one drawback is that they involve estimation of both regression parameters and baseline cumulative hazard function. In this paper, we propose two simple estimation approaches that do not need estimation of the baseline cumulative hazard function. The asymptotic properties of the resulting estimates are given, and an extensive simulation study is conducted and indicates that they work well for practical situations.

  3. A method to estimate statistical errors of properties derived from charge-density modelling

    PubMed Central

    Lecomte, Claude

    2018-01-01

    Estimating uncertainties of property values derived from a charge-density model is not straightforward. A methodology, based on calculation of sample standard deviations (SSD) of properties using randomly deviating charge-density models, is proposed with the MoPro software. The parameter shifts applied in the deviating models are generated in order to respect the variance–covariance matrix issued from the least-squares refinement. This ‘SSD methodology’ procedure can be applied to estimate uncertainties of any property related to a charge-density model obtained by least-squares fitting. This includes topological properties such as critical point coordinates, electron density, Laplacian and ellipticity at critical points and charges integrated over atomic basins. Errors on electrostatic potentials and interaction energies are also available now through this procedure. The method is exemplified with the charge density of compound (E)-5-phenylpent-1-enylboronic acid, refined at 0.45 Å resolution. The procedure is implemented in the freely available MoPro program dedicated to charge-density refinement and modelling. PMID:29724964

  4. Frequency of adverse events in plateletpheresis donors in regional transfusion centre in North India.

    PubMed

    Patidar, Gopal Kumar; Sharma, Ratti Ram; Marwaha, Neelam

    2013-10-01

    Although automated cell separators have undergone a lot of technical refinements, attention has been focused on the quality of platelet concentrates than on donor safety. We planned this prospective study to look into donor safety aspect by studying adverse events in normal healthy plateletpheresis donors. The study included 500 healthy, first-time (n=301) and repeat (n=199) plateletpheresis donors after informed consent. The plateletpheresis procedures were performed on Trima Accel (5.1 version, GAMBRO BCT) and Amicus (3.2 version FENWAL) cell separators. The adverse events during procedure were recorded and classified according to their nature. The pre and post procedure hematological and biochemical profiles of these donors were also assessed with the help of automated cell counter and analyser respectively. A total of 18% (n=90) adverse events were recorded in 500 plateletpheresis donors, of which 9% of were hypocalcaemia in nature followed by hematoma (7.4%), vasovagal reaction (0.8%) and kit related adverse events in (0.8%). There was significant post procedure drop in Hb, Hct, platelet count of the donors (p<0.0001) whereas WBC count showed a statistically significant rise (p<0.0001). Divalent cations (iCa(+), TCa(+), TMg(+)) also showed a statistically significant decline after donation (p<0.0001). However there were no statistically significance difference between adverse events in Trima Accel (5.1 version, GAMBRO BCT) and Amicus (3.2 version FENWAL) cell separators. Donor reactions can adversely affect the voluntary donor recruitment strategies to increase the public awareness regarding constant need for blood and blood products. Commonly observed adverse events in plateletpheresis donors were hypocalcemia, hematoma formation and vasovagal reactions which can be prevented by pre-donation education of the donors and change of machine configuration. Nevertheless, more prospective studies on this aspect are required in order to establish guidelines for donor safety in apheresis and also to help in assessing donor suitability, especially given the present trend of double product apheresis collections. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Coracoid process x-ray investigation before Latarjet procedure: a radioanatomic study.

    PubMed

    Bachy, Manon; Lapner, Peter L C; Goutallier, Daniel; Allain, Jérôme; Hernigou, Phillipe; Bénichou, Jacques; Zilber, Sébastien

    2013-12-01

    The purpose of this study was to determine whether a preoperative radiologic assessment of the coracoid process is predictive of the amount of bone available for coracoid transfer by the Latarjet procedure. Thirty-five patients with anterior instability undergoing a Latarjet procedure were included. A preoperative radiologic assessment was performed with the Bernageau and true anteroposterior (true AP) views. The length of the coracoid process was measured on both radiographic views and the values were compared with the length of the bone block during surgery. Statistical analysis was carried out by ANOVA and Wilcoxon tests (P < .05). On radiologic examination, the mean coracoid process length was 29 ± 4 and 33 ± 4 mm on the Bernageau and true AP views, respectively. The mean bone block length during surgery was 21.6 ± 2.7 mm. A significant correlation was found (P = .032) between the coracoid process length on the true AP view and the intraoperative bone block length. Preoperative planning for the Latarjet procedure, including graft orientation and screw placement, requires knowledge of the length of coracoid bone available for transfer. This can be facilitated with the use of preoperative standard radiographs, thus avoiding computed tomography. This planning allows the detection of coracoid process anatomic variations or the analysis of the remaining part of the coracoid process after failure of a first Latarjet procedure to avoid an iliac bone graft. Radiologic preoperative coracoid process measurement is an easy, reliable method to aid preoperative planning of the Latarjet procedure in primary surgery and reoperations. Copyright © 2013 Journal of Shoulder and Elbow Surgery Board of Trustees. All rights reserved.

  6. Plastic Surgery Statistics in the US: Evidence and Implications.

    PubMed

    Heidekrueger, Paul I; Juran, Sabrina; Patel, Anup; Tanna, Neil; Broer, P Niclas

    2016-04-01

    The American Society of Plastic Surgeons publishes yearly procedural statistics, collected through questionnaires and online via tracking operations and outcomes for plastic surgeons (TOPS). The statistics, disaggregated by U.S. region, leave two important factors unaccounted for: (1) the underlying base population and (2) the number of surgeons performing the procedures. The presented analysis puts the regional distribution of surgeries into perspective and contributes to fulfilling the TOPS legislation objectives. ASPS statistics from 2005 to 2013 were analyzed by geographic region in the U.S. Using population estimates from the 2010 U.S. Census Bureau, procedures were calculated per 100,000 population. Then, based on the ASPS member roster, the rate of surgeries per surgeon by region was calculated and the interaction of these two variables was related to each other. In 2013, 1668,420 esthetic surgeries were performed in the U.S., resulting in the following ASPS ranking: 1st Mountain/Pacific (Region 5; 502,094 procedures, 30 % share), 2nd New England/Middle Atlantic (Region 1; 319,515, 19 %), 3rd South Atlantic (Region 3; 310,441, 19 %), 4th East/West South Central (Region 4; 274,282, 16 %), and 5th East/West North Central (Region 2; 262,088, 16 %). However, considering underlying populations, distribution and ranking appear to be different, displaying a smaller variance in surgical demand. Further, the number of surgeons and rate of procedures show great regional variation. Demand for plastic surgery is influenced by patients' geographic background and varies among U.S. regions. While ASPS data provide important information, additional insight regarding the demand for surgical procedures can be gained by taking certain demographic factors into consideration. This journal requires that the authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266.

  7. Accuracy assessment: The statistical approach to performance evaluation in LACIE. [Great Plains corridor, United States

    NASA Technical Reports Server (NTRS)

    Houston, A. G.; Feiveson, A. H.; Chhikara, R. S.; Hsu, E. M. (Principal Investigator)

    1979-01-01

    A statistical methodology was developed to check the accuracy of the products of the experimental operations throughout crop growth and to determine whether the procedures are adequate to accomplish the desired accuracy and reliability goals. It has allowed the identification and isolation of key problems in wheat area yield estimation, some of which have been corrected and some of which remain to be resolved. The major unresolved problem in accuracy assessment is that of precisely estimating the bias of the LACIE production estimator. Topics covered include: (1) evaluation techniques; (2) variance and bias estimation for the wheat production estimate; (3) the 90/90 evaluation; (4) comparison of the LACIE estimate with reference standards; and (5) first and second order error source investigations.

  8. Statistical Analysis of Complexity Generators for Cost Estimation

    NASA Technical Reports Server (NTRS)

    Rowell, Ginger Holmes

    1999-01-01

    Predicting the cost of cutting edge new technologies involved with spacecraft hardware can be quite complicated. A new feature of the NASA Air Force Cost Model (NAFCOM), called the Complexity Generator, is being developed to model the complexity factors that drive the cost of space hardware. This parametric approach is also designed to account for the differences in cost, based on factors that are unique to each system and subsystem. The cost driver categories included in this model are weight, inheritance from previous missions, technical complexity, and management factors. This paper explains the Complexity Generator framework, the statistical methods used to select the best model within this framework, and the procedures used to find the region of predictability and the prediction intervals for the cost of a mission.

  9. A time series analysis performed on a 25-year period of kidney transplantation activity in a single center.

    PubMed

    Santori, G; Fontana, I; Bertocchi, M; Gasloli, G; Valente, U

    2010-05-01

    Following the example of many Western countries, where a "minimum volume rule" policy has been adopted as a quality parameter for complex surgical procedures, the Italian National Transplant Centre set the minimum number of kidney transplantation procedures/y at 30/center. The number of procedures performed in a single center over a large period may be treated as a time series to evaluate trends, seasonal cycles, and nonsystematic fluctuations. Between January 1, 1983, and December 31, 2007, we performed 1376 procedures in adult or pediatric recipients from living or cadaveric donors. The greatest numbers of cases/y were performed in 1998 (n = 86) followed by 2004 (n = 82), 1996 (n = 75), and 2003 (n = 73). A time series analysis performed using R Statistical Software (Foundation for Statistical Computing, Vienna, Austria), a free software environment for statistical computing and graphics, showed a whole incremental trend after exponential smoothing as well as after seasonal decomposition. However, starting from 2005, we observed a decreased trend in the series. The number of kidney transplants expected to be performed for 2008 by using the Holt-Winters exponential smoothing applied to the period 1983 to 2007 suggested 58 procedures, while in that year there were 52. The time series approach may be helpful to establish a minimum volume/y at a single-center level. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  10. Canine periodontal disease control using a clindamycin hydrochloride gel.

    PubMed

    Johnston, Thomas P; Mondal, Pravakar; Pal, Dhananjay; MacGee, Scott; Stromberg, Arnold J; Alur, Hemant

    2011-01-01

    Stabilizing or reducing periodontal pocket depth can have a positive influence on the retention of teeth in dogs. A topical 2% clindamycin hydrochloride gel (CHgel) was evaluated for the treatment of periodontal disease in dogs. The CHgel formulation provides for the sustained erosion of the matrix, but also flows into the periodontal pocket as a viscous liquid, and then rapidly forms a gel that has mucoadhesive properties and also may function as a physical barrier to the introduction of bacteria. A professional teeth cleaning procedure including scaling and root planing was done in dogs with one group receiving CHgel following treatment. Periodontal health was determined before and after the procedure including measurement of periodontal pocket depth, gingival index, gingival bleeding sites, and number of suppurating sites. There was a statistically significant decrease in periodontal pocket depth (19%), gingival index (16%), and the number of bleeding sites (64%) at 90-days in dogs receiving CHgel. Additionally, the number of suppurating sites was lower (93%) at 90-days for the group receiving CHgel. The addition of CHgel effectively controlled the bacterial burden (e.g, Fusobacterium nucleatum) at both day 14 and 90. Gingival cells in culture were shown to rapidly incorporate clindamycin and attain saturation in approximately 20-minutes. In summary, a professional teeth cleaning procedure including root planning and the addition of CHgel improves the gingival index and reduces periodontal pocket depth.

  11. Statistical Research of Investment Development of Russian Regions

    ERIC Educational Resources Information Center

    Burtseva, Tatiana A.; Aleshnikova, Vera I.; Dubovik, Mayya V.; Naidenkova, Ksenya V.; Kovalchuk, Nadezda B.; Repetskaya, Natalia V.; Kuzmina, Oksana G.; Surkov, Anton A.; Bershadskaya, Olga I.; Smirennikova, Anna V.

    2016-01-01

    This article the article is concerned with a substantiation of procedures ensuring the implementation of statistical research and monitoring of investment development of the Russian regions, which would be pertinent for modern development of the state statistics. The aim of the study is to develop the methodological framework in order to estimate…

  12. Decision Support Systems: Applications in Statistics and Hypothesis Testing.

    ERIC Educational Resources Information Center

    Olsen, Christopher R.; Bozeman, William C.

    1988-01-01

    Discussion of the selection of appropriate statistical procedures by educators highlights a study conducted to investigate the effectiveness of decision aids in facilitating the use of appropriate statistics. Experimental groups and a control group using a printed flow chart, a computer-based decision aid, and a standard text are described. (11…

  13. Use of Management Statistics in ARL Libraries. SPEC Kit #153.

    ERIC Educational Resources Information Center

    Vasi, John

    A Systems and Procedures Exchange Center (SPEC) survey conducted in 1986 investigated the collection and use of management statistics in Association of Research Libraries (ARL) member libraries, and SPEC Kit #134 (May 1987) summarized the kinds of statistics collected and the reasons given by the 91 respondents for collecting them. This more…

  14. Strategies Used by Students to Compare Two Data Sets

    ERIC Educational Resources Information Center

    Reaburn, Robyn

    2012-01-01

    One of the common tasks of inferential statistics is to compare two data sets. Long before formal statistical procedures, however, students can be encouraged to make comparisons between data sets and therefore build up intuitive statistical reasoning. Such tasks also give meaning to the data collection students may do. This study describes the…

  15. Statistics for People Who (Think They) Hate Statistics. Third Edition

    ERIC Educational Resources Information Center

    Salkind, Neil J.

    2007-01-01

    This text teaches an often intimidating and difficult subject in a way that is informative, personable, and clear. The author takes students through various statistical procedures, beginning with correlation and graphical representation of data and ending with inferential techniques and analysis of variance. In addition, the text covers SPSS, and…

  16. Item Analysis Appropriate for Domain-Referenced Classroom Testing. (Project Technical Report Number 1).

    ERIC Educational Resources Information Center

    Nitko, Anthony J.; Hsu, Tse-chi

    Item analysis procedures appropriate for domain-referenced classroom testing are described. A conceptual framework within which item statistics can be considered and promising statistics in light of this framework are presented. The sampling fluctuations of the more promising item statistics for sample sizes comparable to the typical classroom…

  17. A statistical study of atypical wave modes in the Earth's foreshock region

    NASA Astrophysics Data System (ADS)

    Hsieh, W.; Shue, J.; Lee, B.

    2010-12-01

    The Earth's foreshock, the region upstream the Earth’s bow shock, is filled with back-streaming particles and ultra-low frequency waves. Three different wave modes have been identified in the region, including 30-sec waves, 3-sec waves, and shocklets. Time History of Events and Macroscale Interactions during Substorms (THEMIS), a satellite mission that consists of five probes, provides multiple measuements of the Earth’s foreshock region. The method of Hilbert-Huang transform (HHT) includes the procedures of empirical mode decomposition and instantaneous frequency calculation. In this study, we use HHT to decompose intrinsic wave modes and perform a wave analysis of chaotic magnetic fields in the Earth's foreshock region. We find that some individual atypical wave modes other than 30-sec and 3-sec appear in the region. In this presentation, we will show the statistical characteristics, such as wave frequency, wave amplitude, and wave polarization of the atypical intrinsic wave modes, with respect to different locations in the foreshock region and to different solar wind conditions.

  18. Approach for Uncertainty Propagation and Robust Design in CFD Using Sensitivity Derivatives

    NASA Technical Reports Server (NTRS)

    Putko, Michele M.; Newman, Perry A.; Taylor, Arthur C., III; Green, Lawrence L.

    2001-01-01

    This paper presents an implementation of the approximate statistical moment method for uncertainty propagation and robust optimization for a quasi 1-D Euler CFD (computational fluid dynamics) code. Given uncertainties in statistically independent, random, normally distributed input variables, a first- and second-order statistical moment matching procedure is performed to approximate the uncertainty in the CFD output. Efficient calculation of both first- and second-order sensitivity derivatives is required. In order to assess the validity of the approximations, the moments are compared with statistical moments generated through Monte Carlo simulations. The uncertainties in the CFD input variables are also incorporated into a robust optimization procedure. For this optimization, statistical moments involving first-order sensitivity derivatives appear in the objective function and system constraints. Second-order sensitivity derivatives are used in a gradient-based search to successfully execute a robust optimization. The approximate methods used throughout the analyses are found to be valid when considering robustness about input parameter mean values.

  19. Evaluation of efficacy of a bioresorbable membrane in the treatment of oral lichen planus

    PubMed Central

    Kapoor, Anoop; Sikri, Poonam; Grover, Vishakha; Malhotra, Ranjan; Sachdeva, Sonia

    2014-01-01

    Background: Gingival involvement is commonly seen in lichen planus, a chronic mucocutaneous inflammatory condition of the stratified squamous epithelia. It is often painful and may undergo malignant transformation and thus warrants early diagnosis and prompt treatment. The aim of this study is to evaluate the use of a bioresorbable membrane (Polyglactin 910) in the management of erosive lichen planus of gingiva. Materials and Methods: A split-mouth randomized controlled trial was carried out. Fifteen patients with identical bilateral lesions of lichen planus on gingiva were included in the study. Three parameters were selected for the clinical assessment of gingival lesions: Surface texture, color, and burning sensation. After complete oral prophylaxis, an excisional biopsy procedure was carried out for lesions on both sides, but on the experimental side, the biopsy procedure was combined with placement of the bioresorbable membrane. The statistical significance of intergroup differences in measurements was tested by using an independent sample t-test. A two-tailed P-value less than 0.05 was considered as statistically significant. Results: Intragroup comparisons revealed a statistically significant difference between mean value of grades at 6, 12, and 24 weeks in both groups for the surface texture, color, and burning sensation of gingiva, respectively. For intergroup comparison of change in surface texture, color, and burning sensation of gingiva between group A and group B, differences were statistically nonsignificant. Conclusion: Surgical management of the lesion accomplished significant improvement of lesion with no significant additional clinical benefits with the application of bioresorbable membrane. Worsening of baseline scores was not observed in any case at the end of the study. PMID:25097651

  20. Cochlear Implant Electrode Array From Partial to Full Insertion in Non-Human Primate Model.

    PubMed

    Manrique-Huarte, Raquel; Calavia, Diego; Gallego, Maria Antonia; Manrique, Manuel

    2018-04-01

    To determine the feasibility of progressive insertion (two sequential surgeries: partial to full insertion) of an electrode array and to compare functional outcomes. 8 normal-hearing animals (Macaca fascicularis (MF)) were included. A 14 contact electrode array, which is suitably sized for the MF cochlea was partially inserted (PI) in 16 ears. After 3 months of follow-up revision surgery the electrode was advanced to a full insertion (FI) in 8 ears. Radiological examination and auditory testing was performed monthly for 6 months. In order to compare the values a two way repeated measures ANOVA was used. A p-value below 0.05 was considered as statistically significant. IBM SPSS Statistics V20 was used. Surgical procedure was completed in all cases with no complications. Mean auditory threshold shift (ABR click tones) after 6 months follow-up is 19 dB and 27 dB for PI and FI group. For frequencies 4, 6, 8, 12, and 16 kHz in the FI group, tone burst auditory thresholds increased after the revision surgery showing no recovery thereafter. Mean threshold shift at 6 months of follow- up is 19.8 dB ranging from 2 to 36dB for PI group and 33.14dB ranging from 8 to 48dB for FI group. Statistical analysis yields no significant differences between groups. It is feasible to perform a partial insertion of an electrode array and progress on a second surgical time to a full insertion (up to 270º). Hearing preservation is feasible for both procedures. Note that a minimal threshold deterioration is depicted among full insertion group, especially among high frequencies, with no statistical differences.

  1. The benefit of silicone stents in primary endonasal dacryocystorhinostomy: a systematic review and meta-analysis.

    PubMed

    Sarode, D; Bari, D A; Cain, A C; Syed, M I; Williams, A T

    2017-04-01

    To critically evaluate the evidence comparing success rates of endonasal dacryocystorhinostomy (EN-DCR) with and without silicone tubing and to thus determine whether silicone intubation is beneficial in primary EN-DCR. Systematic review and meta-analysis. A literature search was performed on AMED, EMBASE, HMIC, MEDLINE, PsycINFO, BNI, CINAHL, HEALTH BUSINESS ELITE, CENTRAL and Cochrane Ear, Nose and Throat disorders groups trials register using a combination of various MeSH. The date of last search was January 2016. This review was limited to randomised controlled trials (RCTs) in English language. Risk of bias was assessed using the Cochrane Collaboration's risk of bias tool. Chi-square and I 2 statistics were calculated to determine the presence and extent of statistical heterogeneity. Study selection, data extraction and risk of bias scoring were performed independently by two authors in concordance with the PRISMA statement. Five RCTs (447 primary EN-DCR procedures in 426 patients) were included for analysis. Moderate interstudy statistical heterogeneity was demonstrated (Chi 2 = 6.18; d.f. = 4; I 2 = 35%). Bicanalicular silicone stents were used in 229 and not used in 218 procedures. The overall success rate of EN-DCR was 92.8% (415/447). The success rate of EN-DCR was 93.4% (214/229) with silicone tubing and 92.2% (201/218) without silicone tubing. Meta-analysis using a random-effects model showed no statistically significant difference in outcomes between the two groups (P = 0.63; RR = 0.79; 95% CI = 0.3-2.06). Our review and meta-analysis did not demonstrate an additional advantage of silicone stenting. A high-quality well-powered prospective multicentre RCT is needed to further clarify on the benefit of silicone stents. © 2016 John Wiley & Sons Ltd.

  2. The use of oral sucrose for procedural pain relief in infants up to six months of age: a randomized controlled trial.

    PubMed

    Wilson, Sally; Bremner, Alexandra P; Mathews, Judy; Pearson, Diane

    2013-12-01

    The aim of this study was to evaluate the effectiveness of oral sucrose in decreasing pain during minor procedures in infants of 1-6 months corrected age. A blinded randomized controlled trial with infants aged 4-26 weeks who underwent venipuncture, heel lance or intravenous cannulation were stratified by corrected age into > 4-12 weeks and > 12-26 weeks. They received 2 mL of either 25% sucrose or sterile water orally 2 minutes before the painful procedure. Nonnutritional sucking and parental comfort, provided in adherence to hospital guidelines, were recorded. Pain behavior was recorded using a validated 10 point scale at baseline, during and following the procedure. Data collectors were blinded to the intervention. A total of 21 and 20 infants received sucrose and water, respectively, in the > 4-12-week age group, and 21 and 22, respectively, in the > 12-26-week age group. No statistical differences were found in pain scores between treatment and control groups at any data collection points in either age group. Infants aged > 4-12 weeks who did nonnutritional sucking showed statistically significantly lower median pain scores at 1, 2, and 3 minutes after the procedure than those who did not suck. Infants aged > 4-26 weeks exhibited pain behavior scores that indicated moderate to large pain during painful procedures; however, there was insufficient evidence to show that 2 mL 25% sucrose had a statistically significant effect in decreasing pain. Infants should be offered nonnutritional sucking in compliance with the Baby Friendly Health Initiative during painful procedures. Crown Copyright © 2013. Published by Elsevier Inc. All rights reserved.

  3. Systematic and fully automated identification of protein sequence patterns.

    PubMed

    Hart, R K; Royyuru, A K; Stolovitzky, G; Califano, A

    2000-01-01

    We present an efficient algorithm to systematically and automatically identify patterns in protein sequence families. The procedure is based on the Splash deterministic pattern discovery algorithm and on a framework to assess the statistical significance of patterns. We demonstrate its application to the fully automated discovery of patterns in 974 PROSITE families (the complete subset of PROSITE families which are defined by patterns and contain DR records). Splash generates patterns with better specificity and undiminished sensitivity, or vice versa, in 28% of the families; identical statistics were obtained in 48% of the families, worse statistics in 15%, and mixed behavior in the remaining 9%. In about 75% of the cases, Splash patterns identify sequence sites that overlap more than 50% with the corresponding PROSITE pattern. The procedure is sufficiently rapid to enable its use for daily curation of existing motif and profile databases. Third, our results show that the statistical significance of discovered patterns correlates well with their biological significance. The trypsin subfamily of serine proteases is used to illustrate this method's ability to exhaustively discover all motifs in a family that are statistically and biologically significant. Finally, we discuss applications of sequence patterns to multiple sequence alignment and the training of more sensitive score-based motif models, akin to the procedure used by PSI-BLAST. All results are available at httpl//www.research.ibm.com/spat/.

  4. Functional outcome after the Hoffer procedure.

    PubMed

    Murabit, Amera; Gnarra, Maria; O'Grady, Kathleen; Morhart, Michael; Olson, Jaret L

    2013-06-01

    Children with obstetrical brachial plexus injury often develop an internal rotation and adduction contracture about the shoulder as a secondary deformity, resulting in an inability to externally rotate and abduct the shoulder. The Hoffer procedure is evaluated for its potential benefit in improving shoulder abduction and external rotation and its impact on activities of daily living. This is a retrospective review of patients treated in brachial plexus injury clinic who underwent tendon transfer procedures about the shoulder. Preoperative and postoperative active movement and active range of motion were measured and recorded using the Mallet scale and the Active Movement Scale. Twenty patients were included in the study. Average age at time of surgery was 6.35 years. Thirteen patients had primary brachial plexus reconstructive surgery and four patients had concomitant wrist extension tendon transfer procedures. All patients had full passive range of motion preoperatively. The average follow-up period was 25.45 months. Average differences in pre-Hoffer and post-Hoffer Mallet scale scores are as follows: active abduction, 1.20; external rotation, 1.35; hand-to-neck, 1.25; hand-to-back, 0.75; hand-to-mouth, 0.65; and aggregate score, 5.20 (p<0.001 for all). Average differences in relevant pre-Hoffer and post-Hoffer Active Movement Scale scores are as follows: shoulder abduction, 2.10; shoulder external rotation, 4.25; and shoulder internal rotation, -0.80. All patients maintained full range of motion passively; thus, no functional loss was experienced. These results showed very high statistical significance (p<0.001 for all) and clinical significance. Younger patients (≤6 years) and those with better preoperative shoulder flexion and shoulder internal rotation yielded better postoperative results. The Hoffer procedure provides clinically and statistically significant improvement in external rotation and abduction while preserving functional internal rotation range in the child with obstetrical brachial plexus palsy and secondary shoulder deformity. Therapeutic, IV.

  5. Use of repeat anterior maxillary distraction to correct residual midface hypoplasia in cleft patients

    PubMed Central

    2017-01-01

    Objectives The study was designed to evaluate the efficacy of performing a second, repeat anterior maxillary distraction (AMD) to treat residual cleft maxillary hypoplasia. Materials and Methods Five patients between the ages of 12 to 15 years with a history of AMD and with residual cleft maxillary hypoplasia were included in the study. Inclusion was irrespective of gender, type of cleft lip and palate, and the amount of advancement needed. Repeat AMD was executed in these patients 4 to 5 years after the primary AMD procedure to correct the cleft maxillary hypoplasia that had developed since the initial procedure. Orthopantomogram (OPG) and lateral cephalograms were taken for evaluation preoperatively, immediately after distraction, after consolidation, and one year postoperatively. The data obtained was tabulated and a Mann Whitney U-test was used for statistical comparisons. Results At the time of presentation, a residual maxillary hypoplasia was observed with a well maintained distraction gap on the OPG which ruled out the occurrence of a relapse. Favorable movement of the segments without any resistance was seen in all patients. Mean maxillary advancement of 10.56 mm was achieved at repeat AMD. Statistically significant increases in midfacial length, SNA angle, and nasion perpendicular to point A distance was achieved (P=0.012, P=0.011, and P=0.012, respectively). Good profile was achieved for all patients. Minimal transient complications, for example anterior open bite and bleeding episodes, were managed. Conclusion Addressing the problem of cleft maxillary hypoplasia at an early age (12–15 years) is beneficial for the child. Residual hypoplasia may develop in some patients, which may require additional corrective procedures. The results of our study show that AMD can be repeated when residual deformity develops with the previous procedure having no negative impact on the results of the repeat procedure. PMID:29333371

  6. Estimation of descriptive statistics for multiply censored water quality data

    USGS Publications Warehouse

    Helsel, Dennis R.; Cohn, Timothy A.

    1988-01-01

    This paper extends the work of Gilliom and Helsel (1986) on procedures for estimating descriptive statistics of water quality data that contain “less than” observations. Previously, procedures were evaluated when only one detection limit was present. Here we investigate the performance of estimators for data that have multiple detection limits. Probability plotting and maximum likelihood methods perform substantially better than simple substitution procedures now commonly in use. Therefore simple substitution procedures (e.g., substitution of the detection limit) should be avoided. Probability plotting methods are more robust than maximum likelihood methods to misspecification of the parent distribution and their use should be encouraged in the typical situation where the parent distribution is unknown. When utilized correctly, less than values frequently contain nearly as much information for estimating population moments and quantiles as would the same observations had the detection limit been below them.

  7. Screen and clean: a tool for identifying interactions in genome-wide association studies.

    PubMed

    Wu, Jing; Devlin, Bernie; Ringquist, Steven; Trucco, Massimo; Roeder, Kathryn

    2010-04-01

    Epistasis could be an important source of risk for disease. How interacting loci might be discovered is an open question for genome-wide association studies (GWAS). Most researchers limit their statistical analyses to testing individual pairwise interactions (i.e., marginal tests for association). A more effective means of identifying important predictors is to fit models that include many predictors simultaneously (i.e., higher-dimensional models). We explore a procedure called screen and clean (SC) for identifying liability loci, including interactions, by using the lasso procedure, which is a model selection tool for high-dimensional regression. We approach the problem by using a varying dictionary consisting of terms to include in the model. In the first step the lasso dictionary includes only main effects. The most promising single-nucleotide polymorphisms (SNPs) are identified using a screening procedure. Next the lasso dictionary is adjusted to include these main effects and the corresponding interaction terms. Again, promising terms are identified using lasso screening. Then significant terms are identified through the cleaning process. Implementation of SC for GWAS requires algorithms to explore the complex model space induced by the many SNPs genotyped and their interactions. We propose and explore a set of algorithms and find that SC successfully controls Type I error while yielding good power to identify risk loci and their interactions. When the method is applied to data obtained from the Wellcome Trust Case Control Consortium study of Type 1 Diabetes it uncovers evidence supporting interaction within the HLA class II region as well as within Chromosome 12q24.

  8. Comparison of revision surgeries for one- to two-level cervical TDR and ACDF from 2002 to 2011.

    PubMed

    Nandyala, Sreeharsha V; Marquez-Lara, Alejandro; Fineberg, Steven J; Singh, Kern

    2014-12-01

    Cervical total disc replacement (TDR) and anterior cervical discectomy and fusion (ACDF) provide comparable outcomes for degenerative cervical pathology. However, revisions of these procedures are not well characterized. The purpose of this study is to examine the rates, epidemiology, perioperative complications, and costs between the revision procedures and to compare these outcomes with those of primary cases. This study is a retrospective database analysis. A total of 3,792 revision and 183,430 primary cases from the Nationwide Inpatient Sample (NIS) database from 2002 to 2011 were included. Incidence of revision cases, patient demographics, length of stay (LOS), in-hospital costs, mortality, and perioperative complications. Patients who underwent revision for either one- to two-level cervical TDR or ACDF were identified. SPSS v.20 was used for statistical analysis with χ(2) test for categorical data and independent sample t test for continuous data. The relative risk for perioperative complications with revisions was calculated in comparison with primary cases using a 95% confidence interval. An alpha level of less than 0.05 denoted statistical significance. There were 3,536 revision one- to two-level ACDFs and 256 revision cervical TDRs recorded in the NIS database from 2002 to 2011. The revision cervical TDR cohort demonstrated a significantly greater LOS (3.18 vs. 2.25, p<.001), cost ($16,998 vs. $15,222, p=.03), and incidence of perioperative wound infections (13.6 vs. 5.3 per 1,000, p<.001) compared with the ACDF revision cohort (p<.001). There were no differences in mortality between the revision surgical cohorts. Compared with primary cases, both revision cohorts demonstrated a significantly greater LOS and cost. Furthermore, patients who underwent revision demonstrated a greater incidence and risk for perioperative wound infections, hematomas, dysphagia, and neurologic complications relative to the primary procedures. This study demonstrated a significantly greater incidence of perioperative wound infection, LOS, and costs associated with a TDR revision compared with a revision ACDF. We propose that these differences are by virtue of the inherently more invasive nature of revising TDRs. In addition, compared with primary cases, revision procedures are associated with greater costs, LOS, and complications including wound infections, dysphagia, hematomas, and neurologic events. These additional risks must be considered before opting for a revision procedure. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. A Bayesian statistical analysis of mouse dermal tumor promotion assay data for evaluating cigarette smoke condensate.

    PubMed

    Kathman, Steven J; Potts, Ryan J; Ayres, Paul H; Harp, Paul R; Wilson, Cody L; Garner, Charles D

    2010-10-01

    The mouse dermal assay has long been used to assess the dermal tumorigenicity of cigarette smoke condensate (CSC). This mouse skin model has been developed for use in carcinogenicity testing utilizing the SENCAR mouse as the standard strain. Though the model has limitations, it remains as the most relevant method available to study the dermal tumor promoting potential of mainstream cigarette smoke. In the typical SENCAR mouse CSC bioassay, CSC is applied for 29 weeks following the application of a tumor initiator such as 7,12-dimethylbenz[a]anthracene (DMBA). Several endpoints are considered for analysis including: the percentage of animals with at least one mass, latency, and number of masses per animal. In this paper, a relatively straightforward analytic model and procedure is presented for analyzing the time course of the incidence of masses. The procedure considered here takes advantage of Bayesian statistical techniques, which provide powerful methods for model fitting and simulation. Two datasets are analyzed to illustrate how the model fits the data, how well the model may perform in predicting data from such trials, and how the model may be used as a decision tool when comparing the dermal tumorigenicity of cigarette smoke condensate from multiple cigarette types. The analysis presented here was developed as a statistical decision tool for differentiating between two or more prototype products based on the dermal tumorigenicity. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  10. Clinical evaluation of subepithelial connective tissue graft and guided tissue regeneration for treatment of Miller’s class 1 gingival recession (comparative, split mouth, six months study)

    PubMed Central

    Bhavsar, Neeta-V.; Dulani, Kirti; Trivedi, Rahul

    2014-01-01

    Objectives: The present study aims to clinically compare and evaluate subepithelial connective tissue graft and the GTR based root coverage in treatment of Miller’s Class I gingival recession. Study Design: 30 patients with at least one pair of Miller’s Class I gingival recession were treated either with Subepithelial connective tissue graft (Group A) or Guided tissue regeneration (Group B). Clinical parameters monitored included recession RD, width of keratinized gingiva (KG), probing depth (PD), clinical attachment level (CAL), attached gingiva (AG), residual probing depth (RPD) and % of Root coverage(%RC). Measurements were taken at baseline, three months and six months. A standard surgical procedure was used for both Group A and Group B. Data were recorded and statistical analysis was done for both intergroup and intragroup. Results: At end of six months % RC obtained were 84.47% (Group A) and 81.67% (Group B). Both treatments resulted in statistically significant improvement in clinical parameters. When compared, no statistically significant difference was found between both groups except in RPD, where it was significantly greater in Group A. Conclusions: GTR technique has advantages over subepithelial connective tissue graft for shallow Miller’s Class I defects and this procedure can be used to avoid patient discomfort and reduce treatment time. Key words:Collagen membrane, comparative split mouth study, gingival recession, subepithelial connective tissue graft, guided tissue regeneration (GTR). PMID:25136420

  11. Application of multivariate statistical techniques in microbial ecology

    PubMed Central

    Paliy, O.; Shankar, V.

    2016-01-01

    Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large scale ecological datasets. Especially noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions, and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amounts of data, powerful statistical techniques of multivariate analysis are well suited to analyze and interpret these datasets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular dataset. In this review we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive, and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and dataset structure. PMID:26786791

  12. Evidence-based value of subcutaneous surgical wound drainage: the largest systematic review and meta-analysis.

    PubMed

    Kosins, Aaron M; Scholz, Thomas; Cetinkaya, Mine; Evans, Gregory R D

    2013-08-01

    The purpose of this study was to determine the evidenced-based value of prophylactic drainage of subcutaneous wounds in surgery. An electronic search was performed. Articles comparing subcutaneous prophylactic drainage with no drainage were identified and classified by level of evidence. If sufficient randomized controlled trials were included, a meta-analysis was performed using the random-effects model. Fifty-two randomized controlled trials were included in the meta-analysis, and subgroups were determined by specific surgical procedures or characteristics (cesarean delivery, abdominal wound, breast reduction, breast biopsy, femoral wound, axillary lymph node dissection, hip and knee arthroplasty, obesity, and clean-contaminated wound). Studies were compared for the following endpoints: hematoma, wound healing issues, seroma, abscess, and infection. Fifty-two studies with a total of 6930 operations were identified as suitable for this analysis. There were 3495 operations in the drain group and 3435 in the no-drain group. Prophylactic subcutaneous drainage offered a statistically significant advantage only for (1) prevention of hematomas in breast biopsy procedures and (2) prevention of seromas in axillary node dissections. In all other procedures studied, drainage did not offer an advantage. Many surgical operations can be performed safely without prophylactic drainage. Surgeons can consider omitting drains after cesarean section, breast reduction, abdominal wounds, femoral wounds, and hip and knee joint replacement. Furthermore, surgeons should consider not placing drains prophylactically in obese patients. However, drain placement following a surgical procedure is the surgeon's choice and can be based on multiple factors beyond the type of procedure being performed or the patient's body habitus. Therapeutic, II.

  13. Nitrous Oxide 70% for Procedural Analgosedation in a Pediatric Emergency Department With or Without Intranasal Fentanyl?: Analgesic Efficacy and Adverse Events if Combined With Intranasal Fentanyl.

    PubMed

    Seiler, Michelle; Landolt, Markus A; Staubli, Georg

    2017-07-03

    Nitrous oxide 70% (N20 70%) is an excellent medication for procedural analgosedation in a pediatric emergency department. However, its analgesic efficacy remains uncertain for painful procedures; therefore, a combination with intranasal fentanyl (INF), an opioid, was suggested. This study aimed at observing and assessing the analgesic efficacy and rate of adverse events using N20 70% with and without INF. Children who received N20 70% in a tertiary children's hospital emergency department from January 1, 2014 to June 30, 2015 were included in this observational study with prospective data collection. Physicians decided individually whether INF was administered. Medical staff documented the child's behavior during the procedure, adverse events, and satisfaction rate. A total of 442 children were included; 206 (46.6%) received INF. Group differences regarding patient behavior were not statistically significant; however, N20 70% application time was longer in the INF group (P = .02). Nausea was the most frequent adverse event with 13.1% in the INF group versus 8.1% without INF. Inadequate procedural analgosedation was documented only in the INF group, affecting 1.8% of all patients (P = .002). In contrast, anxiety was exclusively observed in the group without INF, which was presumably misjudged pain (P = .03); the satisfaction rate in the INF group was 95.6% compared with 98.7% without INF. Because of the study design and limitations, no conclusions about adding INF to N20 70% can be made. Additional research is needed to investigate the effect of combining N20 70% with INF.

  14. Economic analysis of the future growth of cosmetic surgery procedures.

    PubMed

    Liu, Tom S; Miller, Timothy A

    2008-06-01

    The economic growth of cosmetic surgical and nonsurgical procedures has been tremendous. Between 1992 and 2005, annual U.S. cosmetic surgery volume increased by 725 percent, with over $10 billion spent in 2005. It is unknown whether this growth will continue for the next decade and, if so, what impact it will it have on the plastic surgeon workforce. The authors analyzed annual U.S. cosmetic surgery procedure volume reported by the American Society of Plastic Surgeons (ASPS) National Clearinghouse of Plastic Surgery Statistics between 1992 and 2005. Reconstructive plastic surgery volume was not included in the analysis. The authors analyzed the ability of economic and noneconomic variables to predict annual cosmetic surgery volume. The authors also used growth rate analyses to construct models with which to predict the future growth of cosmetic surgery. None of the economic and noneconomic variables were a significant predictor of annual cosmetic surgery volume. Instead, based on current compound annual growth rates, the authors predict that total cosmetic surgery volume (surgical and nonsurgical) will exceed 55 million annual procedures by 2015. ASPS members are projected to perform 299 surgical and 2165 nonsurgical annual procedures. Non-ASPS members are projected to perform 39 surgical and 1448 nonsurgical annual procedures. If current growth rates continue into the next decade, the future demand in cosmetic surgery will be driven largely by nonsurgical procedures. The growth of surgical procedures will be met by ASPS members. However, meeting the projected growth in nonsurgical procedures could be a potential challenge and a potential area for increased competition.

  15. Learning curves for urological procedures: a systematic review.

    PubMed

    Abboudi, Hamid; Khan, Mohammed Shamim; Guru, Khurshid A; Froghi, Saied; de Win, Gunter; Van Poppel, Hendrik; Dasgupta, Prokar; Ahmed, Kamran

    2014-10-01

    To determine the number of cases a urological surgeon must complete to achieve proficiency for various urological procedures. The MEDLINE, EMBASE and PsycINFO databases were systematically searched for studies published up to December 2011. Studies pertaining to learning curves of urological procedures were included. Two reviewers independently identified potentially relevant articles. Procedure name, statistical analysis, procedure setting, number of participants, outcomes and learning curves were analysed. Forty-four studies described the learning curve for different urological procedures. The learning curve for open radical prostatectomy ranged from 250 to 1000 cases and for laparoscopic radical prostatectomy from 200 to 750 cases. The learning curve for robot-assisted laparoscopic prostatectomy (RALP) has been reported to be 40 procedures as a minimum number. Robot-assisted radical cystectomy has a documented learning curve of 16-30 cases, depending on which outcome variable is measured. Irrespective of previous laparoscopic experience, there is a significant reduction in operating time (P = 0.008), estimated blood loss (P = 0.008) and complication rates (P = 0.042) after 100 RALPs. The available literature can act as a guide to the learning curves of trainee urologists. Although the learning curve may vary among individual surgeons, a consensus should exist for the minimum number of cases to achieve proficiency. The complexities associated with defining procedural competence are vast. The majority of learning curve trials have focused on the latest surgical techniques and there is a paucity of data pertaining to basic urological procedures. © 2013 The Authors. BJU International © 2013 BJU International.

  16. A review of mammalian carcinogenicity study design and potential effects of alternate test procedures on the safety evaluation of food ingredients.

    PubMed

    Hayes, A W; Dayan, A D; Hall, W C; Kodell, R L; Williams, G M; Waddell, W D; Slesinski, R S; Kruger, C L

    2011-06-01

    Extensive experience in conducting long term cancer bioassays has been gained over the past 50 years of animal testing on drugs, pesticides, industrial chemicals, food additives and consumer products. Testing protocols for the conduct of carcinogenicity studies in rodents have been developed in Guidelines promulgated by regulatory agencies, including the US EPA (Environmental Protection Agency), the US FDA (Food and Drug Administration), the OECD (Organization for Economic Co-operation and Development) for the EU member states and the MAFF (Ministries of Agriculture, Forestries and Fisheries) and MHW (Ministry of Health and Welfare) in Japan. The basis of critical elements of the study design that lead to an accepted identification of the carcinogenic hazard of substances in food and beverages is the focus of this review. The approaches used by entities well-known for carcinogenicity testing and/or guideline development are discussed. Particular focus is placed on comparison of testing programs used by the US National Toxicology Program (NTP) and advocated in OECD guidelines to the testing programs of the European Ramazzini Foundation (ERF), an organization with numerous published carcinogenicity studies. This focus allows for a good comparison of differences in approaches to carcinogenicity testing and allows for a critical consideration of elements important to appropriate carcinogenicity study designs and practices. OECD protocols serve as good standard models for carcinogenicity testing protocol design. Additionally, the detailed design of any protocol should include attention to the rationale for inclusion of particular elements, including the impact of those elements on study interpretations. Appropriate interpretation of study results is dependent on rigorous evaluation of the study design and conduct, including differences from standard practices. Important considerations are differences in the strain of animal used, diet and housing practices, rigorousness of test procedures, dose selection, histopathology procedures, application of historical control data, statistical evaluations and whether statistical extrapolations are supported by, or are beyond the limits of, the data generated. Without due consideration, there can be result conflicting data interpretations and uncertainty about the relevance of a study's results to human risk. This paper discusses the critical elements of rodent (rat) carcinogenicity studies, particularly with respect to the study of food ingredients. It also highlights study practices and procedures that can detract from the appropriate evaluation of human relevance of results, indicating the importance of adherence to international consensus protocols, such as those detailed by OECD. Copyright © 2010. Published by Elsevier Inc.

  17. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR SAMPLING WEIGHT CALCULATION (IIT-A-9.0)

    EPA Science Inventory

    The purpose of this SOP is to describe the procedures undertaken to calculate sampling weights. The sampling weights are needed to obtain weighted statistics of the study data. This SOP uses data that have been properly coded and certified with appropriate QA/QC procedures by th...

  18. Assessing the Item Response Theory with Covariate (IRT-C) Procedure for Ascertaining Differential Item Functioning

    ERIC Educational Resources Information Center

    Tay, Louis; Vermunt, Jeroen K.; Wang, Chun

    2013-01-01

    We evaluate the item response theory with covariates (IRT-C) procedure for assessing differential item functioning (DIF) without preknowledge of anchor items (Tay, Newman, & Vermunt, 2011). This procedure begins with a fully constrained baseline model, and candidate items are tested for uniform and/or nonuniform DIF using the Wald statistic.…

  19. Evaluation on the use of cerium in the NBL Titrimetric Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zebrowski, J.P.; Orlowicz, G.J.; Johnson, K.D.

    An alternative to potassium dichromate as titrant in the New Brunswick Laboratory Titrimetric Method for uranium analysis was sought since chromium in the waste makes disposal difficult. Substitution of a ceric-based titrant was statistically evaluated. Analysis of the data indicated statistically equivalent precisions for the two methods, but a significant overall bias of +0.035% for the ceric titrant procedure. The cause of the bias was investigated, alterations to the procedure were made, and a second statistical study was performed. This second study revealed no statistically significant bias, nor any analyst-to-analyst variation in the ceric titration procedure. A statistically significant day-to-daymore » variation was detected, but this was physically small (0.01 5%) and was only detected because of the within-day precision of the method. The added mean and standard deviation of the %RD for a single measurement was found to be 0.031%. A comparison with quality control blind dichromate titration data again indicated similar overall precision. Effects of ten elements on the ceric titration`s performance was determined. Co, Ti, Cu, Ni, Na, Mg, Gd, Zn, Cd, and Cr in previous work at NBL these impurities did not interfere with the potassium dichromate titrant. This study indicated similar results for the ceric titrant, with the exception of Ti. All the elements (excluding Ti and Cr), caused no statistically significant bias in uranium measurements at levels of 10 mg impurity per 20-40 mg uranium. The presence of Ti was found to cause a bias of {minus}0.05%; this is attributed to the presence of sulfate ions, resulting in precipitation of titanium sulfate and occlusion of uranium. A negative bias of 0.012% was also statistically observed in the samples containing chromium impurities.« less

  20. Ultraclean air for prevention of postoperative infection after posterior spinal fusion with instrumentation: a comparison between surgeries performed with and without a vertical exponential filtered air-flow system.

    PubMed

    Gruenberg, Marcelo F; Campaner, Gustavo L; Sola, Carlos A; Ortolan, Eligio G

    2004-10-15

    This study retrospectively compared infection rates between adult patients after posterior spinal instrumentation procedures performed in a conventional versus an ultraclean air operating room. To evaluate if the use of ultraclean air technology could decrease the infection rate after posterior spinal arthrodesis with instrumentation. Postoperative wound infection after posterior arthrodesis remains a feared complication in spinal surgery. Although this frequent complication results in a significant problem, the employment of ultraclean air technology, as it is commonly used for arthroplasty, has not been reported as a possible alternative to reduce the infection rate after complex spine surgery. One hundred seventy-nine patients having posterior spinal fusion with instrumentation were divided into 2 groups: group I included 139 patients operated in a conventional operating room, and group II included 40 patients operated in a vertical laminar flow operating room. Patient selection was performed favoring ultraclean air technology for elective cases in which high infection risk was considered. A statistical analysis of the infection rate and its associated risk factors between both groups was assessed. We observed 18 wound infections in group I and 0 in group II. Comparison of infection rates using the chi-squared test showed a statistically significant difference (P <0.017). The use of ultraclean air technology reduced the infection rate after complex spinal procedures and appears to be an interesting alternative that still needs to be prospectively studied with a randomized protocol.

  1. Relevance of the c-statistic when evaluating risk-adjustment models in surgery.

    PubMed

    Merkow, Ryan P; Hall, Bruce L; Cohen, Mark E; Dimick, Justin B; Wang, Edward; Chow, Warren B; Ko, Clifford Y; Bilimoria, Karl Y

    2012-05-01

    The measurement of hospital quality based on outcomes requires risk adjustment. The c-statistic is a popular tool used to judge model performance, but can be limited, particularly when evaluating specific operations in focused populations. Our objectives were to examine the interpretation and relevance of the c-statistic when used in models with increasingly similar case mix and to consider an alternative perspective on model calibration based on a graphical depiction of model fit. From the American College of Surgeons National Surgical Quality Improvement Program (2008-2009), patients were identified who underwent a general surgery procedure, and procedure groups were increasingly restricted: colorectal-all, colorectal-elective cases only, and colorectal-elective cancer cases only. Mortality and serious morbidity outcomes were evaluated using logistic regression-based risk adjustment, and model c-statistics and calibration curves were used to compare model performance. During the study period, 323,427 general, 47,605 colorectal-all, 39,860 colorectal-elective, and 21,680 colorectal cancer patients were studied. Mortality ranged from 1.0% in general surgery to 4.1% in the colorectal-all group, and serious morbidity ranged from 3.9% in general surgery to 12.4% in the colorectal-all procedural group. As case mix was restricted, c-statistics progressively declined from the general to the colorectal cancer surgery cohorts for both mortality and serious morbidity (mortality: 0.949 to 0.866; serious morbidity: 0.861 to 0.668). Calibration was evaluated graphically by examining predicted vs observed number of events over risk deciles. For both mortality and serious morbidity, there was no qualitative difference in calibration identified between the procedure groups. In the present study, we demonstrate how the c-statistic can become less informative and, in certain circumstances, can lead to incorrect model-based conclusions, as case mix is restricted and patients become more homogenous. Although it remains an important tool, caution is advised when the c-statistic is advanced as the sole measure of a model performance. Copyright © 2012 American College of Surgeons. All rights reserved.

  2. A Simple Illustration for the Need of Multiple Comparison Procedures

    ERIC Educational Resources Information Center

    Carter, Rickey E.

    2010-01-01

    Statistical adjustments to accommodate multiple comparisons are routinely covered in introductory statistical courses. The fundamental rationale for such adjustments, however, may not be readily understood. This article presents a simple illustration to help remedy this.

  3. Validating Coherence Measurements Using Aligned and Unaligned Coherence Functions

    NASA Technical Reports Server (NTRS)

    Miles, Jeffrey Hilton

    2006-01-01

    This paper describes a novel approach based on the use of coherence functions and statistical theory for sensor validation in a harsh environment. By the use of aligned and unaligned coherence functions and statistical theory one can test for sensor degradation, total sensor failure or changes in the signal. This advanced diagnostic approach and the novel data processing methodology discussed provides a single number that conveys this information. This number as calculated with standard statistical procedures for comparing the means of two distributions is compared with results obtained using Yuen's robust statistical method to create confidence intervals. Examination of experimental data from Kulite pressure transducers mounted in a Pratt & Whitney PW4098 combustor using spectrum analysis methods on aligned and unaligned time histories has verified the effectiveness of the proposed method. All the procedures produce good results which demonstrates how robust the technique is.

  4. There is no benefit to universal carotid artery duplex screening before a major cardiac surgical procedure.

    PubMed

    Adams, Brian C; Clark, Ross M; Paap, Christina; Goff, James M

    2014-01-01

    Perioperative stroke is a devastating complication after cardiac surgery. In an attempt to minimize this complication, many cardiac surgeons routinely preoperatively order carotid artery duplex scans to assess for significant carotid stenosis. We hypothesize that the routine screening of preoperative cardiac surgery patients with carotid artery duplex scans detects few patients who would benefit from carotid intervention or that a significant carotid stenosis reliably predicts stroke risk after cardiac surgery. A retrospective review identified 1,499 patients who underwent cardiac surgical procedures between July 1999 and September 2010. Data collected included patient demographics, comorbidities, history of previous stroke, preoperative carotid artery duplex scan results, location of postoperative stroke, and details of carotid endarterectomy (CEA) procedures before, in conjunction with, or after cardiac surgery. Statistical methods included univariate analysis and Fisher's exact test. Twenty-six perioperative strokes were identified (1.7%). In the 21 postoperative stroke patients for whom there is complete carotid artery duplex scan data, 3 patients had a hemodynamically significant lesion (>70%) and 1 patient underwent unilateral carotid CEA for bilateral disease. Postoperative strokes occurred in the anterior cerebral circulation (69.2%), posterior cerebral circulation (15.4%), or both (15.4%). Patient comorbidities, preoperative carotid artery duplex scan screening velocities, or types of cardiac surgical procedure were not predictive for stroke. Thirteen patients (0.86%) underwent CEA before, in conjunction with, or after cardiac surgery. Two of these patients had symptomatic disease, 1 of whom underwent CEA before and the other after his cardiac surgery. Of the 11 asymptomatic patients, 2 underwent CEA before, 3 concurrently, and 6 after cardiac surgery. Left main disease (≥50% stenosis), previous stroke, and peripheral vascular disease were found to be statistically significant predictors of carotid revascularization. A cost analysis of universal screening resulted in an estimated net cost of $378,918 during the study period. The majority of postoperative strokes after cardiac surgery are not related to extracranial carotid artery disease and they are not predicted by preoperative carotid artery duplex scan screening. Consequently, universal carotid artery duplex scan screening cannot be recommended and a selective approach should be adopted. Published by Elsevier Inc.

  5. Systematic Review and Meta-analysis of Osteochondral Autograft Transplantation versus Debridement in the Treatment of Osteochondritis Dessicans of the Capitellum

    PubMed Central

    Bowman, Seth; Braunstein, Jacob; Rabinowitz, Justin; Barfield, William R.; Chhabra, Bobby; Haro, Marc Scott

    2016-01-01

    Objectives: The purpose of this systematic review and meta- analysis is to compare clinical results and functional outcomes in patients with osteochondritis dessicans (OCD) lesions of the capitellum treated with either osteochondral autograft transplantation (OATS) or debridement with or without microfracture. Methods: Systematic review of multiple medical databases was performed after PROSPERO registration and using PRISMA guidelines. A literature search was performed using the multiple medical databases and the methodological quality of the individual studies was assessed by two review authors using the Cochrane Collaboration’s “Risk of Bias” tool. Case reports were excluded and only case series of more than five patients and higher level of evidence were included. All study, subject, and surgery parameters were collected. Data was analyzed using statistical software. Odds ratios (OR) were calculated when possible. Data were compared using Pearson Chi-Square and independent sample T tests when applicable. Results: Fifteen studies were included involving 368 patients (326 males and 42 females). There were a total of 197 patients in the debridement group and 171 patients in the OATS group. The mean age was 16.9 +/-4.1 for the debridement group and 14.6 +/-1.2 for the OATS group. Mean follow up was 29.0 +/-24.3 and 38.0 +/-12.8 for the debridement and OATS groups, respectively. Patients that underwent an OATS procedure had a statistically significant improvement in overall arc range of motion compared to patients that had a debridement (P≤0.001). When compared to patients with debridement, patients with OATS were 5.6 times more likely to return to at least their pre-injury level of sports participation (p≤0.002). Conclusion: Post-operative range of motion was significantly improved in patients undergoing an OATS procedure versus a debridement for OCD lesions of the capitellum. Patients with an OATS were 5.7 times more likely to return to at least the pre-injury level of sports participation compared to patients undergoing a debridement. Further studies are necessary in order to directly compare functional outcomes in patients undergoing a debridement procedures versus and OATS procedure.

  6. Hypothesis testing for band size detection of high-dimensional banded precision matrices.

    PubMed

    An, Baiguo; Guo, Jianhua; Liu, Yufeng

    2014-06-01

    Many statistical analysis procedures require a good estimator for a high-dimensional covariance matrix or its inverse, the precision matrix. When the precision matrix is banded, the Cholesky-based method often yields a good estimator of the precision matrix. One important aspect of this method is determination of the band size of the precision matrix. In practice, crossvalidation is commonly used; however, we show that crossvalidation not only is computationally intensive but can be very unstable. In this paper, we propose a new hypothesis testing procedure to determine the band size in high dimensions. Our proposed test statistic is shown to be asymptotically normal under the null hypothesis, and its theoretical power is studied. Numerical examples demonstrate the effectiveness of our testing procedure.

  7. The Thurgood Marshall School of Law Empirical Findings: A Report of the Statistical Analysis of the July 2010 TMSL Texas Bar Results

    ERIC Educational Resources Information Center

    Kadhi, Tau; Holley, D.

    2010-01-01

    The following report gives the statistical findings of the July 2010 TMSL Bar results. Procedures: Data is pre-existing and was given to the Evaluator by email from the Registrar and Dean. Statistical analyses were run using SPSS 17 to address the following research questions: 1. What are the statistical descriptors of the July 2010 overall TMSL…

  8. Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement.

    PubMed

    Dupont, Corinne; Occelli, Pauline; Deneux-Tharaux, Catherine; Touzet, Sandrine; Duclos, Antoine; Bouvier-Colle, Marie-Hélène; Rudigoz, René-Charles; Huissoud, Cyril

    2014-07-01

    Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement To use statistical process control charts to describe trends in the prevalence of severe postpartum haemorrhage after vaginal delivery. This assessment was performed 7 years after we initiated a continuous quality improvement programme that began with regular criteria-based audits Observational descriptive study, in a French maternity unit in the Rhône-Alpes region. Quarterly clinical audit meetings to analyse all cases of severe postpartum haemorrhage after vaginal delivery and provide feedback on quality of care with statistical process control tools. The primary outcomes were the prevalence of severe PPH after vaginal delivery and its quarterly monitoring with a control chart. The secondary outcomes included the global quality of care for women with severe postpartum haemorrhage, including the performance rate of each recommended procedure. Differences in these variables between 2005 and 2012 were tested. From 2005 to 2012, the prevalence of severe postpartum haemorrhage declined significantly, from 1.2% to 0.6% of vaginal deliveries (p<0.001). Since 2010, the quarterly rate of severe PPH has not exceeded the upper control limits, that is, been out of statistical control. The proportion of cases that were managed consistently with the guidelines increased for all of their main components. Implementation of continuous quality improvement efforts began seven years ago and used, among other tools, statistical process control charts. During this period, the prevalence of severe postpartum haemorrhage after vaginal delivery has been reduced by 50%. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  9. [Bronchoscopy in Germany. Cross-sectional inquiry with 681 institutions].

    PubMed

    Markus, A; Häussinger, K; Kohlhäufl, M; Hauck, R W

    2000-11-01

    Bronchoscopy represents an integral part of the diagnostic tools in pulmonary medicine. Recently, it has also gained considerable attention for its therapeutic properties. To elucidate equipment, indications and procedural techniques of bronchoscopy units, a retrospective survey of 1232 hospitals and practices is conducted. 687 questionnaires are received back (response rate 56%). 681 of which are statistically evaluated. Two thirds of the physicians in charge are internists, one third are pulmonary care specialists. A total of 200,596 endoscopic procedures is included. The majority of procedures is done with an average of 3 bronchoscopists and in over 57% (388) of cases with an average number of 100 or less procedures per year. The five main indications are tumor, hemoptysis, infection or pneumonia, drainage of secretions and suspected interstitial disease. Overall complication rate amounts to 2.7% with an incidence of 4.6% minor and 0.7% major complications and a bronchoscopy-related mortality of 0.02%. The patterns seen in premedication, intra- and post-procedural monitoring, disinfection practices as well as documentation are quite heterogeneous. It is suggested to establish revised and updated standards for bronchoscopy, which should take the data collected into particular account. Those standards should provide the basis for a high level bronchological care throughout Germany.

  10. A comparison of unsupervised classification procedures on LANDSAT MSS data for an area of complex surface conditions in Basilicata, Southern Italy

    NASA Technical Reports Server (NTRS)

    Justice, C.; Townshend, J. (Principal Investigator)

    1981-01-01

    Two unsupervised classification procedures were applied to ratioed and unratioed LANDSAT multispectral scanner data of an area of spatially complex vegetation and terrain. An objective accuracy assessment was undertaken on each classification and comparison was made of the classification accuracies. The two unsupervised procedures use the same clustering algorithm. By on procedure the entire area is clustered and by the other a representative sample of the area is clustered and the resulting statistics are extrapolated to the remaining area using a maximum likelihood classifier. Explanation is given of the major steps in the classification procedures including image preprocessing; classification; interpretation of cluster classes; and accuracy assessment. Of the four classifications undertaken, the monocluster block approach on the unratioed data gave the highest accuracy of 80% for five coarse cover classes. This accuracy was increased to 84% by applying a 3 x 3 contextual filter to the classified image. A detailed description and partial explanation is provided for the major misclassification. The classification of the unratioed data produced higher percentage accuracies than for the ratioed data and the monocluster block approach gave higher accuracies than clustering the entire area. The moncluster block approach was additionally the most economical in terms of computing time.

  11. Pilot study of proposed revisions to specifications for hydraulic cement concrete.

    DOT National Transportation Integrated Search

    1985-01-01

    This report summarizes the results of a pilot study of the statistical acceptance procedures proposed for adoption by the Virginia Department of Highways and Transportation. The proposed procedures were recommended in the report titled "Improved Spec...

  12. Using Cochran's Z Statistic to Test the Kernel-Smoothed Item Response Function Differences between Focal and Reference Groups

    ERIC Educational Resources Information Center

    Zheng, Yinggan; Gierl, Mark J.; Cui, Ying

    2010-01-01

    This study combined the kernel smoothing procedure and a nonparametric differential item functioning statistic--Cochran's Z--to statistically test the difference between the kernel-smoothed item response functions for reference and focal groups. Simulation studies were conducted to investigate the Type I error and power of the proposed…

  13. Comparison of ketamine and ketofol for deep sedation and analgesia in children undergoing laser procedure.

    PubMed

    Stevic, Marija; Ristic, Nina; Budic, Ivana; Ladjevic, Nebojsa; Trifunovic, Branislav; Rakic, Ivan; Majstorovic, Marko; Burazor, Ivana; Simic, Dusica

    2017-09-01

    The aim of our study was to research and evaluate cardiovascular and respiratory stability, clinical efficacy, and safety of two different anesthetic agents in pediatric patients who underwent Pulse dye (wavelength 595 nm, pulse duration 0-40 ms, power 0-40 J) and CO 2 (wavelength 10,600 nm, intensity-fraxel mod with SX index 4 to 8, power 0-30 W) laser procedure. This prospective non-blinded study included 203 pediatric patients ASA I-II, aged between 1 month and 12 years who underwent short-term procedural sedation and analgesia for the laser procedure. After oral premedication with midazolam, 103 children were analgo-sedated with ketamine and fentanyl (K group) and 100 with ketofol and fentanyl (KT group). Vital signs, applied drug doses, pulse oximetry, and parental satisfaction questionnaire were used to compare these two groups. Statistical differences were tested using Student's t test, Mann-Whitney U test, chi-square test, and Fisher's exact test. Receiver operating characteristic (ROC) curve analysis was used to assess the cut-off value of the duration of anesthesia predicting apnea. Tachycardia was recorded in a significantly higher number of patients who received ketamine as the anesthetic agent (35.9 vs. 3% respectively). Hypertension was also significantly more frequent in patients who received ketamine in comparison with patients who received ketofol (25.2 vs. 3%). Laryngospasm was not observed in both examined groups. There was no statistically significant difference between groups in satisfaction of parents and doctors. Apnea and respiratory depression occurred significantly more frequent in ketofol than in ketamine group (12 vs. 0.97% and 13 vs. 0%). Based on ROC analysis for apnea, we found a significantly higher number of patients with apnea in the ketofol group when duration of anesthesia was longer than 17 min. Our study has shown that ketofol is more comfortable than ketamine in short-term laser procedures in children, causing less hemodynamic alteration with mild respiratory depression and less post-procedural adverse events.

  14. Effects of live music therapy sessions on quality of life indicators, medications administered and hospital length of stay for patients undergoing elective surgical procedures for brain.

    PubMed

    Walworth, Darcy; Rumana, Christopher S; Nguyen, Judy; Jarred, Jennifer

    2008-01-01

    The physiological and psychological stress that brain tumor patients undergo during the entire surgical experience can considerably affect several aspects of their hospitalization. The purpose of this study was to examine the effects of live music therapy on quality of life indicators, amount of medications administered and length of stay for persons receiving elective surgical procedures of the brain. Subjects (N = 27) were patients admitted for some type of surgical procedure of the brain. Subjects were randomly assigned to either the control group receiving no music intervention (n = 13) or the experimental group receiving pre and postoperative live music therapy sessions (n = 14). Anxiety, mood, pain, perception of hospitalization or procedure, relaxation, and stress were measured using a self-report Visual Analog Scale (VAS) for each of the variables. The documented administration of postoperative pain medications; the frequency, dosage, type, and how it was given was also compared between groups. Experimental subjects live and interactive music therapy sessions, including a pre-operative session and continuing with daily sessions until the patient was discharged home. Control subjects received routine hospital care without any music therapy intervention. Differences in experimental pretest and posttest scores were analyzed using a Wilcoxon Matched-Pairs Signed-Rank test. Results indicated statistically significant differences for 4 of the 6 quality of life measures: anxiety (p = .03), perception of hospitalization (p = .03), relaxation (p = .001), and stress (p = .001). No statistically significant differences were found for mood (p > .05) or pain (p > .05) levels. Administration amounts of nausea and pain medications were compared with a Two-Way ANOVA with One Repeated Measure resulting in no significant differences between groups and medications, F(1, 51) = 0.03; p > .05. Results indicate no significant differences between groups for length of stay (t = .97, df = 25, p > .05). This research study indicates that live music therapy using patient-preferred music can be beneficial in improving quality of life indicators such as anxiety, perception of the hospitalization or procedure, relaxation, and stress in patients undergoing surgical procedures of the brain.

  15. Uncertainties in Estimates of Fleet Average Fuel Economy : A Statistical Evaluation

    DOT National Transportation Integrated Search

    1977-01-01

    Research was performed to assess the current Federal procedure for estimating the average fuel economy of each automobile manufacturer's new car fleet. Test vehicle selection and fuel economy estimation methods were characterized statistically and so...

  16. 40 CFR 91.512 - Request for public hearing.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... plans and statistical analyses have been properly applied (specifically, whether sampling procedures and statistical analyses specified in this subpart were followed and whether there exists a basis for... will be made available to the public during Agency business hours. ...

  17. Projector-based virtual reality dome environment for procedural pain and anxiety in young children with burn injuries: a pilot study.

    PubMed

    Khadra, Christelle; Ballard, Ariane; Déry, Johanne; Paquin, David; Fortin, Jean-Simon; Perreault, Isabelle; Labbe, David R; Hoffman, Hunter G; Bouchard, Stéphane; LeMay, Sylvie

    2018-01-01

    Virtual reality (VR) is a non-pharmacological method to distract from pain during painful procedures. However, it was never tested in young children with burn injuries undergoing wound care. We aimed to assess the feasibility and acceptability of the study process and the use of VR for procedural pain management. From June 2016 to January 2017, we recruited children from 2 months to 10 years of age with burn injuries requiring a hydrotherapy session in a pediatric university teaching hospital in Montreal. Each child received the projector-based VR intervention in addition to the standard pharmacological treatment. Data on intervention and study feasibility and acceptability in addition to measures on pain (Face, Legs, Activity, Cry, Consolability scale), baseline (Modified Smith Scale) and procedural (Procedure Behavior Check List) anxiety, comfort (OCCEB-BECCO [behavioral observational scale of comfort level for child burn victims]), and sedation (Ramsay Sedation Scale) were collected before, during, and after the procedure. Data analyses included descriptive and non-parametric inferential statistics. We recruited 15 children with a mean age of 2.2±2.1 years and a mean total body surface area of 5% (±4). Mean pain score during the procedure was low (2.9/10, ±3), as was the discomfort level (2.9/10, ±2.8). Most children were cooperative, oriented, and calm. Assessing anxiety was not feasible with our sample of participants. The prototype did not interfere with the procedure and was considered useful for procedural pain management by most health care professionals. The projector-based VR is a feasible and acceptable intervention for procedural pain management in young children with burn injuries. A larger trial with a control group is required to assess its efficacy.

  18. Projector-based virtual reality dome environment for procedural pain and anxiety in young children with burn injuries: a pilot study

    PubMed Central

    Khadra, Christelle; Ballard, Ariane; Déry, Johanne; Paquin, David; Fortin, Jean-Simon; Perreault, Isabelle; Labbe, David R; Hoffman, Hunter G; Bouchard, Stéphane

    2018-01-01

    Background Virtual reality (VR) is a non-pharmacological method to distract from pain during painful procedures. However, it was never tested in young children with burn injuries undergoing wound care. Aim We aimed to assess the feasibility and acceptability of the study process and the use of VR for procedural pain management. Methods From June 2016 to January 2017, we recruited children from 2 months to 10 years of age with burn injuries requiring a hydrotherapy session in a pediatric university teaching hospital in Montreal. Each child received the projector-based VR intervention in addition to the standard pharmacological treatment. Data on intervention and study feasibility and acceptability in addition to measures on pain (Face, Legs, Activity, Cry, Consolability scale), baseline (Modified Smith Scale) and procedural (Procedure Behavior Check List) anxiety, comfort (OCCEB-BECCO [behavioral observational scale of comfort level for child burn victims]), and sedation (Ramsay Sedation Scale) were collected before, during, and after the procedure. Data analyses included descriptive and non-parametric inferential statistics. Results We recruited 15 children with a mean age of 2.2±2.1 years and a mean total body surface area of 5% (±4). Mean pain score during the procedure was low (2.9/10, ±3), as was the discomfort level (2.9/10, ±2.8). Most children were cooperative, oriented, and calm. Assessing anxiety was not feasible with our sample of participants. The prototype did not interfere with the procedure and was considered useful for procedural pain management by most health care professionals. Conclusion The projector-based VR is a feasible and acceptable intervention for procedural pain management in young children with burn injuries. A larger trial with a control group is required to assess its efficacy. PMID:29491717

  19. Integrative review of clinical decision support for registered nurses in acute care settings.

    PubMed

    Dunn Lopez, Karen; Gephart, Sheila M; Raszewski, Rebecca; Sousa, Vanessa; Shehorn, Lauren E; Abraham, Joanna

    2017-03-01

    To report on the state of the science of clinical decision support (CDS) for hospital bedside nurses. We performed an integrative review of qualitative and quantitative peer-reviewed original research studies using a structured search of PubMed, Embase, Cumulative Index to Nursing and Applied Health Literature (CINAHL), Scopus, Web of Science, and IEEE Xplore (Institute of Electrical and Electronics Engineers Xplore Digital Library). We included articles that reported on CDS targeting bedside nurses and excluded in stages based on rules for titles, abstracts, and full articles. We extracted research design and methods, CDS purpose, electronic health record integration, usability, and process and patient outcomes. Our search yielded 3157 articles. After removing duplicates and applying exclusion rules, 28 articles met the inclusion criteria. The majority of studies were single-site, descriptive or qualitative (43%) or quasi-experimental (36%). There was only 1 randomized controlled trial. The purpose of most CDS was to support diagnostic decision-making (36%), guideline adherence (32%), medication management (29%), and situational awareness (25%). All the studies that included process outcomes (7) and usability outcomes (4) and also had analytic procedures to detect changes in outcomes demonstrated statistically significant improvements. Three of 4 studies that included patient outcomes and also had analytic procedures to detect change showed statistically significant improvements. No negative effects of CDS were found on process, usability, or patient outcomes. Clinical support systems targeting bedside nurses have positive effects on outcomes and hold promise for improving care quality; however, this research is lagging behind studies of CDS targeting medical decision-making in both volume and level of evidence. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  20. Single-incision laparoscopic cholecystectomy: a cost comparison.

    PubMed

    Love, Katie M; Durham, Christopher A; Meara, Michael P; Mays, Ashley C; Bower, Curtis E

    2011-05-01

    Single-incision laparoscopic cholecystectomy (SILC) should not cost more or less than traditional laparoscopic cholecystectomy (LC). Retrospective cost data were collected from the accounting records of a single institution. A direct comparison of LC and SILC was conducted. Data on the SILC cases converted to LC were included. The total operating room (OR) cost (actual cost to the hospital for equipment, time, and personnel) and the total OR charges (total derived from the OR cost plus a margin to cover overhead costs beyond material costs) were examined. The total hospital charges (OR charges plus hospital charges accrued in the perioperative period) also were included. Descriptive statistics were used to analyze the data, with p values less than 0.05 considered statistically significant. Over a period of 19 months, 116 cases of minimally invasive cholecystectomy were evaluated. Of the 116 patients, 48 underwent LC during the first half of that period, and 68 patients underwent SILC during the second half of that period. Nine of the single-incision procedures were converted to traditional LC, for a 13% conversion rate. The groups were well matched from a demographics standpoint, with no significant differences in age, gender, body mass index (BMI), diagnoses, American Society of Anesthesiology (ASA) class, or payment. Comparison of all attempted SILCs, including those converted, with all LCs showed no significant difference in cost category totals. A significant difference among all cost variables was found when SILCs were compared with SILCs that required conversion to LC. A significant difference among the cost variables also was found when LCs were compared with converted SILCs. The cost for SILC did not differ significantly from that for LC when standard materials were used and the duration of the procedure was considered. Converted cases were significantly more expensive than completed SILC and LC cases.

  1. Evaluative procedures to detect, characterize, and assess the severity of diabetic neuropathy.

    PubMed

    Dyck, P J

    1991-01-01

    Minimal criteria for diabetic neuropathy need to be defined and universally applied. Standardized evaluative procedures need to be agreed and normal ranges determined from healthy volunteers. Types and stages of neuropathy should be established and assessments performed on representative populations of both Type 1 and Type 2 diabetic patients. Potential minimal criteria include absent ankle reflexes and vibratory sensation, and abnormalities of nerve conduction. However, the preferred criterion is the identification of more than two statistically defined abnormalities among symptoms and deficits, nerve conduction, quantitative sensory examination or quantitative autonomic examination. Various evaluative procedures are available. Symptoms should be assessed and scores can be assigned to neurological deficits. However, assessments of nerve conduction provide the most specific, objective, sensitive, and repeatable procedures, although these may be the least meaningful. Many techniques are available for quantitative sensory examination, but are poorly standardized and normal values are not available. For quantitative autonomic examination, tests are available for the adequacy of cardiovascular and peripheral vascular reflexes and increasingly for other autonomic functions. In any assessment of nerve function the conditions should be optimized and standardized, and stimuli defined. Specific instructions should be given and normal ranges established in healthy volunteers.

  2. Behavior analytic approaches to problem behavior in intellectual disabilities.

    PubMed

    Hagopian, Louis P; Gregory, Meagan K

    2016-03-01

    The purpose of the current review is to summarize recent behavior analytic research on problem behavior in individuals with intellectual disabilities. We have focused our review on studies published from 2013 to 2015, but also included earlier studies that were relevant. Behavior analytic research on problem behavior continues to focus on the use and refinement of functional behavioral assessment procedures and function-based interventions. During the review period, a number of studies reported on procedures aimed at making functional analysis procedures more time efficient. Behavioral interventions continue to evolve, and there were several larger scale clinical studies reporting on multiple individuals. There was increased attention on the part of behavioral researchers to develop statistical methods for analysis of within subject data and continued efforts to aggregate findings across studies through evaluative reviews and meta-analyses. Findings support continued utility of functional analysis for guiding individualized interventions and for classifying problem behavior. Modifications designed to make functional analysis more efficient relative to the standard method of functional analysis were reported; however, these require further validation. Larger scale studies on behavioral assessment and treatment procedures provided additional empirical support for effectiveness of these approaches and their sustainability outside controlled clinical settings.

  3. The role of surgical expertise with regard to chronic postoperative inguinal pain (CPIP) after Lichtenstein correction of inguinal hernia: a systematic review.

    PubMed

    Lange, J F M; Meyer, V M; Voropai, D A; Keus, E; Wijsmuller, A R; Ploeg, R J; Pierie, J P E N

    2016-06-01

    The aim of this study was to evaluate whether a relation exists between surgical expertise and incidence of chronic postoperative inguinal pain (CPIP) after inguinal hernia repair using the Lichtenstein procedure . CPIP after inguinal hernia repair remains a major clinical problem despite many efforts to address this problem. Recently, case volume and specialisation have been found correlated to significant improvement of outcomes in other fields of surgery; to date these important factors have not been reviewed extensively enough in the context of inguinal hernia surgery. A systematic literature review was performed to identify randomised controlled trials reporting on the incidence of CPIP after the Lichtenstein procedure and including the expertise of the surgeon. Surgical expertise was subdivided into expert and non-expert. In a total of 16 studies 3086 Lichtenstein procedures were included. In the expert group the incidence of CPIP varied between 6.9 and 11.7 % versus an incidence of 18.1 and 39.4 % in the non-expert group. Due to the heterogeneity between groups no statistical significance could be demonstrated. The results of this evaluation suggest that an association between surgical expertise and CPIP is highly likely warranting further analysis in a prospectively designed study.

  4. Commodity Movements on the Texas Highway System: Data Collection and Survey Results

    DOT National Transportation Integrated Search

    1991-11-01

    This report presents the survey procedures used and data collected in the : development of commodity flow statistics for movements over Texas Highways. : Response rates, sampling procedures, questionnaire design and the types of data : provided by th...

  5. 10 CFR 431.329 - Enforcement.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Metal Halide Lamp Ballasts and Fixtures Energy Conservation Standards § 431.329 Enforcement. Process for Metal Halide Lamp Ballasts. This section sets forth procedures DOE will follow in pursuing alleged... with the following statistical sampling procedures for metal halide lamp ballasts, with the methods...

  6. SNW 2000 Proceedings. Oxide Thickness Variation Induced Threshold Voltage Fluctuations in Decanano MOSFETs: a 3D Density Gradient Simulation Study

    NASA Technical Reports Server (NTRS)

    Asenov, Asen; Kaya, S.; Davies, J. H.; Saini, S.

    2000-01-01

    We use the density gradient (DG) simulation approach to study, in 3D, the effect of local oxide thickness fluctuations on the threshold voltage of decanano MOSFETs in a statistical manner. A description of the reconstruction procedure for the random 2D surfaces representing the 'atomistic' Si-SiO2 interface variations is presented. The procedure is based on power spectrum synthesis in the Fourier domain and can include either Gaussian or exponential spectra. The simulations show that threshold voltage variations induced by oxide thickness fluctuation become significant when the gate length of the devices become comparable to the correlation length of the fluctuations. The extent of quantum corrections in the simulations with respect to the classical case and the dependence of threshold variations on the oxide thickness are examined.

  7. Introducing StatHand: A Cross-Platform Mobile Application to Support Students' Statistical Decision Making.

    PubMed

    Allen, Peter J; Roberts, Lynne D; Baughman, Frank D; Loxton, Natalie J; Van Rooy, Dirk; Rock, Adam J; Finlay, James

    2016-01-01

    Although essential to professional competence in psychology, quantitative research methods are a known area of weakness for many undergraduate psychology students. Students find selecting appropriate statistical tests and procedures for different types of research questions, hypotheses and data types particularly challenging, and these skills are not often practiced in class. Decision trees (a type of graphic organizer) are known to facilitate this decision making process, but extant trees have a number of limitations. Furthermore, emerging research suggests that mobile technologies offer many possibilities for facilitating learning. It is within this context that we have developed StatHand, a free cross-platform application designed to support students' statistical decision making. Developed with the support of the Australian Government Office for Learning and Teaching, StatHand guides users through a series of simple, annotated questions to help them identify a statistical test or procedure appropriate to their circumstances. It further offers the guidance necessary to run these tests and procedures, then interpret and report their results. In this Technology Report we will overview the rationale behind StatHand, before describing the feature set of the application. We will then provide guidelines for integrating StatHand into the research methods curriculum, before concluding by outlining our road map for the ongoing development and evaluation of StatHand.

  8. Harnessing Multivariate Statistics for Ellipsoidal Data in Structural Geology

    NASA Astrophysics Data System (ADS)

    Roberts, N.; Davis, J. R.; Titus, S.; Tikoff, B.

    2015-12-01

    Most structural geology articles do not state significance levels, report confidence intervals, or perform regressions to find trends. This is, in part, because structural data tend to include directions, orientations, ellipsoids, and tensors, which are not treatable by elementary statistics. We describe a full procedural methodology for the statistical treatment of ellipsoidal data. We use a reconstructed dataset of deformed ooids in Maryland from Cloos (1947) to illustrate the process. Normalized ellipsoids have five degrees of freedom and can be represented by a second order tensor. This tensor can be permuted into a five dimensional vector that belongs to a vector space and can be treated with standard multivariate statistics. Cloos made several claims about the distribution of deformation in the South Mountain fold, Maryland, and we reexamine two particular claims using hypothesis testing: 1) octahedral shear strain increases towards the axial plane of the fold; 2) finite strain orientation varies systematically along the trend of the axial trace as it bends with the Appalachian orogen. We then test the null hypothesis that the southern segment of South Mountain is the same as the northern segment. This test illustrates the application of ellipsoidal statistics, which combine both orientation and shape. We report confidence intervals for each test, and graphically display our results with novel plots. This poster illustrates the importance of statistics in structural geology, especially when working with noisy or small datasets.

  9. Audiovisual distraction reduces pain perception during aural microsuction.

    PubMed

    Choudhury, N; Amer, I; Daniels, M; Wareing, M J

    2013-01-01

    Aural microsuction is a common ear, nose and throat procedure used in the outpatient setting. Some patients, however, find it difficult to tolerate owing to discomfort, pain or noise. This study evaluated the effect of audiovisual distraction on patients' pain perception and overall satisfaction. A prospective study was conducted for patients attending our aural care clinic requiring aural toileting of bilateral mastoid cavities over a three-month period. All microsuction was performed by a single clinical nurse specialist. Any patients with active infection were excluded. For each patient, during microsuction of one ear, they watched the procedure on a television screen while for the other ear they did not view the procedure. All patients received the same real time explanations during microsuction of both ears. After the procedure, each patient completed a visual analogue scale (VAS) to rate the pain they experienced for each ear, with and without access to the television screen. They also documented their preference and reasons why. A total of 37 patients were included in the study. The mean pain score for patients viewing the procedure was 2.43 compared with a mean of 3.48 for patients with no television view. This difference in patients' pain perception was statistically lower in the group who observed the procedure on the television (p=0.003), consistent with the majority of patients reporting a preference to viewing their procedure (65%). Audiovisual distraction significantly lowered patients' VAS pain scores during aural microsuction. This simple intervention can therefore reduce patients' perceived pain and help improve acceptance of this procedure.

  10. Are procedures codes in claims data a reliable indicator of intraoperative splenic injury compared with clinical registry data?

    PubMed

    Stey, Anne M; Ko, Clifford Y; Hall, Bruce Lee; Louie, Rachel; Lawson, Elise H; Gibbons, Melinda M; Zingmond, David S; Russell, Marcia M

    2014-08-01

    Identifying iatrogenic injuries using existing data sources is important for improved transparency in the occurrence of intraoperative events. There is evidence that procedure codes are reliably recorded in claims data. The objective of this study was to assess whether concurrent splenic procedure codes in patients undergoing colectomy procedures are reliably coded in claims data as compared with clinical registry data. Patients who underwent colectomy procedures in the absence of neoplastic diagnosis codes were identified from American College of Surgeons (ACS) NSQIP data linked with Medicare inpatient claims data file (2005 to 2008). A κ statistic was used to assess coding concordance between ACS NSQIP and Medicare inpatient claims, with ACS NSQIP serving as the reference standard. A total of 11,367 colectomy patients were identified from 212 hospitals. There were 114 patients (1%) who had a concurrent splenic procedure code recorded in either ACS NSQIP or Medicare inpatient claims. There were 7 patients who had a splenic injury diagnosis code recorded in either data source. Agreement of splenic procedure codes between the data sources was substantial (κ statistic 0.72; 95% CI, 0.64-0.79). Medicare inpatient claims identified 81% of the splenic procedure codes recorded in ACS NSQIP, and 99% of the patients without a splenic procedure code. It is feasible to use Medicare claims data to identify splenic injuries occurring during colectomy procedures, as claims data have moderate sensitivity and excellent specificity for capturing concurrent splenic procedure codes compared with ACS NSQIP. Copyright © 2014 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  11. Statistical analyses of commercial vehicle accident factors. Volume 1 Part 1

    DOT National Transportation Integrated Search

    1978-02-01

    Procedures for conducting statistical analyses of commercial vehicle accidents have been established and initially applied. A file of some 3,000 California Highway Patrol accident reports from two areas of California during a period of about one year...

  12. Statistical methods for the quality control of steam cured concrete : final report.

    DOT National Transportation Integrated Search

    1971-01-01

    Concrete strength test results from three prestressing plants utilizing steam curing were evaluated statistically in terms of the concrete as received and the effectiveness of the plants' steaming procedures. Control charts were prepared to show tren...

  13. 40 CFR 90.712 - Request for public hearing.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... sampling plans and statistical analyses have been properly applied (specifically, whether sampling procedures and statistical analyses specified in this subpart were followed and whether there exists a basis... Clerk and will be made available to the public during Agency business hours. ...

  14. The Krigifier: A Procedure for Generating Pseudorandom Nonlinear Objective Functions for Computational Experimentation

    NASA Technical Reports Server (NTRS)

    Trosset, Michael W.

    1999-01-01

    Comprehensive computational experiments to assess the performance of algorithms for numerical optimization require (among other things) a practical procedure for generating pseudorandom nonlinear objective functions. We propose a procedure that is based on the convenient fiction that objective functions are realizations of stochastic processes. This report details the calculations necessary to implement our procedure for the case of certain stationary Gaussian processes and presents a specific implementation in the statistical programming language S-PLUS.

  15. Sequential Tests of Multiple Hypotheses Controlling Type I and II Familywise Error Rates

    PubMed Central

    Bartroff, Jay; Song, Jinlin

    2014-01-01

    This paper addresses the following general scenario: A scientist wishes to perform a battery of experiments, each generating a sequential stream of data, to investigate some phenomenon. The scientist would like to control the overall error rate in order to draw statistically-valid conclusions from each experiment, while being as efficient as possible. The between-stream data may differ in distribution and dimension but also may be highly correlated, even duplicated exactly in some cases. Treating each experiment as a hypothesis test and adopting the familywise error rate (FWER) metric, we give a procedure that sequentially tests each hypothesis while controlling both the type I and II FWERs regardless of the between-stream correlation, and only requires arbitrary sequential test statistics that control the error rates for a given stream in isolation. The proposed procedure, which we call the sequential Holm procedure because of its inspiration from Holm’s (1979) seminal fixed-sample procedure, shows simultaneous savings in expected sample size and less conservative error control relative to fixed sample, sequential Bonferroni, and other recently proposed sequential procedures in a simulation study. PMID:25092948

  16. Pile Driving

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Machine-oriented structural engineering firm TERA, Inc. is engaged in a project to evaluate the reliability of offshore pile driving prediction methods to eventually predict the best pile driving technique for each new offshore oil platform. Phase I Pile driving records of 48 offshore platforms including such information as blow counts, soil composition and pertinent construction details were digitized. In Phase II, pile driving records were statistically compared with current methods of prediction. Result was development of modular software, the CRIPS80 Software Design Analyzer System, that companies can use to evaluate other prediction procedures or other data bases.

  17. Fluorescent-Antibody Measurement Of Cancer-Cell Urokinase

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R.

    1993-01-01

    Combination of laboratory techniques provides measurements of amounts of urokinase in and between normal and cancer cells. Includes use of fluorescent antibodies specific against different forms of urokinase-type plasminogen activator, (uPA), fluorescence microscopy, quantitative analysis of images of sections of tumor tissue, and flow cytometry of different uPA's and deoxyribonucleic acid (DNA) found in suspended-tumor-cell preparations. Measurements provide statistical method for indicating or predicting metastatic potentials of some invasive tumors. Assessments of metastatic potentials based on such measurements used in determining appropriate follow-up procedures after surgical removal of tumors.

  18. An Analysis LANDSAT-4 Thematic Mapper Geometric Properties

    NASA Technical Reports Server (NTRS)

    Walker, R. E.; Zobrist, A. L.; Bryant, N. A.; Gokhman, B.; Friedman, S. Z.; Logan, T. L.

    1984-01-01

    LANDSAT Thematic Mapper P-data of Washington, D. C., Harrisburg, PA, and Salton Sea, CA are analyzed to determine magnitudes and causes of error in the geometric conformity of the data to known Earth surface geometry. Several tests of data geometry are performed. Intraband and interband correlation and registration are investigated, exclusive of map based ground truth. The magnitudes and statistical trends of pixel offsets between a single band's mirror scans (due to processing procedures) are computed, and the inter-band integrity of registration is analyzed. A line to line correlation analysis is included.

  19. Brightness temperature and attenuation statistics at 20.6 and 31.65 GHz

    NASA Technical Reports Server (NTRS)

    Westwater, Edgeworth R.; Falls, M. J.

    1991-01-01

    Attenuation and brightness temperature statistics at 20.6 and 31.65 GHz are analyzed for a year's worth of data. The data were collected in 1988 at Denver and Platteville, Colorado. The locations are separated by 49 km. Single-station statistics are derived for the entire year. Quality control procedures are discussed and examples of their application are given.

  20. Data-driven inference for the spatial scan statistic.

    PubMed

    Almeida, Alexandre C L; Duarte, Anderson R; Duczmal, Luiz H; Oliveira, Fernando L P; Takahashi, Ricardo H C

    2011-08-02

    Kulldorff's spatial scan statistic for aggregated area maps searches for clusters of cases without specifying their size (number of areas) or geographic location in advance. Their statistical significance is tested while adjusting for the multiple testing inherent in such a procedure. However, as is shown in this work, this adjustment is not done in an even manner for all possible cluster sizes. A modification is proposed to the usual inference test of the spatial scan statistic, incorporating additional information about the size of the most likely cluster found. A new interpretation of the results of the spatial scan statistic is done, posing a modified inference question: what is the probability that the null hypothesis is rejected for the original observed cases map with a most likely cluster of size k, taking into account only those most likely clusters of size k found under null hypothesis for comparison? This question is especially important when the p-value computed by the usual inference process is near the alpha significance level, regarding the correctness of the decision based in this inference. A practical procedure is provided to make more accurate inferences about the most likely cluster found by the spatial scan statistic.

  1. Effect of Audioanalgesia in 6- to 12-year-old Children during Dental Treatment Procedure.

    PubMed

    Ramar, Kavitha; Hariharavel, V P; Sinnaduri, Gayathri; Sambath, Gayathri; Zohni, Fathima; Alagu, Palani J

    2016-12-01

    To evaluate the effect of audioanalgesia in 6- to 12-year-old children during dental treatment procedure. A total of 40 children were selected and divided into two groups, study group - with audioanalgesia and control group - without audioanalgesia. The value of their pain was evaluated using Venham's pain rating scale. Data were compared using one-sample t-test using Statistical Package for the Social Sciences (SPSS) (Inc.; Chicago, IL, USA), version 17.0. The difference in the control group and study group was statistically significant (p < 0.05). The method of distraction using audioanalgesia instills better positive dental attitude in children and decreases their pain perception. Playing or hearing music during dental procedure significantly alters the perception of pain in 6- to 12-year-old children.

  2. Statistical Properties of a Two-Stage Procedure for Creating Sky Flats

    NASA Astrophysics Data System (ADS)

    Crawford, R. W.; Trueblood, M.

    2004-05-01

    Accurate flat fielding is an essential factor in image calibration and good photometry, yet no single method for creating flat fields is both practical and effective in all cases. At Winer Observatory, robotic telescope opera- tion and the research program of Near Earth Object follow-up astrometry favor the use of sky flats formed from the many images that are acquired during a night. This paper reviews the statistical properties of the median-combine process used to create sky flats and discusses a computationally efficient procedure for two-stage combining of many images to form sky flats with relatively high signal-to-noise ratio (SNR). This procedure is in use at Winer for the flat field calibration of unfiltered images taken for NEO follow-up astrometry.

  3. Operational NDT simulator, towards human factors integration in simulated probability of detection

    NASA Astrophysics Data System (ADS)

    Rodat, Damien; Guibert, Frank; Dominguez, Nicolas; Calmon, Pierre

    2017-02-01

    In the aeronautic industry, the performance demonstration of Non-Destructive Testing (NDT) procedures relies on Probability Of Detection (POD) analyses. This statistical approach measures the ability of the procedure to detect a flaw with regard to one of its characteristic dimensions. The inspection chain is evaluated as a whole, including equipment configuration, probe effciency but also operator manipulations. Traditionally, a POD study requires an expensive campaign during which several operators apply the procedure on a large set of representative samples. Recently, new perspectives for the POD estimation have been introduced using NDT simulation to generate data. However, these approaches do not offer straightforward solutions to take the operator into account. The simulation of human factors, including cognitive aspects, often raises questions. To address these diffculties, we propose a concept of operational NDT simulator [1]. This work presents the first steps in the implementation of such simulator for ultrasound phased array inspection of composite parts containing Flat Bottom Holes (FBHs). The final system will look like a classical ultrasound testing equipment with a single exception: the displayed signals will be synthesized. Our hardware (ultrasound acquisition card, 3D position tracker) and software (position analysis, inspection scenario, synchronization, simulations) environments are developed as a bench to test the meta-modeling techniques able to provide fast-simulated realistic ultra-sound signals. The results presented here are obtained by on-the-fly merging of real and simulated signals. They confirm the feasibility of our approach: the replacement of real signals by purely simulated ones has been unnoticed by operators. We believe this simulator is a great prospect for POD evaluation including human factors, and may also find applications for training or procedure set-up.

  4. Hemostasis and Post-operative Care of Oral Surgical Wounds by Hemcon Dental Dressing in Patients on Oral Anticoagulant Therapy: A Split Mouth Randomized Controlled Clinical Trial.

    PubMed

    Kumar, K R Ashok; Kumar, Jambukeshwar; Sarvagna, Jagadesh; Gadde, Praveen; Chikkaboriah, Shwetha

    2016-09-01

    Hemostasis is a fundamental management issue post-operatively in minor oral surgical procedures. To ensure safety and therapeutic efficacy in patients, under oral anti coagulant therapy, is complicated by necessity for frequent determination of prothrombin time or international normalised ratio. The aim of the study was to determine whether early hemostasis achieved by using Hemcon Dental Dressing (HDD) will affect post-operative care and surgical healing outcome in minor oral surgical procedures. A total of 30 patients, aged 18 years to 90 years, except those allergic to seafood, who consented to participate, were enrolled into this study. Patients were required to have two or more surgical sites so that they would have both surgical and control sites. All patients taking Oral Anticoagulation Therapy (OAT) were included for treatment in the study without altering the anticoagulant regimens. Institutional Review Board approval was obtained for the same. The collected data was subjected to statistical analysis using unpaired t-test. All HDD surgically treated sites achieved hemostasis in 1.49 minutes and control wounds in 4.06 minutes (p < 0.001). Post-operative pain at HDD treated sites (1.87,1.27 on 1 st and 3 rd day respectively) was significantly lower than the control sites (4.0,1.87 on 1 st and 3 rd day respectively) p-value (0.001, 0.001 respectively). HDD treated oral surgery wounds achieved statistically significant improved healing both at 1 st and 3 rd post-operative days (p <0.0001). The HDD has been proven to be a clinically effective hemostatic dressing material that significantly shortens bleeding time following minor oral surgical procedures under local anaesthesia, including those patients taking OAT. Patients receiving the HDD had improved surgical wound healing as compared to controls.

  5. Understanding administrative abdominal aortic aneurysm mortality data.

    PubMed

    Hussey, K; Siddiqui, T; Burton, P; Welch, G H; Stuart, W P

    2015-03-01

    Administrative data in the form of Hospital Episode Statistics (HES) and the Scottish Morbidity Record (SMR) have been used to describe surgical activity. These data have also been used to compare outcomes from different hospitals and regions, and to corroborate data submitted to national audits and registries. The aim of this observational study was to examine the completeness and accuracy of administrative data relating to abdominal aortic aneurysm (AAA) repair. Administrative data (SMR-01 returns) from a single health board relating to AAA repair were requested (September 2007 to August 2012). A complete list of validated procedures; termed the reference data set was compiled from all available sources (clinical and administrative). For each patient episode electronic health records were scrutinised to confirm urgency of admission, diagnosis, and operative repair. The 30-day mortality was recorded. The reference data set was used to systematically validate the SMR-01 returns. The reference data set contained 608 verified procedures. SMR-01 returns identified 2433 episodes of care (1724 patients) in which a discharge diagnosis included AAA. This included 574 operative repairs. There were 34 missing cases (5.6%) from SMR-01 returns; nine of these patients died within 30 days of the index procedure. Omission of these cases made a statistically significant improvement to perceived 30-day mortality (p < .05, chi-square test). If inconsistent SMR-01 data (in terms of ICD-10 and OPCS-4 codes) were excluded only 81.9% of operative repairs were correctly identified and only 30.9% of deaths were captured. The SMR-01 returns contain multiple errors. There also appears to be a systematic bias that reduces apparent 30-day mortality. Using these data alone to describe or compare activity or outcomes must be done with caution. Copyright © 2014 European Society for Vascular Surgery. Published by Elsevier Ltd. All rights reserved.

  6. Complications after laparoscopic and open subtotal colectomy for inflammatory colitis: a case-matched comparison.

    PubMed

    Parnaby, C N; Ramsay, G; Macleod, C S; Hope, N R; Jansen, J O; McAdam, T K

    2013-11-01

    The aim of this study was to compare the early postoperative outcome of patients undergoing laparoscopic subtotal colectomy with those undergoing open subtotal colectomy for colitis refractory to medical treatment. A retrospective observational study was carried out of patients who underwent subtotal colectomy for refractory colitis, at a single centre, between 2006 and 2012. Patients were matched for age, gender, American Society of Anesthesiology (ASA) grade, urgency of operation and immunosuppressant/modulator treatment. The primary outcome measure was the number of postoperative complications, classified using the Clavien-Dindo scale. Secondary end-points included procedure duration, laparoscopic conversion rates, blood loss, 30-day readmission rates and length of hospital stay. Ninety-six patients were included, 39 of whom had laparoscopic surgery. Thirty-two of these were matched to similar patients who underwent an open procedure. The overall duration of the procedure was longer for laparoscopic surgery than for open surgery (median: 240 vs 150 min, P < 0.005) but estimated blood loss was less (median: 75 vs 400 ml, P < 0.005). In the laparoscopic group, 23 patients experienced 27 complications, and in the open surgery group, 23 patients experienced 30 complications. Most complications were minor (Grade I/II), and the distribution of complications, by grade, was similar between the two groups. There was no statistically significant difference in 30-day readmission rates between the laparoscopic and open groups (five readmissions vs eight readmissions, P = 0.536). Length of hospital stay was 4 days shorter for laparoscopic surgery, but this difference was not statistically significant (median: 7 vs 11 days, P = 0.159). In patients requiring colectomy for acute severe colitis, laparoscopic surgery reduced blood loss but increased operating time and was not associated with a reduction in early postoperative complications, length of hospital stay or readmission rates. Colorectal Disease © 2013 The Association of Coloproctology of Great Britain and Ireland.

  7. Statistical theory and methodology for remote sensing data analysis with special emphasis on LACIE

    NASA Technical Reports Server (NTRS)

    Odell, P. L.

    1975-01-01

    Crop proportion estimators for determining crop acreage through the use of remote sensing were evaluated. Several studies of these estimators were conducted, including an empirical comparison of the different estimators (using actual data) and an empirical study of the sensitivity (robustness) of the class of mixture estimators. The effect of missing data upon crop classification procedures is discussed in detail including a simulation of the missing data effect. The final problem addressed is that of taking yield data (bushels per acre) gathered at several yield stations and extrapolating these values over some specified large region. Computer programs developed in support of some of these activities are described.

  8. A review of the application of nonattenuating frequency radars for estimating rain attenuation and space-diversity performance

    NASA Technical Reports Server (NTRS)

    Goldhirsh, J.

    1979-01-01

    Cumulative rain fade statistics are used by space communications engineers to establish transmitter power and receiver sensitivities for systems operating under various geometries, climates, and radio frequencies. Space-diversity performance criteria are also of interest. This work represents a review, in which are examined the many elements involved in the employment of single nonattenuating frequency radars for arriving at the desired information. The elements examined include radar techniques and requirements, phenomenological assumptions, path attenuation formulations and procedures, as well as error budgeting and calibration analysis. Included are the pertinent results of previous investigators who have used radar for rain-attenuation modeling. Suggestions are made for improving present methods.

  9. Surgeon Training, Protocol Compliance, and Technical Outcomes From Breast Cancer Sentinel Lymph Node Randomized Trial

    PubMed Central

    Ashikaga, Takamaru; Harlow, Seth P.; Skelly, Joan M.; Julian, Thomas B.; Brown, Ann M.; Weaver, Donald L.; Wolmark, Norman

    2009-01-01

    Background The National Surgical Adjuvant Breast and Bowel Project B-32 trial was designed to determine whether sentinel lymph node resection can achieve the same therapeutic outcomes as axillary lymph node resection but with fewer side effects and is one of the most carefully controlled and monitored randomized trials in the field of surgical oncology. We evaluated the relationship of surgeon trial preparation, protocol compliance audit, and technical outcomes. Methods Preparation for this trial included a protocol manual, a site visit with key participants, an intraoperative session with the surgeon, and prerandomization documentation of protocol compliance. Training categories included surgeons who submitted material on five prerandomization surgeries and were trained by a core trainer (category 1) or by a site trainer (category 2). An expedited group (category 3) included surgeons with extensive experience who submitted material on one prerandomization surgery. At completion of training, surgeons could accrue patients. Two hundred twenty-four surgeons enrolled 4994 patients with breast cancer and were audited for 94 specific items in the following four categories: procedural, operative note, pathology report, and data entry. The relationship of training method; protocol compliance performance audit; and the technical outcomes of the sentinel lymph node resection rate, false-negative rate, and number of sentinel lymph nodes removed was determined. All statistical tests were two-sided. Results The overall sentinel lymph node resection success rate was 96.9% (95% confidence interval [CI] = 96.4% to 97.4%), and the overall false-negative rate was 9.5% (95% CI = 7.4% to 12.0%), with no statistical differences between training methods. Overall audit outcomes were excellent in all four categories. For all three training groups combined, a statistically significant positive association was observed between surgeons’ average number of procedural errors and their false-negative rate (ρ = +0.188, P = .021). Conclusions All three training methods resulted in uniform and high overall sentinel lymph node resection rates. Subgroup analyses identified some variation in false-negative rates that were related to audited outcome performance measures. PMID:19704072

  10. Meta-analysis of Pentacam vs. ultrasound pachymetry in central corneal thickness measurement in normal, post-LASIK or PRK, and keratoconic or keratoconus-suspect eyes.

    PubMed

    Wu, Wenjing; Wang, Yan; Xu, Lulu

    2014-01-01

    The aim of this meta-analysis is to evaluate the central corneal thickness (CCT) measurement differences between Pentacam (Oculus Inc., Germany) and Ultrasound Pachymetry (USP) in normal (unoperated eyes , myopic and astigmatic eyes without corneal disease or topographic irregularity), after laser in situ keratomileusis (LASIK) or photorefractive keratectomy (PRK), and keratoconic or keratoconus suspected eyes. We assess whether Pentacam and USP have similar CCT differences in normal, thinner corneas after LASIK or PRK procedures, and kerotoconic or keratoconus suspected eyes. Data sources, including PubMed, Medline, EMBASE, and Cochrane Central Registry of Controlled Trials on the Cochrane Library, were searched to find the relevant studies. Primary outcome measures were CCT measurement between Pentacam and USP. Three groups of eyes were analyzed: normal; LASIK or PRK eyes; and keratoconus suspected or keratoconic eyes. Nineteen studies describing 1,908 eyes were enrolled in the normal group. Pentacam results were 1.47 μm ,95 % confidence interval (CI) -2.32 to 5.27, higher than USP without statistically significant difference (P = 0.45). Nine studies with total 539 eyes were included in the corneas after LASIK or PRK. The mean difference in the CCT measurement with Pentacam and ultrasound pachymetry was 1.03 μm, with the 95 % CI -3.36 to 5.42, there was no statistically difference (P = 0.64). Four studies with a total of 185 eyes were included in the keratoconic eyes or keratoconus-suspect group, however,the mean difference was -6.33 μm (95 % CI -9.17 to-3.49), which was statistically different between Pentacam and ultrasound pachymetry in the CCT measurement (P < 0.0001). Pentacam offers similar CCT results to ultrasound pachymetry in normal eyes, thinner corneas after LASIK or PRK procedures. However, in keratoconic or keratoconus-suspect eyes, Pentacam slightly underestimates the central corneal thickness than does ultrasound pachymetry, which may result from the difficulty in fixation of keratoconic eyes, misalignment of Pentacam and the variation of ultrasonic velocity due to the histological deformation.

  11. On representing the prognostic value of continuous gene expression biomarkers with the restricted mean survival curve.

    PubMed

    Eng, Kevin H; Schiller, Emily; Morrell, Kayla

    2015-11-03

    Researchers developing biomarkers for cancer prognosis from quantitative gene expression data are often faced with an odd methodological discrepancy: while Cox's proportional hazards model, the appropriate and popular technique, produces a continuous and relative risk score, it is hard to cast the estimate in clear clinical terms like median months of survival and percent of patients affected. To produce a familiar Kaplan-Meier plot, researchers commonly make the decision to dichotomize a continuous (often unimodal and symmetric) score. It is well known in the statistical literature that this procedure induces significant bias. We illustrate the liabilities of common techniques for categorizing a risk score and discuss alternative approaches. We promote the use of the restricted mean survival (RMS) and the corresponding RMS curve that may be thought of as an analog to the best fit line from simple linear regression. Continuous biomarker workflows should be modified to include the more rigorous statistical techniques and descriptive plots described in this article. All statistics discussed can be computed via standard functions in the Survival package of the R statistical programming language. Example R language code for the RMS curve is presented in the appendix.

  12. Statistical variances of diffusional properties from ab initio molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    He, Xingfeng; Zhu, Yizhou; Epstein, Alexander; Mo, Yifei

    2018-12-01

    Ab initio molecular dynamics (AIMD) simulation is widely employed in studying diffusion mechanisms and in quantifying diffusional properties of materials. However, AIMD simulations are often limited to a few hundred atoms and a short, sub-nanosecond physical timescale, which leads to models that include only a limited number of diffusion events. As a result, the diffusional properties obtained from AIMD simulations are often plagued by poor statistics. In this paper, we re-examine the process to estimate diffusivity and ionic conductivity from the AIMD simulations and establish the procedure to minimize the fitting errors. In addition, we propose methods for quantifying the statistical variance of the diffusivity and ionic conductivity from the number of diffusion events observed during the AIMD simulation. Since an adequate number of diffusion events must be sampled, AIMD simulations should be sufficiently long and can only be performed on materials with reasonably fast diffusion. We chart the ranges of materials and physical conditions that can be accessible by AIMD simulations in studying diffusional properties. Our work provides the foundation for quantifying the statistical confidence levels of diffusion results from AIMD simulations and for correctly employing this powerful technique.

  13. Quality assurance software inspections at NASA Ames: Metrics for feedback and modification

    NASA Technical Reports Server (NTRS)

    Wenneson, G.

    1985-01-01

    Software inspections are a set of formal technical review procedures held at selected key points during software development in order to find defects in software documents--is described in terms of history, participants, tools, procedures, statistics, and database analysis.

  14. Least Squares Procedures.

    ERIC Educational Resources Information Center

    Hester, Yvette

    Least squares methods are sophisticated mathematical curve fitting procedures used in all classical parametric methods. The linear least squares approximation is most often associated with finding the "line of best fit" or the regression line. Since all statistical analyses are correlational and all classical parametric methods are least…

  15. 40 CFR 610.10 - Program purpose.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... DEVICES Test Procedures and Evaluation Criteria General Provisions § 610.10 Program purpose. (a) The... standardized procedures, the performance of various retrofit devices applicable to automobiles for which fuel... statistical analysis of data from vehicle tests, the evaluation program will determine the effects on fuel...

  16. 40 CFR 610.10 - Program purpose.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... DEVICES Test Procedures and Evaluation Criteria General Provisions § 610.10 Program purpose. (a) The... standardized procedures, the performance of various retrofit devices applicable to automobiles for which fuel... statistical analysis of data from vehicle tests, the evaluation program will determine the effects on fuel...

  17. 40 CFR 610.10 - Program purpose.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... DEVICES Test Procedures and Evaluation Criteria General Provisions § 610.10 Program purpose. (a) The... standardized procedures, the performance of various retrofit devices applicable to automobiles for which fuel... statistical analysis of data from vehicle tests, the evaluation program will determine the effects on fuel...

  18. 40 CFR 610.10 - Program purpose.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... DEVICES Test Procedures and Evaluation Criteria General Provisions § 610.10 Program purpose. (a) The... standardized procedures, the performance of various retrofit devices applicable to automobiles for which fuel... statistical analysis of data from vehicle tests, the evaluation program will determine the effects on fuel...

  19. Statistical and optimal learning with applications in business analytics

    NASA Astrophysics Data System (ADS)

    Han, Bin

    Statistical learning is widely used in business analytics to discover structure or exploit patterns from historical data, and build models that capture relationships between an outcome of interest and a set of variables. Optimal learning on the other hand, solves the operational side of the problem, by iterating between decision making and data acquisition/learning. All too often the two problems go hand-in-hand, which exhibit a feedback loop between statistics and optimization. We apply this statistical/optimal learning concept on a context of fundraising marketing campaign problem arising in many non-profit organizations. Many such organizations use direct-mail marketing to cultivate one-time donors and convert them into recurring contributors. Cultivated donors generate much more revenue than new donors, but also lapse with time, making it important to steadily draw in new cultivations. The direct-mail budget is limited, but better-designed mailings can improve success rates without increasing costs. We first apply statistical learning to analyze the effectiveness of several design approaches used in practice, based on a massive dataset covering 8.6 million direct-mail communications with donors to the American Red Cross during 2009-2011. We find evidence that mailed appeals are more effective when they emphasize disaster preparedness and training efforts over post-disaster cleanup. Including small cards that affirm donors' identity as Red Cross supporters is an effective strategy, while including gift items such as address labels is not. Finally, very recent acquisitions are more likely to respond to appeals that ask them to contribute an amount similar to their most recent donation, but this approach has an adverse effect on donors with a longer history. We show via simulation that a simple design strategy based on these insights has potential to improve success rates from 5.4% to 8.1%. Given these findings, when new scenario arises, however, new data need to be acquired to update our model and decisions, which is studied under optimal learning framework. The goal becomes discovering a sequential information collection strategy that learns the best campaign design alternative as quickly as possible. Regression structure is used to learn about a set of unknown parameters, which alternates with optimization to design new data points. Such problems have been extensively studied in the ranking and selection (R&S) community, but traditional R&S procedures experience high computational costs when the decision space grows combinatorially. We present a value of information procedure for simultaneously learning unknown regression parameters and unknown sampling noise. We then develop an approximate version of the procedure, based on semi-definite programming relaxation, that retains good performance and scales better to large problems. We also prove the asymptotic consistency of the algorithm in the parametric model, a result that has not previously been available for even the known-variance case.

  20. The epistemology of mathematical and statistical modeling: a quiet methodological revolution.

    PubMed

    Rodgers, Joseph Lee

    2010-01-01

    A quiet methodological revolution, a modeling revolution, has occurred over the past several decades, almost without discussion. In contrast, the 20th century ended with contentious argument over the utility of null hypothesis significance testing (NHST). The NHST controversy may have been at least partially irrelevant, because in certain ways the modeling revolution obviated the NHST argument. I begin with a history of NHST and modeling and their relation to one another. Next, I define and illustrate principles involved in developing and evaluating mathematical models. Following, I discuss the difference between using statistical procedures within a rule-based framework and building mathematical models from a scientific epistemology. Only the former is treated carefully in most psychology graduate training. The pedagogical implications of this imbalance and the revised pedagogy required to account for the modeling revolution are described. To conclude, I discuss how attention to modeling implies shifting statistical practice in certain progressive ways. The epistemological basis of statistics has moved away from being a set of procedures, applied mechanistically, and moved toward building and evaluating statistical and scientific models. Copyrigiht 2009 APA, all rights reserved.

Top