Sample records for standard statistical procedures

  1. BTS statistical standards manual

    DOT National Transportation Integrated Search

    2005-10-01

    The Bureau of Transportation Statistics (BTS), like other federal statistical agencies, establishes professional standards to guide the methods and procedures for the collection, processing, storage, and presentation of statistical data. Standards an...

  2. 10 CFR Appendix II to Part 504 - Fuel Price Computation

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 504—Fuel Price Computation (a) Introduction. This appendix provides the equations and parameters... inflation indices must follow standard statistical procedures and must be fully documented within the... the weighted average fuel price must follow standard statistical procedures and be fully documented...

  3. Applying a statistical PTB detection procedure to complement the gold standard.

    PubMed

    Noor, Norliza Mohd; Yunus, Ashari; Bakar, S A R Abu; Hussin, Amran; Rijal, Omar Mohd

    2011-04-01

    This paper investigates a novel statistical discrimination procedure to detect PTB when the gold standard requirement is taken into consideration. Archived data were used to establish two groups of patients which are the control and test group. The control group was used to develop the statistical discrimination procedure using four vectors of wavelet coefficients as feature vectors for the detection of pulmonary tuberculosis (PTB), lung cancer (LC), and normal lung (NL). This discrimination procedure was investigated using the test group where the number of sputum positive and sputum negative cases that were correctly classified as PTB cases were noted. The proposed statistical discrimination method is able to detect PTB patients and LC with high true positive fraction. The method is also able to detect PTB patients that are sputum negative and therefore may be used as a complement to the gold standard. Copyright © 2010 Elsevier Ltd. All rights reserved.

  4. 40 CFR 1065.12 - Approval of alternate procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... engine meets all applicable emission standards according to specified procedures. (iii) Use statistical.... (e) We may give you specific directions regarding methods for statistical analysis, or we may approve... statistical tests. Perform the tests as follows: (1) Repeat measurements for all applicable duty cycles at...

  5. Randomization Procedures Applied to Analysis of Ballistic Data

    DTIC Science & Technology

    1991-06-01

    test,;;15. NUMBER OF PAGES data analysis; computationally intensive statistics ; randomization tests; permutation tests; 16 nonparametric statistics ...be 0.13. 8 Any reasonable statistical procedure would fail to support the notion of improvement of dynamic over standard indexing based on this data ...AD-A238 389 TECHNICAL REPORT BRL-TR-3245 iBRL RANDOMIZATION PROCEDURES APPLIED TO ANALYSIS OF BALLISTIC DATA MALCOLM S. TAYLOR BARRY A. BODT - JUNE

  6. Introducing Statistical Inference to Biology Students through Bootstrapping and Randomization

    ERIC Educational Resources Information Center

    Lock, Robin H.; Lock, Patti Frazer

    2008-01-01

    Bootstrap methods and randomization tests are increasingly being used as alternatives to standard statistical procedures in biology. They also serve as an effective introduction to the key ideas of statistical inference in introductory courses for biology students. We discuss the use of such simulation based procedures in an integrated curriculum…

  7. Standard and goodness-of-fit parameter estimation methods for the three-parameter lognormal distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kane, V.E.

    1982-01-01

    A class of goodness-of-fit estimators is found to provide a useful alternative in certain situations to the standard maximum likelihood method which has some undesirable estimation characteristics for estimation from the three-parameter lognormal distribution. The class of goodness-of-fit tests considered include the Shapiro-Wilk and Filliben tests which reduce to a weighted linear combination of the order statistics that can be maximized in estimation problems. The weighted order statistic estimators are compared to the standard procedures in Monte Carlo simulations. Robustness of the procedures are examined and example data sets analyzed.

  8. Fitting a three-parameter lognormal distribution with applications to hydrogeochemical data from the National Uranium Resource Evaluation Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kane, V.E.

    1979-10-01

    The standard maximum likelihood and moment estimation procedures are shown to have some undesirable characteristics for estimating the parameters in a three-parameter lognormal distribution. A class of goodness-of-fit estimators is found which provides a useful alternative to the standard methods. The class of goodness-of-fit tests considered include the Shapiro-Wilk and Shapiro-Francia tests which reduce to a weighted linear combination of the order statistics that can be maximized in estimation problems. The weighted-order statistic estimators are compared to the standard procedures in Monte Carlo simulations. Bias and robustness of the procedures are examined and example data sets analyzed including geochemical datamore » from the National Uranium Resource Evaluation Program.« less

  9. A Comparison of Approaches for Setting Proficiency Standards.

    ERIC Educational Resources Information Center

    Koffler, Stephen L.

    This research compared the cut-off scores estimated from an empirical procedure (Contrasting group method) to those determined from a more theoretical process (Nedelsky method). A methodological and statistical framework was also provided for analysis of the data to obtain the most appropriate standard using the empirical procedure. Data were…

  10. Statistical Assessment of Variability of Terminal Restriction Fragment Length Polymorphism Analysis Applied to Complex Microbial Communities ▿ †

    PubMed Central

    Rossi, Pierre; Gillet, François; Rohrbach, Emmanuelle; Diaby, Nouhou; Holliger, Christof

    2009-01-01

    The variability of terminal restriction fragment polymorphism analysis applied to complex microbial communities was assessed statistically. Recent technological improvements were implemented in the successive steps of the procedure, resulting in a standardized procedure which provided a high level of reproducibility. PMID:19749066

  11. 5 CFR 532.215 - Establishments included in regular appropriated fund surveys.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... in surveys shall be selected under standard probability sample selection procedures. In areas with... establishment list drawn under statistical sampling procedures. [55 FR 46142, Nov. 1, 1990] ...

  12. The Development of Statistical Models for Predicting Surgical Site Infections in Japan: Toward a Statistical Model-Based Standardized Infection Ratio.

    PubMed

    Fukuda, Haruhisa; Kuroki, Manabu

    2016-03-01

    To develop and internally validate a surgical site infection (SSI) prediction model for Japan. Retrospective observational cohort study. We analyzed surveillance data submitted to the Japan Nosocomial Infections Surveillance system for patients who had undergone target surgical procedures from January 1, 2010, through December 31, 2012. Logistic regression analyses were used to develop statistical models for predicting SSIs. An SSI prediction model was constructed for each of the procedure categories by statistically selecting the appropriate risk factors from among the collected surveillance data and determining their optimal categorization. Standard bootstrapping techniques were applied to assess potential overfitting. The C-index was used to compare the predictive performances of the new statistical models with those of models based on conventional risk index variables. The study sample comprised 349,987 cases from 428 participant hospitals throughout Japan, and the overall SSI incidence was 7.0%. The C-indices of the new statistical models were significantly higher than those of the conventional risk index models in 21 (67.7%) of the 31 procedure categories (P<.05). No significant overfitting was detected. Japan-specific SSI prediction models were shown to generally have higher accuracy than conventional risk index models. These new models may have applications in assessing hospital performance and identifying high-risk patients in specific procedure categories.

  13. Development of QC Procedures for Ocean Data Obtained by National Research Projects of Korea

    NASA Astrophysics Data System (ADS)

    Kim, S. D.; Park, H. M.

    2017-12-01

    To establish data management system for ocean data obtained by national research projects of Ministry of Oceans and Fisheries of Korea, KIOST conducted standardization and development of QC procedures. After reviewing and analyzing the existing international and domestic ocean-data standards and QC procedures, the draft version of standards and QC procedures were prepared. The proposed standards and QC procedures were reviewed and revised by experts in the field of oceanography and academic societies several times. A technical report on the standards of 25 data items and 12 QC procedures for physical, chemical, biological and geological data items. The QC procedure for temperature and salinity data was set up by referring the manuals published by GTSPP, ARGO and IOOS QARTOD. It consists of 16 QC tests applicable for vertical profile data and time series data obtained in real-time mode and delay mode. Three regional range tests to inspect annual, seasonal and monthly variations were included in the procedure. Three programs were developed to calculate and provide upper limit and lower limit of temperature and salinity at depth from 0 to 1550m. TS data of World Ocean Database, ARGO, GTSPP and in-house data of KIOST were analysed statistically to calculate regional limit of Northwest Pacific area. Based on statistical analysis, the programs calculate regional ranges using mean and standard deviation at 3 kind of grid systems (3° grid, 1° grid and 0.5° grid) and provide recommendation. The QC procedures for 12 data items were set up during 1st phase of national program for data management (2012-2015) and are being applied to national research projects practically at 2nd phase (2016-2019). The QC procedures will be revised by reviewing the result of QC application when the 2nd phase of data management programs is completed.

  14. Nevada Applied Ecology Group procedures handbook for environmental transuranics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, M.G.; Dunaway, P.B.

    The activities of the Nevada Applied Ecology Group (NAEG) integrated research studies of environmental plutonium and other transuranics at the Nevada Test Site have required many standardized field and laboratory procedures. These include sampling techniques, collection and preparation, radiochemical and wet chemistry analysis, data bank storage and reporting, and statistical considerations for environmental samples of soil, vegetation, resuspended particles, animals, and others. This document, printed in two volumes, includes most of the Nevada Applied Ecology Group standard procedures, with explanations as to the specific applications involved in the environmental studies. Where there is more than one document concerning a procedure,more » it has been included to indicate special studies or applications perhaps more complex than the routine standard sampling procedures utilized.« less

  15. Nevada Applied Ecology Group procedures handbook for environmental transuranics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, M.G.; Dunaway, P.B.

    The activities of the Nevada Applied Ecology Group (NAEG) integrated research studies of environmental plutonium and other transuranics at the Nevada Test Site have required many standardized field and laboratory procedures. These include sampling techniques, collection and preparation, radiochemical and wet chemistry analysis, data bank storage and reporting, and statistical considerations for environmental samples of soil, vegetation, resuspended particles, animals, and other biological material. This document, printed in two volumes, includes most of the Nevada Applied Ecology Group standard procedures, with explanations as to the specific applications involved in the environmental studies. Where there is more than one document concerningmore » a procedure, it has been included to indicate special studies or applications more complex than the routine standard sampling procedures utilized.« less

  16. NHEXAS PHASE I MARYLAND STUDY--STANDARD OPERATING PROCEDURE FOR EXPLORATORY DATA ANALYSIS AND SUMMARY STATISTICS (D05)

    EPA Science Inventory

    This SOP describes the methods and procedures for two types of QA procedures: spot checks of hand entered data, and QA procedures for co-located and split samples. The spot checks were used to determine whether the error rate goal for the input of hand entered data was being att...

  17. The Incoming Statistical Knowledge of Undergraduate Majors in a Department of Mathematics and Statistics

    ERIC Educational Resources Information Center

    Cook, Samuel A.; Fukawa-Connelly, Timothy

    2016-01-01

    Studies have shown that at the end of an introductory statistics course, students struggle with building block concepts, such as mean and standard deviation, and rely on procedural understandings of the concepts. This study aims to investigate the understandings entering freshman of a department of mathematics and statistics (including mathematics…

  18. 10 CFR Appendix D to Subpart T of... - Enforcement for Performance Standards; Compliance Determination Procedure for Certain Commercial...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Determination Procedure for Certain Commercial Equipment D Appendix D to Subpart T of Part 431 Energy DEPARTMENT... EQUIPMENT Certification and Enforcement Pt. 431, Subpt. T, App. D Appendix D to Subpart T of Part 431... where EPS is the energy performance standard and t is a statistic based on a 97.5 percent, one-sided...

  19. Scaled test statistics and robust standard errors for non-normal data in covariance structure analysis: a Monte Carlo study.

    PubMed

    Chou, C P; Bentler, P M; Satorra, A

    1991-11-01

    Research studying robustness of maximum likelihood (ML) statistics in covariance structure analysis has concluded that test statistics and standard errors are biased under severe non-normality. An estimation procedure known as asymptotic distribution free (ADF), making no distributional assumption, has been suggested to avoid these biases. Corrections to the normal theory statistics to yield more adequate performance have also been proposed. This study compares the performance of a scaled test statistic and robust standard errors for two models under several non-normal conditions and also compares these with the results from ML and ADF methods. Both ML and ADF test statistics performed rather well in one model and considerably worse in the other. In general, the scaled test statistic seemed to behave better than the ML test statistic and the ADF statistic performed the worst. The robust and ADF standard errors yielded more appropriate estimates of sampling variability than the ML standard errors, which were usually downward biased, in both models under most of the non-normal conditions. ML test statistics and standard errors were found to be quite robust to the violation of the normality assumption when data had either symmetric and platykurtic distributions, or non-symmetric and zero kurtotic distributions.

  20. The role of ultrasound guidance in pediatric caudal block

    PubMed Central

    Erbüyün, Koray; Açıkgöz, Barış; Ok, Gülay; Yılmaz, Ömer; Temeltaş, Gökhan; Tekin, İdil; Tok, Demet

    2016-01-01

    Objectives: To compare the time interval of the procedure, possible complications, post-operative pain levels, additional analgesics, and nurse satisfaction in ultrasonography-guided and standard caudal block applications. Methods: This retrospective study was conducted in Celal Bayar University Hospital, Manisa, Turkey, between January and December 2014, included 78 pediatric patients. Caudal block was applied to 2 different groups; one with ultrasound guide, and the other using the standard method. Results: The time interval of the procedure was significantly shorter in the standard application group compared with ultrasound-guided group (p=0.020). Wong-Baker FACES Pain Rating Scale values obtained at the 90th minute was statistically lower in the standard application group compared with ultrasound-guided group (p=0.035). No statistically significant difference was found on the other parameters between the 2 groups. The shorter time interval of the procedure at standard application group should not be considered as a distinctive mark by the pediatric anesthesiologists, because this time difference was as short as seconds. Conclusion: Ultrasound guidance for caudal block applications would neither increase nor decrease the success of the treatment. However, ultrasound guidance should be needed in cases where the detection of sacral anatomy is difficult, especially by palpations. PMID:26837396

  1. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR SAMPLING WEIGHT CALCULATION (IIT-A-9.0)

    EPA Science Inventory

    The purpose of this SOP is to describe the procedures undertaken to calculate sampling weights. The sampling weights are needed to obtain weighted statistics of the NHEXAS data. This SOP uses data that have been properly coded and certified with appropriate QA/QC procedures by t...

  2. 10 CFR 431.329 - Enforcement.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Metal Halide Lamp Ballasts and Fixtures Energy Conservation Standards § 431.329 Enforcement. Process for Metal Halide Lamp Ballasts. This section sets forth procedures DOE will follow in pursuing alleged... with the following statistical sampling procedures for metal halide lamp ballasts, with the methods...

  3. Decision Support Systems: Applications in Statistics and Hypothesis Testing.

    ERIC Educational Resources Information Center

    Olsen, Christopher R.; Bozeman, William C.

    1988-01-01

    Discussion of the selection of appropriate statistical procedures by educators highlights a study conducted to investigate the effectiveness of decision aids in facilitating the use of appropriate statistics. Experimental groups and a control group using a printed flow chart, a computer-based decision aid, and a standard text are described. (11…

  4. Standard Errors and Confidence Intervals of Norm Statistics for Educational and Psychological Tests.

    PubMed

    Oosterhuis, Hannah E M; van der Ark, L Andries; Sijtsma, Klaas

    2016-11-14

    Norm statistics allow for the interpretation of scores on psychological and educational tests, by relating the test score of an individual test taker to the test scores of individuals belonging to the same gender, age, or education groups, et cetera. Given the uncertainty due to sampling error, one would expect researchers to report standard errors for norm statistics. In practice, standard errors are seldom reported; they are either unavailable or derived under strong distributional assumptions that may not be realistic for test scores. We derived standard errors for four norm statistics (standard deviation, percentile ranks, stanine boundaries and Z-scores) under the mild assumption that the test scores are multinomially distributed. A simulation study showed that the standard errors were unbiased and that corresponding Wald-based confidence intervals had good coverage. Finally, we discuss the possibilities for applying the standard errors in practical test use in education and psychology. The procedure is provided via the R function check.norms, which is available in the mokken package.

  5. 42 CFR 493.1256 - Standard: Control procedures.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... for having control procedures that monitor the accuracy and precision of the complete analytic process..., include two control materials, including one that is capable of detecting errors in the extraction process... control materials having previously determined statistical parameters. (e) For reagent, media, and supply...

  6. 40 CFR 610.10 - Program purpose.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... DEVICES Test Procedures and Evaluation Criteria General Provisions § 610.10 Program purpose. (a) The... standardized procedures, the performance of various retrofit devices applicable to automobiles for which fuel... statistical analysis of data from vehicle tests, the evaluation program will determine the effects on fuel...

  7. 40 CFR 610.10 - Program purpose.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... DEVICES Test Procedures and Evaluation Criteria General Provisions § 610.10 Program purpose. (a) The... standardized procedures, the performance of various retrofit devices applicable to automobiles for which fuel... statistical analysis of data from vehicle tests, the evaluation program will determine the effects on fuel...

  8. 40 CFR 610.10 - Program purpose.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... DEVICES Test Procedures and Evaluation Criteria General Provisions § 610.10 Program purpose. (a) The... standardized procedures, the performance of various retrofit devices applicable to automobiles for which fuel... statistical analysis of data from vehicle tests, the evaluation program will determine the effects on fuel...

  9. 40 CFR 610.10 - Program purpose.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... DEVICES Test Procedures and Evaluation Criteria General Provisions § 610.10 Program purpose. (a) The... standardized procedures, the performance of various retrofit devices applicable to automobiles for which fuel... statistical analysis of data from vehicle tests, the evaluation program will determine the effects on fuel...

  10. Statistics in the pharmacy literature.

    PubMed

    Lee, Charlene M; Soin, Herpreet K; Einarson, Thomas R

    2004-09-01

    Research in statistical methods is essential for maintenance of high quality of the published literature. To update previous reports of the types and frequencies of statistical terms and procedures in research studies of selected professional pharmacy journals. We obtained all research articles published in 2001 in 6 journals: American Journal of Health-System Pharmacy, The Annals of Pharmacotherapy, Canadian Journal of Hospital Pharmacy, Formulary, Hospital Pharmacy, and Journal of the American Pharmaceutical Association. Two independent reviewers identified and recorded descriptive and inferential statistical terms/procedures found in the methods, results, and discussion sections of each article. Results were determined by tallying the total number of times, as well as the percentage, that each statistical term or procedure appeared in the articles. One hundred forty-four articles were included. Ninety-eight percent employed descriptive statistics; of these, 28% used only descriptive statistics. The most common descriptive statistical terms were percentage (90%), mean (74%), standard deviation (58%), and range (46%). Sixty-nine percent of the articles used inferential statistics, the most frequent being chi(2) (33%), Student's t-test (26%), Pearson's correlation coefficient r (18%), ANOVA (14%), and logistic regression (11%). Statistical terms and procedures were found in nearly all of the research articles published in pharmacy journals. Thus, pharmacy education should aim to provide current and future pharmacists with an understanding of the common statistical terms and procedures identified to facilitate the appropriate appraisal and consequential utilization of the information available in research articles.

  11. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR SAMPLING WEIGHT CALCULATION (IIT-A-9.0)

    EPA Science Inventory

    The purpose of this SOP is to describe the procedures undertaken to calculate sampling weights. The sampling weights are needed to obtain weighted statistics of the study data. This SOP uses data that have been properly coded and certified with appropriate QA/QC procedures by th...

  12. Evaluation of noise pollution level in the operating rooms of hospitals: A study in Iran.

    PubMed

    Giv, Masoumeh Dorri; Sani, Karim Ghazikhanlou; Alizadeh, Majid; Valinejadi, Ali; Majdabadi, Hesamedin Askari

    2017-06-01

    Noise pollution in the operating rooms is one of the remaining challenges. Both patients and physicians are exposed to different sound levels during the operative cases, many of which can last for hours. This study aims to evaluate the noise pollution in the operating rooms during different surgical procedures. In this cross-sectional study, sound level in the operating rooms of Hamadan University-affiliated hospitals (totally 10) in Iran during different surgical procedures was measured using B&K sound meter. The gathered data were compared with national and international standards. Statistical analysis was performed using descriptive statistics and one-way ANOVA, t -test, and Pearson's correlation test. Noise pollution level at majority of surgical procedures is higher than national and international documented standards. The highest level of noise pollution is related to orthopedic procedures, and the lowest one related to laparoscopic and heart surgery procedures. The highest and lowest registered sound level during the operation was 93 and 55 dB, respectively. Sound level generated by equipments (69 ± 4.1 dB), trolley movement (66 ± 2.3 dB), and personnel conversations (64 ± 3.9 dB) are the main sources of noise. The noise pollution of operating rooms are higher than available standards. The procedure needs to be corrected for achieving the proper conditions.

  13. The use of analysis of variance procedures in biological studies

    USGS Publications Warehouse

    Williams, B.K.

    1987-01-01

    The analysis of variance (ANOVA) is widely used in biological studies, yet there remains considerable confusion among researchers about the interpretation of hypotheses being tested. Ambiguities arise when statistical designs are unbalanced, and in particular when not all combinations of design factors are represented in the data. This paper clarifies the relationship among hypothesis testing, statistical modelling and computing procedures in ANOVA for unbalanced data. A simple two-factor fixed effects design is used to illustrate three common parametrizations for ANOVA models, and some associations among these parametrizations are developed. Biologically meaningful hypotheses for main effects and interactions are given in terms of each parametrization, and procedures for testing the hypotheses are described. The standard statistical computing procedures in ANOVA are given along with their corresponding hypotheses. Throughout the development unbalanced designs are assumed and attention is given to problems that arise with missing cells.

  14. Cost Finding Principles and Procedures. Preliminary Field Review Edition. Technical Report 26.

    ERIC Educational Resources Information Center

    Ziemer, Gordon; And Others

    This report is part of the Larger Cost Finding Principles Project designed to develop a uniform set of standards, definitions, and alternative procedures that will use accounting and statistical data to find the full cost of resources utilized in the process of producing institutional outputs. This technical report describes preliminary procedures…

  15. Standard Setting as Psychometric Due Process: Going a Little Further Down an Uncertain Road.

    ERIC Educational Resources Information Center

    Cizek, Gregory J.

    The concept of due process provides an analogy for the process of standard setting that emphasizes many of the procedural and substantive elements of the process over technical and statistical concerns. Surely such concerns can and should continue to be addressed. However, a sound rationale for standard setting does not rest on this foundation.…

  16. Standard Setting in a Small Scale OSCE: A Comparison of the Modified Borderline-Group Method and the Borderline Regression Method

    ERIC Educational Resources Information Center

    Wood, Timothy J.; Humphrey-Murto, Susan M.; Norman, Geoffrey R.

    2006-01-01

    When setting standards, administrators of small-scale OSCEs often face several challenges, including a lack of resources, a lack of available expertise in statistics, and difficulty in recruiting judges. The Modified Borderline-Group Method is a standard setting procedure that compensates for these challenges by using physician examiners and is…

  17. Student Distractor Choices on the Mathematics Virginia Standards of Learning Middle School Assessments

    ERIC Educational Resources Information Center

    Lewis, Virginia Vimpeny

    2011-01-01

    Number Concepts; Measurement; Geometry; Probability; Statistics; and Patterns, Functions and Algebra. Procedural Errors were further categorized into the following content categories: Computation; Measurement; Statistics; and Patterns, Functions, and Algebra. The results of the analysis showed the main sources of error for 6th, 7th, and 8th…

  18. The breaking load method - Results and statistical modification from the ASTM interlaboratory test program

    NASA Technical Reports Server (NTRS)

    Colvin, E. L.; Emptage, M. R.

    1992-01-01

    The breaking load test provides quantitative stress corrosion cracking data by determining the residual strength of tension specimens that have been exposed to corrosive environments. Eight laboratories have participated in a cooperative test program under the auspices of ASTM Committee G-1 to evaluate the new test method. All eight laboratories were able to distinguish between three tempers of aluminum alloy 7075. The statistical analysis procedures that were used in the test program do not work well in all situations. An alternative procedure using Box-Cox transformations shows a great deal of promise. An ASTM standard method has been drafted which incorporates the Box-Cox procedure.

  19. Optimizing ACS NSQIP modeling for evaluation of surgical quality and risk: patient risk adjustment, procedure mix adjustment, shrinkage adjustment, and surgical focus.

    PubMed

    Cohen, Mark E; Ko, Clifford Y; Bilimoria, Karl Y; Zhou, Lynn; Huffman, Kristopher; Wang, Xue; Liu, Yaoming; Kraemer, Kari; Meng, Xiangju; Merkow, Ryan; Chow, Warren; Matel, Brian; Richards, Karen; Hart, Amy J; Dimick, Justin B; Hall, Bruce L

    2013-08-01

    The American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) collects detailed clinical data from participating hospitals using standardized data definitions, analyzes these data, and provides participating hospitals with reports that permit risk-adjusted comparisons with a surgical quality standard. Since its inception, the ACS NSQIP has worked to refine surgical outcomes measurements and enhance statistical methods to improve the reliability and validity of this hospital profiling. From an original focus on controlling for between-hospital differences in patient risk factors with logistic regression, ACS NSQIP has added a variable to better adjust for the complexity and risk profile of surgical procedures (procedure mix adjustment) and stabilized estimates derived from small samples by using a hierarchical model with shrinkage adjustment. New models have been developed focusing on specific surgical procedures (eg, "Procedure Targeted" models), which provide opportunities to incorporate indication and other procedure-specific variables and outcomes to improve risk adjustment. In addition, comparative benchmark reports given to participating hospitals have been expanded considerably to allow more detailed evaluations of performance. Finally, procedures have been developed to estimate surgical risk for individual patients. This article describes the development of, and justification for, these new statistical methods and reporting strategies in ACS NSQIP. Copyright © 2013 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  20. How Does One Assess the Accuracy of Academic Success Predictors? ROC Analysis Applied to University Entrance Factors

    ERIC Educational Resources Information Center

    Vivo, Juana-Maria; Franco, Manuel

    2008-01-01

    This article attempts to present a novel application of a method of measuring accuracy for academic success predictors that could be used as a standard. This procedure is known as the receiver operating characteristic (ROC) curve, which comes from statistical decision techniques. The statistical prediction techniques provide predictor models and…

  1. Standards for reporting fish toxicity tests

    USGS Publications Warehouse

    Cope, O.B.

    1961-01-01

    The growing impetus of studies on fish and pesticides focuses attention on the need for standardized reporting procedures. Good methods have been developed for laboratory and field procedures in testing programs and in statistical features of assay experiments; and improvements are being made on methods of collecting and preserving fish, invertebrates, and other materials exposed to economic poisons. On the other had, the reporting of toxicity data in a complete manner has lagged behind, and today's literature is little improved over yesterday's with regard to completeness and susceptibility to interpretation.

  2. Effect of Flexible Duty Hour Policies on Length of Stay for Complex Intra-Abdominal Operations: A Flexibility in Duty Hour Requirements for Surgical Trainees (FIRST) Trial Analysis.

    PubMed

    Stulberg, Jonah J; Pavey, Emily S; Cohen, Mark E; Ko, Clifford Y; Hoyt, David B; Bilimoria, Karl Y

    2017-02-01

    Changes to resident duty hour policies in the Flexibility in Duty Hour Requirements for Surgical Trainees (FIRST) trial could impact hospitalized patients' length of stay (LOS) by altering care coordination. Length of stay can also serve as a reflection of all complications, particularly those not captured in the FIRST trial (eg pneumothorax from central line). Programs were randomized to either maintaining current ACGME duty hour policies (Standard arm) or more flexible policies waiving rules on maximum shift lengths and time off between shifts (Flexible arm). Our objective was to determine whether flexibility in resident duty hours affected LOS in patients undergoing high-risk surgical operations. Patients were identified who underwent hepatectomy, pancreatectomy, laparoscopic colectomy, open colectomy, or ventral hernia repair (2014-2015 academic year) at 154 hospitals participating in the FIRST trial. Two procedure-stratified evaluations of LOS were undertaken: multivariable negative binomial regression analysis on LOS and a multivariable logistic regression analysis on the likelihood of a prolonged LOS (>75 th percentile). Before any adjustments, there was no statistically significant difference in overall mean LOS between study arms (Flexible Policy: mean [SD] LOS 6.03 [5.78] days vs Standard Policy: mean LOS 6.21 [5.82] days; p = 0.74). In adjusted analyses, there was no statistically significant difference in LOS between study arms overall (incidence rate ratio for Flexible vs Standard: 0.982; 95% CI, 0.939-1.026; p = 0.41) or for any individual procedures. In addition, there was no statistically significant difference in the proportion of patients with prolonged LOS between study arms overall (Flexible vs Standard: odds ratio = 1.028; 95% CI, 0.871-1.212) or for any individual procedures. Duty hour flexibility had no statistically significant effect on LOS in patients undergoing complex intra-abdominal operations. Copyright © 2016 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  3. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data.

    PubMed

    Tekwe, Carmen D; Carroll, Raymond J; Dabney, Alan R

    2012-08-01

    Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. ctekwe@stat.tamu.edu.

  4. [The development of hospital medical supplies information management system].

    PubMed

    Cao, Shaoping; Gu, Hongqing; Zhang, Peng; Wang, Qiang

    2010-05-01

    The information management of medical materials by using high-tech computer, in order to improve the efficiency of the consumption of medical supplies, hospital supplies and develop a new technology way to manage the hospital and material support. Using C # NET, JAVA techniques to develop procedures for the establishment of hospital material management information system, set the various management modules, production of various statistical reports, standard operating procedures. The system is convenient, functional and strong, fluent statistical functions. It can always fully grasp and understand the whole hospital supplies run dynamic information, as a modern and effective tool for hospital materials management.

  5. Quantitative Comparison of Three Standardization Methods Using a One-Way ANOVA for Multiple Mean Comparisons

    ERIC Educational Resources Information Center

    Barrows, Russell D.

    2007-01-01

    A one-way ANOVA experiment is performed to determine whether or not the three standardization methods are statistically different in determining the concentration of the three paraffin analytes. The laboratory exercise asks students to combine the three methods in a single analytical procedure of their own design to determine the concentration of…

  6. Statistics and Ethics in Surgery and Anesthesia

    ERIC Educational Resources Information Center

    Gilbert, John P.; And Others

    1977-01-01

    Analyzes 46 medical research papers on the effects of innovative versus standard surgical procedures on the health of patients. Results reveal that innovations generally reduce complications. The ethics of experimental surgery are also discussed. (CP)

  7. A Hands-On Exercise Improves Understanding of the Standard Error of the Mean

    ERIC Educational Resources Information Center

    Ryan, Robert S.

    2006-01-01

    One of the most difficult concepts for statistics students is the standard error of the mean. To improve understanding of this concept, 1 group of students used a hands-on procedure to sample from small populations representing either a true or false null hypothesis. The distribution of 120 sample means (n = 3) from each population had standard…

  8. Standard methods for sampling North American freshwater fishes

    USGS Publications Warehouse

    Bonar, Scott A.; Hubert, Wayne A.; Willis, David W.

    2009-01-01

    This important reference book provides standard sampling methods recommended by the American Fisheries Society for assessing and monitoring freshwater fish populations in North America. Methods apply to ponds, reservoirs, natural lakes, and streams and rivers containing cold and warmwater fishes. Range-wide and eco-regional averages for indices of abundance, population structure, and condition for individual species are supplied to facilitate comparisons of standard data among populations. Provides information on converting nonstandard to standard data, statistical and database procedures for analyzing and storing standard data, and methods to prevent transfer of invasive species while sampling.

  9. Counting Microfiche: The Utilization of the Microform Section of the ANSI Standard Z39.7-1983 "Library Statistics"; Microfiche Curl; and "Poly" or "Cell"?

    ERIC Educational Resources Information Center

    Caldwell-Wood, Naomi; And Others

    1987-01-01

    The first of three articles describes procedures for using ANSI statistical methods for estimating the number of pieces in large homogeneous collections of microfiche. The second discusses causes of curl, its control, and measurement, and the third compares the advantages and disadvantages of cellulose acetate and polyester base for microforms.…

  10. Nonclassical light revealed by the joint statistics of simultaneous measurements.

    PubMed

    Luis, Alfredo

    2016-04-15

    Nonclassicality cannot be a single-observable property, since the statistics of any quantum observable is compatible with classical physics. We develop a general procedure to reveal nonclassical behavior of light states from the joint statistics arising in the practical measurement of multiple observables. Beside embracing previous approaches, this protocol can disclose nonclassical features for standard examples of classical-like behavior, such as SU(2) and Glauber coherent states. When combined with other criteria, this would imply that every light state is nonclassical.

  11. Enforcing Job Safety: A Managerial View

    ERIC Educational Resources Information Center

    Barnako, Frank R.

    1975-01-01

    The views of management or of employees regarding enforcement of the job safety law range from general satisfaction to calls for repeal of the act. The complexity of standards, statistics and recordkeeping, and enforcement procedures are major areas of concern. (MW)

  12. Statistical and procedural issues in the use of heated taxidermic mounts

    USGS Publications Warehouse

    Bakken, G.S.; Kenow, K.P.; Korschgen, C.E.; Boysen, A.F.

    2000-01-01

    Studies using mounts have an inherently nested error structure; calibration and standardization should use the appropriate procedures and statistics. One example is that individual mount differences are nested within morphological factors related to species, age, or gender; without replication, mount differences may be confused with differences due to morphology. Also, the sensitivity of mounts to orientation to wind or sun is nested within mount; without replication, inadvertent variation in mount positioning may be confused with differences among mounts. Data on heat loss from a of 1-day-old mallard duckling mount are used to illustrate orientation sensitivity. (C) 2000 Elsevier Science Ltd. All rights reserved.

  13. Validating Coherence Measurements Using Aligned and Unaligned Coherence Functions

    NASA Technical Reports Server (NTRS)

    Miles, Jeffrey Hilton

    2006-01-01

    This paper describes a novel approach based on the use of coherence functions and statistical theory for sensor validation in a harsh environment. By the use of aligned and unaligned coherence functions and statistical theory one can test for sensor degradation, total sensor failure or changes in the signal. This advanced diagnostic approach and the novel data processing methodology discussed provides a single number that conveys this information. This number as calculated with standard statistical procedures for comparing the means of two distributions is compared with results obtained using Yuen's robust statistical method to create confidence intervals. Examination of experimental data from Kulite pressure transducers mounted in a Pratt & Whitney PW4098 combustor using spectrum analysis methods on aligned and unaligned time histories has verified the effectiveness of the proposed method. All the procedures produce good results which demonstrates how robust the technique is.

  14. A comparative study on assessment procedures and metric properties of two scoring systems of the Coma Recovery Scale-Revised items: standard and modified scores.

    PubMed

    Sattin, Davide; Lovaglio, Piergiorgio; Brenna, Greta; Covelli, Venusia; Rossi Sebastiano, Davide; Duran, Dunja; Minati, Ludovico; Giovannetti, Ambra Mara; Rosazza, Cristina; Bersano, Anna; Nigri, Anna; Ferraro, Stefania; Leonardi, Matilde

    2017-09-01

    The study compared the metric characteristics (discriminant capacity and factorial structure) of two different methods for scoring the items of the Coma Recovery Scale-Revised and it analysed scale scores collected using the standard assessment procedure and a new proposed method. Cross sectional design/methodological study. Inpatient, neurological unit. A total of 153 patients with disorders of consciousness were consecutively enrolled between 2011 and 2013. All patients were assessed with the Coma Recovery Scale-Revised using standard (rater 1) and inverted (rater 2) procedures. Coma Recovery Scale-Revised score, number of cognitive and reflex behaviours and diagnosis. Regarding patient assessment, rater 1 using standard and rater 2 using inverted procedures obtained the same best scores for each subscale of the Coma Recovery Scale-Revised for all patients, so no clinical (and statistical) difference was found between the two procedures. In 11 patients (7.7%), rater 2 noted that some Coma Recovery Scale-Revised codified behavioural responses were not found during assessment, although higher response categories were present. A total of 51 (36%) patients presented the same Coma Recovery Scale-Revised scores of 7 or 8 using a standard score, whereas no overlap was found using the modified score. Unidimensionality was confirmed for both score systems. The Coma Recovery Scale Modified Score showed a higher discriminant capacity than the standard score and a monofactorial structure was also supported. The inverted assessment procedure could be a useful evaluation method for the assessment of patients with disorder of consciousness diagnosis.

  15. Effect of music therapy with emotional-approach coping on preprocedural anxiety in cardiac catheterization: a randomized controlled trial.

    PubMed

    Ghetti, Claire M

    2013-01-01

    Individuals undergoing cardiac catheterization are likely to experience elevated anxiety periprocedurally, with highest anxiety levels occurring immediately prior to the procedure. Elevated anxiety has the potential to negatively impact these individuals psychologically and physiologically in ways that may influence the subsequent procedure. This study evaluated the use of music therapy, with a specific emphasis on emotional-approach coping, immediately prior to cardiac catheterization to impact periprocedural outcomes. The randomized, pretest/posttest control group design consisted of two experimental groups--the Music Therapy with Emotional-Approach Coping group [MT/EAC] (n = 13), and a talk-based Emotional-Approach Coping group (n = 14), compared with a standard care Control group (n = 10). MT/EAC led to improved positive affective states in adults awaiting elective cardiac catheterization, whereas a talk-based emphasis on emotional-approach coping or standard care did not. All groups demonstrated a significant overall decrease in negative affect. The MT/EAC group demonstrated a statistically significant, but not clinically significant, increase in systolic blood pressure most likely due to active engagement in music making. The MT/EAC group trended toward shortest procedure length and least amount of anxiolytic required during the procedure, while the EAC group trended toward least amount of analgesic required during the procedure, but these differences were not statistically significant. Actively engaging in a session of music therapy with an emphasis on emotional-approach coping can improve the well-being of adults awaiting cardiac catheterization procedures.

  16. Statistical analysis of radioimmunoassay. In comparison with bioassay (in Japanese)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakano, R.

    1973-01-01

    Using the data of RIA (radioimmunoassay), statistical procedures for dealing with two problems of the linearization of dose response curve and calculation of relative potency were described. There were three methods for linearization of dose response curve of RIA. In each method, the following parameters were shown on the horizontal and vertical axis: dose x, (B/T)/sup -1/; c/x + c, B/T (C: dose which makes B/T 50%); log x, logit B/T. Among them, the last method seems to be most practical. The statistical procedures for bioassay were employed for calculating the relative potency of unknown samples compared to the standardmore » samples from dose response curves of standand and unknown samples using regression coefficient. It is desirable that relative potency is calculated by plotting more than 5 points in the standard curve and plotting more than 2 points in unknow samples. For examining the statistical limit of precision of measuremert, LH activity of gonadotropin in urine was measured and relative potency, precision coefficient and the upper and lower limits of relative potency at 95% confidence limit were calculated. On the other hand, bioassay (by the ovarian ascorbic acid reduction method and anteriol lobe of prostate weighing method) was done in the same samples, and the precision was compared with that of RIA. In these examinations, the upper and lower limits of the relative potency at 95% confidence limit were near each other, while in bioassay, a considerable difference was observed between the upper and lower limits. The necessity of standardization and systematization of the statistical procedures for increasing the precision of RIA was pointed out. (JA)« less

  17. Procedural considerations for CPV outdoor power ratings per IEC 62670

    NASA Astrophysics Data System (ADS)

    Muller, Matthew; Kurtz, Sarah; Rodriguez, Jose

    2013-09-01

    The IEC Working Group 7 (WG7) is in the process of developing a draft procedure for an outdoor concentrating photovoltaic (CPV) module power rating at Concentrator Standard Operating Conditions (CSOC). WG7 recently achieved some consensus that using component reference cells to monitor/limit spectral variation is the preferred path for the outdoor power rating. To build on this consensus, the community must quantify these spectral limits and select a procedure for calculating and reporting a power rating. This work focuses on statistically comparing several procedures the community is considering in context with monitoring/limiting spectral variation.

  18. Kappa statistic for the clustered dichotomous responses from physicians and patients

    PubMed Central

    Kang, Chaeryon; Qaqish, Bahjat; Monaco, Jane; Sheridan, Stacey L.; Cai, Jianwen

    2013-01-01

    The bootstrap method for estimating the standard error of the kappa statistic in the presence of clustered data is evaluated. Such data arise, for example, in assessing agreement between physicians and their patients regarding their understanding of the physician-patient interaction and discussions. We propose a computationally efficient procedure for generating correlated dichotomous responses for physicians and assigned patients for simulation studies. The simulation result demonstrates that the proposed bootstrap method produces better estimate of the standard error and better coverage performance compared to the asymptotic standard error estimate that ignores dependence among patients within physicians with at least a moderately large number of clusters. An example of an application to a coronary heart disease prevention study is presented. PMID:23533082

  19. 9 CFR 439.1 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... considered a misidentification. (h) CUSUM—A class of statistical procedures for assessing whether or not a... that process. The initial CUSUM values for each laboratory whose application for accreditation is... samples. (y) Standardizing constant—A number that results from a mathematical adjustment to the...

  20. Hands-on 2.0: improving transfer of training via the Society of American Gastrointestinal and Endoscopic Surgeons (SAGES) Acquisition of Data for Outcomes and Procedure Transfer (ADOPT) program.

    PubMed

    Dort, Jonathan; Trickey, Amber; Paige, John; Schwarz, Erin; Dunkin, Brian

    2017-08-01

    Practicing surgeons commonly learn new procedures and techniques by attending a "hands-on" course, though trainings are often ineffective at promoting subsequent procedure adoption in practice. We describe implementation of a new program with the SAGES All Things Hernia Hands-On Course, Acquisition of Data for Outcomes and Procedure Transfer (ADOPT), which employs standardized, proven teaching techniques, and 1-year mentorship. Attendee confidence and procedure adoption are compared between standard and ADOPT programs. For the pilot ADOPT course implementation, a hands-on course focusing on abdominal wall hernia repair was chosen. ADOPT participants were recruited among enrollees for the standard Hands-On Hernia Course. Enrollment in ADOPT was capped at 10 participants and limited to a 2:1 student-to-faculty ratio, compared to the standard course 22 participants with a 4:1 student-to-faculty ratio. ADOPT mentors interacted with participants through webinars, phone conferences, and continuous email availability throughout the year. All participants were asked to provide pre- and post-course surveys inquiring about the number of targeted hernia procedures performed and related confidence level. Four of 10 ADOPT participants (40%) and six of 22 standard training participants (27%) returned questionnaires. Over the 3 months following the course, ADOPT participants performed more ventral hernia mesh insertion procedures than standard training participants (median 13 vs. 0.5, p = 0.010) and considerably more total combined procedures (median 26 vs. 7, p = 0.054). Compared to standard training, learners who participated in ADOPT reported greater confidence improvements in employing a components separation via an open approach (p = 0.051), and performing an open transversus abdominis release, though the difference did not achieve statistical significance (p = 0.14). These results suggest that the ADOPT program, with standardized and structured teaching, telementoring, and a longitudinal educational approach, is effective and leads to better transfer of learned skills and procedures to clinical practice.

  1. Mode-Stirred Method Implementation for HIRF Susceptibility Testing and Results Comparison with Anechoic Method

    NASA Technical Reports Server (NTRS)

    Nguyen, Truong X.; Ely, Jay J.; Koppen, Sandra V.

    2001-01-01

    This paper describes the implementation of mode-stirred method for susceptibility testing according to the current DO-160D standard. Test results on an Engine Data Processor using the implemented procedure and the comparisons with the standard anechoic test results are presented. The comparison experimentally shows that the susceptibility thresholds found in mode-stirred method are consistently higher than anechoic. This is consistent with the recent statistical analysis finding by NIST that the current calibration procedure overstates field strength by a fixed amount. Once the test results are adjusted for this value, the comparisons with the anechoic results are excellent. The results also show that test method has excellent chamber to chamber repeatability. Several areas for improvements to the current procedure are also identified and implemented.

  2. Assessment of statistic analysis in non-radioisotopic local lymph node assay (non-RI-LLNA) with alpha-hexylcinnamic aldehyde as an example.

    PubMed

    Takeyoshi, Masahiro; Sawaki, Masakuni; Yamasaki, Kanji; Kimber, Ian

    2003-09-30

    The murine local lymph node assay (LLNA) is used for the identification of chemicals that have the potential to cause skin sensitization. However, it requires specific facility and handling procedures to accommodate a radioisotopic (RI) endpoint. We have developed non-radioisotopic (non-RI) endpoint of LLNA based on BrdU incorporation to avoid a use of RI. Although this alternative method appears viable in principle, it is somewhat less sensitive than the standard assay. In this study, we report investigations to determine the use of statistical analysis to improve the sensitivity of a non-RI LLNA procedure with alpha-hexylcinnamic aldehyde (HCA) in two separate experiments. Consequently, the alternative non-RI method required HCA concentrations of greater than 25% to elicit a positive response based on the criterion for classification as a skin sensitizer in the standard LLNA. Nevertheless, dose responses to HCA in the alternative method were consistent in both experiments and we examined whether the use of an endpoint based upon the statistical significance of induced changes in LNC turnover, rather than an SI of 3 or greater, might provide for additional sensitivity. The results reported here demonstrate that with HCA at least significant responses were, in each of two experiments, recorded following exposure of mice to 25% of HCA. These data suggest that this approach may be more satisfactory-at least when BrdU incorporation is measured. However, this modification of the LLNA is rather less sensitive than the standard method if employing statistical endpoint. Taken together the data reported here suggest that a modified LLNA in which BrdU is used in place of radioisotope incorporation shows some promise, but that in its present form, even with the use of a statistical endpoint, lacks some of the sensitivity of the standard method. The challenge is to develop strategies for further refinement of this approach.

  3. Tests for qualitative treatment-by-centre interaction using a 'pushback' procedure.

    PubMed

    Ciminera, J L; Heyse, J F; Nguyen, H H; Tukey, J W

    1993-06-15

    In multicentre clinical trials using a common protocol, the centres are usually regarded as being a fixed factor, thus allowing any treatment-by-centre interaction to be omitted from the error term for the effect of treatment. However, we feel it necessary to use the treatment-by-centre interaction as the error term if there is substantial evidence that the interaction with centres is qualitative instead of quantitative. To make allowance for the estimated uncertainties of the centre means, we propose choosing a reference value (for example, the median of the ordered array of centre means) and converting the individual centre results into standardized deviations from the reference value. The deviations are then reordered, and the results 'pushed back' by amounts appropriate for the corresponding order statistics in a sample from the relevant distribution. The pushed-back standardized deviations are then restored to the original scale. The appearance of opposite signs among the destandardized values for the various centres is then taken as 'substantial evidence' of qualitative interaction. Procedures are presented using, in any combination: (i) Gaussian, or Student's t-distribution; (ii) order-statistic medians or outward 90 per cent points of the corresponding order statistic distributions; (iii) pooling or grouping and pooling the internally estimated standard deviations of the centre means. The use of the least conservative combination--Student's t, outward 90 per cent points, grouping and pooling--is recommended.

  4. Procedures for determination of detection limits: application to high-performance liquid chromatography analysis of fat-soluble vitamins in human serum.

    PubMed

    Browne, Richard W; Whitcomb, Brian W

    2010-07-01

    Problems in the analysis of laboratory data commonly arise in epidemiologic studies in which biomarkers subject to lower detection thresholds are used. Various thresholds exist including limit of detection (LOD), limit of quantification (LOQ), and limit of blank (LOB). Choosing appropriate strategies for dealing with data affected by such limits relies on proper understanding of the nature of the detection limit and its determination. In this paper, we demonstrate experimental and statistical procedures generally used for estimating different detection limits according to standard procedures in the context of analysis of fat-soluble vitamins and micronutrients in human serum. Fat-soluble vitamins and micronutrients were analyzed by high-performance liquid chromatography with diode array detection. A simulated serum matrix blank was repeatedly analyzed for determination of LOB parametrically by using the observed blank distribution as well as nonparametrically by using ranks. The LOD was determined by combining information regarding the LOB with data from repeated analysis of standard reference materials (SRMs), diluted to low levels; from LOB to 2-3 times LOB. The LOQ was determined experimentally by plotting the observed relative standard deviation (RSD) of SRM replicates compared with the concentration, where the LOQ is the concentration at an RSD of 20%. Experimental approaches and example statistical procedures are given for determination of LOB, LOD, and LOQ. These quantities are reported for each measured analyte. For many analyses, there is considerable information available below the LOQ. Epidemiologic studies must understand the nature of these detection limits and how they have been estimated for appropriate treatment of affected data.

  5. Statistical segmentation of multidimensional brain datasets

    NASA Astrophysics Data System (ADS)

    Desco, Manuel; Gispert, Juan D.; Reig, Santiago; Santos, Andres; Pascau, Javier; Malpica, Norberto; Garcia-Barreno, Pedro

    2001-07-01

    This paper presents an automatic segmentation procedure for MRI neuroimages that overcomes part of the problems involved in multidimensional clustering techniques like partial volume effects (PVE), processing speed and difficulty of incorporating a priori knowledge. The method is a three-stage procedure: 1) Exclusion of background and skull voxels using threshold-based region growing techniques with fully automated seed selection. 2) Expectation Maximization algorithms are used to estimate the probability density function (PDF) of the remaining pixels, which are assumed to be mixtures of gaussians. These pixels can then be classified into cerebrospinal fluid (CSF), white matter and grey matter. Using this procedure, our method takes advantage of using the full covariance matrix (instead of the diagonal) for the joint PDF estimation. On the other hand, logistic discrimination techniques are more robust against violation of multi-gaussian assumptions. 3) A priori knowledge is added using Markov Random Field techniques. The algorithm has been tested with a dataset of 30 brain MRI studies (co-registered T1 and T2 MRI). Our method was compared with clustering techniques and with template-based statistical segmentation, using manual segmentation as a gold-standard. Our results were more robust and closer to the gold-standard.

  6. Percutaneous Tracheostomy under Bronchoscopic Visualization Does Not Affect Short-Term or Long-Term Complications.

    PubMed

    Easterday, Thomas S; Moore, Joshua W; Redden, Meredith H; Feliciano, David V; Henderson, Vernon J; Humphries, Timothy; Kohler, Katherine E; Ramsay, Philip T; Spence, Stanston D; Walker, Mark; Wyrzykowski, Amy D

    2017-07-01

    Percutaneous tracheostomy is a safe and effective bedside procedure. Some advocate the use of bronchoscopy during the procedure to reduce the rate of complications. We evaluated our complication rate in trauma patients undergoing percutaneous tracheostomy with and without bronchoscopic guidance to ascertain if there was a difference in the rate of complications. A retrospective review of all tracheostomies performed in critically ill trauma patients was performed using the trauma registry from an urban, Level I Trauma Center. Bronchoscopy assistance was used based on surgeon preference. Standard statistical methodology was used to determine if there was a difference in complication rates for procedures performed with and without the bronchoscope. From January 2007, to April 2016, 649 patients underwent modified percuteaneous tracheostomy; 289 with the aid of a bronchoscope and 360 without. There were no statistically significant differences in any type of complication regardless of utilization of a bronchoscope. The addition of bronchoscopy provides several theoretical benefits when performing percutaneous tracheostomy. Our findings, however, do not demonstrate a statistically significant difference in complications between procedures performed with and without a bronchoscope. Use of the bronchoscope should, therefore, be left to the discretion of the performing physician.

  7. Kappa statistic for clustered dichotomous responses from physicians and patients.

    PubMed

    Kang, Chaeryon; Qaqish, Bahjat; Monaco, Jane; Sheridan, Stacey L; Cai, Jianwen

    2013-09-20

    The bootstrap method for estimating the standard error of the kappa statistic in the presence of clustered data is evaluated. Such data arise, for example, in assessing agreement between physicians and their patients regarding their understanding of the physician-patient interaction and discussions. We propose a computationally efficient procedure for generating correlated dichotomous responses for physicians and assigned patients for simulation studies. The simulation result demonstrates that the proposed bootstrap method produces better estimate of the standard error and better coverage performance compared with the asymptotic standard error estimate that ignores dependence among patients within physicians with at least a moderately large number of clusters. We present an example of an application to a coronary heart disease prevention study. Copyright © 2013 John Wiley & Sons, Ltd.

  8. Design of experiments enhanced statistical process control for wind tunnel check standard testing

    NASA Astrophysics Data System (ADS)

    Phillips, Ben D.

    The current wind tunnel check standard testing program at NASA Langley Research Center is focused on increasing data quality, uncertainty quantification and overall control and improvement of wind tunnel measurement processes. The statistical process control (SPC) methodology employed in the check standard testing program allows for the tracking of variations in measurements over time as well as an overall assessment of facility health. While the SPC approach can and does provide researchers with valuable information, it has certain limitations in the areas of process improvement and uncertainty quantification. It is thought by utilizing design of experiments methodology in conjunction with the current SPC practices that one can efficiently and more robustly characterize uncertainties and develop enhanced process improvement procedures. In this research, methodologies were developed to generate regression models for wind tunnel calibration coefficients, balance force coefficients and wind tunnel flow angularities. The coefficients of these regression models were then tracked in statistical process control charts, giving a higher level of understanding of the processes. The methodology outlined is sufficiently generic such that this research can be applicable to any wind tunnel check standard testing program.

  9. 75 FR 2488 - Mid-Atlantic Fishery Management Council; Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-15

    ... Mid-Atlantic Fishery Management Council's (MAFMC) Scientific and Statistical Committee (SSC) will hold... include new member orientation (overview of Council process and role of the SSC), review and adoption of SSC Standard Operating Practices and Procedures, ABC Control Rule Framework and Council Risk Policy...

  10. Permutation tests for goodness-of-fit testing of mathematical models to experimental data.

    PubMed

    Fişek, M Hamit; Barlas, Zeynep

    2013-03-01

    This paper presents statistical procedures for improving the goodness-of-fit testing of theoretical models to data obtained from laboratory experiments. We use an experimental study in the expectation states research tradition which has been carried out in the "standardized experimental situation" associated with the program to illustrate the application of our procedures. We briefly review the expectation states research program and the fundamentals of resampling statistics as we develop our procedures in the resampling context. The first procedure we develop is a modification of the chi-square test which has been the primary statistical tool for assessing goodness of fit in the EST research program, but has problems associated with its use. We discuss these problems and suggest a procedure to overcome them. The second procedure we present, the "Average Absolute Deviation" test, is a new test and is proposed as an alternative to the chi square test, as being simpler and more informative. The third and fourth procedures are permutation versions of Jonckheere's test for ordered alternatives, and Kendall's tau(b), a rank order correlation coefficient. The fifth procedure is a new rank order goodness-of-fit test, which we call the "Deviation from Ideal Ranking" index, which we believe may be more useful than other rank order tests for assessing goodness-of-fit of models to experimental data. The application of these procedures to the sample data is illustrated in detail. We then present another laboratory study from an experimental paradigm different from the expectation states paradigm - the "network exchange" paradigm, and describe how our procedures may be applied to this data set. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. An Extension of the Chi-Square Procedure for Non-NORMAL Statistics, with Application to Solar Neutrino Data

    NASA Astrophysics Data System (ADS)

    Sturrock, P. A.

    2008-01-01

    Using the chi-square statistic, one may conveniently test whether a series of measurements of a variable are consistent with a constant value. However, that test is predicated on the assumption that the appropriate probability distribution function (pdf) is normal in form. This requirement is usually not satisfied by experimental measurements of the solar neutrino flux. This article presents an extension of the chi-square procedure that is valid for any form of the pdf. This procedure is applied to the GALLEX-GNO dataset, and it is shown that the results are in good agreement with the results of Monte Carlo simulations. Whereas application of the standard chi-square test to symmetrized data yields evidence significant at the 1% level for variability of the solar neutrino flux, application of the extended chi-square test to the unsymmetrized data yields only weak evidence (significant at the 4% level) of variability.

  12. A scaling procedure for the response of an isolated system with high modal overlap factor

    NASA Astrophysics Data System (ADS)

    De Rosa, S.; Franco, F.

    2008-10-01

    The paper deals with a numerical approach that reduces some physical sizes of the solution domain to compute the dynamic response of an isolated system: it has been named Asymptotical Scaled Modal Analysis (ASMA). The proposed numerical procedure alters the input data needed to obtain the classic modal responses to increase the frequency band of validity of the discrete or continuous coordinates model through the definition of a proper scaling coefficient. It is demonstrated that the computational cost remains acceptable while the frequency range of analysis increases. Moreover, with reference to the flexural vibrations of a rectangular plate, the paper discusses the ASMA vs. the statistical energy analysis and the energy distribution approach. Some insights are also given about the limits of the scaling coefficient. Finally it is shown that the linear dynamic response, predicted with the scaling procedure, has the same quality and characteristics of the statistical energy analysis, but it can be useful when the system cannot be solved appropriately by the standard Statistical Energy Analysis (SEA).

  13. Standardization, evaluation and early-phase method validation of an analytical scheme for batch-consistency N-glycosylation analysis of recombinant produced glycoproteins.

    PubMed

    Zietze, Stefan; Müller, Rainer H; Brecht, René

    2008-03-01

    In order to set up a batch-to-batch-consistency analytical scheme for N-glycosylation analysis, several sample preparation steps including enzyme digestions and fluorophore labelling and two HPLC-methods were established. The whole method scheme was standardized, evaluated and validated according to the requirements on analytical testing in early clinical drug development by usage of a recombinant produced reference glycoprotein (RGP). The standardization of the methods was performed by clearly defined standard operation procedures. During evaluation of the methods, the major interest was in the loss determination of oligosaccharides within the analytical scheme. Validation of the methods was performed with respect to specificity, linearity, repeatability, LOD and LOQ. Due to the fact that reference N-glycan standards were not available, a statistical approach was chosen to derive accuracy from the linearity data. After finishing the validation procedure, defined limits for method variability could be calculated and differences observed in consistency analysis could be separated into significant and incidental ones.

  14. A simple test of association for contingency tables with multiple column responses.

    PubMed

    Decady, Y J; Thomas, D R

    2000-09-01

    Loughin and Scherer (1998, Biometrics 54, 630-637) investigated tests of association in two-way tables when one of the categorical variables allows for multiple-category responses from individual respondents. Standard chi-squared tests are invalid in this case, and they developed a bootstrap test procedure that provides good control of test levels under the null hypothesis. This procedure and some others that have been proposed are computationally involved and are based on techniques that are relatively unfamiliar to many practitioners. In this paper, the methods introduced by Rao and Scott (1981, Journal of the American Statistical Association 76, 221-230) for analyzing complex survey data are used to develop a simple test based on a corrected chi-squared statistic.

  15. 76 FR 647 - Energy Conservation Program: Test Procedures for Electric Motors and Small Electric Motors

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-05

    ... determination method (AEDM) for small electric motors, including the statistical requirements to substantiate... restriction to a particular application or type of application; or (2) Standard operating characteristics or... application, and which can be used in most general purpose applications. [[Page 652

  16. Standard Operating Procedures for Collecting Data from Local Education Agencies.

    ERIC Educational Resources Information Center

    McElreath, Nancy R., Ed.

    A systematic approach to planning and presenting the data collection activities of a State Department of Education is described. The Information Communication System, a model communication system used by the state of New Jersey, conveys narrative and statistical information relating to a school district's students, teachers, finances, facilities…

  17. Observed-Score Equating with a Heterogeneous Target Population

    ERIC Educational Resources Information Center

    Duong, Minh Q.; von Davier, Alina A.

    2012-01-01

    Test equating is a statistical procedure for adjusting for test form differences in difficulty in a standardized assessment. Equating results are supposed to hold for a specified target population (Kolen & Brennan, 2004; von Davier, Holland, & Thayer, 2004) and to be (relatively) independent of the subpopulations from the target population (see…

  18. Can subepithelial connective tissue grafts be considered the gold standard procedure in the treatment of Miller Class I and II recession-type defects?

    PubMed

    Chambrone, Leandro; Chambrone, Daniela; Pustiglioni, Francisco E; Chambrone, Luiz A; Lima, Luiz A

    2008-09-01

    The objective of this systematic review was to answer the following question: 'Can subepithelial connective tissue grafts (SCTG) be considered the gold standard procedure in the treatment of recession-type defects?' DATA AND SOURCE: An electronic search (MEDLIINE, EMBASE and CENTRAL) for randomized controlled clinical trials with at least 6 months' follow-up comparing SCTG with other procedures for the treatment of gingival recession was performed up to December 2007. To be eligible to this review patients had to present a diagnosis of gingival recession with the following characteristics: (a) recession areas selected for treatment classified as Miller [Miller Jr PD. A classification of marginal tissue recession. International Journal of Periodontics & Restorative Dentistry 1985;5:8-13.] Class I or Class II of at least 2mm; (b) recession areas containing teeth with no caries or restorations; and (c) at least 10 participants per group at final examination. From a total of 568 references, 23 studies were considered relevant. The results indicated a statistically significant greater reduction in gingival recession for SCTG, when compared to acellular dermal matrix grafts and guided tissue regeneration with resorbable membranes (GTR rm). For clinical attachment level changes, differences between all groups were not significant. For changes in the keratinized tissue (KT), the results showed a statistically significant gain in the width of KT for SCTG when compared to GTR rm. The results of this review show that subepithelial connective tissue grafts provided significant root coverage, clinical attachment and keratinized tissue gain. Overall comparisons allow us to consider it as the 'gold standard' procedure in the treatment of recession-type defects.

  19. Post-operative shampoo effects in neurosurgical patients: a pilot experimental study.

    PubMed

    Palese, Alvisa; Moreale, Renzo; Noacco, Massimo; Pistrino, Flavia; Mastrolia, Irene; Sartor, Assunta; Scarparo, Claudio; Skrap, Miran

    2015-04-01

    Neurosurgical site infections are an important issue. Among the acknowledged preventive tactics, the non-shaving technique is well established in the neurosurgical setting. However, given that patient's hair around the surgical site may retain biologic material that emerges during the surgical procedure or that may simply become dirty, which may increase the risk of surgical site infections, if and when shampooing should be offered remains under debate. A pilot experimental study was undertaken from 2011 to 2012. A series of neurosurgical patients not affected by conditions that would increase the risk of post-operative infection were assigned randomly to the exposed group (receiving shampoo 72 h after surgical procedure) or control group (receiving standard dressing surveillance without shampooing). Comfort, surgical site contamination (measured as the number of colony-forming units [CFU]), and SSIs at 30 d after surgery were the main study outcomes. A total of 53 patients were included: 25 (47.2%) received a shampoo after 72 h whereas 28 (52.8%) received standard care. Patients who received a shampoo reported a similar level of comfort (average=8.04; standard deviation [SD] 1.05) compared with those receiving standard care (average 7.3; SD 3.2) although this was not statistically significant (p=0.345). No statistically significant difference emerged in the occurrence of surgical site contamination between the groups, and no SSIs were detected within 30 d. In our pilot study, the results of which are not generalizable because of the limited sample of patients involved, a gentle shampoo offered 72 h after the surgical procedure did not increase the SSIs occurrence or the contamination of the surgical site, although it may increase the perception of comfort by patients. Further studies are strongly recommended involving a larger sample size and designed to include more diversified neurosurgical patients undergoing surgical procedures in different centers.

  20. Evaluative procedures to detect, characterize, and assess the severity of diabetic neuropathy.

    PubMed

    Dyck, P J

    1991-01-01

    Minimal criteria for diabetic neuropathy need to be defined and universally applied. Standardized evaluative procedures need to be agreed and normal ranges determined from healthy volunteers. Types and stages of neuropathy should be established and assessments performed on representative populations of both Type 1 and Type 2 diabetic patients. Potential minimal criteria include absent ankle reflexes and vibratory sensation, and abnormalities of nerve conduction. However, the preferred criterion is the identification of more than two statistically defined abnormalities among symptoms and deficits, nerve conduction, quantitative sensory examination or quantitative autonomic examination. Various evaluative procedures are available. Symptoms should be assessed and scores can be assigned to neurological deficits. However, assessments of nerve conduction provide the most specific, objective, sensitive, and repeatable procedures, although these may be the least meaningful. Many techniques are available for quantitative sensory examination, but are poorly standardized and normal values are not available. For quantitative autonomic examination, tests are available for the adequacy of cardiovascular and peripheral vascular reflexes and increasingly for other autonomic functions. In any assessment of nerve function the conditions should be optimized and standardized, and stimuli defined. Specific instructions should be given and normal ranges established in healthy volunteers.

  1. Establishing the traceability of a uranyl nitrate solution to a standard reference material

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jackson, C.H.; Clark, J.P.

    1978-01-01

    A uranyl nitrate solution for use as a Working Calibration and Test Material (WCTM) was characterized, using a statistically designed procedure to document traceability to National Bureau of Standards Reference Material (SPM-960). A Reference Calibration and Test Material (PCTM) was prepared from SRM-960 uranium metal to approximate the acid and uranium concentration of the WCTM. This solution was used in the characterization procedure. Details of preparing, handling, and packaging these solutions are covered. Two outside laboratories, each having measurement expertise using a different analytical method, were selected to measure both solutions according to the procedure for characterizing the WCTM. Twomore » different methods were also used for the in-house characterization work. All analytical results were tested for statistical agreement before the WCTM concentration and limit of error values were calculated. A concentration value was determined with a relative limit of error (RLE) of approximately 0.03% which was better than the target RLE of 0.08%. The use of this working material eliminates the expense of using SRMs to fulfill traceability requirements for uranium measurements on this type material. Several years' supply of uranyl nitrate solution with NBS traceability was produced. The cost of this material was less than 10% of an equal quantity of SRM-960 uranium metal.« less

  2. Statistical baseline assessment in cardiotocography.

    PubMed

    Agostinelli, Angela; Braccili, Eleonora; Marchegiani, Enrico; Rosati, Riccardo; Sbrollini, Agnese; Burattini, Luca; Morettini, Micaela; Di Nardo, Francesco; Fioretti, Sandro; Burattini, Laura

    2017-07-01

    Cardiotocography (CTG) is the most common non-invasive diagnostic technique to evaluate fetal well-being. It consists in the recording of fetal heart rate (FHR; bpm) and maternal uterine contractions. Among the main parameters characterizing FHR, baseline (BL) is fundamental to determine fetal hypoxia and distress. In computerized applications, BL is typically computed as mean FHR±ΔFHR, with ΔFHR=8 bpm or ΔFHR=10 bpm, both values being experimentally fixed. In this context, the present work aims: to propose a statistical procedure for ΔFHR assessment; to quantitatively determine ΔFHR value by applying such procedure to clinical data; and to compare the statistically-determined ΔFHR value against the experimentally-determined ΔFHR values. To these aims, the 552 recordings of the "CTU-UHB intrapartum CTG database" from Physionet were submitted to an automatic procedure, which consisted in a FHR preprocessing phase and a statistical BL assessment. During preprocessing, FHR time series were divided into 20-min sliding windows, in which missing data were removed by linear interpolation. Only windows with a correction rate lower than 10% were further processed for BL assessment, according to which ΔFHR was computed as FHR standard deviation. Total number of accepted windows was 1192 (38.5%) over 383 recordings (69.4%) with at least an accepted window. Statistically-determined ΔFHR value was 9.7 bpm. Such value was statistically different from 8 bpm (P<;10 -19 ) but not from 10 bpm (P=0.16). Thus, ΔFHR=10 bpm is preferable over 8 bpm because both experimentally and statistically validated.

  3. Changes in Occupational Radiation Exposures after Incorporation of a Real-time Dosimetry System in the Interventional Radiology Suite.

    PubMed

    Poudel, Sashi; Weir, Lori; Dowling, Dawn; Medich, David C

    2016-08-01

    A statistical pilot study was retrospectively performed to analyze potential changes in occupational radiation exposures to Interventional Radiology (IR) staff at Lawrence General Hospital after implementation of the i2 Active Radiation Dosimetry System (Unfors RaySafe Inc, 6045 Cochran Road Cleveland, OH 44139-3302). In this study, the monthly OSL dosimetry records obtained during the eight-month period prior to i2 implementation were normalized to the number of procedures performed during each month and statistically compared to the normalized dosimetry records obtained for the 8-mo period after i2 implementation. The resulting statistics included calculation of the mean and standard deviation of the dose equivalences per procedure and included appropriate hypothesis tests to assess for statistically valid differences between the pre and post i2 study periods. Hypothesis testing was performed on three groups of staff present during an IR procedure: The first group included all members of the IR staff, the second group consisted of the IR radiologists, and the third group consisted of the IR technician staff. After implementing the i2 active dosimetry system, participating members of the Lawrence General IR staff had a reduction in the average dose equivalence per procedure of 43.1% ± 16.7% (p = 0.04). Similarly, Lawrence General IR radiologists had a 65.8% ± 33.6% (p=0.01) reduction while the technologists had a 45.0% ± 14.4% (p=0.03) reduction.

  4. Towards a Standardized Definition of Educational Expenditure. Statistical Reports and Studies, No. 34.

    ERIC Educational Resources Information Center

    Lassibille, Gerard

    This document is an inventory of public and private educational expenditure from an international perspective and sets out the concepts, definitions and procedures useful in accounting for educational expenditure. The monograph defines the field within which expenditure is to be observed; details the necessary expenditure data to be collected from…

  5. Normalization Ridge Regression in Practice II: The Estimation of Multiple Feedback Linkages.

    ERIC Educational Resources Information Center

    Bulcock, J. W.

    The use of the two-stage least squares (2 SLS) procedure for estimating nonrecursive social science models is often impractical when multiple feedback linkages are required. This is because 2 SLS is extremely sensitive to multicollinearity. The standard statistical solution to the multicollinearity problem is a biased, variance reduced procedure…

  6. Meta-analysis of the technical performance of an imaging procedure: guidelines and statistical methodology.

    PubMed

    Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun

    2015-02-01

    Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test-retest repeatability data for illustrative purposes. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  7. Meta-analysis of the technical performance of an imaging procedure: Guidelines and statistical methodology

    PubMed Central

    Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun

    2017-01-01

    Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test–retest repeatability data for illustrative purposes. PMID:24872353

  8. Tear cytokine profile as a noninvasive biomarker of inflammation for ocular surface diseases: standard operating procedures.

    PubMed

    Wei, Yi; Gadaria-Rathod, Neha; Epstein, Seth; Asbell, Penny

    2013-12-23

    To provide standard operating procedures (SOPs) for measuring tear inflammatory cytokine concentrations and to validate the resulting profile as a minimally invasive objective metric and biomarker of ocular surface inflammation for use in multicenter clinical trials on dry eye disease (DED). Standard operating procedures were established and then validated with cytokine standards, quality controls, and masked tear samples collected from local and distant clinical sites. The concentrations of the inflammatory cytokines in tears were quantified using a high-sensitivity human cytokine multiplex kit. A panel of inflammatory cytokines was initially investigated, from which four key inflammatory cytokines (IL-1β, IL-6, INF-γ, and TNF-α) were chosen. Results with cytokine standards statistically satisfied the manufacturer's quality control criteria. Results with pooled tear samples were highly reproducible and reliable with tear volumes ranging from 4 to 10 μL. Incorporation of the SOPs into clinical trials was subsequently validated. Tear samples were collected at a distant clinical site, stored, and shipped to our Biomarker Laboratory, where a masked analysis of the four tear cytokines was successfully performed. Tear samples were also collected from a feasibility study on DED. Inflammatory cytokine concentrations were decreased in tears of subjects who received anti-inflammatory treatment. Standard operating procedures for human tear cytokine assessment suitable for multicenter clinical trials were established. Tear cytokine profiling using these SOPs may provide objective metrics useful for diagnosing, classifying, and analyzing treatment efficacy in inflammatory conditions of the ocular surface, which may further elucidate the mechanisms involved in the pathogenesis of ocular surface disease.

  9. Value assignment and uncertainty evaluation for single-element reference solutions

    NASA Astrophysics Data System (ADS)

    Possolo, Antonio; Bodnar, Olha; Butler, Therese A.; Molloy, John L.; Winchester, Michael R.

    2018-06-01

    A Bayesian statistical procedure is proposed for value assignment and uncertainty evaluation for the mass fraction of the elemental analytes in single-element solutions distributed as NIST standard reference materials. The principal novelty that we describe is the use of information about relative differences observed historically between the measured values obtained via gravimetry and via high-performance inductively coupled plasma optical emission spectrometry, to quantify the uncertainty component attributable to between-method differences. This information is encapsulated in a prior probability distribution for the between-method uncertainty component, and it is then used, together with the information provided by current measurement data, to produce a probability distribution for the value of the measurand from which an estimate and evaluation of uncertainty are extracted using established statistical procedures.

  10. Uncertainty Analysis of Inertial Model Attitude Sensor Calibration and Application with a Recommended New Calibration Method

    NASA Technical Reports Server (NTRS)

    Tripp, John S.; Tcheng, Ping

    1999-01-01

    Statistical tools, previously developed for nonlinear least-squares estimation of multivariate sensor calibration parameters and the associated calibration uncertainty analysis, have been applied to single- and multiple-axis inertial model attitude sensors used in wind tunnel testing to measure angle of attack and roll angle. The analysis provides confidence and prediction intervals of calibrated sensor measurement uncertainty as functions of applied input pitch and roll angles. A comparative performance study of various experimental designs for inertial sensor calibration is presented along with corroborating experimental data. The importance of replicated calibrations over extended time periods has been emphasized; replication provides independent estimates of calibration precision and bias uncertainties, statistical tests for calibration or modeling bias uncertainty, and statistical tests for sensor parameter drift over time. A set of recommendations for a new standardized model attitude sensor calibration method and usage procedures is included. The statistical information provided by these procedures is necessary for the uncertainty analysis of aerospace test results now required by users of industrial wind tunnel test facilities.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simonen, F.A.; Khaleel, M.A.

    This paper describes a statistical evaluation of the through-thickness copper variation for welds in reactor pressure vessels, and reviews the historical basis for the static and arrest fracture toughness (K{sub Ic} and K{sub Ia}) equations used in the VISA-II code. Copper variability in welds is due to fabrication procedures with copper contents being randomly distributed, variable from one location to another through the thickness of the vessel. The VISA-II procedure of sampling the copper content from a statistical distribution for every 6.35- to 12.7-mm (1/4- to 1/2-in.) layer through the thickness was found to be consistent with the statistical observations.more » However, the parameters of the VISA-II distribution and statistical limits required further investigation. Copper contents at few locations through the thickness were found to exceed the 0.4% upper limit of the VISA-II code. The data also suggest that the mean copper content varies systematically through the thickness. While, the assumption of normality is not clearly supported by the available data, a statistical evaluation based on all the available data results in mean and standard deviations within the VISA-II code limits.« less

  12. Test Statistics and Confidence Intervals to Establish Noninferiority between Treatments with Ordinal Categorical Data.

    PubMed

    Zhang, Fanghong; Miyaoka, Etsuo; Huang, Fuping; Tanaka, Yutaka

    2015-01-01

    The problem for establishing noninferiority is discussed between a new treatment and a standard (control) treatment with ordinal categorical data. A measure of treatment effect is used and a method of specifying noninferiority margin for the measure is provided. Two Z-type test statistics are proposed where the estimation of variance is constructed under the shifted null hypothesis using U-statistics. Furthermore, the confidence interval and the sample size formula are given based on the proposed test statistics. The proposed procedure is applied to a dataset from a clinical trial. A simulation study is conducted to compare the performance of the proposed test statistics with that of the existing ones, and the results show that the proposed test statistics are better in terms of the deviation from nominal level and the power.

  13. Analysis of the procedures used to evaluate suicide crime scenes in Brazil: a statistical approach to interpret reports.

    PubMed

    Bruni, Aline Thaís; Velho, Jesus Antonio; Ferreira, Arthur Serra Lopes; Tasso, Maria Júlia; Ferrari, Raíssa Santos; Yoshida, Ricardo Luís; Dias, Marcos Salvador; Leite, Vitor Barbanti Pereira

    2014-08-01

    This study uses statistical techniques to evaluate reports on suicide scenes; it utilizes 80 reports from different locations in Brazil, randomly collected from both federal and state jurisdictions. We aimed to assess a heterogeneous group of cases in order to obtain an overall perspective of the problem. We evaluated variables regarding the characteristics of the crime scene, such as the detected traces (blood, instruments and clothes) that were found and we addressed the methodology employed by the experts. A qualitative approach using basic statistics revealed a wide distribution as to how the issue was addressed in the documents. We examined a quantitative approach involving an empirical equation and we used multivariate procedures to validate the quantitative methodology proposed for this empirical equation. The methodology successfully identified the main differences in the information presented in the reports, showing that there is no standardized method of analyzing evidences. Copyright © 2014 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  14. An analytic technique for statistically modeling random atomic clock errors in estimation

    NASA Technical Reports Server (NTRS)

    Fell, P. J.

    1981-01-01

    Minimum variance estimation requires that the statistics of random observation errors be modeled properly. If measurements are derived through the use of atomic frequency standards, then one source of error affecting the observable is random fluctuation in frequency. This is the case, for example, with range and integrated Doppler measurements from satellites of the Global Positioning and baseline determination for geodynamic applications. An analytic method is presented which approximates the statistics of this random process. The procedure starts with a model of the Allan variance for a particular oscillator and develops the statistics of range and integrated Doppler measurements. A series of five first order Markov processes is used to approximate the power spectral density obtained from the Allan variance.

  15. Evaluation on the use of cerium in the NBL Titrimetric Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zebrowski, J.P.; Orlowicz, G.J.; Johnson, K.D.

    An alternative to potassium dichromate as titrant in the New Brunswick Laboratory Titrimetric Method for uranium analysis was sought since chromium in the waste makes disposal difficult. Substitution of a ceric-based titrant was statistically evaluated. Analysis of the data indicated statistically equivalent precisions for the two methods, but a significant overall bias of +0.035% for the ceric titrant procedure. The cause of the bias was investigated, alterations to the procedure were made, and a second statistical study was performed. This second study revealed no statistically significant bias, nor any analyst-to-analyst variation in the ceric titration procedure. A statistically significant day-to-daymore » variation was detected, but this was physically small (0.01 5%) and was only detected because of the within-day precision of the method. The added mean and standard deviation of the %RD for a single measurement was found to be 0.031%. A comparison with quality control blind dichromate titration data again indicated similar overall precision. Effects of ten elements on the ceric titration`s performance was determined. Co, Ti, Cu, Ni, Na, Mg, Gd, Zn, Cd, and Cr in previous work at NBL these impurities did not interfere with the potassium dichromate titrant. This study indicated similar results for the ceric titrant, with the exception of Ti. All the elements (excluding Ti and Cr), caused no statistically significant bias in uranium measurements at levels of 10 mg impurity per 20-40 mg uranium. The presence of Ti was found to cause a bias of {minus}0.05%; this is attributed to the presence of sulfate ions, resulting in precipitation of titanium sulfate and occlusion of uranium. A negative bias of 0.012% was also statistically observed in the samples containing chromium impurities.« less

  16. Computing physical properties with quantum Monte Carlo methods with statistical fluctuations independent of system size.

    PubMed

    Assaraf, Roland

    2014-12-01

    We show that the recently proposed correlated sampling without reweighting procedure extends the locality (asymptotic independence of the system size) of a physical property to the statistical fluctuations of its estimator. This makes the approach potentially vastly more efficient for computing space-localized properties in large systems compared with standard correlated methods. A proof is given for a large collection of noninteracting fragments. Calculations on hydrogen chains suggest that this behavior holds not only for systems displaying short-range correlations, but also for systems with long-range correlations.

  17. Are procedures codes in claims data a reliable indicator of intraoperative splenic injury compared with clinical registry data?

    PubMed

    Stey, Anne M; Ko, Clifford Y; Hall, Bruce Lee; Louie, Rachel; Lawson, Elise H; Gibbons, Melinda M; Zingmond, David S; Russell, Marcia M

    2014-08-01

    Identifying iatrogenic injuries using existing data sources is important for improved transparency in the occurrence of intraoperative events. There is evidence that procedure codes are reliably recorded in claims data. The objective of this study was to assess whether concurrent splenic procedure codes in patients undergoing colectomy procedures are reliably coded in claims data as compared with clinical registry data. Patients who underwent colectomy procedures in the absence of neoplastic diagnosis codes were identified from American College of Surgeons (ACS) NSQIP data linked with Medicare inpatient claims data file (2005 to 2008). A κ statistic was used to assess coding concordance between ACS NSQIP and Medicare inpatient claims, with ACS NSQIP serving as the reference standard. A total of 11,367 colectomy patients were identified from 212 hospitals. There were 114 patients (1%) who had a concurrent splenic procedure code recorded in either ACS NSQIP or Medicare inpatient claims. There were 7 patients who had a splenic injury diagnosis code recorded in either data source. Agreement of splenic procedure codes between the data sources was substantial (κ statistic 0.72; 95% CI, 0.64-0.79). Medicare inpatient claims identified 81% of the splenic procedure codes recorded in ACS NSQIP, and 99% of the patients without a splenic procedure code. It is feasible to use Medicare claims data to identify splenic injuries occurring during colectomy procedures, as claims data have moderate sensitivity and excellent specificity for capturing concurrent splenic procedure codes compared with ACS NSQIP. Copyright © 2014 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  18. Robust regression for large-scale neuroimaging studies.

    PubMed

    Fritsch, Virgile; Da Mota, Benoit; Loth, Eva; Varoquaux, Gaël; Banaschewski, Tobias; Barker, Gareth J; Bokde, Arun L W; Brühl, Rüdiger; Butzek, Brigitte; Conrod, Patricia; Flor, Herta; Garavan, Hugh; Lemaitre, Hervé; Mann, Karl; Nees, Frauke; Paus, Tomas; Schad, Daniel J; Schümann, Gunter; Frouin, Vincent; Poline, Jean-Baptiste; Thirion, Bertrand

    2015-05-01

    Multi-subject datasets used in neuroimaging group studies have a complex structure, as they exhibit non-stationary statistical properties across regions and display various artifacts. While studies with small sample sizes can rarely be shown to deviate from standard hypotheses (such as the normality of the residuals) due to the poor sensitivity of normality tests with low degrees of freedom, large-scale studies (e.g. >100 subjects) exhibit more obvious deviations from these hypotheses and call for more refined models for statistical inference. Here, we demonstrate the benefits of robust regression as a tool for analyzing large neuroimaging cohorts. First, we use an analytic test based on robust parameter estimates; based on simulations, this procedure is shown to provide an accurate statistical control without resorting to permutations. Second, we show that robust regression yields more detections than standard algorithms using as an example an imaging genetics study with 392 subjects. Third, we show that robust regression can avoid false positives in a large-scale analysis of brain-behavior relationships with over 1500 subjects. Finally we embed robust regression in the Randomized Parcellation Based Inference (RPBI) method and demonstrate that this combination further improves the sensitivity of tests carried out across the whole brain. Altogether, our results show that robust procedures provide important advantages in large-scale neuroimaging group studies. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Nonlinear estimation of parameters in biphasic Arrhenius plots.

    PubMed

    Puterman, M L; Hrboticky, N; Innis, S M

    1988-05-01

    This paper presents a formal procedure for the statistical analysis of data on the thermotropic behavior of membrane-bound enzymes generated using the Arrhenius equation and compares the analysis to several alternatives. Data is modeled by a bent hyperbola. Nonlinear regression is used to obtain estimates and standard errors of the intersection of line segments, defined as the transition temperature, and slopes, defined as energies of activation of the enzyme reaction. The methodology allows formal tests of the adequacy of a biphasic model rather than either a single straight line or a curvilinear model. Examples on data concerning the thermotropic behavior of pig brain synaptosomal acetylcholinesterase are given. The data support the biphasic temperature dependence of this enzyme. The methodology represents a formal procedure for statistical validation of any biphasic data and allows for calculation of all line parameters with estimates of precision.

  20. Use of fractional factorial design for optimization of digestion procedures followed by multi-element determination of essential and non-essential elements in nuts using ICP-OES technique.

    PubMed

    Momen, Awad A; Zachariadis, George A; Anthemidis, Aristidis N; Stratis, John A

    2007-01-15

    Two digestion procedures have been tested on nut samples for application in the determination of essential (Cr, Cu, Fe, Mg, Mn, Zn) and non-essential (Al, Ba, Cd, Pb) elements by inductively coupled plasma-optical emission spectrometry (ICP-OES). These included wet digestions with HNO(3)/H(2)SO(4) and HNO(3)/H(2)SO(4)/H(2)O(2). The later one is recommended for better analytes recoveries (relative error<11%). Two calibrations (aqueous standard and standard addition) procedures were studied and proved that standard addition was preferable for all analytes. Experimental designs for seven factors (HNO(3), H(2)SO(4) and H(2)O(2) volumes, digestion time, pre-digestion time, temperature of the hot plate and sample weight) were used for optimization of sample digestion procedures. For this purpose Plackett-Burman fractional factorial design, which involve eight experiments was adopted. The factors HNO(3) and H(2)O(2) volume, and the digestion time were found to be the most important parameters. The instrumental conditions were also optimized (using peanut matrix rather than aqueous standard solutions) considering radio-frequency (rf) incident power, nebulizer argon gas flow rate and sample uptake flow rate. The analytical performance, such as limits of detection (LOD<0.74mugg(-1)), precision of the overall procedures (relative standard deviation between 2.0 and 8.2%) and accuracy (relative errors between 0.4 and 11%) were assessed statistically to evaluate the developed analytical procedures. The good agreement between measured and certified values for all analytes (relative error <11%) with respect to IAEA-331 (spinach leaves) and IAEA-359 (cabbage) indicates that the developed analytical method is well suited for further studies on the fate of major elements in nuts and possibly similar matrices.

  1. Using the Bootstrap Method to Evaluate the Critical Range of Misfit for Polytomous Rasch Fit Statistics.

    PubMed

    Seol, Hyunsoo

    2016-06-01

    The purpose of this study was to apply the bootstrap procedure to evaluate how the bootstrapped confidence intervals (CIs) for polytomous Rasch fit statistics might differ according to sample sizes and test lengths in comparison with the rule-of-thumb critical value of misfit. A total of 25 simulated data sets were generated to fit the Rasch measurement and then a total of 1,000 replications were conducted to compute the bootstrapped CIs under each of 25 testing conditions. The results showed that rule-of-thumb critical values for assessing the magnitude of misfit were not applicable because the infit and outfit mean square error statistics showed different magnitudes of variability over testing conditions and the standardized fit statistics did not exactly follow the standard normal distribution. Further, they also do not share the same critical range for the item and person misfit. Based on the results of the study, the bootstrapped CIs can be used to identify misfitting items or persons as they offer a reasonable alternative solution, especially when the distributions of the infit and outfit statistics are not well known and depend on sample size. © The Author(s) 2016.

  2. Protocol for monitoring metals in Ozark National Scenic Riverways, Missouri: Version 1.0

    USGS Publications Warehouse

    Schmitt, Christopher J.; Brumbaugh, William G.; Besser, John M.; Hinck, Jo Ellen; Bowles, David E.; Morrison, Lloyd W.; Williams, Michael H.

    2008-01-01

    The National Park Service is developing a monitoring plan for the Ozark National Scenic Riverways in southeastern Missouri. Because of concerns about the release of lead, zinc, and other metals from lead-zinc mining to streams, the monitoring plan will include mining-related metals. After considering a variety of alternatives, the plan will consist of measuring the concentrations of cadmium, cobalt, lead, nickel, and zinc in composite samples of crayfish (Orconectes luteus or alternate species) and Asian clam (Corbicula fluminea) collected periodically from selected sites. This document, which comprises a protocol narrative and supporting standard operating procedures, describes the methods to be employed prior to, during, and after collection of the organisms, along with procedures for their chemical analysis and quality assurance; statistical analysis, interpretation, and reporting of the data; and for modifying the protocol narrative and supporting standard operating procedures. A list of supplies and equipment, data forms, and sample labels are also included. An example based on data from a pilot study is presented.

  3. The use of standard operating procedures in day case anterior cruciate ligament reconstruction.

    PubMed

    Khan, T; Jackson, W F; Beard, D J; Marfin, A; Ahmad, M; Spacie, R; Jones, R; Howes, S; Barker, K; Price, A J

    2012-08-01

    The current rate of day-case anterior cruciate ligament reconstruction (ACLR) in the UK remains low. Although specialised care pathways with standard operating procedures (SOPs) have been effective in reducing length of stay following some surgical procedures, this has not been previously reported for ACLR. We evaluate the effectiveness of SOPs for establishing day-case ACLR in a specialist unit. Fifty patients undergoing ACLR between May and September 2010 were studied prospectively ("study group"). SOPs were designed for pre-operative assessment, anaesthesia, surgical procedure, mobilisation and discharge. We evaluated length of stay, readmission rates, patient satisfaction and compliance to SOPs. A retrospective analysis of 50 patients who underwent ACLR prior to implementation of the day-case pathway was performed ("standard practice group"). Eighty percent of patients in the study group were discharged on the day of surgery (mean length of stay=5.3h) compared to 16% in the standard practice group (mean length of stay=21.6h). This difference was statistically significant (p<0.05, Mann-Whitney U test). All patients were satisfied with the day case pathway. Ninety-two percent of the study group were discharged on the day of surgery when all SOPs were followed and 46% where they were not. High rates of day-case ACLR with excellent patient satisfaction can be achieved with the use of a specialised patient pathway with SOPs. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. Statistical prediction with Kanerva's sparse distributed memory

    NASA Technical Reports Server (NTRS)

    Rogers, David

    1989-01-01

    A new viewpoint of the processing performed by Kanerva's sparse distributed memory (SDM) is presented. In conditions of near- or over-capacity, where the associative-memory behavior of the model breaks down, the processing performed by the model can be interpreted as that of a statistical predictor. Mathematical results are presented which serve as the framework for a new statistical viewpoint of sparse distributed memory and for which the standard formulation of SDM is a special case. This viewpoint suggests possible enhancements to the SDM model, including a procedure for improving the predictiveness of the system based on Holland's work with genetic algorithms, and a method for improving the capacity of SDM even when used as an associative memory.

  5. Similar range of motion and function after resurfacing large–head or standard total hip arthroplasty

    PubMed Central

    2013-01-01

    Background and purpose Large–size hip articulations may improve range of motion (ROM) and function compared to a 28–mm THA, and the low risk of dislocation allows the patients more activity postoperatively. On the other hand, the greater extent of surgery for resurfacing hip arthroplasty (RHA) could impair rehabilitation. We investigated the effect of head size and surgical procedure on postoperative rehabilitation in a randomized clinical trial (RCT). Methods We followed randomized groups of RHAs, large–head THAs and standard THAs at 2 months, 6 months, 1 and 2 years postoperatively, recording clinical rehabilitation parameters. Results Large articulations increased the mean total range of motion by 13° during the first 6 postoperative months. The increase was not statistically significant and was transient. The 2–year total ROM (SD) for RHA, standard THA, and large–head THA was 221° (35), 232° (36), and 225° (30) respectively, but the differences were not statistically significant. The 3 groups were similar regarding Harris hip score, UCLA activity score, step rate, and sick leave. Interpretation Head size had no influence on range of motion. The lack of restriction allowed for large articulations did not improve the clinical and patient–perceived outcomes. The more extensive surgical procedure of RHA did not impair the rehabilitation. This project is registered at ClinicalTrials.gov under # NCT01113762. PMID:23530872

  6. Empirical best linear unbiased prediction method for small areas with restricted maximum likelihood and bootstrap procedure to estimate the average of household expenditure per capita in Banjar Regency

    NASA Astrophysics Data System (ADS)

    Aminah, Agustin Siti; Pawitan, Gandhi; Tantular, Bertho

    2017-03-01

    So far, most of the data published by Statistics Indonesia (BPS) as data providers for national statistics are still limited to the district level. Less sufficient sample size for smaller area levels to make the measurement of poverty indicators with direct estimation produced high standard error. Therefore, the analysis based on it is unreliable. To solve this problem, the estimation method which can provide a better accuracy by combining survey data and other auxiliary data is required. One method often used for the estimation is the Small Area Estimation (SAE). There are many methods used in SAE, one of them is Empirical Best Linear Unbiased Prediction (EBLUP). EBLUP method of maximum likelihood (ML) procedures does not consider the loss of degrees of freedom due to estimating β with β ^. This drawback motivates the use of the restricted maximum likelihood (REML) procedure. This paper proposed EBLUP with REML procedure for estimating poverty indicators by modeling the average of household expenditures per capita and implemented bootstrap procedure to calculate MSE (Mean Square Error) to compare the accuracy EBLUP method with the direct estimation method. Results show that EBLUP method reduced MSE in small area estimation.

  7. Advanced Combat Helmet Technical Assessment

    DTIC Science & Technology

    2013-05-29

    Lastly, we assessed the participation of various stakeholders and industry experts such as active ACH manufacturers and test facilities. Findings... industrially accepted American National Standards Institute (ANSI Z1.4-2008, Sampling Visit us on the web at www.dodig.mil Results in Brief Advanced...statistically principled approach and the lot acceptance test protocol adopts a widely established and industrially accepted sampling procedure. We

  8. Assessment of variations in thermal cycle life data of thermal barrier coated rods

    NASA Astrophysics Data System (ADS)

    Hendricks, R. C.; McDonald, G.

    An analysis of thermal cycle life data for 22 thermal barrier coated (TBC) specimens was conducted. The Zr02-8Y203/NiCrAlY plasma spray coated Rene 41 rods were tested in a Mach 0.3 Jet A/air burner flame. All specimens were subjected to the same coating and subsequent test procedures in an effort to control three parametric groups; material properties, geometry and heat flux. Statistically, the data sample space had a mean of 1330 cycles with a standard deviation of 520 cycles. The data were described by normal or log-normal distributions, but other models could also apply; the sample size must be increased to clearly delineate a statistical failure model. The statistical methods were also applied to adhesive/cohesive strength data for 20 TBC discs of the same composition, with similar results. The sample space had a mean of 9 MPa with a standard deviation of 4.2 MPa.

  9. Assessment of variations in thermal cycle life data of thermal barrier coated rods

    NASA Technical Reports Server (NTRS)

    Hendricks, R. C.; Mcdonald, G.

    1981-01-01

    An analysis of thermal cycle life data for 22 thermal barrier coated (TBC) specimens was conducted. The Zr02-8Y203/NiCrAlY plasma spray coated Rene 41 rods were tested in a Mach 0.3 Jet A/air burner flame. All specimens were subjected to the same coating and subsequent test procedures in an effort to control three parametric groups; material properties, geometry and heat flux. Statistically, the data sample space had a mean of 1330 cycles with a standard deviation of 520 cycles. The data were described by normal or log-normal distributions, but other models could also apply; the sample size must be increased to clearly delineate a statistical failure model. The statistical methods were also applied to adhesive/cohesive strength data for 20 TBC discs of the same composition, with similar results. The sample space had a mean of 9 MPa with a standard deviation of 4.2 MPa.

  10. A comparison of linear and nonlinear statistical techniques in performance attribution.

    PubMed

    Chan, N H; Genovese, C R

    2001-01-01

    Performance attribution is usually conducted under the linear framework of multifactor models. Although commonly used by practitioners in finance, linear multifactor models are known to be less than satisfactory in many situations. After a brief survey of nonlinear methods, nonlinear statistical techniques are applied to performance attribution of a portfolio constructed from a fixed universe of stocks using factors derived from some commonly used cross sectional linear multifactor models. By rebalancing this portfolio monthly, the cumulative returns for procedures based on standard linear multifactor model and three nonlinear techniques-model selection, additive models, and neural networks-are calculated and compared. It is found that the first two nonlinear techniques, especially in combination, outperform the standard linear model. The results in the neural-network case are inconclusive because of the great variety of possible models. Although these methods are more complicated and may require some tuning, toolboxes are developed and suggestions on calibration are proposed. This paper demonstrates the usefulness of modern nonlinear statistical techniques in performance attribution.

  11. A statistically robust EEG re-referencing procedure to mitigate reference effect

    PubMed Central

    Lepage, Kyle Q.; Kramer, Mark A.; Chu, Catherine J.

    2014-01-01

    Background The electroencephalogram (EEG) remains the primary tool for diagnosis of abnormal brain activity in clinical neurology and for in vivo recordings of human neurophysiology in neuroscience research. In EEG data acquisition, voltage is measured at positions on the scalp with respect to a reference electrode. When this reference electrode responds to electrical activity or artifact all electrodes are affected. Successful analysis of EEG data often involves re-referencing procedures that modify the recorded traces and seek to minimize the impact of reference electrode activity upon functions of the original EEG recordings. New method We provide a novel, statistically robust procedure that adapts a robust maximum-likelihood type estimator to the problem of reference estimation, reduces the influence of neural activity from the re-referencing operation, and maintains good performance in a wide variety of empirical scenarios. Results The performance of the proposed and existing re-referencing procedures are validated in simulation and with examples of EEG recordings. To facilitate this comparison, channel-to-channel correlations are investigated theoretically and in simulation. Comparison with existing methods The proposed procedure avoids using data contaminated by neural signal and remains unbiased in recording scenarios where physical references, the common average reference (CAR) and the reference estimation standardization technique (REST) are not optimal. Conclusion The proposed procedure is simple, fast, and avoids the potential for substantial bias when analyzing low-density EEG data. PMID:24975291

  12. Abnormal cortical sources of resting state electroencephalographic rhythms in single treatment-naïve HIV individuals: A statistical z-score index.

    PubMed

    Babiloni, Claudio; Pennica, Alfredo; Del Percio, Claudio; Noce, Giuseppe; Cordone, Susanna; Muratori, Chiara; Ferracuti, Stefano; Donato, Nicole; Di Campli, Francesco; Gianserra, Laura; Teti, Elisabetta; Aceti, Antonio; Soricelli, Andrea; Viscione, Magdalena; Limatola, Cristina; Andreoni, Massimo; Onorati, Paolo

    2016-03-01

    This study tested a simple statistical procedure to recognize single treatment-naïve HIV individuals having abnormal cortical sources of resting state delta (<4 Hz) and alpha (8-13 Hz) electroencephalographic (EEG) rhythms with reference to a control group of sex-, age-, and education-matched healthy individuals. Compared to the HIV individuals with a statistically normal EEG marker, those with abnormal values were expected to show worse cognitive status. Resting state eyes-closed EEG data were recorded in 82 treatment-naïve HIV (39.8 ys.±1.2 standard error mean, SE) and 59 age-matched cognitively healthy subjects (39 ys.±2.2 SE). Low-resolution brain electromagnetic tomography (LORETA) estimated delta and alpha sources in frontal, central, temporal, parietal, and occipital cortical regions. Ratio of the activity of parietal delta and high-frequency alpha sources (EEG marker) showed the maximum difference between the healthy and the treatment-naïve HIV group. Z-score of the EEG marker was statistically abnormal in 47.6% of treatment-naïve HIV individuals with reference to the healthy group (p<0.05). Compared to the HIV individuals with a statistically normal EEG marker, those with abnormal values exhibited lower mini mental state evaluation (MMSE) score, higher CD4 count, and lower viral load (p<0.05). This statistical procedure permitted for the first time to identify single treatment-naïve HIV individuals having abnormal EEG activity. This procedure might enrich the detection and monitoring of effects of HIV on brain function in single treatment-naïve HIV individuals. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  13. Using accelerated life testing procedures to compare the relative sensitivity of rainbow trout and the federally listed threatened bull trout to three commonly used rangeland herbicides (picloram, 2,4-D, and clopyralid).

    PubMed

    Fairchild, James F; Allert, Ann; Sappington, Linda S; Nelson, Karen J; Valle, Janet

    2008-03-01

    We conducted 96-h static acute toxicity studies to evaluate the relative sensitivity of juveniles of the threatened bull trout (Salvelinus confluentus) and the standard cold-water surrogate rainbow trout (Onchorhyncus mykiss) to three rangeland herbicides commonly used for controlling invasive weeds in the northwestern United States. Relative species sensitivity was compared using three procedures: standard acute toxicity testing, fractional estimates of lethal concentrations, and accelerated life testing chronic estimation procedures. The acutely lethal concentrations (ALC) resulting in 50% mortality at 96 h (96-h ALC50s) were determined using linear regression and indicated that the three herbicides were toxic in the order of picloram acid > 2,4-D acid > clopyralid acid. The 96-h ALC50 values for rainbow trout were as follows: picloram, 41 mg/L; 2.4-D, 707 mg/L; and clopyralid, 700 mg/L. The 96-h ALC50 values for bull trout were as follows: picloram, 24 mg/L; 2.4-D, 398 mg/L; and clopyralid, 802 mg/L. Fractional estimates of safe concentrations, based on 5% of the 96-h ALC50, were conservative (overestimated toxicity) of regression-derived 96-h ALC5 values by an order of magnitude. Accelerated life testing procedures were used to estimate chronic lethal concentrations (CLC) resulting in 1% mortality at 30 d (30-d CLC1) for the three herbicides: picloram (1 mg/L rainbow trout, 5 mg/L bull trout), 2,4-D (56 mg/L rainbow trout, 84 mg/L bull trout), and clopyralid (477 mg/L rainbow trout; 552 mg/L bull trout). Collectively, the results indicated that the standard surrogate rainbow trout is similar in sensitivity to bull trout. Accelerated life testing procedures provided cost-effective, statistically defensible methods for estimating safe chronic concentrations (30-d CLC1s) of herbicides from acute toxicity data because they use statistical models based on the entire mortality:concentration:time data matrix.

  14. Using accelerated life testing procedures to compare the relative sensitivity of rainbow trout and the federally listed threatened bull trout to three commonly used rangeland herbicides (picloram, 2,4-D, and clopyralid)

    USGS Publications Warehouse

    Fairchild, J.F.; Allert, A.; Sappington, L.S.; Nelson, K.J.; Valle, J.

    2008-01-01

    We conducted 96-h static acute toxicity studies to evaluate the relative sensitivity of juveniles of the threatened bull trout (Salvelinus confluentus) and the standard cold-water surrogate rainbow trout (Onchorhyncus mykiss) to three rangeland herbicides commonly used for controlling invasive weeds in the northwestern United States. Relative species sensitivity was compared using three procedures: standard acute toxicity testing, fractional estimates of lethal concentrations, and accelerated life testing chronic estimation procedures. The acutely lethal concentrations (ALC) resulting in 50% mortality at 96 h (96-h ALC50s) were determined using linear regression and indicated that the three herbicides were toxic in the order of picloram acid > 2,4-D acid > clopyralid acid. The 96-h ALC50 values for rainbow trout were as follows: picloram, 41 mg/L; 2.4-D, 707 mg/L; and clopyralid, 700 mg/L. The 96-h ALC50 values for bull trout were as follows: picloram, 24 mg/L; 2.4-D, 398 mg/L; and clopyralid, 802 mg/L. Fractional estimates of safe concentrations, based on 5% of the 96-h ALC50, were conservative (overestimated toxicity) of regression-derived 96-h ALC5 values by an order of magnitude. Accelerated life testing procedures were used to estimate chronic lethal concentrations (CLC) resulting in 1% mortality at 30 d (30-d CLC1) for the three herbicides: picloram (1 mg/L rainbow trout, 5 mg/L bull trout), 2,4-D (56 mg/L rainbow trout, 84 mg/L bull trout), and clopyralid (477 mg/L rainbow trout; 552 mg/L bull trout). Collectively, the results indicated that the standard surrogate rainbow trout is similar in sensitivity to bull trout. Accelerated life testing procedures provided cost-effective, statistically defensible methods for estimating safe chronic concentrations (30-d CLC1s) of herbicides from acute toxicity data because they use statistical models based on the entire mortality:concentration: time data matrix. ?? 2008 SETAC.

  15. Impact of operator on determining functional parameters of nuclear medicine procedures.

    PubMed

    Mohammed, A M; Naddaf, S Y; Mahdi, F S; Al-Mutawa, Q I; Al-Dossary, H A; Elgazzar, A H

    2006-01-01

    The study was designed to assess the significance of the interoperator variability in the estimation of functional parameters for four nuclear medicine procedures. Three nuclear medicine technologists with varying years of experience processed the following randomly selected 20 cases with diverse functions of each study type: renography, renal cortical scans, myocardial perfusion gated single-photon emission computed tomography (MP-GSPECT) and gated blood pool ventriculography (GBPV). The technologists used the same standard processing routines and were blinded to the results of each other. The means of the values and the means of differences calculated case by case were statistically analyzed by one-way ANOVA. The values were further analyzed using Pearson correlation. The range of the mean values and standard deviation of relative renal function obtained by the three technologists were 50.65 +/- 3.9 to 50.92 +/- 4.4% for renography, 51.43 +/- 8.4 to 51.55 +/- 8.8% for renal cortical scans, 57.40 +/- 14.3 to 58.30 +/- 14.9% for left ventricular ejection fraction from MP-GSPECT and 54.80 +/- 12.8 to 55.10 +/- 13.1% for GBPV. The difference was not statistically significant, p > 0.9. The values showed a high correlation of more than 0.95. Calculated case by case, the mean of differences +/- SD was found to range from 0.42 +/- 0.36% in renal cortical scans to 1.35 +/- 0.87% in MP-GSPECT with a maximum difference of 4.00%. The difference was not statistically significant, p > 0.19. The estimated functional parameters were reproducible and operator independent as long as the standard processing instructions were followed. Copyright 2006 S. Karger AG, Basel.

  16. Standard reference water samples for rare earth element determinations

    USGS Publications Warehouse

    Verplanck, P.L.; Antweiler, Ronald C.; Nordstrom, D. Kirk; Taylor, Howard E.

    2001-01-01

    Standard reference water samples (SRWS) were collected from two mine sites, one near Ophir, CO, USA and the other near Redding, CA, USA. The samples were filtered, preserved, and analyzed for rare earth element (REE) concentrations (La, Ce, Pr, Nd, Sm, Eu, Gd, Tb, Dy, Ho, Er, Tm, Yb, and Lu) by inductively coupled plasma-mass spectrometry (ICP-MS). These two samples were acid mine waters with elevated concentrations of REEs (0.45-161 ??g/1). Seventeen international laboratories participated in a 'round-robin' chemical analysis program, which made it possible to evaluate the data by robust statistical procedures that are insensitive to outliers. The resulting most probable values are reported. Ten to 15 of the participants also reported values for Ba, Y, and Sc. Field parameters, major ion, and other trace element concentrations, not subject to statistical evaluation, are provided.

  17. Species-specific diagnostic assays for Bonamia ostreae and B. exitiosa in European flat oyster Ostrea edulis: conventional, real-time and multiplex PCR.

    PubMed

    Ramilo, Andrea; Navas, J Ignacio; Villalba, Antonio; Abollo, Elvira

    2013-05-27

    Bonamia ostreae and B. exitiosa have caused mass mortalities of various oyster species around the world and co-occur in some European areas. The World Organisation for Animal Health (OIE) has included infections with both species in the list of notifiable diseases. However, official methods for species-specific diagnosis of either parasite have certain limitations. In this study, new species-specific conventional PCR (cPCR) and real-time PCR techniques were developed to diagnose each parasite species. Moreover, a multiplex PCR method was designed to detect both parasites in a single assay. The analytical sensitivity and specificity of each new method were evaluated. These new procedures were compared with 2 OIE-recommended methods, viz. standard histology and PCR-RFLP. The new procedures showed higher sensitivity than the OIE recommended ones for the diagnosis of both species. The sensitivity of tests with the new primers was higher using oyster gills and gonad tissue, rather than gills alone. The lack of a 'gold standard' prevented accurate estimation of sensitivity and specificity of the new methods. The implementation of statistical tools (maximum likelihood method) for the comparison of the diagnostic tests showed the possibility of false positives with the new procedures, although the absence of a gold standard precluded certainty. Nevertheless, all procedures showed negative results when used for the analysis of oysters from a Bonamia-free area.

  18. Comparison of Quadrapolar™ radiofrequency lesions produced by standard versus modified technique: an experimental model.

    PubMed

    Safakish, Ramin

    2017-01-01

    Lower back pain (LBP) is a global public health issue and is associated with substantial financial costs and loss of quality of life. Over the years, different literature has provided different statistics regarding the causes of the back pain. The following statistic is the closest estimation regarding our patient population. The sacroiliac (SI) joint pain is responsible for LBP in 18%-30% of individuals with LBP. Quadrapolar™ radiofrequency ablation, which involves ablation of the nerves of the SI joint using heat, is a commonly used treatment for SI joint pain. However, the standard Quadrapolar radiofrequency procedure is not always effective at ablating all the sensory nerves that cause the pain in the SI joint. One of the major limitations of the standard Quadrapolar radiofrequency procedure is that it produces small lesions of ~4 mm in diameter. Smaller lesions increase the likelihood of failure to ablate all nociceptive input. In this study, we compare the standard Quadrapolar radiofrequency ablation technique to a modified Quadrapolar ablation technique that has produced improved patient outcomes in our clinic. The methodology of the two techniques are compared. In addition, we compare results from an experimental model comparing the lesion sizes produced by the two techniques. Taken together, the findings from this study suggest that the modified Quadrapolar technique provides longer lasting relief for the back pain that is caused by SI joint dysfunction. A randomized controlled clinical trial is the next step required to quantify the difference in symptom relief and quality of life produced by the two techniques.

  19. On Statistical Analysis of Neuroimages with Imperfect Registration

    PubMed Central

    Kim, Won Hwa; Ravi, Sathya N.; Johnson, Sterling C.; Okonkwo, Ozioma C.; Singh, Vikas

    2016-01-01

    A variety of studies in neuroscience/neuroimaging seek to perform statistical inference on the acquired brain image scans for diagnosis as well as understanding the pathological manifestation of diseases. To do so, an important first step is to register (or co-register) all of the image data into a common coordinate system. This permits meaningful comparison of the intensities at each voxel across groups (e.g., diseased versus healthy) to evaluate the effects of the disease and/or use machine learning algorithms in a subsequent step. But errors in the underlying registration make this problematic, they either decrease the statistical power or make the follow-up inference tasks less effective/accurate. In this paper, we derive a novel algorithm which offers immunity to local errors in the underlying deformation field obtained from registration procedures. By deriving a deformation invariant representation of the image, the downstream analysis can be made more robust as if one had access to a (hypothetical) far superior registration procedure. Our algorithm is based on recent work on scattering transform. Using this as a starting point, we show how results from harmonic analysis (especially, non-Euclidean wavelets) yields strategies for designing deformation and additive noise invariant representations of large 3-D brain image volumes. We present a set of results on synthetic and real brain images where we achieve robust statistical analysis even in the presence of substantial deformation errors; here, standard analysis procedures significantly under-perform and fail to identify the true signal. PMID:27042168

  20. A Pilot Study of Simple Interventions to Improve Informed Consent in Clinical Research: Feasibility, Approach, and Results

    PubMed Central

    Kass, Nancy; Taylor, Holly; Ali, Joseph; Hallez, Kristina; Chaisson, Lelia

    2014-01-01

    Background Informed consent is intended to ensure that individuals understand the purpose, risks, and benefits of research studies, and then can decide, voluntarily, whether to enroll. However, research suggests that consent procedures do not always lead to adequate participant understanding and may be longer and more complex than necessary. Studies also suggest some consent interventions, including enhanced consent forms and extended discussions with patients, increase understanding, yet methodologic challenges have been raised in studying consent in actual trial settings. This study aimed to examine the feasibility of testing two consent interventions in actual studies and also to measure effectiveness of interventions in improving understanding of trials. Methods Participants enrolling in any of eight ongoing clinical trials (“collaborating studies”) were, for the purposes of this study, sequentially assigned to one of three study arms involving different informed consent procedures (one control and two intervention). Control participants received standard consent form and processes. Participants in the 1st intervention arm received a bulleted fact-sheet providing simple summaries of all study components in addition to the standard consent form. Participants in the 2nd intervention arm received the bulleted fact-sheet and standard consent materials and then also engaged with a member of the collaborating study staff in a feedback Q&A session. Following consent procedures, we administered closed and open ended questions to assess patient understanding and we assessed literacy level. Descriptive statistics, Wilcoxon-Mann-Whitney and Kruskal-Wallis tests were generated to assess correlations; regression analysis determined predictors of patient understanding. Results 144 participants enrolled. Using regression analysis participants receiving the 2nd intervention, which included a standard consent form, bulleted fact sheet and structured question and answer session with a study staff member, had open-ended question scores that were 7.6 percentage points higher (p=.02) than participants who received the control arm (standard consent only), although unadjusted comparisons did not reach statistical significance. Eleven clinical trial investigators agreed to participate and 8 trials provided sufficient data to be included, thereby demonstrating feasibility of consent research in actual settings. Conclusions Our study supports the hypothesis that patients receiving both bulleted fact sheets and a question and answer session have higher understanding compared to patients receiving standard consent form and procedures alone. Fact sheets and short structured dialog are quick to administer and easy to replicate across studies and should be tested in larger samples for effectiveness. PMID:25475879

  1. The Effect of General Statistical Fiber Misalignment on Predicted Damage Initiation in Composites

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Aboudi, Jacob; Arnold, Steven M.

    2014-01-01

    A micromechanical method is employed for the prediction of unidirectional composites in which the fiber orientation can possess various statistical misalignment distributions. The method relies on the probability-weighted averaging of the appropriate concentration tensor, which is established by the micromechanical procedure. This approach provides access to the local field quantities throughout the constituents, from which initiation of damage in the composite can be predicted. In contrast, a typical macromechanical procedure can determine the effective composite elastic properties in the presence of statistical fiber misalignment, but cannot provide the local fields. Fully random fiber distribution is presented as a special case using the proposed micromechanical method. Results are given that illustrate the effects of various amounts of fiber misalignment in terms of the standard deviations of in-plane and out-of-plane misalignment angles, where normal distributions have been employed. Damage initiation envelopes, local fields, effective moduli, and strengths are predicted for polymer and ceramic matrix composites with given normal distributions of misalignment angles, as well as fully random fiber orientation.

  2. [Bronchoscopy in Germany. Cross-sectional inquiry with 681 institutions].

    PubMed

    Markus, A; Häussinger, K; Kohlhäufl, M; Hauck, R W

    2000-11-01

    Bronchoscopy represents an integral part of the diagnostic tools in pulmonary medicine. Recently, it has also gained considerable attention for its therapeutic properties. To elucidate equipment, indications and procedural techniques of bronchoscopy units, a retrospective survey of 1232 hospitals and practices is conducted. 687 questionnaires are received back (response rate 56%). 681 of which are statistically evaluated. Two thirds of the physicians in charge are internists, one third are pulmonary care specialists. A total of 200,596 endoscopic procedures is included. The majority of procedures is done with an average of 3 bronchoscopists and in over 57% (388) of cases with an average number of 100 or less procedures per year. The five main indications are tumor, hemoptysis, infection or pneumonia, drainage of secretions and suspected interstitial disease. Overall complication rate amounts to 2.7% with an incidence of 4.6% minor and 0.7% major complications and a bronchoscopy-related mortality of 0.02%. The patterns seen in premedication, intra- and post-procedural monitoring, disinfection practices as well as documentation are quite heterogeneous. It is suggested to establish revised and updated standards for bronchoscopy, which should take the data collected into particular account. Those standards should provide the basis for a high level bronchological care throughout Germany.

  3. Statistical description of large datasets of Cumulated and Duration values related to shallow landslides initiated by rainfalls

    NASA Astrophysics Data System (ADS)

    Pisano, Luca; Vessia, Giovanna; Vennari, Carmela; Parise, Mario

    2015-04-01

    Empirical rainfall thresholds are a well established method to draw information about Duration (D) and Cumulated (E) values of the rainfalls that are likely to initiate shallow landslides. To this end, rain-gauge records of rainfall heights are commonly used. Several procedures can be applied to address the calculation of the Duration-Cumulated height and, eventually, the Intensity values related to the rainfall events responsible for shallow landslide onset. A large number of procedures are drawn from particular geological settings and climate conditions based on an expert identification of the rainfall event. A few researchers recently devised automated procedures to reconstruct the rainfall events responsible for landslide onset. In this study, 300 pairs of D, E couples, related to shallow landslides that occurred in a ten year span 2002-2012 on the Italian territory, have been drawn by means of two procedures: the expert method (Brunetti et al., 2010) and the automated method (Vessia et al., 2014). The two procedures start from the same sources of information on shallow landslides occurred during or soon after a rainfall. Although they have in common the method to select the date (up to the hour of the landslide occurrence), the site of the landslide and the choice of the rain-gauge representative for the rainfall, they differ when calculating the Duration and Cumulated height of the rainfall event. Moreover, the expert procedure identifies only one D, E pair for each landslide whereas the automated procedure draws 6 possible D,E pairs for the same landslide event. Each one of the 300 D, E pairs calculated by the automated procedure reproduces about 80% of the E values and about 60% of the D values calculated by the expert procedure. Unfortunately, no standard methods are available for checking the forecasting ability of both the expert and the automated reconstruction of the true D, E pairs that result in shallow landslide. Nonetheless, a statistical analysis on marginal distributions of the seven samples of 300 D and E values are performed in this study. The main objective of this statistical analysis is to highlight similarities and differences in the two sets of samples of Duration and Cumulated values collected by the two procedures. At first, the sample distributions have been investigated: the seven E samples are Lognormal distributed, whereas the D samples are all distributed Weibull like. On E samples, due to their Lognormal distribution, statistical tests can be applied to check two null hypotheses: equal mean values through the Student test, equal standard deviations through the Fisher test. These two hypotheses are accepted for the seven E samples, meaning that they come from the same population, at a confidence level of 95%. Conversely, the preceding tests cannot be applied to the seven D samples that are Weibull distributed with shape parameters k ranging between 0.9 to 1.2. Nonetheless, the two procedures calculate the rainfall event through the selection of the E values; after that the D is drawn. Thus, the results of this statistical analysis preliminary confirms the similarities of the two D,E pair set of values drawn from the two different procedures. References Brunetti, M.T., Peruccacci, S., Rossi, M., Luciani, S., Valigi, D., and Guzzetti, F.: Rainfall thresholds for the possible occurrence of landslides in Italy, Nat. Hazards Earth Syst. Sci., 10, 447-458, doi:10.5194/nhess-10-447-2010, 2010. Vessia G., Parise M., Brunetti M.T., Peruccacci S., Rossi M., Vennari C., and Guzzetti F.: Automated reconstruction of rainfall events responsible for shallow landslides, Nat. Hazards Earth Syst. Sci., 14, 2399-2408, doi: 10.5194/nhess-14-2399-2014, 2014.

  4. Conservativeness in Rejection of the Null Hypothesis when Using the Continuity Correction in the MH Chi-Square Test in DIF Applications

    ERIC Educational Resources Information Center

    Paek, Insu

    2010-01-01

    Conservative bias in rejection of a null hypothesis from using the continuity correction in the Mantel-Haenszel (MH) procedure was examined through simulation in a differential item functioning (DIF) investigation context in which statistical testing uses a prespecified level [alpha] for the decision on an item with respect to DIF. The standard MH…

  5. Statistical approaches to the analysis of point count data: a little extra information can go a long way

    Treesearch

    George L. Farnsworth; James D. Nichols; John R. Sauer; Steven G. Fancy; Kenneth H. Pollock; Susan A. Shriner; Theodore R. Simons

    2005-01-01

    Point counts are a standard sampling procedure for many bird species, but lingering concerns still exist about the quality of information produced from the method. It is well known that variation in observer ability and environmental conditions can influence the detection probability of birds in point counts, but many biologists have been reluctant to abandon point...

  6. Statistical considerations for plot design, sampling procedures, analysis, and quality assurance of ozone injury studies

    Treesearch

    Michael Arbaugh; Larry Bednar

    1996-01-01

    The sampling methods used to monitor ozone injury to ponderosa and Jeffrey pines depend on the objectives of the study, geographic and genetic composition of the forest, and the source and composition of air pollutant emissions. By using a standardized sampling methodology, it may be possible to compare conditions within local areas more accurately, and to apply the...

  7. The Search for Better Predictors of Incomes of High School and College Graduates. AIR Forum 1979 Paper.

    ERIC Educational Resources Information Center

    Witmer, David R.

    A search for better predictors of incomes of high school and college graduates is described. The accuracy of the prediction, implicit in the work of John R. Walsh of Harvard University, that the income differences in a given year are good indicators of income differences in future years, was tested by applying standard statistical procedures to…

  8. Music to reduce pain and distress in the pediatric emergency department: a randomized clinical trial.

    PubMed

    Hartling, Lisa; Newton, Amanda S; Liang, Yuanyuan; Jou, Hsing; Hewson, Krista; Klassen, Terry P; Curtis, Sarah

    2013-09-01

    Many medical procedures aimed at helping children cause them pain and distress, which can have long-lasting negative effects. Music is a form of distraction that may alleviate some of the pain and distress experienced by children while undergoing medical procedures. To compare music with standard care to manage pain and distress. Randomized clinical trial conducted in a pediatric emergency department with appropriate sequence generation and adequate allocation concealment from January 1, 2009, to March 31, 2010. Individuals assessing the primary outcome were blind to treatment allocation. A total of 42 children aged 3 to 11 years undergoing intravenous placement were included. Music (recordings selected by a music therapist via ambient speakers) vs standard care. The primary outcome was behavioral distress assessed blinded using the Observational Scale of Behavioral Distress-Revised. The secondary outcomes included child-reported pain, heart rate, parent and health care provider satisfaction, ease of performing the procedure, and parental anxiety. With or without controlling for potential confounders, we found no significant difference in the change in behavioral distress from before the procedure to immediately after the procedure. When children who had no distress during the procedure were removed from the analysis, there was a significantly less increase in distress for the music group (standard care group = 2.2 vs music group = 1.1, P < .05). Pain scores among children in the standard care group increased by 2 points, while they remained the same in the music group (P = .04); the difference was considered clinically important. The pattern of parent satisfaction with the management of children's pain was different between groups, although not statistically significant (P = .07). Health care providers reported that it was easier to perform the procedure for children in the music group (76% very easy) vs the standard care group (38% very easy) (P = .03). Health care providers were more satisfied with the intravenous placement in the music group (86% very satisfied) compared with the standard care group (48%) (P = .02). Music may have a positive impact on pain and distress for children undergoing intravenous placement. Benefits were also observed for the parents and health care providers. clinicaltrials.gov Identifier: NCT00761033.

  9. Evaluation and standardization of different purification procedures for fish bile and liver metallothionein quantification by spectrophotometry and SDS-PAGE analyses.

    PubMed

    Tenório-Daussat, Carolina Lyrio; Resende, Marcia Carolina Martinho; Ziolli, Roberta L; Hauser-Davis, Rachel Ann; Schaumloffel, Dirk; Saint'Pierre, Tatiana D

    2014-03-01

    Fish bile metallothioneins (MT) have been recently reported as biomarkers for environmental metal contamination; however, no studies regarding standardizations for their purification are available. Therefore, different procedures (varying centrifugation times and heat-treatment temperatures) and reducing agents (DTT, β-mercaptoethanol and TCEP) were applied to purify MT isolated from fish (Oreochromis niloticus) bile and liver. Liver was also analyzed, since these two organs are intrinsically connected and show the same trend regarding MT expression. Spectrophotometrical analyses were used to quantify the resulting MT samples, and SDS-PAGE gels were used to qualitatively assess the different procedure results. Each procedure was then statistically evaluated and a multivariate statistical analysis was then applied. A response surface methodology was also applied for bile samples, in order to further evaluate the responses for this matrix. Heat treatment effectively removes most undesired proteins from the samples, however results indicate that temperatures above 70 °C are not efficient since they also remove MTs from both bile and liver samples. Our results also indicate that the centrifugation times described in the literature can be decreased in order to analyze more samples in the same timeframe, of importance in environmental monitoring contexts where samples are usually numerous. In an environmental context, biliary MT was lower than liver MT, as expected, since liver accumulates MT with slower detoxification rates than bile, which is released from the gallbladder during feeding, and then diluted by water. Therefore, bile MT seems to be more adequate in environmental monitoring scopes regarding recent exposure to xenobiotics that may affect the proteomic and metalloproteomic expression of this biological matrix. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. Modeling Major Adverse Outcomes of Pediatric and Adult Patients With Congenital Heart Disease Undergoing Cardiac Catheterization: Observations From the NCDR IMPACT Registry (National Cardiovascular Data Registry Improving Pediatric and Adult Congenital Treatment).

    PubMed

    Jayaram, Natalie; Spertus, John A; Kennedy, Kevin F; Vincent, Robert; Martin, Gerard R; Curtis, Jeptha P; Nykanen, David; Moore, Phillip M; Bergersen, Lisa

    2017-11-21

    Risk standardization for adverse events after congenital cardiac catheterization is needed to equitably compare patient outcomes among different hospitals as a foundation for quality improvement. The goal of this project was to develop a risk-standardization methodology to adjust for patient characteristics when comparing major adverse outcomes in the NCDR's (National Cardiovascular Data Registry) IMPACT Registry (Improving Pediatric and Adult Congenital Treatment). Between January 2011 and March 2014, 39 725 consecutive patients within IMPACT undergoing cardiac catheterization were identified. Given the heterogeneity of interventional procedures for congenital heart disease, new procedure-type risk categories were derived with empirical data and expert opinion, as were markers of hemodynamic vulnerability. A multivariable hierarchical logistic regression model to identify patient and procedural characteristics predictive of a major adverse event or death after cardiac catheterization was derived in 70% of the cohort and validated in the remaining 30%. The rate of major adverse event or death was 7.1% and 7.2% in the derivation and validation cohorts, respectively. Six procedure-type risk categories and 6 independent indicators of hemodynamic vulnerability were identified. The final risk adjustment model included procedure-type risk category, number of hemodynamic vulnerability indicators, renal insufficiency, single-ventricle physiology, and coagulation disorder. The model had good discrimination, with a C-statistic of 0.76 and 0.75 in the derivation and validation cohorts, respectively. Model calibration in the validation cohort was excellent, with a slope of 0.97 (standard error, 0.04; P value [for difference from 1] =0.53) and an intercept of 0.007 (standard error, 0.12; P value [for difference from 0] =0.95). The creation of a validated risk-standardization model for adverse outcomes after congenital cardiac catheterization can support reporting of risk-adjusted outcomes in the IMPACT Registry as a foundation for quality improvement. © 2017 American Heart Association, Inc.

  11. Evaluation of a Powered Stapler System with Gripping Surface Technology on Surgical Interventions Required During Laparoscopic Sleeve Gastrectomy.

    PubMed

    Fegelman, Elliott; Knippenberg, Susan; Schwiers, Michael; Stefanidis, Dimitrios; Gersin, Keith S; Scott, John D; Fernandez, Adolfo Z

    2017-05-01

    Transection of gastric tissue during laparoscopic sleeve gastrectomy (LSG) can be challenging. Reinforcing the staple line may decrease the incidence of issues requiring intervention. The objective of this study was to compare the number of intraoperative surgical interventions for a surgical stapler and reload system with Gripping Surface Technology (GST) to standard reloads in patients who underwent LSG. Patients who underwent elective LSG were enrolled. The study was conducted in two stages. For Stage 1, procedures were performed using a powered stapler and standard reloads. For Stage 2, a reload system with GST was used. The primary endpoint was surgical interventions for bleeding and/or staple line issues during transection of the greater curvature of the stomach. Propensity score matching was applied to create two groups similar in baseline characteristics and risk factors. A total of 111 subjects were enrolled across four centers. Propensity-matched procedures were completed with the standard (n = 38) or GST reloads (n = 38). The mean number of interventions in the standard group was 1.9 (1.29) versus 1.1 (1.45) in the GST group. Nonparametric comparisons were statistically significant, indicating a reduction in the distribution of interventions for GST subjects (P = .0036 for matched pair data). Tissue slippage during transection was low for both groups. Intraoperative leak testing was negative in all procedures, and no procedures were converted to open. Use of the GST stapling system reduces the need for staple line interventions in LSG. Both stapling systems had an acceptable safety profile.

  12. Development and Evaluation of the American College of Surgeons NSQIP Pediatric Surgical Risk Calculator.

    PubMed

    Kraemer, Kari; Cohen, Mark E; Liu, Yaoming; Barnhart, Douglas C; Rangel, Shawn J; Saito, Jacqueline M; Bilimoria, Karl Y; Ko, Clifford Y; Hall, Bruce L

    2016-11-01

    There is an increased desire among patients and families to be involved in the surgical decision-making process. A surgeon's ability to provide patients and families with patient-specific estimates of postoperative complications is critical for shared decision making and informed consent. Surgeons can also use patient-specific risk estimates to decide whether or not to operate and what options to offer patients. Our objective was to develop and evaluate a publicly available risk estimation tool that would cover many common pediatric surgical procedures across all specialties. American College of Surgeons NSQIP Pediatric standardized data from 67 hospitals were used to develop a risk estimation tool. Surgeons enter 18 preoperative variables (demographics, comorbidities, procedure) that are used in a logistic regression model to predict 9 postoperative outcomes. A surgeon adjustment score is also incorporated to adjust for any additional risk not accounted for in the 18 risk factors. A pediatric surgical risk calculator was developed based on 181,353 cases covering 382 CPT codes across all specialties. It had excellent discrimination for mortality (c-statistic = 0.98), morbidity (c-statistic = 0.81), and 7 additional complications (c-statistic > 0.77). The Hosmer-Lemeshow statistic and graphic representations also showed excellent calibration. The ACS NSQIP Pediatric Surgical Risk Calculator was developed using standardized and audited multi-institutional data from the ACS NSQIP Pediatric, and it provides empirically derived, patient-specific postoperative risks. It can be used as a tool in the shared decision-making process by providing clinicians, families, and patients with useful information for many of the most common operations performed on pediatric patients in the US. Copyright © 2016 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  13. Statistical analysis of water-quality data containing multiple detection limits: S-language software for regression on order statistics

    USGS Publications Warehouse

    Lee, L.; Helsel, D.

    2005-01-01

    Trace contaminants in water, including metals and organics, often are measured at sufficiently low concentrations to be reported only as values below the instrument detection limit. Interpretation of these "less thans" is complicated when multiple detection limits occur. Statistical methods for multiply censored, or multiple-detection limit, datasets have been developed for medical and industrial statistics, and can be employed to estimate summary statistics or model the distributions of trace-level environmental data. We describe S-language-based software tools that perform robust linear regression on order statistics (ROS). The ROS method has been evaluated as one of the most reliable procedures for developing summary statistics of multiply censored data. It is applicable to any dataset that has 0 to 80% of its values censored. These tools are a part of a software library, or add-on package, for the R environment for statistical computing. This library can be used to generate ROS models and associated summary statistics, plot modeled distributions, and predict exceedance probabilities of water-quality standards. ?? 2005 Elsevier Ltd. All rights reserved.

  14. Estimating the probability of rare events: addressing zero failure data.

    PubMed

    Quigley, John; Revie, Matthew

    2011-07-01

    Traditional statistical procedures for estimating the probability of an event result in an estimate of zero when no events are realized. Alternative inferential procedures have been proposed for the situation where zero events have been realized but often these are ad hoc, relying on selecting methods dependent on the data that have been realized. Such data-dependent inference decisions violate fundamental statistical principles, resulting in estimation procedures whose benefits are difficult to assess. In this article, we propose estimating the probability of an event occurring through minimax inference on the probability that future samples of equal size realize no more events than that in the data on which the inference is based. Although motivated by inference on rare events, the method is not restricted to zero event data and closely approximates the maximum likelihood estimate (MLE) for nonzero data. The use of the minimax procedure provides a risk adverse inferential procedure where there are no events realized. A comparison is made with the MLE and regions of the underlying probability are identified where this approach is superior. Moreover, a comparison is made with three standard approaches to supporting inference where no event data are realized, which we argue are unduly pessimistic. We show that for situations of zero events the estimator can be simply approximated with 1/2.5n, where n is the number of trials. © 2011 Society for Risk Analysis.

  15. Analyses and assessments of span wise gust gradient data from NASA B-57B aircraft

    NASA Technical Reports Server (NTRS)

    Frost, Walter; Chang, Ho-Pen; Ringnes, Erik A.

    1987-01-01

    Analysis of turbulence measured across the airfoil of a Cambera B-57 aircraft is reported. The aircraft is instrumented with probes for measuring wind at both wing tips and at the nose. Statistical properties of the turbulence are reported. These consist of the standard deviations of turbulence measured by each individual probe, standard deviations and probability distribution of differences in turbulence measured between probes and auto- and two-point spatial correlations and spectra. Procedures associated with calculations of two-point spatial correlations and spectra utilizing data were addressed. Methods and correction procedures for assuring the accuracy of aircraft measured winds are also described. Results are found, in general, to agree with correlations existing in the literature. The velocity spatial differences fit a Gaussian/Bessel type probability distribution. The turbulence agrees with the von Karman turbulence correlation and with two-point spatial correlations developed from the von Karman correlation.

  16. Identifying fMRI Model Violations with Lagrange Multiplier Tests

    PubMed Central

    Cassidy, Ben; Long, Christopher J; Rae, Caroline; Solo, Victor

    2013-01-01

    The standard modeling framework in Functional Magnetic Resonance Imaging (fMRI) is predicated on assumptions of linearity, time invariance and stationarity. These assumptions are rarely checked because doing so requires specialised software, although failure to do so can lead to bias and mistaken inference. Identifying model violations is an essential but largely neglected step in standard fMRI data analysis. Using Lagrange Multiplier testing methods we have developed simple and efficient procedures for detecting model violations such as non-linearity, non-stationarity and validity of the common Double Gamma specification for hemodynamic response. These procedures are computationally cheap and can easily be added to a conventional analysis. The test statistic is calculated at each voxel and displayed as a spatial anomaly map which shows regions where a model is violated. The methodology is illustrated with a large number of real data examples. PMID:22542665

  17. A New Way for Antihelixplasty in Prominent Ear Surgery: Modified Postauricular Fascial Flap.

    PubMed

    Taş, Süleyman; Benlier, Erol

    2016-06-01

    Otoplasty procedures aim to reduce the concha-mastoid angle and recreate the antihelical fold. Here, we explained the modified postauricular fascial flap, described as a new way for recreating the antihelical fold, and reported the results of patients on whom this flap was used. The defined technique was used on 24 patients (10 females and 14 males; age, 6-27 years; mean, 16.7 years) between June 2009 and July 2012, a total of 48 procedures in total (bilateral). Follow-up ranged from 1 to 3 years (mean, 1.5 years). At the preoperative and postoperative time points (1 and 12 months after surgery), all patients were measured for upper and middle helix-head distance and were photographed. The records were analyzed statistically using t test and analysis of variance. The procedure resulted in ears that were natural in appearance without any significant visible evidence of surgery. The operations resulted in no complications except 1 patient who developed a small skin ulcer on the left ear because of band pressure. When we compared the preoperative and postoperative upper and middle helix-head distance, there was a high significance statistically. To introduce modified postauricular fascial flap, we used a simple and safe procedure to recreate an antihelical fold. This procedure led to several benefits, including a natural-in-appearance antihelical fold, prevention of suture extrusion and granuloma, as well as minimized risk for recurrence due to neochondrogenesis. This method may be used as a standard procedure for treating prominent ears surgically.

  18. A hierarchical Bayesian approach to adaptive vision testing: A case study with the contrast sensitivity function.

    PubMed

    Gu, Hairong; Kim, Woojae; Hou, Fang; Lesmes, Luis Andres; Pitt, Mark A; Lu, Zhong-Lin; Myung, Jay I

    2016-01-01

    Measurement efficiency is of concern when a large number of observations are required to obtain reliable estimates for parametric models of vision. The standard entropy-based Bayesian adaptive testing procedures addressed the issue by selecting the most informative stimulus in sequential experimental trials. Noninformative, diffuse priors were commonly used in those tests. Hierarchical adaptive design optimization (HADO; Kim, Pitt, Lu, Steyvers, & Myung, 2014) further improves the efficiency of the standard Bayesian adaptive testing procedures by constructing an informative prior using data from observers who have already participated in the experiment. The present study represents an empirical validation of HADO in estimating the human contrast sensitivity function. The results show that HADO significantly improves the accuracy and precision of parameter estimates, and therefore requires many fewer observations to obtain reliable inference about contrast sensitivity, compared to the method of quick contrast sensitivity function (Lesmes, Lu, Baek, & Albright, 2010), which uses the standard Bayesian procedure. The improvement with HADO was maintained even when the prior was constructed from heterogeneous populations or a relatively small number of observers. These results of this case study support the conclusion that HADO can be used in Bayesian adaptive testing by replacing noninformative, diffuse priors with statistically justified informative priors without introducing unwanted bias.

  19. A hierarchical Bayesian approach to adaptive vision testing: A case study with the contrast sensitivity function

    PubMed Central

    Gu, Hairong; Kim, Woojae; Hou, Fang; Lesmes, Luis Andres; Pitt, Mark A.; Lu, Zhong-Lin; Myung, Jay I.

    2016-01-01

    Measurement efficiency is of concern when a large number of observations are required to obtain reliable estimates for parametric models of vision. The standard entropy-based Bayesian adaptive testing procedures addressed the issue by selecting the most informative stimulus in sequential experimental trials. Noninformative, diffuse priors were commonly used in those tests. Hierarchical adaptive design optimization (HADO; Kim, Pitt, Lu, Steyvers, & Myung, 2014) further improves the efficiency of the standard Bayesian adaptive testing procedures by constructing an informative prior using data from observers who have already participated in the experiment. The present study represents an empirical validation of HADO in estimating the human contrast sensitivity function. The results show that HADO significantly improves the accuracy and precision of parameter estimates, and therefore requires many fewer observations to obtain reliable inference about contrast sensitivity, compared to the method of quick contrast sensitivity function (Lesmes, Lu, Baek, & Albright, 2010), which uses the standard Bayesian procedure. The improvement with HADO was maintained even when the prior was constructed from heterogeneous populations or a relatively small number of observers. These results of this case study support the conclusion that HADO can be used in Bayesian adaptive testing by replacing noninformative, diffuse priors with statistically justified informative priors without introducing unwanted bias. PMID:27105061

  20. Adjustment of geochemical background by robust multivariate statistics

    USGS Publications Warehouse

    Zhou, D.

    1985-01-01

    Conventional analyses of exploration geochemical data assume that the background is a constant or slowly changing value, equivalent to a plane or a smoothly curved surface. However, it is better to regard the geochemical background as a rugged surface, varying with changes in geology and environment. This rugged surface can be estimated from observed geological, geochemical and environmental properties by using multivariate statistics. A method of background adjustment was developed and applied to groundwater and stream sediment reconnaissance data collected from the Hot Springs Quadrangle, South Dakota, as part of the National Uranium Resource Evaluation (NURE) program. Source-rock lithology appears to be a dominant factor controlling the chemical composition of groundwater or stream sediments. The most efficacious adjustment procedure is to regress uranium concentration on selected geochemical and environmental variables for each lithologic unit, and then to delineate anomalies by a common threshold set as a multiple of the standard deviation of the combined residuals. Robust versions of regression and RQ-mode principal components analysis techniques were used rather than ordinary techniques to guard against distortion caused by outliers Anomalies delineated by this background adjustment procedure correspond with uranium prospects much better than do anomalies delineated by conventional procedures. The procedure should be applicable to geochemical exploration at different scales for other metals. ?? 1985.

  1. [Mechanical properties of nickel-titanium files following multiple heat sterilizations].

    PubMed

    Testarelli, L; Gallottini, L; Gambarini, G

    2003-04-01

    The effect of cycles of sterilization procedures on nickel-titanium (NiTi) endodontic instruments is a serious concern for practitioners. There is no agreement in the literature whether these procedures could adversely affect the mechanical properties of endodontic files, and, consequently, increase the risk of intracanal failure. The purpose of this study was to evaluate the mechanichal resistance of Hero (MicroMega, Besancon, France) instruments, before and after sterilization procedures. Thirty 02, 04, 06 tapered Hero size 30 new instruments were chosen and divided into 3 groups. Group A (control) were tested according to ANSI/ADA Spec.no 28 for torsional resistance, angle of torque and angle at breakage (45 inverted exclamation mark ). Group B files were first sterilized with chemiclave for 10 cycles of 20 minutes at 124 inverted exclamation mark C and then tested as described above. Group C files were first sterilized with glass beads for 10 cycles of 20 sec. at 250 inverted exclamation mark C and then tested as described above. Data were collected and statistically analyzed (t-paired test). Differences among the 3 groups were statistically not significant for both tests. All data were well within Spec.no 28 standard values. From the results of the present study, we may conclude that repeated sterilization procedures do not adversely affect the mechanichal resistance of Hero files.

  2. Identification of differentially expressed genes and false discovery rate in microarray studies.

    PubMed

    Gusnanto, Arief; Calza, Stefano; Pawitan, Yudi

    2007-04-01

    To highlight the development in microarray data analysis for the identification of differentially expressed genes, particularly via control of false discovery rate. The emergence of high-throughput technology such as microarrays raises two fundamental statistical issues: multiplicity and sensitivity. We focus on the biological problem of identifying differentially expressed genes. First, multiplicity arises due to testing tens of thousands of hypotheses, rendering the standard P value meaningless. Second, known optimal single-test procedures such as the t-test perform poorly in the context of highly multiple tests. The standard approach of dealing with multiplicity is too conservative in the microarray context. The false discovery rate concept is fast becoming the key statistical assessment tool replacing the P value. We review the false discovery rate approach and argue that it is more sensible for microarray data. We also discuss some methods to take into account additional information from the microarrays to improve the false discovery rate. There is growing consensus on how to analyse microarray data using the false discovery rate framework in place of the classical P value. Further research is needed on the preprocessing of the raw data, such as the normalization step and filtering, and on finding the most sensitive test procedure.

  3. [Construction and analysis of questionnaires on AIDS cough in traditional Chinese medicine diagnosis and treatment procedures].

    PubMed

    Zhang, Ying; Xue, Liu-Hua; Chen, Yu-Xia; Huang, Shi-Jing; Pan, Ju-Hua; Wang, Jie

    2013-08-01

    To norm the behavior of AIDS cough in traditional Chinese medicine diagnosis and treatment and improve the clinical level of cough treatment for HIV/AIDS, and build AIDS cough diagnosis and treatment procedures in traditional Chinese medicine. Combined with clinical practice,to formulate questionnaire on AIDS cough in traditional Chinese medicine diagnosis and treatment by both English and Chinese literature research to expertise consultation and verify the results of the questionnaires on the statistics using the Delphi method. Questionnaire contents consist of overview, pathogeny, diagnosis standard, dialectical medication (phlegm heat resistance pulmonary lung and kidney Yin deficiency lung spleen-deficiency), treating spleen-deficiency (lung), moxibustion treatment and aftercare care and diet and mental, average (2.93-3.00), full mark rate (93.10%-100%) ranks average (9.91-10.67) and (287.50-309.50) of which are the most high value, and the variation coefficient is 0.00, the Kendall coefficient (Kendalls W) is 0.049 which is statistical significance, the questionnaire reliability value of alpha was 0.788. Preliminary standarded concept, etiology and pathogenesis, diagnosis and syndrome differentiation treatment of AIDS cough, basically recognised by the experts in this field, and laid the foundation of traditional Chinese medicine diagnosis and treatment on develop the AIDS cough specifications.

  4. A new one-step procedure for pulmonary valve implantation of the melody valve: Simultaneous prestenting and valve implantation.

    PubMed

    Boudjemline, Younes

    2018-01-01

    To describe a new modification, the one-step procedure, that allows interventionists to pre-stent and implant a Melody valve simultaneously. Percutaneous pulmonary valve implantation (PPVI) is the standard of care for managing patients with dysfunctional right ventricular outflow tract, and the approach is standardized. Patients undergoing PPVI using the one-step procedure were identified in our database. Procedural data and radiation exposure were compared to those in a matched group of patients who underwent PPVI using the conventional two-step procedure. Between January 2016 and January 2017, PPVI was performed in 27 patients (median age/range, 19.1/10-55 years) using the one-step procedure involving manual crimping of one to three bare metal stents over the Melody valve. The stent and Melody valve were delivered successfully using the Ensemble delivery system. No complications occurred. All patients had excellent hemodynamic results (median/range post-PPVI right ventricular to pulmonary artery gradient, 9/0-20 mmHg). Valve function was excellent. Median procedural and fluoroscopic times were 56 and 10.2 min, respectively, which significantly differed from those of the two-step procedure group. Similarly, the dose area product (DAP), and radiation time were statistically lower in the one-step group than in the two-step group (P < 0.001 for all variables). After a median follow-up of 8 months (range, 3-14.7), no patient underwent reintervention, and no device dysfunction was observed. The one-step procedure is a safe modification that allows interventionists to prestent and implants the Melody valve simultaneously. It significantly reduces procedural and fluoroscopic times, and radiation exposure. © 2017 Wiley Periodicals, Inc.

  5. Learning intervention-induced deformations for non-rigid MR-CT registration and electrode localization in epilepsy patients

    PubMed Central

    Onofrey, John A.; Staib, Lawrence H.; Papademetris, Xenophon

    2015-01-01

    This paper describes a framework for learning a statistical model of non-rigid deformations induced by interventional procedures. We make use of this learned model to perform constrained non-rigid registration of pre-procedural and post-procedural imaging. We demonstrate results applying this framework to non-rigidly register post-surgical computed tomography (CT) brain images to pre-surgical magnetic resonance images (MRIs) of epilepsy patients who had intra-cranial electroencephalography electrodes surgically implanted. Deformations caused by this surgical procedure, imaging artifacts caused by the electrodes, and the use of multi-modal imaging data make non-rigid registration challenging. Our results show that the use of our proposed framework to constrain the non-rigid registration process results in significantly improved and more robust registration performance compared to using standard rigid and non-rigid registration methods. PMID:26900569

  6. Development of Growth Charts of Pakistani Children Aged 4-15 Years Using Quantile Regression: A Cross-sectional Study

    PubMed Central

    Khan, Nazeer; Siddiqui, Junaid S; Baig-Ansari, Naila

    2018-01-01

    Background Growth charts are essential tools used by pediatricians as well as public health researchers in assessing and monitoring the well-being of pediatric populations. Development of these growth charts, especially for children above five years of age, is challenging and requires current anthropometric data and advanced statistical analysis. These growth charts are generally presented as a series of smooth centile curves. A number of modeling approaches are available for generating growth charts and applying these on national datasets is important for generating country-specific reference growth charts. Objective To demonstrate that quantile regression (QR) as a viable statistical approach to construct growth reference charts and to assess the applicability of the World Health Organization (WHO) 2007 growth standards to a large Pakistani population of school-going children. Methodology This is a secondary data analysis using anthropometric data of 9,515 students from a Pakistani survey conducted between 2007 and 2014 in four cities of Pakistan. Growth reference charts were created using QR as well as the LMS (Box-Cox transformation (L), the median (M), and the generalized coefficient of variation (S)) method and then compared with WHO 2007 growth standards. Results Centile values estimated by the LMS method and QR procedure had few differences. The centile values attained from QR procedure of BMI-for-age, weight-for-age, and height-for-age of Pakistani children were lower than the standard WHO 2007 centile. Conclusion QR should be considered as an alternative method to develop growth charts for its simplicity and lack of necessity to transform data. WHO 2007 standards are not suitable for Pakistani children. PMID:29632748

  7. Development of Growth Charts of Pakistani Children Aged 4-15 Years Using Quantile Regression: A Cross-sectional Study.

    PubMed

    Iftikhar, Sundus; Khan, Nazeer; Siddiqui, Junaid S; Baig-Ansari, Naila

    2018-02-02

    Background Growth charts are essential tools used by pediatricians as well as public health researchers in assessing and monitoring the well-being of pediatric populations. Development of these growth charts, especially for children above five years of age, is challenging and requires current anthropometric data and advanced statistical analysis. These growth charts are generally presented as a series of smooth centile curves. A number of modeling approaches are available for generating growth charts and applying these on national datasets is important for generating country-specific reference growth charts. Objective To demonstrate that quantile regression (QR) as a viable statistical approach to construct growth reference charts and to assess the applicability of the World Health Organization (WHO) 2007 growth standards to a large Pakistani population of school-going children. Methodology This is a secondary data analysis using anthropometric data of 9,515 students from a Pakistani survey conducted between 2007 and 2014 in four cities of Pakistan. Growth reference charts were created using QR as well as the LMS (Box-Cox transformation (L), the median (M), and the generalized coefficient of variation (S)) method and then compared with WHO 2007 growth standards. Results Centile values estimated by the LMS method and QR procedure had few differences. The centile values attained from QR procedure of BMI-for-age, weight-for-age, and height-for-age of Pakistani children were lower than the standard WHO 2007 centile. Conclusion QR should be considered as an alternative method to develop growth charts for its simplicity and lack of necessity to transform data. WHO 2007 standards are not suitable for Pakistani children.

  8. Standardization in gully erosion studies: methodology and interpretation of magnitudes from a global review

    NASA Astrophysics Data System (ADS)

    Castillo, Carlos; Gomez, Jose Alfonso

    2016-04-01

    Standardization is the process of developing common conventions or proceedings to facilitate the communication, use, comparison and exchange of products or information among different parties. It has been an useful tool in different fields from industry to statistics due to technical, economic and social reasons. In science the need for standardization has been recognised in the definition of methods as well as in publication formats. With respect to gully erosion, a number of initiatives have been carried out to propose common methodologies, for instance, for gully delineation (Castillo et al., 2014) and geometrical measurements (Casalí et al., 2015). The main aims of this work are: 1) to examine previous proposals in gully erosion literature implying standardization processes; 2) to contribute with new approaches to improve the homogeneity of methodologies and presentation of results for a better communication among the gully erosion community. For this purpose, we evaluated the basic information provided on environmental factors, discussed the delineation and measurement procedures proposed in previous works and, finally, we analysed statistically the severity of degradation levels derived from different indicators at the world scale. As a result, we presented suggestions aiming to serve as guidance for survey design as well as for the interpretation of vulnerability levels and degradation rates for future gully erosion studies. References Casalí, J., Giménez, R., and Campo-Bescós, M. A.: Gully geometry: what are we measuring?, SOIL, 1, 509-513, doi:10.5194/soil-1-509-2015, 2015. Castillo C., Taguas E. V., Zarco-Tejada P., James M. R., and Gómez J. A. (2014), The normalized topographic method: an automated procedure for gully mapping using GIS, Earth Surf. Process. Landforms, 39, 2002-2015, doi: 10.1002/esp.3595

  9. 78 FR 43002 - Proposed Collection; Comment Request for Revenue Procedure 2004-29

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-18

    ... comments concerning statistical sampling in Sec. 274 Context. DATES: Written comments should be received on... INFORMATION: Title: Statistical Sampling in Sec. 274 Contest. OMB Number: 1545-1847. Revenue Procedure Number: Revenue Procedure 2004-29. Abstract: Revenue Procedure 2004-29 prescribes the statistical sampling...

  10. Cleanroom certification model

    NASA Technical Reports Server (NTRS)

    Currit, P. A.

    1983-01-01

    The Cleanroom software development methodology is designed to take the gamble out of product releases for both suppliers and receivers of the software. The ingredients of this procedure are a life cycle of executable product increments, representative statistical testing, and a standard estimate of the MTTF (Mean Time To Failure) of the product at the time of its release. A statistical approach to software product testing using randomly selected samples of test cases is considered. A statistical model is defined for the certification process which uses the timing data recorded during test. A reasonableness argument for this model is provided that uses previously published data on software product execution. Also included is a derivation of the certification model estimators and a comparison of the proposed least squares technique with the more commonly used maximum likelihood estimators.

  11. Introduction of basic obstetrical ultrasound screening in undergraduate medical education.

    PubMed

    Hamza, A; Solomayer, E-F; Takacs, Z; Juhasz-Boes, I; Joukhadar, R; Radosa, J C; Mavrova, R; Marc, W; Volk, T; Meyberg-Solomayer, G

    2016-09-01

    Teaching ultrasound procedures to undergraduates has recently been proposed to improve the quality of medical education. We address the impact of applying standardized ultrasound teaching to our undergraduates. Medical students received an additional theoretical and practical course involving hands-on ultrasound screening during their mandatory practical training week in obstetrics and gynecology. The students' theoretical knowledge and fetal image recognition skills were tested before and after the course. After the course, the students were asked to answer a course evaluation questionnaire. To standardize the teaching procedure, we used Peyton's 4-Step Approach to teach the skills needed for a German Society of Ultrasound in Medicine Level 1 ultrasound examiner. The multiple-choice question scores after the course showed statistically significant improvement (50 vs. 80 %; P < 0.001). The questionnaire revealed that students were satisfied with the course, felt that it increased their ultrasound knowledge, and indicated that they wanted more sonographic hands-on training in both obstetrics and gynecology and other medical fields. Using practical, hands-on medical teaching is an emerging method for undergraduate education that should be further evaluated, standardized, and developed.

  12. Can Particulate Air Sampling Predict Microbial Load in Operating Theatres for Arthroplasty?

    PubMed Central

    Cristina, Maria Luisa; Spagnolo, Anna Maria; Sartini, Marina; Panatto, Donatella; Gasparini, Roberto; Orlando, Paolo; Ottria, Gianluca; Perdelli, Fernanda

    2012-01-01

    Several studies have proposed that the microbiological quality of the air in operating theatres be indirectly evaluated by means of particle counting, a technique derived from industrial clean-room technology standards, using airborne particle concentration as an index of microbial contamination. However, the relationship between particle counting and microbiological sampling has rarely been evaluated and demonstrated in operating theatres. The aim of the present study was to determine whether particle counting could predict microbiological contamination of the air in an operating theatre during 95 surgical arthroplasty procedures. This investigation was carried out over a period of three months in 2010 in an orthopedic operating theatre devoted exclusively to prosthetic surgery. During each procedure, the bacterial contamination of the air was determined by means of active sampling; at the same time, airborne particulate contamination was assessed throughout the entire procedure. On considering the total number of surgical operations, the mean value of the total bacterial load in the center of the operating theatre proved to be 35 CFU/m3; the mean particle count was 4,194,569 no./m3 for particles of diameter ≥0.5 µm and 13,519 no./m3 for particles of diameter ≥5 µm. No significant differences emerged between the median values of the airborne microbial load recorded during the two types of procedure monitored. Particulates with a diameter of ≥0.5 µm were detected in statistically higher concentrations (p<0.001) during knee-replacement procedures. By contrast, particulates with a diameter of ≥5 µm displayed a statistically higher concentration during hip-replacement procedures (p<0.05). The results did not reveal any statistically significant correlation between microbial loads and particle counts for either of the particle diameters considered (≥0.5 µm and ≥5 µm). Consequently, microbiological monitoring remains the most suitable method of evaluating the quality of air in operating theatres. PMID:23285189

  13. Can particulate air sampling predict microbial load in operating theatres for arthroplasty?

    PubMed

    Cristina, Maria Luisa; Spagnolo, Anna Maria; Sartini, Marina; Panatto, Donatella; Gasparini, Roberto; Orlando, Paolo; Ottria, Gianluca; Perdelli, Fernanda

    2012-01-01

    Several studies have proposed that the microbiological quality of the air in operating theatres be indirectly evaluated by means of particle counting, a technique derived from industrial clean-room technology standards, using airborne particle concentration as an index of microbial contamination. However, the relationship between particle counting and microbiological sampling has rarely been evaluated and demonstrated in operating theatres. The aim of the present study was to determine whether particle counting could predict microbiological contamination of the air in an operating theatre during 95 surgical arthroplasty procedures. This investigation was carried out over a period of three months in 2010 in an orthopedic operating theatre devoted exclusively to prosthetic surgery. During each procedure, the bacterial contamination of the air was determined by means of active sampling; at the same time, airborne particulate contamination was assessed throughout the entire procedure. On considering the total number of surgical operations, the mean value of the total bacterial load in the center of the operating theatre proved to be 35 CFU/m(3); the mean particle count was 4,194,569 no./m(3) for particles of diameter ≥0.5 µm and 13,519 no./m(3) for particles of diameter ≥5 µm. No significant differences emerged between the median values of the airborne microbial load recorded during the two types of procedure monitored. Particulates with a diameter of ≥0.5 µm were detected in statistically higher concentrations (p<0.001) during knee-replacement procedures. By contrast, particulates with a diameter of ≥5 µm displayed a statistically higher concentration during hip-replacement procedures (p<0.05). The results did not reveal any statistically significant correlation between microbial loads and particle counts for either of the particle diameters considered (≥0.5 µm and ≥5 µm). Consequently, microbiological monitoring remains the most suitable method of evaluating the quality of air in operating theatres.

  14. A survey of office visits for actinic keratosis as reported by NAMCS, 1990-1999. National Ambulatory Medical Care Survey.

    PubMed

    Gupta, Aditya K; Cooper, Elizabeth A; Feldman, Steven R; Fleischer, Alan B

    2002-08-01

    Although actinic keratosis (AK) has been linked to the development of nonmelanoma skin cancer (NMSC), particularly squamous cell carcinoma (SCC), increased awareness regarding diagnosis and treatment may be an important component for reducing morbidity and even mortality from AK and NMSC. We used the National Ambulatory Medical Care Survey (NAMCS) data from 1990 to 1999 to evaluate the diagnosis and treatment of AKs among a wide variety of patients by physicians across the United States. To our knowledge, no widespread surveys of North American populations have been performed recently to determine the epidemiology of AK. AK was diagnosed in more than 47 million visits over the 10-year period surveyed and was found to occur in 14% of patients visiting dermatologists. The diagnosis of AK as determined by NAMCS does not reflect the true prevalence of AK because only patients seeking physician diagnosis were surveyed. This suggests that the actual number of patients in the United States with AK is much higher than 14%. Rates of AK diagnosis in the standard metropolitan statistical areas (SMSAs) and non-standard metropolitan statistical areas (non-SMSAs) of the West states are higher than in other states, but geographic location may not be a direct risk factor for the development of AKs. Procedures were undertaken at 70% of visits where AK was the primary diagnosis. Destruction of lesions was the most frequently performed procedure found in the survey considering only the 1993 and 1994 NAMCS data. Biopsy was the second most frequently performed procedure.

  15. Knowledge, Attitude and Practice of Healthcare Managers to Medical Waste Management and Occupational Safety Practices: Findings from Southeast Nigeria.

    PubMed

    Anozie, Okechukwu Bonaventure; Lawani, Lucky Osaheni; Eze, Justus Ndulue; Mamah, Emmanuel Johnbosco; Onoh, Robinson Chukwudi; Ogah, Emeka Onwe; Umezurike, Daniel Akuma; Anozie, Rita Onyinyechi

    2017-03-01

    Awareness of appropriate waste management procedures and occupational safety measures is fundamental to achieving a safe work environment, and ensuring patient and staff safety. This study was conducted to assess the attitude of healthcare managers to medical waste management and occupational safety practices. This was a cross-sectional study conducted among 54 hospital administrators in Ebonyi state. Semi-structured questionnaires were used for qualitative data collection and analyzed with SPSS statistics for windows (2011), version 20.0 statistical software (Armonk, NY: IBM Corp). Two-fifth (40%) of healthcare managers had received training on medical waste management and occupational safety. Standard operating procedure of waste disposal was practiced by only one hospital (1.9%), while 98.1% (53/54) practiced indiscriminate waste disposal. Injection safety boxes were widely available in all health facilities, nevertheless, the use of incinerators and waste treatment was practiced by 1.9% (1/54) facility. However, 40.7% (22/54) and 59.3% (32/54) of respondents trained their staff and organize safety orientation courses respectively. Staff insurance cover was offered by just one hospital (1.9%), while none of the hospitals had compensation package for occupational hazard victims. Over half (55.6%; 30/54) of the respondents provided both personal protective equipment and post exposure prophylaxis for HIV. There was high level of non-compliance to standard medical waste management procedures, and lack of training on occupational safety measures. Relevant regulating agencies should step up efforts at monitoring and regulation of healthcare activities and ensure staff training on safe handling and disposal of hospital waste.

  16. Quantification of integrated HIV DNA by repetitive-sampling Alu-HIV PCR on the basis of poisson statistics.

    PubMed

    De Spiegelaere, Ward; Malatinkova, Eva; Lynch, Lindsay; Van Nieuwerburgh, Filip; Messiaen, Peter; O'Doherty, Una; Vandekerckhove, Linos

    2014-06-01

    Quantification of integrated proviral HIV DNA by repetitive-sampling Alu-HIV PCR is a candidate virological tool to monitor the HIV reservoir in patients. However, the experimental procedures and data analysis of the assay are complex and hinder its widespread use. Here, we provide an improved and simplified data analysis method by adopting binomial and Poisson statistics. A modified analysis method on the basis of Poisson statistics was used to analyze the binomial data of positive and negative reactions from a 42-replicate Alu-HIV PCR by use of dilutions of an integration standard and on samples of 57 HIV-infected patients. Results were compared with the quantitative output of the previously described Alu-HIV PCR method. Poisson-based quantification of the Alu-HIV PCR was linearly correlated with the standard dilution series, indicating that absolute quantification with the Poisson method is a valid alternative for data analysis of repetitive-sampling Alu-HIV PCR data. Quantitative outputs of patient samples assessed by the Poisson method correlated with the previously described Alu-HIV PCR analysis, indicating that this method is a valid alternative for quantifying integrated HIV DNA. Poisson-based analysis of the Alu-HIV PCR data enables absolute quantification without the need of a standard dilution curve. Implementation of the CI estimation permits improved qualitative analysis of the data and provides a statistical basis for the required minimal number of technical replicates. © 2014 The American Association for Clinical Chemistry.

  17. The efficacy of problem-solving treatments after deliberate self-harm: meta-analysis of randomized controlled trials with respect to depression, hopelessness and improvement in problems.

    PubMed

    Townsend, E; Hawton, K; Altman, D G; Arensman, E; Gunnell, D; Hazell, P; House, A; Van Heeringen, K

    2001-08-01

    Brief problem-solving therapy is regarded as a pragmatic treatment for deliberate self-harm (DSH) patients. A recent meta-analysis of randomized controlled trials (RCTs) evaluating this approach indicated a trend towards reduced repetition of DSH but the pooled odds ratio was not statistically significant. We have now examined other important outcomes using this procedure, namely depression, hopelessness and improvement in problems. Six trials in which problem-solving therapy was compared with control treatment were identified from an extensive literature review of RCTs of treatments for DSH patients. Data concerning depression, hopelessness and improvement in problems were extracted. Where relevant statistical data (e.g. standard deviations) were missing these were imputed using various statistical methods. Results were pooled using meta-analytical procedures. At follow-up, patients who were offered problem-solving therapy had significantly greater improvement in scores for depression (standardized mean difference = -0.36; 95% CI -0.61 to -0.11) and hopelessness (weighted mean difference =-3.2; 95% CI -4.0 to -2.41), and significantly more reported improvement in their problems (odds ratio = 2.31; 95% CI 1.29 to 4.13), than patients who were in the control treatment groups. Problem-solving therapy for DSH patients appears to produce better results than control treatment with regard to improvement in depression, hopelessness and problems. It is desirable that this finding is confirmed in a large trial, which will also allow adequate testing of the impact of this treatment on repetition of DSH.

  18. Precipitation projections under GCMs perspective and Turkish Water Foundation (TWF) statistical downscaling model procedures

    NASA Astrophysics Data System (ADS)

    Dabanlı, İsmail; Şen, Zekai

    2018-04-01

    The statistical climate downscaling model by the Turkish Water Foundation (TWF) is further developed and applied to a set of monthly precipitation records. The model is structured by two phases as spatial (regional) and temporal downscaling of global circulation model (GCM) scenarios. The TWF model takes into consideration the regional dependence function (RDF) for spatial structure and Markov whitening process (MWP) for temporal characteristics of the records to set projections. The impact of climate change on monthly precipitations is studied by downscaling Intergovernmental Panel on Climate Change-Special Report on Emission Scenarios (IPCC-SRES) A2 and B2 emission scenarios from Max Plank Institute (EH40PYC) and Hadley Center (HadCM3). The main purposes are to explain the TWF statistical climate downscaling model procedures and to expose the validation tests, which are rewarded in same specifications as "very good" for all stations except one (Suhut) station in the Akarcay basin that is in the west central part of Turkey. Eventhough, the validation score is just a bit lower at the Suhut station, the results are "satisfactory." It is, therefore, possible to say that the TWF model has reasonably acceptable skill for highly accurate estimation regarding standard deviation ratio (SDR), Nash-Sutcliffe efficiency (NSE), and percent bias (PBIAS) criteria. Based on the validated model, precipitation predictions are generated from 2011 to 2100 by using 30-year reference observation period (1981-2010). Precipitation arithmetic average and standard deviation have less than 5% error for EH40PYC and HadCM3 SRES (A2 and B2) scenarios.

  19. When Is Evidence Enough Evidence? A Systematic Review and Meta-Analysis of the Trabectome as a Solo Procedure in Patients with Primary Open-Angle Glaucoma

    PubMed Central

    Chow, Jeffrey T. Y.; Hutnik, Cindy M. L.; Solo, Karla

    2017-01-01

    The purpose of this systematic review and meta-analysis was to examine the availability of evidence for one of the earliest available minimally invasive glaucoma surgery (MIGS) procedures, the Trabectome. Various databases were searched up to December 20, 2016, for any published studies assessing the use of the Trabectome as a solo procedure in patients with primary open-angle glaucoma (POAG). The standardized mean differences (SMD) were calculated for the change in intraocular pressure (IOP) and number of glaucoma mediations used at 1-month, 6-month, and 12-month follow-up. After screening, three studies and one abstract with analyzable data were included. The meta-analysis showed statistically significant reductions in IOP and number of glaucoma medications used at all time points. Though the Trabectome as a solo procedure appears to lower IOP and reduces the number of glaucoma medications, more high-quality studies are required to make definitive conclusions. The difficulty of obtaining evidence may be one of the many obstacles that limit a full understanding of the potential safety and/or efficacy benefits compared to standard treatments. The time has come for a thoughtful and integrated approach with stakeholders to determine optimal access to care strategies for our patients. PMID:28740733

  20. When Is Evidence Enough Evidence? A Systematic Review and Meta-Analysis of the Trabectome as a Solo Procedure in Patients with Primary Open-Angle Glaucoma.

    PubMed

    Chow, Jeffrey T Y; Hutnik, Cindy M L; Solo, Karla; Malvankar-Mehta, Monali S

    2017-01-01

    The purpose of this systematic review and meta-analysis was to examine the availability of evidence for one of the earliest available minimally invasive glaucoma surgery (MIGS) procedures, the Trabectome. Various databases were searched up to December 20, 2016, for any published studies assessing the use of the Trabectome as a solo procedure in patients with primary open-angle glaucoma (POAG). The standardized mean differences (SMD) were calculated for the change in intraocular pressure (IOP) and number of glaucoma mediations used at 1-month, 6-month, and 12-month follow-up. After screening, three studies and one abstract with analyzable data were included. The meta-analysis showed statistically significant reductions in IOP and number of glaucoma medications used at all time points. Though the Trabectome as a solo procedure appears to lower IOP and reduces the number of glaucoma medications, more high-quality studies are required to make definitive conclusions. The difficulty of obtaining evidence may be one of the many obstacles that limit a full understanding of the potential safety and/or efficacy benefits compared to standard treatments. The time has come for a thoughtful and integrated approach with stakeholders to determine optimal access to care strategies for our patients.

  1. Battery Calendar Life Estimator Manual Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jon P. Christophersen; Ira Bloom; Ed Thomas

    2012-10-01

    The Battery Life Estimator (BLE) Manual has been prepared to assist developers in their efforts to estimate the calendar life of advanced batteries for automotive applications. Testing requirements and procedures are defined by the various manuals previously published under the United States Advanced Battery Consortium (USABC). The purpose of this manual is to describe and standardize a method for estimating calendar life based on statistical models and degradation data acquired from typical USABC battery testing.

  2. Battery Life Estimator Manual Linear Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jon P. Christophersen; Ira Bloom; Ed Thomas

    2009-08-01

    The Battery Life Estimator (BLE) Manual has been prepared to assist developers in their efforts to estimate the calendar life of advanced batteries for automotive applications. Testing requirements and procedures are defined by the various manuals previously published under the United States Advanced Battery Consortium (USABC). The purpose of this manual is to describe and standardize a method for estimating calendar life based on statistical models and degradation data acquired from typical USABC battery testing.

  3. 28 CFR Appendix D to Part 61 - Office of Justice Assistance, Research, and Statistics Procedures Relating to the Implementation...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., and Statistics Procedures Relating to the Implementation of the National Environmental Policy Act D... Assistance, Research, and Statistics Procedures Relating to the Implementation of the National Environmental... Statistics (OJARS) assists State and local units of government in strengthening and improving law enforcement...

  4. 28 CFR Appendix D to Part 61 - Office of Justice Assistance, Research, and Statistics Procedures Relating to the Implementation...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., and Statistics Procedures Relating to the Implementation of the National Environmental Policy Act D... Assistance, Research, and Statistics Procedures Relating to the Implementation of the National Environmental... Statistics (OJARS) assists State and local units of government in strengthening and improving law enforcement...

  5. Evidence-based orthodontics. Current statistical trends in published articles in one journal.

    PubMed

    Law, Scott V; Chudasama, Dipak N; Rinchuse, Donald J

    2010-09-01

    To ascertain the number, type, and overall usage of statistics in American Journal of Orthodontics and Dentofacial (AJODO) articles for 2008. These data were then compared to data from three previous years: 1975, 1985, and 2003. The frequency and distribution of statistics used in the AJODO original articles for 2008 were dichotomized into those using statistics and those not using statistics. Statistical procedures were then broadly divided into descriptive statistics (mean, standard deviation, range, percentage) and inferential statistics (t-test, analysis of variance). Descriptive statistics were used to make comparisons. In 1975, 1985, 2003, and 2008, AJODO published 72, 87, 134, and 141 original articles, respectively. The percentage of original articles using statistics was 43.1% in 1975, 75.9% in 1985, 94.0% in 2003, and 92.9% in 2008; original articles using statistics stayed relatively the same from 2003 to 2008, with only a small 1.1% decrease. The percentage of articles using inferential statistical analyses was 23.7% in 1975, 74.2% in 1985, 92.9% in 2003, and 84.4% in 2008. Comparing AJODO publications in 2003 and 2008, there was an 8.5% increase in the use of descriptive articles (from 7.1% to 15.6%), and there was an 8.5% decrease in articles using inferential statistics (from 92.9% to 84.4%).

  6. Achieving new levels of recall in consent to research by combining remedial and motivational techniques.

    PubMed

    Festinger, David S; Dugosh, Karen L; Marlowe, Douglas B; Clements, Nicolle T

    2014-04-01

    Research supports the efficacy of both a remedial consent procedure (corrected feedback (CF)) and a motivational consent procedure (incentives) for improving recall of informed consent to research. Although these strategies were statistically superior to standard consent, effects were modest and not clinically significant. This study examines a combined incentivised consent and CF procedure that simplifies the cognitive task and increases motivation to learn consent information. We randomly assigned 104 individuals consenting to an unrelated host study to a consent as usual (CAU) condition (n=52) or an incentivised CF (ICF) condition (n=52). All participants were told they would be quizzed on their consent recall following their baseline assessment and at 4 monthly follow-ups. ICF participants were also informed that they would earn $5 for each correct answer and receive CF as needed. Quiz scores in the two conditions did not differ at the first administration (p=0.39, d=0.2); however, ICF scores were significantly higher at each subsequent administration (second: p=0.003, Cohen's d=0.6; third: p<0.0001, d=1.4; fourth: p<0.0001, d=1.6; fifth: p<0.0001, d=1.8). The ICF procedure increased consent recall from 72% to 83%, compared with the CAU condition in which recall decreased from 69% to 59%. This supports the statistical and clinical utility of a combined remedial and motivational consent procedure for enhancing recall of study information and human research protections.

  7. Correlation of a Novel Noninvasive Tissue Oxygen Saturation Monitor to Serum Central Venous Oxygen Saturation in Pediatric Patients with Postoperative Congenital Cyanotic Heart Disease

    PubMed Central

    Yadlapati, Ajay; Grogan, Tristan; Elashoff, David; Kelly, Robert B.

    2013-01-01

    Abstract: Using a novel noninvasive, visible-light optical diffusion oximeter (T-Stat VLS Tissue Oximeter; Spectros Corporation, Portola Valley, CA) to measure the tissue oxygen saturation (StO2) of the buccal mucosa, the correlation between StO2 and central venous oxygen saturation (ScvO2) was examined in children with congenital cyanotic heart disease undergoing a cardiac surgical procedure. Paired StO2 and serum ScvO2 measurements were obtained postoperatively and statistically analyzed for agreement and association. Thirteen children (nine male) participated in the study (age range, 4 days to 18 months). Surgeries included Glenn shunt procedures, Norwood procedures, unifocalization procedures with Blalock-Taussig shunt placement, a Kawashima/Glenn shunt procedure, a Blalock-Taussig shunt placement, and a modified Norwood procedure. A total of 45 paired StO2-ScvO2 measurements was obtained. Linear regression demonstrated a Pearson’s correlation of .58 (95% confidence interval [CI], .35–.75; p < .0001). The regression slope coefficient estimate was .95 (95% CI, .54–1.36) with an interclass correlation coefficient of .48 (95% CI, .22–.68). Below a clinically relevant average ScvO2 value, a receiver operator characteristic analysis yielded an area under the curve of .78. Statistical methods to control for repeatedly measuring the same subjects produced similar results. This study shows a moderate relationship and agreement between StO2 and ScvO2 measurements in pediatric patients with a history of congenital cyanotic heart disease undergoing a cardiac surgical procedure. This real-time monitoring device can act as a valuable adjunct to standard noninvasive monitoring in which serum ScvO2 sampling currently assists in the diagnosis of low cardiac output after pediatric cardiac surgery. PMID:23691783

  8. Regression assumptions in clinical psychology research practice-a systematic review of common misconceptions.

    PubMed

    Ernst, Anja F; Albers, Casper J

    2017-01-01

    Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated employment and reporting of assumption checks in twelve clinical psychology journals. Findings indicate that normality of the variables themselves, rather than of the errors, was wrongfully held for a necessary assumption in 4% of papers that use regression. Furthermore, 92% of all papers using linear regression were unclear about their assumption checks, violating APA-recommendations. This paper appeals for a heightened awareness for and increased transparency in the reporting of statistical assumption checking.

  9. Regression assumptions in clinical psychology research practice—a systematic review of common misconceptions

    PubMed Central

    Ernst, Anja F.

    2017-01-01

    Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated employment and reporting of assumption checks in twelve clinical psychology journals. Findings indicate that normality of the variables themselves, rather than of the errors, was wrongfully held for a necessary assumption in 4% of papers that use regression. Furthermore, 92% of all papers using linear regression were unclear about their assumption checks, violating APA-recommendations. This paper appeals for a heightened awareness for and increased transparency in the reporting of statistical assumption checking. PMID:28533971

  10. Automated acid and base number determination of mineral-based lubricants by fourier transform infrared spectroscopy: commercial laboratory evaluation.

    PubMed

    Winterfield, Craig; van de Voort, F R

    2014-12-01

    The Fluid Life Corporation assessed and implemented Fourier transform infrared spectroscopy (FTIR)-based methods using American Society for Testing and Materials (ASTM)-like stoichiometric reactions for determination of acid and base number for in-service mineral-based oils. The basic protocols, quality control procedures, calibration, validation, and performance of these new quantitative methods are assessed. ASTM correspondence is attained using a mixed-mode calibration, using primary reference standards to anchor the calibration, supplemented by representative sample lubricants analyzed by ASTM procedures. A partial least squares calibration is devised by combining primary acid/base reference standards and representative samples, focusing on the main spectral stoichiometric response with chemometrics assisting in accounting for matrix variability. FTIR(AN/BN) methodology is precise, accurate, and free of most interference that affects ASTM D664 and D4739 results. Extensive side-by-side operational runs produced normally distributed differences with mean differences close to zero and standard deviations of 0.18 and 0.26 mg KOH/g, respectively. Statistically, the FTIR methods are a direct match to the ASTM methods, with superior performance in terms of analytical throughput, preparation time, and solvent use. FTIR(AN/BN) analysis is a viable, significant advance for in-service lubricant analysis, providing an economic means of trending samples instead of tedious and expensive conventional ASTM(AN/BN) procedures. © 2014 Society for Laboratory Automation and Screening.

  11. From plastic to gold: a unified classification scheme for reference standards in medical image processing

    NASA Astrophysics Data System (ADS)

    Lehmann, Thomas M.

    2002-05-01

    Reliable evaluation of medical image processing is of major importance for routine applications. Nonetheless, evaluation is often omitted or methodically defective when novel approaches or algorithms are introduced. Adopted from medical diagnosis, we define the following criteria to classify reference standards: 1. Reliance, if the generation or capturing of test images for evaluation follows an exactly determined and reproducible protocol. 2. Equivalence, if the image material or relationships considered within an algorithmic reference standard equal real-life data with respect to structure, noise, or other parameters of importance. 3. Independence, if any reference standard relies on a different procedure than that to be evaluated, or on other images or image modalities than that used routinely. This criterion bans the simultaneous use of one image for both, training and test phase. 4. Relevance, if the algorithm to be evaluated is self-reproducible. If random parameters or optimization strategies are applied, reliability of the algorithm must be shown before the reference standard is applied for evaluation. 5. Significance, if the number of reference standard images that are used for evaluation is sufficient large to enable statistically founded analysis. We demand that a true gold standard must satisfy the Criteria 1 to 3. Any standard only satisfying two criteria, i.e., Criterion 1 and Criterion 2 or Criterion 1 and Criterion 3, is referred to as silver standard. Other standards are termed to be from plastic. Before exhaustive evaluation based on gold or silver standards is performed, its relevance must be shown (Criterion 4) and sufficient tests must be carried out to found statistical analysis (Criterion 5). In this paper, examples are given for each class of reference standards.

  12. 75 FR 38871 - Proposed Collection; Comment Request for Revenue Procedure 2004-29

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-06

    ... comments concerning Revenue Procedure 2004-29, Statistical Sampling in Sec. 274 Context. DATES: Written... Internet, at [email protected] . SUPPLEMENTARY INFORMATION: Title: Statistical Sampling in Sec...: Revenue Procedure 2004-29 prescribes the statistical sampling methodology by which taxpayers under...

  13. Management of sigmoid volvulus: options and prognosis.

    PubMed

    Maddah, Ghodratollah; Kazemzadeh, Gholam Hossein; Abdollahi, Abbas; Bahar, Mostafa Mehrabi; Tavassoli, Alireza; Shabahang, Hossein

    2014-01-01

    To describe the management of sigmoid volvulus with reference to the type of surgical procedures performed and to determine the prognosis of sigmoid volvulus. A case series. Ghaem Hospital of Mashhad, University of Medical Sciences, Mashhad, Iran, from 1996 to 2008. A total of 944 cases of colon obstruction were reviewed. Demographic, laboratory and treatment results, mortality and complications were recorded. The data was analyzed using descriptive statistics as frequency and percentage for the qualitative variables and mean and standard deviation values for the quantitative variables. Also chisquare and Fisher's exact test were used for the association between the qualitative variables. SPSS statistical software (version 18) was used for the data analysis. In all patients except those with symptoms or signs of gangrenous bowel, a long rectal tube was inserted via the rectosigmoidoscope which was successful in 80 (36.87%) cases. Rectosigmoidoscopic detorsion was unsuccessful in 137 (63.13%) patients, who underwent an emergent laparotomy. The surgical procedures performed in these cases were resection and primary anastomosis in 40 (29.1%), Mikulicz procedure in 9 (6.6%), laparotomy detorsion in 37 (27.01%), Hartmann procedure in 47 (34.3%), mesosigmoidoplasty in 3 (2.19%) patients and total colectomy in one (0.73%) case. The overall mortality was 9.8% (22) patients. In sigmoid volvulus, the most important determinant of patient outcome is bowel viability. The initial treatment of sigmoid colon volvulus is sigmoidoscopy with rectal tube placement.

  14. Reporting Practices and Use of Quantitative Methods in Canadian Journal Articles in Psychology.

    PubMed

    Counsell, Alyssa; Harlow, Lisa L

    2017-05-01

    With recent focus on the state of research in psychology, it is essential to assess the nature of the statistical methods and analyses used and reported by psychological researchers. To that end, we investigated the prevalence of different statistical procedures and the nature of statistical reporting practices in recent articles from the four major Canadian psychology journals. The majority of authors evaluated their research hypotheses through the use of analysis of variance (ANOVA), t -tests, and multiple regression. Multivariate approaches were less common. Null hypothesis significance testing remains a popular strategy, but the majority of authors reported a standardized or unstandardized effect size measure alongside their significance test results. Confidence intervals on effect sizes were infrequently employed. Many authors provided minimal details about their statistical analyses and less than a third of the articles presented on data complications such as missing data and violations of statistical assumptions. Strengths of and areas needing improvement for reporting quantitative results are highlighted. The paper concludes with recommendations for how researchers and reviewers can improve comprehension and transparency in statistical reporting.

  15. [What is the methodological quality of articles on therapeutic procedures published in Cirugía Española?].

    PubMed

    Manterola, Carlos; Busquets, Juli; Pascual, Marta; Grande, Luis

    2006-02-01

    The aim of this study was to determine the methodological quality of articles on therapeutic procedures published in Cirugía Española and to study its association with the publication year, center, and subject-matter. A bibliometric study that included all articles on therapeutic procedures published in Cirugía Española between 2001 and 2004 was performed. All kinds of clinical designs were considered, excluding editorials, review articles, letters to editor, and experimental studies. The variables analyzed were: year of publication, center, design, and methodological quality. Methodological quality was determined by a valid and reliable scale. Descriptive statistics (calculation of means, standard deviation and medians) and analytical statistics (Pearson's chi2, nonparametric, ANOVA and Bonferroni tests) were used. A total of 244 articles were studied (197 case series [81%], 28 cohort studies [12%], 17 clinical trials [7%], 1 cross sectional study and 1 case-control study [0.8%]). The studies were performed mainly in Catalonia and Murcia (22% and 16%, respectively). The most frequent subject areas were soft tissue and hepatobiliopancreatic surgery (23% and 19%, respectively). The mean and median of the methodological quality score calculated for the entire series was 10.2 +/- 3.9 points and 9.5 points, respectively. Methodological quality significantly increased by publication year (p < 0.001). An association between methodological quality and subject area was observed but no association was detected with the center performing the study. The methodological quality of articles on therapeutic procedures published in Cirugía Española between 2001 and 2004 is low. However, a statistically significant trend toward improvement was observed.

  16. Obtaining accreditation by the pharmacy compounding accreditation board, part 2: developing essential standard operating procedures.

    PubMed

    Cabaleiro, Joe

    2007-01-01

    A key component of qualifying for accreditation with the Pharmacy Compounding Accreditation Board is having a set of comprehensive standard operating procedures that are being used by the pharmacy staff. The three criteria in standard operating procedures for which the Pharmacy Compounding Accreditation Board looks are: (1)written standard operating procedures; (2)standard operating procedures that reflect what the organization actualy does; and (3) whether the written standard operating procedures are implemented. Following specified steps in the preparation of standard operating procedures will result in procedures that meet Pharmacy Compounding Accreditation Board Requirements, thereby placing pharmacies one step closer to qualifying for accreditation.

  17. Effect of Various Finishing Procedures on the Reflectivity (Shine) of Tooth Enamel - An In-vitro Study.

    PubMed

    Patil, Harshal Ashok; Chitko, Shrikant Shrinivas; Kerudi, Veerendra Virupaxappa; Maheshwari, Amit Ratanlal; Patil, Neeraj Suresh; Tekale, Pawankumar Dnyandeo; Gore, Ketan Ashorao; Zope, Amit Ashok

    2016-08-01

    Reflectivity of an object is a good parameter for surface finish. As the patient evaluates finishing as a function of gloss/reflectivity/shine an attempt is made here to evaluate changes in surface finish with custom made reflectometer. The aim of the present study was to study the effect of various procedures during orthodontic treatment on the shine of enamel, using a custom made reflectometer. Sixty one extracted premolars were collected and each tooth was mounted on acrylic block. Reflectivity of the teeth was measured as compared to standard before any procedure. One tooth was kept as standard throughout the study. Sixty teeth were acid etched. Reflectivity was measured on custom made reflectometer and readings recorded. Same procedure was repeated after debonding. Then 60 samples were divided into three groups: Group 1 - Tungsten Carbide, Group 2 - Astropol, Group 3- Sof-Lex disc depending upon the finishing method after debonding and reflectivity was measured. The mean percentage of reflectivity after acid etching was 31.4%, debonding 45.5%, Tungsten carbide bur finishing (Group 1) was 58.3%, Astropol (Group 2) 72.8%, and Sof-Lex disc (Group 3) 84.4% as that to the standard. There was statistically very highly significant (p<0.001) difference in reflectivity restored by the three finishing materials in the study. Thus, the light reflection was better in Group 3> Group 2> Group 1. The primary goal was to restore the enamel to its original state after orthodontic treatment. The methods tested in this study could not restore the original enamel reflectivity.

  18. Evaluating flow cytometer performance with weighted quadratic least squares analysis of LED and multi-level bead data

    PubMed Central

    Parks, David R.; Khettabi, Faysal El; Chase, Eric; Hoffman, Robert A.; Perfetto, Stephen P.; Spidlen, Josef; Wood, James C.S.; Moore, Wayne A.; Brinkman, Ryan R.

    2017-01-01

    We developed a fully automated procedure for analyzing data from LED pulses and multi-level bead sets to evaluate backgrounds and photoelectron scales of cytometer fluorescence channels. The method improves on previous formulations by fitting a full quadratic model with appropriate weighting and by providing standard errors and peak residuals as well as the fitted parameters themselves. Here we describe the details of the methods and procedures involved and present a set of illustrations and test cases that demonstrate the consistency and reliability of the results. The automated analysis and fitting procedure is generally quite successful in providing good estimates of the Spe (statistical photoelectron) scales and backgrounds for all of the fluorescence channels on instruments with good linearity. The precision of the results obtained from LED data is almost always better than for multi-level bead data, but the bead procedure is easy to carry out and provides results good enough for most purposes. Including standard errors on the fitted parameters is important for understanding the uncertainty in the values of interest. The weighted residuals give information about how well the data fits the model, and particularly high residuals indicate bad data points. Known photoelectron scales and measurement channel backgrounds make it possible to estimate the precision of measurements at different signal levels and the effects of compensated spectral overlap on measurement quality. Combining this information with measurements of standard samples carrying dyes of biological interest, we can make accurate comparisons of dye sensitivity among different instruments. Our method is freely available through the R/Bioconductor package flowQB. PMID:28160404

  19. Premedication with N-acetylcysteine and simethicone improves mucosal visualization during gastroscopy: a randomized, controlled, endoscopist-blinded study.

    PubMed

    Neale, James R; James, Shirley; Callaghan, James; Patel, Praful

    2013-07-01

    Diagnostic gastroscopy provides a unique opportunity to diagnose early oesophagogastric neoplasia; however, intraluminal mucus and bile can obscure mucosal visualization. The aim of this study was to determine whether the use of a premedication solution containing the mucolytic agent N-acetylcysteine and the surfactant simethicone improves mucosal visualization within a UK diagnostic gastroscopy service. A total of 75 consecutive patients were recruited from a single (S.J.) endoscopist's diagnostic gastroscopy list. They were randomized into three treatment groups: (a) standard control=clear fluids only for 6 h, nil by mouth for 2 h; (b) water control=standard control+100 ml sterile water (given 20 min before gastroscopy); and (c) solution=standard control+100 ml investigated solution (20 min before gastroscopy). The endoscopist was blinded to patient preparation. Inadequate mucosal visualization was defined as fluid/mucus during gastroscopy that could not be suctioned and required flushing with water. The volume of flush, the site at which it was used and the total procedure times were recorded. All three groups showed no statistical difference for age, sex ratio, procedure priority or indication. The mean volume of flush required to obtain clear mucosa was significantly less in the solution group compared with the other groups. The mean overall procedure time was also less in the solution group compared with the other groups. Premedication with N-acetylcysteine and simethicone markedly improves mucosal visibility during gastroscopy. It also reduces the time taken for the procedure. This low-cost and well-tolerated intervention may improve detection of early neoplasia.

  20. Methods of analysis by the U.S. Geological Survey National Water Quality Laboratory; preparation procedure for aquatic biological material determined for trace metals

    USGS Publications Warehouse

    Hoffman, Gerald L.

    1996-01-01

    A method for the chemical preparation of tissue samples that are subsequently analyzed for 22 trace metals is described. The tissue-preparation procedure was tested with three National Institute of Standards and Technology biological standard reference materials and two National Water Quality Laboratory homogenized biological materials. A low-temperature (85 degrees Celsius) nitric acid digestion followed by the careful addition of hydrogen peroxide (30-percent solution) is used to decompose the biological material. The solutions are evaporated to incipient dryness, reconstituted with 5 percent nitric acid, and filtered. After filtration the solutions were diluted to a known volume and analyzed by inductively coupled plasma-mass spectrometry (ICP-MS), inductively coupled plasma-atomic emission spectrometry (ICP-AES), and cold vapor-atomic absorption spectrophotometry (CV-AAS). Many of the metals were determined by both ICP-MS and ICP-AES. This report does not provide a detailed description of the instrumental procedures and conditions used with the three types of instrumentation for the quantitation of trace metals determined in this study. Statistical data regarding recovery, accuracy, and precision for individual trace metals determined in the biological material tested are summarized.

  1. A statistical assessment of seismic models of the U.S. continental crust using Bayesian inversion of ambient noise surface wave dispersion data

    NASA Astrophysics Data System (ADS)

    Olugboji, T. M.; Lekic, V.; McDonough, W.

    2017-07-01

    We present a new approach for evaluating existing crustal models using ambient noise data sets and its associated uncertainties. We use a transdimensional hierarchical Bayesian inversion approach to invert ambient noise surface wave phase dispersion maps for Love and Rayleigh waves using measurements obtained from Ekström (2014). Spatiospectral analysis shows that our results are comparable to a linear least squares inverse approach (except at higher harmonic degrees), but the procedure has additional advantages: (1) it yields an autoadaptive parameterization that follows Earth structure without making restricting assumptions on model resolution (regularization or damping) and data errors; (2) it can recover non-Gaussian phase velocity probability distributions while quantifying the sources of uncertainties in the data measurements and modeling procedure; and (3) it enables statistical assessments of different crustal models (e.g., CRUST1.0, LITHO1.0, and NACr14) using variable resolution residual and standard deviation maps estimated from the ensemble. These assessments show that in the stable old crust of the Archean, the misfits are statistically negligible, requiring no significant update to crustal models from the ambient noise data set. In other regions of the U.S., significant updates to regionalization and crustal structure are expected especially in the shallow sedimentary basins and the tectonically active regions, where the differences between model predictions and data are statistically significant.

  2. 75 FR 53738 - Proposed Collection; Comment Request for Rev. Proc. 2007-35

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-01

    ... Revenue Procedure Revenue Procedure 2007-35, Statistical Sampling for purposes of Section 199. DATES... through the Internet, at [email protected] . SUPPLEMENTARY INFORMATION: Title: Statistical Sampling...: This revenue procedure provides for determining when statistical sampling may be used in purposes of...

  3. Assessment of opacimeter calibration according to International Standard Organization 10155.

    PubMed

    Gomes, J F

    2001-01-01

    This paper compares the calibration method for opacimeters issued by the International Standard Organization (ISO) 10155 with the manual reference method for determination of dust content in stack gases. ISO 10155 requires at least nine operational measurements, corresponding to three operational measurements per each dust emission range within the stack. The procedure is assessed by comparison with previous calibration methods for opacimeters using only two operational measurements from a set of measurements made at stacks from pulp mills. The results show that even if the international standard for opacimeter calibration requires that the calibration curve is to be obtained using 3 x 3 points, a calibration curve derived using 3 points could be, at times, acceptable in statistical terms, provided that the amplitude of individual measurements is low.

  4. Open-source platform to benchmark fingerprints for ligand-based virtual screening

    PubMed Central

    2013-01-01

    Similarity-search methods using molecular fingerprints are an important tool for ligand-based virtual screening. A huge variety of fingerprints exist and their performance, usually assessed in retrospective benchmarking studies using data sets with known actives and known or assumed inactives, depends largely on the validation data sets used and the similarity measure used. Comparing new methods to existing ones in any systematic way is rather difficult due to the lack of standard data sets and evaluation procedures. Here, we present a standard platform for the benchmarking of 2D fingerprints. The open-source platform contains all source code, structural data for the actives and inactives used (drawn from three publicly available collections of data sets), and lists of randomly selected query molecules to be used for statistically valid comparisons of methods. This allows the exact reproduction and comparison of results for future studies. The results for 12 standard fingerprints together with two simple baseline fingerprints assessed by seven evaluation methods are shown together with the correlations between methods. High correlations were found between the 12 fingerprints and a careful statistical analysis showed that only the two baseline fingerprints were different from the others in a statistically significant way. High correlations were also found between six of the seven evaluation methods, indicating that despite their seeming differences, many of these methods are similar to each other. PMID:23721588

  5. The effect of vasodilatory medications on radial artery spasm in patients undergoing transradial coronary artery procedures: a systematic review.

    PubMed

    Curtis, Elizabeth; Fernandez, Ritin; Lee, Astin

    2017-07-01

    The uptake of percutaneous coronary procedures via the radial artery has increased internationally due to the decreased risk of complications and increased patient satisfaction. The increased susceptibility of the radial artery to spasm however presents a potential risk for procedural failure. Although most experts agree on the need for prophylactic medications to reduce radial artery spasm, currently there is inconsistency in literature regarding the most effective vasodilatory medication or combination of medications. The objective of this study is to identify the effectiveness of vasodilatory medications on radial artery spasm in patients undergoing transradial coronary artery procedures. This review considered studies that included participants aged 18 years and over undergoing non-emergent transradial percutaneous coronary artery procedures. This review considered studies that used vasodilating intravenous and intra-arterial medications or combinations of medications prior to commencing and during transradial coronary approaches to reduce radial artery spasm. The outcomes of interest were the incidence of radial artery spasm during percutaneous coronary procedure using objective and/or subjective measures and its effect on the successful completion of the procedure. Randomized controlled trials published in the English language between 1989 to date were considered for inclusion. The search strategy aimed to find both published and unpublished studies. A three-step search strategy was utilized in this review. An initial search of MEDLINE, CINAHL and Scopus was undertaken, followed by a search for unpublished studies. Papers selected for retrieval were assessed by two independent reviewers for methodological validity prior to inclusion in the review using standardized critical appraisal instruments. Any disagreements that arose between the reviewers were resolved through discussion. Quantitative data was extracted from papers included in the review using the standardized data extraction tool from RevMan5 (Copenhagen: The Nordic Cochrane Centre, Cochrane). Quantitative data, where possible, was pooled in statistical meta-analysis using RevMan5. All results were subject to double data entry. Effect sizes expressed as risk ratio (for categorical data) and weighted mean differences (for continuous data) and their 95% confidence intervals were calculated for analysis. Nine trials involving 3614 patients were included in the final review. Pooled data involving 992 patients on the effect of calcium channel blockers demonstrated a statistically significant reduction in the incidence of vasospasm in patients who received verapamil 5 mg compared to those who received a placebo (OR 0.33; 95%CI 0.19, 0.58). Similarly patients who received verapamil 2.5 mg or 1.25 mg had significantly fewer incidences of vasospasm when compared to those who received a placebo. Nitroglycerine 100mcg was demonstrated to be associated with a statistically significant reduction in the incidence of vasospasm. The evidence demonstrates a benefit in the use of vasodilatory medications for the reduction of vasospasm in patients having radial coronary procedures. Further large-scale multi-center trials are needed to determine the preferred medication.

  6. B-737 flight test of curved-path and steep-angle approaches using MLS guidance

    NASA Technical Reports Server (NTRS)

    Branstetter, J. R.; White, W. F.

    1989-01-01

    A series of flight tests were conducted to collect data for jet transport aircraft flying curved-path and steep-angle approaches using Microwave Landing System (MLS) guidance. During the test, 432 approaches comprising seven different curved-paths and four glidepath angles varying from 3 to 4 degrees were flown in NASA Langley's Boeing 737 aircraft (Transport Systems Research Vehicle) using an MLS ground station at the NASA Wallops Flight Facility. Subject pilots from Piedmont Airlines flew the approaches using conventional cockpit instrumentation (flight director and Horizontal Situation Indicator (HSI). The data collected will be used by FAA procedures specialists to develop standards and criteria for designing MLS terminal approach procedures (TERPS). The use of flight simulation techniques greatly aided the preliminary stages of approach development work and saved a significant amount of costly flight time. This report is intended to complement a data report to be issued by the FAA Office of Aviation Standards which will contain all detailed data analysis and statistics.

  7. Recommendation for the review of biological reference intervals in medical laboratories.

    PubMed

    Henny, Joseph; Vassault, Anne; Boursier, Guilaine; Vukasovic, Ines; Mesko Brguljan, Pika; Lohmander, Maria; Ghita, Irina; Andreu, Francisco A Bernabeu; Kroupis, Christos; Sprongl, Ludek; Thelen, Marc H M; Vanstapel, Florent J L A; Vodnik, Tatjana; Huisman, Willem; Vaubourdolle, Michel

    2016-12-01

    This document is based on the original recommendation of the Expert Panel on the Theory of Reference Values of the International Federation of Clinical Chemistry and Laboratory Medicine (IFCC), updated guidelines were recently published under the auspices of the IFCC and the Clinical and Laboratory Standards Institute (CLSI). This document summarizes proposals for recommendations on: (i) The terminology, which is often confusing, noticeably concerning the terms of reference limits and decision limits. (ii) The method for the determination of reference limits according to the original procedure and the conditions, which should be used. (iii) A simple procedure allowing the medical laboratories to fulfill the requirements of the regulation and standards. The updated document proposes to verify that published reference limits are applicable to the laboratory involved. Finally, the strengths and limits of the revised recommendations (especially the selection of the reference population, the maintenance of the analytical quality, the choice of the statistical method used…) will be briefly discussed.

  8. Variation in hospital costs and reimbursement for endovascular aneurysm repair: A Vascular Quality Initiative pilot project.

    PubMed

    Lemmon, Gary W; Neal, Dan; DeMartino, Randall R; Schneider, Joseph R; Singh, Tej; Kraiss, Larry; Scali, Salvatore; Tassiopoulos, Apostolos; Hoel, Andrew; Cronenwett, Jack L

    2017-10-01

    Comparing costs between centers is difficult because of the heterogeneity of vascular procedures contained in broad diagnosis-related group (DRG) billing categories. The purpose of this pilot project was to develop a mechanism to merge Vascular Quality Initiative (VQI) clinical data with hospital billing data to allow more accurate cost and reimbursement comparison for endovascular aneurysm repair (EVAR) procedures across centers. Eighteen VQI centers volunteered to submit UB04 billing data for 782 primary, elective infrarenal EVAR procedures performed by 108 surgeons in 2014. Procedures were categorized as standard or complex (with femoral-femoral bypass or additional arterial treatment) and without or with complications (arterial injury or embolectomy; bowel or leg ischemia; wound infection; reoperation; or cardiac, pulmonary, or renal complications), yielding four clinical groups for comparison. MedAssets, Inc, using cost to charge ratios, calculated total hospital costs and cost categories. Cost variation analyzed across centers was compared with DRG 237 (with major complication or comorbidity) and 238 (without major complication or comorbidity) coding. A multivariable model to predict DRG 237 coding was developed using VQI clinical data. Of the 782 EVAR procedures, 56% were standard and 15% had complications, with wide variation between centers. Mean total costs ranged from $31,100 for standard EVAR without complications to $47,400 for complex EVAR with complications and varied twofold to threefold among centers. Implant costs for standard EVAR without complications varied from $8100 to $28,200 across centers. Average Medicare reimbursement was less than total cost except for standard EVAR without complications. Only 9% of all procedures with complications in the VQI were reported in the higher reimbursed DRG 237 category (center range, 0%-21%). There was significant variation in hospitals' coding of DRG 237 compared with their expected rates. VQI clinical data accurately predict current DRG coding (C statistic, 0.87). VQI data allow a more precise EVAR cost comparison by identifying comparable clinical groups compared with DRG-based calculations. Total costs exceeded Medicare reimbursement, especially for patients with complications, although this varied by center. Implant costs also varied more than expected between centers for comparable cases. Incorporation of VQI data elements documenting EVAR case complexity into billing data may allow centers to better align respective DRG reimbursement to total costs. Copyright © 2017 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.

  9. Time-variant random interval natural frequency analysis of structures

    NASA Astrophysics Data System (ADS)

    Wu, Binhua; Wu, Di; Gao, Wei; Song, Chongmin

    2018-02-01

    This paper presents a new robust method namely, unified interval Chebyshev-based random perturbation method, to tackle hybrid random interval structural natural frequency problem. In the proposed approach, random perturbation method is implemented to furnish the statistical features (i.e., mean and standard deviation) and Chebyshev surrogate model strategy is incorporated to formulate the statistical information of natural frequency with regards to the interval inputs. The comprehensive analysis framework combines the superiority of both methods in a way that computational cost is dramatically reduced. This presented method is thus capable of investigating the day-to-day based time-variant natural frequency of structures accurately and efficiently under concrete intrinsic creep effect with probabilistic and interval uncertain variables. The extreme bounds of the mean and standard deviation of natural frequency are captured through the embedded optimization strategy within the analysis procedure. Three particularly motivated numerical examples with progressive relationship in perspective of both structure type and uncertainty variables are demonstrated to justify the computational applicability, accuracy and efficiency of the proposed method.

  10. Radial access for cerebrovascular procedures: Case report and technical note.

    PubMed

    Satti, Sudhakar R; Vance, Ansar Z; Sivapatham, Thinesh

    2016-04-01

    Advantages of radial access over brachial/axillary or femoral access have been well described for several decades and include decreased cost, patient preference, and decreased major access site complications. Despite these advantages, radial access is rarely employed or even considered for neurointerventional procedures. This attitude should be reconsidered given several recent large, randomized, controlled trials from the cardiovascular literature proving that radial access is associated with statistically lower costs, decreased incidence of myocardial infarctions, strokes, and even decreased mortality. Radial access is now considered the standard of care for percutaneous coronary interventions in most US centers. Although radial access has been described for neurovascular procedures in the past, overall experience is limited. The two major challenges are the unique anatomy required to access the cerebral vasculature given very acute angles between the arm and craniocervical vessels and limitations in available technology. We present a simplified approach to radial access for cerebrovascular procedures and provide a concise step-by-step approach for patient selection, ultrasound-guided single-wall access, recommended catheters/wires, and review of patent hemostasis. Additionally, we present a complex cerebrovascular intervention in which standard femoral access was unsuccessful, while radial access was quickly achieved to highlight the importance of familiarity with the radial approach for all neurointerventionalists. We have found that the learning curve is not too steep and that the radial access approach can be adopted smoothly for a large percentage of diagnostic and interventional neuroradiologic procedures. Radial access should be considered in all patients undergoing a cerebrovascular procedure. © The Author(s) 2015.

  11. A Delphi Consensus of the Crucial Steps in Gastric Bypass and Sleeve Gastrectomy Procedures in the Netherlands.

    PubMed

    Kaijser, Mirjam A; van Ramshorst, Gabrielle H; Emous, Marloes; Veeger, Nic J G M; van Wagensveld, Bart A; Pierie, Jean-Pierre E N

    2018-04-09

    Bariatric procedures are technically complex and skill demanding. In order to standardize the procedures for research and training, a Delphi analysis was performed to reach consensus on the practice of the laparoscopic gastric bypass and sleeve gastrectomy in the Netherlands. After a pre-round identifying all possible steps from literature and expert opinion within our study group, questionnaires were send to 68 registered Dutch bariatric surgeons, with 73 steps for bypass surgery and 51 steps for sleeve gastrectomy. Statistical analysis was performed to identify steps with and without consensus. This process was repeated to reach consensus of all necessary steps. Thirty-eight participants (56%) responded in the first round and 32 participants (47%) in the second round. After the first Delphi round, 19 steps for gastric bypass (26%) and 14 for sleeve gastrectomy (27%) gained full consensus. After the second round, an additional amount of 10 and 12 sub-steps was confirmed as key steps, respectively. Thirteen steps in the gastric bypass and seven in the gastric sleeve were deemed advisable. Our expert panel showed a high level of consensus expressed in a Cronbach's alpha of 0.82 for the gastric bypass and 0.87 for the sleeve gastrectomy. The Delphi consensus defined 29 steps for gastric bypass and 26 for sleeve gastrectomy as being crucial for correct performance of these procedures to the standards of our expert panel. These results offer a clear framework for the technical execution of these procedures.

  12. An Evaluation of the Effects of the Transobturator Tape Procedure on Sexual Satisfaction in Women with Stress Urinary Incontinence Using the Libido Scoring System

    PubMed Central

    Narin, Raziye; Nazik, Hakan; Narin, Mehmet Ali; Aytan, Hakan; Api, Murat

    2013-01-01

    Introduction and Hypothesis. Most women experience automatic urine leakage in their lifetimes. SUI is the most common type in women. Suburethral slings have become a standard surgical procedure for the treatment of stress urinary incontinence when conservative therapy failed. The treatment of stress urinary incontinence by suburethral sling may improve body image by reducing urinary leakage and may improve sexual satisfaction. Methods. A total of 59 sexually active patients were included in the study and underwent a TOT outside-in procedure. The LSS was applied in all patients by self-completion of questionnaires preoperatively and 6 months after the operation. General pleasure with the operation was measured by visual analogue score (VAS). Pre- and postoperative scores were recorded and analyzed using SPSS 11.5. Results. Two parameters of the LSS, orgasm and who starts the sexual activity, increased at a statistically significant rate. Conclusion. Sexual satisfaction and desire have partially improved after the TOT procedure. PMID:24288621

  13. Statistical methodology for the analysis of dye-switch microarray experiments

    PubMed Central

    Mary-Huard, Tristan; Aubert, Julie; Mansouri-Attia, Nadera; Sandra, Olivier; Daudin, Jean-Jacques

    2008-01-01

    Background In individually dye-balanced microarray designs, each biological sample is hybridized on two different slides, once with Cy3 and once with Cy5. While this strategy ensures an automatic correction of the gene-specific labelling bias, it also induces dependencies between log-ratio measurements that must be taken into account in the statistical analysis. Results We present two original statistical procedures for the statistical analysis of individually balanced designs. These procedures are compared with the usual ML and REML mixed model procedures proposed in most statistical toolboxes, on both simulated and real data. Conclusion The UP procedure we propose as an alternative to usual mixed model procedures is more efficient and significantly faster to compute. This result provides some useful guidelines for the analysis of complex designs. PMID:18271965

  14. A gene-signature progression approach to identifying candidate small-molecule cancer therapeutics with connectivity mapping.

    PubMed

    Wen, Qing; Kim, Chang-Sik; Hamilton, Peter W; Zhang, Shu-Dong

    2016-05-11

    Gene expression connectivity mapping has gained much popularity recently with a number of successful applications in biomedical research testifying its utility and promise. Previously methodological research in connectivity mapping mainly focused on two of the key components in the framework, namely, the reference gene expression profiles and the connectivity mapping algorithms. The other key component in this framework, the query gene signature, has been left to users to construct without much consensus on how this should be done, albeit it has been an issue most relevant to end users. As a key input to the connectivity mapping process, gene signature is crucially important in returning biologically meaningful and relevant results. This paper intends to formulate a standardized procedure for constructing high quality gene signatures from a user's perspective. We describe a two-stage process for making quality gene signatures using gene expression data as initial inputs. First, a differential gene expression analysis comparing two distinct biological states; only the genes that have passed stringent statistical criteria are considered in the second stage of the process, which involves ranking genes based on statistical as well as biological significance. We introduce a "gene signature progression" method as a standard procedure in connectivity mapping. Starting from the highest ranked gene, we progressively determine the minimum length of the gene signature that allows connections to the reference profiles (drugs) being established with a preset target false discovery rate. We use a lung cancer dataset and a breast cancer dataset as two case studies to demonstrate how this standardized procedure works, and we show that highly relevant and interesting biological connections are returned. Of particular note is gefitinib, identified as among the candidate therapeutics in our lung cancer case study. Our gene signature was based on gene expression data from Taiwan female non-smoker lung cancer patients, while there is evidence from independent studies that gefitinib is highly effective in treating women, non-smoker or former light smoker, advanced non-small cell lung cancer patients of Asian origin. In summary, we introduced a gene signature progression method into connectivity mapping, which enables a standardized procedure for constructing high quality gene signatures. This progression method is particularly useful when the number of differentially expressed genes identified is large, and when there is a need to prioritize them to be included in the query signature. The results from two case studies demonstrate that the approach we have developed is capable of obtaining pertinent candidate drugs with high precision.

  15. Nonlinear filtering properties of detrended fluctuation analysis

    NASA Astrophysics Data System (ADS)

    Kiyono, Ken; Tsujimoto, Yutaka

    2016-11-01

    Detrended fluctuation analysis (DFA) has been widely used for quantifying long-range correlation and fractal scaling behavior. In DFA, to avoid spurious detection of scaling behavior caused by a nonstationary trend embedded in the analyzed time series, a detrending procedure using piecewise least-squares fitting has been applied. However, it has been pointed out that the nonlinear filtering properties involved with detrending may induce instabilities in the scaling exponent estimation. To understand this issue, we investigate the adverse effects of the DFA detrending procedure on the statistical estimation. We show that the detrending procedure using piecewise least-squares fitting results in the nonuniformly weighted estimation of the root-mean-square deviation and that this property could induce an increase in the estimation error. In addition, for comparison purposes, we investigate the performance of a centered detrending moving average analysis with a linear detrending filter and sliding window DFA and show that these methods have better performance than the standard DFA.

  16. Standards-Based Procedural Phenotyping: The Arden Syntax on i2b2.

    PubMed

    Mate, Sebastian; Castellanos, Ixchel; Ganslandt, Thomas; Prokosch, Hans-Ulrich; Kraus, Stefan

    2017-01-01

    Phenotyping, or the identification of patient cohorts, is a recurring challenge in medical informatics. While there are open source tools such as i2b2 that address this problem by providing user-friendly querying interfaces, these platforms lack semantic expressiveness to model complex phenotyping algorithms. The Arden Syntax provides procedural programming language construct, designed specifically for medical decision support and knowledge transfer. In this work, we investigate how language constructs of the Arden Syntax can be used for generic phenotyping. We implemented a prototypical tool to integrate i2b2 with an open source Arden execution environment. To demonstrate the applicability of our approach, we used the tool together with an Arden-based phenotyping algorithm to derive statistics about ICU-acquired hypernatremia. Finally, we discuss how the combination of i2b2's user-friendly cohort pre-selection and Arden's procedural expressiveness could benefit phenotyping.

  17. Comparison of measurement methods with a mixed effects procedure accounting for replicated evaluations (COM3PARE): method comparison algorithm implementation for head and neck IGRT positional verification.

    PubMed

    Roy, Anuradha; Fuller, Clifton D; Rosenthal, David I; Thomas, Charles R

    2015-08-28

    Comparison of imaging measurement devices in the absence of a gold-standard comparator remains a vexing problem; especially in scenarios where multiple, non-paired, replicated measurements occur, as in image-guided radiotherapy (IGRT). As the number of commercially available IGRT presents a challenge to determine whether different IGRT methods may be used interchangeably, an unmet need conceptually parsimonious and statistically robust method to evaluate the agreement between two methods with replicated observations. Consequently, we sought to determine, using an previously reported head and neck positional verification dataset, the feasibility and utility of a Comparison of Measurement Methods with the Mixed Effects Procedure Accounting for Replicated Evaluations (COM3PARE), a unified conceptual schema and analytic algorithm based upon Roy's linear mixed effects (LME) model with Kronecker product covariance structure in a doubly multivariate set-up, for IGRT method comparison. An anonymized dataset consisting of 100 paired coordinate (X/ measurements from a sequential series of head and neck cancer patients imaged near-simultaneously with cone beam CT (CBCT) and kilovoltage X-ray (KVX) imaging was used for model implementation. Software-suggested CBCT and KVX shifts for the lateral (X), vertical (Y) and longitudinal (Z) dimensions were evaluated for bias, inter-method (between-subject variation), intra-method (within-subject variation), and overall agreement using with a script implementing COM3PARE with the MIXED procedure of the statistical software package SAS (SAS Institute, Cary, NC, USA). COM3PARE showed statistically significant bias agreement and difference in inter-method between CBCT and KVX was observed in the Z-axis (both p - value<0.01). Intra-method and overall agreement differences were noted as statistically significant for both the X- and Z-axes (all p - value<0.01). Using pre-specified criteria, based on intra-method agreement, CBCT was deemed preferable for X-axis positional verification, with KVX preferred for superoinferior alignment. The COM3PARE methodology was validated as feasible and useful in this pilot head and neck cancer positional verification dataset. COM3PARE represents a flexible and robust standardized analytic methodology for IGRT comparison. The implemented SAS script is included to encourage other groups to implement COM3PARE in other anatomic sites or IGRT platforms.

  18. 40 CFR 160.81 - Standard operating procedures.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 25 2013-07-01 2013-07-01 false Standard operating procedures. 160.81... GOOD LABORATORY PRACTICE STANDARDS Testing Facilities Operation § 160.81 Standard operating procedures. (a) A testing facility shall have standard operating procedures in writing setting forth study...

  19. 40 CFR 160.81 - Standard operating procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 24 2014-07-01 2014-07-01 false Standard operating procedures. 160.81... GOOD LABORATORY PRACTICE STANDARDS Testing Facilities Operation § 160.81 Standard operating procedures. (a) A testing facility shall have standard operating procedures in writing setting forth study...

  20. 40 CFR 160.81 - Standard operating procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 25 2012-07-01 2012-07-01 false Standard operating procedures. 160.81... GOOD LABORATORY PRACTICE STANDARDS Testing Facilities Operation § 160.81 Standard operating procedures. (a) A testing facility shall have standard operating procedures in writing setting forth study...

  1. Segmentation of bone and soft tissue regions in digital radiographic images of extremities

    NASA Astrophysics Data System (ADS)

    Pakin, S. Kubilay; Gaborski, Roger S.; Barski, Lori L.; Foos, David H.; Parker, Kevin J.

    2001-07-01

    This paper presents an algorithm for segmentation of computed radiography (CR) images of extremities into bone and soft tissue regions. The algorithm is a region-based one in which the regions are constructed using a growing procedure with two different statistical tests. Following the growing process, tissue classification procedure is employed. The purpose of the classification is to label each region as either bone or soft tissue. This binary classification goal is achieved by using a voting procedure that consists of clustering of regions in each neighborhood system into two classes. The voting procedure provides a crucial compromise between local and global analysis of the image, which is necessary due to strong exposure variations seen on the imaging plate. Also, the existence of regions whose size is large enough such that exposure variations can be observed through them makes it necessary to use overlapping blocks during the classification. After the classification step, resulting bone and soft tissue regions are refined by fitting a 2nd order surface to each tissue, and reevaluating the label of each region according to the distance between the region and surfaces. The performance of the algorithm is tested on a variety of extremity images using manually segmented images as gold standard. The experiments showed that our algorithm provided a bone boundary with an average area overlap of 90% compared to the gold standard.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beavin, P. Jr.

    A previously published method for determining zirconium in antiperspirant aerosols was collaboratively studied by 7 laboratories. The method consists of 2 procedures: a rapid dilution procedure for soluble zirconium compounds or a lengthier fusion procedure for total zirconium followed by colorimetric determination. The collaborators were asked to perform the following: Spiking materials representing 4 levels of soluble zirconium were added to weighed portions of a zirconium-free cream base concentrate and the portions were assayed by the dilution procedure. Spiking materials representing 4 levels of zirconium in either the soluble or the insoluble form (or as a mixture) were also addedmore » to portions of the same concentrate and these portions were assayed by the fusion procedure. They were also asked to concentrate and assay, by both procedures, 2 cans each of 2 commercial aerosol antiperspirants containing zirconyl hydroxychloride. The average percent recoveries and standard deviations for spiked samples were 99.8-100.2 and 1.69-2.71, respectively, for soluble compounds determined by the dilution procedure, and 93.8-97.4 and 3.09-4.78, respectively, for soluble and/or insoluble compounds determined by the fusion procedure. The average perent zirconium found by the dilution procedure in the 2 commercial aerosol products was 0.751 and 0.792. Insufficient collaborative results were received for the fusion procedure for statistical evaluation. The dilution procedure has been adopted as official first action.« less

  3. Engineering Students Designing a Statistical Procedure for Quantifying Variability

    ERIC Educational Resources Information Center

    Hjalmarson, Margret A.

    2007-01-01

    The study examined first-year engineering students' responses to a statistics task that asked them to generate a procedure for quantifying variability in a data set from an engineering context. Teams used technological tools to perform computations, and their final product was a ranking procedure. The students could use any statistical measures,…

  4. A sup-score test for the cure fraction in mixture models for long-term survivors.

    PubMed

    Hsu, Wei-Wen; Todem, David; Kim, KyungMann

    2016-12-01

    The evaluation of cure fractions in oncology research under the well known cure rate model has attracted considerable attention in the literature, but most of the existing testing procedures have relied on restrictive assumptions. A common assumption has been to restrict the cure fraction to a constant under alternatives to homogeneity, thereby neglecting any information from covariates. This article extends the literature by developing a score-based statistic that incorporates covariate information to detect cure fractions, with the existing testing procedure serving as a special case. A complication of this extension, however, is that the implied hypotheses are not typical and standard regularity conditions to conduct the test may not even hold. Using empirical processes arguments, we construct a sup-score test statistic for cure fractions and establish its limiting null distribution as a functional of mixtures of chi-square processes. In practice, we suggest a simple resampling procedure to approximate this limiting distribution. Our simulation results show that the proposed test can greatly improve efficiency over tests that neglect the heterogeneity of the cure fraction under the alternative. The practical utility of the methodology is illustrated using ovarian cancer survival data with long-term follow-up from the surveillance, epidemiology, and end results registry. © 2016, The International Biometric Society.

  5. 40 CFR 792.81 - Standard operating procedures.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 33 2013-07-01 2013-07-01 false Standard operating procedures. 792.81... operating procedures. (a) A testing facility shall have standard operating procedures in writing, setting... data generated in the course of a study. All deviations in a study from standard operating procedures...

  6. 40 CFR 792.81 - Standard operating procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 33 2012-07-01 2012-07-01 false Standard operating procedures. 792.81... operating procedures. (a) A testing facility shall have standard operating procedures in writing, setting... data generated in the course of a study. All deviations in a study from standard operating procedures...

  7. 40 CFR 792.81 - Standard operating procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 32 2014-07-01 2014-07-01 false Standard operating procedures. 792.81... operating procedures. (a) A testing facility shall have standard operating procedures in writing, setting... data generated in the course of a study. All deviations in a study from standard operating procedures...

  8. Status of research into lightning effects on aircraft

    NASA Technical Reports Server (NTRS)

    Plumer, J. A.

    1976-01-01

    Developments in aircraft lightning protection since 1938 are reviewed. Potential lightning problems resulting from present trends toward the use of electronic controls and composite structures are discussed, along with presently available lightning test procedures for problem assessment. The validity of some procedures is being questioned because of pessimistic results and design implications. An in-flight measurement program is needed to provide statistics on lightning severity at flight altitudes and to enable more realistic tests, and operators are urged to supply researchers with more details on electronic components damaged by lightning strikes. A need for review of certain aspects of fuel system vulnerability is indicated by several recent accidents, and specific areas for examination are identified. New educational materials and standardization activities are also noted.

  9. General aviation air traffic pattern safety analysis

    NASA Technical Reports Server (NTRS)

    Parker, L. C.

    1973-01-01

    A concept is described for evaluating the general aviation mid-air collision hazard in uncontrolled terminal airspace. Three-dimensional traffic pattern measurements were conducted at uncontrolled and controlled airports. Computer programs for data reduction, storage retrieval and statistical analysis have been developed. Initial general aviation air traffic pattern characteristics are presented. These preliminary results indicate that patterns are highly divergent from the expected standard pattern, and that pattern procedures observed can affect the ability of pilots to see and avoid each other.

  10. The Construction and Utility of Three Indexes of Intellectual Achievement: An Intellectual-Development (ID) Index; A Socio-Intellectual-Status (SIS) Index; A Differential-Intellectual-Development (DID) Index. U.S. Children and Youths, 6-17 Years. Vital and Health Statistics. Data Evaluation and Methods Research. Series 2-Number 74.

    ERIC Educational Resources Information Center

    Dupuy, Harold J.; Gruvaeus, Gunnar

    Although the Intellectual Development (ID) index was constructed using standard psychometric procedures, the derivation of the other two indexes, Socio Intellectual Status (SIS) and Differential Intellectual Development (DID), by criterion scaling should have applications in diverse areas of scale or index construction. The ID is basically…

  11. Precision Measurement and Calibration. Volume 1. Statistical Concepts and Procedures

    DTIC Science & Technology

    1969-02-01

    due to growth measure of the loss of radium D and polonium - 210 of polonium - 210 in A. B, and D, on the one hand, in the transfer of June 1930. and in C...tracting the contributions of polonium - 210 and tion to Washington, D. C., of these standards; to nuclear recoils and of radium E from the energy W...event, the balance for the absolute measurement of radiation.w a n rwith applications to radium and its emanation, Proc. polonium - 210 correction will

  12. An operational definition of a statistically meaningful trend.

    PubMed

    Bryhn, Andreas C; Dimberg, Peter H

    2011-04-28

    Linear trend analysis of time series is standard procedure in many scientific disciplines. If the number of data is large, a trend may be statistically significant even if data are scattered far from the trend line. This study introduces and tests a quality criterion for time trends referred to as statistical meaningfulness, which is a stricter quality criterion for trends than high statistical significance. The time series is divided into intervals and interval mean values are calculated. Thereafter, r(2) and p values are calculated from regressions concerning time and interval mean values. If r(2) ≥ 0.65 at p ≤ 0.05 in any of these regressions, then the trend is regarded as statistically meaningful. Out of ten investigated time series from different scientific disciplines, five displayed statistically meaningful trends. A Microsoft Excel application (add-in) was developed which can perform statistical meaningfulness tests and which may increase the operationality of the test. The presented method for distinguishing statistically meaningful trends should be reasonably uncomplicated for researchers with basic statistics skills and may thus be useful for determining which trends are worth analysing further, for instance with respect to causal factors. The method can also be used for determining which segments of a time trend may be particularly worthwhile to focus on.

  13. Clinical evaluation of selected Yogic procedures in individuals with low back pain

    PubMed Central

    Pushpika Attanayake, A. M.; Somarathna, K. I. W. K.; Vyas, G. H.; Dash, S. C.

    2010-01-01

    The present study has been conducted to evaluate selected yogic procedures on individuals with low back pain. The understanding of back pain as one of the commonest clinical presentations during clinical practice made the path to the present study. It has also been calculated that more than three-quarters of the world's population experience back pain at some time in their lives. Twelve patients were selected and randomly divided into two groups, viz., group A yogic group and group B control group. Advice for life style and diet was given for all the patients. The effect of the therapy was assessed subjectively and objectively. Particular scores drawn for yogic group and control group were individually analyzed before and after treatment and the values were compared using standard statistical protocols. Yogic intervention revealed 79% relief in both subjective and objective parameters (i.e., 7 out of 14 parameters showed statistically highly significant P < 0.01 results, while 4 showed significant results P < 0.05). Comparative effect of yogic group and control group showed 79% relief in both subjective and objective parameters. (i.e., total 6 out of 14 parameters showed statistically highly significant (P < 0.01) results, while 5 showed significant results (P < 0.05). PMID:22131719

  14. 18 CFR 725.6 - Principles, standards and procedures.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 2 2010-04-01 2010-04-01 false Principles, standards... Responsibilities § 725.6 Principles, standards and procedures. The Principles, Standards and Procedures established... Orders. These Principles, Standards and Procedures are found in 18 CFR parts 710 through 717. ...

  15. A randomized trial of nature scenery and sounds versus urban scenery and sounds to reduce pain in adults undergoing bone marrow aspirate and biopsy.

    PubMed

    Lechtzin, Noah; Busse, Anne M; Smith, Michael T; Grossman, Stuart; Nesbit, Suzanne; Diette, Gregory B

    2010-09-01

    Bone marrow aspiration and biopsy (BMAB) is painful when performed with only local anesthetic. Our objective was to determine whether viewing nature scenes and listening to nature sounds can reduce pain during BMAB. This was a randomized, controlled clinical trial. Adult patients undergoing outpatient BMAB with only local anesthetic were assigned to use either a nature scene with accompanying nature sounds, city scene with city sounds, or standard care. The primary outcome was a visual analog scale (0-10) of pain. Prespecified secondary analyses included categorizing pain as mild and moderate to severe and using multiple logistic regression to adjust for potential confounding variables. One hundred and twenty (120) subjects were enrolled: 44 in the Nature arm, 39 in the City arm, and 37 in the Standard Care arm. The mean pain scores, which were the primary outcome, were not significantly different between the three arms. A higher proportion in the Standard Care arm had moderate-to-severe pain (pain rating ≥4) than in the Nature arm (78.4% versus 60.5%), though this was not statistically significant (p = 0.097). This difference was statistically significant after adjusting for differences in the operators who performed the procedures (odds ratio = 3.71, p = 0.02). We confirmed earlier findings showing that BMAB is poorly tolerated. While mean pain scores were not significantly different between the study arms, secondary analyses suggest that viewing a nature scene while listening to nature sounds is a safe, inexpensive method that may reduce pain during BMAB. This approach should be considered to alleviate pain during invasive procedures.

  16. 20 CFR 634.4 - Statistical standards.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 20 Employees' Benefits 3 2011-04-01 2011-04-01 false Statistical standards. 634.4 Section 634.4... System § 634.4 Statistical standards. Recipients shall agree to provide required data following the statistical standards prescribed by the Bureau of Labor Statistics for cooperative statistical programs. ...

  17. 20 CFR 634.4 - Statistical standards.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Statistical standards. 634.4 Section 634.4... System § 634.4 Statistical standards. Recipients shall agree to provide required data following the statistical standards prescribed by the Bureau of Labor Statistics for cooperative statistical programs. ...

  18. 7 CFR 52.38c - Statistical sampling procedures for lot inspection of processed fruits and vegetables by attributes.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 2 2011-01-01 2011-01-01 false Statistical sampling procedures for lot inspection of processed fruits and vegetables by attributes. 52.38c Section 52.38c Agriculture Regulations of the... Regulations Governing Inspection and Certification Sampling § 52.38c Statistical sampling procedures for lot...

  19. 7 CFR 52.38b - Statistical sampling procedures for on-line inspection by attributes of processed fruits and...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 2 2011-01-01 2011-01-01 false Statistical sampling procedures for on-line inspection by attributes of processed fruits and vegetables. 52.38b Section 52.38b Agriculture Regulations of... Regulations Governing Inspection and Certification Sampling § 52.38b Statistical sampling procedures for on...

  20. 75 FR 79320 - Animal Drugs, Feeds, and Related Products; Regulation of Carcinogenic Compounds in Food-Producing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-20

    ... is calculated from tumor data of the cancer bioassays using a statistical extrapolation procedure... carcinogenic concern currently set forth in Sec. 500.84 utilizes a statistical extrapolation procedure that... procedures did not rely on a statistical extrapolation of the data to a 1 in 1 million risk of cancer to test...

  1. 7 CFR 52.38b - Statistical sampling procedures for on-line inspection by attributes of processed fruits and...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Statistical sampling procedures for on-line inspection by attributes of processed fruits and vegetables. 52.38b Section 52.38b Agriculture Regulations of... Regulations Governing Inspection and Certification Sampling § 52.38b Statistical sampling procedures for on...

  2. 7 CFR 52.38c - Statistical sampling procedures for lot inspection of processed fruits and vegetables by attributes.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Statistical sampling procedures for lot inspection of processed fruits and vegetables by attributes. 52.38c Section 52.38c Agriculture Regulations of the... Regulations Governing Inspection and Certification Sampling § 52.38c Statistical sampling procedures for lot...

  3. 40 CFR 51.357 - Test procedures and standards.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 2 2010-07-01 2010-07-01 false Test procedures and standards. 51.357... Requirements § 51.357 Test procedures and standards. Written test procedures and pass/fail standards shall be established and followed for each model year and vehicle type included in the program. (a) Test procedure...

  4. Rapid label-free identification of Klebsiella pneumoniae antibiotic resistant strains by the drop-coating deposition surface-enhanced Raman scattering method

    NASA Astrophysics Data System (ADS)

    Cheong, Youjin; Kim, Young Jin; Kang, Heeyoon; Choi, Samjin; Lee, Hee Joo

    2017-08-01

    Although many methodologies have been developed to identify unknown bacteria, bacterial identification in clinical microbiology remains a complex and time-consuming procedure. To address this problem, we developed a label-free method for rapidly identifying clinically relevant multilocus sequencing typing-verified quinolone-resistant Klebsiella pneumoniae strains. We also applied the method to identify three strains from colony samples, ATCC70063 (control), ST11 and ST15; these are the prevalent quinolone-resistant K. pneumoniae strains in East Asia. The colonies were identified using a drop-coating deposition surface-enhanced Raman scattering (DCD-SERS) procedure coupled with a multivariate statistical method. Our workflow exhibited an enhancement factor of 11.3 × 106 to Raman intensities, high reproducibility (relative standard deviation of 7.4%), and a sensitive limit of detection (100 pM rhodamine 6G), with a correlation coefficient of 0.98. All quinolone-resistant K. pneumoniae strains showed similar spectral Raman shifts (high correlations) regardless of bacterial type, as well as different Raman vibrational modes compared to Escherichia coli strains. Our proposed DCD-SERS procedure coupled with the multivariate statistics-based identification method achieved excellent performance in discriminating similar microbes from one another and also in subtyping of K. pneumoniae strains. Therefore, our label-free DCD-SERS procedure coupled with the computational decision supporting method is a potentially useful method for the rapid identification of clinically relevant K. pneumoniae strains.

  5. Evaluating flow cytometer performance with weighted quadratic least squares analysis of LED and multi-level bead data.

    PubMed

    Parks, David R; El Khettabi, Faysal; Chase, Eric; Hoffman, Robert A; Perfetto, Stephen P; Spidlen, Josef; Wood, James C S; Moore, Wayne A; Brinkman, Ryan R

    2017-03-01

    We developed a fully automated procedure for analyzing data from LED pulses and multilevel bead sets to evaluate backgrounds and photoelectron scales of cytometer fluorescence channels. The method improves on previous formulations by fitting a full quadratic model with appropriate weighting and by providing standard errors and peak residuals as well as the fitted parameters themselves. Here we describe the details of the methods and procedures involved and present a set of illustrations and test cases that demonstrate the consistency and reliability of the results. The automated analysis and fitting procedure is generally quite successful in providing good estimates of the Spe (statistical photoelectron) scales and backgrounds for all the fluorescence channels on instruments with good linearity. The precision of the results obtained from LED data is almost always better than that from multilevel bead data, but the bead procedure is easy to carry out and provides results good enough for most purposes. Including standard errors on the fitted parameters is important for understanding the uncertainty in the values of interest. The weighted residuals give information about how well the data fits the model, and particularly high residuals indicate bad data points. Known photoelectron scales and measurement channel backgrounds make it possible to estimate the precision of measurements at different signal levels and the effects of compensated spectral overlap on measurement quality. Combining this information with measurements of standard samples carrying dyes of biological interest, we can make accurate comparisons of dye sensitivity among different instruments. Our method is freely available through the R/Bioconductor package flowQB. © 2017 International Society for Advancement of Cytometry. © 2017 International Society for Advancement of Cytometry.

  6. Pulsed Dose Radiofrequency Before Ablation of Medial Branch of the Lumbar Dorsal Ramus for Zygapophyseal Joint Pain Reduces Post-procedural Pain.

    PubMed

    Arsanious, David; Gage, Emmanuel; Koning, Jonathon; Sarhan, Mazin; Chaiban, Gassan; Almualim, Mohammed; Atallah, Joseph

    2016-01-01

    One of the potential side effects with radiofrequency ablation (RFA) includes painful cutaneous dysesthesias and increased pain due to neuritis or neurogenic inflammation. This pain may require the prescription of opioids or non-opioid analgesics to control post-procedural pain and discomfort. The goal of this study is to compare post-procedural pain scores and post-procedural oral analgesic use in patients receiving continuous thermal radiofrequency ablation versus patients receiving pulsed dose radiofrequency immediately followed by continuous thermal radiofrequency ablation for zygopophaseal joint disease. This is a prospective, double-blinded, randomized, controlled trial. Patients who met all the inclusion criteria and were not subject to any of the exclusion criteria were required to have two positive diagnostic medial branch blocks prior to undergoing randomization, intervention, and analysis. University hospital. Eligible patients were randomized in a 1:1 ratio to either receive thermal radiofrequency ablation alone (standard group) or pulsed dose radiofrequency (PDRF) immediately followed by thermal radiofrequency ablation (investigational group), all of which were performed by a single Board Certified Pain Medicine physician. Post-procedural pain levels between the two groups were assessed using the numerical pain Scale (NPS), and patients were contacted by phone on post-procedural days 1 and 2 in the morning and afternoon regarding the amount of oral analgesic medications used in the first 48 hours following the procedure. Patients who received pulsed dose radiofrequency followed by continuous radiofrequency neurotomy reported statistically significantly lower post-procedural pain scores in the first 24 hours compared to patients who received thermal radiofrequency neurotomy alone. These patients also used less oral analgesic medication in the post-procedural period. These interventions were carried out by one board accredited pain physician at one center. The procedures were exclusively performed using one model of radiofrequency generator, at one setting for the PDRF and RFA. The difference in the number of levels of ablation was not considered in the analysis of the results. Treating patients with pulsed dose radiofrequency prior to continuous thermal radiofrequency ablation can provide patients with less post-procedural pain during the first 24 hours and also reduce analgesic requirements. Furthermore, the addition of PDRF to standard thermal RFA did not prolong the time of standard thermal radiofrequency ablation procedures, as it was performed during the typically allotted time for local anesthetic action. Low back pain, facet joint disease, medial branch block, Radiofrequency ablation, thermal radiofrequency, pulsed dose radiofrequency, PDRF, zygapophyseal joint.

  7. A Statistical Analysis of Brain Morphology Using Wild Bootstrapping

    PubMed Central

    Ibrahim, Joseph G.; Tang, Niansheng; Rowe, Daniel B.; Hao, Xuejun; Bansal, Ravi; Peterson, Bradley S.

    2008-01-01

    Methods for the analysis of brain morphology, including voxel-based morphology and surface-based morphometries, have been used to detect associations between brain structure and covariates of interest, such as diagnosis, severity of disease, age, IQ, and genotype. The statistical analysis of morphometric measures usually involves two statistical procedures: 1) invoking a statistical model at each voxel (or point) on the surface of the brain or brain subregion, followed by mapping test statistics (e.g., t test) or their associated p values at each of those voxels; 2) correction for the multiple statistical tests conducted across all voxels on the surface of the brain region under investigation. We propose the use of new statistical methods for each of these procedures. We first use a heteroscedastic linear model to test the associations between the morphological measures at each voxel on the surface of the specified subregion (e.g., cortical or subcortical surfaces) and the covariates of interest. Moreover, we develop a robust test procedure that is based on a resampling method, called wild bootstrapping. This procedure assesses the statistical significance of the associations between a measure of given brain structure and the covariates of interest. The value of this robust test procedure lies in its computationally simplicity and in its applicability to a wide range of imaging data, including data from both anatomical and functional magnetic resonance imaging (fMRI). Simulation studies demonstrate that this robust test procedure can accurately control the family-wise error rate. We demonstrate the application of this robust test procedure to the detection of statistically significant differences in the morphology of the hippocampus over time across gender groups in a large sample of healthy subjects. PMID:17649909

  8. Low-pressure pneumoperitoneum versus standard pneumoperitoneum in laparoscopic cholecystectomy, a prospective randomized clinical trial.

    PubMed

    Sandhu, Trichak; Yamada, Sirikan; Ariyakachon, Veeravorn; Chakrabandhu, Thiraphat; Chongruksut, Wilaiwan; Ko-iam, Wasana

    2009-05-01

    Post-laparoscopic pain syndrome is well recognized and characterized by abdominal and particularly shoulder tip pain; it occurs frequently following laparoscopic cholecystectomy. The etiology of post-laparoscopic pain can be classified into three aspects: visceral, incision, and shoulder. The origin of shoulder pain is only partly understood, but it is commonly assumed that the cause is overstretching of the diaphragmatic muscle fibers owing to a high rate of insufflations. This study aimed to compare the frequency and intensity of shoulder tip pain between low-pressure (7 mmHg) and standard-pressure (14 mmHg) in a prospective randomized clinical trial. One hundred and forty consecutive patients undergoing elective laparoscopic cholecystectomy were randomized prospectively to either high- or low-pressure pneumoperitoneum and blinded by research nurses who assessed the patients during the postoperative period. The statistical analysis included sex, mean age, weight, American Society of Anesthesiologists (ASA) grade, operative time, complication rate, duration of surgery, conversion rate, postoperative pain by using visual analogue scale, number of analgesic injections, incidence and severity of shoulder tip pain, and postoperative hospital stay. p < 0.05 was considered indicative of significance. The characteristics of the patients were similar in the two groups except for the predominance of males in the standard-pressure group (controls). The procedure was successful in 68 of 70 patients in the low-pressure group compared with in 70 patients in the standard group. Operative time, number of analgesic injections, visual analogue score, and length of postoperative days were similar in both groups. Incidence of shoulder tip pain was higher in the standard-pressure group, but not statistically significantly so (27.9% versus 44.3%) (p = 0.100). Low-pressure pneumoperitoneum tended to be better than standard-pressure pneumoperitoneum in terms of lower incidence of shoulder tip pain, but this difference did not reach statistical significance following elective laparoscopic cholecystectomy.

  9. Bon-EV: an improved multiple testing procedure for controlling false discovery rates.

    PubMed

    Li, Dongmei; Xie, Zidian; Zand, Martin; Fogg, Thomas; Dye, Timothy

    2017-01-03

    Stability of multiple testing procedures, defined as the standard deviation of total number of discoveries, can be used as an indicator of variability of multiple testing procedures. Improving stability of multiple testing procedures can help to increase the consistency of findings from replicated experiments. Benjamini-Hochberg's and Storey's q-value procedures are two commonly used multiple testing procedures for controlling false discoveries in genomic studies. Storey's q-value procedure has higher power and lower stability than Benjamini-Hochberg's procedure. To improve upon the stability of Storey's q-value procedure and maintain its high power in genomic data analysis, we propose a new multiple testing procedure, named Bon-EV, to control false discovery rate (FDR) based on Bonferroni's approach. Simulation studies show that our proposed Bon-EV procedure can maintain the high power of the Storey's q-value procedure and also result in better FDR control and higher stability than Storey's q-value procedure for samples of large size(30 in each group) and medium size (15 in each group) for either independent, somewhat correlated, or highly correlated test statistics. When sample size is small (5 in each group), our proposed Bon-EV procedure has performance between the Benjamini-Hochberg procedure and the Storey's q-value procedure. Examples using RNA-Seq data show that the Bon-EV procedure has higher stability than the Storey's q-value procedure while maintaining equivalent power, and higher power than the Benjamini-Hochberg's procedure. For medium or large sample sizes, the Bon-EV procedure has improved FDR control and stability compared with the Storey's q-value procedure and improved power compared with the Benjamini-Hochberg procedure. The Bon-EV multiple testing procedure is available as the BonEV package in R for download at https://CRAN.R-project.org/package=BonEV .

  10. 10 CFR 434.510 - Standard calculation procedure.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 3 2013-01-01 2013-01-01 false Standard calculation procedure. 434.510 Section 434.510... HIGH RISE RESIDENTIAL BUILDINGS Building Energy Cost Compliance Alternative § 434.510 Standard calculation procedure. 510.1The Standard Calculation Procedure consists of methods and assumptions for...

  11. The Effects of Clinical Hypnosis versus Neurolinguistic Programming (NLP) before External Cephalic Version (ECV): A Prospective Off-Centre Randomised, Double-Blind, Controlled Trial

    PubMed Central

    Reinhard, Joscha; Peiffer, Swati; Sänger, Nicole; Herrmann, Eva; Yuan, Juping; Louwen, Frank

    2012-01-01

    Objective. To examine the effects of clinical hypnosis versus NLP intervention on the success rate of ECV procedures in comparison to a control group. Methods. A prospective off-centre randomised trial of a clinical hypnosis intervention against NLP of women with a singleton breech fetus at or after 370/7 (259 days) weeks of gestation and normal amniotic fluid index. All 80 participants heard a 20-minute recorded intervention via head phones. Main outcome assessed was success rate of ECV. The intervention groups were compared with a control group with standard medical care alone (n = 122). Results. A total of 42 women, who received a hypnosis intervention prior to ECV, had a 40.5% (n = 17), successful ECV, whereas 38 women, who received NLP, had a 44.7% (n = 17) successful ECV (P > 0.05). The control group had similar patient characteristics compared to the intervention groups (P > 0.05). In the control group (n = 122) 27.3% (n = 33) had a statistically significant lower successful ECV procedure than NLP (P = 0.05) and hypnosis and NLP (P = 0.03). Conclusions. These findings suggest that prior clinical hypnosis and NLP have similar success rates of ECV procedures and are both superior to standard medical care alone. PMID:22778774

  12. The Effects of Clinical Hypnosis versus Neurolinguistic Programming (NLP) before External Cephalic Version (ECV): A Prospective Off-Centre Randomised, Double-Blind, Controlled Trial.

    PubMed

    Reinhard, Joscha; Peiffer, Swati; Sänger, Nicole; Herrmann, Eva; Yuan, Juping; Louwen, Frank

    2012-01-01

    Objective. To examine the effects of clinical hypnosis versus NLP intervention on the success rate of ECV procedures in comparison to a control group. Methods. A prospective off-centre randomised trial of a clinical hypnosis intervention against NLP of women with a singleton breech fetus at or after 37(0/7) (259 days) weeks of gestation and normal amniotic fluid index. All 80 participants heard a 20-minute recorded intervention via head phones. Main outcome assessed was success rate of ECV. The intervention groups were compared with a control group with standard medical care alone (n = 122). Results. A total of 42 women, who received a hypnosis intervention prior to ECV, had a 40.5% (n = 17), successful ECV, whereas 38 women, who received NLP, had a 44.7% (n = 17) successful ECV (P > 0.05). The control group had similar patient characteristics compared to the intervention groups (P > 0.05). In the control group (n = 122) 27.3% (n = 33) had a statistically significant lower successful ECV procedure than NLP (P = 0.05) and hypnosis and NLP (P = 0.03). Conclusions. These findings suggest that prior clinical hypnosis and NLP have similar success rates of ECV procedures and are both superior to standard medical care alone.

  13. After-hour Versus Daytime Shifts in Non-Operating Room Anesthesia Environments: National Distribution of Case Volume, Patient Characteristics, and Procedures.

    PubMed

    Gabriel, Rodney A; Burton, Brittany N; Tsai, Mitchell H; Ehrenfeld, Jesse M; Dutton, Richard P; Urman, Richard D

    2017-09-01

    The objective of this study was to characterize workload during all hours of the day in the non-operating room anesthesia (NORA) environment and identify what type of patients and procedures were more likely to occur during after-hours. By investigating data from the National Anesthesia Clinical Outcomes Registry, we characterized the total number of ongoing NORA cases per hour of the day (0 - 23 h). Results were presented as the mean hour and standard error (SE). Multivariable logistic regression was applied to assess the association of various patient, procedural, and facility characteristics with time of day (after-hours = 17:01-06:59 local time versus day-time). Included in this analysis, there were a total of 4,948,634 cases performed on non-holiday weekdays. The mean hour for ongoing cases for gastroenterology, cardiac, radiology and "other" were: 10.8 with standard error (SE) of 0.002, 11.5 (SE of 0.005), 11.2 (SE of 0.005), and 10.8 (SE of 0.002), respectively. Pairwise differences between means for each NORA specialty were all statistically significant (p < 0.0001). During after-hour shifts (4.3% of cases), patients with higher American Society of Anesthesiologists physical status classification scores had increased odds for undergoing a NORA procedure, while procedures that were more physiologically complex had decreased odds. With the increasing demand for NORA services, it is prudent that we fully understand the challenges of providing safe and efficient anesthetic services particularly in locations where fewer resources are available.

  14. Basics of compounding: considerations for implementing United States pharmacopeia chapter 797 pharmaceutical compounding-sterile preparations, part 16: suggested standard operating procedures.

    PubMed

    Okeke, Claudia C; Allen, Loyd V

    2009-01-01

    The standard operating procedures suggested in this article are presented to compounding pharmacies to ensure the quality of the environment in which a CSP is prepared. Since United States Pharmacopeia Chapter 797 provides minimum standards, each facility should aim for best practice gold standard. The standard operating procedures should be tailored to meet the expectations and design of each facility. Compounding personnel are expected to know and understand each standard operating procedure to allow for complete execution of the procedures.

  15. EQUAL-quant: an international external quality assessment scheme for real-time PCR.

    PubMed

    Ramsden, Simon C; Daly, Sarah; Geilenkeuser, Wolf-Jochen; Duncan, Graeme; Hermitte, Fabienne; Marubini, Ettore; Neumaier, Michael; Orlando, Claudio; Palicka, Vladimir; Paradiso, Angelo; Pazzagli, Mario; Pizzamiglio, Sara; Verderio, Paolo

    2006-08-01

    Quantitative gene expression analysis by real-time PCR is important in several diagnostic areas, such as the detection of minimum residual disease in leukemia and the prognostic assessment of cancer patients. To address quality assurance in this technically challenging area, the European Union (EU) has funded the EQUAL project to develop methodologic external quality assessment (EQA) relevant to diagnostic and research laboratories among the EU member states. We report here the results of the EQUAL-quant program, which assesses standards in the use of TaqMan probes, one of the most widely used assays in the implementation of real-time PCR. The EQUAL-quant reagent set was developed to assess the technical execution of a standard TaqMan assay, including RNA extraction, reverse transcription, and real-time PCR quantification of target DNA copy number. The multidisciplinary EQA scheme included 137 participating laboratories from 29 countries. We demonstrated significant differences in performance among laboratories, with 20% of laboratories reporting at least one result lacking in precision and/or accuracy according to the statistical procedures described. No differences in performance were observed for the >10 different testing platforms used by the study participants. This EQA scheme demonstrated both the requirement and demand for external assessment of technical standards in real-time PCR. The reagent design and the statistical tools developed within this project will provide a benchmark for defining acceptable working standards in this emerging technology.

  16. The kinetics-based enzyme-linked immunosorbent assay for coronavirus antibodies in cats: calibration to the indirect immunofluorescence assay and computerized standardization of results through normalization to control values.

    PubMed Central

    Barlough, J E; Jacobson, R H; Downing, D R; Lynch, T J; Scott, F W

    1987-01-01

    The computer-assisted, kinetics-based enzyme-linked immunosorbent assay for coronavirus antibodies in cats was calibrated to the conventional indirect immunofluorescence assay by linear regression analysis and computerized interpolation (generation of "immunofluorescence assay-equivalent" titers). Procedures were developed for normalization and standardization of kinetics-based enzyme-linked immunosorbent assay results through incorporation of five different control sera of predetermined ("expected") titer in daily runs. When used with such sera and with computer assistance, the kinetics-based enzyme-linked immunosorbent assay minimized both within-run and between-run variability while allowing also for efficient data reduction and statistical analysis and reporting of results. PMID:3032390

  17. The kinetics-based enzyme-linked immunosorbent assay for coronavirus antibodies in cats: calibration to the indirect immunofluorescence assay and computerized standardization of results through normalization to control values.

    PubMed

    Barlough, J E; Jacobson, R H; Downing, D R; Lynch, T J; Scott, F W

    1987-01-01

    The computer-assisted, kinetics-based enzyme-linked immunosorbent assay for coronavirus antibodies in cats was calibrated to the conventional indirect immunofluorescence assay by linear regression analysis and computerized interpolation (generation of "immunofluorescence assay-equivalent" titers). Procedures were developed for normalization and standardization of kinetics-based enzyme-linked immunosorbent assay results through incorporation of five different control sera of predetermined ("expected") titer in daily runs. When used with such sera and with computer assistance, the kinetics-based enzyme-linked immunosorbent assay minimized both within-run and between-run variability while allowing also for efficient data reduction and statistical analysis and reporting of results.

  18. Development and Validation of Instruments to Measure Learning of Expert-Like Thinking

    NASA Astrophysics Data System (ADS)

    Adams, Wendy K.; Wieman, Carl E.

    2011-06-01

    This paper describes the process for creating and validating an assessment test that measures the effectiveness of instruction by probing how well that instruction causes students in a class to think like experts about specific areas of science. The design principles and process are laid out and it is shown how these align with professional standards that have been established for educational and psychological testing and the elements of assessment called for in a recent National Research Council study on assessment. The importance of student interviews for creating and validating the test is emphasized, and the appropriate interview procedures are presented. The relevance and use of standard psychometric statistical tests are discussed. Additionally, techniques for effective test administration are presented.

  19. The effect of restructuring student writing in the general chemistry laboratory on student understanding of chemistry and on students' approach to the laboratory course

    NASA Astrophysics Data System (ADS)

    Rudd, James Andrew, II

    Many students encounter difficulties engaging with laboratory-based instruction, and reviews of research have indicated that the value of such instruction is not clearly evident. Traditional forms of writing associated with laboratory activities are commonly in a style used by professional scientists to communicate developed explanations. Students probably lack the interpretative skills of a professional, and writing in this style may not support students in learning how to develop scientific explanations. The Science Writing Heuristic (SWH) is an inquiry-based approach to laboratory instruction designed in part to promote student ability in developing such explanations. However, there is not a convincing body of evidence for the superiority of inquiry-based laboratory instruction in chemistry. In a series of studies, the performance of students using the SWH student template in place of the standard laboratory report format was compared to the performance of students using the standard format. The standard reports had Title, Purpose, Procedure, Data & Observations, Calculations & Graphs, and Discussion sections. The SWH reports had Beginning Questions & Ideas, Tests & Procedures, Observations, Claims, Evidence, and Reflection sections. The pilot study produced evidence that using the SWH improved the quality of laboratory reports, improved student performance on a laboratory exam, and improved student approach to laboratory work. A main study found that SWH students statistically exhibited a better understanding of physical equilibrium when written explanations and equations were analyzed on a lecture exam and performed descriptively better on a physical equilibrium practical exam task. In another main study, the activities covering the general equilibrium concept were restructured as an additional change, and it was found that SWH students exhibited a better understanding of chemical equilibrium as shown by statistically greater success in overcoming the common confusion of interpreting equilibrium as equal concentrations and by statistically better performance when explaining aspects of chemical equilibrium. Both main studies found that students and instructors spent less time on the SWH reports and that students preferred the SWH approach because it increased their level of mental engagement. The studies supported the conclusion that inquiry-based laboratory instruction benefits student learning and attitudes.

  20. Avoid lost discoveries, because of violations of standard assumptions, by using modern robust statistical methods.

    PubMed

    Wilcox, Rand; Carlson, Mike; Azen, Stan; Clark, Florence

    2013-03-01

    Recently, there have been major advances in statistical techniques for assessing central tendency and measures of association. The practical utility of modern methods has been documented extensively in the statistics literature, but they remain underused and relatively unknown in clinical trials. Our objective was to address this issue. STUDY DESIGN AND PURPOSE: The first purpose was to review common problems associated with standard methodologies (low power, lack of control over type I errors, and incorrect assessments of the strength of the association). The second purpose was to summarize some modern methods that can be used to circumvent such problems. The third purpose was to illustrate the practical utility of modern robust methods using data from the Well Elderly 2 randomized controlled trial. In multiple instances, robust methods uncovered differences among groups and associations among variables that were not detected by classic techniques. In particular, the results demonstrated that details of the nature and strength of the association were sometimes overlooked when using ordinary least squares regression and Pearson correlation. Modern robust methods can make a practical difference in detecting and describing differences between groups and associations between variables. Such procedures should be applied more frequently when analyzing trial-based data. Copyright © 2013 Elsevier Inc. All rights reserved.

  1. Economic and outcomes consequences of TachoSil®: a systematic review.

    PubMed

    Colombo, Giorgio L; Bettoni, Daria; Di Matteo, Sergio; Grumi, Camilla; Molon, Cinzia; Spinelli, Daniela; Mauro, Gaetano; Tarozzo, Alessia; Bruno, Giacomo M

    2014-01-01

    TachoSil(®) is a medicated sponge coated with human fibrinogen and human thrombin. It is indicated as a support treatment in adult surgery to improve hemostasis, promote tissue sealing, and support sutures when standard surgical techniques are insufficient. This review systematically analyses the international scientific literature relating to the use of TachoSil in hemostasis and as a surgical sealant, from the point of view of its economic impact. We carried out a systematic review of the PubMed literature up to November 2013. Based on the selection criteria, papers were grouped according to the following outcomes: reduction of time to hemostasis; decrease in length of hospital stay; and decrease in postoperative complications. Twenty-four scientific papers were screened, 13 (54%) of which were randomized controlled trials and included a total of 2,116 patients, 1,055 of whom were treated with TachoSil. In the clinical studies carried out in patients undergoing hepatic, cardiac, or renal surgery, the time to hemostasis obtained with TachoSil was lower (1-4 minutes) than the time measured with other techniques and hemostatic drugs, with statistically significant differences. Moreover, in 13 of 15 studies, TachoSil showed a statistically significant reduction in postoperative complications in comparison with the standard surgical procedure. The range of the observed decrease in the length of hospital stay for TachoSil patients was 2.01-3.58 days versus standard techniques, with a statistically significant difference in favor of TachoSil in eight of 15 studies. This analysis shows that TachoSil has a role as a supportive treatment in surgery to improve hemostasis and promote tissue sealing when standard techniques are insufficient, with a consequent decrease in postoperative complications and hospital costs.

  2. Implementing informative priors for heterogeneity in meta-analysis using meta-regression and pseudo data.

    PubMed

    Rhodes, Kirsty M; Turner, Rebecca M; White, Ian R; Jackson, Dan; Spiegelhalter, David J; Higgins, Julian P T

    2016-12-20

    Many meta-analyses combine results from only a small number of studies, a situation in which the between-study variance is imprecisely estimated when standard methods are applied. Bayesian meta-analysis allows incorporation of external evidence on heterogeneity, providing the potential for more robust inference on the effect size of interest. We present a method for performing Bayesian meta-analysis using data augmentation, in which we represent an informative conjugate prior for between-study variance by pseudo data and use meta-regression for estimation. To assist in this, we derive predictive inverse-gamma distributions for the between-study variance expected in future meta-analyses. These may serve as priors for heterogeneity in new meta-analyses. In a simulation study, we compare approximate Bayesian methods using meta-regression and pseudo data against fully Bayesian approaches based on importance sampling techniques and Markov chain Monte Carlo (MCMC). We compare the frequentist properties of these Bayesian methods with those of the commonly used frequentist DerSimonian and Laird procedure. The method is implemented in standard statistical software and provides a less complex alternative to standard MCMC approaches. An importance sampling approach produces almost identical results to standard MCMC approaches, and results obtained through meta-regression and pseudo data are very similar. On average, data augmentation provides closer results to MCMC, if implemented using restricted maximum likelihood estimation rather than DerSimonian and Laird or maximum likelihood estimation. The methods are applied to real datasets, and an extension to network meta-analysis is described. The proposed method facilitates Bayesian meta-analysis in a way that is accessible to applied researchers. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  3. Effects of hypotensive anesthesia on blood transfusion rates in craniosynostosis corrections.

    PubMed

    Fearon, Jeffrey A; Cook, T Kevin; Herbert, Morley

    2014-05-01

    Hypotensive anesthesia is routinely used during craniosynostosis corrections to reduce blood loss. Noting that cerebral oxygenation levels often fell below recommended levels, the authors sought to measure the effects of hypotensive versus standard anesthesia on blood transfusion rates. One hundred children undergoing craniosynostosis corrections were randomized prospectively into two groups: a target mean arterial pressure of either 50 mm Hg or 60 mm Hg. Aside from anesthesiologists, caregivers were blinded and strict transfusion criteria were followed. Multiple variables were analyzed, and appropriate statistical testing was performed. The hypotensive and standard groups appeared similar, with no statistically significant differences in mean age (46.5 months versus 46.5 months), weight (19.25 kg versus 19.49 kg), procedure [anterior remodeling (34 versus 31) versus posterior (19 versus 16)], or preoperative hemoglobin level (13 g/dl versus 12.9 g/dl). Intraoperative mean arterial pressures differed significantly (56 mm Hg versus 66 mm Hg; p < 0.001). The captured cell saver amount was lower in the hypotensive group (163 cc versus 204 cc; p = 0.02), yet no significant differences were noted in postoperative hemoglobin levels (8.8 g/dl versus 9.3 g/dl). Fifteen of 100 patients (15 percent) received allogenic transfusions, but no statistically significant differences were noted in transfusion rates between the hypotensive [nine of 53 (17.0 percent)] and standard anesthesia [six of 47 (13 percent)] group (p = 0.056). No significant difference in transfusion requirements was found between hypotensive and standard anesthesia during craniosynostosis corrections. Considering potential benefits of improved cerebral blood flow and total body perfusion, surgeons might consider performing craniosynostosis corrections without hypotension. Therapeutic, II.

  4. 40 CFR Appendix Xviii to Part 86 - Statistical Outlier Identification Procedure for Light-Duty Vehicles and Light Light-Duty Trucks...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 19 2011-07-01 2011-07-01 false Statistical Outlier Identification... (CONTINUED) Pt. 86, App. XVIII Appendix XVIII to Part 86—Statistical Outlier Identification Procedure for..., but suffer theoretical deficiencies if statistical significance tests are required. Consequently, the...

  5. 40 CFR Appendix Xviii to Part 86 - Statistical Outlier Identification Procedure for Light-Duty Vehicles and Light Light-Duty Trucks...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Statistical Outlier Identification... (CONTINUED) Pt. 86, App. XVIII Appendix XVIII to Part 86—Statistical Outlier Identification Procedure for..., but suffer theoretical deficiencies if statistical significance tests are required. Consequently, the...

  6. Developing tools to measure quality in congenital catheterization and interventions: the congenital cardiac catheterization project on outcomes (C3PO).

    PubMed

    Chaudhry-Waterman, Nadia; Coombs, Sandra; Porras, Diego; Holzer, Ralf; Bergersen, Lisa

    2014-01-01

    The broad range of relatively rare procedures performed in pediatric cardiac catheterization laboratories has made the standardization of care and risk assessment in the field statistically quite problematic. However, with the growing number of patients who undergo cardiac catheterization, it has become imperative that the cardiology community overcomes these challenges to study patient outcomes. The Congenital Cardiac Catheterization Project on Outcomes was able to develop benchmarks, tools for measurement, and risk adjustment methods while exploring procedural efficacy. Based on the success of these efforts, the collaborative is pursuing a follow-up project, the Congenital Cardiac Catheterization Project on Outcomes-Quality Improvement, aimed at improving the outcomes for all patients undergoing catheterization for congenital heart disease by reducing radiation exposure.

  7. [Validation of measurement methods and estimation of uncertainty of measurement of chemical agents in the air at workstations].

    PubMed

    Dobecki, Marek

    2012-01-01

    This paper reviews the requirements for measurement methods of chemical agents in the air at workstations. European standards, which have a status of Polish standards, comprise some requirements and information on sampling strategy, measuring techniques, type of samplers, sampling pumps and methods of occupational exposure evaluation at a given technological process. Measurement methods, including air sampling and analytical procedure in a laboratory, should be appropriately validated before intended use. In the validation process, selected methods are tested and budget of uncertainty is set up. The validation procedure that should be implemented in the laboratory together with suitable statistical tools and major components of uncertainity to be taken into consideration, were presented in this paper. Methods of quality control, including sampling and laboratory analyses were discussed. Relative expanded uncertainty for each measurement expressed as a percentage, should not exceed the limit of values set depending on the type of occupational exposure (short-term or long-term) and the magnitude of exposure to chemical agents in the work environment.

  8. Applications of statistics to medical science, II overview of statistical procedures for general use.

    PubMed

    Watanabe, Hiroshi

    2012-01-01

    Procedures of statistical analysis are reviewed to provide an overview of applications of statistics for general use. Topics that are dealt with are inference on a population, comparison of two populations with respect to means and probabilities, and multiple comparisons. This study is the second part of series in which we survey medical statistics. Arguments related to statistical associations and regressions will be made in subsequent papers.

  9. SAP- FORTRAN STATIC SOURCE CODE ANALYZER PROGRAM (IBM VERSION)

    NASA Technical Reports Server (NTRS)

    Manteufel, R.

    1994-01-01

    The FORTRAN Static Source Code Analyzer program, SAP, was developed to automatically gather statistics on the occurrences of statements and structures within a FORTRAN program and to provide for the reporting of those statistics. Provisions have been made for weighting each statistic and to provide an overall figure of complexity. Statistics, as well as figures of complexity, are gathered on a module by module basis. Overall summed statistics are also accumulated for the complete input source file. SAP accepts as input syntactically correct FORTRAN source code written in the FORTRAN 77 standard language. In addition, code written using features in the following languages is also accepted: VAX-11 FORTRAN, IBM S/360 FORTRAN IV Level H Extended; and Structured FORTRAN. The SAP program utilizes two external files in its analysis procedure. A keyword file allows flexibility in classifying statements and in marking a statement as either executable or non-executable. A statistical weight file allows the user to assign weights to all output statistics, thus allowing the user flexibility in defining the figure of complexity. The SAP program is written in FORTRAN IV for batch execution and has been implemented on a DEC VAX series computer under VMS and on an IBM 370 series computer under MVS. The SAP program was developed in 1978 and last updated in 1985.

  10. SAP- FORTRAN STATIC SOURCE CODE ANALYZER PROGRAM (DEC VAX VERSION)

    NASA Technical Reports Server (NTRS)

    Merwarth, P. D.

    1994-01-01

    The FORTRAN Static Source Code Analyzer program, SAP, was developed to automatically gather statistics on the occurrences of statements and structures within a FORTRAN program and to provide for the reporting of those statistics. Provisions have been made for weighting each statistic and to provide an overall figure of complexity. Statistics, as well as figures of complexity, are gathered on a module by module basis. Overall summed statistics are also accumulated for the complete input source file. SAP accepts as input syntactically correct FORTRAN source code written in the FORTRAN 77 standard language. In addition, code written using features in the following languages is also accepted: VAX-11 FORTRAN, IBM S/360 FORTRAN IV Level H Extended; and Structured FORTRAN. The SAP program utilizes two external files in its analysis procedure. A keyword file allows flexibility in classifying statements and in marking a statement as either executable or non-executable. A statistical weight file allows the user to assign weights to all output statistics, thus allowing the user flexibility in defining the figure of complexity. The SAP program is written in FORTRAN IV for batch execution and has been implemented on a DEC VAX series computer under VMS and on an IBM 370 series computer under MVS. The SAP program was developed in 1978 and last updated in 1985.

  11. Limited Impact of Music Therapy on Patient Anxiety with the Large Loop Excision of Transformation Zone Procedure - a Randomized Controlled Trial.

    PubMed

    Kongsawatvorakul, Chompunoot; Charakorn, Chuenkamon; Paiwattananupant, Krissada; Lekskul, Navamol; Rattanasiri, Sasivimol; Lertkhachonsuk, Arb-Aroon

    2016-01-01

    Many studies have pointed to strategies to cope with patient anxiety in colposcopy. Evidence shows that patients experienced considerable distress with the large loop excision of transformation zone (LLETZ) procedure and suitable interventions should be introduced to reduce anxiety. This study aimed to investigate the effects of music therapy in patients undergoing LLETZ. A randomized controlled trial was conducted with patients undergoing LLETZ performed under local anesthesia in an out patient setting at Ramathibodi Hospital, Bangkok, Thailand, from February 2015 to January 2016. After informed consent and demographic data were obtained, we assessed the anxiety level using State Anxiety Inventory pre and post procedures. Music group patients listened to classical songs through headphones, while the control group received the standard care. Pain score was evaluated with a visual analog scale (VAS). Statistical analysis was conducted using Pearson Chi-square, Fisher's Exact test and T-Test and p-values less than 0.05 were considered statistically significant. A total of 73 patients were enrolled and randomized, resulting in 36 women in the music group and 37 women in the non-music control group. The preoperative mean anxiety score was higher in the music group (46.8 VS 45.8 points). The postoperative mean anxiety scores in the music and the non-music groups were 38.7 and 41.3 points, respectively. VAS was lower in music group (2.55 VS 3.33). The percent change of anxiety was greater in the music group, although there was no significant difference between two groups. Music therapy did not significantly reduce anxiety in patients undergoing the LLETZ procedure. However, different interventions should be developed to ease the patients' apprehension during this procedure.

  12. MTS dye based colorimetric CTLL-2 cell proliferation assay for product release and stability monitoring of interleukin-15: assay qualification, standardization and statistical analysis.

    PubMed

    Soman, Gopalan; Yang, Xiaoyi; Jiang, Hengguang; Giardina, Steve; Vyas, Vinay; Mitra, George; Yovandich, Jason; Creekmore, Stephen P; Waldmann, Thomas A; Quiñones, Octavio; Alvord, W Gregory

    2009-08-31

    A colorimetric cell proliferation assay using soluble tetrazolium salt [(CellTiter 96(R) Aqueous One Solution) cell proliferation reagent, containing the (3-(4,5-dimethylthiazol-2-yl)-5-(3-carboxymethoxyphenyl)-2-(4-sulfophenyl)-2H-tetrazolium, inner salt) and an electron coupling reagent phenazine ethosulfate], was optimized and qualified for quantitative determination of IL-15 dependent CTLL-2 cell proliferation activity. An in-house recombinant Human (rHu)IL-15 reference lot was standardized (IU/mg) against an international reference standard. Specificity of the assay for IL-15 was documented by illustrating the ability of neutralizing anti-IL-15 antibodies to block the product specific CTLL-2 cell proliferation and the lack of blocking effect with anti-IL-2 antibodies. Under the defined assay conditions, the linear dose-response concentration range was between 0.04 and 0.17ng/ml of the rHuIL-15 produced in-house and 0.5-3.0IU/ml for the international standard. Statistical analysis of the data was performed with the use of scripts written in the R Statistical Language and Environment utilizing a four-parameter logistic regression fit analysis procedure. The overall variation in the ED(50) values for the in-house reference standard from 55 independent estimates performed over the period of 1year was 12.3% of the average. Excellent intra-plate and within-day/inter-plate consistency was observed for all four parameter estimates in the model. Different preparations of rHuIL-15 showed excellent intra-plate consistency in the parameter estimates corresponding to the lower and upper asymptotes as well as to the 'slope' factor at the mid-point. The ED(50) values showed statistically significant differences for different lots and for control versus stressed samples. Three R-scripts improve data analysis capabilities allowing one to describe assay variations, to draw inferences between data sets from formal statistical tests, and to set up improved assay acceptance criteria based on comparability and consistency in the four parameters of the model. The assay is precise, accurate and robust and can be fully validated. Applications of the assay were established including process development support, release of the rHuIL-15 product for pre-clinical and clinical studies, and for monitoring storage stability.

  13. Pediatric Drowning: A Standard Operating Procedure to Aid the Prehospital Management of Pediatric Cardiac Arrest Resulting From Submersion.

    PubMed

    Best, Rebecca R; Harris, Benjamin H L; Walsh, Jason L; Manfield, Timothy

    2017-05-08

    Drowning is one of the leading causes of death in children. Resuscitating a child following submersion is a high-pressure situation, and standard operating procedures can reduce error. Currently, the Resuscitation Council UK guidance does not include a standard operating procedure on pediatric drowning. The objective of this project was to design a standard operating procedure to improve outcomes of drowned children. A literature review on the management of pediatric drowning was conducted. Relevant publications were used to develop a standard operating procedure for management of pediatric drowning. A concise standard operating procedure was developed for resuscitation following pediatric submersion. Specific recommendations include the following: the Heimlich maneuver should not be used in this context; however, prolonged resuscitation and therapeutic hypothermia are recommended. This standard operating procedure is a potentially useful adjunct to the Resuscitation Council UK guidance and should be considered for incorporation into its next iteration.

  14. Accuracy assessment: The statistical approach to performance evaluation in LACIE. [Great Plains corridor, United States

    NASA Technical Reports Server (NTRS)

    Houston, A. G.; Feiveson, A. H.; Chhikara, R. S.; Hsu, E. M. (Principal Investigator)

    1979-01-01

    A statistical methodology was developed to check the accuracy of the products of the experimental operations throughout crop growth and to determine whether the procedures are adequate to accomplish the desired accuracy and reliability goals. It has allowed the identification and isolation of key problems in wheat area yield estimation, some of which have been corrected and some of which remain to be resolved. The major unresolved problem in accuracy assessment is that of precisely estimating the bias of the LACIE production estimator. Topics covered include: (1) evaluation techniques; (2) variance and bias estimation for the wheat production estimate; (3) the 90/90 evaluation; (4) comparison of the LACIE estimate with reference standards; and (5) first and second order error source investigations.

  15. Poisson statistics of PageRank probabilities of Twitter and Wikipedia networks

    NASA Astrophysics Data System (ADS)

    Frahm, Klaus M.; Shepelyansky, Dima L.

    2014-04-01

    We use the methods of quantum chaos and Random Matrix Theory for analysis of statistical fluctuations of PageRank probabilities in directed networks. In this approach the effective energy levels are given by a logarithm of PageRank probability at a given node. After the standard energy level unfolding procedure we establish that the nearest spacing distribution of PageRank probabilities is described by the Poisson law typical for integrable quantum systems. Our studies are done for the Twitter network and three networks of Wikipedia editions in English, French and German. We argue that due to absence of level repulsion the PageRank order of nearby nodes can be easily interchanged. The obtained Poisson law implies that the nearby PageRank probabilities fluctuate as random independent variables.

  16. 21 CFR 821.25 - Device tracking system and content requirements: manufacturer requirements.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... accordance with its standard operating procedure of the information identified in paragraphs (a)(1), (a)(2... shall establish a written standard operating procedure for the collection, maintenance, and auditing of... standard operating procedure: (1) Data collection and recording procedures, which shall include a procedure...

  17. Enhanced Sanitation Standard Operating Procedures Have Limited Impact on Listeria monocytogenes Prevalence in Retail Delis.

    PubMed

    Etter, Andrea J; Hammons, Susan R; Roof, Sherry; Simmons, Courtenay; Wu, Tongyu; Cook, Peter W; Katubig, Alex; Stasiewicz, Matthew J; Wright, Emily; Warchocki, Steven; Hollingworth, Jill; Thesmar, Hilary S; Ibrahim, Salam A; Wiedmann, Martin; Oliver, Haley F

    2017-10-20

    In a recent longitudinal surveillance study in 30 U.S. retail delicatessens, 9.7% of environmental surfaces were positive for Listeria monocytogenes, and we found substantial evidence of persistence. In this study, we aimed to reduce the prevalence and persistence of L. monocytogenes in the retail deli environment by developing and implementing practical and feasible intervention strategies (i.e., sanitation standard operating procedures; SSOPs). These SSOPs were standardized across the 30 delis enrolled in this study. SSOP implementation was verified by systems inherent to each retailer. Each deli also was equipped with ATP monitoring systems to verify effective sanitation. We evaluated intervention strategy efficacy by testing 28 food and nonfood contact surfaces for L. monocytogenes for 6 months in all 30 retail delis. The efficacy of the intervention on the delis compared with preintervention prevalence level was not statistically significant; we found that L. monocytogenes could persist despite implementation of enhanced SSOPs. Systematic and accurate use of ATP monitoring systems varied widely among delis. The findings indicate that intervention strategies in the form of enhanced daily SSOPs were not sufficient to eliminate L. monocytogenes from highly prevalent and persistently contaminated delis and that more aggressive strategies (e.g., deep cleaning or capital investment in redesign or equipment) may be necessary to fully mitigate persistent contamination.

  18. An Initial Design of ISO 19152:2012 LADM Based Valuation and Taxation Data Model

    NASA Astrophysics Data System (ADS)

    Çağdaş, V.; Kara, A.; van Oosterom, P.; Lemmen, C.; Işıkdağ, Ü.; Kathmann, R.; Stubkjær, E.

    2016-10-01

    A fiscal registry or database is supposed to record geometric, legal, physical, economic, and environmental characteristics in relation to property units, which are subject to immovable property valuation and taxation. Apart from procedural standards, there is no internationally accepted data standard that defines the semantics of fiscal databases. The ISO 19152:2012 Land Administration Domain Model (LADM), as an international land administration standard focuses on legal requirements, but considers out of scope specifications of external information systems including valuation and taxation databases. However, it provides a formalism which allows for an extension that responds to the fiscal requirements. This paper introduces an initial version of a LADM - Fiscal Extension Module for the specification of databases used in immovable property valuation and taxation. The extension module is designed to facilitate all stages of immovable property taxation, namely the identification of properties and taxpayers, assessment of properties through single or mass appraisal procedures, automatic generation of sales statistics, and the management of tax collection, dealing with arrears and appeals. It is expected that the initial version will be refined through further activities held by a possible joint working group under FIG Commission 7 (Cadastre and Land Management) and FIG Commission 9 (Valuation and the Management of Real Estate) in collaboration with other relevant international bodies.

  19. KMC 3: counting and manipulating k-mer statistics.

    PubMed

    Kokot, Marek; Dlugosz, Maciej; Deorowicz, Sebastian

    2017-09-01

    Counting all k -mers in a given dataset is a standard procedure in many bioinformatics applications. We introduce KMC3, a significant improvement of the former KMC2 algorithm together with KMC tools for manipulating k -mer databases. Usefulness of the tools is shown on a few real problems. Program is freely available at http://sun.aei.polsl.pl/REFRESH/kmc . sebastian.deorowicz@polsl.pl. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  20. 10 CFR 26.127 - Procedures.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ..., implement, and maintain written standard operating procedures for each assay performed for drug and specimen... facility shall develop, implement, and maintain written standard operating procedures for each test. The...; (2) Preparation of reagents, standards, and controls; (3) Calibration procedures; (4) Derivation of...

  1. 10 CFR 26.127 - Procedures.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ..., implement, and maintain written standard operating procedures for each assay performed for drug and specimen... facility shall develop, implement, and maintain written standard operating procedures for each test. The...; (2) Preparation of reagents, standards, and controls; (3) Calibration procedures; (4) Derivation of...

  2. 10 CFR 26.127 - Procedures.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., implement, and maintain written standard operating procedures for each assay performed for drug and specimen... facility shall develop, implement, and maintain written standard operating procedures for each test. The...; (2) Preparation of reagents, standards, and controls; (3) Calibration procedures; (4) Derivation of...

  3. Evaluation of a new very low dose imaging protocol: feasibility and impact on X-ray dose levels in electrophysiology procedures

    PubMed Central

    Bourier, Felix; Reents, Tilko; Ammar-Busch, Sonia; Buiatti, Alessandra; Kottmaier, Marc; Semmler, Verena; Telishevska, Marta; Brkic, Amir; Grebmer, Christian; Lennerz, Carsten; Kolb, Christof; Hessling, Gabriele; Deisenhofer, Isabel

    2016-01-01

    Aims This study presents and evaluates the impact of a new lowest-dose fluoroscopy protocol (Siemens AG), especially designed for electrophysiology (EP) procedures, on X-ray dose levels. Methods and results From October 2014 to March 2015, 140 patients underwent an EP study on an Artis zee angiography system. The standard low-dose protocol was operated at 23 nGy (fluoroscopy) and at 120 nGy (cine-loop), the new lowest-dose protocol was operated at 8 nGy (fluoroscopy) and at 36 nGy (cine-loop). Procedural data, X-ray times, and doses were analysed in 100 complex left atrial and in 40 standard EP procedures. The resulting dose–area products were 877.9 ± 624.7 µGym² (n = 50 complex procedures, standard low dose), 199 ± 159.6 µGym² (n = 50 complex procedures, lowest dose), 387.7 ± 36.0 µGym² (n = 20 standard procedures, standard low dose), and 90.7 ± 62.3 µGym² (n = 20 standard procedures, lowest dose), P < 0.01. In the low-dose and lowest-dose groups, procedure times were 132.6 ± 35.7 vs. 126.7 ± 34.7 min (P = 0.40, complex procedures) and 72.3 ± 20.9 vs. 85.2 ± 44.1 min (P = 0.24, standard procedures), radiofrequency (RF) times were 53.8 ± 26.1 vs. 50.4 ± 29.4 min (P = 0.54, complex procedures) and 10.1 ± 9.9 vs. 12.2 ± 14.7 min (P = 0.60, standard procedures). One complication occurred in the standard low-dose and lowest-dose groups (P = 1.0). Conclusion The new lowest-dose imaging protocol reduces X-ray dose levels by 77% compared with the currently available standard low-dose protocol. From an operator standpoint, lowest X-ray dose levels create a different, reduced image quality. The new image quality did not significantly affect procedure or RF times and did not result in higher complication rates. Regarding radiological protection, operating at lowest-dose settings should become standard in EP procedures. PMID:26589627

  4. Using a Five-Step Procedure for Inferential Statistical Analyses

    ERIC Educational Resources Information Center

    Kamin, Lawrence F.

    2010-01-01

    Many statistics texts pose inferential statistical problems in a disjointed way. By using a simple five-step procedure as a template for statistical inference problems, the student can solve problems in an organized fashion. The problem and its solution will thus be a stand-by-itself organic whole and a single unit of thought and effort. The…

  5. 76 FR 80261 - National Emission Standards for Hazardous Air Pollutants: Area Source Standards for Prepared...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-23

    ... specifications and operating instructions, if available, or standard operating procedures must be developed by... manufacturer's specifications and operating instructions, if available, or standard operating procedures must... operating specifications or standard operating procedures developed by the prepared feeds manufacturer be...

  6. Estimation of selected seasonal streamflow statistics representative of 1930-2002 in West Virginia

    USGS Publications Warehouse

    Wiley, Jeffrey B.; Atkins, John T.

    2010-01-01

    Regional equations and procedures were developed for estimating seasonal 1-day 10-year, 7-day 10-year, and 30-day 5-year hydrologically based low-flow frequency values for unregulated streams in West Virginia. Regional equations and procedures also were developed for estimating the seasonal U.S. Environmental Protection Agency harmonic-mean flows and the 50-percent flow-duration values. The seasons were defined as winter (January 1-March 31), spring (April 1-June 30), summer (July 1-September 30), and fall (October 1-December 31). Regional equations were developed using ordinary least squares regression using statistics from 117 U.S. Geological Survey continuous streamgage stations as dependent variables and basin characteristics as independent variables. Equations for three regions in West Virginia-North, South-Central, and Eastern Panhandle Regions-were determined. Drainage area, average annual precipitation, and longitude of the basin centroid are significant independent variables in one or more of the equations. The average standard error of estimates for the equations ranged from 12.6 to 299 percent. Procedures developed to estimate the selected seasonal streamflow statistics in this study are applicable only to rural, unregulated streams within the boundaries of West Virginia that have independent variables within the limits of the stations used to develop the regional equations: drainage area from 16.3 to 1,516 square miles in the North Region, from 2.78 to 1,619 square miles in the South-Central Region, and from 8.83 to 3,041 square miles in the Eastern Panhandle Region; average annual precipitation from 42.3 to 61.4 inches in the South-Central Region and from 39.8 to 52.9 inches in the Eastern Panhandle Region; and longitude of the basin centroid from 79.618 to 82.023 decimal degrees in the North Region. All estimates of seasonal streamflow statistics are representative of the period from the 1930 to the 2002 climatic year.

  7. Using statistical process control to make data-based clinical decisions.

    PubMed

    Pfadt, A; Wheeler, D J

    1995-01-01

    Applied behavior analysis is based on an investigation of variability due to interrelationships among antecedents, behavior, and consequences. This permits testable hypotheses about the causes of behavior as well as for the course of treatment to be evaluated empirically. Such information provides corrective feedback for making data-based clinical decisions. This paper considers how a different approach to the analysis of variability based on the writings of Walter Shewart and W. Edwards Deming in the area of industrial quality control helps to achieve similar objectives. Statistical process control (SPC) was developed to implement a process of continual product improvement while achieving compliance with production standards and other requirements for promoting customer satisfaction. SPC involves the use of simple statistical tools, such as histograms and control charts, as well as problem-solving techniques, such as flow charts, cause-and-effect diagrams, and Pareto charts, to implement Deming's management philosophy. These data-analytic procedures can be incorporated into a human service organization to help to achieve its stated objectives in a manner that leads to continuous improvement in the functioning of the clients who are its customers. Examples are provided to illustrate how SPC procedures can be used to analyze behavioral data. Issues related to the application of these tools for making data-based clinical decisions and for creating an organizational climate that promotes their routine use in applied settings are also considered.

  8. 40 CFR 63.1547 - Monitoring requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... standard operating procedures manual that describes in detail the procedures for inspection, maintenance...) The standard operating procedures manual for baghouses required by paragraph (a) of this section shall... specified in the standard operating procedures manual for inspections and routine maintenance shall, at a...

  9. 40 CFR 63.1547 - Monitoring requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... standard operating procedures manual that describes in detail the procedures for inspection, maintenance...) The standard operating procedures manual for baghouses required by paragraph (a) of this section shall... specified in the standard operating procedures manual for inspections and routine maintenance shall, at a...

  10. Analysis of repeated measurement data in the clinical trials

    PubMed Central

    Singh, Vineeta; Rana, Rakesh Kumar; Singhal, Richa

    2013-01-01

    Statistics is an integral part of Clinical Trials. Elements of statistics span Clinical Trial design, data monitoring, analyses and reporting. A solid understanding of statistical concepts by clinicians improves the comprehension and the resulting quality of Clinical Trials. In biomedical research it has been seen that researcher frequently use t-test and ANOVA to compare means between the groups of interest irrespective of the nature of the data. In Clinical Trials we record the data on the patients more than two times. In such a situation using the standard ANOVA procedures is not appropriate as it does not consider dependencies between observations within subjects in the analysis. To deal with such types of study data Repeated Measure ANOVA should be used. In this article the application of One-way Repeated Measure ANOVA has been demonstrated by using the software SPSS (Statistical Package for Social Sciences) Version 15.0 on the data collected at four time points 0 day, 15th day, 30th day, and 45th day of multicentre clinical trial conducted on Pandu Roga (~Iron Deficiency Anemia) with an Ayurvedic formulation Dhatrilauha. PMID:23930038

  11. 76 FR 80314 - National Emission Standards for Hazardous Air Pollutants: Area Source Standards for Prepared...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-23

    ... operating instructions, if available, or standard operating procedures must be developed by the facility... operating instructions, if available, or standard operating procedures must be developed by the facility... standard operating procedures developed by the prepared feeds manufacturer be required as part of the...

  12. Embryo transfer techniques: an American Society for Reproductive Medicine survey of current Society for Assisted Reproductive Technology practices.

    PubMed

    Toth, Thomas L; Lee, Malinda S; Bendikson, Kristin A; Reindollar, Richard H

    2017-04-01

    To better understand practice patterns and opportunities for standardization of ET. Cross-sectional survey. Not applicable. Not applicable. An anonymous 82-question survey was emailed to the medical directors of 286 Society for Assisted Reproductive Technology member IVF practices. A follow-up survey composed of three questions specific to ET technique was emailed to the same medical directors. Descriptive statistics of the results were compiled. The survey assessed policies, protocols, restrictions, and specifics pertinent to the technique of ET. There were 117 (41%) responses; 32% practice in academic settings and 68% in private practice. Responders were experienced clinicians, half of whom had performed <10 procedures during training. Ninety-eight percent of practices allowed all practitioners to perform ET; half did not follow a standardized ET technique. Multiple steps in the ET process were identified as "highly conserved;" others demonstrated discordance. ET technique is divided among [1] trial transfer followed immediately with ET (40%); [2] afterload transfer (30%); and [3] direct transfer without prior trial or afterload (27%). Embryos are discharged in the upper (66%) and middle thirds (29%) of the endometrial cavity and not closer than 1-1.5 cm from fundus (87%). Details of each step were reported and allowed the development of a "common" practice ET procedure. ET training and practices vary widely. Improved training and standardization based on outcomes data and best practices are warranted. A common practice procedure is suggested for validation by a systematic literature review. Copyright © 2017 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  13. A method to estimate statistical errors of properties derived from charge-density modelling

    PubMed Central

    Lecomte, Claude

    2018-01-01

    Estimating uncertainties of property values derived from a charge-density model is not straightforward. A methodology, based on calculation of sample standard deviations (SSD) of properties using randomly deviating charge-density models, is proposed with the MoPro software. The parameter shifts applied in the deviating models are generated in order to respect the variance–covariance matrix issued from the least-squares refinement. This ‘SSD methodology’ procedure can be applied to estimate uncertainties of any property related to a charge-density model obtained by least-squares fitting. This includes topological properties such as critical point coordinates, electron density, Laplacian and ellipticity at critical points and charges integrated over atomic basins. Errors on electrostatic potentials and interaction energies are also available now through this procedure. The method is exemplified with the charge density of compound (E)-5-phenylpent-1-enylboronic acid, refined at 0.45 Å resolution. The procedure is implemented in the freely available MoPro program dedicated to charge-density refinement and modelling. PMID:29724964

  14. Analysis of half diallel mating designs I: a practical analysis procedure for ANOVA approximation.

    Treesearch

    G.R. Johnson; J.N. King

    1998-01-01

    Procedures to analyze half-diallel mating designs using the SAS statistical package are presented. The procedure requires two runs of PROC and VARCOMP and results in estimates of additive and non-additive genetic variation. The procedures described can be modified to work on most statistical software packages which can compute variance component estimates. The...

  15. Radiation exposure during in-situ pinning of slipped capital femoral epiphysis hips: does the patient positioning matter?

    PubMed

    Mohammed, Riazuddin; Johnson, Karl; Bache, Ed

    2010-07-01

    Multiple radiographic images may be necessary during the standard procedure of in-situ pinning of slipped capital femoral epiphysis (SCFE) hips. This procedure can be performed with the patient positioned on a fracture table or a radiolucent table. Our study aims to look at any differences in the amount and duration of radiation exposure for in-situ pinning of SCFE performed using a traction table or a radiolucent table. Sixteen hips in thirteen patients who were pinned on radiolucent table were compared for the cumulative radiation exposure to 35 hips pinned on a fracture table in 33 patients during the same time period. Cumulative radiation dose was measured as dose area product in Gray centimeter2 and the duration of exposure was measured in minutes. Appropriate statistical tests were used to test the significance of any differences. Mean cumulative radiation dose for SCFE pinned on radiolucent table was statistically less than for those pinned on fracture table (P<0.05). The mean duration of radiation exposure on either table was not significantly different. Lateral projections may increase the radiation doses compared with anteroposterior projections because of the higher exposure parameters needed for side imaging. Our results showing decreased exposure doses on the radiolucent table are probably because of the ease of a frog leg lateral positioning obtained and thereby the ease of lateral imaging. In-situ pinning of SCFE hips on a radiolucent table has an additional advantage that the radiation dose during the procedure is significantly less than that of the procedure that is performed on a fracture table.

  16. Internal Delorme's Procedure for Treating ODS Associated With Impaired Anal Continence.

    PubMed

    Liu, Weicheng; Sturiale, Alessandro; Fabiani, Bernardina; Giani, Iacopo; Menconi, Claudia; Naldini, Gabriele

    2017-12-01

    The aim of this study was to evaluate the medium-term outcomes of internal Delorme's procedure for treating obstructed defecation syndrome (ODS) patients with impaired anal continence. In a retrospective study, 41 ODS patients who underwent internal Delorme's procedure between 2011 and 2015 were divided into 3 subgroups according to their associated symptoms of impaired continence, as urgency, passive fecal incontinence and both, before study. Then the patients' preoperative statuses, perioperative complications, and postoperative outcomes were investigated and collected from standardized questionnaires, including Altomare ODS score, Fecal Incontinence Severity Index (FISI), Patient Assessment of Constipation-Quality of Life Questionnaire (PAC-QoL), and Fecal Incontinence Quality of Life Scale (FIQLS). All results with a 2-tailed P < .05 were considered statistically significant. At an average 2.8 years of follow-up, there were significant improvements ( P < .01) in Altomare ODS score, FISI, PAC-QoL, and FIQLS in all patients when comparing scores from before the operation with those at the final follow-up. Similar results were also observed in both the urgency subgroup and passive fecal incontinence subgroup, but there were no statistically significant improvements ( P > .05) in Altomare ODS score, FISI, PAC-QoL, or FIQLS in the urgency and passive fecal incontinence subgroups. Anorectal manometry showed the mean value of anal resting pressure increased 20%. Additionally, no major complications occurred. Internal Delorme's procedure is effective without major morbidity for treating ODS associated with urgency or passive fecal incontinence, but it may be less effective for treating ODS associated with both urgency and passive fecal incontinence.

  17. Variation in rates of breast cancer surgery: A national analysis based on French Hospital Episode Statistics.

    PubMed

    Rococo, E; Mazouni, C; Or, Z; Mobillion, V; Koon Sun Pat, M; Bonastre, J

    2016-01-01

    Minimum volume thresholds were introduced in France in 2008 to improve the quality of cancer care. We investigated whether/how the quality of treatment decisions in breast cancer surgery had evolved before and after this policy was implemented. We used Hospital Episode Statistics for all women having undergone breast conserving surgery (BCS) or mastectomy in France in 2005 and 2012. Three surgical procedures considered as better treatment options were analyzed: BCS, immediate breast reconstruction (IBR) and sentinel lymph node biopsy (SLNB). We studied the mean rates and variation according to the hospital profile and volume. Between 2005 and 2012, the volume of breast cancer surgery increased by 11% whereas one third of the hospitals no longer performed this type of surgery. In 2012, the mean rate of BCS was 74% and similar in all hospitals whatever the volume. Conversely, IBR and SLNB rates were much higher in cancer centers (CC) and regional teaching hospitals (RTH) [IBR: 19% and 14% versus 8% on average; SLNB: 61% and 47% versus 39% on average]; the greater the hospital volume, the higher the IBR and SLNB rates (p < 0.0001). Overall, whatever the surgical procedure considered, inter-hospital variation in rates declined substantially in CC and RTH. We identified considerable variation in IBR and SLNB rates between French hospitals. Although more complex and less standardized than BCS, most clinical guidelines recommended these procedures. This apparent heterogeneity suggests unequal access to high-quality procedures for women with breast cancer. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. 40 CFR 63.1547 - Monitoring requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... according to, a standard operating procedures manual that describes in detail the procedures for inspection...) The standard operating procedures manual for baghouses required by paragraph (a) of this section must... specified in the standard operating procedures manual for inspections and routine maintenance must, at a...

  19. 40 CFR 63.1547 - Monitoring requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... according to, a standard operating procedures manual that describes in detail the procedures for inspection...) The standard operating procedures manual for baghouses required by paragraph (a) of this section must... specified in the standard operating procedures manual for inspections and routine maintenance must, at a...

  20. 40 CFR 63.1547 - Monitoring requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... according to, a standard operating procedures manual that describes in detail the procedures for inspection...) The standard operating procedures manual for baghouses required by paragraph (a) of this section must... specified in the standard operating procedures manual for inspections and routine maintenance must, at a...

  1. Study Methods to Standardize Thermography NDE

    NASA Technical Reports Server (NTRS)

    Walker, James L.; Workman, Gary L.

    1998-01-01

    The purpose of this work is to develop thermographic inspection methods and standards for use in evaluating structural composites and aerospace hardware. Qualification techniques and calibration methods are investigated to standardize the thermographic method for use in the field. Along with the inspections of test standards structural hardware, support hardware is designed and fabricated to aid in the thermographic process. Also, a standard operating procedure is developed for performing inspections with the Bales Thermal Image Processor (TIP). Inspections are performed on a broad range of structural composites. These materials include various graphite/epoxies, graphite/cyanide-ester, graphite/silicon-carbide, graphite phenolic and Keviar/epoxy. Also metal honeycomb (titanium and aluminum faceplates over an aluminum honeycomb core) structures are investigated. Various structural shapes are investigated and the thickness of the structures vary from as few as 3 plies to as many as 80 plies. Special emphasis is placed on characterizing defects in attachment holes and bondlines, in addition to those resulting from impact damage and the inclusion of foreign matter. Image processing through statistical analysis and digital filtering is investigated to enhance the quality and quantify the NDE thermal images when necessary.

  2. Study Methods to Standardize Thermography NDE

    NASA Technical Reports Server (NTRS)

    Walker, James L.; Workman, Gary L.

    1998-01-01

    The purpose of this work is to develop thermographic inspection methods and standards for use in evaluating structural composites and aerospace hardware. Qualification techniques and calibration methods are investigated to standardize the thermographic method for use in the field. Along with the inspections of test standards structural hardware, support hardware is designed and fabricated to aid in the thermographic process. Also, a standard operating procedure is developed for performing inspections with the Bales Thermal Image Processor (TIP). Inspections are performed on a broad range of structural composites. These materials include graphite/epoxies, graphite/cyanide-ester, graphite/silicon-carbide, graphite phenolic and Kevlar/epoxy. Also metal honeycomb (titanium and aluminum faceplates over an aluminum honeycomb core) structures are investigated. Various structural shapes are investigated and the thickness of the structures vary from as few as 3 plies to as many as 80 plies. Special emphasis is placed on characterizing defects in attachment holes and bondlines, in addition to those resulting from impact damage and the inclusion of foreign matter. Image processing through statistical analysis and digital filtering is investigated to enhance the quality and quantify the NDE thermal images when necessary.

  3. Statistical Reporting Errors and Collaboration on Statistical Analyses in Psychological Science.

    PubMed

    Veldkamp, Coosje L S; Nuijten, Michèle B; Dominguez-Alvarez, Linda; van Assen, Marcel A L M; Wicherts, Jelte M

    2014-01-01

    Statistical analysis is error prone. A best practice for researchers using statistics would therefore be to share data among co-authors, allowing double-checking of executed tasks just as co-pilots do in aviation. To document the extent to which this 'co-piloting' currently occurs in psychology, we surveyed the authors of 697 articles published in six top psychology journals and asked them whether they had collaborated on four aspects of analyzing data and reporting results, and whether the described data had been shared between the authors. We acquired responses for 49.6% of the articles and found that co-piloting on statistical analysis and reporting results is quite uncommon among psychologists, while data sharing among co-authors seems reasonably but not completely standard. We then used an automated procedure to study the prevalence of statistical reporting errors in the articles in our sample and examined the relationship between reporting errors and co-piloting. Overall, 63% of the articles contained at least one p-value that was inconsistent with the reported test statistic and the accompanying degrees of freedom, and 20% of the articles contained at least one p-value that was inconsistent to such a degree that it may have affected decisions about statistical significance. Overall, the probability that a given p-value was inconsistent was over 10%. Co-piloting was not found to be associated with reporting errors.

  4. Statistical Reporting Errors and Collaboration on Statistical Analyses in Psychological Science

    PubMed Central

    Veldkamp, Coosje L. S.; Nuijten, Michèle B.; Dominguez-Alvarez, Linda; van Assen, Marcel A. L. M.; Wicherts, Jelte M.

    2014-01-01

    Statistical analysis is error prone. A best practice for researchers using statistics would therefore be to share data among co-authors, allowing double-checking of executed tasks just as co-pilots do in aviation. To document the extent to which this ‘co-piloting’ currently occurs in psychology, we surveyed the authors of 697 articles published in six top psychology journals and asked them whether they had collaborated on four aspects of analyzing data and reporting results, and whether the described data had been shared between the authors. We acquired responses for 49.6% of the articles and found that co-piloting on statistical analysis and reporting results is quite uncommon among psychologists, while data sharing among co-authors seems reasonably but not completely standard. We then used an automated procedure to study the prevalence of statistical reporting errors in the articles in our sample and examined the relationship between reporting errors and co-piloting. Overall, 63% of the articles contained at least one p-value that was inconsistent with the reported test statistic and the accompanying degrees of freedom, and 20% of the articles contained at least one p-value that was inconsistent to such a degree that it may have affected decisions about statistical significance. Overall, the probability that a given p-value was inconsistent was over 10%. Co-piloting was not found to be associated with reporting errors. PMID:25493918

  5. Interlaboratory evaluation of a standardized inductively coupled plasma mass spectrometry method for the determination of trace beryllium in air filter samples.

    PubMed

    Ashley, Kevin; Brisson, Michael J; Howe, Alan M; Bartley, David L

    2009-12-01

    A collaborative interlaboratory evaluation of a newly standardized inductively coupled plasma mass spectrometry (ICP-MS) method for determining trace beryllium in workplace air samples was carried out toward fulfillment of method validation requirements for ASTM International voluntary consensus standard test methods. The interlaboratory study (ILS) was performed in accordance with an applicable ASTM International standard practice, ASTM E691, which describes statistical procedures for investigating interlaboratory precision. Uncertainty was also estimated in accordance with ASTM D7440, which applies the International Organization for Standardization Guide to the Expression of Uncertainty in Measurement to air quality measurements. Performance evaluation materials (PEMs) used consisted of 37 mm diameter mixed cellulose ester filters that were spiked with beryllium at levels of 0.025 (low loading), 0.5 (medium loading), and 10 (high loading) microg Be/filter; these spiked filters were prepared by a contract laboratory. Participating laboratories were recruited from a pool of over 50 invitees; ultimately, 20 laboratories from Europe, North America, and Asia submitted ILS results. Triplicates of each PEM (blanks plus the three different loading levels) were conveyed to each volunteer laboratory, along with a copy of the draft standard test method that each participant was asked to follow; spiking levels were unknown to the participants. The laboratories were requested to prepare the PEMs by one of three sample preparation procedures (hotplate or microwave digestion or hotblock extraction) that were described in the draft standard. Participants were then asked to analyze aliquots of the prepared samples by ICP-MS and to report their data in units of mu g Be/filter sample. Interlaboratory precision estimates from participating laboratories, computed in accordance with ASTM E691, were 0.165, 0.108, and 0.151 (relative standard deviation) for the PEMs spiked at 0.025, 0.5, and 10 microg Be/filter, respectively. Overall recoveries were 93.2%, 102%, and 80.6% for the low, medium, and high beryllium loadings, respectively. Expanded uncertainty estimates for interlaboratory analysis of low, medium, and high beryllium loadings, calculated in accordance with ASTM D7440, were 18.8%, 19.8%, and 24.4%, respectively. These figures of merit support promulgation of the analytical procedure as an ASTM International standard test method, ASTM D7439.

  6. 76 FR 47518 - Energy Conservation Program: Treatment of “Smart” Appliances in Energy Conservation Standards and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-05

    ... Conservation Program: Treatment of ``Smart'' Appliances in Energy Conservation Standards and Test Procedures... well as in test procedures used to demonstrate compliance with DOE's standards and qualification as an... development of energy conservation standards and test procedures for DOE's Appliance Standards Program and the...

  7. Twice random, once mixed: applying mixed models to simultaneously analyze random effects of language and participants.

    PubMed

    Janssen, Dirk P

    2012-03-01

    Psychologists, psycholinguists, and other researchers using language stimuli have been struggling for more than 30 years with the problem of how to analyze experimental data that contain two crossed random effects (items and participants). The classical analysis of variance does not apply; alternatives have been proposed but have failed to catch on, and a statistically unsatisfactory procedure of using two approximations (known as F(1) and F(2)) has become the standard. A simple and elegant solution using mixed model analysis has been available for 15 years, and recent improvements in statistical software have made mixed models analysis widely available. The aim of this article is to increase the use of mixed models by giving a concise practical introduction and by giving clear directions for undertaking the analysis in the most popular statistical packages. The article also introduces the DJMIXED: add-on package for SPSS, which makes entering the models and reporting their results as straightforward as possible.

  8. Numerical solutions of ideal quantum gas dynamical flows governed by semiclassical ellipsoidal-statistical distribution.

    PubMed

    Yang, Jaw-Yen; Yan, Chih-Yuan; Diaz, Manuel; Huang, Juan-Chen; Li, Zhihui; Zhang, Hanxin

    2014-01-08

    The ideal quantum gas dynamics as manifested by the semiclassical ellipsoidal-statistical (ES) equilibrium distribution derived in Wu et al. (Wu et al . 2012 Proc. R. Soc. A 468 , 1799-1823 (doi:10.1098/rspa.2011.0673)) is numerically studied for particles of three statistics. This anisotropic ES equilibrium distribution was derived using the maximum entropy principle and conserves the mass, momentum and energy, but differs from the standard Fermi-Dirac or Bose-Einstein distribution. The present numerical method combines the discrete velocity (or momentum) ordinate method in momentum space and the high-resolution shock-capturing method in physical space. A decoding procedure to obtain the necessary parameters for determining the ES distribution is also devised. Computations of two-dimensional Riemann problems are presented, and various contours of the quantities unique to this ES model are illustrated. The main flow features, such as shock waves, expansion waves and slip lines and their complex nonlinear interactions, are depicted and found to be consistent with existing calculations for a classical gas.

  9. Change in quality of malnutrition surveys between 1986 and 2015.

    PubMed

    Grellety, Emmanuel; Golden, Michael H

    2018-01-01

    Representative surveys collecting weight, height and MUAC are used to estimate the prevalence of acute malnutrition. The results are then used to assess the scale of malnutrition in a population and type of nutritional intervention required. There have been changes in methodology over recent decades; the objective of this study was to determine if these have resulted in higher quality surveys. In order to examine the change in reliability of such surveys we have analysed the statistical distributions of the derived anthropometric parameters from 1843 surveys conducted by 19 agencies between 1986 and 2015. With the introduction of standardised guidelines and software by 2003 and their more general application from 2007 the mean standard deviation, kurtosis and skewness of the parameters used to assess nutritional status have each moved to now approximate the distribution of the WHO standards when the exclusion of outliers from analysis is based upon SMART flagging procedure. Where WHO flags, that only exclude data incompatible with life, are used the quality of anthropometric surveys has improved and the results now approach those seen with SMART flags and the WHO standards distribution. Agencies vary in their uptake and adherence to standard guidelines. Those agencies that fully implement the guidelines achieve the most consistently reliable results. Standard methods should be universally used to produce reliable data and tests of data quality and SMART type flagging procedures should be applied and reported to ensure that the data are credible and therefore inform appropriate intervention. Use of SMART guidelines has coincided with reliable anthropometric data since 2007.

  10. Detection of Anomalies in Hydrometric Data Using Artificial Intelligence Techniques

    NASA Astrophysics Data System (ADS)

    Lauzon, N.; Lence, B. J.

    2002-12-01

    This work focuses on the detection of anomalies in hydrometric data sequences, such as 1) outliers, which are individual data having statistical properties that differ from those of the overall population; 2) shifts, which are sudden changes over time in the statistical properties of the historical records of data; and 3) trends, which are systematic changes over time in the statistical properties. For the purpose of the design and management of water resources systems, it is important to be aware of these anomalies in hydrometric data, for they can induce a bias in the estimation of water quantity and quality parameters. These anomalies may be viewed as specific patterns affecting the data, and therefore pattern recognition techniques can be used for identifying them. However, the number of possible patterns is very large for each type of anomaly and consequently large computing capacities are required to account for all possibilities using the standard statistical techniques, such as cluster analysis. Artificial intelligence techniques, such as the Kohonen neural network and fuzzy c-means, are clustering techniques commonly used for pattern recognition in several areas of engineering and have recently begun to be used for the analysis of natural systems. They require much less computing capacity than the standard statistical techniques, and therefore are well suited for the identification of outliers, shifts and trends in hydrometric data. This work constitutes a preliminary study, using synthetic data representing hydrometric data that can be found in Canada. The analysis of the results obtained shows that the Kohonen neural network and fuzzy c-means are reasonably successful in identifying anomalies. This work also addresses the problem of uncertainties inherent to the calibration procedures that fit the clusters to the possible patterns for both the Kohonen neural network and fuzzy c-means. Indeed, for the same database, different sets of clusters can be established with these calibration procedures. A simple method for analyzing uncertainties associated with the Kohonen neural network and fuzzy c-means is developed here. The method combines the results from several sets of clusters, either from the Kohonen neural network or fuzzy c-means, so as to provide an overall diagnosis as to the identification of outliers, shifts and trends. The results indicate an improvement in the performance for identifying anomalies when the method of combining cluster sets is used, compared with when only one cluster set is used.

  11. Monitoring Items in Real Time to Enhance CAT Security

    ERIC Educational Resources Information Center

    Zhang, Jinming; Li, Jie

    2016-01-01

    An IRT-based sequential procedure is developed to monitor items for enhancing test security. The procedure uses a series of statistical hypothesis tests to examine whether the statistical characteristics of each item under inspection have changed significantly during CAT administration. This procedure is compared with a previously developed…

  12. Fuzzy Adaptive Control for Intelligent Autonomous Space Exploration Problems

    NASA Technical Reports Server (NTRS)

    Esogbue, Augustine O.

    1998-01-01

    The principal objective of the research reported here is the re-design, analysis and optimization of our newly developed neural network fuzzy adaptive controller model for complex processes capable of learning fuzzy control rules using process data and improving its control through on-line adaption. The learned improvement is according to a performance objective function that provides evaluative feedback; this performance objective is broadly defined to meet long-range goals over time. Although fuzzy control had proven effective for complex, nonlinear, imprecisely-defined processes for which standard models and controls are either inefficient, impractical or cannot be derived, the state of the art prior to our work showed that procedures for deriving fuzzy control, however, were mostly ad hoc heuristics. The learning ability of neural networks was exploited to systematically derive fuzzy control and permit on-line adaption and in the process optimize control. The operation of neural networks integrates very naturally with fuzzy logic. The neural networks which were designed and tested using simulation software and simulated data, followed by realistic industrial data were reconfigured for application on several platforms as well as for the employment of improved algorithms. The statistical procedures of the learning process were investigated and evaluated with standard statistical procedures (such as ANOVA, graphical analysis of residuals, etc.). The computational advantage of dynamic programming-like methods of optimal control was used to permit on-line fuzzy adaptive control. Tests for the consistency, completeness and interaction of the control rules were applied. Comparisons to other methods and controllers were made so as to identify the major advantages of the resulting controller model. Several specific modifications and extensions were made to the original controller. Additional modifications and explorations have been proposed for further study. Some of these are in progress in our laboratory while others await additional support. All of these enhancements will improve the attractiveness of the controller as an effective tool for the on line control of an array of complex process environments.

  13. Development of parallel line analysis criteria for recombinant adenovirus potency assay and definition of a unit of potency.

    PubMed

    Ogawa, Yasushi; Fawaz, Farah; Reyes, Candice; Lai, Julie; Pungor, Erno

    2007-01-01

    Parameter settings of a parallel line analysis procedure were defined by applying statistical analysis procedures to the absorbance data from a cell-based potency bioassay for a recombinant adenovirus, Adenovirus 5 Fibroblast Growth Factor-4 (Ad5FGF-4). The parallel line analysis was performed with a commercially available software, PLA 1.2. The software performs Dixon outlier test on replicates of the absorbance data, performs linear regression analysis to define linear region of the absorbance data, and tests parallelism between the linear regions of standard and sample. Width of Fiducial limit, expressed as a percent of the measured potency, was developed as a criterion for rejection of the assay data and to significantly improve the reliability of the assay results. With the linear range-finding criteria of the software set to a minimum of 5 consecutive dilutions and best statistical outcome, and in combination with the Fiducial limit width acceptance criterion of <135%, 13% of the assay results were rejected. With these criteria applied, the assay was found to be linear over the range of 0.25 to 4 relative potency units, defined as the potency of the sample normalized to the potency of Ad5FGF-4 standard containing 6 x 10(6) adenovirus particles/mL. The overall precision of the assay was estimated to be 52%. Without the application of Fiducial limit width criterion, the assay results were not linear over the range, and an overall precision of 76% was calculated from the data. An absolute unit of potency for the assay was defined by using the parallel line analysis procedure as the amount of Ad5FGF-4 that results in an absorbance value that is 121% of the average absorbance readings of the wells containing cells not infected with the adenovirus.

  14. 40 CFR 63.1549 - Recordkeeping and reporting requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... recordkeeping required as part of the practices described in the standard operating procedures manual for... as part of the practices described in the standard operating procedures manual for baghouses required... procedures outlined in the standard operating procedures manual required by § 63.1544(a) were not followed...

  15. 40 CFR 63.1549 - Recordkeeping and reporting requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... recordkeeping required as part of the practices described in the standard operating procedures manual for... as part of the practices described in the standard operating procedures manual for baghouses required... procedures outlined in the standard operating procedures manual required by § 63.1544(a) were not followed...

  16. 40 CFR 63.1549 - Recordkeeping and reporting requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... recordkeeping required as part of the practices described in the standard operating procedures manual for... as part of the practices described in the standard operating procedures manual for baghouses required... procedures outlined in the standard operating procedures manual required by § 63.1544(a) were not followed...

  17. 40 CFR 75.32 - Determination of monitor data availability for standard missing data procedures.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... availability for standard missing data procedures. 75.32 Section 75.32 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTINUOUS EMISSION MONITORING Missing Data Substitution Procedures § 75.32 Determination of monitor data availability for standard missing data procedures...

  18. 40 CFR 75.32 - Determination of monitor data availability for standard missing data procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... availability for standard missing data procedures. 75.32 Section 75.32 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTINUOUS EMISSION MONITORING Missing Data Substitution Procedures § 75.32 Determination of monitor data availability for standard missing data procedures...

  19. 40 CFR 75.32 - Determination of monitor data availability for standard missing data procedures.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... availability for standard missing data procedures. 75.32 Section 75.32 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTINUOUS EMISSION MONITORING Missing Data Substitution Procedures § 75.32 Determination of monitor data availability for standard missing data procedures...

  20. 40 CFR 75.32 - Determination of monitor data availability for standard missing data procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... availability for standard missing data procedures. 75.32 Section 75.32 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTINUOUS EMISSION MONITORING Missing Data Substitution Procedures § 75.32 Determination of monitor data availability for standard missing data procedures...

  1. 40 CFR 75.32 - Determination of monitor data availability for standard missing data procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... availability for standard missing data procedures. 75.32 Section 75.32 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTINUOUS EMISSION MONITORING Missing Data Substitution Procedures § 75.32 Determination of monitor data availability for standard missing data procedures...

  2. Guenter Tulip Filter Retrieval Experience: Predictors of Successful Retrieval

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turba, Ulku Cenk, E-mail: uct5d@virginia.edu; Arslan, Bulent, E-mail: ba6e@virginia.edu; Meuse, Michael, E-mail: mm5tz@virginia.edu

    We report our experience with Guenter Tulip filter placement indications, retrievals, and procedural problems, with emphasis on alternative retrieval techniques. We have identified 92 consecutive patients in whom a Guenter Tulip filter was placed and filter removal attempted. We recorded patient demographic information, filter placement and retrieval indications, procedures, standard and nonstandard filter retrieval techniques, complications, and clinical outcomes. The mean time to retrieval for those who experienced filter strut penetration was statistically significant [F(1,90) = 8.55, p = 0.004]. Filter strut(s) IVC penetration and successful retrieval were found to be statistically significant (p = 0.043). The filter hook-IVC relationshipmore » correlated with successful retrieval. A modified guidewire loop technique was applied in 8 of 10 cases where the hook appeared to penetrate the IVC wall and could not be engaged with a loop snare catheter, providing additional technical success in 6 of 8 (75%). Therefore, the total filter retrieval success increased from 88 to 95%. In conclusion, the Guenter Tulip filter has high successful retrieval rates with low rates of complication. Additional maneuvers such as a guidewire loop method can be used to improve retrieval success rates when the filter hook is endothelialized.« less

  3. Assessing the fit of site-occupancy models

    USGS Publications Warehouse

    MacKenzie, D.I.; Bailey, L.L.

    2004-01-01

    Few species are likely to be so evident that they will always be detected at a site when present. Recently a model has been developed that enables estimation of the proportion of area occupied, when the target species is not detected with certainty. Here we apply this modeling approach to data collected on terrestrial salamanders in the Plethodon glutinosus complex in the Great Smoky Mountains National Park, USA, and wish to address the question 'how accurately does the fitted model represent the data?' The goodness-of-fit of the model needs to be assessed in order to make accurate inferences. This article presents a method where a simple Pearson chi-square statistic is calculated and a parametric bootstrap procedure is used to determine whether the observed statistic is unusually large. We found evidence that the most global model considered provides a poor fit to the data, hence estimated an overdispersion factor to adjust model selection procedures and inflate standard errors. Two hypothetical datasets with known assumption violations are also analyzed, illustrating that the method may be used to guide researchers to making appropriate inferences. The results of a simulation study are presented to provide a broader view of the methods properties.

  4. A Comparative Study of Microleakage on Dental Surfaces Bonded with Three Self-Etch Adhesive Systems Treated with the Er:YAG Laser and Bur

    PubMed Central

    Sanhadji El Haddar, Youssef; Cetik, Sibel; Bahrami, Babak; Atash, Ramin

    2016-01-01

    Aim. This study sought to compare the microleakage of three adhesive systems in the context of Erbium-YAG laser and diamond bur cavity procedures. Cavities were restored with composite resin. Materials and Methods. Standardized Class V cavities were performed in 72 extracted human teeth by means of diamond burs or Er-YAG laser. The samples were randomly divided into six groups of 12, testing three adhesive systems (Clearfil s3 Bond Plus, Xeno® Select, and Futurabond U) for each method used. Cavities were restored with composite resin before thermocycling (methylene blue 2%, 24 h). The slices were prepared using a microtome. Optical microscope photography was employed to measure the penetration. Results. No statistically significant differences in microleakage were found in the use of bur or laser, nor between adhesive systems. Only statistically significant values were observed comparing enamel with cervical walls (p < 0.001). Conclusion. It can be concluded that the Er:YAG laser is as efficient as diamond bur concerning microleakage values in adhesive restoration procedures, thus constituting an alternative tool for tooth preparation. PMID:27419128

  5. Using Statistical Process Control to Make Data-Based Clinical Decisions.

    ERIC Educational Resources Information Center

    Pfadt, Al; Wheeler, Donald J.

    1995-01-01

    Statistical process control (SPC), which employs simple statistical tools and problem-solving techniques such as histograms, control charts, flow charts, and Pareto charts to implement continual product improvement procedures, can be incorporated into human service organizations. Examples illustrate use of SPC procedures to analyze behavioral data…

  6. Statistical Cost Estimation in Higher Education: Some Alternatives.

    ERIC Educational Resources Information Center

    Brinkman, Paul T.; Niwa, Shelley

    Recent developments in econometrics that are relevant to the task of estimating costs in higher education are reviewed. The relative effectiveness of alternative statistical procedures for estimating costs are also tested. Statistical cost estimation involves three basic parts: a model, a data set, and an estimation procedure. Actual data are used…

  7. Prevalidation in pharmaceutical analysis. Part I. Fundamentals and critical discussion.

    PubMed

    Grdinić, Vladimir; Vuković, Jadranka

    2004-05-28

    A complete prevalidation, as a basic prevalidation strategy for quality control and standardization of analytical procedure was inaugurated. Fast and simple, the prevalidation methodology based on mathematical/statistical evaluation of a reduced number of experiments (N < or = 24) was elaborated and guidelines as well as algorithms were given in detail. This strategy has been produced for the pharmaceutical applications and dedicated to the preliminary evaluation of analytical methods where linear calibration model, which is very often occurred in practice, could be the most appropriate to fit experimental data. The requirements presented in this paper should therefore help the analyst to design and perform the minimum number of prevalidation experiments needed to obtain all the required information to evaluate and demonstrate the reliability of its analytical procedure. In complete prevalidation process, characterization of analytical groups, checking of two limiting groups, testing of data homogeneity, establishment of analytical functions, recognition of outliers, evaluation of limiting values and extraction of prevalidation parameters were included. Moreover, system of diagnosis for particular prevalidation step was suggested. As an illustrative example for demonstration of feasibility of prevalidation methodology, among great number of analytical procedures, Vis-spectrophotometric procedure for determination of tannins with Folin-Ciocalteu's phenol reagent was selected. Favourable metrological characteristics of this analytical procedure, as prevalidation figures of merit, recognized the metrological procedure as a valuable concept in preliminary evaluation of quality of analytical procedures.

  8. Evaluation of a new very low dose imaging protocol: feasibility and impact on X-ray dose levels in electrophysiology procedures.

    PubMed

    Bourier, Felix; Reents, Tilko; Ammar-Busch, Sonia; Buiatti, Alessandra; Kottmaier, Marc; Semmler, Verena; Telishevska, Marta; Brkic, Amir; Grebmer, Christian; Lennerz, Carsten; Kolb, Christof; Hessling, Gabriele; Deisenhofer, Isabel

    2016-09-01

    This study presents and evaluates the impact of a new lowest-dose fluoroscopy protocol (Siemens AG), especially designed for electrophysiology (EP) procedures, on X-ray dose levels. From October 2014 to March 2015, 140 patients underwent an EP study on an Artis zee angiography system. The standard low-dose protocol was operated at 23 nGy (fluoroscopy) and at 120 nGy (cine-loop), the new lowest-dose protocol was operated at 8 nGy (fluoroscopy) and at 36 nGy (cine-loop). Procedural data, X-ray times, and doses were analysed in 100 complex left atrial and in 40 standard EP procedures. The resulting dose-area products were 877.9 ± 624.7 µGym² (n = 50 complex procedures, standard low dose), 199 ± 159.6 µGym² (n = 50 complex procedures, lowest dose), 387.7 ± 36.0 µGym² (n = 20 standard procedures, standard low dose), and 90.7 ± 62.3 µGym² (n = 20 standard procedures, lowest dose), P < 0.01. In the low-dose and lowest-dose groups, procedure times were 132.6 ± 35.7 vs. 126.7 ± 34.7 min (P = 0.40, complex procedures) and 72.3 ± 20.9 vs. 85.2 ± 44.1 min (P = 0.24, standard procedures), radiofrequency (RF) times were 53.8 ± 26.1 vs. 50.4 ± 29.4 min (P = 0.54, complex procedures) and 10.1 ± 9.9 vs. 12.2 ± 14.7 min (P = 0.60, standard procedures). One complication occurred in the standard low-dose and lowest-dose groups (P = 1.0). The new lowest-dose imaging protocol reduces X-ray dose levels by 77% compared with the currently available standard low-dose protocol. From an operator standpoint, lowest X-ray dose levels create a different, reduced image quality. The new image quality did not significantly affect procedure or RF times and did not result in higher complication rates. Regarding radiological protection, operating at lowest-dose settings should become standard in EP procedures. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2015. For permissions please email: journals.permissions@oup.com.

  9. Quality Assurance in Breast Health Care and Requirement for Accreditation in Specialized Units

    PubMed Central

    Güler, Sertaç Ata; Güllüoğlu, Bahadır M.

    2014-01-01

    Breast health is a subject of increasing importance. The statistical increase in the frequency of breast cancer and the consequent increase in death rate increase the importance of quality of services to be provided for breast health. For these reasons, the minimum standards and optimum quality metrics of breast care provided to the community are determined. The quality parameters for breast care service include the results, the structure and the operation of services. Within this group, the results of breast health services are determined according to clinical results, patient satisfaction and financial condition. The structure of quality services should include interdisciplinary meetings, written standards for specific procedures and the existence of standardized reporting systems. Establishing breast centers that adopt integrated multidisciplinary working principles and their cost-effective maintenance are important in terms of operation of breast health services. The importance of using a “reviewing/auditing” procedure that checks if all of these functions existing in the health system are carried out at the desired level and an “accreditation” system indicating that the working breast units/centers provide minimum quality adequacy in all aspects, is undeniable. Currently, the accreditation system for breast centers is being used in the European Union and the United States for the last 5–10 years. This system is thought to provide standardization in breast care services, and is accepted as one of the important factors that resulted in reduction in mortality associated with breast cancer. PMID:28331658

  10. Quality Assurance in Breast Health Care and Requirement for Accreditation in Specialized Units.

    PubMed

    Güler, Sertaç Ata; Güllüoğlu, Bahadır M

    2014-07-01

    Breast health is a subject of increasing importance. The statistical increase in the frequency of breast cancer and the consequent increase in death rate increase the importance of quality of services to be provided for breast health. For these reasons, the minimum standards and optimum quality metrics of breast care provided to the community are determined. The quality parameters for breast care service include the results, the structure and the operation of services. Within this group, the results of breast health services are determined according to clinical results, patient satisfaction and financial condition. The structure of quality services should include interdisciplinary meetings, written standards for specific procedures and the existence of standardized reporting systems. Establishing breast centers that adopt integrated multidisciplinary working principles and their cost-effective maintenance are important in terms of operation of breast health services. The importance of using a "reviewing/auditing" procedure that checks if all of these functions existing in the health system are carried out at the desired level and an "accreditation" system indicating that the working breast units/centers provide minimum quality adequacy in all aspects, is undeniable. Currently, the accreditation system for breast centers is being used in the European Union and the United States for the last 5-10 years. This system is thought to provide standardization in breast care services, and is accepted as one of the important factors that resulted in reduction in mortality associated with breast cancer.

  11. Biomechanical Comparison of an Open vs Arthroscopic Approach for Lateral Ankle Instability.

    PubMed

    Drakos, Mark C; Behrens, Steve B; Paller, Dave; Murphy, Conor; DiGiovanni, Christopher W

    2014-08-01

    The current clinical standard for the surgical treatment of ankle instability remains the open modified Broström procedure. Modern advents in arthroscopic technology have allowed physicians to perform certain foot and ankle procedures arthroscopically as opposed to traditional open approaches. Twenty matched lower extremity cadaver specimens were obtained. Steinman pins were inserted into the tibia and talus with 6 sensors affixed to each pin. Specimens were placed in a Telos ankle stress apparatus in an anteroposterior and then lateral position, while a 1.7 N-m load was applied. For each of these tests, movement of the sensors was measured in 3 planes using the Optotrak Computer Navigation System. Changes in position were calculated and compared with the unloaded state. The anteriortalofibular ligament and the calcaneofibular ligament were thereafter sectioned from the fibula. The aforementioned measurements in the loaded and unloaded states were repeated on the specimens. The sectioned ligaments were then repaired using 2 corkscrew anchors. Ten specimens were repaired using a standard open Broström-type repair, while the matched pairs were repaired using an arthroscopic technique. Measurements were repeated and compared using a paired t test. There was a statistically significant difference between the sectioned state and the other 3 states (P < .05). There were no statistically significant differences between the intact state and either the open or arthroscopic state (P > .05). There were no significant differences between the open and arthroscopic repairs with respect to translation and total combined motion during the talar tilt test (P > .05). Statistically significant differences were demonstrated between the 2 methods in 3 specific axes of movement during talar tilt (P = .04). Biomechanically effective ankle stabilization may be amenable to a minimally invasive approach. A minimally invasive, arthroscopic approach can be considered for treating patients with lateral ankle instability who have failed conservative treatment. © The Author(s) 2014.

  12. Total Ossicular Replacement Prosthesis: A New Fat Interposition Technique.

    PubMed

    Saliba, Issam; Sabbah, Valérie; Poirier, Jackie Bibeau

    2018-01-01

    To compare audiometric results between the standard total ossicular replacement prosthesis (TORP-S) and a new fat interposition total ossicular replacement prosthesis (TORP-F) in pediatric and adult patients and to assess the complication and the undesirable outcome. This is a retrospective study. This study included 104 patients who had undergone titanium implants with TORP-F and 54 patients who had undergone the procedure with TORP-S between 2008 and 2013 in our tertiary care centers. The new technique consists of interposing a fat graft between the 4 legs of the universal titanium prosthesis (Medtronic Xomed Inc, Jacksonville, FL, USA) to provide a more stable TORP in the ovale window niche. Normally, this prosthesis is designed to fit on the stapes' head as a partial ossicular replacement prosthesis. The postoperative air-bone gap less than 25 dB for the combined cohort was 69.2% and 41.7% for the TORP-F and the TORP-S groups, respectively. The mean follow-up was 17 months postoperatively. By stratifying data, the pediatric cohort shows 56.5% in the TORP-F group (n = 52) compared with 40% in the TORP-S group (n = 29). However, the adult cohort shows 79.3% in the TORP-F group (n = 52) compared with 43.75% in the TORP-S group (n = 25). These improvements in hearing were statistically significant. There were no statistically significant differences in the speech discrimination scores. The only undesirable outcome that was statistically different between the 2 groups was the prosthesis displacement: 7% in the TORP-F group compared with 19% in the TORP-S group ( P  = .03). The interposition of a fat graft between the legs of the titanium implants (TORP-F) provides superior hearing results compared with a standard procedure (TORP-S) in pediatric and adult populations because of its better stability in the oval window niche.

  13. Total Ossicular Replacement Prosthesis: A New Fat Interposition Technique

    PubMed Central

    Saliba, Issam; Sabbah, Valérie; Poirier, Jackie Bibeau

    2018-01-01

    Objective: To compare audiometric results between the standard total ossicular replacement prosthesis (TORP-S) and a new fat interposition total ossicular replacement prosthesis (TORP-F) in pediatric and adult patients and to assess the complication and the undesirable outcome. Study design: This is a retrospective study. Methods: This study included 104 patients who had undergone titanium implants with TORP-F and 54 patients who had undergone the procedure with TORP-S between 2008 and 2013 in our tertiary care centers. The new technique consists of interposing a fat graft between the 4 legs of the universal titanium prosthesis (Medtronic Xomed Inc, Jacksonville, FL, USA) to provide a more stable TORP in the ovale window niche. Normally, this prosthesis is designed to fit on the stapes’ head as a partial ossicular replacement prosthesis. Results: The postoperative air-bone gap less than 25 dB for the combined cohort was 69.2% and 41.7% for the TORP-F and the TORP-S groups, respectively. The mean follow-up was 17 months postoperatively. By stratifying data, the pediatric cohort shows 56.5% in the TORP-F group (n = 52) compared with 40% in the TORP-S group (n = 29). However, the adult cohort shows 79.3% in the TORP-F group (n = 52) compared with 43.75% in the TORP-S group (n = 25). These improvements in hearing were statistically significant. There were no statistically significant differences in the speech discrimination scores. The only undesirable outcome that was statistically different between the 2 groups was the prosthesis displacement: 7% in the TORP-F group compared with 19% in the TORP-S group (P = .03). Conclusions: The interposition of a fat graft between the legs of the titanium implants (TORP-F) provides superior hearing results compared with a standard procedure (TORP-S) in pediatric and adult populations because of its better stability in the oval window niche. PMID:29326537

  14. Development of partial life-cycle experiments to assess the effects of endocrine disruptors on the freshwater gastropod Lymnaea stagnalis: a case-study with vinclozolin.

    PubMed

    Ducrot, Virginie; Teixeira-Alves, Mickaël; Lopes, Christelle; Delignette-Muller, Marie-Laure; Charles, Sandrine; Lagadic, Laurent

    2010-10-01

    Long-term effects of endocrine disruptors (EDs) on aquatic invertebrates remain difficult to assess, mainly due to the lack of appropriate sensitive toxicity test methods and relevant data analysis procedures. This study aimed at identifying windows of sensitivity to EDs along the life-cycle of the freshwater snail Lymnaea stagnalis, a candidate species for the development of forthcoming test guidelines. Juveniles, sub-adults, young adults and adults were exposed for 21 days to the fungicide vinclozolin (VZ). Survival, growth, onset of reproduction, fertility and fecundity were monitored weekly. Data were analyzed using standard statistical analysis procedures and mixed-effect models. No deleterious effect on survival and growth occurred in snails exposed to VZ at environmentally relevant concentrations. A significant impairment of the male function occurred in young adults, leading to infertility at concentrations exceeding 0.025 μg/L. Furthermore, fecundity was impaired in adults exposed to concentrations exceeding 25 μg/L. Biological responses depended on VZ concentration, exposure duration and on their interaction, leading to complex response patterns. The use of a standard statistical approach to analyze those data led to underestimation of VZ effects on reproduction, whereas effects could reliably be analyzed by mixed-effect models. L. stagnalis may be among the most sensitive invertebrate species to VZ, a 21-day reproduction test allowing the detection of deleterious effects at environmentally relevant concentrations of the fungicide. These results thus reinforce the relevance of L. stagnalis as a good candidate species for the development of guidelines devoted to the risk assessment of EDs.

  15. Accounting for measurement reliability to improve the quality of inference in dental microhardness research: a worked example.

    PubMed

    Sever, Ivan; Klaric, Eva; Tarle, Zrinka

    2016-07-01

    Dental microhardness experiments are influenced by unobserved factors related to the varying tooth characteristics that affect measurement reproducibility. This paper explores the appropriate analytical tools for modeling different sources of unobserved variability to reduce the biases encountered and increase the validity of microhardness studies. The enamel microhardness of human third molars was measured by Vickers diamond. The effects of five bleaching agents-10, 16, and 30 % carbamide peroxide, and 25 and 38 % hydrogen peroxide-were examined, as well as the effect of artificial saliva and amorphous calcium phosphate. To account for both between- and within-tooth heterogeneity in evaluating treatment effects, the statistical analysis was performed in the mixed-effects framework, which also included the appropriate weighting procedure to adjust for confounding. The results were compared to those of the standard ANOVA model usually applied. The weighted mixed-effects model produced the parameter estimates of different magnitude and significance than the standard ANOVA model. The results of the former model were more intuitive, with more precise estimates and better fit. Confounding could seriously bias the study outcomes, highlighting the need for more robust statistical procedures in dental research that account for the measurement reliability. The presented framework is more flexible and informative than existing analytical techniques and may improve the quality of inference in dental research. Reported results could be misleading if underlying heterogeneity of microhardness measurements is not taken into account. The confidence in treatment outcomes could be increased by applying the framework presented.

  16. MiDAS I (mild Decompression Alternative to Open Surgery): a preliminary report of a prospective, multi-center clinical study.

    PubMed

    Chopko, Bohdan; Caraway, David L

    2010-01-01

    Neurogenic claudication due to lumbar spinal stenosis is a common problem that can be caused by many factors including hypertrophic ligamentum flavum, facet hypertrophy, and disc protrusion. When standard medical therapies such as pain medication, epidural steroid injections, and physical therapy fail, or when the patient is unwilling, unable, or not severe enough to advance to more invasive surgical procedures, both physicians and patients are often left with a treatment dilemma. Patients in this study were treated with mild, an ultra-minimally invasive lumbar decompression procedure using a dorsal approach. The mild procedure is performed under fluoroscopic imaging to resect bone adjacent to, and achieve partial resection of, the hypertrophic ligamentum flavum with minimal disruption of surrounding muscular and skeletal structure. To assess the clinical application and patient safety and functional outcomes of the mild lumbar decompression procedure in the treatment of symptomatic central canal spinal stenosis. Multi-center, non-blinded, prospective clinical study. Fourteen US spine specialist practices. Between July 2008 and January 2010, 78 patients were enrolled in the MiDAS I Study and treated with the mild procedure for lumbar decompression. Of these patients, 6-week follow-up was available for 75 patients. Visual Analog Score (VAS), Oswestry Disability Index (ODI), Zurich Claudication Questionnaire (ZCQ), and SF-12v2 Health Survey. Outcomes were assessed at baseline and 6 weeks post-treatment. There were no major device or procedure-related complications reported in this patient cohort. At 6 weeks, the MiDAS I Study showed statistically and clinically significant reduction of pain as measured by VAS, ZCQ, and SF-12v2. In addition, improvement in physical function and mobility as measured by ODI, ZCQ, and SF-12v2 was statistically and clinically significant in this study. This is a preliminary report encompassing 6-week follow-up. There was no control group. In this 75-patient series, and in keeping with a previously published 90-patient safety cohort, the mild procedure proved to be safe. Further, based on near-term follow-up, the mild procedure demonstrated efficacy in improving mobility and reducing pain associated with lumbar spinal canal stenosis.

  17. NHEXAS PHASE I MARYLAND STUDY--STANDARD OPERATING PROCEDURE FOR PREPARATION OF STANDARD OPERATING PROCEDURES (G01)

    EPA Science Inventory

    The purpose of this SOP is to develop a consistent method and style for all Emory University/Harvard University/Johns Hopkins University standard operating procedures. SOPs are necessary to document all procedures, methods, and techniques used in the NHEXAS investigations. Deve...

  18. 77 FR 32645 - Revision of Performance Standards for State Medicaid Fraud Control Units

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-01

    ... Procedures A Unit establishes written policies and procedures for its operations and ensures that staff are familiar with, and adhere to, policies and procedures. To determine whether a Unit meets this standard, OIG... contain current policies and procedures, consistent with these performance standards, for the...

  19. 40 CFR 75.38 - Standard missing data procedures for Hg CEMS.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Standard missing data procedures for...) AIR PROGRAMS (CONTINUED) CONTINUOUS EMISSION MONITORING Missing Data Substitution Procedures § 75.38 Standard missing data procedures for Hg CEMS. (a) Once 720 quality assured monitor operating hours of Hg...

  20. 75 FR 19296 - Energy Conservation Program: Test Procedures and Energy Conservation Standards for Residential...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-14

    ... Procedures and Energy Conservation Standards for Residential Furnaces and Boilers AGENCY: Office of Energy... supplemental notice of proposed rulemaking (SNOPR) to amend the test procedures for furnaces and boilers, and... Procedures for Residential Furnaces and Boilers'' or ``NOPM for Energy Conservation Standards for Residential...

  1. 78 FR 47047 - Proposed Policy for Discontinuance of Certain Instrument Approach Procedures

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-02

    ... procedures. Localizer procedures. TACAN procedures. Standard Instrument Arrivals (STARs). Standard Instrument... Policy for Discontinuance of Certain Instrument Approach Procedures AGENCY: Federal Aviation... facilitates the introduction of area navigation (RNAV) instrument approach procedures over the past decade...

  2. 40 CFR 63.543 - What are my standards for process vents?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... available. You must develop and follow standard operating procedures designed to minimize emissions of total... the manufacturer's recommended procedures, if available, and the standard operating procedures... 40 Protection of Environment 10 2013-07-01 2013-07-01 false What are my standards for process...

  3. 40 CFR 63.543 - What are my standards for process vents?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... available. You must develop and follow standard operating procedures designed to minimize emissions of total... the manufacturer's recommended procedures, if available, and the standard operating procedures... 40 Protection of Environment 10 2012-07-01 2012-07-01 false What are my standards for process...

  4. An improved standardization procedure to remove systematic low frequency variability biases in GCM simulations

    NASA Astrophysics Data System (ADS)

    Mehrotra, Rajeshwar; Sharma, Ashish

    2012-12-01

    The quality of the absolute estimates of general circulation models (GCMs) calls into question the direct use of GCM outputs for climate change impact assessment studies, particularly at regional scales. Statistical correction of GCM output is often necessary when significant systematic biasesoccur between the modeled output and observations. A common procedure is to correct the GCM output by removing the systematic biases in low-order moments relative to observations or to reanalysis data at daily, monthly, or seasonal timescales. In this paper, we present an extension of a recently published nested bias correction (NBC) technique to correct for the low- as well as higher-order moments biases in the GCM-derived variables across selected multiple time-scales. The proposed recursive nested bias correction (RNBC) approach offers an improved basis for applying bias correction at multiple timescales over the original NBC procedure. The method ensures that the bias-corrected series exhibits improvements that are consistently spread over all of the timescales considered. Different variations of the approach starting from the standard NBC to the more complex recursive alternatives are tested to assess their impacts on a range of GCM-simulated atmospheric variables of interest in downscaling applications related to hydrology and water resources. Results of the study suggest that three to five iteration RNBCs are the most effective in removing distributional and persistence related biases across the timescales considered.

  5. Profix cemented total knee replacement: a 5-year outcome review from Lagos, Nigeria.

    PubMed

    Ugbeye, M E; Odunubi, O O; Dim, E M; Ekundayo, O O

    2012-01-01

    Total knee replacement is a standard treatment for severe osteoarthritis of the knees. It is however, still a novel procedure in Nigeria. Literature on the procedure and outcome of management are sparse in Nigeria. This study aimed at describing Total Knee prosthetic Replacement as it is practiced in National Orthopaedic Hospital, Lagos. Data on patients treated with Total knee replacement between 2006 and 2010 were analyzed retrospectively. The standard anterior approach, with a medial parapatellar incision under pneumatic tourniquet was used in all cases. There were a total of 59 knees in 48 patients operated, with a female: male ratio of 5:1. Patients were in the sixth to ninth decades of life. There was a statistically significant relationship between duration of symptoms and severity of angular deformity. The average pre-operative Knee score (KS) was 27 and average function score (FS) was 43. Average duration of surgery was 126.38 minutes. Tourniquet removal after wound closure was associated with reduced intra-operative blood loss (p < 0.05). Post-operative complications included peri-prosthetic fracture (1.69%), post-operative anaemia (8.47%), superficial wound dehiscence (3.39%) and foot drop (3.39%). The mean post-operative KS and FS increased to 80 and 75 respectively. Total knee replacement, though a novel procedure in our institution is beneficial to patients with severe osteoarthritis. A long term outcome study is being planned.

  6. Heart Rate Variability Dynamics for the Prognosis of Cardiovascular Risk

    PubMed Central

    Ramirez-Villegas, Juan F.; Lam-Espinosa, Eric; Ramirez-Moreno, David F.; Calvo-Echeverry, Paulo C.; Agredo-Rodriguez, Wilfredo

    2011-01-01

    Statistical, spectral, multi-resolution and non-linear methods were applied to heart rate variability (HRV) series linked with classification schemes for the prognosis of cardiovascular risk. A total of 90 HRV records were analyzed: 45 from healthy subjects and 45 from cardiovascular risk patients. A total of 52 features from all the analysis methods were evaluated using standard two-sample Kolmogorov-Smirnov test (KS-test). The results of the statistical procedure provided input to multi-layer perceptron (MLP) neural networks, radial basis function (RBF) neural networks and support vector machines (SVM) for data classification. These schemes showed high performances with both training and test sets and many combinations of features (with a maximum accuracy of 96.67%). Additionally, there was a strong consideration for breathing frequency as a relevant feature in the HRV analysis. PMID:21386966

  7. A New Method for Estimating the Effective Population Size from Allele Frequency Changes

    PubMed Central

    Pollak, Edward

    1983-01-01

    A new procedure is proposed for estimating the effective population size, given that information is available on changes in frequencies of the alleles at one or more independently segregating loci and the population is observed at two or more separate times. Approximate expressions are obtained for the variances of the new statistic, as well as others, also based on allele frequency changes, that have been discussed in the literature. This analysis indicates that the new statistic will generally have a smaller variance than the others. Estimates of effective population sizes and of the standard errors of the estimates are computed for data on two fly populations that have been discussed in earlier papers. In both cases, there is evidence that the effective population size is very much smaller than the minimum census size of the population. PMID:17246147

  8. On Time Performance Pressure

    NASA Technical Reports Server (NTRS)

    Connell, Linda; Wichner, David; Jakey, Abegael

    2013-01-01

    Within many operations, the pressures for on-time performance are high. Each month, on-time statistics are reported to the Department of Transportation and made public. There is a natural tendency for employees under pressure to do their best to meet these objectives. As a result, pressure to get the job done within the allotted time may cause personnel to deviate from procedures and policies. Additionally, inadequate or unavailable resources may drive employees to work around standard processes that are seen as barriers. However, bypassing practices to enable on-time performance may affect more than the statistics. ASRS reports often highlight on-time performance pressures which may result in impact across all workgroups in an attempt to achieve on-time performance. Reporters often provide in-depth insights into their experiences which can be used by industry to identify and focus on the implementation of systemic fixes.

  9. 40 CFR 63.545 - Standards for fugitive dust sources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... according to a standard operating procedures manual that describes in detail the measures that will be put... standard operating procedures manual shall be submitted to the Administrator or delegated authority for review and approval. (c) The controls specified in the standard operating procedures manual shall at a...

  10. 40 CFR 63.545 - Standards for fugitive dust sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... according to a standard operating procedures manual that describes in detail the measures that will be put... standard operating procedures manual shall be submitted to the Administrator or delegated authority for review and approval. (c) The controls specified in the standard operating procedures manual shall at a...

  11. 40 CFR 63.543 - What are my standards for process vents?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... develop and follow standard operating procedures designed to minimize emissions of total hydrocarbon for... manufacturer's recommended procedures, if available, and the standard operating procedures designed to minimize... 40 Protection of Environment 10 2014-07-01 2014-07-01 false What are my standards for process...

  12. Trace element reference values in tissues from inhabitants of the EU. XII. Development of BioReVa program for statistical treatment.

    PubMed

    Iversen, B S; Sabbioni, E; Fortaner, S; Pietra, R; Nicolotti, A

    2003-01-20

    Statistical data treatment is a key point in the assessment of trace element reference values being the conclusive stage of a comprehensive and organized evaluation process of metal concentration in human body fluids. The EURO TERVIHT project (Trace Elements Reference Values in Human Tissues) was started for evaluating, checking and suggesting harmonized procedures for the establishment of trace element reference intervals in body fluids and tissues. Unfortunately, different statistical approaches are being used in this research field making data comparison difficult and in some cases impossible. Although international organizations such as International Federation of Clinical Chemistry (IFCC) or International Union of Pure and Applied Chemistry (IUPAC) have issued recommended guidelines for reference values assessment, including the statistical data treatment, a unique format and a standardized data layout is still missing. The aim of the present study is to present a software (BioReVa) running under Microsoft Windows platform suitable for calculating the reference intervals of trace elements in body matrices. The main scope for creating an ease-of-use application was to control the data distribution, to establish the reference intervals according to the accepted recommendation, on the base of the simple statistic, to get a standard presentation of experimental data and to have an application to which further need could be integrated in future. BioReVa calculates the IFCC reference intervals as well as the coverage intervals recommended by IUPAC as a supplement to the IFCC intervals. Examples of reference values and reference intervals calculated with BioReVa software concern Pb and Se in blood; Cd, In and Cr in urine, Hg and Mo in hair of different general European populations. University of Michigan

  13. Efficiency Analysis: Enhancing the Statistical and Evaluative Power of the Regression-Discontinuity Design.

    ERIC Educational Resources Information Center

    Madhere, Serge

    An analytic procedure, efficiency analysis, is proposed for improving the utility of quantitative program evaluation for decision making. The three features of the procedure are explained: (1) for statistical control, it adopts and extends the regression-discontinuity design; (2) for statistical inferences, it de-emphasizes hypothesis testing in…

  14. 40 CFR 75.33 - Standard missing data procedures for SO2, NOX, Hg, and flow rate.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Standard missing data procedures for... AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTINUOUS EMISSION MONITORING Missing Data Substitution Procedures § 75.33 Standard missing data procedures for SO2, NOX, Hg, and flow rate. (a) Following initial...

  15. 40 CFR 75.33 - Standard missing data procedures for SO2, NOX, and flow rate.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 16 2011-07-01 2011-07-01 false Standard missing data procedures for... (CONTINUED) AIR PROGRAMS (CONTINUED) CONTINUOUS EMISSION MONITORING Missing Data Substitution Procedures § 75.33 Standard missing data procedures for SO2, NOX, and flow rate. (a) Following initial certification...

  16. 40 CFR 75.33 - Standard missing data procedures for SO2, NOX, and flow rate.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 17 2012-07-01 2012-07-01 false Standard missing data procedures for... (CONTINUED) AIR PROGRAMS (CONTINUED) CONTINUOUS EMISSION MONITORING Missing Data Substitution Procedures § 75.33 Standard missing data procedures for SO2, NOX, and flow rate. (a) Following initial certification...

  17. 40 CFR 75.33 - Standard missing data procedures for SO2, NOX, and flow rate.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 17 2013-07-01 2013-07-01 false Standard missing data procedures for... (CONTINUED) AIR PROGRAMS (CONTINUED) CONTINUOUS EMISSION MONITORING Missing Data Substitution Procedures § 75.33 Standard missing data procedures for SO2, NOX, and flow rate. (a) Following initial certification...

  18. 40 CFR 75.33 - Standard missing data procedures for SO2, NOX, and flow rate.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 17 2014-07-01 2014-07-01 false Standard missing data procedures for... (CONTINUED) AIR PROGRAMS (CONTINUED) CONTINUOUS EMISSION MONITORING Missing Data Substitution Procedures § 75.33 Standard missing data procedures for SO2, NOX, and flow rate. (a) Following initial certification...

  19. Clinical efficacy of a spray containing hyaluronic Acid and dexpanthenol after surgery in the nasal cavity (septoplasty, simple ethmoid sinus surgery, and turbinate surgery).

    PubMed

    Gouteva, Ina; Shah-Hosseini, Kija; Meiser, Peter

    2014-01-01

    Background. This prospective, controlled, parallel-group observational study investigated the efficacy of a spray containing hyaluronic acid and dexpanthenol to optimise regular treatment after nasal cavity surgery in 49 patients with chronic rhinosinusitis. Methods. The control group received standard therapy. Mucosal regeneration was determined using rhinoscopy sum score (RSS). Pre- and postoperative nasal patency was tested using anterior rhinomanometry. The participants were questioned about their symptoms. Results. Regarding all RSS parameters (dryness, dried nasal mucus, fibrin deposition, and obstruction), mucosal regeneration achieved good final results in both groups, tending to a better improvement through the spray application, without statistically significant differences during the whole assessment period, the mean values being 7.04, 5.00, 3.66, and 3.00 (intervention group) and 7.09, 5.14, 4.36, and 3.33 (control group). No statistically significant benefit was identified for nasal breathing, foreign body sensation, and average rhinomanometric volume flow, which improved by 12.31% (control group) and 11.24% (nasal spray group). Conclusion. The investigational product may have additional benefit on postoperative mucosal regeneration compared to standard cleaning procedures alone. However, no statistically significant advantage could be observed in this observational study. Double-blind, controlled studies with larger populations will be necessary to evaluate the efficacy of this treatment modality.

  20. Clinical Efficacy of a Spray Containing Hyaluronic Acid and Dexpanthenol after Surgery in the Nasal Cavity (Septoplasty, Simple Ethmoid Sinus Surgery, and Turbinate Surgery)

    PubMed Central

    2014-01-01

    Background. This prospective, controlled, parallel-group observational study investigated the efficacy of a spray containing hyaluronic acid and dexpanthenol to optimise regular treatment after nasal cavity surgery in 49 patients with chronic rhinosinusitis. Methods. The control group received standard therapy. Mucosal regeneration was determined using rhinoscopy sum score (RSS). Pre- and postoperative nasal patency was tested using anterior rhinomanometry. The participants were questioned about their symptoms. Results. Regarding all RSS parameters (dryness, dried nasal mucus, fibrin deposition, and obstruction), mucosal regeneration achieved good final results in both groups, tending to a better improvement through the spray application, without statistically significant differences during the whole assessment period, the mean values being 7.04, 5.00, 3.66, and 3.00 (intervention group) and 7.09, 5.14, 4.36, and 3.33 (control group). No statistically significant benefit was identified for nasal breathing, foreign body sensation, and average rhinomanometric volume flow, which improved by 12.31% (control group) and 11.24% (nasal spray group). Conclusion. The investigational product may have additional benefit on postoperative mucosal regeneration compared to standard cleaning procedures alone. However, no statistically significant advantage could be observed in this observational study. Double-blind, controlled studies with larger populations will be necessary to evaluate the efficacy of this treatment modality. PMID:25104962

  1. 45 CFR 164.308 - Administrative safeguards.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...)(i) Standard: Security management process. Implement policies and procedures to prevent, detect... this subpart for the entity. (3)(i) Standard: Workforce security. Implement policies and procedures to...) Standard: Information access management. Implement policies and procedures for authorizing access to...

  2. 45 CFR 164.308 - Administrative safeguards.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...)(i) Standard: Security management process. Implement policies and procedures to prevent, detect... this subpart for the entity. (3)(i) Standard: Workforce security. Implement policies and procedures to...) Standard: Information access management. Implement policies and procedures for authorizing access to...

  3. Evaluating measurement models in clinical research: covariance structure analysis of latent variable models of self-conception.

    PubMed

    Hoyle, R H

    1991-02-01

    Indirect measures of psychological constructs are vital to clinical research. On occasion, however, the meaning of indirect measures of psychological constructs is obfuscated by statistical procedures that do not account for the complex relations between items and latent variables and among latent variables. Covariance structure analysis (CSA) is a statistical procedure for testing hypotheses about the relations among items that indirectly measure a psychological construct and relations among psychological constructs. This article introduces clinical researchers to the strengths and limitations of CSA as a statistical procedure for conceiving and testing structural hypotheses that are not tested adequately with other statistical procedures. The article is organized around two empirical examples that illustrate the use of CSA for evaluating measurement models with correlated error terms, higher-order factors, and measured and latent variables.

  4. Consequences of common data analysis inaccuracies in CNS trauma injury basic research.

    PubMed

    Burke, Darlene A; Whittemore, Scott R; Magnuson, David S K

    2013-05-15

    The development of successful treatments for humans after traumatic brain or spinal cord injuries (TBI and SCI, respectively) requires animal research. This effort can be hampered when promising experimental results cannot be replicated because of incorrect data analysis procedures. To identify and hopefully avoid these errors in future studies, the articles in seven journals with the highest number of basic science central nervous system TBI and SCI animal research studies published in 2010 (N=125 articles) were reviewed for their data analysis procedures. After identifying the most common statistical errors, the implications of those findings were demonstrated by reanalyzing previously published data from our laboratories using the identified inappropriate statistical procedures, then comparing the two sets of results. Overall, 70% of the articles contained at least one type of inappropriate statistical procedure. The highest percentage involved incorrect post hoc t-tests (56.4%), followed by inappropriate parametric statistics (analysis of variance and t-test; 37.6%). Repeated Measures analysis was inappropriately missing in 52.0% of all articles and, among those with behavioral assessments, 58% were analyzed incorrectly. Reanalysis of our published data using the most common inappropriate statistical procedures resulted in a 14.1% average increase in significant effects compared to the original results. Specifically, an increase of 15.5% occurred with Independent t-tests and 11.1% after incorrect post hoc t-tests. Utilizing proper statistical procedures can allow more-definitive conclusions, facilitate replicability of research results, and enable more accurate translation of those results to the clinic.

  5. Effect of 25(OH) vitamin D reference method procedure (RMP) alignment on clinical measurements obtained with the IDS-iSYS chemiluminescent-based automated analyzer.

    PubMed

    Simpson, Christine A; Cusano, Anna Maria; Bihuniak, Jessica; Walker, Joanne; Insogna, Karl L

    2015-04-01

    The Vitamin D Standardization Program (VDSP) has identified ID-LC/MS/MS as the reference method procedure (RMP) for 25(OH) vitamin D and NIST Standard SRM2972 as the standard reference material (SRM). As manufacturers align their products to the RMP and NIST standard, a concern is that results obtained in aligned assays will be divergent from those obtained with pre-alignment assays. The Immunodiagnostic Systems Ltd., chemiluminescent, 25(OH) vitamin D iSYS platform assay, was recently harmonized to the RMP. To determine the impact of standardization on results obtained with iSYS reagents, 119 single donor serum samples from eight different disease categories were analyzed in four non-standardized and two standardized iSYS assays. There were strong correlations between the four non-standardized and two standardized assays with Spearman's rank r values between 0.975 and 0.961 and four of the eight r values were >0.97. R(2) values for the eight best-fit linear regression equations ranging between 0.947 and 0.916. None of the slopes were found to be significantly different from one another. Bland-Altman plots showed that the bias was comparable when each of the four non-standardized assays was compared to either of the standardized assays. When the data were segregated in values between 6 and 49ng/mL (15-122nmol/L) or between 50 and 100ng/mL (125-250nmol/L) significant associations remained between results obtained with non-standardized and standardized calibrators regardless of the absolute value. When five recent DEQAS unknowns were analyzed in one non-standardized and one standardized assay the mean percent difference from the NIST target in values obtained using standardized vs. non-standardized calibrators were not significantly different. Finally, strong and statistically significant associations between the results were obtained using non-standardized and standardized assays for six of eight clinical conditions. The only exceptions were hypocalcemia and breast cancer, which likely reflect the small sample sizes for each of these diseases. These initial data provide confidence that the move to a NIST standardized assay will have little impact on results obtained with the iSYS platform. This article is part of a Special Issue entitled '17th Vitamin D Workshop'. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. 78 FR 44112 - California State Motor Vehicle Pollution Control Standards; Urban Buses; Request for Waiver of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-23

    ... Board (CARB) its request for a waiver of preemption for emission standards and related test procedures... standards and test procedures for heavy-duty urban bus engines and vehicles. The 2000 rulemaking included... to emission standards and test procedures resulting from these five sets of amendments were codified...

  7. 76 FR 56126 - Energy Conservation Program: Treatment of “Smart” Appliances in Energy Conservation Standards and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-12

    ...'s energy conservation standards, as well as in test procedures used to demonstrate compliance with...'' appliances in the development of DOE's energy conservation standards, as well as in test procedures used to... Conservation Program: Treatment of ``Smart'' Appliances in Energy Conservation Standards and Test Procedures...

  8. JSC Design and Procedural Standards, JSC-STD-8080

    NASA Technical Reports Server (NTRS)

    Punch, Danny T.

    2011-01-01

    This document provides design and procedural requirements appropriate for inclusion in specifications for any human spaceflight program, project, spacecraft, system, or end item. The term "spacecraft" as used in the standards includes launch vehicles, orbital vehicles, non-terrestrial surface vehicles, and modules. The standards are developed and maintained as directed by Johnson Space Center (JSC) Policy Directive JPD 8080.2, JSC Design and Procedural Standards for Human Space Flight Equipment. The Design and Procedural Standards contained in this manual represent human spacecraft design and operational knowledge applicable to a wide range of spaceflight activities. These standards are imposed on JSC human spaceflight equipment through JPD 8080.2. Designers shall comply with all design standards applicable to their design effort.

  9. CTEPP STANDARD OPERATING PROCEDURE FOR PREPARATION OF SURROGATE RECOVERY STANDARD AND INTERNAL STANDARD SOLUTIONS FOR NEUTRAL TARGET ANALYTES (SOP-5.25)

    EPA Science Inventory

    This standard operating procedure describes the method used for preparing internal standard, surrogate recovery standard and calibration standard solutions for neutral analytes used for gas chromatography/mass spectrometry analysis.

  10. 21 CFR 58.81 - Standard operating procedures.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... LABORATORY PRACTICE FOR NONCLINICAL LABORATORY STUDIES Testing Facilities Operation § 58.81 Standard operating procedures. (a) A testing facility shall have standard operating procedures in writing setting... following: (1) Animal room preparation. (2) Animal care. (3) Receipt, identification, storage, handling...

  11. 21 CFR 58.81 - Standard operating procedures.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... LABORATORY PRACTICE FOR NONCLINICAL LABORATORY STUDIES Testing Facilities Operation § 58.81 Standard operating procedures. (a) A testing facility shall have standard operating procedures in writing setting... following: (1) Animal room preparation. (2) Animal care. (3) Receipt, identification, storage, handling...

  12. 21 CFR 58.81 - Standard operating procedures.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... LABORATORY PRACTICE FOR NONCLINICAL LABORATORY STUDIES Testing Facilities Operation § 58.81 Standard operating procedures. (a) A testing facility shall have standard operating procedures in writing setting... following: (1) Animal room preparation. (2) Animal care. (3) Receipt, identification, storage, handling...

  13. 21 CFR 58.81 - Standard operating procedures.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... LABORATORY PRACTICE FOR NONCLINICAL LABORATORY STUDIES Testing Facilities Operation § 58.81 Standard operating procedures. (a) A testing facility shall have standard operating procedures in writing setting... following: (1) Animal room preparation. (2) Animal care. (3) Receipt, identification, storage, handling...

  14. Statistical tools for transgene copy number estimation based on real-time PCR.

    PubMed

    Yuan, Joshua S; Burris, Jason; Stewart, Nathan R; Mentewab, Ayalew; Stewart, C Neal

    2007-11-01

    As compared with traditional transgene copy number detection technologies such as Southern blot analysis, real-time PCR provides a fast, inexpensive and high-throughput alternative. However, the real-time PCR based transgene copy number estimation tends to be ambiguous and subjective stemming from the lack of proper statistical analysis and data quality control to render a reliable estimation of copy number with a prediction value. Despite the recent progresses in statistical analysis of real-time PCR, few publications have integrated these advancements in real-time PCR based transgene copy number determination. Three experimental designs and four data quality control integrated statistical models are presented. For the first method, external calibration curves are established for the transgene based on serially-diluted templates. The Ct number from a control transgenic event and putative transgenic event are compared to derive the transgene copy number or zygosity estimation. Simple linear regression and two group T-test procedures were combined to model the data from this design. For the second experimental design, standard curves were generated for both an internal reference gene and the transgene, and the copy number of transgene was compared with that of internal reference gene. Multiple regression models and ANOVA models can be employed to analyze the data and perform quality control for this approach. In the third experimental design, transgene copy number is compared with reference gene without a standard curve, but rather, is based directly on fluorescence data. Two different multiple regression models were proposed to analyze the data based on two different approaches of amplification efficiency integration. Our results highlight the importance of proper statistical treatment and quality control integration in real-time PCR-based transgene copy number determination. These statistical methods allow the real-time PCR-based transgene copy number estimation to be more reliable and precise with a proper statistical estimation. Proper confidence intervals are necessary for unambiguous prediction of trangene copy number. The four different statistical methods are compared for their advantages and disadvantages. Moreover, the statistical methods can also be applied for other real-time PCR-based quantification assays including transfection efficiency analysis and pathogen quantification.

  15. Global aesthetic surgery statistics: a closer look.

    PubMed

    Heidekrueger, Paul I; Juran, S; Ehrl, D; Aung, T; Tanna, N; Broer, P Niclas

    2017-08-01

    Obtaining quality global statistics about surgical procedures remains an important yet challenging task. The International Society of Aesthetic Plastic Surgery (ISAPS) reports the total number of surgical and non-surgical procedures performed worldwide on a yearly basis. While providing valuable insight, ISAPS' statistics leave two important factors unaccounted for: (1) the underlying base population, and (2) the number of surgeons performing the procedures. Statistics of the published ISAPS' 'International Survey on Aesthetic/Cosmetic Surgery' were analysed by country, taking into account the underlying national base population according to the official United Nations population estimates. Further, the number of surgeons per country was used to calculate the number of surgeries performed per surgeon. In 2014, based on ISAPS statistics, national surgical procedures ranked in the following order: 1st USA, 2nd Brazil, 3rd South Korea, 4th Mexico, 5th Japan, 6th Germany, 7th Colombia, and 8th France. When considering the size of the underlying national populations, the demand for surgical procedures per 100,000 people changes the overall ranking substantially. It was also found that the rate of surgical procedures per surgeon shows great variation between the responding countries. While the US and Brazil are often quoted as the countries with the highest demand for plastic surgery, according to the presented analysis, other countries surpass these countries in surgical procedures per capita. While data acquisition and quality should be improved in the future, valuable insight regarding the demand for surgical procedures can be gained by taking specific demographic and geographic factors into consideration.

  16. Usefulness of silicone elastomer sheet as another option of adhesion preventive material during craniectomies.

    PubMed

    Lee, Choon-Hyun; Cho, Do-Sang; Jin, Sung-Chul; Kim, Sung-Hak; Park, Dong-Been

    2007-10-01

    We describe the use of a silicone elastomer sheet (SILASTIC) to prevent peridural fibrosis in patients who underwent a craniectomy and a subsequent cranioplasty. We performed a decompressive craniectomy and a subsequent cranioplasty with an autologous bone flap in 50 patients (mean age, 40 years) between 1996 and 2005 at our institution. Most of the craniectomies were performed as an emergency procedure for relief of brain swelling. The standard decompressive craniectomy technique that we performed included bone removal and a duroplasty in 26 of the 50 patients, however, a SILASTIC sheet was added to the standard decompressive craniectomy in the remaining patients in an attempt to prevent dural adhesions. The development of adhesion formation between the tissue layers was evaluated during the cranioplasty in terms of operative time and the amount of blood loss. During the cranioplasty, we observed that the SILASTIC sheet succeeded in creating a controlled dissection plane, which facilitated access to the epidural space, shortened the operative time by approximately 24.8% and diminished the intraoperative blood loss by 37.9% as compared with the group of patients who underwent the standard cranioplasty. These differences were statistically significant (p<0.05). The use of a SILASTIC sheet to prevent peridural scarring and to facilitate cranioplasty in patients who have previously undergone a craniectomy is a good technique, regardless of the procedural indication.

  17. Statistical analysis and digital processing of the Mössbauer spectra

    NASA Astrophysics Data System (ADS)

    Prochazka, Roman; Tucek, Pavel; Tucek, Jiri; Marek, Jaroslav; Mashlan, Miroslav; Pechousek, Jiri

    2010-02-01

    This work is focused on using the statistical methods and development of the filtration procedures for signal processing in Mössbauer spectroscopy. Statistical tools for noise filtering in the measured spectra are used in many scientific areas. The use of a pure statistical approach in accumulated Mössbauer spectra filtration is described. In Mössbauer spectroscopy, the noise can be considered as a Poisson statistical process with a Gaussian distribution for high numbers of observations. This noise is a superposition of the non-resonant photons counting with electronic noise (from γ-ray detection and discrimination units), and the velocity system quality that can be characterized by the velocity nonlinearities. The possibility of a noise-reducing process using a new design of statistical filter procedure is described. This mathematical procedure improves the signal-to-noise ratio and thus makes it easier to determine the hyperfine parameters of the given Mössbauer spectra. The filter procedure is based on a periodogram method that makes it possible to assign the statistically important components in the spectral domain. The significance level for these components is then feedback-controlled using the correlation coefficient test results. The estimation of the theoretical correlation coefficient level which corresponds to the spectrum resolution is performed. Correlation coefficient test is based on comparison of the theoretical and the experimental correlation coefficients given by the Spearman method. The correctness of this solution was analyzed by a series of statistical tests and confirmed by many spectra measured with increasing statistical quality for a given sample (absorber). The effect of this filter procedure depends on the signal-to-noise ratio and the applicability of this method has binding conditions.

  18. Behavior analytic approaches to problem behavior in intellectual disabilities.

    PubMed

    Hagopian, Louis P; Gregory, Meagan K

    2016-03-01

    The purpose of the current review is to summarize recent behavior analytic research on problem behavior in individuals with intellectual disabilities. We have focused our review on studies published from 2013 to 2015, but also included earlier studies that were relevant. Behavior analytic research on problem behavior continues to focus on the use and refinement of functional behavioral assessment procedures and function-based interventions. During the review period, a number of studies reported on procedures aimed at making functional analysis procedures more time efficient. Behavioral interventions continue to evolve, and there were several larger scale clinical studies reporting on multiple individuals. There was increased attention on the part of behavioral researchers to develop statistical methods for analysis of within subject data and continued efforts to aggregate findings across studies through evaluative reviews and meta-analyses. Findings support continued utility of functional analysis for guiding individualized interventions and for classifying problem behavior. Modifications designed to make functional analysis more efficient relative to the standard method of functional analysis were reported; however, these require further validation. Larger scale studies on behavioral assessment and treatment procedures provided additional empirical support for effectiveness of these approaches and their sustainability outside controlled clinical settings.

  19. Inversion of ground-motion data from a seismometer array for rotation using a modification of Jaeger's method

    USGS Publications Warehouse

    Chi, Wu-Cheng; Lee, W.H.K.; Aston, J.A.D.; Lin, C.J.; Liu, C.-C.

    2011-01-01

    We develop a new way to invert 2D translational waveforms using Jaeger's (1969) formula to derive rotational ground motions about one axis and estimate the errors in them using techniques from statistical multivariate analysis. This procedure can be used to derive rotational ground motions and strains using arrayed translational data, thus providing an efficient way to calibrate the performance of rotational sensors. This approach does not require a priori information about the noise level of the translational data and elastic properties of the media. This new procedure also provides estimates of the standard deviations of the derived rotations and strains. In this study, we validated this code using synthetic translational waveforms from a seismic array. The results after the inversion of the synthetics for rotations were almost identical with the results derived using a well-tested inversion procedure by Spudich and Fletcher (2009). This new 2D procedure can be applied three times to obtain the full, three-component rotations. Additional modifications can be implemented to the code in the future to study different features of the rotational ground motions and strains induced by the passage of seismic waves.

  20. Complications Following Common Inpatient Urological Procedures: Temporal Trend Analysis from 2000 to 2010.

    PubMed

    Meyer, Christian P; Hollis, Michael; Cole, Alexander P; Hanske, Julian; O'Leary, James; Gupta, Soham; Löppenberg, Björn; Zavaski, Mike E; Sun, Maxine; Sammon, Jesse D; Kibel, Adam S; Fisch, Margit; Chun, Felix K H; Trinh, Quoc-Dien

    2016-04-01

    Measuring procedure-specific complication-rate trends allows for benchmarking and improvement in quality of care but must be done in a standardized fashion. Using the Nationwide Inpatient Sample, we identified all instances of eight common inpatient urologic procedures performed in the United States between 2000 and 2010. This yielded 327218 cases including both oncologic and benign diseases. Complications were identified by International Classification of Diseases, Ninth Revision codes. Each complication was cross-referenced to the procedure code and graded according to the standardized Clavien system. The Mann-Whitney and chi-square were used to assess the statistical significance of medians and proportions, respectively. We assessed temporal variability in the rates of overall complications (Clavien grade 1-4), length of hospital stay, and in-hospital mortality using the estimated annual percent change (EAPC) linear regression methodology. We observed an overall reduction in length of stay (EAPC: -1.59; p<0.001), whereas mortality rates remained negligible and unchanged (EAPC: -0.32; p=0.83). Patient comorbidities increased significantly over the study period (EAPC: 2.09; p<0.001), as did the rates of complications. Procedure-specific trends showed a significant increase in complications for inpatient ureterorenoscopy (EAPC: 5.53; p<0.001), percutaneous nephrolithotomy (EAPC: 3.75; p<0.001), radical cystectomy (EAPC: 1.37; p<0.001), radical nephrectomy (EAPC: 1.35; p<0.001), and partial nephrectomy (EAPC: 1.22; p=0.006). Limitations include lack of postdischarge follow-up data, lack of pathologic characteristics, and inability to adjust for secular changes in administrative coding. In the context of urologic care in the United States, our findings suggest a shift toward more complex oncologic procedures in the inpatient setting, with same-day procedures most likely shifted to the outpatient setting. Consequently, complications have increased for the majority of examined procedures; however, no change in mortality was found. This report evaluated the trends of urologic procedures and their complications. A significant shift toward sicker patients and more complex procedures in the inpatient setting was found, but this did not result in higher mortality. These results are indicators of the high quality of care for urologic procedures in the inpatient setting. Copyright © 2015 European Association of Urology. Published by Elsevier B.V. All rights reserved.

  1. Two Paradoxes in Linear Regression Analysis.

    PubMed

    Feng, Ge; Peng, Jing; Tu, Dongke; Zheng, Julia Z; Feng, Changyong

    2016-12-25

    Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection.

  2. 7 CFR 610.23 - State Technical Committee meetings.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Technical Committee member. (b) NRCS will establish and maintain national standard operating procedures... standard operating procedures will outline items such as: The best practice approach to establishing... standard operating procedures established under paragraph (b) of this section, the State Conservationist...

  3. 7 CFR 610.23 - State Technical Committee meetings.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Technical Committee member. (b) NRCS will establish and maintain national standard operating procedures... standard operating procedures will outline items such as: The best practice approach to establishing... standard operating procedures established under paragraph (b) of this section, the State Conservationist...

  4. 7 CFR 610.23 - State Technical Committee meetings.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Technical Committee member. (b) NRCS will establish and maintain national standard operating procedures... standard operating procedures will outline items such as: The best practice approach to establishing... standard operating procedures established under paragraph (b) of this section, the State Conservationist...

  5. 7 CFR 610.23 - State Technical Committee meetings.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Technical Committee member. (b) NRCS will establish and maintain national standard operating procedures... standard operating procedures will outline items such as: The best practice approach to establishing... standard operating procedures established under paragraph (b) of this section, the State Conservationist...

  6. 7 CFR 610.23 - State Technical Committee meetings.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Technical Committee member. (b) NRCS will establish and maintain national standard operating procedures... standard operating procedures will outline items such as: The best practice approach to establishing... standard operating procedures established under paragraph (b) of this section, the State Conservationist...

  7. A Bayesian statistical analysis of mouse dermal tumor promotion assay data for evaluating cigarette smoke condensate.

    PubMed

    Kathman, Steven J; Potts, Ryan J; Ayres, Paul H; Harp, Paul R; Wilson, Cody L; Garner, Charles D

    2010-10-01

    The mouse dermal assay has long been used to assess the dermal tumorigenicity of cigarette smoke condensate (CSC). This mouse skin model has been developed for use in carcinogenicity testing utilizing the SENCAR mouse as the standard strain. Though the model has limitations, it remains as the most relevant method available to study the dermal tumor promoting potential of mainstream cigarette smoke. In the typical SENCAR mouse CSC bioassay, CSC is applied for 29 weeks following the application of a tumor initiator such as 7,12-dimethylbenz[a]anthracene (DMBA). Several endpoints are considered for analysis including: the percentage of animals with at least one mass, latency, and number of masses per animal. In this paper, a relatively straightforward analytic model and procedure is presented for analyzing the time course of the incidence of masses. The procedure considered here takes advantage of Bayesian statistical techniques, which provide powerful methods for model fitting and simulation. Two datasets are analyzed to illustrate how the model fits the data, how well the model may perform in predicting data from such trials, and how the model may be used as a decision tool when comparing the dermal tumorigenicity of cigarette smoke condensate from multiple cigarette types. The analysis presented here was developed as a statistical decision tool for differentiating between two or more prototype products based on the dermal tumorigenicity. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  8. Clinical evaluation of subepithelial connective tissue graft and guided tissue regeneration for treatment of Miller’s class 1 gingival recession (comparative, split mouth, six months study)

    PubMed Central

    Bhavsar, Neeta-V.; Dulani, Kirti; Trivedi, Rahul

    2014-01-01

    Objectives: The present study aims to clinically compare and evaluate subepithelial connective tissue graft and the GTR based root coverage in treatment of Miller’s Class I gingival recession. Study Design: 30 patients with at least one pair of Miller’s Class I gingival recession were treated either with Subepithelial connective tissue graft (Group A) or Guided tissue regeneration (Group B). Clinical parameters monitored included recession RD, width of keratinized gingiva (KG), probing depth (PD), clinical attachment level (CAL), attached gingiva (AG), residual probing depth (RPD) and % of Root coverage(%RC). Measurements were taken at baseline, three months and six months. A standard surgical procedure was used for both Group A and Group B. Data were recorded and statistical analysis was done for both intergroup and intragroup. Results: At end of six months % RC obtained were 84.47% (Group A) and 81.67% (Group B). Both treatments resulted in statistically significant improvement in clinical parameters. When compared, no statistically significant difference was found between both groups except in RPD, where it was significantly greater in Group A. Conclusions: GTR technique has advantages over subepithelial connective tissue graft for shallow Miller’s Class I defects and this procedure can be used to avoid patient discomfort and reduce treatment time. Key words:Collagen membrane, comparative split mouth study, gingival recession, subepithelial connective tissue graft, guided tissue regeneration (GTR). PMID:25136420

  9. Quantized correlation coefficient for measuring reproducibility of ChIP-chip data.

    PubMed

    Peng, Shouyong; Kuroda, Mitzi I; Park, Peter J

    2010-07-27

    Chromatin immunoprecipitation followed by microarray hybridization (ChIP-chip) is used to study protein-DNA interactions and histone modifications on a genome-scale. To ensure data quality, these experiments are usually performed in replicates, and a correlation coefficient between replicates is used often to assess reproducibility. However, the correlation coefficient can be misleading because it is affected not only by the reproducibility of the signal but also by the amount of binding signal present in the data. We develop the Quantized correlation coefficient (QCC) that is much less dependent on the amount of signal. This involves discretization of data into set of quantiles (quantization), a merging procedure to group the background probes, and recalculation of the Pearson correlation coefficient. This procedure reduces the influence of the background noise on the statistic, which then properly focuses more on the reproducibility of the signal. The performance of this procedure is tested in both simulated and real ChIP-chip data. For replicates with different levels of enrichment over background and coverage, we find that QCC reflects reproducibility more accurately and is more robust than the standard Pearson or Spearman correlation coefficients. The quantization and the merging procedure can also suggest a proper quantile threshold for separating signal from background for further analysis. To measure reproducibility of ChIP-chip data correctly, a correlation coefficient that is robust to the amount of signal present should be used. QCC is one such measure. The QCC statistic can also be applied in a variety of other contexts for measuring reproducibility, including analysis of array CGH data for DNA copy number and gene expression data.

  10. Measuring temperature rise during orthopaedic surgical procedures.

    PubMed

    Manoogian, Sarah; Lee, Adam K; Widmaier, James C

    2016-09-01

    A reliable means for measuring temperatures generated during surgical procedures is needed to recommend best practices for inserting fixation devices and minimizing the risk of osteonecrosis. Twenty four screw tests for three surgical procedures were conducted using the four thermocouples in the bone and one thermocouple in the screw. The maximum temperature rise recorded from the thermocouple in the screw (92.7±8.9°C, 158.7±20.9°C, 204.4±35.2°C) was consistently higher than the average temperature rise recorded in the bone (31.8±9.3°C, 44.9±12.4°C, 77.3±12.7°C). The same overall trend between the temperatures that resulted from three screw insertion procedures was recorded with significant statistical analyses using either the thermocouple in the screw or the average of several in-bone thermocouples. Placing a single thermocouple in the bone was determined to have limitations in accurately comparing temperatures from different external fixation screw insertion procedures. Using the preferred measurement techniques, a standard screw with a predrilled hole was found to have the lowest maximum temperatures for the shortest duration compared to the other two insertion procedures. Future studies evaluating bone temperature increase need to use reliable temperature measurements for recommending best practices to surgeons. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.

  11. Influence of skin peeling procedure in allergic contact dermatitis.

    PubMed

    Kim, Jung Eun; Park, Hyun Jeong; Cho, Baik Kee; Lee, Jun Young

    2008-03-01

    The prevalence of allergic contact dermatitis in patients who have previously undergone skin peeling has been rarely studied. We compared the frequency of positive patch test (PT) reactions in a patient group with a history of peeling, to that of a control group with no history of peeling. The Korean standard series and cosmetic series were performed on a total of 262 patients. 62 patients had previously undergone peeling and 200 patients did not. The frequency of positive PT reactions on Korean standard series was significantly higher in the peeling group compared with that of the control group (P < 0.05, chi-square test). However, the most commonly identified allergens were mostly cosmetic-unrelated allergens. The frequency of positive PT reactions on cosmetic series in the peeling group was higher than that of the control group, but lacked statistical significance. The frequency (%) of positive PT reactions on cosmetic series in the high-frequency peel group was higher than that of the low-frequency group, but lacked statistical significance. It appears peeling may not generally affect the development of contact sensitization. Further work is required focusing on the large-scale prospective studies by performing a PT before and after peeling.

  12. Systematic review of general thoracic surgery articles to identify predictors of operating room case durations.

    PubMed

    Dexter, Franklin; Dexter, Elisabeth U; Masursky, Danielle; Nussmeier, Nancy A

    2008-04-01

    Previous studies of operating room (OR) information systems data over the past two decades have shown how to predict case durations using the combination of scheduled procedure(s), individual surgeon and assistant(s), and type of anesthetic(s). We hypothesized that the accuracy of case duration prediction could be improved by the use of other electronic medical record data (e.g., patient weight or surgeon notes using standardized vocabularies). General thoracic surgery was used as a model specialty because much of its workload is elective (scheduled) and many of its cases are long. PubMed was searched for thoracic surgery papers reporting operative time, surgical time, etc. The systematic literature review identified 48 papers reporting statistically significant differences in perioperative times. There were multiple reports of differences in OR times based on the procedure(s), perioperative team including primary surgeon, and type of anesthetic, in that sequence of importance. All such detail may not be known when the case is originally scheduled and thus may require an updated duration the day before surgery. Although the use of these categorical data from OR systems can result in few historical data for estimating each case's duration, bias and imprecision of case duration estimates are unlikely to be affected. There was a report of a difference in case duration based on additional information. However, the incidence of the procedure for the diagnosis was so uncommon as to be unlikely to affect OR management. Matching findings of prior studies using OR information system data, multiple case series show that it is important to rely on the precise procedure(s), surgical team, and type of anesthetic when estimating case durations. OR information systems need to incorporate the statistical methods designed for small numbers of prior surgical cases. Future research should focus on the most effective methods to update the prediction of each case's duration as these data become available. The case series did not reveal additional data which could be cost-effectively integrated with OR information systems data to improve the accuracy of predicted durations for general thoracic surgery cases.

  13. Reusable single-port access device shortens operative time and reduces operative costs.

    PubMed

    Shussman, Noam; Kedar, Asaf; Elazary, Ram; Abu Gazala, Mahmoud; Rivkind, Avraham I; Mintz, Yoav

    2014-06-01

    In recent years, single-port laparoscopy (SPL) has become an attractive approach for performing surgical procedures. The pitfalls of this approach are technical and financial. Financial concerns are due to the increased cost of dedicated devices and prolonged operating room time. Our aim was to calculate the cost of SPL using a reusable port and instruments in order to evaluate the cost difference between this approach to SPL using the available disposable ports and standard laparoscopy. We performed 22 laparoscopic procedures via the SPL approach using a reusable single-port access system and reusable laparoscopic instruments. These included 17 cholecystectomies and five other procedures. Operative time, postoperative length of stay (LOS) and complications were prospectively recorded and were compared with similar data from our SPL database. Student's t test was used for statistical analysis. SPL was successfully performed in all cases. Mean operative time for cholecystectomy was 72 min (range 40-116). Postoperative LOS was not changed from our standard protocols and was 1.1 days for cholecystectomy. The postoperative course was within normal limits for all patients and perioperative morbidity was recorded. Both operative time and length of hospital stay were shorter for the 17 patients who underwent cholecystectomy using a reusable port than for the matched previous 17 SPL cholecystectomies we performed (p < 0.001). Prices of disposable SPL instruments and multiport access devices as well as extraction bags from different manufacturers were used to calculate the cost difference. Operating with a reusable port ended up with an average cost savings of US$388 compared with using disposable ports, and US$240 compared with standard laparoscopy. Single-port laparoscopic surgery is a technically challenging and expensive surgical approach. Financial concerns among others have been advocated against this approach; however, we demonstrate herein that using a reusable port and instruments reduces operative time and overall operative costs, even beyond the cost of standard laparoscopy.

  14. Efforts to improve international migration statistics: a historical perspective.

    PubMed

    Kraly, E P; Gnanasekaran, K S

    1987-01-01

    During the past decade, the international statistical community has made several efforts to develop standards for the definition, collection and publication of statistics on international migration. This article surveys the history of official initiatives to standardize international migration statistics by reviewing the recommendations of the International Statistical Institute, International Labor Organization, and the UN, and reports a recently proposed agenda for moving toward comparability among national statistical systems. Heightening awareness of the benefits of exchange and creating motivation to implement international standards requires a 3-pronged effort from the international statistical community. 1st, it is essential to continue discussion about the significance of improvement, specifically standardization, of international migration statistics. The move from theory to practice in this area requires ongoing focus by migration statisticians so that conformity to international standards itself becomes a criterion by which national statistical practices are examined and assessed. 2nd, the countries should be provided with technical documentation to support and facilitate the implementation of the recommended statistical systems. Documentation should be developed with an understanding that conformity to international standards for migration and travel statistics must be achieved within existing national statistical programs. 3rd, the call for statistical research in this area requires more efforts by the community of migration statisticians, beginning with the mobilization of bilateral and multilateral resources to undertake the preceding list of activities.

  15. 21 CFR 680.1 - Allergenic Products.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...) Mold manufacturers shall maintain written standard operating procedures, developed by a qualified... representative species of mold subject to the standard operating procedures. The tests shall be performed at each manufacturing step during and subsequent to harvest, as specified in the standard operating procedures. Before...

  16. 21 CFR 680.1 - Allergenic Products.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...) Mold manufacturers shall maintain written standard operating procedures, developed by a qualified... representative species of mold subject to the standard operating procedures. The tests shall be performed at each manufacturing step during and subsequent to harvest, as specified in the standard operating procedures. Before...

  17. 21 CFR 680.1 - Allergenic Products.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...) Mold manufacturers shall maintain written standard operating procedures, developed by a qualified... representative species of mold subject to the standard operating procedures. The tests shall be performed at each manufacturing step during and subsequent to harvest, as specified in the standard operating procedures. Before...

  18. 40 CFR 63.548 - Monitoring requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) You must prepare, and at all times operate according to, a standard operating procedures manual that...) You must submit the standard operating procedures manual for baghouses required by paragraph (a) of... that you specify in the standard operating procedures manual for inspections and routine maintenance...

  19. 40 CFR 63.548 - Monitoring requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) You must prepare, and at all times operate according to, a standard operating procedures manual that...) You must submit the standard operating procedures manual for baghouses required by paragraph (a) of... that you specify in the standard operating procedures manual for inspections and routine maintenance...

  20. 21 CFR 680.1 - Allergenic Products.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...) Mold manufacturers shall maintain written standard operating procedures, developed by a qualified... representative species of mold subject to the standard operating procedures. The tests shall be performed at each manufacturing step during and subsequent to harvest, as specified in the standard operating procedures. Before...

  1. 40 CFR 63.548 - Monitoring requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...) You must prepare, and at all times operate according to, a standard operating procedures manual that...) You must submit the standard operating procedures manual for baghouses required by paragraph (a) of... that you specify in the standard operating procedures manual for inspections and routine maintenance...

  2. 75 FR 37245 - 2010 Standards for Delineating Metropolitan and Micropolitan Statistical Areas

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-28

    ... Micropolitan Statistical Areas; Notice #0;#0;Federal Register / Vol. 75, No. 123 / Monday, June 28, 2010... and Micropolitan Statistical Areas AGENCY: Office of Information and Regulatory Affairs, Office of... Statistical Areas. The 2010 standards replace and supersede the 2000 Standards for Defining Metropolitan and...

  3. Estimating multilevel logistic regression models when the number of clusters is low: a comparison of different statistical software procedures.

    PubMed

    Austin, Peter C

    2010-04-22

    Multilevel logistic regression models are increasingly being used to analyze clustered data in medical, public health, epidemiological, and educational research. Procedures for estimating the parameters of such models are available in many statistical software packages. There is currently little evidence on the minimum number of clusters necessary to reliably fit multilevel regression models. We conducted a Monte Carlo study to compare the performance of different statistical software procedures for estimating multilevel logistic regression models when the number of clusters was low. We examined procedures available in BUGS, HLM, R, SAS, and Stata. We found that there were qualitative differences in the performance of different software procedures for estimating multilevel logistic models when the number of clusters was low. Among the likelihood-based procedures, estimation methods based on adaptive Gauss-Hermite approximations to the likelihood (glmer in R and xtlogit in Stata) or adaptive Gaussian quadrature (Proc NLMIXED in SAS) tended to have superior performance for estimating variance components when the number of clusters was small, compared to software procedures based on penalized quasi-likelihood. However, only Bayesian estimation with BUGS allowed for accurate estimation of variance components when there were fewer than 10 clusters. For all statistical software procedures, estimation of variance components tended to be poor when there were only five subjects per cluster, regardless of the number of clusters.

  4. The Effect of Web-Based Education on Patient Satisfaction, Consultation Time and Conversion to Surgery.

    PubMed

    Boudreault, David J; Li, Chin-Shang; Wong, Michael S

    2016-01-01

    To evaluate the effect of web-based education on (1) patient satisfaction, (2) consultation times, and (3) conversion to surgery. A retrospective review of 767 new patient consultations seen by 4 university-based plastic surgeons was conducted between May 2012 and August 2013 to determine the effect a web-based education program has on patient satisfaction and consultation time. A standard 5-point Likert scale survey completed at the end of the consultation was used to assess satisfaction with their experience. Consult times were obtained from the electronic medical record. All analyses were done with Statistical Analysis Software version 9.2 (SAS Inc., Cary, NC). A P value less than 0.05 was considered statistically significant. Those who viewed the program before their consultation were more satisfied with their consultation compared to those who did not (satisfaction scores, mean ± SD: 1.13 ± 0.44 vs 1.36 ± 0.74; P = 0.02) and more likely to rate their experience as excellent (92% vs 75%; P = 0.02). Contrary to the claims of Emmi Solutions, patients who viewed the educational program before consultation trended toward longer visits compared to those who did not (mean time ± SD: 54 ± 26 vs 50 ± 35 minutes; P = 0.10). More patients who completed the program went on to undergo a procedure (44% vs 37%; P = 0.16), but this difference was not statistically significant. Viewing web-based educational programs significantly improved plastic surgery patients' satisfaction with their consultation, but patients who viewed the program also trended toward longer consultation times. Although there was an increase in converting to surgical procedures, this did not reach statistical significance.

  5. POWER-ENHANCED MULTIPLE DECISION FUNCTIONS CONTROLLING FAMILY-WISE ERROR AND FALSE DISCOVERY RATES.

    PubMed

    Peña, Edsel A; Habiger, Joshua D; Wu, Wensong

    2011-02-01

    Improved procedures, in terms of smaller missed discovery rates (MDR), for performing multiple hypotheses testing with weak and strong control of the family-wise error rate (FWER) or the false discovery rate (FDR) are developed and studied. The improvement over existing procedures such as the Šidák procedure for FWER control and the Benjamini-Hochberg (BH) procedure for FDR control is achieved by exploiting possible differences in the powers of the individual tests. Results signal the need to take into account the powers of the individual tests and to have multiple hypotheses decision functions which are not limited to simply using the individual p -values, as is the case, for example, with the Šidák, Bonferroni, or BH procedures. They also enhance understanding of the role of the powers of individual tests, or more precisely the receiver operating characteristic (ROC) functions of decision processes, in the search for better multiple hypotheses testing procedures. A decision-theoretic framework is utilized, and through auxiliary randomizers the procedures could be used with discrete or mixed-type data or with rank-based nonparametric tests. This is in contrast to existing p -value based procedures whose theoretical validity is contingent on each of these p -value statistics being stochastically equal to or greater than a standard uniform variable under the null hypothesis. Proposed procedures are relevant in the analysis of high-dimensional "large M , small n " data sets arising in the natural, physical, medical, economic and social sciences, whose generation and creation is accelerated by advances in high-throughput technology, notably, but not limited to, microarray technology.

  6. A new developmental toxicity test for pelagic fish using anchoveta (Engraulis ringens J.).

    PubMed

    Llanos-Rivera, A; Castro, L R; Silva, J; Bay-Schmith, E

    2009-07-01

    A series of six 96-h static bioassays were performed to validate the use of anchoveta (Engraulis ringens) embryos as test organisms for ecotoxicological studies. The standardization protocol utilized potassium dichromate (K(2)Cr(2)O(7)) as a reference toxicant and egg mortality as the endpoint. The results indicated that the mean sensitivity of anchoveta embryos to potassium dichromate was 156.1 mg L(-1) (range: 131-185 mg L(-1)). The statistical data analysis showed high homogeneity in LC50 values among bioassays (variation coefficient = 11.02%). These results demonstrated that the protocol and handling procedures implemented for the anchoveta embryo bioassays comply with international standards for intra-laboratory precision. After secondary treatment, an effluent from a modern Kraft pulp mill was tested for E. ringens embryo toxicity, finding no significant differences from the controls.

  7. Liquid chromatographic determination of histamine in fish, sauerkraut, and wine: interlaboratory study.

    PubMed

    Beljaars, P R; Van Dijk, R; Jonker, K M; Schout, L J

    1998-01-01

    An interlaboratory study of the liquid chromatographic (LC) determination of histamine in fish, sauerkraut, and wine was conducted. Diminuted and homogenized samples were suspended in water followed by clarification of extracts with perchloric acid, filtration, and dilution with water. After LC separation on a reversed-phase C18 column with phosphate buffer (pH 3.0)--acetonitrile (875 + 125, v/v) as mobile phase, histamine was measured fluorometrically (excitation, 340 nm; emission, 455 nm) in samples and standards after postcolumn derivatization with o-phthaldialdehyde (OPA). Fourteen samples (including 6 blind duplicates and 1 split level) containing histamine at about 10-400 mg/kg or mg/L were analyzed singly according to the proposed procedure by 11 laboratories. Results from one participant were excluded from statistical analysis. For all samples analyzed, repeatability relative standard deviations varied from 2.1 to 5.6%, and reproducibility relative standard deviations ranged from 2.2 to 7.1%. Averaged recoveries of histamine for this concentration range varied from 94 to 100%.

  8. "Heidelberg standard examination" and "Heidelberg standard procedures" - Development of faculty-wide standards for physical examination techniques and clinical procedures in undergraduate medical education.

    PubMed

    Nikendei, C; Ganschow, P; Groener, J B; Huwendiek, S; Köchel, A; Köhl-Hackert, N; Pjontek, R; Rodrian, J; Scheibe, F; Stadler, A-K; Steiner, T; Stiepak, J; Tabatabai, J; Utz, A; Kadmon, M

    2016-01-01

    The competent physical examination of patients and the safe and professional implementation of clinical procedures constitute essential components of medical practice in nearly all areas of medicine. The central objective of the projects "Heidelberg standard examination" and "Heidelberg standard procedures", which were initiated by students, was to establish uniform interdisciplinary standards for physical examination and clinical procedures, and to distribute them in coordination with all clinical disciplines at the Heidelberg University Hospital. The presented project report illuminates the background of the initiative and its methodological implementation. Moreover, it describes the multimedia documentation in the form of pocketbooks and a multimedia internet-based platform, as well as the integration into the curriculum. The project presentation aims to provide orientation and action guidelines to facilitate similar processes in other faculties.

  9. Implementation of a standardized electronic tool improves compliance, accuracy, and efficiency of trainee-to-trainee patient care handoffs after complex general surgical oncology procedures.

    PubMed

    Clarke, Callisia N; Patel, Sameer H; Day, Ryan W; George, Sobha; Sweeney, Colin; Monetes De Oca, Georgina Avaloa; Aiss, Mohamed Ait; Grubbs, Elizabeth G; Bednarski, Brian K; Lee, Jeffery E; Bodurka, Diane C; Skibber, John M; Aloia, Thomas A

    2017-03-01

    Duty-hour regulations have increased the frequency of trainee-trainee patient handoffs. Each handoff creates a potential source for communication errors that can lead to near-miss and patient-harm events. We investigated the utility, efficacy, and trainee experience associated with implementation of a novel, standardized, electronic handoff system. We conducted a prospective intervention study of trainee-trainee handoffs of inpatients undergoing complex general surgical oncology procedures at a large tertiary institution. Preimplementation data were measured using trainee surveys and direct observation and by tracking delinquencies in charting. A standardized electronic handoff tool was created in a research electronic data capture (REDCap) database using the previously validated I-PASS methodology (illness severity, patient summary, action list, situational awareness and contingency planning, and synthesis). Electronic handoff was augmented by direct communication via phone or face-to-face interaction for inpatients deemed "watcher" or "unstable." Postimplementation handoff compliance, communication errors, and trainee work flow were measured and compared to preimplementation values using standard statistical analysis. A total of 474 handoffs (203 preintervention and 271 postintervention) were observed over the study period; 86 handoffs involved patients admitted to the surgical intensive care unit, 344 patients admitted to the surgical stepdown unit, and 44 patients on the surgery ward. Implementation of the structured electronic tool resulted in an increase in trainee handoff compliance from 73% to 96% (P < .001) and decreased errors in communication by 50% (P = .044) while improving trainee efficiency and workflow. A standardized electronic tool augmented by direct communication for higher acuity patients can improve compliance, accuracy, and efficiency of handoff communication between surgery trainees. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Evaluation of a standardized procedure for [corrected] microscopic cell counts [corrected] in body fluids.

    PubMed

    Emerson, Jane F; Emerson, Scott S

    2005-01-01

    A standardized urinalysis and manual microscopic cell counting system was evaluated for its potential to reduce intra- and interoperator variability in urine and cerebrospinal fluid (CSF) cell counts. Replicate aliquots of pooled specimens were submitted blindly to technologists who were instructed to use either the Kova system with the disposable Glasstic slide (Hycor Biomedical, Inc., Garden Grove, CA) or the standard operating procedure of the University of California-Irvine (UCI), which uses plain glass slides for urine sediments and hemacytometers for CSF. The Hycor system provides a mechanical means of obtaining a fixed volume of fluid in which to resuspend the sediment, and fixes the volume of specimen to be microscopically examined by using capillary filling of a chamber containing in-plane counting grids. Ninety aliquots of pooled specimens of each type of body fluid were used to assess the inter- and intraoperator reproducibility of the measurements. The variability of replicate Hycor measurements made on a single specimen by the same or different observers was compared with that predicted by a Poisson distribution. The Hycor methods generally resulted in test statistics that were slightly lower than those obtained with the laboratory standard methods, indicating a trend toward decreasing the effects of various sources of variability. For 15 paired aliquots of each body fluid, tests for systematically higher or lower measurements with the Hycor methods were performed using the Wilcoxon signed-rank test. Also examined was the average difference between the Hycor and current laboratory standard measurements, along with a 95% confidence interval (CI) for the true average difference. Without increasing labor or the requirement for attention to detail, the Hycor method provides slightly better interrater comparisons than the current method used at UCI. Copyright 2005 Wiley-Liss, Inc.

  11. Standardization of Clinical Assessment and Sample Collection Across All PERCH Study Sites

    PubMed Central

    Prosperi, Christine; Baggett, Henry C.; Brooks, W. Abdullah; Deloria Knoll, Maria; Hammitt, Laura L.; Howie, Stephen R. C.; Kotloff, Karen L.; Levine, Orin S.; Madhi, Shabir A.; Murdoch, David R.; O’Brien, Katherine L.; Thea, Donald M.; Awori, Juliet O.; Bunthi, Charatdao; DeLuca, Andrea N.; Driscoll, Amanda J.; Ebruke, Bernard E.; Goswami, Doli; Hidgon, Melissa M.; Karron, Ruth A.; Kazungu, Sidi; Kourouma, Nana; Mackenzie, Grant; Moore, David P.; Mudau, Azwifari; Mwale, Magdalene; Nahar, Kamrun; Park, Daniel E.; Piralam, Barameht; Seidenberg, Phil; Sylla, Mamadou; Feikin, Daniel R.; Scott, J. Anthony G.; O’Brien, Katherine L.; Levine, Orin S.; Knoll, Maria Deloria; Feikin, Daniel R.; DeLuca, Andrea N.; Driscoll, Amanda J.; Fancourt, Nicholas; Fu, Wei; Hammitt, Laura L.; Higdon, Melissa M.; Kagucia, E. Wangeci; Karron, Ruth A.; Li, Mengying; Park, Daniel E.; Prosperi, Christine; Wu, Zhenke; Zeger, Scott L.; Watson, Nora L.; Crawley, Jane; Murdoch, David R.; Brooks, W. Abdullah; Endtz, Hubert P.; Zaman, Khalequ; Goswami, Doli; Hossain, Lokman; Jahan, Yasmin; Ashraf, Hasan; Howie, Stephen R. C.; Ebruke, Bernard E.; Antonio, Martin; McLellan, Jessica; Machuka, Eunice; Shamsul, Arifin; Zaman, Syed M.A.; Mackenzie, Grant; Scott, J. Anthony G.; Awori, Juliet O.; Morpeth, Susan C.; Kamau, Alice; Kazungu, Sidi; Kotloff, Karen L.; Tapia, Milagritos D.; Sow, Samba O.; Sylla, Mamadou; Tamboura, Boubou; Onwuchekwa, Uma; Kourouma, Nana; Toure, Aliou; Madhi, Shabir A.; Moore, David P.; Adrian, Peter V.; Baillie, Vicky L.; Kuwanda, Locadiah; Mudau, Azwifarwi; Groome, Michelle J.; Baggett, Henry C.; Thamthitiwat, Somsak; Maloney, Susan A.; Bunthi, Charatdao; Rhodes, Julia; Sawatwong, Pongpun; Akarasewi, Pasakorn; Thea, Donald M.; Mwananyanda, Lawrence; Chipeta, James; Seidenberg, Phil; Mwansa, James; wa Somwe, Somwe; Kwenda, Geoffrey

    2017-01-01

    Abstract Background. Variable adherence to standardized case definitions, clinical procedures, specimen collection techniques, and laboratory methods has complicated the interpretation of previous multicenter pneumonia etiology studies. To circumvent these problems, a program of clinical standardization was embedded in the Pneumonia Etiology Research for Child Health (PERCH) study. Methods. Between March 2011 and August 2013, standardized training on the PERCH case definition, clinical procedures, and collection of laboratory specimens was delivered to 331 clinical staff at 9 study sites in 7 countries (The Gambia, Kenya, Mali, South Africa, Zambia, Thailand, and Bangladesh), through 32 on-site courses and a training website. Staff competency was assessed throughout 24 months of enrollment with multiple-choice question (MCQ) examinations, a video quiz, and checklist evaluations of practical skills. Results. MCQ evaluation was confined to 158 clinical staff members who enrolled PERCH cases and controls, with scores obtained for >86% of eligible staff at each time-point. Median scores after baseline training were ≥80%, and improved by 10 percentage points with refresher training, with no significant intersite differences. Percentage agreement with the clinical trainer on the presence or absence of clinical signs on video clips was high (≥89%), with interobserver concordance being substantial to high (AC1 statistic, 0.62–0.82) for 5 of 6 signs assessed. Staff attained median scores of >90% in checklist evaluations of practical skills. Conclusions. Satisfactory clinical standardization was achieved within and across all PERCH sites, providing reassurance that any etiological or clinical differences observed across the study sites are true differences, and not attributable to differences in application of the clinical case definition, interpretation of clinical signs, or in techniques used for clinical measurements or specimen collection. PMID:28575355

  12. Flow Chamber System for the Statistical Evaluation of Bacterial Colonization on Materials

    PubMed Central

    Menzel, Friederike; Conradi, Bianca; Rodenacker, Karsten; Gorbushina, Anna A.; Schwibbert, Karin

    2016-01-01

    Biofilm formation on materials leads to high costs in industrial processes, as well as in medical applications. This fact has stimulated interest in the development of new materials with improved surfaces to reduce bacterial colonization. Standardized tests relying on statistical evidence are indispensable to evaluate the quality and safety of these new materials. We describe here a flow chamber system for biofilm cultivation under controlled conditions with a total capacity for testing up to 32 samples in parallel. In order to quantify the surface colonization, bacterial cells were DAPI (4`,6-diamidino-2-phenylindole)-stained and examined with epifluorescence microscopy. More than 100 images of each sample were automatically taken and the surface coverage was estimated using the free open source software g’mic, followed by a precise statistical evaluation. Overview images of all gathered pictures were generated to dissect the colonization characteristics of the selected model organism Escherichia coli W3310 on different materials (glass and implant steel). With our approach, differences in bacterial colonization on different materials can be quantified in a statistically validated manner. This reliable test procedure will support the design of improved materials for medical, industrial, and environmental (subaquatic or subaerial) applications. PMID:28773891

  13. On representing the prognostic value of continuous gene expression biomarkers with the restricted mean survival curve.

    PubMed

    Eng, Kevin H; Schiller, Emily; Morrell, Kayla

    2015-11-03

    Researchers developing biomarkers for cancer prognosis from quantitative gene expression data are often faced with an odd methodological discrepancy: while Cox's proportional hazards model, the appropriate and popular technique, produces a continuous and relative risk score, it is hard to cast the estimate in clear clinical terms like median months of survival and percent of patients affected. To produce a familiar Kaplan-Meier plot, researchers commonly make the decision to dichotomize a continuous (often unimodal and symmetric) score. It is well known in the statistical literature that this procedure induces significant bias. We illustrate the liabilities of common techniques for categorizing a risk score and discuss alternative approaches. We promote the use of the restricted mean survival (RMS) and the corresponding RMS curve that may be thought of as an analog to the best fit line from simple linear regression. Continuous biomarker workflows should be modified to include the more rigorous statistical techniques and descriptive plots described in this article. All statistics discussed can be computed via standard functions in the Survival package of the R statistical programming language. Example R language code for the RMS curve is presented in the appendix.

  14. Maximum Likelihood Estimation of Spectra Information from Multiple Independent Astrophysics Data Sets

    NASA Technical Reports Server (NTRS)

    Howell, Leonard W., Jr.; Six, N. Frank (Technical Monitor)

    2002-01-01

    The Maximum Likelihood (ML) statistical theory required to estimate spectra information from an arbitrary number of astrophysics data sets produced by vastly different science instruments is developed in this paper. This theory and its successful implementation will facilitate the interpretation of spectral information from multiple astrophysics missions and thereby permit the derivation of superior spectral information based on the combination of data sets. The procedure is of significant value to both existing data sets and those to be produced by future astrophysics missions consisting of two or more detectors by allowing instrument developers to optimize each detector's design parameters through simulation studies in order to design and build complementary detectors that will maximize the precision with which the science objectives may be obtained. The benefits of this ML theory and its application is measured in terms of the reduction of the statistical errors (standard deviations) of the spectra information using the multiple data sets in concert as compared to the statistical errors of the spectra information when the data sets are considered separately, as well as any biases resulting from poor statistics in one or more of the individual data sets that might be reduced when the data sets are combined.

  15. BIOREL: the benchmark resource to estimate the relevance of the gene networks.

    PubMed

    Antonov, Alexey V; Mewes, Hans W

    2006-02-06

    The progress of high-throughput methodologies in functional genomics has lead to the development of statistical procedures to infer gene networks from various types of high-throughput data. However, due to the lack of common standards, the biological significance of the results of the different studies is hard to compare. To overcome this problem we propose a benchmark procedure and have developed a web resource (BIOREL), which is useful for estimating the biological relevance of any genetic network by integrating different sources of biological information. The associations of each gene from the network are classified as biologically relevant or not. The proportion of genes in the network classified as "relevant" is used as the overall network relevance score. Employing synthetic data we demonstrated that such a score ranks the networks fairly in respect to the relevance level. Using BIOREL as the benchmark resource we compared the quality of experimental and theoretically predicted protein interaction data.

  16. Markov chain Monte Carlo techniques applied to parton distribution functions determination: Proof of concept

    NASA Astrophysics Data System (ADS)

    Gbedo, Yémalin Gabin; Mangin-Brinet, Mariane

    2017-07-01

    We present a new procedure to determine parton distribution functions (PDFs), based on Markov chain Monte Carlo (MCMC) methods. The aim of this paper is to show that we can replace the standard χ2 minimization by procedures grounded on statistical methods, and on Bayesian inference in particular, thus offering additional insight into the rich field of PDFs determination. After a basic introduction to these techniques, we introduce the algorithm we have chosen to implement—namely Hybrid (or Hamiltonian) Monte Carlo. This algorithm, initially developed for Lattice QCD, turns out to be very interesting when applied to PDFs determination by global analyses; we show that it allows us to circumvent the difficulties due to the high dimensionality of the problem, in particular concerning the acceptance. A first feasibility study is performed and presented, which indicates that Markov chain Monte Carlo can successfully be applied to the extraction of PDFs and of their uncertainties.

  17. Multiple regression for physiological data analysis: the problem of multicollinearity.

    PubMed

    Slinker, B K; Glantz, S A

    1985-07-01

    Multiple linear regression, in which several predictor variables are related to a response variable, is a powerful statistical tool for gaining quantitative insight into complex in vivo physiological systems. For these insights to be correct, all predictor variables must be uncorrelated. However, in many physiological experiments the predictor variables cannot be precisely controlled and thus change in parallel (i.e., they are highly correlated). There is a redundancy of information about the response, a situation called multicollinearity, that leads to numerical problems in estimating the parameters in regression equations; the parameters are often of incorrect magnitude or sign or have large standard errors. Although multicollinearity can be avoided with good experimental design, not all interesting physiological questions can be studied without encountering multicollinearity. In these cases various ad hoc procedures have been proposed to mitigate multicollinearity. Although many of these procedures are controversial, they can be helpful in applying multiple linear regression to some physiological problems.

  18. Headspace screening: A novel approach for fast quality assessment of the essential oil from culinary sage.

    PubMed

    Cvetkovikj, Ivana; Stefkov, Gjoshe; Acevska, Jelena; Karapandzova, Marija; Dimitrovska, Aneta; Kulevanova, Svetlana

    2016-07-01

    Quality assessment of essential oil (EO) from culinary sage (Salvia officinalis L., Lamiaceae) is limited by the long pharmacopoeial procedure. The aim of this study was to employ headspace (HS) sampling in the quality assessment of sage EO. Different populations (30) of culinary sage were assessed using GC/FID/MS analysis of the hydrodistilled EO (pharmacopoeial method) and HS sampling directly from leaves. Compound profiles from both procedures were evaluated according to ISO 9909 and GDC standards for sage EO quality, revealing compliance for only 10 populations. Factors to convert HS values, for the target ISO and GDC components, into theoretical EO values were calculated. Statistical analysis revealed a significant relationship between HS and EO values for seven target components. Consequently, HS sampling could be used as a complementary extraction technique for rapid screening in quality assessment of sage EOs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Two Paradoxes in Linear Regression Analysis

    PubMed Central

    FENG, Ge; PENG, Jing; TU, Dongke; ZHENG, Julia Z.; FENG, Changyong

    2016-01-01

    Summary Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection. PMID:28638214

  20. Shunting outcomes in posthemorrhagic hydrocephalus: results of a Hydrocephalus Clinical Research Network prospective cohort study.

    PubMed

    Wellons, John C; Shannon, Chevis N; Holubkov, Richard; Riva-Cambrin, Jay; Kulkarni, Abhaya V; Limbrick, David D; Whitehead, William; Browd, Samuel; Rozzelle, Curtis; Simon, Tamara D; Tamber, Mandeep S; Oakes, W Jerry; Drake, James; Luerssen, Thomas G; Kestle, John

    2017-07-01

    OBJECTIVE Previous Hydrocephalus Clinical Research Network (HCRN) retrospective studies have shown a 15% difference in rates of conversion to permanent shunts with the use of ventriculosubgaleal shunts (VSGSs) versus ventricular reservoirs (VRs) as temporization procedures in the treatment of hydrocephalus due to high-grade intraventricular hemorrhage (IVH) of prematurity. Further research in the same study line revealed a strong influence of center-specific decision-making on shunt outcomes. The primary goal of this prospective study was to standardize decision-making across centers to determine true procedural superiority, if any, of VSGS versus VR as a temporization procedure in high-grade IVH of prematurity. METHODS The HCRN conducted a prospective cohort study across 6 centers with an approximate 1.5- to 3-year accrual period (depending on center) followed by 6 months of follow-up. Infants with premature birth, who weighed less than 1500 g, had Grade 3 or 4 IVH of prematurity, and had more than 72 hours of life expectancy were included in the study. Based on a priori consensus, decisions were standardized regarding the timing of initial surgical treatment, upfront shunt versus temporization procedure (VR or VSGS), and when to convert a VR or VSGS to a permanent shunt. Physical examination assessment and surgical technique were also standardized. The primary outcome was the proportion of infants who underwent conversion to a permanent shunt. The major secondary outcomes of interest included infection and other complication rates. RESULTS One hundred forty-five premature infants were enrolled and met criteria for analysis. Using the standardized decision rubrics, 28 infants never reached the threshold for treatment, 11 initially received permanent shunts, 4 were initially treated with endoscopic third ventriculostomy (ETV), and 102 underwent a temporization procedure (36 with VSGSs and 66 with VRs). The 2 temporization cohorts were similar in terms of sex, race, IVH grade, head (orbitofrontal) circumference, and ventricular size at temporization. There were statistically significant differences noted between groups in gestational age, birth weight, and bilaterality of clot burden that were controlled for in post hoc analysis. By Kaplan-Meier analysis, the 180-day rates of conversion to permanent shunts were 63.5% for VSGS and 74.0% for VR (p = 0.36, log-rank test). The infection rate for VSGS was 14% (5/36) and for VR was 17% (11/66; p = 0.71). The overall compliance rate with the standardized decision rubrics was noted to be 90% for all surgeons. CONCLUSIONS A standardized protocol was instituted across all centers of the HCRN. Compliance was high. Choice of temporization techniques in premature infants with IVH does not appear to influence rates of conversion to permanent ventricular CSF diversion. Once management decisions and surgical techniques are standardized across HCRN sites, thus minimizing center effect, the observed difference in conversion rates between VSGSs and VRs is mitigated.

  1. 40 CFR 136.7 - Quality assurance and quality control.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... quality control elements, where applicable, into the laboratory's documented standard operating procedure... quality control elements must be clearly documented in the written standard operating procedure for each... Methods contains QA/QC procedures in the Part 1000 section of the Standard Methods Compendium. The...

  2. 40 CFR 63.550 - Recordkeeping and reporting requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... recordkeeping required as part of the practices described in the standard operating procedures manual for... part of the practices described in the standard operating procedures manual for baghouses required... period, including an explanation of the periods when the procedures outlined in the standard operating...

  3. 40 CFR 63.550 - Recordkeeping and reporting requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... recordkeeping required as part of the practices described in the standard operating procedures manual for... part of the practices described in the standard operating procedures manual for baghouses required... period, including an explanation of the periods when the procedures outlined in the standard operating...

  4. 40 CFR 136.7 - Quality assurance and quality control.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... quality control elements, where applicable, into the laboratory's documented standard operating procedure... quality control elements must be clearly documented in the written standard operating procedure for each... Methods contains QA/QC procedures in the Part 1000 section of the Standard Methods Compendium. The...

  5. 40 CFR 136.7 - Quality assurance and quality control.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... quality control elements, where applicable, into the laboratory's documented standard operating procedure... quality control elements must be clearly documented in the written standard operating procedure for each... Methods contains QA/QC procedures in the Part 1000 section of the Standard Methods Compendium. The...

  6. 49 CFR 383.131 - Test procedures.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... provide standardized scoring sheets for the skills tests, as well as standardized driving instructions for... 49 Transportation 5 2010-10-01 2010-10-01 false Test procedures. 383.131 Section 383.131... STANDARDS; REQUIREMENTS AND PENALTIES Tests § 383.131 Test procedures. (a) Driver information manuals...

  7. Uncertainty Analysis for DAM Projects.

    DTIC Science & Technology

    1987-09-01

    overwhelming majority of articles published on the use of statistical methodology for geotechnical engineering focus on performance predictions and design ...Results of the present study do not support the adoption of more esoteric statistical procedures except on a special case basis or in research ...influence that recommended statistical procedures might have had on the Carters Project, had they been applied during planning and design phases

  8. Improving the safety and quality of nursing care through standardized operating procedures in Bosnia and Herzegovina.

    PubMed

    Ausserhofer, Dietmar; Rakic, Severin; Novo, Ahmed; Dropic, Emira; Fisekovic, Eldin; Sredic, Ana; Van Malderen, Greet

    2016-06-01

    We explored how selected 'positive deviant' healthcare facilities in Bosnia and Herzegovina approach the continuous development, adaptation, implementation, monitoring and evaluation of nursing-related standard operating procedures. Standardized nursing care is internationally recognized as a critical element of safe, high-quality health care; yet very little research has examined one of its key instruments: nursing-related standard operating procedures. Despite variability in Bosnia and Herzegovina's healthcare and nursing care quality, we assumed that some healthcare facilities would have developed effective strategies to elevate nursing quality and safety through the use of standard operating procedures. Guided by the 'positive deviance' approach, we used a multiple-case study design to examine a criterion sample of four facilities (two primary healthcare centres and two hospitals), collecting data via focus groups and individual interviews. In each studied facility, certification/accreditation processes were crucial to the initiation of continuous development, adaptation, implementation, monitoring and evaluation of nursing-related SOPs. In one hospital and one primary healthcare centre, nurses working in advanced roles (i.e. quality coordinators) were responsible for developing and implementing nursing-related standard operating procedures. Across the four studied institutions, we identified a consistent approach to standard operating procedures-related processes. The certification/accreditation process is enabling necessary changes in institutions' organizational cultures, empowering nurses to take on advanced roles in improving the safety and quality of nursing care. Standardizing nursing procedures is key to improve the safety and quality of nursing care. Nursing and Health Policy are needed in Bosnia and Herzegovina to establish a functioning institutional framework, including regulatory bodies, educational systems for developing nurses' capacities or the inclusion of nursing-related standard operating procedures in certification/accreditation standards. © 2016 International Council of Nurses.

  9. Estimating peak-flow frequency statistics for selected gaged and ungaged sites in naturally flowing streams and rivers in Idaho

    USGS Publications Warehouse

    Wood, Molly S.; Fosness, Ryan L.; Skinner, Kenneth D.; Veilleux, Andrea G.

    2016-06-27

    The U.S. Geological Survey, in cooperation with the Idaho Transportation Department, updated regional regression equations to estimate peak-flow statistics at ungaged sites on Idaho streams using recent streamflow (flow) data and new statistical techniques. Peak-flow statistics with 80-, 67-, 50-, 43-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent annual exceedance probabilities (1.25-, 1.50-, 2.00-, 2.33-, 5.00-, 10.0-, 25.0-, 50.0-, 100-, 200-, and 500-year recurrence intervals, respectively) were estimated for 192 streamgages in Idaho and bordering States with at least 10 years of annual peak-flow record through water year 2013. The streamgages were selected from drainage basins with little or no flow diversion or regulation. The peak-flow statistics were estimated by fitting a log-Pearson type III distribution to records of annual peak flows and applying two additional statistical methods: (1) the Expected Moments Algorithm to help describe uncertainty in annual peak flows and to better represent missing and historical record; and (2) the generalized Multiple Grubbs Beck Test to screen out potentially influential low outliers and to better fit the upper end of the peak-flow distribution. Additionally, a new regional skew was estimated for the Pacific Northwest and used to weight at-station skew at most streamgages. The streamgages were grouped into six regions (numbered 1_2, 3, 4, 5, 6_8, and 7, to maintain consistency in region numbering with a previous study), and the estimated peak-flow statistics were related to basin and climatic characteristics to develop regional regression equations using a generalized least squares procedure. Four out of 24 evaluated basin and climatic characteristics were selected for use in the final regional peak-flow regression equations.Overall, the standard error of prediction for the regional peak-flow regression equations ranged from 22 to 132 percent. Among all regions, regression model fit was best for region 4 in west-central Idaho (average standard error of prediction=46.4 percent; pseudo-R2>92 percent) and region 5 in central Idaho (average standard error of prediction=30.3 percent; pseudo-R2>95 percent). Regression model fit was poor for region 7 in southern Idaho (average standard error of prediction=103 percent; pseudo-R2<78 percent) compared to other regions because few streamgages in region 7 met the criteria for inclusion in the study, and the region’s semi-arid climate and associated variability in precipitation patterns causes substantial variability in peak flows.A drainage area ratio-adjustment method, using ratio exponents estimated using generalized least-squares regression, was presented as an alternative to the regional regression equations if peak-flow estimates are desired at an ungaged site that is close to a streamgage selected for inclusion in this study. The alternative drainage area ratio-adjustment method is appropriate for use when the drainage area ratio between the ungaged and gaged sites is between 0.5 and 1.5.The updated regional peak-flow regression equations had lower total error (standard error of prediction) than all regression equations presented in a 1982 study and in four of six regions presented in 2002 and 2003 studies in Idaho. A more extensive streamgage screening process used in the current study resulted in fewer streamgages used in the current study than in the 1982, 2002, and 2003 studies. Fewer streamgages used and the selection of different explanatory variables were likely causes of increased error in some regions compared to previous studies, but overall, regional peak‑flow regression model fit was generally improved for Idaho. The revised statistical procedures and increased streamgage screening applied in the current study most likely resulted in a more accurate representation of natural peak-flow conditions.The updated, regional peak-flow regression equations will be integrated in the U.S. Geological Survey StreamStats program to allow users to estimate basin and climatic characteristics and peak-flow statistics at ungaged locations of interest. StreamStats estimates peak-flow statistics with quantifiable certainty only when used at sites with basin and climatic characteristics within the range of input variables used to develop the regional regression equations. Both the regional regression equations and StreamStats should be used to estimate peak-flow statistics only in naturally flowing, relatively unregulated streams without substantial local influences to flow, such as large seeps, springs, or other groundwater-surface water interactions that are not widespread or characteristic of the respective region.

  10. The effect of thermochemotherapy with mitomycin C on normal bladder urothelium, an experimental study.

    PubMed

    Uçar, Murat; Altok, Muammer; Umul, Mehmet; Bayram, Dilek; Armağan, İlkay; Güneş, Mustafa; Çapkin, Tahsin; Soyupek, Sedat

    2016-01-01

    To investigate the effects of thermochemotherapy with mitomycin C (MMC) on normal rabbit bladder urothelium and to compare it with standard intravesical MMC and hyperthermia with normal saline. Twenty-four male New Zealand rabbits, with a mean weight of 2.7 kg (in weight of 2.1–4.3 kg), were divided into three groups, each containing eight rabbits. Thermotherapy with only normal saline was performed in the first group, standard intravesical MMC was performed in the second group, and thermotherapy with MMC was performed in the last group. A week after the primary procedure, total cystectomy was performed and tissue samples were evaluated. The presence of epithelial vacuolar degeneration (p = 0.001), epithelial hyperplasia (p = 0.000), subepithelial fibrosis (p = 0.001) and hemorrhagic areas in the connective tissue (p = 0.002) was observed statistically significantly higher in the standard MMC group than in thermotherapy with normal saline group. There was almost a significant difference among standard MMC and normal saline group in terms of vascular congestion in the connective tissue (p = 0.08). Presence of epithelial vacuolar degeneration (p = 0.002), epithelial hyperplasia (p = 0.002), subepithelial fibrosis (p = 0.030), hemorrhagic areas (p = 0.011) and vascular congestion (p = 0.36) in the connective tissue was observed statistically significantly higher in the thermochemotherapy with MMC group than in standard intravesical MMC group. Polymorphonuclear cell infiltration was not considerable in any of the groups, and there was no significant difference between each groups (p = 0.140). Administration of intravesical MMC causes a toxic effect on the normal urothelium of the bladder rather than an inflammatory reaction. Heating MMC significantly increased this effect.

  11. The search for causal inferences: using propensity scores post hoc to reduce estimation error with nonexperimental research.

    PubMed

    Tumlinson, Samuel E; Sass, Daniel A; Cano, Stephanie M

    2014-03-01

    While experimental designs are regarded as the gold standard for establishing causal relationships, such designs are usually impractical owing to common methodological limitations. The objective of this article is to illustrate how propensity score matching (PSM) and using propensity scores (PS) as a covariate are viable alternatives to reduce estimation error when experimental designs cannot be implemented. To mimic common pediatric research practices, data from 140 simulated participants were used to resemble an experimental and nonexperimental design that assessed the effect of treatment status on participant weight loss for diabetes. Pretreatment participant characteristics (age, gender, physical activity, etc.) were then used to generate PS for use in the various statistical approaches. Results demonstrate how PSM and using the PS as a covariate can be used to reduce estimation error and improve statistical inferences. References for issues related to the implementation of these procedures are provided to assist researchers.

  12. Higher certainty of the laser-induced damage threshold test with a redistributing data treatment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jensen, Lars; Mrohs, Marius; Gyamfi, Mark

    2015-10-15

    As a consequence of its statistical nature, the measurement of the laser-induced damage threshold holds always risks to over- or underestimate the real threshold value. As one of the established measurement procedures, the results of S-on-1 (and 1-on-1) tests outlined in the corresponding ISO standard 21 254 depend on the amount of data points and their distribution over the fluence scale. With the limited space on a test sample as well as the requirements on test site separation and beam sizes, the amount of data from one test is restricted. This paper reports on a way to treat damage testmore » data in order to reduce the statistical error and therefore measurement uncertainty. Three simple assumptions allow for the assignment of one data point to multiple data bins and therefore virtually increase the available data base.« less

  13. UNCERTAINTY ON RADIATION DOSES ESTIMATED BY BIOLOGICAL AND RETROSPECTIVE PHYSICAL METHODS.

    PubMed

    Ainsbury, Elizabeth A; Samaga, Daniel; Della Monaca, Sara; Marrale, Maurizio; Bassinet, Celine; Burbidge, Christopher I; Correcher, Virgilio; Discher, Michael; Eakins, Jon; Fattibene, Paola; Güçlü, Inci; Higueras, Manuel; Lund, Eva; Maltar-Strmecki, Nadica; McKeever, Stephen; Rääf, Christopher L; Sholom, Sergey; Veronese, Ivan; Wieser, Albrecht; Woda, Clemens; Trompier, Francois

    2018-03-01

    Biological and physical retrospective dosimetry are recognised as key techniques to provide individual estimates of dose following unplanned exposures to ionising radiation. Whilst there has been a relatively large amount of recent development in the biological and physical procedures, development of statistical analysis techniques has failed to keep pace. The aim of this paper is to review the current state of the art in uncertainty analysis techniques across the 'EURADOS Working Group 10-Retrospective dosimetry' members, to give concrete examples of implementation of the techniques recommended in the international standards, and to further promote the use of Monte Carlo techniques to support characterisation of uncertainties. It is concluded that sufficient techniques are available and in use by most laboratories for acute, whole body exposures to highly penetrating radiation, but further work will be required to ensure that statistical analysis is always wholly sufficient for the more complex exposure scenarios.

  14. A generalized Grubbs-Beck test statistic for detecting multiple potentially influential low outliers in flood series

    USGS Publications Warehouse

    Cohn, T.A.; England, J.F.; Berenbrock, C.E.; Mason, R.R.; Stedinger, J.R.; Lamontagne, J.R.

    2013-01-01

    he Grubbs-Beck test is recommended by the federal guidelines for detection of low outliers in flood flow frequency computation in the United States. This paper presents a generalization of the Grubbs-Beck test for normal data (similar to the Rosner (1983) test; see also Spencer and McCuen (1996)) that can provide a consistent standard for identifying multiple potentially influential low flows. In cases where low outliers have been identified, they can be represented as “less-than” values, and a frequency distribution can be developed using censored-data statistical techniques, such as the Expected Moments Algorithm. This approach can improve the fit of the right-hand tail of a frequency distribution and provide protection from lack-of-fit due to unimportant but potentially influential low flows (PILFs) in a flood series, thus making the flood frequency analysis procedure more robust.

  15. Evaluation of body posture in individuals with internal temporomandibular joint derangement.

    PubMed

    Munhoz, Wagner Cesar; Marques, Amélia Pasqual; de Siqueira, José Tadeu Tesseroli

    2005-10-01

    Temporomandibular dysfunctions (TMD) comprise a great number of disruptions that may affect the temporomandibular joint (TMJ), the masticatory muscles, or both. TMJ internal derangement is a specific type of TMD, of which the etiology and physiopathology are broadly unknown, but have been suggested to be linked to head, neck, and body posture factors. This study aimed at verifying possible relationships between body posture and TMJ internal derangements (TMJ-id), by comparing 30 subjects presenting typical TMJ-id signs to 20 healthy subjects. Subjects' clinical evaluations included anamnesis, stomatognatic system evaluation, and plotting analysis on body posture photographs. No statistically significant differences were found between the groups. Results do not support the assertion that body posture plays a role in causing or enhancing TMD; however, these results should be cautiously considered because of the small number of subjects evaluated and the many posture variables submitted to statistical procedures that lead to high standard deviations.

  16. A generalized Grubbs-Beck test statistic for detecting multiple potentially influential low outliers in flood series

    NASA Astrophysics Data System (ADS)

    Cohn, T. A.; England, J. F.; Berenbrock, C. E.; Mason, R. R.; Stedinger, J. R.; Lamontagne, J. R.

    2013-08-01

    The Grubbs-Beck test is recommended by the federal guidelines for detection of low outliers in flood flow frequency computation in the United States. This paper presents a generalization of the Grubbs-Beck test for normal data (similar to the Rosner (1983) test; see also Spencer and McCuen (1996)) that can provide a consistent standard for identifying multiple potentially influential low flows. In cases where low outliers have been identified, they can be represented as "less-than" values, and a frequency distribution can be developed using censored-data statistical techniques, such as the Expected Moments Algorithm. This approach can improve the fit of the right-hand tail of a frequency distribution and provide protection from lack-of-fit due to unimportant but potentially influential low flows (PILFs) in a flood series, thus making the flood frequency analysis procedure more robust.

  17. Harnessing Multivariate Statistics for Ellipsoidal Data in Structural Geology

    NASA Astrophysics Data System (ADS)

    Roberts, N.; Davis, J. R.; Titus, S.; Tikoff, B.

    2015-12-01

    Most structural geology articles do not state significance levels, report confidence intervals, or perform regressions to find trends. This is, in part, because structural data tend to include directions, orientations, ellipsoids, and tensors, which are not treatable by elementary statistics. We describe a full procedural methodology for the statistical treatment of ellipsoidal data. We use a reconstructed dataset of deformed ooids in Maryland from Cloos (1947) to illustrate the process. Normalized ellipsoids have five degrees of freedom and can be represented by a second order tensor. This tensor can be permuted into a five dimensional vector that belongs to a vector space and can be treated with standard multivariate statistics. Cloos made several claims about the distribution of deformation in the South Mountain fold, Maryland, and we reexamine two particular claims using hypothesis testing: 1) octahedral shear strain increases towards the axial plane of the fold; 2) finite strain orientation varies systematically along the trend of the axial trace as it bends with the Appalachian orogen. We then test the null hypothesis that the southern segment of South Mountain is the same as the northern segment. This test illustrates the application of ellipsoidal statistics, which combine both orientation and shape. We report confidence intervals for each test, and graphically display our results with novel plots. This poster illustrates the importance of statistics in structural geology, especially when working with noisy or small datasets.

  18. A simulator study for the development and evaluation of operating procedures on a supersonic cruise research transport to minimize airport-community noise

    NASA Technical Reports Server (NTRS)

    Grantham, W. D.; Smith, P. M.; Deal, P. L.

    1980-01-01

    Piloted-simulator studies were conducted to determine takeoff and landing operating procedures for a supersonic cruise research transport concept that result in predicted noise levels which meet current Federal Aviation Administration (FAA) certification standards. With the use of standard FAA noise certification test procedures, the subject simulated aircraft did not meet the FAA traded-noise-level standards during takeoff and landing. However, with the use of advanced procedures, this aircraft meets the traded-noise-level standards for flight crews with average skills. The advanced takeoff procedures developed involved violating some of the current Federal Aviation Regulations (FAR), but it was not necessary to violate any FAR noise-test conditions during landing approach. Noise contours were also determined for some of the simulated takeoffs and landings in order to indicate the noise-reduction advantages of using operational procedures other than standard.

  19. 42 CFR 431.708 - Procedures for applying standards.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Programs for Licensing Nursing Home Administrators § 431.708 Procedures for applying standards. The agency... 42 Public Health 4 2010-10-01 2010-10-01 false Procedures for applying standards. 431.708 Section 431.708 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN...

  20. 40 CFR 792.63 - Maintenance and calibration of equipment.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... standardized. (b) The written standard operating procedures required under § 792.81(b)(11) shall set forth in... maintenance operations were routine and followed the written standard operating procedures. Written records... operating procedures shall designate the person responsible for the performance of each operation. (c...

  1. 40 CFR 792.63 - Maintenance and calibration of equipment.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... standardized. (b) The written standard operating procedures required under § 792.81(b)(11) shall set forth in... maintenance operations were routine and followed the written standard operating procedures. Written records... operating procedures shall designate the person responsible for the performance of each operation. (c...

  2. 21 CFR 58.63 - Maintenance and calibration of equipment.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... standardized. (b) The written standard operating procedures required under § 58.81(b)(11) shall set forth in... maintenance operations were routine and followed the written standard operating procedures. Written records... operating procedures shall designate the person responsible for the performance of each operation. (c...

  3. 40 CFR 792.63 - Maintenance and calibration of equipment.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... standardized. (b) The written standard operating procedures required under § 792.81(b)(11) shall set forth in... maintenance operations were routine and followed the written standard operating procedures. Written records... operating procedures shall designate the person responsible for the performance of each operation. (c...

  4. 21 CFR 58.63 - Maintenance and calibration of equipment.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... standardized. (b) The written standard operating procedures required under § 58.81(b)(11) shall set forth in... maintenance operations were routine and followed the written standard operating procedures. Written records... operating procedures shall designate the person responsible for the performance of each operation. (c...

  5. 21 CFR 58.63 - Maintenance and calibration of equipment.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... standardized. (b) The written standard operating procedures required under § 58.81(b)(11) shall set forth in... maintenance operations were routine and followed the written standard operating procedures. Written records... operating procedures shall designate the person responsible for the performance of each operation. (c...

  6. 21 CFR 58.63 - Maintenance and calibration of equipment.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... standardized. (b) The written standard operating procedures required under § 58.81(b)(11) shall set forth in... maintenance operations were routine and followed the written standard operating procedures. Written records... operating procedures shall designate the person responsible for the performance of each operation. (c...

  7. 40 CFR 792.63 - Maintenance and calibration of equipment.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... standardized. (b) The written standard operating procedures required under § 792.81(b)(11) shall set forth in... maintenance operations were routine and followed the written standard operating procedures. Written records... operating procedures shall designate the person responsible for the performance of each operation. (c...

  8. 40 CFR 792.63 - Maintenance and calibration of equipment.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... standardized. (b) The written standard operating procedures required under § 792.81(b)(11) shall set forth in... maintenance operations were routine and followed the written standard operating procedures. Written records... operating procedures shall designate the person responsible for the performance of each operation. (c...

  9. 21 CFR 58.63 - Maintenance and calibration of equipment.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... standardized. (b) The written standard operating procedures required under § 58.81(b)(11) shall set forth in... maintenance operations were routine and followed the written standard operating procedures. Written records... operating procedures shall designate the person responsible for the performance of each operation. (c...

  10. 77 FR 49056 - Categorical Exclusion From Further Environmental Review for Standard Terminal Arrival Route...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-15

    ... public of its environmental review of two standard terminal arrival route (STAR) procedures and two... STAR procedures and the two proposed SID procedures to determine the level of environmental review... STAR procedures identified as GIBBZ (RNAV) STAR and DOCCS STAR and the proposed SID procedures...

  11. Statistical detection of EEG synchrony using empirical bayesian inference.

    PubMed

    Singh, Archana K; Asoh, Hideki; Takeda, Yuji; Phillips, Steven

    2015-01-01

    There is growing interest in understanding how the brain utilizes synchronized oscillatory activity to integrate information across functionally connected regions. Computing phase-locking values (PLV) between EEG signals is a popular method for quantifying such synchronizations and elucidating their role in cognitive tasks. However, high-dimensionality in PLV data incurs a serious multiple testing problem. Standard multiple testing methods in neuroimaging research (e.g., false discovery rate, FDR) suffer severe loss of power, because they fail to exploit complex dependence structure between hypotheses that vary in spectral, temporal and spatial dimension. Previously, we showed that a hierarchical FDR and optimal discovery procedures could be effectively applied for PLV analysis to provide better power than FDR. In this article, we revisit the multiple comparison problem from a new Empirical Bayes perspective and propose the application of the local FDR method (locFDR; Efron, 2001) for PLV synchrony analysis to compute FDR as a posterior probability that an observed statistic belongs to a null hypothesis. We demonstrate the application of Efron's Empirical Bayes approach for PLV synchrony analysis for the first time. We use simulations to validate the specificity and sensitivity of locFDR and a real EEG dataset from a visual search study for experimental validation. We also compare locFDR with hierarchical FDR and optimal discovery procedures in both simulation and experimental analyses. Our simulation results showed that the locFDR can effectively control false positives without compromising on the power of PLV synchrony inference. Our results from the application locFDR on experiment data detected more significant discoveries than our previously proposed methods whereas the standard FDR method failed to detect any significant discoveries.

  12. TATES: Efficient Multivariate Genotype-Phenotype Analysis for Genome-Wide Association Studies

    PubMed Central

    van der Sluis, Sophie; Posthuma, Danielle; Dolan, Conor V.

    2013-01-01

    To date, the genome-wide association study (GWAS) is the primary tool to identify genetic variants that cause phenotypic variation. As GWAS analyses are generally univariate in nature, multivariate phenotypic information is usually reduced to a single composite score. This practice often results in loss of statistical power to detect causal variants. Multivariate genotype–phenotype methods do exist but attain maximal power only in special circumstances. Here, we present a new multivariate method that we refer to as TATES (Trait-based Association Test that uses Extended Simes procedure), inspired by the GATES procedure proposed by Li et al (2011). For each component of a multivariate trait, TATES combines p-values obtained in standard univariate GWAS to acquire one trait-based p-value, while correcting for correlations between components. Extensive simulations, probing a wide variety of genotype–phenotype models, show that TATES's false positive rate is correct, and that TATES's statistical power to detect causal variants explaining 0.5% of the variance can be 2.5–9 times higher than the power of univariate tests based on composite scores and 1.5–2 times higher than the power of the standard MANOVA. Unlike other multivariate methods, TATES detects both genetic variants that are common to multiple phenotypes and genetic variants that are specific to a single phenotype, i.e. TATES provides a more complete view of the genetic architecture of complex traits. As the actual causal genotype–phenotype model is usually unknown and probably phenotypically and genetically complex, TATES, available as an open source program, constitutes a powerful new multivariate strategy that allows researchers to identify novel causal variants, while the complexity of traits is no longer a limiting factor. PMID:23359524

  13. Testing homogeneity of proportion ratios for stratified correlated bilateral data in two-arm randomized clinical trials.

    PubMed

    Pei, Yanbo; Tian, Guo-Liang; Tang, Man-Lai

    2014-11-10

    Stratified data analysis is an important research topic in many biomedical studies and clinical trials. In this article, we develop five test statistics for testing the homogeneity of proportion ratios for stratified correlated bilateral binary data based on an equal correlation model assumption. Bootstrap procedures based on these test statistics are also considered. To evaluate the performance of these statistics and procedures, we conduct Monte Carlo simulations to study their empirical sizes and powers under various scenarios. Our results suggest that the procedure based on score statistic performs well generally and is highly recommended. When the sample size is large, procedures based on the commonly used weighted least square estimate and logarithmic transformation with Mantel-Haenszel estimate are recommended as they do not involve any computation of maximum likelihood estimates requiring iterative algorithms. We also derive approximate sample size formulas based on the recommended test procedures. Finally, we apply the proposed methods to analyze a multi-center randomized clinical trial for scleroderma patients. Copyright © 2014 John Wiley & Sons, Ltd.

  14. Statistical Reform in School Psychology Research: A Synthesis

    ERIC Educational Resources Information Center

    Swaminathan, Hariharan; Rogers, H. Jane

    2007-01-01

    Statistical reform in school psychology research is discussed in terms of research designs, measurement issues, statistical modeling and analysis procedures, interpretation and reporting of statistical results, and finally statistics education.

  15. Standardization of the capillary electrophoresis procedures Capillarys® CDT and Minicap® CDT in comparison to the IFCC reference measurement procedure.

    PubMed

    Schellenberg, François; Humeau, Camille

    2017-06-01

    CDT is at present the most relevant routinely available biological marker of alcohol use and is widely used for screening and monitoring of patients. The lack of standardization leads to specific reference intervals for each procedure. The IFCC working group devoted to CDT demonstrated that the standardization is possible using calibrators assigned to the reference measurement procedure. In this study, we compare the capillary electrophoresis (CE) techniques Capillarys® CDT and Minicap® CDT (Sebia, Lisses, France) to the reference procedure before and after standardization in 126 samples covering the range of CDT measurement. Both capillary electrophoresis procedures show a high correlation (r=0,997) with the reference procedure and the concordance correlation coefficient evaluated according to Mc Bride is "almost perfect" (>0.997 for both CE procedures). The number of results with a relative difference higher than the acceptable difference limit is only 1 for Capillarys® CDT and 5 for Minicap® CDT. These results demonstrate the efficiency of the standardization of CDT measurements for both CE techniques from Sebia, achieved using calibrators assigned to the reference measurement procedure.

  16. 7 CFR 868.102 - Procedures for establishing and revising grade standards.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 7 2012-01-01 2012-01-01 false Procedures for establishing and revising grade....102 Procedures for establishing and revising grade standards. (a) GIPSA will develop, revise, suspend, or terminate grade standards if it determines that such action is in the public interest. GIPSA...

  17. 7 CFR 868.102 - Procedures for establishing and revising grade standards.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 7 2013-01-01 2013-01-01 false Procedures for establishing and revising grade....102 Procedures for establishing and revising grade standards. (a) GIPSA will develop, revise, suspend, or terminate grade standards if it determines that such action is in the public interest. GIPSA...

  18. 7 CFR 868.102 - Procedures for establishing and revising grade standards.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 7 2010-01-01 2010-01-01 false Procedures for establishing and revising grade....102 Procedures for establishing and revising grade standards. (a) GIPSA will develop, revise, suspend, or terminate grade standards if it determines that such action is in the public interest. GIPSA...

  19. 7 CFR 868.102 - Procedures for establishing and revising grade standards.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 7 2014-01-01 2014-01-01 false Procedures for establishing and revising grade....102 Procedures for establishing and revising grade standards. (a) GIPSA will develop, revise, suspend, or terminate grade standards if it determines that such action is in the public interest. GIPSA...

  20. 7 CFR 868.102 - Procedures for establishing and revising grade standards.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 7 2011-01-01 2011-01-01 false Procedures for establishing and revising grade....102 Procedures for establishing and revising grade standards. (a) GIPSA will develop, revise, suspend, or terminate grade standards if it determines that such action is in the public interest. GIPSA...

  1. PROVIDING A HEALTHFUL SCHOOL ENVIRONMENT. STANDARDS AND PROCEDURES.

    ERIC Educational Resources Information Center

    JOHANNIS, NORMA; AND OTHERS

    THIS REPORT DISCUSSES STANDARDS AND PROCEDURES AS APPLIED TO MENTAL AND PHYSICAL HEALTH AND SAFETY AS AFFECTED BY THE PHYSICAL SURROUNDINGS. A BIBLIOGRAPHY DESCRIBING STANDARDS AND SUGGESTED PROCEDURES, AND A CHECKLIST, ARE PROVIDED FOR VOLUNTARY SELF APPRAISAL. THE CHECKLIST COVERS (1) THE SCHOOL GROUNDS, (2) THE SCHOOL BUILDING, (3)…

  2. 10 CFR 440.21 - Weatherization materials standards and energy audit procedures.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 3 2013-01-01 2013-01-01 false Weatherization materials standards and energy audit procedures. 440.21 Section 440.21 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION WEATHERIZATION ASSISTANCE FOR LOW-INCOME PERSONS § 440.21 Weatherization materials standards and energy audit procedures. (a...

  3. 21 CFR 821.25 - Device tracking system and content requirements: manufacturer requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... manufacturer of a tracked device shall keep current records in accordance with its standard operating procedure... this section. A manufacturer shall make this standard operating procedure available to FDA upon request. A manufacturer shall incorporate the following into the standard operating procedure: (1) Data...

  4. 40 CFR 63.1549 - Recordkeeping and reporting requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... the practices described in the standard operating procedures manual required under § 63.1544(a) for... described in the standard operating procedures manual for baghouses required under § 63.1547(a). (6) If an... in the standard operating procedures manual for baghouses required under § 63.1547(a), including an...

  5. 40 CFR 63.1549 - Recordkeeping and reporting requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the practices described in the standard operating procedures manual required under § 63.1544(a) for... described in the standard operating procedures manual for baghouses required under § 63.1547(a). (6) If an... in the standard operating procedures manual for baghouses required under § 63.1547(a), including an...

  6. 76 FR 18645 - Third Party Testing for Certain Children's Products; Notice of Requirements for Accreditation of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-05

    ... referenced: The existing CPSC Standard Operating Procedure for Determining Lead (Pb) in Paint and Other... explicitly require the use of a particular standard operating procedure. Additionally, the following....'' It is still based on standard test procedures, such as ASTM International (formerly the American...

  7. 21 CFR 821.25 - Device tracking system and content requirements: manufacturer requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... manufacturer of a tracked device shall keep current records in accordance with its standard operating procedure... this section. A manufacturer shall make this standard operating procedure available to FDA upon request. A manufacturer shall incorporate the following into the standard operating procedure: (1) Data...

  8. 40 CFR 63.550 - Recordkeeping and reporting requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., corrective action, report, or record, according to § 63.10(b)(1). (b) The standard operating procedures... standard operating procedures manual for baghouses required under § 63.548(a). (4) Electronic records of... required as part of the practices described in the standard operating procedures manual required under § 63...

  9. 21 CFR 821.25 - Device tracking system and content requirements: manufacturer requirements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... manufacturer of a tracked device shall keep current records in accordance with its standard operating procedure... this section. A manufacturer shall make this standard operating procedure available to FDA upon request. A manufacturer shall incorporate the following into the standard operating procedure: (1) Data...

  10. 40 CFR 63.550 - Recordkeeping and reporting requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., corrective action, report, or record, according to § 63.10(b)(1). (b) The standard operating procedures... standard operating procedures manual for baghouses required under § 63.548(a). (4) Electronic records of... required as part of the practices described in the standard operating procedures manual required under § 63...

  11. 21 CFR 821.25 - Device tracking system and content requirements: manufacturer requirements.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... manufacturer of a tracked device shall keep current records in accordance with its standard operating procedure... this section. A manufacturer shall make this standard operating procedure available to FDA upon request. A manufacturer shall incorporate the following into the standard operating procedure: (1) Data...

  12. 40 CFR 63.550 - Recordkeeping and reporting requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., corrective action, report, or record, according to § 63.10(b)(1). (b) The standard operating procedures... standard operating procedures manual for baghouses required under § 63.548(a). (4) Electronic records of... required as part of the practices described in the standard operating procedures manual required under § 63...

  13. 76 FR 62074 - Proposed Revision of Performance Standards for State Medicaid Fraud Control Units

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-06

    ... volume of case referrals and workload for each location. Performance Standard 3--Policies and Procedures A Unit establishes written policies and procedures for its operations and ensures that staff are familiar with, and adhere to, policies and procedures. In meeting this standard, the following performance...

  14. 24 CFR 574.510 - Environmental procedures and standards.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 3 2011-04-01 2010-04-01 true Environmental procedures and standards. 574.510 Section 574.510 Housing and Urban Development Regulations Relating to Housing and Urban... Administration § 574.510 Environmental procedures and standards. (a) Activities under this part are subject to...

  15. 76 FR 54528 - Standard Operating Procedures (SOP) of the Aircraft Certification Service (AIR) Process for the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-01

    ...) of the Aircraft Certification Service (AIR) Process for the Sequencing of Certification and... on the Aircraft Certification Service (AIR) standard operating procedure (SOP) describing the process... comments on the SOP : AIR-100-001; Standard Operating Procedure--Aircraft Certification Service Project...

  16. 10 CFR 440.21 - Weatherization materials standards and energy audit procedures.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Weatherization materials standards and energy audit procedures. 440.21 Section 440.21 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION WEATHERIZATION ASSISTANCE FOR LOW-INCOME PERSONS § 440.21 Weatherization materials standards and energy audit procedures. (a...

  17. 10 CFR 440.21 - Weatherization materials standards and energy audit procedures.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 3 2014-01-01 2014-01-01 false Weatherization materials standards and energy audit procedures. 440.21 Section 440.21 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION WEATHERIZATION ASSISTANCE FOR LOW-INCOME PERSONS § 440.21 Weatherization materials standards and energy audit procedures. (a...

  18. 10 CFR 440.21 - Weatherization materials standards and energy audit procedures.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 3 2012-01-01 2012-01-01 false Weatherization materials standards and energy audit procedures. 440.21 Section 440.21 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION WEATHERIZATION ASSISTANCE FOR LOW-INCOME PERSONS § 440.21 Weatherization materials standards and energy audit procedures. (a...

  19. Antibiotic stewardship in the newborn surgical patient: A quality improvement project in the neonatal intensive care unit.

    PubMed

    Walker, Sarah; Datta, Ankur; Massoumi, Roxanne L; Gross, Erica R; Uhing, Michael; Arca, Marjorie J

    2017-12-01

    There is significant diversity in the utilization of antibiotics for neonates undergoing surgical procedures. Our institution standardized antibiotic administration for surgical neonates, in which no empiric antibiotics were given to infants with surgical conditions postnatally, and antibiotics are given no more than 72 hours perioperatively. We compared the time periods before and after implementation of antibiotic protocol in an institution review board-approved, retrospective review of neonates with congenital surgical conditions who underwent surgical correction within 30 days after birth. Surgical site infection at 30 days was the primary outcome, and development of hospital-acquired infections or multidrug-resistant organism were secondary outcomes. One hundred forty-eight infants underwent surgical procedures pre-protocol, and 127 underwent procedures post-protocol implementation. Surgical site infection rates were similar pre- and post-protocol, 14% and 9% respectively, (P = .21.) The incidence of hospital-acquired infections (13.7% vs 8.7%, P = .205) and multidrug-resistant organism (4.7% vs 1.6%, P = .143) was similar between the 2 periods. Elimination of empiric postnatal antibiotics did not statistically change rates of surgical site infection, hospital-acquired infections, or multidrug-resistant organisms. Limiting the duration of perioperative antibiotic prophylaxis to no more than 72 hours after surgery did not increase the rate of surgical site infection, hospital-acquired infections, or multidrug-resistant organism. Median antibiotic days were decreased with antibiotic standardization for surgical neonates. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Compendium of Methods for Applying Measured Data to Vibration and Acoustic Problems

    DTIC Science & Technology

    1985-10-01

    statistical energy analysis , finite element models, transfer function...Procedures for the Modal Analysis Method .............................................. 8-22 8.4 Summary of the Procedures for the Statistical Energy Analysis Method... statistical energy analysis . 8-1 • o + . . i... "_+,A" L + "+..• •+A ’! i, + +.+ +• o.+ -ore -+. • -..- , .%..% ". • 2 -".-2- ;.-.’, . o . It is helpful

  1. To t-Test or Not to t-Test? A p-Values-Based Point of View in the Receiver Operating Characteristic Curve Framework.

    PubMed

    Vexler, Albert; Yu, Jihnhee

    2018-04-13

    A common statistical doctrine supported by many introductory courses and textbooks is that t-test type procedures based on normally distributed data points are anticipated to provide a standard in decision-making. In order to motivate scholars to examine this convention, we introduce a simple approach based on graphical tools of receiver operating characteristic (ROC) curve analysis, a well-established biostatistical methodology. In this context, we propose employing a p-values-based method, taking into account the stochastic nature of p-values. We focus on the modern statistical literature to address the expected p-value (EPV) as a measure of the performance of decision-making rules. During the course of our study, we extend the EPV concept to be considered in terms of the ROC curve technique. This provides expressive evaluations and visualizations of a wide spectrum of testing mechanisms' properties. We show that the conventional power characterization of tests is a partial aspect of the presented EPV/ROC technique. We desire that this explanation of the EPV/ROC approach convinces researchers of the usefulness of the EPV/ROC approach for depicting different characteristics of decision-making procedures, in light of the growing interest regarding correct p-values-based applications.

  2. A procedure for classifying textural facies in gravel‐bed rivers

    USGS Publications Warehouse

    Buffington, John M.; Montgomery, David R.

    1999-01-01

    Textural patches (i.e., grain‐size facies) are commonly observed in gravel‐bed channels and are of significance for both physical and biological processes at subreach scales. We present a general framework for classifying textural patches that allows modification for particular study goals, while maintaining a basic degree of standardization. Textures are classified using a two‐tier system of ternary diagrams that identifies the relative abundance of major size classes and subcategories of the dominant size. An iterative procedure of visual identification and quantitative grain‐size measurement is used. A field test of our classification indicates that it affords reasonable statistical discrimination of median grain size and variance of bed‐surface textures. We also explore the compromise between classification simplicity and accuracy. We find that statistically meaningful textural discrimination requires use of both tiers of our classification. Furthermore, we find that simplified variants of the two‐tier scheme are less accurate but may be more practical for field studies which do not require a high level of textural discrimination or detailed description of grain‐size distributions. Facies maps provide a natural template for stratifying other physical and biological measurements and produce a retrievable and versatile database that can be used as a component of channel monitoring efforts.

  3. Virtual reality for acute pain reduction in adolescents undergoing burn wound care: a prospective randomized controlled trial.

    PubMed

    Kipping, Belinda; Rodger, Sylvia; Miller, Kate; Kimble, Roy M

    2012-08-01

    Effective pain management remains a challenge for adolescents during conscious burn wound care procedures. Virtual reality (VR) shows promise as a non-pharmacological adjunct in reducing pain. This study assessed off-the-shelf VR for (1) its effect on reducing acute pain intensity during adolescent burn wound care, and (2) its clinical utility in a busy hospital setting. Forty-one adolescents (11-17 years) participated in this prospective randomized controlled trial. Acute pain outcomes including adolescent self-report, nursing staff behavioral observation, caregiver observation and physiological measures were collected. Length of procedure times and adolescent reactions were also recorded to inform clinical utility. Nursing staff reported a statistically significant reduction in pain scores during dressing removal, and significantly less rescue doses of Entonox given to those receiving VR, compared to those receiving standard distraction. For all other pain outcomes and length of treatment, there was a trend for lower pain scores and treatment times for those receiving VR, but these differences were not statistically significant. Despite only minimal pain reduction achieved using off-the-shelf VR, other results from this trial and previous research on younger children with burns suggest a customized, adolescent and hospital friendly device may be more effective in pain reduction. Copyright © 2012. Published by Elsevier Ltd.

  4. FGWAS: Functional genome wide association analysis.

    PubMed

    Huang, Chao; Thompson, Paul; Wang, Yalin; Yu, Yang; Zhang, Jingwen; Kong, Dehan; Colen, Rivka R; Knickmeyer, Rebecca C; Zhu, Hongtu

    2017-10-01

    Functional phenotypes (e.g., subcortical surface representation), which commonly arise in imaging genetic studies, have been used to detect putative genes for complexly inherited neuropsychiatric and neurodegenerative disorders. However, existing statistical methods largely ignore the functional features (e.g., functional smoothness and correlation). The aim of this paper is to develop a functional genome-wide association analysis (FGWAS) framework to efficiently carry out whole-genome analyses of functional phenotypes. FGWAS consists of three components: a multivariate varying coefficient model, a global sure independence screening procedure, and a test procedure. Compared with the standard multivariate regression model, the multivariate varying coefficient model explicitly models the functional features of functional phenotypes through the integration of smooth coefficient functions and functional principal component analysis. Statistically, compared with existing methods for genome-wide association studies (GWAS), FGWAS can substantially boost the detection power for discovering important genetic variants influencing brain structure and function. Simulation studies show that FGWAS outperforms existing GWAS methods for searching sparse signals in an extremely large search space, while controlling for the family-wise error rate. We have successfully applied FGWAS to large-scale analysis of data from the Alzheimer's Disease Neuroimaging Initiative for 708 subjects, 30,000 vertices on the left and right hippocampal surfaces, and 501,584 SNPs. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Projector-based virtual reality dome environment for procedural pain and anxiety in young children with burn injuries: a pilot study.

    PubMed

    Khadra, Christelle; Ballard, Ariane; Déry, Johanne; Paquin, David; Fortin, Jean-Simon; Perreault, Isabelle; Labbe, David R; Hoffman, Hunter G; Bouchard, Stéphane; LeMay, Sylvie

    2018-01-01

    Virtual reality (VR) is a non-pharmacological method to distract from pain during painful procedures. However, it was never tested in young children with burn injuries undergoing wound care. We aimed to assess the feasibility and acceptability of the study process and the use of VR for procedural pain management. From June 2016 to January 2017, we recruited children from 2 months to 10 years of age with burn injuries requiring a hydrotherapy session in a pediatric university teaching hospital in Montreal. Each child received the projector-based VR intervention in addition to the standard pharmacological treatment. Data on intervention and study feasibility and acceptability in addition to measures on pain (Face, Legs, Activity, Cry, Consolability scale), baseline (Modified Smith Scale) and procedural (Procedure Behavior Check List) anxiety, comfort (OCCEB-BECCO [behavioral observational scale of comfort level for child burn victims]), and sedation (Ramsay Sedation Scale) were collected before, during, and after the procedure. Data analyses included descriptive and non-parametric inferential statistics. We recruited 15 children with a mean age of 2.2±2.1 years and a mean total body surface area of 5% (±4). Mean pain score during the procedure was low (2.9/10, ±3), as was the discomfort level (2.9/10, ±2.8). Most children were cooperative, oriented, and calm. Assessing anxiety was not feasible with our sample of participants. The prototype did not interfere with the procedure and was considered useful for procedural pain management by most health care professionals. The projector-based VR is a feasible and acceptable intervention for procedural pain management in young children with burn injuries. A larger trial with a control group is required to assess its efficacy.

  6. Management patterns of medicare patients undergoing treatment for upper urinary tract calculi.

    PubMed

    Matlaga, Brian R; Meckley, Lisa M; Kim, Micheline; Byrne, Thomas W

    2014-06-01

    We conducted this study to identify differences in the re-treatment rates and ancillary procedures for the two most commonly utilized stone treatment procedures in the Medicare population: ureteroscopy (URS) and shock wave lithotripsy (SWL). A retrospective claims analysis of the Medicare standard analytical file 5% sample was conducted to identify patients with a new diagnosis of urolithiasis undergoing treatment with URS or SWL from 2009-2010. Outcomes evaluated: (1) repeat stone removal procedures within 120 days post index procedure, (2) stent placement procedures on the index date, 30 days prior to and 120 days post index date, and (3) use of general anesthesia. We identified 3885 eligible patients, of which 2165 (56%) underwent SWL and 1720 (44%) underwent URS. Overall, SWL patients were 1.73 times more likely to undergo at least one repeat procedure than URS patients, and twice as likely to require multiple re-treatments compared to URS. Among those with ureteral stones, SWL patients were 2.27 times more likely to undergo repeat procedures. The difference was not statistically significant in renal stone patients. Overall, SWL patients were 1.41 times more likely than URS patients to have a stent placed prior to index procedure, and 1.33 times more likely to have a stent placed subsequent to the index procedure. The majority of URS patients (77.8%) had a stent placed at the time of index procedure. There was no significant difference in anesthetic approaches between SWL and URS. Patients undergoing SWL are significantly more likely to require re-treatments than URS patients. SWL patients are also significantly more likely to require ureteral stent placement as a separate event. SWL and URS patients have similar rates of general anesthesia.

  7. Projector-based virtual reality dome environment for procedural pain and anxiety in young children with burn injuries: a pilot study

    PubMed Central

    Khadra, Christelle; Ballard, Ariane; Déry, Johanne; Paquin, David; Fortin, Jean-Simon; Perreault, Isabelle; Labbe, David R; Hoffman, Hunter G; Bouchard, Stéphane

    2018-01-01

    Background Virtual reality (VR) is a non-pharmacological method to distract from pain during painful procedures. However, it was never tested in young children with burn injuries undergoing wound care. Aim We aimed to assess the feasibility and acceptability of the study process and the use of VR for procedural pain management. Methods From June 2016 to January 2017, we recruited children from 2 months to 10 years of age with burn injuries requiring a hydrotherapy session in a pediatric university teaching hospital in Montreal. Each child received the projector-based VR intervention in addition to the standard pharmacological treatment. Data on intervention and study feasibility and acceptability in addition to measures on pain (Face, Legs, Activity, Cry, Consolability scale), baseline (Modified Smith Scale) and procedural (Procedure Behavior Check List) anxiety, comfort (OCCEB-BECCO [behavioral observational scale of comfort level for child burn victims]), and sedation (Ramsay Sedation Scale) were collected before, during, and after the procedure. Data analyses included descriptive and non-parametric inferential statistics. Results We recruited 15 children with a mean age of 2.2±2.1 years and a mean total body surface area of 5% (±4). Mean pain score during the procedure was low (2.9/10, ±3), as was the discomfort level (2.9/10, ±2.8). Most children were cooperative, oriented, and calm. Assessing anxiety was not feasible with our sample of participants. The prototype did not interfere with the procedure and was considered useful for procedural pain management by most health care professionals. Conclusion The projector-based VR is a feasible and acceptable intervention for procedural pain management in young children with burn injuries. A larger trial with a control group is required to assess its efficacy. PMID:29491717

  8. "Knife to skin" time is a poor marker of operating room utilization and efficiency in cardiac surgery.

    PubMed

    Luthra, Suvitesh; Ramady, Omar; Monge, Mary; Fitzsimons, Michael G; Kaleta, Terry R; Sundt, Thoralf M

    2015-06-01

    Markers of operation room (OR) efficiency in cardiac surgery are focused on "knife to skin" and "start time tardiness." These do not evaluate the middle and later parts of the cardiac surgical pathway. The purpose of this analysis was to evaluate knife to skin time as an efficiency marker in cardiac surgery. We looked at knife to skin time, procedure time, and transfer times in the cardiac operational pathway for their correlation with predefined indices of operational efficiency (Index of Operation Efficiency - InOE, Surgical Index of Operational Efficiency - sInOE). A regression analysis was performed to test the goodness of fit of the regression curves estimated for InOE relative to the times on the operational pathway. The mean knife to skin time was 90.6 ± 13 minutes (23% of total OR time). The mean procedure time was 282 ± 123 minutes (71% of total OR time). Utilization efficiencies were highest for aortic valve replacement and coronary artery bypass grafting and least for complex aortic procedures. There were no significant procedure-specific or team-specific differences for standard procedures. Procedure times correlated the strongest with InOE (r = -0.98, p < 0.01). Compared to procedure times, knife to skin is not as strong an indicator of efficiency. A statistically significant linear dependence on InOE was observed with "procedure times" only. Procedure times are a better marker of OR efficiency than knife to skin in cardiac cases. Strategies to increase OR utilization and efficiency should address procedure times in addition to knife to skin times. © 2015 Wiley Periodicals, Inc.

  9. Halogen and LED light curing of composite: temperature increase and Knoop hardness.

    PubMed

    Schneider, L F; Consani, S; Correr-Sobrinho, L; Correr, A B; Sinhoreti, M A

    2006-03-01

    This study assessed the Knoop hardness and temperature increase provided by three light curing units when using (1) the manufacturers' recommended times of photo-activation and (2) standardizing total energy density. One halogen--XL2500 (3M/ESPE)--and two light-emitting diode (LED) curing units--Freelight (3M/ESPE) and Ultrablue IS (DMC)--were used. A type-K thermocouple registered the temperature change produced by the composite photo-activation in a mold. Twenty-four hours after the photo-activation procedures, the composite specimens were submitted to a hardness test. Both temperature increase and hardness data were submitted to ANOVA and Tukey's test (5% significance). Using the first set of photo-activation conditions, the halogen unit produced a statistically higher temperature increase than did both LED units, and the Freelight LED resulted in a lower hardness than did the other curing units. When applying the second set of photo-activation conditions, the two LED units produced statistically greater temperature increase than did the halogen unit, whereas there were no statistical differences in hardness among the curing units.

  10. Causality

    NASA Astrophysics Data System (ADS)

    Pearl, Judea

    2000-03-01

    Written by one of the pre-eminent researchers in the field, this book provides a comprehensive exposition of modern analysis of causation. It shows how causality has grown from a nebulous concept into a mathematical theory with significant applications in the fields of statistics, artificial intelligence, philosophy, cognitive science, and the health and social sciences. Pearl presents a unified account of the probabilistic, manipulative, counterfactual and structural approaches to causation, and devises simple mathematical tools for analyzing the relationships between causal connections, statistical associations, actions and observations. The book will open the way for including causal analysis in the standard curriculum of statistics, artifical intelligence, business, epidemiology, social science and economics. Students in these areas will find natural models, simple identification procedures, and precise mathematical definitions of causal concepts that traditional texts have tended to evade or make unduly complicated. This book will be of interest to professionals and students in a wide variety of fields. Anyone who wishes to elucidate meaningful relationships from data, predict effects of actions and policies, assess explanations of reported events, or form theories of causal understanding and causal speech will find this book stimulating and invaluable.

  11. Inference of median difference based on the Box-Cox model in randomized clinical trials.

    PubMed

    Maruo, K; Isogawa, N; Gosho, M

    2015-05-10

    In randomized clinical trials, many medical and biological measurements are not normally distributed and are often skewed. The Box-Cox transformation is a powerful procedure for comparing two treatment groups for skewed continuous variables in terms of a statistical test. However, it is difficult to directly estimate and interpret the location difference between the two groups on the original scale of the measurement. We propose a helpful method that infers the difference of the treatment effect on the original scale in a more easily interpretable form. We also provide statistical analysis packages that consistently include an estimate of the treatment effect, covariance adjustments, standard errors, and statistical hypothesis tests. The simulation study that focuses on randomized parallel group clinical trials with two treatment groups indicates that the performance of the proposed method is equivalent to or better than that of the existing non-parametric approaches in terms of the type-I error rate and power. We illustrate our method with cluster of differentiation 4 data in an acquired immune deficiency syndrome clinical trial. Copyright © 2015 John Wiley & Sons, Ltd.

  12. The relationship between knowledge of leadership and knowledge management practices in the food industry in Kurdistan province, Iran.

    PubMed

    Jad, Seyyed Mohammad Moosavi; Geravandi, Sahar; Mohammadi, Mohammad Javad; Alizadeh, Rashin; Sarvarian, Mohammad; Rastegarimehr, Babak; Afkar, Abolhasan; Yari, Ahmad Reza; Momtazan, Mahboobeh; Valipour, Aliasghar; Mahboubi, Mohammad; Karimyan, Azimeh; Mazraehkar, Alireza; Nejad, Ali Soleimani; Mohammadi, Hafez

    2017-12-01

    The aim of this study was to identify the relationship between the knowledge of leadership and knowledge management practices. This research strategy, in terms of quantity, procedure and obtain information, is descriptive and correlational. Statistical population, consist of all employees of a food industry in Kurdistan province of Iran, who were engaged in 2016 and their total number is about 1800 people. 316 employees in the Kurdistan food industry (Kurdistan FI) were selected, using Cochran formula. Non-random method and valid questions (standard) for measurement of the data are used. Reliability and validity were confirmed. Statistical analysis of the data was carried out, using SPSS 16. The statistical analysis of collected data showed the relationship between knowledge-oriented of leadership and knowledge management activities as mediator variables. The results of the data and test hypotheses suggest that knowledge management activities play an important role in the functioning of product innovation and the results showed that the activities of Knowledge Management (knowledge transfer, storage knowledge, application of knowledge, creation of knowledge) on performance of product innovation.

  13. Numerical solutions of ideal quantum gas dynamical flows governed by semiclassical ellipsoidal-statistical distribution

    PubMed Central

    Yang, Jaw-Yen; Yan, Chih-Yuan; Diaz, Manuel; Huang, Juan-Chen; Li, Zhihui; Zhang, Hanxin

    2014-01-01

    The ideal quantum gas dynamics as manifested by the semiclassical ellipsoidal-statistical (ES) equilibrium distribution derived in Wu et al. (Wu et al. 2012 Proc. R. Soc. A 468, 1799–1823 (doi:10.1098/rspa.2011.0673)) is numerically studied for particles of three statistics. This anisotropic ES equilibrium distribution was derived using the maximum entropy principle and conserves the mass, momentum and energy, but differs from the standard Fermi–Dirac or Bose–Einstein distribution. The present numerical method combines the discrete velocity (or momentum) ordinate method in momentum space and the high-resolution shock-capturing method in physical space. A decoding procedure to obtain the necessary parameters for determining the ES distribution is also devised. Computations of two-dimensional Riemann problems are presented, and various contours of the quantities unique to this ES model are illustrated. The main flow features, such as shock waves, expansion waves and slip lines and their complex nonlinear interactions, are depicted and found to be consistent with existing calculations for a classical gas. PMID:24399919

  14. Impact of gastrectomy procedural complexity on surgical outcomes and hospital comparisons.

    PubMed

    Mohanty, Sanjay; Paruch, Jennifer; Bilimoria, Karl Y; Cohen, Mark; Strong, Vivian E; Weber, Sharon M

    2015-08-01

    Most risk adjustment approaches adjust for patient comorbidities and the primary procedure. However, procedures done at the same time as the index case may increase operative risk and merit inclusion in adjustment models for fair hospital comparisons. Our objectives were to evaluate the impact of surgical complexity on postoperative outcomes and hospital comparisons in gastric cancer surgery. Patients who underwent gastric resection for cancer were identified from a large clinical dataset. Procedure complexity was characterized using secondary procedure CPT codes and work relative value units (RVUs). Regression models were developed to evaluate the association between complexity variables and outcomes. The impact of complexity adjustment on model performance and hospital comparisons was examined. Among 3,467 patients who underwent gastrectomy for adenocarcinoma, 2,171 operations were distal and 1,296 total. A secondary procedure was reported for 33% of distal gastrectomies and 59% of total gastrectomies. Six of 10 secondary procedures were associated with adverse outcomes. For example, patients who underwent a synchronous bowel resection had a higher risk of mortality (odds ratio [OR], 2.14; 95% CI, 1.07-4.29) and reoperation (OR, 2.09; 95% CI, 1.26-3.47). Model performance was slightly better for nearly all outcomes with complexity adjustment (mortality c-statistics: standard model, 0.853; secondary procedure model, 0.858; RVU model, 0.855). Hospital ranking did not change substantially after complexity adjustment. Surgical complexity variables are associated with adverse outcomes in gastrectomy, but complexity adjustment does not affect hospital rankings appreciably. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Comparison of a minimally invasive procedure versus standard microscopic discotomy: a prospective randomised controlled clinical trial

    PubMed Central

    Greiner-Perth, R.; Boehm, H.; Mahlfeld, K.; Grasshoff, H.; Allam, Y.; Awiszus, F.

    2009-01-01

    A Prospective randomised controlled study was done to determine statistical difference between the standard microsurgical discotomy (MC) and a minimally invasive microscopic procedure for disc prolapse surgery by comparing operation duration and clinical outcome. Additionally, the transferability of the results was determined by a bicentric design. The microscopic assisted percutaneous nucleotomy (MAPN) has been advocated as a minimally invasive tubular technique. Proponents have claimed that minimally invasive procedures reduce postoperative pain and accelerate the recovery. In addition, there exist only a limited number of well-designed comparison studies comparing standard microdiscotomy to a tubular minimally invasive technique that support this claim. Furthermore, there are no well-designed studies looking at the transferability of those results and possible learning curve phenomena. We studied 100 patients, who were planned for disc prolapse surgery at two centres [50 patients at the developing centre (index) and 50 patients at the less experienced (transfer) centre]. The randomisation was done separately for each centre, employing a block-randomisation procedure with respect to age and preoperative Oswestry score. Operation duration was chosen as a primary outcome parameter as there was a distinguished shortening observed in a preliminary study at the index centre enabling a sound case number estimation. The following data were compared between the two groups and the centres with a 12-month follow-up: surgical times (operation duration and approach duration), the clinical results, leg and back pain by visual analogue scale, the Oswestry disability index, length of hospital stay, return to work time, and complications. The operation duration was statistically identical for MC (57.8 ± 20.2 min) at the index centre and for MAPN (50.3 ± 18.3 min) and MC (54.7 ± 18.1 min) at the transfer centre. The operation duration was only significantly shorter for the MAPN technique at the index centre with 33.3 min (SD 12.1 min). There was a huge clinical improvement for all patients regardless of centre or method revealed by a repeated measures ANOVA for all follow-up visits Separate post hoc ANOVAs for each centre revealed that there was a significant time–method (MAPN vs. MC) interaction at the index centre (F = 3.75, P = 0.006), whereas this crucial interaction was not present at the transfer centre (F = 0.5, P = 0.7). These results suggest a slightly faster clinical recovery for the MAPN patients only at the index centre. This was due to a greater reduction in VAS score for back pain at discharge, 8-week and 6-month follow up (P < 0.002). The Oswestry-disability scores reached a significant improvement compared to the initial values extending over the complete follow-up at both centres for both methods without revealing any differences for the two methods in either centre. There was no difference regarding complications. The results demonstrate that a shorter operation duration and concomitant quicker recovery is comprehensible at an experienced minimally invasively operating centre. These advantages could not be found at the transfer centre within 25 minimally invasive procedures. In conclusion both procedures show equal mid term clinical results and the same complication rate even if the suggested advantages for the minimally invasive procedure could not be confirmed for the transfer centre within the framework of this study. PMID:19360440

  16. Canary in a coal mine: does the plastic surgery market predict the american economy?

    PubMed

    Wong, Wendy W; Davis, Drew G; Son, Andrew K; Camp, Matthew C; Gupta, Subhas C

    2010-08-01

    Economic tools have been used in the past to predict the trends in plastic surgery procedures. Since 1992, U.S. cosmetic surgery volumes have increased overall, but the exact relationship between economic downturns and procedural volumes remains elusive. If an economic predicting role can be established from plastic surgery indicators, this could prove to be a very powerful tool. A rolling 3-month revenue average of an eight-plastic surgeon practice and various economic indicators were plotted and compared. An investigation of the U.S. procedural volumes was performed from the American Society of Plastic Surgeons statistics between 1996 and 2008. The correlations of different economic variables with plastic surgery volumes were evaluated. Lastly, search term frequencies were examined from 2004 to July of 2009 to study potential patient interest in major plastic surgery procedures. The self-payment revenue of the plastic surgery group consistently proved indicative of the market trends approximately 1 month in advance. The Standard and Poor's 500, Dow Jones Industrial Average, National Association of Securities Dealers Automated Quotations, and Standard and Poor's Retail Index demonstrated a very close relationship with the income of our plastic surgery group. The frequency of Internet search terms showed a constant level of interest in the patient population despite economic downturns. The data demonstrate that examining plastic surgery revenue can be a useful tool to analyze and possibly predict trends, as it is driven by a market and shows a close correlation to many leading economic indicators. The persisting and increasing interest in plastic surgery suggests hope for a recovering and successful market in the near future.

  17. Laparoscopic Colorectal Surgery in the Emergency Setting: Trends in the Province of Ontario.

    PubMed

    Musselman, Reilly P; Gomes, Tara; Chan, Beverley P; Auer, Rebecca C; Moloo, Husein; Mamdani, Muhammad; Al-Omran, Mohammed; Al-Obeed, Omar; Boushey, Robin P

    2015-10-01

    The purpose of this study was to examine the adoption trends of emergency laparoscopic colorectal surgery in the province of Ontario. We conducted a retrospective time-series analysis examining rates of emergency colorectal surgery among 10.5 million adults in Ontario, Canada from April 1, 2002 to December 31, 2009. We linked administrative claims databases and the Ontario Cancer Registry to assess procedure rates over time. Procedure trends were assessed using time-series analysis. Over the 8-year period, 29,676 emergency colorectal procedures were identified. A total of 2582 (8.7%) were performed laparoscopically and 27,094 (91.3%) were open. Open and laparoscopic patients were similar with respect age, sex, and Charlson Comorbidity Index. The proportion of surgery for benign (63.8% of open cases vs. 65.6% laparoscopic, standardized difference=0.04) and malignant disease (36.2% open vs. 34.4% laparoscopic, standardized difference=0.04) was equal between groups. The percentage of emergency colorectal surgery performed laparoscopically increased from 5.7% in 2002 to 12.0% in 2009 (P<0.01). The use of laparoscopy increased for both benign and malignant disease. Statistically significant upward trends in laparoscopic surgery were seen for inflammatory bowel disease (P<0.01), obstruction (P<0.01), and colon cancer (P<0.01). From 2002 to 2009, annual procedure rates increased at a greater rate in nonacademic centers (P<0.01). Laparoscopic emergency colorectal surgery has increased significantly between 2002 and 2009 for both benign and malignant disease and for a wide range of diagnoses. This was driven in part by steadily rising usage of laparoscopy in nonacademic centers.

  18. Analyzing longitudinal data with the linear mixed models procedure in SPSS.

    PubMed

    West, Brady T

    2009-09-01

    Many applied researchers analyzing longitudinal data share a common misconception: that specialized statistical software is necessary to fit hierarchical linear models (also known as linear mixed models [LMMs], or multilevel models) to longitudinal data sets. Although several specialized statistical software programs of high quality are available that allow researchers to fit these models to longitudinal data sets (e.g., HLM), rapid advances in general purpose statistical software packages have recently enabled analysts to fit these same models when using preferred packages that also enable other more common analyses. One of these general purpose statistical packages is SPSS, which includes a very flexible and powerful procedure for fitting LMMs to longitudinal data sets with continuous outcomes. This article aims to present readers with a practical discussion of how to analyze longitudinal data using the LMMs procedure in the SPSS statistical software package.

  19. A standardised protocol for texture feature analysis of endoscopic images in gynaecological cancer.

    PubMed

    Neofytou, Marios S; Tanos, Vasilis; Pattichis, Marios S; Pattichis, Constantinos S; Kyriacou, Efthyvoulos C; Koutsouris, Dimitris D

    2007-11-29

    In the development of tissue classification methods, classifiers rely on significant differences between texture features extracted from normal and abnormal regions. Yet, significant differences can arise due to variations in the image acquisition method. For endoscopic imaging of the endometrium, we propose a standardized image acquisition protocol to eliminate significant statistical differences due to variations in: (i) the distance from the tissue (panoramic vs close up), (ii) difference in viewing angles and (iii) color correction. We investigate texture feature variability for a variety of targets encountered in clinical endoscopy. All images were captured at clinically optimum illumination and focus using 720 x 576 pixels and 24 bits color for: (i) a variety of testing targets from a color palette with a known color distribution, (ii) different viewing angles, (iv) two different distances from a calf endometrial and from a chicken cavity. Also, human images from the endometrium were captured and analysed. For texture feature analysis, three different sets were considered: (i) Statistical Features (SF), (ii) Spatial Gray Level Dependence Matrices (SGLDM), and (iii) Gray Level Difference Statistics (GLDS). All images were gamma corrected and the extracted texture feature values were compared against the texture feature values extracted from the uncorrected images. Statistical tests were applied to compare images from different viewing conditions so as to determine any significant differences. For the proposed acquisition procedure, results indicate that there is no significant difference in texture features between the panoramic and close up views and between angles. For a calibrated target image, gamma correction provided an acquired image that was a significantly better approximation to the original target image. In turn, this implies that the texture features extracted from the corrected images provided for better approximations to the original images. Within the proposed protocol, for human ROIs, we have found that there is a large number of texture features that showed significant differences between normal and abnormal endometrium. This study provides a standardized protocol for avoiding any significant texture feature differences that may arise due to variability in the acquisition procedure or the lack of color correction. After applying the protocol, we have found that significant differences in texture features will only be due to the fact that the features were extracted from different types of tissue (normal vs abnormal).

  20. Procedures and Standards Handbook. Version 3.0. What Works Clearinghouse

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2014

    2014-01-01

    This "What Works Clearinghouse Procedures and Standards Handbook (Version 3.0)" provides a detailed description of the standards and procedures of the What Works Clearinghouse (WWC). The remaining chapters of this Handbook are organized to take the reader through the basic steps that the WWC uses to develop a review protocol, identify…

  1. 46 CFR 160.055-9 - Procedure for approval-standard and nonstandard life preservers.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 6 2010-10-01 2010-10-01 false Procedure for approval-standard and nonstandard life preservers. 160.055-9 Section 160.055-9 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED..., Unicellular Plastic Foam, Adult and Child, for Merchant Vessels § 160.055-9 Procedure for approval—standard...

  2. 77 FR 4698 - Energy Conservation Program: Test Procedure and Energy Conservation Standard for Set-Top Boxes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-31

    ... Conservation Program: Test Procedure and Energy Conservation Standard for Set-Top Boxes and Network Equipment... comments on the request for information pertaining to the development of test procedures and energy conservation standards for set-top boxes and network equipment. The comment period is extended to March 15...

  3. 49 CFR 231.35 - Procedure for modification of an approved industry safety appliance standard for new railcar...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... industry safety appliance standard for new construction of railroad cars, locomotives, tenders, or other... 49 Transportation 4 2013-10-01 2013-10-01 false Procedure for modification of an approved industry... TRANSPORTATION RAILROAD SAFETY APPLIANCE STANDARDS § 231.35 Procedure for modification of an approved industry...

  4. 49 CFR 231.35 - Procedure for modification of an approved industry safety appliance standard for new railcar...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... industry safety appliance standard for new construction of railroad cars, locomotives, tenders, or other... 49 Transportation 4 2014-10-01 2014-10-01 false Procedure for modification of an approved industry... TRANSPORTATION RAILROAD SAFETY APPLIANCE STANDARDS § 231.35 Procedure for modification of an approved industry...

  5. 49 CFR 231.35 - Procedure for modification of an approved industry safety appliance standard for new railcar...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... industry safety appliance standard for new construction of railroad cars, locomotives, tenders, or other... 49 Transportation 4 2012-10-01 2012-10-01 false Procedure for modification of an approved industry... TRANSPORTATION RAILROAD SAFETY APPLIANCE STANDARDS § 231.35 Procedure for modification of an approved industry...

  6. 49 CFR 231.35 - Procedure for modification of an approved industry safety appliance standard for new railcar...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... industry safety appliance standard for new construction of railroad cars, locomotives, tenders, or other... 49 Transportation 4 2011-10-01 2011-10-01 false Procedure for modification of an approved industry... TRANSPORTATION RAILROAD SAFETY APPLIANCE STANDARDS § 231.35 Procedure for modification of an approved industry...

  7. 21 CFR 861.30 - Development of standards.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Development of standards. 861.30 Section 861.30...) MEDICAL DEVICES PROCEDURES FOR PERFORMANCE STANDARDS DEVELOPMENT Procedures for Performance Standards Development and Publication § 861.30 Development of standards. The Food and Drug Administration (FDA), while...

  8. 21 CFR 861.30 - Development of standards.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Development of standards. 861.30 Section 861.30...) MEDICAL DEVICES PROCEDURES FOR PERFORMANCE STANDARDS DEVELOPMENT Procedures for Performance Standards Development and Publication § 861.30 Development of standards. The Food and Drug Administration (FDA), while...

  9. 15 CFR 10.13 - Withdrawal of a published standard.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... substantial public impact, that it does not duplicate a standard published by a private standards-writing...) Before withdrawing a standard published under these procedures, the Director will review the relative... years. (2) Public notice of intent to withdraw an existing standard published under these procedures...

  10. 15 CFR 10.13 - Withdrawal of a published standard.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... substantial public impact, that it does not duplicate a standard published by a private standards-writing...) Before withdrawing a standard published under these procedures, the Director will review the relative... years. (2) Public notice of intent to withdraw an existing standard published under these procedures...

  11. 15 CFR 10.13 - Withdrawal of a published standard.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... substantial public impact, that it does not duplicate a standard published by a private standards-writing...) Before withdrawing a standard published under these procedures, the Director will review the relative... years. (2) Public notice of intent to withdraw an existing standard published under these procedures...

  12. 15 CFR 10.13 - Withdrawal of a published standard.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... substantial public impact, that it does not duplicate a standard published by a private standards-writing...) Before withdrawing a standard published under these procedures, the Director will review the relative... years. (2) Public notice of intent to withdraw an existing standard published under these procedures...

  13. 7 CFR 42.122 - Applicability of other procedures.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ....122 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE COMMODITY STANDARDS AND STANDARD CONTAINER REGULATIONS STANDARDS FOR CONDITION OF FOOD CONTAINERS Skip Lot Sampling and Inspection Procedures § 42.122...

  14. 7 CFR 42.136 - Applicability of other procedures.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ....136 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE COMMODITY STANDARDS AND STANDARD CONTAINER REGULATIONS STANDARDS FOR CONDITION OF FOOD CONTAINERS On-Line Sampling and Inspection Procedures § 42.136...

  15. 21 CFR 606.60 - Equipment.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... prescribed in the Standard Operating Procedures Manual and shall perform in the manner for which it was... Hemoglobinometer Standardize against cyanmethemoglobin standard ......do Refractometer Standardize against... microorganisms. The effectiveness of the sterilization procedure shall be no less than that achieved by an...

  16. 21 CFR 606.60 - Equipment.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... prescribed in the Standard Operating Procedures Manual and shall perform in the manner for which it was... Hemoglobinometer Standardize against cyanmethemoglobin standard ......do Refractometer Standardize against... microorganisms. The effectiveness of the sterilization procedure shall be no less than that achieved by an...

  17. 21 CFR 606.60 - Equipment.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... prescribed in the Standard Operating Procedures Manual and shall perform in the manner for which it was... Hemoglobinometer Standardize against cyanmethemoglobin standard ......do Refractometer Standardize against... microorganisms. The effectiveness of the sterilization procedure shall be no less than that achieved by an...

  18. 21 CFR 606.60 - Equipment.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... prescribed in the Standard Operating Procedures Manual and shall perform in the manner for which it was... Hemoglobinometer Standardize against cyanmethemoglobin standard ......do Refractometer Standardize against... microorganisms. The effectiveness of the sterilization procedure shall be no less than that achieved by an...

  19. 21 CFR 606.60 - Equipment.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... prescribed in the Standard Operating Procedures Manual and shall perform in the manner for which it was... Hemoglobinometer Standardize against cyanmethemoglobin standard ......do Refractometer Standardize against... microorganisms. The effectiveness of the sterilization procedure shall be no less than that achieved by an...

  20. 40 CFR 1065.10 - Other procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... meet all applicable emission standards. (b) Our testing. These procedures generally apply for testing... importance of pursuing changes to the procedures: (i) Whether supplemental emission standards or other... procedures. For example, this may apply if your engine cannot operate on the specified duty cycle. In this...

Top