Sample records for statistical analysis failed

  1. An evaluation of the quality of statistical design and analysis of published medical research: results from a systematic survey of general orthopaedic journals.

    PubMed

    Parsons, Nick R; Price, Charlotte L; Hiskens, Richard; Achten, Juul; Costa, Matthew L

    2012-04-25

    The application of statistics in reported research in trauma and orthopaedic surgery has become ever more important and complex. Despite the extensive use of statistical analysis, it is still a subject which is often not conceptually well understood, resulting in clear methodological flaws and inadequate reporting in many papers. A detailed statistical survey sampled 100 representative orthopaedic papers using a validated questionnaire that assessed the quality of the trial design and statistical analysis methods. The survey found evidence of failings in study design, statistical methodology and presentation of the results. Overall, in 17% (95% confidence interval; 10-26%) of the studies investigated the conclusions were not clearly justified by the results, in 39% (30-49%) of studies a different analysis should have been undertaken and in 17% (10-26%) a different analysis could have made a difference to the overall conclusions. It is only by an improved dialogue between statistician, clinician, reviewer and journal editor that the failings in design methodology and analysis highlighted by this survey can be addressed.

  2. Why campaigns for local transportation funding initiatives succeed or fail : an analysis of four communities and national data

    DOT National Transportation Integrated Search

    2000-06-01

    This report uses statistical analysis of community-level characteristics and qualitatively focused case studies to explore what determines the success of local transportation-related tax measures. The report contains both a statistical analysis of lo...

  3. The Empirical Review of Meta-Analysis Published in Korea

    ERIC Educational Resources Information Center

    Park, Sunyoung; Hong, Sehee

    2016-01-01

    Meta-analysis is a statistical method that is increasingly utilized to combine and compare the results of previous primary studies. However, because of the lack of comprehensive guidelines for how to use meta-analysis, many meta-analysis studies have failed to consider important aspects, such as statistical programs, power analysis, publication…

  4. Why Campaigns for Local Transportation Funding Initiatives Succeed or Fail: An Analysis of Four Communities and National Data (PDF file)

    DOT National Transportation Integrated Search

    2000-06-01

    This report uses statistical analysis of community-level characteristics and qualitatively focused case studies to explore what determines the success of local transportation-related tax measures. The report contains both a statistical analysis of lo...

  5. Randomization Procedures Applied to Analysis of Ballistic Data

    DTIC Science & Technology

    1991-06-01

    test,;;15. NUMBER OF PAGES data analysis; computationally intensive statistics ; randomization tests; permutation tests; 16 nonparametric statistics ...be 0.13. 8 Any reasonable statistical procedure would fail to support the notion of improvement of dynamic over standard indexing based on this data ...AD-A238 389 TECHNICAL REPORT BRL-TR-3245 iBRL RANDOMIZATION PROCEDURES APPLIED TO ANALYSIS OF BALLISTIC DATA MALCOLM S. TAYLOR BARRY A. BODT - JUNE

  6. Arthrodesis following failed total knee arthroplasty: comprehensive review and meta-analysis of recent literature.

    PubMed

    Damron, T A; McBeath, A A

    1995-04-01

    With the increasing duration of follow up on total knee arthroplasties, more revision arthroplasties are being performed. When revision is not advisable, a salvage procedure such as arthrodesis or resection arthroplasty is indicated. This article provides a comprehensive review of the literature regarding arthrodesis following failed total knee arthroplasty. In addition, a statistical meta-analysis of five studies using modern arthrodesis techniques is presented. A statistically significant greater fusion rate with intramedullary nail arthrodesis compared to external fixation is documented. Gram negative and mixed infections are found to be significant risk factors for failure of arthrodesis.

  7. Analysis and Evaluation of the LANDSAT-4 MSS and TM Sensors and Ground Data Processing Systems: Early Results

    NASA Technical Reports Server (NTRS)

    Bernstein, R.; Lotspiech, J. B.

    1985-01-01

    The MSS and TM sensor performances were evaluated by studying both the sensors and the characteristics of the data. Information content analysis, image statistics, band-to-band registration, the presence of failed or failing detectors, and sensor resolution are discussed. The TM data were explored from the point of view of adequacy of the ground processing and improvements that could be made to compensate for sensor problems and deficiencies. Radiometric correction processing, compensation for a failed detector, and geometric correction processing are also considered.

  8. Publication Bias in Meta-Analysis: Confidence Intervals for Rosenthal's Fail-Safe Number.

    PubMed

    Fragkos, Konstantinos C; Tsagris, Michail; Frangos, Christos C

    2014-01-01

    The purpose of the present paper is to assess the efficacy of confidence intervals for Rosenthal's fail-safe number. Although Rosenthal's estimator is highly used by researchers, its statistical properties are largely unexplored. First of all, we developed statistical theory which allowed us to produce confidence intervals for Rosenthal's fail-safe number. This was produced by discerning whether the number of studies analysed in a meta-analysis is fixed or random. Each case produces different variance estimators. For a given number of studies and a given distribution, we provided five variance estimators. Confidence intervals are examined with a normal approximation and a nonparametric bootstrap. The accuracy of the different confidence interval estimates was then tested by methods of simulation under different distributional assumptions. The half normal distribution variance estimator has the best probability coverage. Finally, we provide a table of lower confidence intervals for Rosenthal's estimator.

  9. Publication Bias in Meta-Analysis: Confidence Intervals for Rosenthal's Fail-Safe Number

    PubMed Central

    Fragkos, Konstantinos C.; Tsagris, Michail; Frangos, Christos C.

    2014-01-01

    The purpose of the present paper is to assess the efficacy of confidence intervals for Rosenthal's fail-safe number. Although Rosenthal's estimator is highly used by researchers, its statistical properties are largely unexplored. First of all, we developed statistical theory which allowed us to produce confidence intervals for Rosenthal's fail-safe number. This was produced by discerning whether the number of studies analysed in a meta-analysis is fixed or random. Each case produces different variance estimators. For a given number of studies and a given distribution, we provided five variance estimators. Confidence intervals are examined with a normal approximation and a nonparametric bootstrap. The accuracy of the different confidence interval estimates was then tested by methods of simulation under different distributional assumptions. The half normal distribution variance estimator has the best probability coverage. Finally, we provide a table of lower confidence intervals for Rosenthal's estimator. PMID:27437470

  10. Targeting regional pediatric congenital hearing loss using a spatial scan statistic.

    PubMed

    Bush, Matthew L; Christian, Warren Jay; Bianchi, Kristin; Lester, Cathy; Schoenberg, Nancy

    2015-01-01

    Congenital hearing loss is a common problem, and timely identification and intervention are paramount for language development. Patients from rural regions may have many barriers to timely diagnosis and intervention. The purpose of this study was to examine the spatial and hospital-based distribution of failed infant hearing screening testing and pediatric congenital hearing loss throughout Kentucky. Data on live births and audiological reporting of infant hearing loss results in Kentucky from 2009 to 2011 were analyzed. The authors used spatial scan statistics to identify high-rate clusters of failed newborn screening tests and permanent congenital hearing loss (PCHL), based on the total number of live births per county. The authors conducted further analyses on PCHL and failed newborn hearing screening tests, based on birth hospital data and method of screening. The authors observed four statistically significant (p < 0.05) high-rate clusters with failed newborn hearing screenings in Kentucky, including two in the Appalachian region. Hospitals using two-stage otoacoustic emission testing demonstrated higher rates of failed screening (p = 0.009) than those using two-stage automated auditory brainstem response testing. A significant cluster of high rate of PCHL was observed in Western Kentucky. Five of the 54 birthing hospitals were found to have higher relative risk of PCHL, and two of those hospitals are located in a very rural region of Western Kentucky within the cluster. This spatial analysis in children in Kentucky has identified specific regions throughout the state with high rates of congenital hearing loss and failed newborn hearing screening tests. Further investigation regarding causative factors is warranted. This method of analysis can be useful in the setting of hearing health disparities to focus efforts on regions facing high incidence of congenital hearing loss.

  11. Why Children Fail in First Grade in Rio Grande do Sul: Implications for Policy and Research.

    ERIC Educational Resources Information Center

    Wolff, Laurence

    This study, exploring why first grade children from Rio Grande do Sul, Brazil, fail in school, utilized computerized techniques of statistical analysis to measure the relationships of various school and family characteristics with student achievement. Four types of schools--urban state, rural state, municipal, and private--were used to test the…

  12. Development of a reliable simulation-based test for diagnostic abdominal ultrasound with a pass/fail standard usable for mastery learning.

    PubMed

    Østergaard, Mia L; Nielsen, Kristina R; Albrecht-Beste, Elisabeth; Konge, Lars; Nielsen, Michael B

    2018-01-01

    This study aimed to develop a test with validity evidence for abdominal diagnostic ultrasound with a pass/fail-standard to facilitate mastery learning. The simulator had 150 real-life patient abdominal scans of which 15 cases with 44 findings were selected, representing level 1 from The European Federation of Societies for Ultrasound in Medicine and Biology. Four groups of experience levels were constructed: Novices (medical students), trainees (first-year radiology residents), intermediates (third- to fourth-year radiology residents) and advanced (physicians with ultrasound fellowship). Participants were tested in a standardized setup and scored by two blinded reviewers prior to an item analysis. The item analysis excluded 14 diagnoses. Both internal consistency (Cronbach's alpha 0.96) and inter-rater reliability (0.99) were good and there were statistically significant differences (p < 0.001) between all four groups, except the intermediate and advanced groups (p = 1.0). There was a statistically significant correlation between experience and test scores (Pearson's r = 0.82, p < 0.001). The pass/fail-standard failed all novices (no false positives) and passed all advanced (no false negatives). All intermediate participants and six out of 14 trainees passed. We developed a test for diagnostic abdominal ultrasound with solid validity evidence and a pass/fail-standard without any false-positive or false-negative scores. • Ultrasound training can benefit from competency-based education based on reliable tests. • This simulation-based test can differentiate between competency levels of ultrasound examiners. • This test is suitable for competency-based education, e.g. mastery learning. • We provide a pass/fail standard without false-negative or false-positive scores.

  13. Vocational Preparation for Women: A Critical Analysis.

    ERIC Educational Resources Information Center

    Steiger, JoAnn

    In this analysis of vocational preparation for women material is presented to substantiate the claim that women are joining the labor force in increasing numbers and their career opportunities are expanding, but that the educational system has failed to respond. Statistical data is cited showing that women have traditionally been employed in just…

  14. Survival of dental implants placed in sites of previously failed implants.

    PubMed

    Chrcanovic, Bruno R; Kisch, Jenö; Albrektsson, Tomas; Wennerberg, Ann

    2017-11-01

    To assess the survival of dental implants placed in sites of previously failed implants and to explore the possible factors that might affect the outcome of this reimplantation procedure. Patients that had failed dental implants, which were replaced with the same implant type at the same site, were included. Descriptive statistics were used to describe the patients and implants; survival analysis was also performed. The effect of systemic, environmental, and local factors on the survival of the reoperated implants was evaluated. 175 of 10,096 implants in 98 patients were replaced by another implant at the same location (159, 14, and 2 implants at second, third, and fourth surgeries, respectively). Newly replaced implants were generally of similar diameter but of shorter length compared to the previously placed fixtures. A statistically significant greater percentage of lost implants were placed in sites with low bone quantity. There was a statistically significant difference (P = 0.032) in the survival rates between implants that were inserted for the first time (94%) and implants that replaced the ones lost (73%). There was a statistically higher failure rate of the reoperated implants for patients taking antidepressants and antithrombotic agents. Dental implants replacing failed implants had lower survival rates than the rates reported for the previous attempts of implant placement. It is suggested that a site-specific negative effect may possibly be associated with this phenomenon, as well as the intake of antidepressants and antithrombotic agents. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  15. Comparing Methods for Assessing Reliability Uncertainty Based on Pass/Fail Data Collected Over Time

    DOE PAGES

    Abes, Jeff I.; Hamada, Michael S.; Hills, Charles R.

    2017-12-20

    In this paper, we compare statistical methods for analyzing pass/fail data collected over time; some methods are traditional and one (the RADAR or Rationale for Assessing Degradation Arriving at Random) was recently developed. These methods are used to provide uncertainty bounds on reliability. We make observations about the methods' assumptions and properties. Finally, we illustrate the differences between two traditional methods, logistic regression and Weibull failure time analysis, and the RADAR method using a numerical example.

  16. Comparing Methods for Assessing Reliability Uncertainty Based on Pass/Fail Data Collected Over Time

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abes, Jeff I.; Hamada, Michael S.; Hills, Charles R.

    In this paper, we compare statistical methods for analyzing pass/fail data collected over time; some methods are traditional and one (the RADAR or Rationale for Assessing Degradation Arriving at Random) was recently developed. These methods are used to provide uncertainty bounds on reliability. We make observations about the methods' assumptions and properties. Finally, we illustrate the differences between two traditional methods, logistic regression and Weibull failure time analysis, and the RADAR method using a numerical example.

  17. T-tubule disease: Relationship between t-tubule organization and regional contractile performance in human dilated cardiomyopathy.

    PubMed

    Crossman, David J; Young, Alistair A; Ruygrok, Peter N; Nason, Guy P; Baddelely, David; Soeller, Christian; Cannell, Mark B

    2015-07-01

    Evidence from animal models suggest that t-tubule changes may play an important role in the contractile deficit associated with heart failure. However samples are usually taken at random with no regard as to regional variability present in failing hearts which leads to uncertainty in the relationship between contractile performance and possible t-tubule derangement. Regional contraction in human hearts was measured by tagged cine MRI and model fitting. At transplant, failing hearts were biopsy sampled in identified regions and immunocytochemistry was used to label t-tubules and sarcomeric z-lines. Computer image analysis was used to assess 5 different unbiased measures of t-tubule structure/organization. In regions of failing hearts that showed good contractile performance, t-tubule organization was similar to that seen in normal hearts, with worsening structure correlating with the loss of regional contractile performance. Statistical analysis showed that t-tubule direction was most highly correlated with local contractile performance, followed by the amplitude of the sarcomeric peak in the Fourier transform of the t-tubule image. Other area based measures were less well correlated. We conclude that regional contractile performance in failing human hearts is strongly correlated with the local t-tubule organization. Cluster tree analysis with a functional definition of failing contraction strength allowed a pathological definition of 't-tubule disease'. The regional variability in contractile performance and cellular structure is a confounding issue for analysis of samples taken from failing human hearts, although this may be overcome with regional analysis by using tagged cMRI and biopsy mapping. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. U. S. Acquisition Cost Reduction and Avoidance Due to Foreign Military Sales

    DTIC Science & Technology

    2016-05-25

    delinquencies resulting in forfeiture, regulatory non-compliance, and possible misunderstandings, misrepresentations that might cause a business deal to fail...Knoema. (2015, September 26). Ongoing Armed Conflicts, 2014-2015 - knoema.com. Retrieved September 26, 2015, from Free data, statistics , analysis...States-of-America/Ease-of-Doing- Business?compareTo=GB,NL,IT,RU,FR Knoema. (2016, January 17). World and regional statistics , national data, maps

  19. Extreme value statistics analysis of fracture strengths of a sintered silicon nitride failing from pores

    NASA Technical Reports Server (NTRS)

    Chao, Luen-Yuan; Shetty, Dinesh K.

    1992-01-01

    Statistical analysis and correlation between pore-size distribution and fracture strength distribution using the theory of extreme-value statistics is presented for a sintered silicon nitride. The pore-size distribution on a polished surface of this material was characterized, using an automatic optical image analyzer. The distribution measured on the two-dimensional plane surface was transformed to a population (volume) distribution, using the Schwartz-Saltykov diameter method. The population pore-size distribution and the distribution of the pore size at the fracture origin were correllated by extreme-value statistics. Fracture strength distribution was then predicted from the extreme-value pore-size distribution, usin a linear elastic fracture mechanics model of annular crack around pore and the fracture toughness of the ceramic. The predicted strength distribution was in good agreement with strength measurements in bending. In particular, the extreme-value statistics analysis explained the nonlinear trend in the linearized Weibull plot of measured strengths without postulating a lower-bound strength.

  20. Season of Birth in Autism: A Fiction Revisited.

    ERIC Educational Resources Information Center

    Landau, Edwina C.; Cicchetti, Domenic V.; Klin, Ami; Volkmar, Fred R.

    1999-01-01

    This study attempted to replicate previously reported increases in birth rates in March and August for individuals with autism. Statistical analysis of 904 cases revealed no significant seasonal effect. Samples were subcategorized into verbal and mute groups and again results failed to support the seasonal hypothesis. (Author/DB)

  1. Coupling strength assumption in statistical energy analysis

    PubMed Central

    Lafont, T.; Totaro, N.

    2017-01-01

    This paper is a discussion of the hypothesis of weak coupling in statistical energy analysis (SEA). The examples of coupled oscillators and statistical ensembles of coupled plates excited by broadband random forces are discussed. In each case, a reference calculation is compared with the SEA calculation. First, it is shown that the main SEA relation, the coupling power proportionality, is always valid for two oscillators irrespective of the coupling strength. But the case of three subsystems, consisting of oscillators or ensembles of plates, indicates that the coupling power proportionality fails when the coupling is strong. Strong coupling leads to non-zero indirect coupling loss factors and, sometimes, even to a reversal of the energy flow direction from low to high vibrational temperature. PMID:28484335

  2. Repeat Urethroplasty After Failed Urethral Reconstruction: Outcome Analysis of 130 Patients

    PubMed Central

    Blaschko, Sarah D.; McAninch, Jack W.; Myers, Jeremy B.; Schlomer, Bruce J.; Breyer, Benjamin N.

    2013-01-01

    Purpose Male urethral stricture disease accounts for a significant number of hospital admissions and health care expenditures. Although much research has been completed on treatment for urethral strictures, fewer studies have addressed the treatment of strictures in men with recurrent stricture disease after failed prior urethroplasty. We examined outcome results for repeat urethroplasty. Materials and Methods A prospectively collected, single surgeon urethroplasty database was queried from 1977 to 2011 for patients treated with repeat urethroplasty after failed prior urethral reconstruction. Stricture length and location, and repeat urethroplasty intervention and failure were evaluated with descriptive statistics, and univariate and multivariate logistic regression. Results Of 1,156 cases 168 patients underwent repeat urethroplasty after at least 1 failed prior urethroplasty. Of these patients 130 had a followup of 6 months or more and were included in analysis. Median patient age was 44 years (range 11 to 75). Median followup was 55 months (range 6 months to 20.75 years). Overall, 102 of 130 patients (78%) were successfully treated. For patients with failure median time to failure was 17 months (range 7 months to 16.8 years). Two or more failed prior urethroplasties and comorbidities associated with urethral stricture disease were associated with an increased risk of repeat urethroplasty failure. Conclusions Repeat urethroplasty is a successful treatment option. Patients in whom treatment failed had longer strictures and more complex repairs. PMID:23083654

  3. Use of check lists in assessing the statistical content of medical studies.

    PubMed Central

    Gardner, M J; Machin, D; Campbell, M J

    1986-01-01

    Two check lists are used routinely in the statistical assessment of manuscripts submitted to the "BMJ." One is for papers of a general nature and the other specifically for reports on clinical trials. Each check list includes questions on the design, conduct, analysis, and presentation of studies, and answers to these contribute to the overall statistical evaluation. Only a small proportion of submitted papers are assessed statistically, and these are selected at the refereeing or editorial stage. Examination of the use of the check lists showed that most papers contained statistical failings, many of which could easily be remedied. It is recommended that the check lists should be used by statistical referees, editorial staff, and authors and also during the design stage of studies. PMID:3082452

  4. Sister chromatid exchanges and micronuclei analysis in lymphocytes of men exposed to simazine through drinking water.

    PubMed

    Suárez, Susanna; Rubio, Arantxa; Sueiro, Rosa Ana; Garrido, Joaquín

    2003-06-06

    In some cities of the autonomous community of Extremadura (south-west of Spain), levels of simazine from 10 to 30 ppm were detected in tap water. To analyse the possible effect of this herbicide, two biomarkers, sister chromatid exchanges (SCE) and micronuclei (MN), were used in peripheral blood lymphocytes from males exposed to simazine through drinking water. SCE and MN analysis failed to detect any statistically significant increase in the people exposed to simazine when compared with the controls. With respect to high frequency cells (HFC), a statistically significant difference was detected between exposed and control groups.

  5. Use of a statistical model of the whole femur in a large scale, multi-model study of femoral neck fracture risk.

    PubMed

    Bryan, Rebecca; Nair, Prasanth B; Taylor, Mark

    2009-09-18

    Interpatient variability is often overlooked in orthopaedic computational studies due to the substantial challenges involved in sourcing and generating large numbers of bone models. A statistical model of the whole femur incorporating both geometric and material property variation was developed as a potential solution to this problem. The statistical model was constructed using principal component analysis, applied to 21 individual computer tomography scans. To test the ability of the statistical model to generate realistic, unique, finite element (FE) femur models it was used as a source of 1000 femurs to drive a study on femoral neck fracture risk. The study simulated the impact of an oblique fall to the side, a scenario known to account for a large proportion of hip fractures in the elderly and have a lower fracture load than alternative loading approaches. FE model generation, application of subject specific loading and boundary conditions, FE processing and post processing of the solutions were completed automatically. The generated models were within the bounds of the training data used to create the statistical model with a high mesh quality, able to be used directly by the FE solver without remeshing. The results indicated that 28 of the 1000 femurs were at highest risk of fracture. Closer analysis revealed the percentage of cortical bone in the proximal femur to be a crucial differentiator between the failed and non-failed groups. The likely fracture location was indicated to be intertrochantic. Comparison to previous computational, clinical and experimental work revealed support for these findings.

  6. An analysis of the relationship of seven selected variables to State Board Test Pool Examination performance of the University of Tennessee, Knoxville, College of Nursing.

    PubMed

    Sharp, T G

    1984-02-01

    The study was designed to determine whether any one of seven selected variables or a combination of the variables is predictive of performance on the State Board Test Pool Examination. The selected variables studied were: high school grade point average (HSGPA), The University of Tennessee, Knoxville, College of Nursing grade point average (GPA), and American College Test Assessment (ACT) standard scores (English, ENG; mathematics, MA; social studies, SS; natural sciences, NSC; composite, COMP). Data utilized were from graduates of the baccalaureate program of The University of Tennessee, Knoxville, College of Nursing from 1974 through 1979. The sample of 322 was selected from a total population of 572. The Statistical Analysis System (SAS) was designed to accomplish analysis of the predictive relationship of each of the seven selected variables to State Board Test Pool Examination performance (result of pass or fail), a stepwise discriminant analysis was designed for determining the predictive relationship of the strongest combination of the independent variables to overall State Board Test Pool Examination performance (result of pass or fail), and stepwise multiple regression analysis was designed to determine the strongest predictive combination of selected variables for each of the five subexams of the State Board Test Pool Examination. The selected variables were each found to be predictive of SBTPE performance (result of pass or fail). The strongest combination for predicting SBTPE performance (result of pass or fail) was found to be GPA, MA, and NSC.

  7. Setting and validating the pass/fail score for the NBDHE.

    PubMed

    Tsai, Tsung-Hsun; Dixon, Barbara Leatherman

    2013-04-01

    This report describes the overall process used for setting the pass/fail score for the National Board Dental Hygiene Examination (NBDHE). The Objective Standard Setting (OSS) method was used for setting the pass/fail score for the NBDHE. The OSS method requires a panel of experts to determine the criterion items and proportion of these items that minimally competent candidates would answer correctly, the percentage of mastery and the confidence level of the error band. A panel of 11 experts was selected by the Joint Commission on National Dental Examinations (Joint Commission). Panel members represented geographic distribution across the U.S. and had the following characteristics: full-time dental hygiene practitioners with experience in areas of preventive, periodontal, geriatric and special needs care, and full-time dental hygiene educators with experience in areas of scientific basis for dental hygiene practice, provision of clinical dental hygiene services and community health/research principles. Utilizing the expert panel's judgments, the pass/fail score was set and then the score scale was established using the Rasch measurement model. Statistical and psychometric analysis shows the actual failure rate and the OSS failure rate are reasonably consistent (2.4% vs. 2.8%). The analysis also showed the lowest error of measurement, an index of the precision at the pass/fail score point and that the highest reliability (0.97) are achieved at the pass/fail score point. The pass/fail score is a valid guide for making decisions about candidates for dental hygiene licensure. This new standard was reviewed and approved by the Joint Commission and was implemented beginning in 2011.

  8. Compositional Solution Space Quantification for Probabilistic Software Analysis

    NASA Technical Reports Server (NTRS)

    Borges, Mateus; Pasareanu, Corina S.; Filieri, Antonio; d'Amorim, Marcelo; Visser, Willem

    2014-01-01

    Probabilistic software analysis aims at quantifying how likely a target event is to occur during program execution. Current approaches rely on symbolic execution to identify the conditions to reach the target event and try to quantify the fraction of the input domain satisfying these conditions. Precise quantification is usually limited to linear constraints, while only approximate solutions can be provided in general through statistical approaches. However, statistical approaches may fail to converge to an acceptable accuracy within a reasonable time. We present a compositional statistical approach for the efficient quantification of solution spaces for arbitrarily complex constraints over bounded floating-point domains. The approach leverages interval constraint propagation to improve the accuracy of the estimation by focusing the sampling on the regions of the input domain containing the sought solutions. Preliminary experiments show significant improvement on previous approaches both in results accuracy and analysis time.

  9. Physics Education: A Significant Backbone of Sustainable Development in Developing Countries

    NASA Astrophysics Data System (ADS)

    Akintola, R. A.

    2006-08-01

    In the quest for technological self-reliance, many policies, programs and projects have been proposed and implemented in order to procure solutions to the problems of technological inadequacies of developing countries. It has been observed that all these failed. This research identifies the problems and proposes lasting solutions to emancipate physics education in developing nations and highlight possible future gains. The statistical analysis employed was based on questionnaires, interviews and data analysis.

  10. Flexible statistical modelling detects clinical functional magnetic resonance imaging activation in partially compliant subjects.

    PubMed

    Waites, Anthony B; Mannfolk, Peter; Shaw, Marnie E; Olsrud, Johan; Jackson, Graeme D

    2007-02-01

    Clinical functional magnetic resonance imaging (fMRI) occasionally fails to detect significant activation, often due to variability in task performance. The present study seeks to test whether a more flexible statistical analysis can better detect activation, by accounting for variance associated with variable compliance to the task over time. Experimental results and simulated data both confirm that even at 80% compliance to the task, such a flexible model outperforms standard statistical analysis when assessed using the extent of activation (experimental data), goodness of fit (experimental data), and area under the operator characteristic curve (simulated data). Furthermore, retrospective examination of 14 clinical fMRI examinations reveals that in patients where the standard statistical approach yields activation, there is a measurable gain in model performance in adopting the flexible statistical model, with little or no penalty in lost sensitivity. This indicates that a flexible model should be considered, particularly for clinical patients who may have difficulty complying fully with the study task.

  11. Twice random, once mixed: applying mixed models to simultaneously analyze random effects of language and participants.

    PubMed

    Janssen, Dirk P

    2012-03-01

    Psychologists, psycholinguists, and other researchers using language stimuli have been struggling for more than 30 years with the problem of how to analyze experimental data that contain two crossed random effects (items and participants). The classical analysis of variance does not apply; alternatives have been proposed but have failed to catch on, and a statistically unsatisfactory procedure of using two approximations (known as F(1) and F(2)) has become the standard. A simple and elegant solution using mixed model analysis has been available for 15 years, and recent improvements in statistical software have made mixed models analysis widely available. The aim of this article is to increase the use of mixed models by giving a concise practical introduction and by giving clear directions for undertaking the analysis in the most popular statistical packages. The article also introduces the DJMIXED: add-on package for SPSS, which makes entering the models and reporting their results as straightforward as possible.

  12. The other half of the story: effect size analysis in quantitative research.

    PubMed

    Maher, Jessica Middlemis; Markey, Jonathan C; Ebert-May, Diane

    2013-01-01

    Statistical significance testing is the cornerstone of quantitative research, but studies that fail to report measures of effect size are potentially missing a robust part of the analysis. We provide a rationale for why effect size measures should be included in quantitative discipline-based education research. Examples from both biological and educational research demonstrate the utility of effect size for evaluating practical significance. We also provide details about some effect size indices that are paired with common statistical significance tests used in educational research and offer general suggestions for interpreting effect size measures. Finally, we discuss some inherent limitations of effect size measures and provide further recommendations about reporting confidence intervals.

  13. Feminist identity as a predictor of eating disorder diagnostic status.

    PubMed

    Green, Melinda A; Scott, Norman A; Riopel, Cori M; Skaggs, Anna K

    2008-06-01

    Passive Acceptance (PA) and Active Commitment (AC) subscales of the Feminist Identity Development Scale (FIDS) were examined as predictors of eating disorder diagnostic status as assessed by the Questionnaire for Eating Disorder Diagnoses (Q-EDD). Results of a hierarchical regression analysis revealed PA and AC scores were not statistically significant predictors of ED diagnostic status after controlling for diagnostic subtype. Results of a multiple regression analysis revealed FIDS as a statistically significant predictor of ED diagnostic status when failing to control for ED diagnostic subtype. Discrepancies suggest ED diagnostic subtype may serve as a moderator variable in the relationship between ED diagnostic status and FIDS. (c) 2008 Wiley Periodicals, Inc.

  14. Comparison of the statistics of salmonella testing of chilled broiler chicken carcasses by whole carcass rinse and neck skin excision

    USDA-ARS?s Scientific Manuscript database

    Whether a required Salmonella test series is passed or failed depends not only on the presence of the bacteria, but also on the methods for taking samples, the methods for culturing samples, and the statistics associated with the sampling plan. The pass-fail probabilities of the two-class attribute...

  15. CMM Data Analysis Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Due to the increase in the use of Coordinate Measuring Machines (CMMs) to measure fine details and complex geometries in manufacturing, many programs have been made to compile and analyze the data. These programs typically require extensive setup to determine the expected results in order to not only track the pass/fail of a dimension, but also to use statistical process control (SPC). These extra steps and setup times have been addressed through the CMM Data Analysis Tool, which only requires the output of the CMM to provide both pass/fail analysis on all parts run to the same inspection program asmore » well as provide graphs which help visualize where the part measures within the allowed tolerances. This provides feedback not only to the customer for approval of a part during development, but also to machining process engineers to identify when any dimension is drifting towards an out of tolerance condition during production. This program can handle hundreds of parts with complex dimensions and will provide an analysis within minutes.« less

  16. Statistical, Graphical, and Learning Methods for Sensing, Surveillance, and Navigation Systems

    DTIC Science & Technology

    2016-06-28

    harsh propagation environments. Conventional filtering techniques fail to provide satisfactory performance in many important nonlinear or non...Gaussian scenarios. In addition, there is a lack of a unified methodology for the design and analysis of different filtering techniques. To address...these problems, we have proposed a new filtering methodology called belief condensation (BC) DISTRIBUTION A: Distribution approved for public release

  17. Clinical and histopathological factors affecting failed sentinel node localization in axillary staging for breast cancer.

    PubMed

    Dordea, Matei; Colvin, Hugh; Cox, Phil; Pujol Nicolas, Andrea; Kanakala, Venkat; Iwuchukwu, Obi

    2013-04-01

    Sentinel lymph node biopsy (SLNB) has become the standard of care in axillary staging of clinically node-negative breast cancer patients. To analyze reasons for failure of SLN localization by means of a multivariate analysis of clinical and histopathological factors. We performed a review of 164 consecutive breast cancer patients who underwent SLNB. A superficial injection technique was used. 9/164 patients failed to show nodes. In 7/9 patients no evidence of radioactivity or blue dye was observed. Age and nodal status were the only statistically significant factors (p < 0.05). For every unit increase in age there was a 9% reduced chance of failed SLN localization. Patients with negative nodal status have 90% reduced risk of failed sentinel node localization than patients with macro or extra capsular nodal invasion. The results suggest that altered lymphatic dynamics secondary to tumour burden may play a role in failed sentinel node localization. We showed that in all failed localizations the radiocolloid persisted around the injection site, showing limited local diffusion only. While clinical and histopathological data may provide some clues as to why sentinel node localization fails, we further hypothesize that integrity of peri-areolar lymphatics is important for successful localization. Copyright © 2012 Royal College of Surgeons of Edinburgh (Scottish charity number SC005317) and Royal College of Surgeons in Ireland. Published by Elsevier Ltd. All rights reserved.

  18. An experimental evaluation of the Sternberg task as a workload metric for helicopter Flight Handling Qualities (FHQ) research

    NASA Technical Reports Server (NTRS)

    Hemingway, J. C.

    1984-01-01

    The objective was to determine whether the Sternberg item-recognition task, employed as a secondary task measure of spare mental capacity for flight handling qualities (FHQ) simulation research, could help to differentiate between different flight-control conditions. FHQ evaluations were conducted on the Vertical Motion Simulator at Ames Research Center to investigate different primary flight-control configurations, and selected stability and control augmentation levels for helicopters engaged in low-level flight regimes. The Sternberg task was superimposed upon the primary flight-control task in a balanced experimental design. The results of parametric statistical analysis of Sternberg secondary task data failed to support the continued use of this task as a measure of pilot workload. In addition to the secondary task, subjects provided Cooper-Harper pilot ratings (CHPR) and responded to workload questionnaire. The CHPR data also failed to provide reliable statistical discrimination between FHQ treatment conditions; some insight into the behavior of the secondary task was gained from the workload questionnaire data.

  19. The Sternberg Task as a Workload Metric in Flight Handling Qualities Research

    NASA Technical Reports Server (NTRS)

    Hemingway, J. C.

    1984-01-01

    The objective of this research was to determine whether the Sternberg item-recognition task, employed as a secondary task measure of spare mental capacity for flight handling qualities (FHQ) simulation research, could help to differentiate between different flight-control conditions. FHQ evaluations were conducted on the Vertical Motion Simulator at Ames Research Center to investigate different primary flight-control configurations, and selected stability and control augmentation levels for helicopers engaged in low-level flight regimes. The Sternberg task was superimposed upon the primary flight-control task in a balanced experimental design. The results of parametric statistical analysis of Sternberg secondary task data failed to support the continued use of this task as a measure of pilot workload. In addition to the secondary task, subjects provided Cooper-Harper pilot ratings (CHPR) and responded to a workload questionnaire. The CHPR data also failed to provide reliable statistical discrimination between FHQ treatment conditions; some insight into the behavior of the secondary task was gained from the workload questionnaire data.

  20. Prediction of the Electromagnetic Field Distribution in a Typical Aircraft Using the Statistical Energy Analysis

    NASA Astrophysics Data System (ADS)

    Kovalevsky, Louis; Langley, Robin S.; Caro, Stephane

    2016-05-01

    Due to the high cost of experimental EMI measurements significant attention has been focused on numerical simulation. Classical methods such as Method of Moment or Finite Difference Time Domain are not well suited for this type of problem, as they require a fine discretisation of space and failed to take into account uncertainties. In this paper, the authors show that the Statistical Energy Analysis is well suited for this type of application. The SEA is a statistical approach employed to solve high frequency problems of electromagnetically reverberant cavities at a reduced computational cost. The key aspects of this approach are (i) to consider an ensemble of system that share the same gross parameter, and (ii) to avoid solving Maxwell's equations inside the cavity, using the power balance principle. The output is an estimate of the field magnitude distribution in each cavity. The method is applied on a typical aircraft structure.

  1. Should English healthcare providers be penalised for failing to collect patient-reported outcome measures? A retrospective analysis

    PubMed Central

    Street, Andrew; Gomes, Manuel; Bojke, Chris

    2015-01-01

    Summary Objective The best practice tariff for hip and knee replacement in the English National Health Service (NHS) rewards providers based on improvements in patient-reported outcome measures (PROMs) collected before and after surgery. Providers only receive a bonus if at least 50% of their patients complete the preoperative questionnaire. We determined how many providers failed to meet this threshold prior to the policy introduction and assessed longitudinal stability of participation rates. Design Retrospective observational study using data from Hospital Episode Statistics and the national PROM programme from April 2009 to March 2012. We calculated participation rates based on either (a) all PROM records or (b) only those that could be linked to inpatient records; constructed confidence intervals around rates to account for sampling variation; applied precision weighting to allow for volume; and applied risk adjustment. Setting NHS hospitals and private providers in England. Participants NHS patients undergoing elective unilateral hip and knee replacement surgery. Main outcome measures Number of providers with participation rates statistically significantly below 50%. Results Crude rates identified many providers that failed to achieve the 50% threshold but there were substantially fewer after adjusting for uncertainty and precision. While important, risk adjustment required restricting the analysis to linked data. Year-on-year correlation between provider participation rates was moderate. Conclusions Participation rates have improved over time and only a small number of providers now fall below the threshold, but administering preoperative questionnaires remains problematic in some providers. We recommend that participation rates are based on linked data and take into account sampling variation. PMID:25827906

  2. Should English healthcare providers be penalised for failing to collect patient-reported outcome measures? A retrospective analysis.

    PubMed

    Gutacker, Nils; Street, Andrew; Gomes, Manuel; Bojke, Chris

    2015-08-01

    The best practice tariff for hip and knee replacement in the English National Health Service (NHS) rewards providers based on improvements in patient-reported outcome measures (PROMs) collected before and after surgery. Providers only receive a bonus if at least 50% of their patients complete the preoperative questionnaire. We determined how many providers failed to meet this threshold prior to the policy introduction and assessed longitudinal stability of participation rates. Retrospective observational study using data from Hospital Episode Statistics and the national PROM programme from April 2009 to March 2012. We calculated participation rates based on either (a) all PROM records or (b) only those that could be linked to inpatient records; constructed confidence intervals around rates to account for sampling variation; applied precision weighting to allow for volume; and applied risk adjustment. NHS hospitals and private providers in England. NHS patients undergoing elective unilateral hip and knee replacement surgery. Number of providers with participation rates statistically significantly below 50%. Crude rates identified many providers that failed to achieve the 50% threshold but there were substantially fewer after adjusting for uncertainty and precision. While important, risk adjustment required restricting the analysis to linked data. Year-on-year correlation between provider participation rates was moderate. Participation rates have improved over time and only a small number of providers now fall below the threshold, but administering preoperative questionnaires remains problematic in some providers. We recommend that participation rates are based on linked data and take into account sampling variation. © The Royal Society of Medicine.

  3. Arthroscopic lysis of adhesions for the stiff total knee: results after failed manipulation.

    PubMed

    Tjoumakaris, Fotios Paul; Tucker, Bradfords Chofield; Post, Zachary; Pepe, Matthew David; Orozco, Fabio; Ong, Alvin C

    2014-05-01

    Arthrofibrosis after total knee arthroplasty (TKA) is a potentially devastating complication, resulting in loss of motion and function and residual pain. For patients in whom aggressive physical therapy and manipulation under anesthesia fail, lysis of adhesions may be the only option to rescue the stiff TKA. The purpose of this study is to report the results of arthroscopic lysis of adhesions after failed manipulation for a stiff, cruciate-substituting TKA. This retrospective study evaluated patients who had undergone arthroscopic lysis of adhesions for arthrofibrosis after TKA between 2007 and 2011. Minimum follow-up was 12 months (average, 31 months). Average total range of motion of patients in this series was 62.3°. Average preoperative flexion contracture was 16° and average flexion was 78.6°. Statistical analysis was performed using Student's t test. Pre- to postoperative increase in range of motion was significant (P<.001) (average, 62° preoperatively to 98° postoperatively). Average preoperative extension deficit was 16°, which was reduced to 4° at final follow-up. This value was also found to be statistically significant (P<.0001). With regard to ultimate flexion attained, average preoperative flexion was 79°, which was improved to 103° at final follow-up. This improvement in flexion was statistically significant (P<.0001). Patients can reliably expect an improvement after arthroscopic lysis of adhesions for a stiff TKA using a standardized arthroscopic approach; however, patients achieved approximately half of the improvement that was obtained at the time of surgery. Copyright 2014, SLACK Incorporated.

  4. Assessment of statistical methods used in library-based approaches to microbial source tracking.

    PubMed

    Ritter, Kerry J; Carruthers, Ethan; Carson, C Andrew; Ellender, R D; Harwood, Valerie J; Kingsley, Kyle; Nakatsu, Cindy; Sadowsky, Michael; Shear, Brian; West, Brian; Whitlock, John E; Wiggins, Bruce A; Wilbur, Jayson D

    2003-12-01

    Several commonly used statistical methods for fingerprint identification in microbial source tracking (MST) were examined to assess the effectiveness of pattern-matching algorithms to correctly identify sources. Although numerous statistical methods have been employed for source identification, no widespread consensus exists as to which is most appropriate. A large-scale comparison of several MST methods, using identical fecal sources, presented a unique opportunity to assess the utility of several popular statistical methods. These included discriminant analysis, nearest neighbour analysis, maximum similarity and average similarity, along with several measures of distance or similarity. Threshold criteria for excluding uncertain or poorly matched isolates from final analysis were also examined for their ability to reduce false positives and increase prediction success. Six independent libraries used in the study were constructed from indicator bacteria isolated from fecal materials of humans, seagulls, cows and dogs. Three of these libraries were constructed using the rep-PCR technique and three relied on antibiotic resistance analysis (ARA). Five of the libraries were constructed using Escherichia coli and one using Enterococcus spp. (ARA). Overall, the outcome of this study suggests a high degree of variability across statistical methods. Despite large differences in correct classification rates among the statistical methods, no single statistical approach emerged as superior. Thresholds failed to consistently increase rates of correct classification and improvement was often associated with substantial effective sample size reduction. Recommendations are provided to aid in selecting appropriate analyses for these types of data.

  5. A Retrospective Analysis of Dental Implants Replacing Failed Implants in Grafted Maxillary Sinus: A Case Series.

    PubMed

    Manor, Yifat; Chaushu, Gavriel; Lorean, Adi; Mijiritzky, Eithan

    2015-01-01

    To evaluate the survival rate of dental implants replacing failed implants in grafted maxillary sinuses using the lateral approach vs nongrafted posterior maxillae. A retrospective analysis was conducted to study the survival of secondary dental implants inserted in the posterior maxilla in previously failed implant sites between the years 2000 and 2010. The study group consisted of patients who had also undergone maxillary sinus augmentation, and the control group consisted of patients in whom implants in the posterior maxilla had failed. Clinical and demographic data were analyzed using a structured form. Seventy-five patients with a total of 75 replaced implants were included in the study. The study group comprised 40 patients and the control group, 35 patients. None of the replaced implants in the study group failed, resulting in an overall survival of 100%; three replaced implants in the control group failed (92% survival). The main reason for the primary implant removal was lack of osseointegration (35 [87.5%] of 40 study group implants and 23 [65.7%] of 35 control group implants [P = .027]). The difference between the groups with regard to the timing of primary implant failure was statistically significant. The study group had more early failures of the primary implant than did the control group (77% vs 62%; P = .038). Dental implants replaced in the posterior maxilla had a high survival rate. A higher rate of survival was found in augmented maxillary sinus sites. Within the limits of the present study, it can be concluded that previous implant failures in the grafted maxillary sinus should not discourage practitioners from a second attempt.

  6. Statistical analysis of field data for aircraft warranties

    NASA Astrophysics Data System (ADS)

    Lakey, Mary J.

    Air Force and Navy maintenance data collection systems were researched to determine their scientific applicability to the warranty process. New and unique algorithms were developed to extract failure distributions which were then used to characterize how selected families of equipment typically fails. Families of similar equipment were identified in terms of function, technology and failure patterns. Statistical analyses and applications such as goodness-of-fit test, maximum likelihood estimation and derivation of confidence intervals for the probability density function parameters were applied to characterize the distributions and their failure patterns. Statistical and reliability theory, with relevance to equipment design and operational failures were also determining factors in characterizing the failure patterns of the equipment families. Inferences about the families with relevance to warranty needs were then made.

  7. A primer on the study of transitory dynamics in ecological series using the scale-dependent correlation analysis.

    PubMed

    Rodríguez-Arias, Miquel Angel; Rodó, Xavier

    2004-03-01

    Here we describe a practical, step-by-step primer to scale-dependent correlation (SDC) analysis. The analysis of transitory processes is an important but often neglected topic in ecological studies because only a few statistical techniques appear to detect temporary features accurately enough. We introduce here the SDC analysis, a statistical and graphical method to study transitory processes at any temporal or spatial scale. SDC analysis, thanks to the combination of conventional procedures and simple well-known statistical techniques, becomes an improved time-domain analogue of wavelet analysis. We use several simple synthetic series to describe the method, a more complex example, full of transitory features, to compare SDC and wavelet analysis, and finally we analyze some selected ecological series to illustrate the methodology. The SDC analysis of time series of copepod abundances in the North Sea indicates that ENSO primarily is the main climatic driver of short-term changes in population dynamics. SDC also uncovers some long-term, unexpected features in the population. Similarly, the SDC analysis of Nicholson's blowflies data locates where the proposed models fail and provides new insights about the mechanism that drives the apparent vanishing of the population cycle during the second half of the series.

  8. Statistical Requirements For Pass-Fail Testing Of Contraband Detection Systems

    NASA Astrophysics Data System (ADS)

    Gilliam, David M.

    2011-06-01

    Contraband detection systems for homeland security applications are typically tested for probability of detection (PD) and probability of false alarm (PFA) using pass-fail testing protocols. Test protocols usually require specified values for PD and PFA to be demonstrated at a specified level of statistical confidence CL. Based on a recent more theoretical treatment of this subject [1], this summary reviews the definition of CL and provides formulas and spreadsheet functions for constructing tables of general test requirements and for determining the minimum number of tests required. The formulas and tables in this article may be generally applied to many other applications of pass-fail testing, in addition to testing of contraband detection systems.

  9. UNCERTAINTY ON RADIATION DOSES ESTIMATED BY BIOLOGICAL AND RETROSPECTIVE PHYSICAL METHODS.

    PubMed

    Ainsbury, Elizabeth A; Samaga, Daniel; Della Monaca, Sara; Marrale, Maurizio; Bassinet, Celine; Burbidge, Christopher I; Correcher, Virgilio; Discher, Michael; Eakins, Jon; Fattibene, Paola; Güçlü, Inci; Higueras, Manuel; Lund, Eva; Maltar-Strmecki, Nadica; McKeever, Stephen; Rääf, Christopher L; Sholom, Sergey; Veronese, Ivan; Wieser, Albrecht; Woda, Clemens; Trompier, Francois

    2018-03-01

    Biological and physical retrospective dosimetry are recognised as key techniques to provide individual estimates of dose following unplanned exposures to ionising radiation. Whilst there has been a relatively large amount of recent development in the biological and physical procedures, development of statistical analysis techniques has failed to keep pace. The aim of this paper is to review the current state of the art in uncertainty analysis techniques across the 'EURADOS Working Group 10-Retrospective dosimetry' members, to give concrete examples of implementation of the techniques recommended in the international standards, and to further promote the use of Monte Carlo techniques to support characterisation of uncertainties. It is concluded that sufficient techniques are available and in use by most laboratories for acute, whole body exposures to highly penetrating radiation, but further work will be required to ensure that statistical analysis is always wholly sufficient for the more complex exposure scenarios.

  10. Teaching the Meaning of Statistical Techniques with Microcomputer Simulation.

    ERIC Educational Resources Information Center

    Lee, Motoko Y.; And Others

    Students in an introductory statistics course are often preoccupied with learning the computational routines of specific summary statistics and thereby fail to develop an understanding of the meaning of those statistics or their conceptual basis. To help students develop a better understanding of the meaning of three frequently used statistics,…

  11. Statistical Requirements For Pass-Fail Testing Of Contraband Detection Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilliam, David M.

    2011-06-01

    Contraband detection systems for homeland security applications are typically tested for probability of detection (PD) and probability of false alarm (PFA) using pass-fail testing protocols. Test protocols usually require specified values for PD and PFA to be demonstrated at a specified level of statistical confidence CL. Based on a recent more theoretical treatment of this subject [1], this summary reviews the definition of CL and provides formulas and spreadsheet functions for constructing tables of general test requirements and for determining the minimum number of tests required. The formulas and tables in this article may be generally applied to many othermore » applications of pass-fail testing, in addition to testing of contraband detection systems.« less

  12. Analysis of longitudinal data from animals with missing values using SPSS.

    PubMed

    Duricki, Denise A; Soleman, Sara; Moon, Lawrence D F

    2016-06-01

    Testing of therapies for disease or injury often involves the analysis of longitudinal data from animals. Modern analytical methods have advantages over conventional methods (particularly when some data are missing), yet they are not used widely by preclinical researchers. Here we provide an easy-to-use protocol for the analysis of longitudinal data from animals, and we present a click-by-click guide for performing suitable analyses using the statistical package IBM SPSS Statistics software (SPSS). We guide readers through the analysis of a real-life data set obtained when testing a therapy for brain injury (stroke) in elderly rats. If a few data points are missing, as in this example data set (for example, because of animal dropout), repeated-measures analysis of covariance may fail to detect a treatment effect. An alternative analysis method, such as the use of linear models (with various covariance structures), and analysis using restricted maximum likelihood estimation (to include all available data) can be used to better detect treatment effects. This protocol takes 2 h to carry out.

  13. A biological compression model and its applications.

    PubMed

    Cao, Minh Duc; Dix, Trevor I; Allison, Lloyd

    2011-01-01

    A biological compression model, expert model, is presented which is superior to existing compression algorithms in both compression performance and speed. The model is able to compress whole eukaryotic genomes. Most importantly, the model provides a framework for knowledge discovery from biological data. It can be used for repeat element discovery, sequence alignment and phylogenetic analysis. We demonstrate that the model can handle statistically biased sequences and distantly related sequences where conventional knowledge discovery tools often fail.

  14. The use of Quincke and Whitacre 27-gauge needles in orthopedic patients: incidence of failed spinal anesthesia and postdural puncture headache.

    PubMed

    Lynch, J; Kasper, S M; Strick, K; Topalidis, K; Schaaf, H; Zech, D; Krings-Ernst, I

    1994-07-01

    This study examined the incidence of failed spinal anesthesia and postdural puncture headache using a 27-gauge Whitacre and a 27-gauge Quincke needle in patients undergoing elective inpatient orthopedic procedures. The overall rate of failed spinal anesthesia was 8.5% [95% confidence interval (CI) = 4.6%-12.4%] (n = 17) in the Quincke group (n = 199) and 5.5% [95% CI = 2.3%-8.7%] (n = 11) in the Whitacre group (n = 199). This difference was not statistically significant. The overall incidence of postdural puncture headache (PDPH) was 0.8%; 1.1% [95% CI = 0%-2.4%] (n = 2) in the Quincke group and 0.5% [95% CI = 0%-1.5%] (n = 1) in the Whitacre group. These differences were not statistically significant. All headaches were classified as mild and resolved spontaneously with conservative management. The mean time for withdrawal of the stylet to appearance of cerebrospinal fluid was 10.8 +/- 6.9 s in the Quincke (n = 31) and 10.7 +/- 6.8 s in the Whitacre group (n = 33). These differences were not statistically significant. Our results suggest that both needles are associated with a very low incidence of PDPH and an incidence of failed anesthesia of 5.5%-8.5%.

  15. Statistical power and optimal design in experiments in which samples of participants respond to samples of stimuli.

    PubMed

    Westfall, Jacob; Kenny, David A; Judd, Charles M

    2014-10-01

    Researchers designing experiments in which a sample of participants responds to a sample of stimuli are faced with difficult questions about optimal study design. The conventional procedures of statistical power analysis fail to provide appropriate answers to these questions because they are based on statistical models in which stimuli are not assumed to be a source of random variation in the data, models that are inappropriate for experiments involving crossed random factors of participants and stimuli. In this article, we present new methods of power analysis for designs with crossed random factors, and we give detailed, practical guidance to psychology researchers planning experiments in which a sample of participants responds to a sample of stimuli. We extensively examine 5 commonly used experimental designs, describe how to estimate statistical power in each, and provide power analysis results based on a reasonable set of default parameter values. We then develop general conclusions and formulate rules of thumb concerning the optimal design of experiments in which a sample of participants responds to a sample of stimuli. We show that in crossed designs, statistical power typically does not approach unity as the number of participants goes to infinity but instead approaches a maximum attainable power value that is possibly small, depending on the stimulus sample. We also consider the statistical merits of designs involving multiple stimulus blocks. Finally, we provide a simple and flexible Web-based power application to aid researchers in planning studies with samples of stimuli.

  16. Living systematic reviews: 3. Statistical methods for updating meta-analyses.

    PubMed

    Simmonds, Mark; Salanti, Georgia; McKenzie, Joanne; Elliott, Julian

    2017-11-01

    A living systematic review (LSR) should keep the review current as new research evidence emerges. Any meta-analyses included in the review will also need updating as new material is identified. If the aim of the review is solely to present the best current evidence standard meta-analysis may be sufficient, provided reviewers are aware that results may change at later updates. If the review is used in a decision-making context, more caution may be needed. When using standard meta-analysis methods, the chance of incorrectly concluding that any updated meta-analysis is statistically significant when there is no effect (the type I error) increases rapidly as more updates are performed. Inaccurate estimation of any heterogeneity across studies may also lead to inappropriate conclusions. This paper considers four methods to avoid some of these statistical problems when updating meta-analyses: two methods, that is, law of the iterated logarithm and the Shuster method control primarily for inflation of type I error and two other methods, that is, trial sequential analysis and sequential meta-analysis control for type I and II errors (failing to detect a genuine effect) and take account of heterogeneity. This paper compares the methods and considers how they could be applied to LSRs. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. A segmentation editing framework based on shape change statistics

    NASA Astrophysics Data System (ADS)

    Mostapha, Mahmoud; Vicory, Jared; Styner, Martin; Pizer, Stephen

    2017-02-01

    Segmentation is a key task in medical image analysis because its accuracy significantly affects successive steps. Automatic segmentation methods often produce inadequate segmentations, which require the user to manually edit the produced segmentation slice by slice. Because editing is time-consuming, an editing tool that enables the user to produce accurate segmentations by only drawing a sparse set of contours would be needed. This paper describes such a framework as applied to a single object. Constrained by the additional information enabled by the manually segmented contours, the proposed framework utilizes object shape statistics to transform the failed automatic segmentation to a more accurate version. Instead of modeling the object shape, the proposed framework utilizes shape change statistics that were generated to capture the object deformation from the failed automatic segmentation to its corresponding correct segmentation. An optimization procedure was used to minimize an energy function that consists of two terms, an external contour match term and an internal shape change regularity term. The high accuracy of the proposed segmentation editing approach was confirmed by testing it on a simulated data set based on 10 in-vivo infant magnetic resonance brain data sets using four similarity metrics. Segmentation results indicated that our method can provide efficient and adequately accurate segmentations (Dice segmentation accuracy increase of 10%), with very sparse contours (only 10%), which is promising in greatly decreasing the work expected from the user.

  18. Analyzing Faculty Salaries When Statistics Fail.

    ERIC Educational Resources Information Center

    Simpson, William A.

    The role played by nonstatistical procedures, in contrast to multivariant statistical approaches, in analyzing faculty salaries is discussed. Multivariant statistical methods are usually used to establish or defend against prima facia cases of gender and ethnic discrimination with respect to faculty salaries. These techniques are not applicable,…

  19. Handling nonnormality and variance heterogeneity for quantitative sublethal toxicity tests.

    PubMed

    Ritz, Christian; Van der Vliet, Leana

    2009-09-01

    The advantages of using regression-based techniques to derive endpoints from environmental toxicity data are clear, and slowly, this superior analytical technique is gaining acceptance. As use of regression-based analysis becomes more widespread, some of the associated nuances and potential problems come into sharper focus. Looking at data sets that cover a broad spectrum of standard test species, we noticed that some model fits to data failed to meet two key assumptions-variance homogeneity and normality-that are necessary for correct statistical analysis via regression-based techniques. Failure to meet these assumptions often is caused by reduced variance at the concentrations showing severe adverse effects. Although commonly used with linear regression analysis, transformation of the response variable only is not appropriate when fitting data using nonlinear regression techniques. Through analysis of sample data sets, including Lemna minor, Eisenia andrei (terrestrial earthworm), and algae, we show that both the so-called Box-Cox transformation and use of the Poisson distribution can help to correct variance heterogeneity and nonnormality and so allow nonlinear regression analysis to be implemented. Both the Box-Cox transformation and the Poisson distribution can be readily implemented into existing protocols for statistical analysis. By correcting for nonnormality and variance heterogeneity, these two statistical tools can be used to encourage the transition to regression-based analysis and the depreciation of less-desirable and less-flexible analytical techniques, such as linear interpolation.

  20. Pass-fail grading: laying the foundation for self-regulated learning.

    PubMed

    White, Casey B; Fantone, Joseph C

    2010-10-01

    Traditionally, medical schools have tended to make assumptions that students will "automatically" engage in self-education effectively after graduation and subsequent training in residency and fellowships. In reality, the majority of medical graduates out in practice feel unprepared for learning on their own. Many medical schools are now adopting strategies and pedagogies to help students become self-regulating learners. Along with these changes in practices and pedagogy, many schools are eliminating a cornerstone of extrinsic motivation: discriminating grades. To study the effects of the switch from discriminating to pass-fail grading in the second year of medical school, we compared internal and external assessments and evaluations for a second-year class with a discriminating grading scale (Honors, High Pass, Pass, Fail) and for a second-year class with a pass-fail grading scale. Of the measures we compared (MCATs, GPAs, means on second-year examinations, USMLE Step 1 scores, residency placement, in which there were no statistically significant changes), the only statistically significant decreases (lower performance with pass fail) were found in two of the second-year courses. Performance in one other course also improved significantly. Pass-fail grading can meet several important intended outcomes, including "leveling the playing field" for incoming students with different academic backgrounds, reducing competition and fostering collaboration among members of a class, more time for extracurricular interests and personal activities. Pass-fail grading also reduces competition and supports collaboration, and fosters intrinsic motivation, which is key to self-regulated, lifelong learning.

  1. The Utility of Robust Means in Statistics

    ERIC Educational Resources Information Center

    Goodwyn, Fara

    2012-01-01

    Location estimates calculated from heuristic data were examined using traditional and robust statistical methods. The current paper demonstrates the impact outliers have on the sample mean and proposes robust methods to control for outliers in sample data. Traditional methods fail because they rely on the statistical assumptions of normality and…

  2. Treatment of selected syringomyelias with syringo-pleural shunt: the experience with a consecutive 26 cases.

    PubMed

    Fan, Tao; Zhao, XinGang; Zhao, HaiJun; Liang, Cong; Wang, YinQian; Gai, QiFei; Zhang, Fangyi

    2015-10-01

    It is well established that syringomyelia can cause neurological symptoms and deficit by accumulation of fluid within syrinx cavities that lead to internal compression within the spinal cord. When other intervention treating the underlying etiology failed to yield any improvement, the next option would be a procedure to divert the fluid from the syrinx cavity, such as syringo-subarachnoid, syringo-peritoneal or syringo-pleural shunting. The indications and long term efficacy of these direct shunting procedures are still questionable and controversial. To investigate the clinical indication, outcome and complication of syringe-pleural shunt (SPS) as an alternative for treatment of syringomyelia. We reported a retrospective 26 cases of syringomyelia were found to have indication for a diversion procedure. SPS was offered. Patients' symptoms, mJOA score, and MRI were collected to evaluate the change of the syringomyelia and prognosis of the patients. 2-tailed wilcoxon signed-rank test was used to perform the statistical analysis of the mJOA scores. All 26 patients underwent SPS. The clinical information was collected, the mean follow-up time was 27.4 months, 2-tailed wilcoxon signed-rank test was used to perform the statistical analysis of the mJOA scores. The key surgical technique, outcome and complications of SPS were reported in detail. No mortality and severe complications occurred. Postoperative MRIs revealed near-complete resolution of syrinx in 14 patients, significant shrinkage of syrinx in 10 patients, no obvious reduction or unchanged in remaining 2 patient. Postoperatively, the symptoms improved in 24 cases (92.3%). Statistical analysis of the mJOA scores showed a statistical significance (P<0.001) between the preoperative group and the 2-week postoperative group. No further significant improvement between 2 weeks to the final follow up at 27 months. Collapse or remarkable shrinkage of the syrinx by SPS could ameliorate or at least stabilize the symptoms for the patient. We recommend small laminectomy and a less than 3mm myelotomy either at PML or DREZ. The SPS procedure can be an effective and relatively long-lived treatment for the idiopathic syringomyelia and those that failed other options. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. On Statistical Analysis of Neuroimages with Imperfect Registration

    PubMed Central

    Kim, Won Hwa; Ravi, Sathya N.; Johnson, Sterling C.; Okonkwo, Ozioma C.; Singh, Vikas

    2016-01-01

    A variety of studies in neuroscience/neuroimaging seek to perform statistical inference on the acquired brain image scans for diagnosis as well as understanding the pathological manifestation of diseases. To do so, an important first step is to register (or co-register) all of the image data into a common coordinate system. This permits meaningful comparison of the intensities at each voxel across groups (e.g., diseased versus healthy) to evaluate the effects of the disease and/or use machine learning algorithms in a subsequent step. But errors in the underlying registration make this problematic, they either decrease the statistical power or make the follow-up inference tasks less effective/accurate. In this paper, we derive a novel algorithm which offers immunity to local errors in the underlying deformation field obtained from registration procedures. By deriving a deformation invariant representation of the image, the downstream analysis can be made more robust as if one had access to a (hypothetical) far superior registration procedure. Our algorithm is based on recent work on scattering transform. Using this as a starting point, we show how results from harmonic analysis (especially, non-Euclidean wavelets) yields strategies for designing deformation and additive noise invariant representations of large 3-D brain image volumes. We present a set of results on synthetic and real brain images where we achieve robust statistical analysis even in the presence of substantial deformation errors; here, standard analysis procedures significantly under-perform and fail to identify the true signal. PMID:27042168

  4. Effects of preparation relief and flow channels on seating full coverage castings during cementation.

    PubMed

    Webb, E L; Murray, H V; Holland, G A; Taylor, D F

    1983-06-01

    Machined steel dies were used to study the effects of three die modifications on seating full coverage castings during cementation. The die modifications consisted of occlusal channels, occlusal surface relief, and axial channels. Fourteen specimens having one or more forms of die modification were compared with two control specimens having no die modifications. Statistical analysis of the data revealed that the addition of four axial channels to the simulated preparation on the steel die produced a significant reduction in the mean marginal discrepancy during cementation. Occlusal modifications alone failed to produce significant reductions in marginal discrepancies when compared with the control specimens. Occlusal modifications in conjunction with axial channels failed to produce further significant reductions in marginal discrepancies when compared with those reductions observed in specimens having only axial channels.

  5. A Statistical Simulation Approach to Safe Life Fatigue Analysis of Redundant Metallic Components

    NASA Technical Reports Server (NTRS)

    Matthews, William T.; Neal, Donald M.

    1997-01-01

    This paper introduces a dual active load path fail-safe fatigue design concept analyzed by Monte Carlo simulation. The concept utilizes the inherent fatigue life differences between selected pairs of components for an active dual path system, enhanced by a stress level bias in one component. The design is applied to a baseline design; a safe life fatigue problem studied in an American Helicopter Society (AHS) round robin. The dual active path design is compared with a two-element standby fail-safe system and the baseline design for life at specified reliability levels and weight. The sensitivity of life estimates for both the baseline and fail-safe designs was examined by considering normal and Weibull distribution laws and coefficient of variation levels. Results showed that the biased dual path system lifetimes, for both the first element failure and residual life, were much greater than for standby systems. The sensitivity of the residual life-weight relationship was not excessive at reliability levels up to R = 0.9999 and the weight penalty was small. The sensitivity of life estimates increases dramatically at higher reliability levels.

  6. Employer reasons for failing to report eligible workers’ compensation claims in the BLS survey of occupational injuries and illnesses

    PubMed Central

    Wuellner, Sara E.; Bonauto, David K.

    2016-01-01

    Background Little research has been done to identify reasons employers fail to report some injuries and illnesses in the Bureau of Labor Statistics Survey of Occupational Injuries and Illnesses (SOII). Methods We interviewed the 2012 Washington SOII respondents from establishments that had failed to report one or more eligible workers’ compensation claims in the SOII about their reasons for not reporting specific claims. Qualitative content analysis methods were used to identify themes and patterns in the responses. Results Non‐compliance with OSHA recordkeeping or SOII reporting instructions and data entry errors led to unreported claims. Some employers refused to include claims because they did not consider the injury to be work‐related, despite workers’ compensation eligibility. Participant responses brought the SOII eligibility of some claims into question. Conclusion Systematic and non‐systematic errors lead to SOII underreporting. Insufficient recordkeeping systems and limited knowledge of reporting requirements are barriers to accurate workplace injury records. Am. J. Ind. Med. 59:343–356, 2016. © 2016 The Authors. American Journal of Industrial Medicine Published by Wiley Periodicals, Inc. PMID:26970051

  7. Evaluation of the "Angelina Jolie Effect" on Screening Mammography Utilization in an Academic Center.

    PubMed

    Huesch, Marco D; Schetter, Susann; Segel, Joel; Chetlen, Alison

    2017-08-01

    The aim of this study was to understand the impact on screening mammography at our institution, comparing weekly utilization in the 2 years before and the 2 years after Ms Angelina Jolie disclosed in the New York Times on May 13, 2013, that she had had a prophylactic double mastectomy. All 48,110 consecutive screening mammograms conducted at our institution between May 16, 2011, and May 16, 2015, were selected from our electronic medical record system. We used interrupted time series statistical models and graphical methods on utilization data to understand utilization changes before and after Ms Jolie's news. The graphed trend of weekly screening mammogram utilization failed to show changes around the time of interest. Analytical models and statistical tests also failed to show a step change increase or acceleration of utilization around May 2013. However, graphical and time series analyses showed a flattening of utilization in the middle of 2014. In our well-powered analysis in a large regional breast imaging center, we found no support for the hypothesis that this celebrity news drove increased screening. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  8. Fatal falls in the US construction industry, 1990 to 1999.

    PubMed

    Derr, J; Forst, L; Chen, H Y; Conroy, L

    2001-10-01

    The Occupational Safety and Health Administration's (OSHA's) Integrated Management Information System (IMIS) database allows for the detailed analysis of risk factors surrounding fatal occupational events. This study used IMIS data to (1) perform a risk factor analysis of fatal construction falls, and (2) assess the impact of the February 1995 29 CFR Part 1926 Subpart M OSHA fall protection regulations for construction by calculating trends in fatal fall rates. In addition, IMIS data on fatal construction falls were compared with data from other occupational fatality surveillance systems. For falls in construction, the study identified several demographic factors that may indicate increased risk. A statistically significant downward trend in fatal falls was evident in all construction and within several construction categories during the decade. Although the study failed to show a statistically significant intervention effect from the new OSHA regulations, it may have lacked the power to do so.

  9. Analysis of Convair 990 rejected-takeoff accident with emphasis on decision making, training and procedures

    NASA Technical Reports Server (NTRS)

    Batthauer, Byron E.

    1987-01-01

    This paper analyzes a NASA Convair 990 (CV-990) accident with emphasis on rejected-takeoff (RTO) decision making, training, procedures, and accident statistics. The NASA Aircraft Accident Investigation Board was somewhat perplexed that an aircraft could be destroyed as a result of blown tires during the takeoff roll. To provide a better understanding of tire failure RTO's, The Board obtained accident reports, Federal Aviation Administration (FAA) studies, and other pertinent information related to the elements of this accident. This material enhanced the analysis process and convinced the Accident Board that high-speed RTO's in transport aircraft should be given more emphasis during pilot training. Pilots should be made aware of various RTO situations and statistics with emphasis on failed-tire RTO's. This background information could enhance the split-second decision-making process that is required prior to initiating an RTO.

  10. A Meta-Analysis of Hypnotherapeutic Techniques in the Treatment of PTSD Symptoms.

    PubMed

    O'Toole, Siobhan K; Solomon, Shelby L; Bergdahl, Stephen A

    2016-02-01

    The efficacy of hypnotherapeutic techniques as treatment for symptoms of posttraumatic stress disorder (PTSD) was explored through meta-analytic methods. Studies were selected through a search of 29 databases. Altogether, 81 studies discussing hypnotherapy and PTSD were reviewed for inclusion criteria. The outcomes of 6 studies representing 391 participants were analyzed using meta-analysis. Evaluation of effect sizes related to avoidance and intrusion, in addition to overall PTSD symptoms after hypnotherapy treatment, revealed that all studies showed that hypnotherapy had a positive effect on PTSD symptoms. The overall Cohen's d was large (-1.18) and statistically significant (p < .001). Effect sizes varied based on study quality; however, they were large and statistically significant. Using the classic fail-safe N to assess for publication bias, it was determined it would take 290 nonsignificant studies to nullify these findings. Copyright © 2016 International Society for Traumatic Stress Studies.

  11. Nonreplication of an Association of Apolipoprotein E2 With Sinistrality

    PubMed Central

    Piper, Brian J.; Yasen, Alia L.; Taylor, Amy E.; Ruiz, Jonatan R.; Gaynor, J. William; Dayger, Catherine A.; Gonzalez-Gross, Marcela; Kwon, Oh D.; Nilsson, Lars-Göran; Day, Ian N. M.; Raber, Jacob; Miller, Jeremy K.

    2013-01-01

    A recent report found that left-handed adolescents were over three-fold more likely to have an Apolipoprotein (APOE) ε2 allele. This study was unable to replicate this association in young-adults (N=166). A meta-analysis of nine other datasets (N = 360 to 7,559, Power > 0.999) including that of National Alzheimer’s Coordinating Center also failed to find an over-representation of ε2 among left-handers indicating that this earlier outcome was most likely a statistical artifact. PMID:22721421

  12. The use of iohexol in pediatric urography: a comparative study with meglumine diatrizoate.

    PubMed

    Bani, E; Federighi, F; Ghio, R; Marchitiello, M; Galigani, P; Palla, R

    1985-01-01

    In a prospective study the nephrotoxicity of iohexol, a new non-ionic contrast medium, was compared with meglumine diatrizoate. Plasma creatinine, BUN, creatinine clearance, urinalysis and the urinary excretion of N-acetyl glucosaminidase (NAG), gamma glutamyl transpeptidase (GGT) and muramidase (MU) were determined prior to and following intravenous pyelography. A significant rise in the enzyme excretion was observed in patients who received iohexol and diatrizoate. Statistical analysis failed to demonstrate any difference in nephrotoxicity between the two iodinated contrast media.

  13. Survival rates of short (6 mm) micro-rough surface implants: a review of literature and meta-analysis.

    PubMed

    Srinivasan, Murali; Vazquez, Lydia; Rieder, Philippe; Moraguez, Osvaldo; Bernard, Jean-Pierre; Belser, Urs C

    2014-05-01

    The aim of this review was to test the hypothesis that 6 mm micro-rough short Straumann(®) implants provide predictable survival rates and verify that most failures occurring are early failures. A PubMed and hand search was performed to identify studies involving micro-rough 6-mm-short implants published between January 1987 and August 2011. Studies were included that (i) involve Straumann(®) 6 mm implants placed in the human jaws, (ii) provide data on the survival rate, (iii) mention the time of failure, and (iv) report a minimum follow-up period of 12 months following placement. A meta-analysis was performed on the extracted data. From a total of 842 publications that were screened, 12 methodologically sound articles qualified to be included for the statistical evaluation based on our inclusion criteria. A total of 690 Straumann(®) 6-mm-short implants were evaluated in the reviewed studies (Total: placed-690, failed-25; maxilla: placed-266, failed-14; mandible: placed-364, failed-5; follow-up period: 1-8 years). A meta-analysis was performed on the calculated early cumulative survival rates (CSR%). The pooled early CSR% calculated in this meta-analysis was 93.7%, whereas the overall survival rates in the maxilla and mandible were 94.7% and 98.6% respectively. Implant failures observed were predominantly early failures (76%). This meta-analysis provides robust evidence that micro-rough 6-mm-short dental implants are a predictable treatment option, providing favorable survival rates. The failures encountered with 6-mm-short implants were predominantly early and their survival in the mandible was slightly superior. © 2013 John Wiley & Sons A/S. Published by Blackwell Publishing Ltd.

  14. Renormalization Group Tutorial

    NASA Technical Reports Server (NTRS)

    Bell, Thomas L.

    2004-01-01

    Complex physical systems sometimes have statistical behavior characterized by power- law dependence on the parameters of the system and spatial variability with no particular characteristic scale as the parameters approach critical values. The renormalization group (RG) approach was developed in the fields of statistical mechanics and quantum field theory to derive quantitative predictions of such behavior in cases where conventional methods of analysis fail. Techniques based on these ideas have since been extended to treat problems in many different fields, and in particular, the behavior of turbulent fluids. This lecture will describe a relatively simple but nontrivial example of the RG approach applied to the diffusion of photons out of a stellar medium when the photons have wavelengths near that of an emission line of atoms in the medium.

  15. Can arthroscopic revision surgery for shoulder instability be a fair option?

    PubMed

    De Giorgi, Silvana; Garofalo, Raffaele; Tafuri, Silvio; Cesari, Eugenio; Rose, Giacomo Delle; Castagna, Alessandro

    2014-04-01

    the aim of this study was to evaluate the role of arthroscopic capsuloplasty in the treatment of failed primary arthroscopic treatment of glenohumeral instability. we retrospectively examined at a minimum of 3-years follow-up 22 patients who underwent arthroscopic treatment between 1999 and 2007 who had recurrent anterior shoulder instability with a post-surgical failure. A statistical analysis was performed to evaluate which variable could influence the definitive result and clinical outcomes at final follow-up. A p value of less than 0.05 was considered significant. we observed after revision surgery an overall failure rate of 8/22 (36.4%) including frank dislocations, subluxations and also apprehension that seriously inhibit the patient's quality of life. No significant differences were observed in the examined parameters. according to our outcomes we generally do not recommend an arthroscopic revision procedure for failed instability surgery.

  16. An Inferential Confidence Interval Method of Establishing Statistical Equivalence that Corrects Tryon's (2001) Reduction Factor

    ERIC Educational Resources Information Center

    Tryon, Warren W.; Lewis, Charles

    2008-01-01

    Evidence of group matching frequently takes the form of a nonsignificant test of statistical difference. Theoretical hypotheses of no difference are also tested in this way. These practices are flawed in that null hypothesis statistical testing provides evidence against the null hypothesis and failing to reject H[subscript 0] is not evidence…

  17. Preparing and Presenting Effective Research Posters

    PubMed Central

    Miller, Jane E

    2007-01-01

    Objectives Posters are a common way to present results of a statistical analysis, program evaluation, or other project at professional conferences. Often, researchers fail to recognize the unique nature of the format, which is a hybrid of a published paper and an oral presentation. This methods note demonstrates how to design research posters to convey study objectives, methods, findings, and implications effectively to varied professional audiences. Methods A review of existing literature on research communication and poster design is used to identify and demonstrate important considerations for poster content and layout. Guidelines on how to write about statistical methods, results, and statistical significance are illustrated with samples of ineffective writing annotated to point out weaknesses, accompanied by concrete examples and explanations of improved presentation. A comparison of the content and format of papers, speeches, and posters is also provided. Findings Each component of a research poster about a quantitative analysis should be adapted to the audience and format, with complex statistical results translated into simplified charts, tables, and bulleted text to convey findings as part of a clear, focused story line. Conclusions Effective research posters should be designed around two or three key findings with accompanying handouts and narrative description to supply additional technical detail and encourage dialog with poster viewers. PMID:17355594

  18. Statistical detection of EEG synchrony using empirical bayesian inference.

    PubMed

    Singh, Archana K; Asoh, Hideki; Takeda, Yuji; Phillips, Steven

    2015-01-01

    There is growing interest in understanding how the brain utilizes synchronized oscillatory activity to integrate information across functionally connected regions. Computing phase-locking values (PLV) between EEG signals is a popular method for quantifying such synchronizations and elucidating their role in cognitive tasks. However, high-dimensionality in PLV data incurs a serious multiple testing problem. Standard multiple testing methods in neuroimaging research (e.g., false discovery rate, FDR) suffer severe loss of power, because they fail to exploit complex dependence structure between hypotheses that vary in spectral, temporal and spatial dimension. Previously, we showed that a hierarchical FDR and optimal discovery procedures could be effectively applied for PLV analysis to provide better power than FDR. In this article, we revisit the multiple comparison problem from a new Empirical Bayes perspective and propose the application of the local FDR method (locFDR; Efron, 2001) for PLV synchrony analysis to compute FDR as a posterior probability that an observed statistic belongs to a null hypothesis. We demonstrate the application of Efron's Empirical Bayes approach for PLV synchrony analysis for the first time. We use simulations to validate the specificity and sensitivity of locFDR and a real EEG dataset from a visual search study for experimental validation. We also compare locFDR with hierarchical FDR and optimal discovery procedures in both simulation and experimental analyses. Our simulation results showed that the locFDR can effectively control false positives without compromising on the power of PLV synchrony inference. Our results from the application locFDR on experiment data detected more significant discoveries than our previously proposed methods whereas the standard FDR method failed to detect any significant discoveries.

  19. Hypothesis: discrepancy between intra- and interpopulation studies of the relationship between dietary salt and blood pressure: fact or fiction?

    PubMed

    Omvik, P

    1984-01-01

    It is a paradox that intra-population studies fail to show significant correlation between sodium excretion and blood pressure while a clear relationship exists in cross-cultural studies. Since daily variation of sodium excretion is high, the discrepancy between the two observations could be due to non-comparable data on sodium excretion. This is a discussion of the hypothesis that the finding of a significant correlation or not between sodium excretion and blood pressure depends on the statistical analysis of the data.

  20. Feature-Based Statistical Analysis of Combustion Simulation Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, J; Krishnamoorthy, V; Liu, S

    2011-11-18

    We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing andmore » reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion science; however, it is applicable to many other science domains.« less

  1. Education and Female Labor Force Participation.

    ERIC Educational Resources Information Center

    Psacharopoulos, George; Tzannatos, Zafiris

    Statistics have created an arbitrary, confusing distinction between a labor force participant and an nonparticipant; women were relegated to a second class employment citizenship that failed to recognize household production and assigned them a lower participation rate relative to males. Despite these shortcomings, such statistics can prove…

  2. Atrial Electrogram Fractionation Distribution before and after Pulmonary Vein Isolation in Human Persistent Atrial Fibrillation-A Retrospective Multivariate Statistical Analysis.

    PubMed

    Almeida, Tiago P; Chu, Gavin S; Li, Xin; Dastagir, Nawshin; Tuan, Jiun H; Stafford, Peter J; Schlindwein, Fernando S; Ng, G André

    2017-01-01

    Purpose: Complex fractionated atrial electrograms (CFAE)-guided ablation after pulmonary vein isolation (PVI) has been used for persistent atrial fibrillation (persAF) therapy. This strategy has shown suboptimal outcomes due to, among other factors, undetected changes in the atrial tissue following PVI. In the present work, we investigate CFAE distribution before and after PVI in patients with persAF using a multivariate statistical model. Methods: 207 pairs of atrial electrograms (AEGs) were collected before and after PVI respectively, from corresponding LA regions in 18 persAF patients. Twelve attributes were measured from the AEGs, before and after PVI. Statistical models based on multivariate analysis of variance (MANOVA) and linear discriminant analysis (LDA) have been used to characterize the atrial regions and AEGs. Results: PVI significantly reduced CFAEs in the LA (70 vs. 40%; P < 0.0001). Four types of LA regions were identified, based on the AEGs characteristics: (i) fractionated before PVI that remained fractionated after PVI (31% of the collected points); (ii) fractionated that converted to normal (39%); (iii) normal prior to PVI that became fractionated (9%) and; (iv) normal that remained normal (21%). Individually, the attributes failed to distinguish these LA regions, but multivariate statistical models were effective in their discrimination ( P < 0.0001). Conclusion: Our results have unveiled that there are LA regions resistant to PVI, while others are affected by it. Although, traditional methods were unable to identify these different regions, the proposed multivariate statistical model discriminated LA regions resistant to PVI from those affected by it without prior ablation information.

  3. PCA as a practical indicator of OPLS-DA model reliability.

    PubMed

    Worley, Bradley; Powers, Robert

    Principal Component Analysis (PCA) and Orthogonal Projections to Latent Structures Discriminant Analysis (OPLS-DA) are powerful statistical modeling tools that provide insights into separations between experimental groups based on high-dimensional spectral measurements from NMR, MS or other analytical instrumentation. However, when used without validation, these tools may lead investigators to statistically unreliable conclusions. This danger is especially real for Partial Least Squares (PLS) and OPLS, which aggressively force separations between experimental groups. As a result, OPLS-DA is often used as an alternative method when PCA fails to expose group separation, but this practice is highly dangerous. Without rigorous validation, OPLS-DA can easily yield statistically unreliable group separation. A Monte Carlo analysis of PCA group separations and OPLS-DA cross-validation metrics was performed on NMR datasets with statistically significant separations in scores-space. A linearly increasing amount of Gaussian noise was added to each data matrix followed by the construction and validation of PCA and OPLS-DA models. With increasing added noise, the PCA scores-space distance between groups rapidly decreased and the OPLS-DA cross-validation statistics simultaneously deteriorated. A decrease in correlation between the estimated loadings (added noise) and the true (original) loadings was also observed. While the validity of the OPLS-DA model diminished with increasing added noise, the group separation in scores-space remained basically unaffected. Supported by the results of Monte Carlo analyses of PCA group separations and OPLS-DA cross-validation metrics, we provide practical guidelines and cross-validatory recommendations for reliable inference from PCA and OPLS-DA models.

  4. Selecting the "Best" Factor Structure and Moving Measurement Validation Forward: An Illustration.

    PubMed

    Schmitt, Thomas A; Sass, Daniel A; Chappelle, Wayne; Thompson, William

    2018-04-09

    Despite the broad literature base on factor analysis best practices, research seeking to evaluate a measure's psychometric properties frequently fails to consider or follow these recommendations. This leads to incorrect factor structures, numerous and often overly complex competing factor models and, perhaps most harmful, biased model results. Our goal is to demonstrate a practical and actionable process for factor analysis through (a) an overview of six statistical and psychometric issues and approaches to be aware of, investigate, and report when engaging in factor structure validation, along with a flowchart for recommended procedures to understand latent factor structures; (b) demonstrating these issues to provide a summary of the updated Posttraumatic Stress Disorder Checklist (PCL-5) factor models and a rationale for validation; and (c) conducting a comprehensive statistical and psychometric validation of the PCL-5 factor structure to demonstrate all the issues we described earlier. Considering previous research, the PCL-5 was evaluated using a sample of 1,403 U.S. Air Force remotely piloted aircraft operators with high levels of battlefield exposure. Previously proposed PCL-5 factor structures were not supported by the data, but instead a bifactor model is arguably more statistically appropriate.

  5. Utilization of an Enhanced Canonical Correlation Analysis (ECCA) to Predict Daily Precipitation and Temperature in a Semi-Arid Environment

    NASA Astrophysics Data System (ADS)

    Lopez, S. R.; Hogue, T. S.

    2011-12-01

    Global climate models (GCMs) are primarily used to generate historical and future large-scale circulation patterns at a coarse resolution (typical order of 50,000 km2) and fail to capture climate variability at the ground level due to localized surface influences (i.e topography, marine, layer, land cover, etc). Their inability to accurately resolve these processes has led to the development of numerous 'downscaling' techniques. The goal of this study is to enhance statistical downscaling of daily precipitation and temperature for regions with heterogeneous land cover and topography. Our analysis was divided into two periods, historical (1961-2000) and contemporary (1980-2000), and tested using sixteen predictand combinations from four GCMs (GFDL CM2.0, GFDL CM2.1, CNRM-CM3 and MRI-CGCM2 3.2a. The Southern California area was separated into five county regions: Santa Barbara, Ventura, Los Angeles, Orange and San Diego. Principle component analysis (PCA) was performed on ground-based observations in order to (1) reduce the number of redundant gauges and minimize dimensionality and (2) cluster gauges that behave statistically similarly for post-analysis. Post-PCA analysis included extensive testing of predictor-predictand relationships using an enhanced canonical correlation analysis (ECCA). The ECCA includes obtaining the optimal predictand sets for all models within each spatial domain (county) as governed by daily and monthly overall statistics. Results show all models maintain mean annual and monthly behavior within each county and daily statistics are improved. The level of improvement highly depends on the vegetation extent within each county and the land-to-ocean ratio within the GCM spatial grid. The utilization of the entire historical period also leads to better statistical representation of observed daily precipitation. The validated ECCA technique is being applied to future climate scenarios distributed by the IPCC in order to provide forcing data for regional hydrologic models and assess future water resources in the Southern California region.

  6. Statistical testing and power analysis for brain-wide association study.

    PubMed

    Gong, Weikang; Wan, Lin; Lu, Wenlian; Ma, Liang; Cheng, Fan; Cheng, Wei; Grünewald, Stefan; Feng, Jianfeng

    2018-04-05

    The identification of connexel-wise associations, which involves examining functional connectivities between pairwise voxels across the whole brain, is both statistically and computationally challenging. Although such a connexel-wise methodology has recently been adopted by brain-wide association studies (BWAS) to identify connectivity changes in several mental disorders, such as schizophrenia, autism and depression, the multiple correction and power analysis methods designed specifically for connexel-wise analysis are still lacking. Therefore, we herein report the development of a rigorous statistical framework for connexel-wise significance testing based on the Gaussian random field theory. It includes controlling the family-wise error rate (FWER) of multiple hypothesis testings using topological inference methods, and calculating power and sample size for a connexel-wise study. Our theoretical framework can control the false-positive rate accurately, as validated empirically using two resting-state fMRI datasets. Compared with Bonferroni correction and false discovery rate (FDR), it can reduce false-positive rate and increase statistical power by appropriately utilizing the spatial information of fMRI data. Importantly, our method bypasses the need of non-parametric permutation to correct for multiple comparison, thus, it can efficiently tackle large datasets with high resolution fMRI images. The utility of our method is shown in a case-control study. Our approach can identify altered functional connectivities in a major depression disorder dataset, whereas existing methods fail. A software package is available at https://github.com/weikanggong/BWAS. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Multi-Genetic Marker Approach and Spatio-Temporal Analysis Suggest There Is a Single Panmictic Population of Swordfish Xiphias gladius in the Indian Ocean

    PubMed Central

    Muths, Delphine; Le Couls, Sarah; Evano, Hugues; Grewe, Peter; Bourjea, Jerome

    2013-01-01

    Genetic population structure of swordfish Xiphias gladius was examined based on 2231 individual samples, collected mainly between 2009 and 2010, among three major sampling areas within the Indian Ocean (IO; twelve distinct sites), Atlantic (two sites) and Pacific (one site) Oceans using analysis of nineteen microsatellite loci (n = 2146) and mitochondrial ND2 sequences (n = 2001) data. Sample collection was stratified in time and space in order to investigate the stability of the genetic structure observed with a special focus on the South West Indian Ocean. Significant AMOVA variance was observed for both markers indicating genetic population subdivision was present between oceans. Overall value of F-statistics for ND2 sequences confirmed that Atlantic and Indian Oceans swordfish represent two distinct genetic stocks. Indo-Pacific differentiation was also significant but lower than that observed between Atlantic and Indian Oceans. However, microsatellite F-statistics failed to reveal structure even at the inter-oceanic scale, indicating that resolving power of our microsatellite loci was insufficient for detecting population subdivision. At the scale of the Indian Ocean, results obtained from both markers are consistent with swordfish belonging to a single unique panmictic population. Analyses partitioned by sampling area, season, or sex also failed to identify any clear structure within this ocean. Such large spatial and temporal homogeneity of genetic structure, observed for such a large highly mobile pelagic species, suggests as satisfactory to consider swordfish as a single panmictic population in the Indian Ocean. PMID:23717447

  8. The Future of Statistics as a Discipline.

    DTIC Science & Technology

    1981-09-01

    Uiversity Dqpsrtzuuht of Statistics Tallahassee, Florida 32306 tIhe Ru, seZ 11 La b WY delveedat he14stAzuuul PMetlnq of the Atudica...from the real world. While the academicians too often fail to enrich their instruction and research with real life pro’ lems, practi- tioners do not...of the Americm Statistical Association, 75, 575-582. Box, George E. P. (1979), "Some Problems of Statistics and Everyday Life ," Jowna of the Amerioan

  9. Sources of Error and the Statistical Formulation of M S: m b Seismic Event Screening Analysis

    NASA Astrophysics Data System (ADS)

    Anderson, D. N.; Patton, H. J.; Taylor, S. R.; Bonner, J. L.; Selby, N. D.

    2014-03-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global ban on nuclear explosions, is currently in a ratification phase. Under the CTBT, an International Monitoring System (IMS) of seismic, hydroacoustic, infrasonic and radionuclide sensors is operational, and the data from the IMS is analysed by the International Data Centre (IDC). The IDC provides CTBT signatories basic seismic event parameters and a screening analysis indicating whether an event exhibits explosion characteristics (for example, shallow depth). An important component of the screening analysis is a statistical test of the null hypothesis H 0: explosion characteristics using empirical measurements of seismic energy (magnitudes). The established magnitude used for event size is the body-wave magnitude (denoted m b) computed from the initial segment of a seismic waveform. IDC screening analysis is applied to events with m b greater than 3.5. The Rayleigh wave magnitude (denoted M S) is a measure of later arriving surface wave energy. Magnitudes are measurements of seismic energy that include adjustments (physical correction model) for path and distance effects between event and station. Relative to m b, earthquakes generally have a larger M S magnitude than explosions. This article proposes a hypothesis test (screening analysis) using M S and m b that expressly accounts for physical correction model inadequacy in the standard error of the test statistic. With this hypothesis test formulation, the 2009 Democratic Peoples Republic of Korea announced nuclear weapon test fails to reject the null hypothesis H 0: explosion characteristics.

  10. What Is Missing in Counseling Research? Reporting Missing Data

    ERIC Educational Resources Information Center

    Sterner, William R.

    2011-01-01

    Missing data have long been problematic in quantitative research. Despite the statistical and methodological advances made over the past 3 decades, counseling researchers fail to provide adequate information on this phenomenon. Interpreting the complex statistical procedures and esoteric language seems to be a contributing factor. An overview of…

  11. Lack of grading agreement among international hemostasis external quality assessment programs

    PubMed Central

    Olson, John D.; Jennings, Ian; Meijer, Piet; Bon, Chantal; Bonar, Roslyn; Favaloro, Emmanuel J.; Higgins, Russell A.; Keeney, Michael; Mammen, Joy; Marlar, Richard A.; Meley, Roland; Nair, Sukesh C.; Nichols, William L.; Raby, Anne; Reverter, Joan C.; Srivastava, Alok; Walker, Isobel

    2018-01-01

    Laboratory quality programs rely on internal quality control and external quality assessment (EQA). EQA programs provide unknown specimens for the laboratory to test. The laboratory's result is compared with other (peer) laboratories performing the same test. EQA programs assign target values using a variety of methods statistical tools and performance assessment of ‘pass’ or ‘fail’ is made. EQA provider members of the international organization, external quality assurance in thrombosis and hemostasis, took part in a study to compare outcome of performance analysis using the same data set of laboratory results. Eleven EQA organizations using eight different analytical approaches participated. Data for a normal and prolonged activated partial thromboplastin time (aPTT) and a normal and reduced factor VIII (FVIII) from 218 laboratories were sent to the EQA providers who analyzed the data set using their method of evaluation for aPTT and FVIII, determining the performance for each laboratory record in the data set. Providers also summarized their statistical approach to assignment of target values and laboratory performance. Each laboratory record in the data set was graded pass/fail by all EQA providers for each of the four analytes. There was a lack of agreement of pass/fail grading among EQA programs. Discordance in the grading was 17.9 and 11% of normal and prolonged aPTT results, respectively, and 20.2 and 17.4% of normal and reduced FVIII results, respectively. All EQA programs in this study employed statistical methods compliant with the International Standardization Organization (ISO), ISO 13528, yet the evaluation of laboratory results for all four analytes showed remarkable grading discordance. PMID:29232255

  12. Statistically Controlling for Confounding Constructs Is Harder than You Think

    PubMed Central

    Westfall, Jacob; Yarkoni, Tal

    2016-01-01

    Social scientists often seek to demonstrate that a construct has incremental validity over and above other related constructs. However, these claims are typically supported by measurement-level models that fail to consider the effects of measurement (un)reliability. We use intuitive examples, Monte Carlo simulations, and a novel analytical framework to demonstrate that common strategies for establishing incremental construct validity using multiple regression analysis exhibit extremely high Type I error rates under parameter regimes common in many psychological domains. Counterintuitively, we find that error rates are highest—in some cases approaching 100%—when sample sizes are large and reliability is moderate. Our findings suggest that a potentially large proportion of incremental validity claims made in the literature are spurious. We present a web application (http://jakewestfall.org/ivy/) that readers can use to explore the statistical properties of these and other incremental validity arguments. We conclude by reviewing SEM-based statistical approaches that appropriately control the Type I error rate when attempting to establish incremental validity. PMID:27031707

  13. A study of environmental characterization of conventional and advanced aluminum alloys for selection and design. Phase 2: The breaking load test method

    NASA Technical Reports Server (NTRS)

    Sprowls, D. O.; Bucci, R. J.; Ponchel, B. M.; Brazill, R. L.; Bretz, P. E.

    1984-01-01

    A technique is demonstrated for accelerated stress corrosion testing of high strength aluminum alloys. The method offers better precision and shorter exposure times than traditional pass fail procedures. The approach uses data from tension tests performed on replicate groups of smooth specimens after various lengths of exposure to static stress. The breaking strength measures degradation in the test specimen load carrying ability due to the environmental attack. Analysis of breaking load data by extreme value statistics enables the calculation of survival probabilities and a statistically defined threshold stress applicable to the specific test conditions. A fracture mechanics model is given which quantifies depth of attack in the stress corroded specimen by an effective flaw size calculated from the breaking stress and the material strength and fracture toughness properties. Comparisons are made with experimental results from three tempers of 7075 alloy plate tested by the breaking load method and by traditional tests of statistically loaded smooth tension bars and conventional precracked specimens.

  14. Con: Meta-analysis: some key limitations and potential solutions.

    PubMed

    Esterhuizen, Tonya M; Thabane, Lehana

    2016-06-01

    Meta-analysis, a statistical combination of results of several trials to produce a summary effect, has been subject to criticism in the past, mainly for the reasons of poor quality of included studies, heterogeneity between studies meta-analyzed and failing to address publication bias. These limitations can cause the results to be misleading, which is important if policy and practice decisions are based on systematic reviews and meta-analyses. We elaborate on these limitations and illustrate them with examples from the nephrology literature. Finally, we present some potential solutions, notably, education in meta-analysis for evidence producers and consumers as well as the use of individual patient data for meta-analyses. © The Author 2016. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  15. Assessing first year radiology resident competence pre-call: development and implementation of a computer-based exam before and after the 12 month training requirement.

    PubMed

    Khan, Rihan; Krupinski, Elizabeth; Graham, J Allen; Benodin, Les; Lewis, Petra

    2012-06-01

    Whether first-year radiology residents are ready to start call after 6 or 12 months has been a subject of much debate. The purpose of this study was to establish an assessment that would evaluate the call readiness of first-year radiology residents and identify any individual areas of weakness using a comprehensive computerized format. Secondarily, we evaluated for any significant differences in performance before and after the change in precall training requirement from 6 to 12 months. A list of >140 potential emergency radiology cases was given to first-year radiology residents at the beginning of the academic year. Over 4 years, three separate versions of a computerized examination were constructed using hyperlinked PowerPoint presentations and given to both first-year and second-year residents. No resident took the same version of the exam twice. Exam score and number of cases failed were assessed. Individual areas of weakness were identified and remediated with the residents. Statistical analysis was used to evaluate exam score and the number of cases failed, considering resident year and the three versions of the exam. Over 4 years, 17 of 19 (89%) first-year radiology residents passed the exam on first attempt. The two who failed were remediated and passed a different version of the exam 6 weeks later. Using the oral board scoring system, first-year radiology residents scored an average of 70.7 with 13 cases failed, compared to 71.1 with eight cases failed for second-year residents who scored statistically significantly higher. No significant difference was found in first-year radiology resident scoring before and after the 12-month training requirement prior to call. An emergency radiology examination was established to aid in the assessment of first-year radiology residents' competency prior to starting call, which has become a permanent part of the first-year curriculum. Over 4 years, all first-year residents were ultimately judged ready to start call. Of the variables assessed, only resident year showed a significant difference in scoring parameters. In particular, length of training prior to taking call showed no significant difference. Areas of weakness were identified for further study. Copyright © 2012 AUR. Published by Elsevier Inc. All rights reserved.

  16. Trial Sequential Analysis in systematic reviews with meta-analysis.

    PubMed

    Wetterslev, Jørn; Jakobsen, Janus Christian; Gluud, Christian

    2017-03-06

    Most meta-analyses in systematic reviews, including Cochrane ones, do not have sufficient statistical power to detect or refute even large intervention effects. This is why a meta-analysis ought to be regarded as an interim analysis on its way towards a required information size. The results of the meta-analyses should relate the total number of randomised participants to the estimated required meta-analytic information size accounting for statistical diversity. When the number of participants and the corresponding number of trials in a meta-analysis are insufficient, the use of the traditional 95% confidence interval or the 5% statistical significance threshold will lead to too many false positive conclusions (type I errors) and too many false negative conclusions (type II errors). We developed a methodology for interpreting meta-analysis results, using generally accepted, valid evidence on how to adjust thresholds for significance in randomised clinical trials when the required sample size has not been reached. The Lan-DeMets trial sequential monitoring boundaries in Trial Sequential Analysis offer adjusted confidence intervals and restricted thresholds for statistical significance when the diversity-adjusted required information size and the corresponding number of required trials for the meta-analysis have not been reached. Trial Sequential Analysis provides a frequentistic approach to control both type I and type II errors. We define the required information size and the corresponding number of required trials in a meta-analysis and the diversity (D 2 ) measure of heterogeneity. We explain the reasons for using Trial Sequential Analysis of meta-analysis when the actual information size fails to reach the required information size. We present examples drawn from traditional meta-analyses using unadjusted naïve 95% confidence intervals and 5% thresholds for statistical significance. Spurious conclusions in systematic reviews with traditional meta-analyses can be reduced using Trial Sequential Analysis. Several empirical studies have demonstrated that the Trial Sequential Analysis provides better control of type I errors and of type II errors than the traditional naïve meta-analysis. Trial Sequential Analysis represents analysis of meta-analytic data, with transparent assumptions, and better control of type I and type II errors than the traditional meta-analysis using naïve unadjusted confidence intervals.

  17. Can PC-9 Zhong chong replace K-1 Yong quan for the acupunctural resuscitation of a bilateral double-amputee? Stating the “random criterion problem” in its statistical analysis

    PubMed Central

    Inchauspe, Adrián Angel

    2016-01-01

    AIM: To present an inclusion criterion for patients who have suffered bilateral amputation in order to be treated with the supplementary resuscitation treatment which is hereby proposed by the author. METHODS: This work is based on a Retrospective Cohort model so that a certainly lethal risk to the control group is avoided. RESULTS: This paper presents a hypothesis on acupunctural PC-9 Zhong chong point, further supported by previous statistical work recorded for the K-1 Yong quan resuscitation point. CONCLUSION: Thanks to the application of the resuscitation maneuver herein proposed on the previously mentioned point, patients with bilateral amputation would have another alternative treatment available in case basic and advanced CPR should fail. PMID:27152257

  18. Directions for new developments on statistical design and analysis of small population group trials.

    PubMed

    Hilgers, Ralf-Dieter; Roes, Kit; Stallard, Nigel

    2016-06-14

    Most statistical design and analysis methods for clinical trials have been developed and evaluated where at least several hundreds of patients could be recruited. These methods may not be suitable to evaluate therapies if the sample size is unavoidably small, which is usually termed by small populations. The specific sample size cut off, where the standard methods fail, needs to be investigated. In this paper, the authors present their view on new developments for design and analysis of clinical trials in small population groups, where conventional statistical methods may be inappropriate, e.g., because of lack of power or poor adherence to asymptotic approximations due to sample size restrictions. Following the EMA/CHMP guideline on clinical trials in small populations, we consider directions for new developments in the area of statistical methodology for design and analysis of small population clinical trials. We relate the findings to the research activities of three projects, Asterix, IDeAl, and InSPiRe, which have received funding since 2013 within the FP7-HEALTH-2013-INNOVATION-1 framework of the EU. As not all aspects of the wide research area of small population clinical trials can be addressed, we focus on areas where we feel advances are needed and feasible. The general framework of the EMA/CHMP guideline on small population clinical trials stimulates a number of research areas. These serve as the basis for the three projects, Asterix, IDeAl, and InSPiRe, which use various approaches to develop new statistical methodology for design and analysis of small population clinical trials. Small population clinical trials refer to trials with a limited number of patients. Small populations may result form rare diseases or specific subtypes of more common diseases. New statistical methodology needs to be tailored to these specific situations. The main results from the three projects will constitute a useful toolbox for improved design and analysis of small population clinical trials. They address various challenges presented by the EMA/CHMP guideline as well as recent discussions about extrapolation. There is a need for involvement of the patients' perspective in the planning and conduct of small population clinical trials for a successful therapy evaluation.

  19. ANALYSIS OF ENTEROCOCCUS FAECALIS IN SAMPLES FROM TURKISH PATIENTS WITH PRIMARY ENDODONTIC INFECTIONS AND FAILED ENDODONTIC TREATMENT BY REAL-TIME PCR SYBR GREEN METHOD

    PubMed Central

    Ozbek, Selcuk M.; Ozbek, Ahmet; Erdogan, Aziz S.

    2009-01-01

    Objective: The aims of this study were to investigate the presence of Enterococcus faecalis in primary endodontic infections and failed endodontic treatments using real-time PCR and to determine the statistical importance of the presence of E. faecalis in a Turkish population with endodontic infections. Material and Methods: E. faecalis was investigated from 79 microbial samples collected from patients who were treated at the Endodontic Clinic of the Dental School of Atatürk University (Erzurum, Turkey). Microbial samples were taken from 43 patients (Group 1) with failed endodontic treatments and 36 patients (Group 2) with chronic apical periodontitis (primary endodontic infections). DNA was extracted from the samples by using a QIAamp® DNA mini-kit and analyzed with real-time PCR SYBR Green. Results: E. faecalis was detected in 41 out of 79 patients, suggesting that it exists in not less than 61% of all endodontic infections when the proportion test (z= -1.645,

  20. Abstracts of ARI Research Publications, FY 1978

    DTIC Science & Technology

    1980-09-01

    initial item pool, 49 items were identified as having signifi- cant item-to-total-score correlations and were statistically determined to address a...failing. Differences among the three groups on main gun performance measures and the previous experience of gun- ners were not statistically significant...forms of the noncognitive cod- ing speed test; and (d) a second field administration to derive norms and other statistical characteristics of the new

  1. Failed Pavlik harness treatment for DDH as a risk factor for avascular necrosis.

    PubMed

    Tiruveedhula, Madhu; Reading, Isabel C; Clarke, Nicholas M P

    2015-03-01

    Avascular necrosis (AVN) of the femoral head is an irreversible complication seen in the treatment of developmental dysplasia of hip (DDH) with the Pavlik harness. Its incidence is reported to be low after successful reduction of the hip but high if the hip is not concentrically relocated. We aim to investigate its incidence after failed Pavlik harness treatment. We prospectively followed up a group of children who failed Pavlik harness treatment for DDH treated at our institution by the senior author between 1988 and 2001 and compared their rates of AVN with a group of children who presented late and hence were treated surgically. AVN was graded as described by Kalamchi and MacEwen and only grade 2 to 4 AVN was considered significant and included in the analysis. Thirty-seven hips were included in the failed Pavlik group (group 1) and 86 hips in the no Pavlik group (group 2). Ten hips in group 1 developed AVN (27%), whereas only 7 hips in group 2 (8%) developed AVN; the odds of developing AVN after failed Pavlik treatment was 4.7 (95% confidence interval, 1.3-14.1) (P=0.009) with a relative risk of 3.32 (range, 1.37 to 8.05). There was no statistically significant association observed with duration of splintage and severity of AVN (Spearman's correlation, -0.46; P=0.18). However, there was a positive correlation noted with age at presentation and severity of AVN. Therefore, we advise close monitoring of hips in the Pavlik harness and discontinue its use if the hips are not reduced within 3 weeks. Level III.

  2. Chlorine dioxide water disinfection: a prospective epidemiology study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michael, G.E.; Miday, R.K.; Bercz, J.P.

    An epidemiologic study of 198 persons exposed for 3 months to drinking water disinfected with chlorine dioxide was conducted in a rural village. A control population of 118 nonexposed persons was also studied. Pre-exposure hematologic and serum chemical parameters were compared with test results after 115 days of exposure. Chlorite ion levels in the water averaged approximately 5 ppM during the study period. Statistical analysis (ANOVA) of the data failed to identify any significant exposure-related effects. This study suggests that future evaluations of chlorine dioxide disinfection should be directed toward populations with potentially increased sensitivity to hemolytic agents.

  3. [Tadalafil combined with behavior therapy for semen collection from infertile males in whom masturbation fails].

    PubMed

    Tang, Wen-Hao; Jiang, Hui; Ma, Lu-Lin; Hong, Kai; Zhao, Lian-Ming; Liu, De-Feng; Mao, Jia-Ming; Yang, Yi; Zhang, Ju; Gao, Ling; Qiao, Jie

    2013-05-01

    To study the effect of Tadalafil combined with behavior therapy in helping obtain semen from infertile men in whom masturbation has failed. Sixty male infertile patients from whom masturbation had failed to obtain semen were equally assigned to receive Tadalafil combined with behavior therapy (combination group) or Tadalafil only (control group). All the patients took Tadalafil 20 mg orally the night before the day of semen collection by masturbation. Before this procedure, the patients of the combination group practiced masturbation 16 - 24 times at home. The average ages of the patients were (37.0 +/- 5.1) yr and (37.5 +/- 5.2) yr and their IIEF-5 scores were 16.50 +/- 1.25 and 16.90 +/- 1.09 in the combination and the control group, respectively, neither with statistically significant difference between the two groups. Semen was successfully obtained from 9 patients (30.0%) of the combination group and 1 patient (3.33%) of the control group, with statistically significant difference between the two groups (chi2 = 7.680, P < 0.01). By training the patients and establishing a conditioned response to masturbation, Tadalafil combined with behavior therapy can significantly increase the success rate of semen collection from the male infertile patients in whom masturbation fails.

  4. How White Teachers Experience and Think about Race in Professional Development

    ERIC Educational Resources Information Center

    Marcy, Renee

    2010-01-01

    The public educational system in the United States fails to proficiently educate a majority of African American, Latino/a, and students from low-income backgrounds. Test score statistics show an average scaled score gap of twenty-six points between African American and White students (National Center for Education Statistics, 2007). The term…

  5. Incorporating Nonparametric Statistics into Delphi Studies in Library and Information Science

    ERIC Educational Resources Information Center

    Ju, Boryung; Jin, Tao

    2013-01-01

    Introduction: The Delphi technique is widely used in library and information science research. However, many researchers in the field fail to employ standard statistical tests when using this technique. This makes the technique vulnerable to criticisms of its reliability and validity. The general goal of this article is to explore how…

  6. Leasing Into the Sun: A Mixed Method Analysis of Transactions of Homes with Third Party Owned Solar

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoen, Ben; Rand, Joseph; Adomatis, Sandra

    This analysis is the first to examine if homes with third-party owned (TPO) PV systems are unique in the marketplace as compared to non-PV or non-TPO PV homes. This is of growing importance as the number of homes with TPO systems is nearly a half of a million in the US currently and is growing. A hedonic pricing model analysis of 20,106 homes that sold in California between 2011 and 2013 is conducted, as well as a paired sales analysis of 18 pairs of TPO PV and non-PV homes in San Diego spanning 2012 and 2013. The hedonic model examinedmore » 2,914 non-TPO PV home sales and 113 TPO PV sales and fails to uncover statistically significant premiums for TPO PV homes nor for those with pre-paid leases as compared to non-PV homes. Similarly, the paired sales analysis does not find evidence of an impact to value for the TPO homes when comparing to non-PV homes. Analyses of non-TPO PV sales both here and previously have found larger and statistically significant premiums. Collection of a larger dataset that covers the present period is recommended for future analyses so that smaller, more nuanced and recent effects can be discovered.« less

  7. Formalizing the definition of meta-analysis in Molecular Ecology.

    PubMed

    ArchMiller, Althea A; Bauer, Eric F; Koch, Rebecca E; Wijayawardena, Bhagya K; Anil, Ammu; Kottwitz, Jack J; Munsterman, Amelia S; Wilson, Alan E

    2015-08-01

    Meta-analysis, the statistical synthesis of pertinent literature to develop evidence-based conclusions, is relatively new to the field of molecular ecology, with the first meta-analysis published in the journal Molecular Ecology in 2003 (Slate & Phua 2003). The goal of this article is to formalize the definition of meta-analysis for the authors, editors, reviewers and readers of Molecular Ecology by completing a review of the meta-analyses previously published in this journal. We also provide a brief overview of the many components required for meta-analysis with a more specific discussion of the issues related to the field of molecular ecology, including the use and statistical considerations of Wright's FST and its related analogues as effect sizes in meta-analysis. We performed a literature review to identify articles published as 'meta-analyses' in Molecular Ecology, which were then evaluated by at least two reviewers. We specifically targeted Molecular Ecology publications because as a flagship journal in this field, meta-analyses published in Molecular Ecology have the potential to set the standard for meta-analyses in other journals. We found that while many of these reviewed articles were strong meta-analyses, others failed to follow standard meta-analytical techniques. One of these unsatisfactory meta-analyses was in fact a secondary analysis. Other studies attempted meta-analyses but lacked the fundamental statistics that are considered necessary for an effective and powerful meta-analysis. By drawing attention to the inconsistency of studies labelled as meta-analyses, we emphasize the importance of understanding the components of traditional meta-analyses to fully embrace the strengths of quantitative data synthesis in the field of molecular ecology. © 2015 John Wiley & Sons Ltd.

  8. The effect of filtered speech feedback on the frequency of stuttering

    NASA Astrophysics Data System (ADS)

    Rami, Manish Krishnakant

    2000-10-01

    This study investigated the effects of filtered components of speech and whispered speech on the frequency of stuttering. It is known that choral speech, shadowing, and altered auditory feedback are the only conditions which induce fluency without any additional effort than normally required to speak on the part of people who stutter. All these conditions use speech as a second signal. This experiment examined the role of components of speech signal as delineated by the source- filter theory of speech production. Three filtered speech signals, a whispered speech signal, and a choral speech signal formed the stimuli. It was postulated that if the speech signal in whole was necessary for producing fluency in people who stutter, then all other conditions except choral speech should fail to produce fluency enhancement. If the glottal source alone was adequate in restoring fluency, then only the conditions of NAF and whispered speech should fail in promoting fluency. In the event that full filter characteristics are necessary for the fluency creating effects, then all conditions except the choral speech and whispered speech should fail to produce fluency. If any part of the filter characteristics is sufficient in yielding fluency, then only the NAF and the approximate glottal source should fail to demonstrate an increase in the amount of fluency. Twelve adults who stuttered read passages under the six conditions while receiving auditory feedback consisting of one of the six experimental conditions: (a)NAF; (b)approximate glottal source; (c)glottal source and first formant; (d)glottal source and first two formants; and (e)whispered speech. Frequencies of stuttering were obtained for each condition and submitted to descriptive and inferential statistical analysis. Statistically significant differences in means were found within the choral feedback conditions. Specifically, the choral speech, the source and first formant, source and the first two formants, and the whispered speech conditions all decreased the frequency of stuttering while the approximate glottal source did not. It is suggested that articulatory events, chiefly the encoded speech output of the vocal tract origin, afford effective cues and induces fluent speech in people who stutter.

  9. Author Self-disclosure Compared with Pharmaceutical Company Reporting of Physician Payments.

    PubMed

    Alhamoud, Hani A; Dudum, Ramzi; Young, Heather A; Choi, Brian G

    2016-01-01

    Industry manufacturers are required by the Sunshine Act to disclose payments to physicians. These data recently became publicly available, but some manufacturers prereleased their data since 2009. We tested the hypotheses that there would be discrepancies between manufacturers' and physicians' disclosures. The financial disclosures by authors of all 39 American College of Cardiology and American Heart Association guidelines between 2009 and 2012 were matched to the public disclosures of 15 pharmaceutical companies during that same period. Duplicate authors across guidelines were assessed independently. Per the guidelines, payments <$10,000 are modest and ≥$10,000 are significant. Agreement was determined using a κ statistic; Fisher's exact and Mann-Whitney tests were used to detect statistical significance. The overall agreement between author and company disclosure was poor (κ = 0.238). There was a significant difference in error rates of disclosure among companies and authors (P = .019). Of disclosures by authors, companies failed to match them with an error rate of 71.6%. Of disclosures by companies, authors failed to match them with an error rate of 54.7%. Our analysis shows a concerning level of disagreement between guideline authors' and pharmaceutical companies' disclosures. Without ability for physicians to challenge reports, it is unclear whether these discrepancies reflect undisclosed relationships with industry or errors in reporting, and caution should be advised in interpretation of data from the Sunshine Act. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Models for inference in dynamic metacommunity systems

    USGS Publications Warehouse

    Dorazio, Robert M.; Kery, Marc; Royle, J. Andrew; Plattner, Matthias

    2010-01-01

    A variety of processes are thought to be involved in the formation and dynamics of species assemblages. For example, various metacommunity theories are based on differences in the relative contributions of dispersal of species among local communities and interactions of species within local communities. Interestingly, metacommunity theories continue to be advanced without much empirical validation. Part of the problem is that statistical models used to analyze typical survey data either fail to specify ecological processes with sufficient complexity or they fail to account for errors in detection of species during sampling. In this paper, we describe a statistical modeling framework for the analysis of metacommunity dynamics that is based on the idea of adopting a unified approach, multispecies occupancy modeling, for computing inferences about individual species, local communities of species, or the entire metacommunity of species. This approach accounts for errors in detection of species during sampling and also allows different metacommunity paradigms to be specified in terms of species- and location-specific probabilities of occurrence, extinction, and colonization: all of which are estimable. In addition, this approach can be used to address inference problems that arise in conservation ecology, such as predicting temporal and spatial changes in biodiversity for use in making conservation decisions. To illustrate, we estimate changes in species composition associated with the species-specific phenologies of flight patterns of butterflies in Switzerland for the purpose of estimating regional differences in biodiversity.

  11. The glass ceiling is not fragile: A response to Odum (2000).

    PubMed

    McSweeney, F K; Swindell, S

    2001-01-01

    Odum (2000) criticized our recent conclusions about the participation of women in the experimental analysis of behavior (McSweeney & Swindell, 1998). We address her criticisms here. We argue against the need for statistical tests. We show that our conclusions still apply to all journals except the Journal of the Experimental Analysis of Behavior even when we include the senior editorial staff along with members of the editorial board. We argue that the data that Odum provides to show gender equity are limited, inconsistent with past findings, and hard to interpret in the absence of other data. Finally, we argue that Odum failed to address our most convincing argument for gender inequity and misinterpreted our suggestions for improvements.

  12. Failed biliary cannulation: Clinical and technical outcomes after tertiary referral endoscopic retrograde cholangiopancreatography

    PubMed Central

    Swan, Michael P; Bourke, Michael J; Williams, Stephen J; Alexander, Sina; Moss, Alan; Hope, Rick; Ruppin, David

    2011-01-01

    AIM: Prospective evaluation of repeat endoscopic retrograde cholangiopancreatography (ERCP) for failed Schutz grade 1 biliary cannulation in a high-volume center. METHODS: Prospective intention-to-treat analysis of patients referred for biliary cannulation following recent unsuccessful ERCP. RESULTS: Fifty-one patients (35 female; mean age: 62.5 years; age range: 40-87 years) with previous failed biliary cannulation were referred for repeat ERCP. The indication for ERCP was primarily choledocholithiasis (45%) or pancreatic malignancy (18%). Successful biliary cannulation was 100%. The precut needle knife sphincterotomy (NKS) rate was 27.4%. Complications occurred in 3.9% (post-ERCP pancreatitis). An identifiable reason for initial unsuccessful biliary cannulation was present in 55% of cases. Compared to a cohort of 940 naïve papilla patients (female 61%; mean age: 59.9 years; age range: 18-94 years) who required sphincterotomy over the same time period, there was no statistical difference in the cannulation success rate (100% vs 98%) or post-ERCP pancreatitis (3.1% vs 3.9%). Precut NKS use was more frequent (27.4% vs 12.7%) (P = 0.017). CONCLUSION: Referral to a high-volume center following unsuccessful ERCP is associated with high technical success, with a favorable complication rate, compared to routine ERCP procedures. PMID:22174549

  13. Violations of the ceiling principle: exact conditions and statistical evidence.

    PubMed Central

    Slimowitz, J R; Cohen, J E

    1993-01-01

    The National Research Council recommended the use of the ceiling principle in forensic applications of DNA testing on the grounds that the ceiling principle was believed to be "conservative," giving estimates greater than or equal to the actual genotype frequencies in the appropriate reference population. We show here that the ceiling principle can fail to be conservative in a population with two subpopulations and two loci, each with two alleles at Hardy-Weinberg equilibrium, if there is some linkage disequilibrium between loci. We also show that the ceiling principle can fail in a population with two subpopulations and a single locus with two alleles if Hardy-Weinberg equilibrium does not hold. We give explicit analytical formulas to describe when the ceiling principle fails. By showing that the ceiling principle is not always mathematically reliable, this analysis gives users of the ceiling principle the responsibility of demonstrating that it is conservative for the particular data with which it is used. Our reanalysis of VNTR data bases of the FBI provides compelling evidence of two-locus associations within three major ethnic groups (Caucasian, black, and Hispanic) in the United States, even though the loci tested are located on different chromosomes. Before the ceiling principle is implemented, more research should be done to determine whether it may be violated in practice. PMID:8328450

  14. Retrieval and clinical analysis of distraction-based dual growing rod constructs for early-onset scoliosis.

    PubMed

    Hill, Genevieve; Nagaraja, Srinidhi; Akbarnia, Behrooz A; Pawelek, Jeff; Sponseller, Paul; Sturm, Peter; Emans, John; Bonangelino, Pablo; Cockrum, Joshua; Kane, William; Dreher, Maureen

    2017-10-01

    Growing rod constructs are an important contribution for treating patients with early-onset scoliosis. These devices experience high failure rates, including rod fractures. The objective of this study was to identify the failure mechanism of retrieved growing rods, and to identify differences between patients with failed and intact constructs. Growing rod patients who had implant removal and were previously enrolled in a multicenter registry were eligible for this study. Forty dual-rod constructs were retrieved from 36 patients across four centers, and 34 of those constructs met the inclusion criteria. Eighteen constructs failed due to rod fracture. Sixteen intact constructs were removed due to final fusion (n=7), implant exchange (n=5), infection (n=2), or implant prominence (n=2). Analyses of clinical registry data, radiographs, and retrievals were the outcome measures. Retrievals were analyzed with microscopic imaging (optical and scanning electron microscopy) for areas of mechanical failure, damage, and corrosion. Failure analyses were conducted on the fracture surfaces to identify failure mechanism(s). Statistical analyses were performed to determine significant differences between the failed and intact groups. The failed rods fractured due to bending fatigue under flexion motion. Construct configuration and loading dictate high bending stresses at three distinct locations along the construct: (1) mid-construct, (2) adjacent to the tandem connector, or (3) adjacent to the distal anchor foundation. In addition, high torques used to insert set screws may create an initiation point for fatigue. Syndromic scoliosis, prior rod fractures, increase in patient weight, and rigid constructs consisting of tandem connectors and multiple crosslinks were associated with failure. This is the first study to examine retrieved, failed growing rod implants across multiple centers. Our analysis found that rod fractures are due to bending fatigue, and that stress concentrations play an important role in rod fractures. Recommendations are made on surgical techniques, such as the use of torque-limiting wrenches or not exceeding the prescribed torques. Additional recommendations include frequent rod replacement in select patients during scheduled surgeries. Published by Elsevier Inc.

  15. What's Trust Got to Do with It? A Communications and Engagement Guide for School Leaders Tackling the Problem of Persistently Failing Schools

    ERIC Educational Resources Information Center

    Johnson, Jean

    2011-01-01

    The rationale for taking bold action on the nation's persistently failing schools can be summed up in one dramatic and disturbing statistic: half of the young Americans who drop out of high school attend just 12 percent of the nation's schools. Ending the cycle of failure at schools is a daunting challenge and a surprisingly controversial one.…

  16. The Circulation Analysis of Serial Use: Numbers Game or Key to Service?

    PubMed Central

    Raisig, L. Miles

    1967-01-01

    The conventionally erected and reported circulation analysis of serial use in the individual and the feeder library is found to be statistically invalid and misleading, since it measures neither the intellectual use of the serial's contents nor the physical handlings of serial units, and is nonrepresentative of the in-depth library use of serials. It fails utterly to report or even to suggest the relation of intralibrary and interlibrary serial resources. The actual mechanics of the serial use analysis, and the active variables in the library situation which affect serial use, are demonstrated in a simulated analysis and are explained at length. A positive design is offered for the objective gathering and reporting of data on the local intellectual use and physical handling of serials and the relating of resources. Data gathering in the feeder library, and implications for the extension of the feeder library's resources, are discussed. PMID:6055863

  17. A complementation assay for in vivo protein structure/function analysis in Physcomitrella patens (Funariaceae)

    DOE PAGES

    Scavuzzo-Duggan, Tess R.; Chaves, Arielle M.; Roberts, Alison W.

    2015-07-14

    Here, a method for rapid in vivo functional analysis of engineered proteins was developed using Physcomitrella patens. A complementation assay was designed for testing structure/function relationships in cellulose synthase (CESA) proteins. The components of the assay include (1) construction of test vectors that drive expression of epitope-tagged PpCESA5 carrying engineered mutations, (2) transformation of a ppcesa5 knockout line that fails to produce gametophores with test and control vectors, (3) scoring the stable transformants for gametophore production, (4) statistical analysis comparing complementation rates for test vectors to positive and negative control vectors, and (5) analysis of transgenic protein expression by Westernmore » blotting. The assay distinguished mutations that generate fully functional, nonfunctional, and partially functional proteins. In conclusion, compared with existing methods for in vivo testing of protein function, this complementation assay provides a rapid method for investigating protein structure/function relationships in plants.« less

  18. Examining the effectiveness of discriminant function analysis and cluster analysis in species identification of male field crickets based on their calling songs.

    PubMed

    Jaiswara, Ranjana; Nandi, Diptarup; Balakrishnan, Rohini

    2013-01-01

    Traditional taxonomy based on morphology has often failed in accurate species identification owing to the occurrence of cryptic species, which are reproductively isolated but morphologically identical. Molecular data have thus been used to complement morphology in species identification. The sexual advertisement calls in several groups of acoustically communicating animals are species-specific and can thus complement molecular data as non-invasive tools for identification. Several statistical tools and automated identifier algorithms have been used to investigate the efficiency of acoustic signals in species identification. Despite a plethora of such methods, there is a general lack of knowledge regarding the appropriate usage of these methods in specific taxa. In this study, we investigated the performance of two commonly used statistical methods, discriminant function analysis (DFA) and cluster analysis, in identification and classification based on acoustic signals of field cricket species belonging to the subfamily Gryllinae. Using a comparative approach we evaluated the optimal number of species and calling song characteristics for both the methods that lead to most accurate classification and identification. The accuracy of classification using DFA was high and was not affected by the number of taxa used. However, a constraint in using discriminant function analysis is the need for a priori classification of songs. Accuracy of classification using cluster analysis, which does not require a priori knowledge, was maximum for 6-7 taxa and decreased significantly when more than ten taxa were analysed together. We also investigated the efficacy of two novel derived acoustic features in improving the accuracy of identification. Our results show that DFA is a reliable statistical tool for species identification using acoustic signals. Our results also show that cluster analysis of acoustic signals in crickets works effectively for species classification and identification.

  19. Teaching Introductory Business Statistics Using the DCOVA Framework

    ERIC Educational Resources Information Center

    Levine, David M.; Stephan, David F.

    2011-01-01

    Introductory business statistics students often receive little guidance on how to apply the methods they learn to further business objectives they may one day face. And those students may fail to see the continuity among the topics taught in an introductory course if they learn those methods outside a context that provides a unifying framework.…

  20. Schools and Classes for the Blind 1926-27. Bulletin, 1928, No. 9

    ERIC Educational Resources Information Center

    Phillips, Frank M.

    1928-01-01

    This report contains statistics concerning schools and classes for blind pupils for the year 1926-27. Reports are included for 80 schools and institutions. Data concerning sight-saving classes are not included where it is possible to separate them from data concerning classes for the blind. For schools that failed to report, statistics for a…

  1. What is a good index? Problems with statistically based indicators and the Malmquist index as alternative

    USDA-ARS?s Scientific Manuscript database

    Conventional multivariate statistical methods have been used for decades to calculate environmental indicators. These methods generally work fine if they are used in a situation where the method can be tailored to the data. But there is some skepticism that the methods might fail in the context of s...

  2. Why Does a Method That Fails Continue To Be Used: The Answer

    PubMed Central

    Templeton, Alan R.

    2009-01-01

    It has been claimed that hundreds of researchers use nested clade phylogeographic analysis (NCPA) based on what the method promises rather than requiring objective validation of the method. The supposed failure of NCPA is based upon the argument that validating it by using positive controls ignored type I error, and that computer simulations have shown a high type I error. The first argument is factually incorrect: the previously published validation analysis fully accounted for both type I and type II errors. The simulations that indicate a 75% type I error rate have serious flaws and only evaluate outdated versions of NCPA. These outdated type I error rates fall precipitously when the 2003 version of single locus NCPA is used or when the 2002 multi-locus version of NCPA is used. It is shown that the treewise type I errors in single-locus NCPA can be corrected to the desired nominal level by a simple statistical procedure, and that multilocus NCPA reconstructs a simulated scenario used to discredit NCPA with 100% accuracy. Hence, NCPA is a not a failed method at all, but rather has been validated both by actual data and by simulated data in a manner that satisfies the published criteria given by its critics. The critics have come to different conclusions because they have focused on the pre-2002 versions of NCPA and have failed to take into account the extensive developments in NCPA since 2002. Hence, researchers can choose to use NCPA based upon objective critical validation that shows that NCPA delivers what it promises. PMID:19335340

  3. Statistical Performance Evaluation Of Soft Seat Pressure Relief Valves

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, Stephen P.; Gross, Robert E.

    2013-03-26

    Risk-based inspection methods enable estimation of the probability of failure on demand for spring-operated pressure relief valves at the United States Department of Energy's Savannah River Site in Aiken, South Carolina. This paper presents a statistical performance evaluation of soft seat spring operated pressure relief valves. These pressure relief valves are typically smaller and of lower cost than hard seat (metal to metal) pressure relief valves and can provide substantial cost savings in fluid service applications (air, gas, liquid, and steam) providing that probability of failure on demand (the probability that the pressure relief valve fails to perform its intendedmore » safety function during a potentially dangerous over pressurization) is at least as good as that for hard seat valves. The research in this paper shows that the proportion of soft seat spring operated pressure relief valves failing is the same or less than that of hard seat valves, and that for failed valves, soft seat valves typically have failure ratios of proof test pressure to set pressure less than that of hard seat valves.« less

  4. Forecasts of non-Gaussian parameter spaces using Box-Cox transformations

    NASA Astrophysics Data System (ADS)

    Joachimi, B.; Taylor, A. N.

    2011-09-01

    Forecasts of statistical constraints on model parameters using the Fisher matrix abound in many fields of astrophysics. The Fisher matrix formalism involves the assumption of Gaussianity in parameter space and hence fails to predict complex features of posterior probability distributions. Combining the standard Fisher matrix with Box-Cox transformations, we propose a novel method that accurately predicts arbitrary posterior shapes. The Box-Cox transformations are applied to parameter space to render it approximately multivariate Gaussian, performing the Fisher matrix calculation on the transformed parameters. We demonstrate that, after the Box-Cox parameters have been determined from an initial likelihood evaluation, the method correctly predicts changes in the posterior when varying various parameters of the experimental setup and the data analysis, with marginally higher computational cost than a standard Fisher matrix calculation. We apply the Box-Cox-Fisher formalism to forecast cosmological parameter constraints by future weak gravitational lensing surveys. The characteristic non-linear degeneracy between matter density parameter and normalization of matter density fluctuations is reproduced for several cases, and the capabilities of breaking this degeneracy by weak-lensing three-point statistics is investigated. Possible applications of Box-Cox transformations of posterior distributions are discussed, including the prospects for performing statistical data analysis steps in the transformed Gaussianized parameter space.

  5. The definition of mental disorder: evolving but dysfunctional?

    PubMed

    Bingham, Rachel; Banner, Natalie

    2014-08-01

    Extensive and diverse conceptual work towards developing a definition of 'mental disorder' was motivated by the declassification of homosexuality from the Diagnostic and Statistical Manual in 1973. This highly politicised event was understood as a call for psychiatry to provide assurances against further misclassification on the basis of discrimination or socio-political deviance. Today, if a definition of mental disorder fails to exclude homosexuality, then it fails to provide this safeguard against potential abuses and therefore fails to do an important part of the work it was intended to do. We argue that fact-based definitions of mental disorder, relying on scientific theory, fail to offer a robust definition of mental disorder that excludes homosexuality. Definitions of mental disorder based on values do not fare better: these definitions are silent on questions about the diagnostic status of individuals in oppressive societies and over-inclusive of mental or behavioural states that happen to be negatively valued in the individual's social context. We consider the latest definition proposed for the Diagnostic and Statistical Manual-5 (DSM-5) in light of these observations. We argue that definition fails to improve on these earlier deficiencies. Its inclusion in the manual may offer false reassurance against repetition of past misclassifications. We conclude with a provocation that if candidate definitions of mental disorder are unable to exclude homosexuality, it might perhaps be preferable not to attempt a definition at all. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  6. Prevalence of upper airway obstruction in patients with apparently asymptomatic euthyroid multi nodular goitre

    PubMed Central

    Menon, Sunil K.; Jagtap, Varsha S.; Sarathi, Vijaya; Lila, Anurag R.; Bandgar, Tushar R.; Menon, Padmavathy S; Shah, Nalini S.

    2011-01-01

    Aims: To study the prevalence of upper airway obstruction (UAO) in “apparently asymptomatic” patients with euthyroid multinodular goitre (MNG) and find correlation between clinical features, UAO on pulmonary function test (PFT) and tracheal narrowing on computerised tomography (CT). Materials and Methods: Consecutive patients with apparently asymptomatic euthyroid MNG attending thyroid clinic in a tertiary centre underwent clinical examination to elicit features of UAO, PFT, and CT of neck and chest. Statistical Analysis Used: Statistical analysis was done with SPSS version 11.5 using paired t-test, Chi square test, and Fisher's exact test. P value of <0.05 was considered to be significant. Results: Fifty-six patients (52 females and four males) were studied. The prevalence of UAO (PFT) and significant tracheal narrowing (CT) was 14.3%. and 9.3%, respectively. Clinical features failed to predict UAO or significant tracheal narrowing. Tracheal narrowing (CT) did not correlate with UAO (PFT). Volume of goitre significantly correlated with degree of tracheal narrowing. Conclusions: Clinical features do not predict UAO on PFT or tracheal narrowing on CT in apparently asymptomatic patients with euthyroid MNG. PMID:21966649

  7. Boston type 1 keratoprosthesis for failed keratoplasty.

    PubMed

    Hager, Jonathan L; Phillips, David L; Goins, Kenneth M; Kitzmann, Anna S; Greiner, Mark A; Cohen, Alex W; Welder, Jeffrey D; Wagoner, Michael D

    2016-02-01

    The purpose of this study was to evaluate the outcomes of the Boston type 1 keratoprosthesis (Kpro-1) in eyes with failed keratoplasty. A retrospective review was performed of every patient treated with a Kpro-1 at a tertiary eye care center between January 1, 2008 and July 1, 2013. Eyes with a failed keratoplasty originally performed for corneal edema, trauma, or keratoconus were included in the statistical analysis. The main outcome measures were visual outcome, prosthesis retention, and postoperative complications. Twenty-four eyes met the inclusion criteria, including 13 eyes with corneal edema, 8 eyes with trauma, and 3 eyes with keratoconus. After a mean follow-up period of 28.9 months (range 7-63 months), the median best corrected visual acuity (BCVA) was 20/125. The BCVA was ≥ 20/40 in 4 (16.7 %) eyes, ≥ 20/70 in 9 (37.5 %) eyes, and ≥ 20/200 in 14 (58.3 %) eyes. Overall, the postoperative BCVA improved in 17 (70.9 %) eyes, was unchanged in 3 (12.5 %) eyes, and was worse in 4 (16.7 %) eyes. The initial Kpro-1 was retained in 22 (91.7 %) eyes, and was successfully repeated in the other 2 eyes. One or more serious prosthesis- or sight-threatening complications occurred in 8 (33.3 %) eyes. These included 1 case of wound dehiscence leading to prosthesis extrusion, 1 case of fungal keratitis leading to prosthesis extrusion, 4 cases of endophthalmitis, and 5 retinal detachments. The Boston Kpro-1 is associated with an excellent prognosis for prosthesis retention and satisfactory visual improvement in eyes with previous failed keratoplasty.

  8. Clinical trials, epidemiology, and public confidence.

    PubMed

    Seigel, Daniel

    2003-11-15

    Critics in the media have become wary of exaggerated research claims from clinical trials and epidemiological studies. Closer to home, reviews of published studies find a high frequency of poor quality in research methods, including those used for statistical analysis. The statistical literature has long recognized that questionable research findings can occur when investigators fail to set aside their own outcome preferences as they analyse and interpret data. These preferences can be related to financial interests, a concern for patients, peer recognition, and commitment to a hypothesis. Several analyses of published papers provide evidence of an association between financial conflicts of interest and reported results. If we are to regain professional and lay confidence in research findings some changes are required. Clinical journals need to develop more competence in the review of analytic methods and provide space for thorough discussion of published papers whose results are challenged. Graduate schools need to prepare students for the conflicting interests that surround the practice of statistics. Above all, each of us must recognize our responsibility to use analytic procedures that illuminate the research issues rather than those serving special interests. Copyright 2003 John Wiley & Sons, Ltd.

  9. Statistical mechanical analysis of linear programming relaxation for combinatorial optimization problems

    NASA Astrophysics Data System (ADS)

    Takabe, Satoshi; Hukushima, Koji

    2016-05-01

    Typical behavior of the linear programming (LP) problem is studied as a relaxation of the minimum vertex cover (min-VC), a type of integer programming (IP) problem. A lattice-gas model on the Erdös-Rényi random graphs of α -uniform hyperedges is proposed to express both the LP and IP problems of the min-VC in the common statistical mechanical model with a one-parameter family. Statistical mechanical analyses reveal for α =2 that the LP optimal solution is typically equal to that given by the IP below the critical average degree c =e in the thermodynamic limit. The critical threshold for good accuracy of the relaxation extends the mathematical result c =1 and coincides with the replica symmetry-breaking threshold of the IP. The LP relaxation for the minimum hitting sets with α ≥3 , minimum vertex covers on α -uniform random graphs, is also studied. Analytic and numerical results strongly suggest that the LP relaxation fails to estimate optimal values above the critical average degree c =e /(α -1 ) where the replica symmetry is broken.

  10. Statistical mechanical analysis of linear programming relaxation for combinatorial optimization problems.

    PubMed

    Takabe, Satoshi; Hukushima, Koji

    2016-05-01

    Typical behavior of the linear programming (LP) problem is studied as a relaxation of the minimum vertex cover (min-VC), a type of integer programming (IP) problem. A lattice-gas model on the Erdös-Rényi random graphs of α-uniform hyperedges is proposed to express both the LP and IP problems of the min-VC in the common statistical mechanical model with a one-parameter family. Statistical mechanical analyses reveal for α=2 that the LP optimal solution is typically equal to that given by the IP below the critical average degree c=e in the thermodynamic limit. The critical threshold for good accuracy of the relaxation extends the mathematical result c=1 and coincides with the replica symmetry-breaking threshold of the IP. The LP relaxation for the minimum hitting sets with α≥3, minimum vertex covers on α-uniform random graphs, is also studied. Analytic and numerical results strongly suggest that the LP relaxation fails to estimate optimal values above the critical average degree c=e/(α-1) where the replica symmetry is broken.

  11. Ensembles of radial basis function networks for spectroscopic detection of cervical precancer

    NASA Technical Reports Server (NTRS)

    Tumer, K.; Ramanujam, N.; Ghosh, J.; Richards-Kortum, R.

    1998-01-01

    The mortality related to cervical cancer can be substantially reduced through early detection and treatment. However, current detection techniques, such as Pap smear and colposcopy, fail to achieve a concurrently high sensitivity and specificity. In vivo fluorescence spectroscopy is a technique which quickly, noninvasively and quantitatively probes the biochemical and morphological changes that occur in precancerous tissue. A multivariate statistical algorithm was used to extract clinically useful information from tissue spectra acquired from 361 cervical sites from 95 patients at 337-, 380-, and 460-nm excitation wavelengths. The multivariate statistical analysis was also employed to reduce the number of fluorescence excitation-emission wavelength pairs required to discriminate healthy tissue samples from precancerous tissue samples. The use of connectionist methods such as multilayered perceptrons, radial basis function (RBF) networks, and ensembles of such networks was investigated. RBF ensemble algorithms based on fluorescence spectra potentially provide automated and near real-time implementation of precancer detection in the hands of nonexperts. The results are more reliable, direct, and accurate than those achieved by either human experts or multivariate statistical algorithms.

  12. Design optimization and probabilistic analysis of a hydrodynamic journal bearing

    NASA Technical Reports Server (NTRS)

    Liniecki, Alexander G.

    1990-01-01

    A nonlinear constrained optimization of a hydrodynamic bearing was performed yielding three main variables: radial clearance, bearing length to diameter ratio, and lubricating oil viscosity. As an objective function a combined model of temperature rise and oil supply has been adopted. The optimized model of the bearing has been simulated for population of 1000 cases using Monte Carlo statistical method. It appeared that the so called 'optimal solution' generated more than 50 percent of failed bearings, because their minimum oil film thickness violated stipulated minimum constraint value. As a remedy change of oil viscosity is suggested after several sensitivities of variables have been investigated.

  13. The classification of anxiety and hysterical states. Part I. Historical review and empirical delineation.

    PubMed

    Sheehan, D V; Sheehan, K H

    1982-08-01

    The history of the classification of anxiety, hysterical, and hypochondriacal disorders is reviewed. Problems in the ability of current classification schemes to predict, control, and describe the relationship between the symptoms and other phenomena are outlined. Existing classification schemes failed the first test of a good classification model--that of providing categories that are mutually exclusive. The independence of these diagnostic categories from each other does not appear to hold up on empirical testing. In the absence of inherently mutually exclusive categories, further empirical investigation of these classes is obstructed since statistically valid analysis of the nominal data and any useful multivariate analysis would be difficult if not impossible. It is concluded that the existing classifications are unsatisfactory and require some fundamental reconceptualization.

  14. Observations on Three Endpoint Properties and Their Relationship to Regulatory Outcomes of European Oncology Marketing Applications

    PubMed Central

    Stolk, Pieter; McAuslane, James Neil; Schellens, Jan; Breckenridge, Alasdair M.; Leufkens, Hubert

    2015-01-01

    Background. Guidance and exploratory evidence indicate that the type of endpoints and the magnitude of their outcome can define a therapy’s clinical activity; however, little empirical evidence relates specific endpoint properties with regulatory outcomes. Materials and Methods. We explored the relationship of 3 endpoint properties to regulatory outcomes by assessing 50 oncology marketing authorization applications (MAAs; reviewed from 2009 to 2013). Results. Overall, 16 (32%) had a negative outcome. The most commonly used hard endpoints were overall survival (OS) and the duration of response or stable disease. OS was a component of 91% approved and 63% failed MAAs. The most commonly used surrogate endpoints were progression-free survival (PFS), response rate, and health-related quality of life assessments. There was no difference (p = .3801) between the approved and failed MAA cohorts in the proportion of hard endpoints used. A mean of slightly more than four surrogate endpoints were used per approved MAA compared with slightly more than two for failed MAAs. Longer OS and PFS duration outcomes were generally associated with approvals, often when not statistically significant. The approved cohort was associated with a preponderance of statistically significant (p < .05) improvements in primary endpoints (p < .0001 difference between the approved and failed groups). Conclusion. Three key endpoint properties (type of endpoint [hard/surrogate], magnitude of an endpoint outcome, and its statistical significance) are consistent with the European Medicines Agency guidance and, notwithstanding the contribution of unique disease-specific circumstances, are associated with a predictable positive outcome for oncology MAAs. Implications for Practice: Regulatory decisions made by the European Medicines Agency determine which new medicines will be available to European prescribers and for which therapeutic indications. Regulatory success or failure can be influenced by many factors. This study assessed three key properties of endpoints used in preauthorization trials (type of endpoint [hard/surrogate], magnitude of endpoint outcome, and its statistical significance) and whether they are associated with a positive regulatory outcome. Clinicians can use these properties, which are described in the publicly available European public assessment reports, to help guide their understanding of the clinical effect of new oncologic therapies. PMID:25948678

  15. Observations on Three Endpoint Properties and Their Relationship to Regulatory Outcomes of European Oncology Marketing Applications.

    PubMed

    Liberti, Lawrence; Stolk, Pieter; McAuslane, James Neil; Schellens, Jan; Breckenridge, Alasdair M; Leufkens, Hubert

    2015-06-01

    Guidance and exploratory evidence indicate that the type of endpoints and the magnitude of their outcome can define a therapy's clinical activity; however, little empirical evidence relates specific endpoint properties with regulatory outcomes. We explored the relationship of 3 endpoint properties to regulatory outcomes by assessing 50 oncology marketing authorization applications (MAAs; reviewed from 2009 to 2013). Overall, 16 (32%) had a negative outcome. The most commonly used hard endpoints were overall survival (OS) and the duration of response or stable disease. OS was a component of 91% approved and 63% failed MAAs. The most commonly used surrogate endpoints were progression-free survival (PFS), response rate, and health-related quality of life assessments. There was no difference (p = .3801) between the approved and failed MAA cohorts in the proportion of hard endpoints used. A mean of slightly more than four surrogate endpoints were used per approved MAA compared with slightly more than two for failed MAAs. Longer OS and PFS duration outcomes were generally associated with approvals, often when not statistically significant. The approved cohort was associated with a preponderance of statistically significant (p < .05) improvements in primary endpoints (p < .0001 difference between the approved and failed groups). Three key endpoint properties (type of endpoint [hard/surrogate], magnitude of an endpoint outcome, and its statistical significance) are consistent with the European Medicines Agency guidance and, notwithstanding the contribution of unique disease-specific circumstances, are associated with a predictable positive outcome for oncology MAAs. Regulatory decisions made by the European Medicines Agency determine which new medicines will be available to European prescribers and for which therapeutic indications. Regulatory success or failure can be influenced by many factors. This study assessed three key properties of endpoints used in preauthorization trials (type of endpoint [hard/surrogate], magnitude of endpoint outcome, and its statistical significance) and whether they are associated with a positive regulatory outcome. Clinicians can use these properties, which are described in the publicly available European public assessment reports, to help guide their understanding of the clinical effect of new oncologic therapies. ©AlphaMed Press.

  16. Validation of time to task performance assessment method in simulation: A comparative design study.

    PubMed

    Shinnick, Mary Ann; Woo, Mary A

    2018-05-01

    There is a lack of objective and valid measures for assessing nursing clinical competence which could adversely impact patient safety. Therefore, we evaluated an objective assessment of clinical competence, Time to Task (ability to perform specific, critical nursing care activities within 5 min), and compared it to two subjective measures, (Lasater Clinical Judgement Rubric [LCJR] and common "pass/fail" assessment). Using a prospective, "Known Groups" (Expert vs. Novice nurses) comparative design, Expert nurses (ICU nurses with >5 years of ICU experience) and Novice nurses (senior prelicensure nursing students) participated individually in a simulation of a patient in decompensated heart failure. Fourteen nursing instructors or preceptors, blinded to group assignment, reviewed 28 simulation videos (15 Expert and 13 Novice) and scored them using the LCJR and pass/fail assessments. Time to Task assessment was scored based on time thresholds for specific nursing actions prospectively set by an expert clinical panel. Statistical analysis consisted of Medians Test and sensitivity and specificity analyses. The LCJR total score was significantly different between Experts and Novices (p < 0.01) and revealed adequate sensitivity (ability to correctly identify "Expert" nurses; 0.72) but had a low specificity (ability to correctly identify "Novice" nurses; 0.40). For the subjective measure 'pass/fail', sensitivity was high (0.90) but specificity was low (0.47). The Time to Task measure had statistical significance between Expert and Novice groups (p < 0.01) and sensitivity (0.80) and specificity (0.85) were good. Commonly used subjective measures of clinical nursing competence have difficulties with achieving acceptable specificity. However, an objective measure, Time to Task, had good sensitivity and specificity in differentiating between groups. While more than one assessment instrument should be used to determine nurse competency, an objective measure, such as Time to Task, warrants further study. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Topological signatures of interstellar magnetic fields - I. Betti numbers and persistence diagrams

    NASA Astrophysics Data System (ADS)

    Makarenko, Irina; Shukurov, Anvar; Henderson, Robin; Rodrigues, Luiz F. S.; Bushby, Paul; Fletcher, Andrew

    2018-04-01

    The interstellar medium (ISM) is a magnetized system in which transonic or supersonic turbulence is driven by supernova explosions. This leads to the production of intermittent, filamentary structures in the ISM gas density, whilst the associated dynamo action also produces intermittent magnetic fields. The traditional theory of random functions, restricted to second-order statistical moments (or power spectra), does not adequately describe such systems. We apply topological data analysis (TDA), sensitive to all statistical moments and independent of the assumption of Gaussian statistics, to the gas density fluctuations in a magnetohydrodynamic simulation of the multiphase ISM. This simulation admits dynamo action, so produces physically realistic magnetic fields. The topology of the gas distribution, with and without magnetic fields, is quantified in terms of Betti numbers and persistence diagrams. Like the more standard correlation analysis, TDA shows that the ISM gas density is sensitive to the presence of magnetic fields. However, TDA gives us important additional information that cannot be obtained from correlation functions. In particular, the Betti numbers per correlation cell are shown to be physically informative. Magnetic fields make the ISM more homogeneous, reducing the abundance of both isolated gas clouds and cavities, with a stronger effect on the cavities. Remarkably, the modification of the gas distribution by magnetic fields is captured by the Betti numbers even in regions more than 300 pc from the mid-plane, where the magnetic field is weaker and correlation analysis fails to detect any signatures of magnetic effects.

  18. Association of bladder sensation measures and bladder diary in patients with urinary incontinence.

    PubMed

    King, Ashley B; Wolters, Jeff P; Klausner, Adam P; Rapp, David E

    2012-04-01

    Investigation suggests the involvement of afferent actions in the pathophysiology of urinary incontinence. Current diagnostic modalities do not allow for the accurate identification of sensory dysfunction. We previously reported urodynamic derivatives that may be useful in assessing bladder sensation. We sought to further investigate these derivatives by assessing for a relationship with 3-day bladder diary. Subset analysis was performed in patients without stress urinary incontinence (SUI) attempting to isolate patients with urgency symptoms. No association was demonstrated between bladder diary parameters and urodynamic derivatives (r coefficient range (-0.06 to 0.08)(p > 0.05)). However, subset analysis demonstrated an association between detrusor overactivity (DO) and bladder urgency velocity (BUV), with a lower BUV identified in patients without DO. Subset analysis of patients with isolated urgency/urge incontinence identified weak associations between voiding frequency and FSR (r = 0.39) and between daily incontinence episodes and BUV (r = 0.35). However, these associations failed to demonstrate statistical significance. No statistical association was seen between bladder diary and urodynamic derivatives. This is not unexpected, given that bladder diary parameters may reflect numerous pathologies including not only sensory dysfunction but also SUI and DO. However, weak associations were identified in patients without SUI and, further, a statistical relationship between DO and BUV was seen. Additional research is needed to assess the utility of FSR/BUV in characterizing sensory dysfunction, especially in patients without concurrent pathology (e.g. SUI, DO).

  19. Vitamin D and depression: a systematic review and meta-analysis comparing studies with and without biological flaws.

    PubMed

    Spedding, Simon

    2014-04-11

    Efficacy of Vitamin D supplements in depression is controversial, awaiting further literature analysis. Biological flaws in primary studies is a possible reason meta-analyses of Vitamin D have failed to demonstrate efficacy. This systematic review and meta-analysis of Vitamin D and depression compared studies with and without biological flaws. The systematic review followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. The literature search was undertaken through four databases for randomized controlled trials (RCTs). Studies were critically appraised for methodological quality and biological flaws, in relation to the hypothesis and study design. Meta-analyses were performed for studies according to the presence of biological flaws. The 15 RCTs identified provide a more comprehensive evidence-base than previous systematic reviews; methodological quality of studies was generally good and methodology was diverse. A meta-analysis of all studies without flaws demonstrated a statistically significant improvement in depression with Vitamin D supplements (+0.78 CI +0.24, +1.27). Studies with biological flaws were mainly inconclusive, with the meta-analysis demonstrating a statistically significant worsening in depression by taking Vitamin D supplements (-1.1 CI -0.7, -1.5). Vitamin D supplementation (≥800 I.U. daily) was somewhat favorable in the management of depression in studies that demonstrate a change in vitamin levels, and the effect size was comparable to that of anti-depressant medication.

  20. Randomized clinical study comparing metallic and glass fiber post in restoration of endodontically treated teeth.

    PubMed

    Gbadebo, Olaide S; Ajayi, Deborah M; Oyekunle, Oyekunle O Dosumu; Shaba, Peter O

    2014-01-01

    Post-retained crowns are indicated for endodontically treated teeth (ETT) with severely damaged coronal tissue. Metallic custom and prefabricated posts have been used over the years, however, due to unacceptable color, extreme rigidity and corrosion, fiber posts, which are flexible, aesthetically pleasing and have modulus of elasticity comparable with dentin were introduced. To compare clinical performance of metallic and glass fiber posts in restoration of ETT. 40 ETT requiring post retained restorations were included. These teeth were randomly allocated into 2 groups. Twenty teeth were restored using a glass fiber-reinforced post (FRP) and 20 others received stainless steel parapost (PP), each in combination with composite core buildups. Patients were observed at 1 and 6 months after post placement and cementation of porcelain fused to metal (PFM) crown. Marginal gap consideration, post retention, post fracture, root fracture, crown fracture, crown decementation and loss of restoration were part of the data recorded. All teeth were assessed clinically and radiographically. Fisher's exact test was used for categorical values while log-rank test was used for descriptive statistical analysis. One tooth in the PP group failed, secondary to decementation of the PFM crown giving a 2.5% overall failure while none in the FRP group failed. The survival rate of FRP was thus 100% while it was 97.5% in the PP group. This however was not statistically significant (log-rank test, P = 0.32). Glass FRPs performed better than the metallic post based on short-term clinical performance.

  1. Models for inference in dynamic metacommunity systems

    USGS Publications Warehouse

    Dorazio, R.M.; Kery, M.; Royle, J. Andrew; Plattner, M.

    2010-01-01

    A variety of processes are thought to be involved in the formation and dynamics of species assemblages. For example, various metacommunity theories are based on differences in the relative contributions of dispersal of species among local communities and interactions of species within local communities. Interestingly, metacommunity theories continue to be advanced without much empirical validation. Part of the problem is that statistical models used to analyze typical survey data either fail to specify ecological processes with sufficient complexity or they fail to account for errors in detection of species during sampling. In this paper, we describe a statistical modeling framework for the analysis of metacommunity dynamics that is based on the idea of adopting a unified approach, multispecies occupancy modeling, for computing inferences about individual species, local communities of species, or the entire metacommunity of species. This approach accounts for errors in detection of species during sampling and also allows different metacommunity paradigms to be specified in terms of species-and location-specific probabilities of occurrence, extinction, and colonization: all of which are estimable. In addition, this approach can be used to address inference problems that arise in conservation ecology, such as predicting temporal and spatial changes in biodiversity for use in making conservation decisions. To illustrate, we estimate changes in species composition associated with the species-specific phenologies of flight patterns of butterflies in Switzerland for the purpose of estimating regional differences in biodiversity. ?? 2010 by the Ecological Society of America.

  2. Analysis of longitudinal data from animals where some data are missing in SPSS

    PubMed Central

    Duricki, DA; Soleman, S; Moon, LDF

    2017-01-01

    Testing of therapies for disease or injury often involves analysis of longitudinal data from animals. Modern analytical methods have advantages over conventional methods (particularly where some data are missing) yet are not used widely by pre-clinical researchers. We provide here an easy to use protocol for analysing longitudinal data from animals and present a click-by-click guide for performing suitable analyses using the statistical package SPSS. We guide readers through analysis of a real-life data set obtained when testing a therapy for brain injury (stroke) in elderly rats. We show that repeated measures analysis of covariance failed to detect a treatment effect when a few data points were missing (due to animal drop-out) whereas analysis using an alternative method detected a beneficial effect of treatment; specifically, we demonstrate the superiority of linear models (with various covariance structures) analysed using Restricted Maximum Likelihood estimation (to include all available data). This protocol takes two hours to follow. PMID:27196723

  3. SU-F-T-227: A Comprehensive Patient Specific, Structure Specific, Pre-Treatment 3D QA Protocol for IMRT, SBRT and VMAT - Clinical Experience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gueorguiev, G; Cotter, C; Young, M

    2016-06-15

    Purpose: To present a 3D QA method and clinical results for 550 patients. Methods: Five hundred and fifty patient treatment deliveries (400 IMRT, 75 SBRT and 75 VMAT) from various treatment sites, planned on Raystation treatment planning system (TPS), were measured on three beam-matched Elekta linear accelerators using IBA’s COMPASS system. The difference between TPS computed and delivered dose was evaluated in 3D by applying three statistical parameters to each structure of interest: absolute average dose difference (AADD, 6% allowed difference), absolute dose difference greater than 6% (ADD6, 4% structure volume allowed to fail) and 3D gamma test (3%/3mm DTA,more » 4% structure volume allowed to fail). If the allowed value was not met for a given structure, manual review was performed. The review consisted of overlaying dose difference or gamma results with the patient CT, scrolling through the slices. For QA to pass, areas of high dose difference or gamma must be small and not on consecutive slices. For AADD to manually pass QA, the average dose difference in cGy must be less than 50cGy. The QA protocol also includes DVH analysis based on QUANTEC and TG-101 recommended dose constraints. Results: Figures 1–3 show the results for the three parameters per treatment modality. Manual review was performed on 67 deliveries (27 IMRT, 22 SBRT and 18 VMAT), for which all passed QA. Results show that statistical parameter AADD may be overly sensitive for structures receiving low dose, especially for the SBRT deliveries (Fig.1). The TPS computed and measured DVH values were in excellent agreement and with minimum difference. Conclusion: Applying DVH analysis and different statistical parameters to any structure of interest, as part of the 3D QA protocol, provides a comprehensive treatment plan evaluation. Author G. Gueorguiev discloses receiving travel and research funding from IBA for unrelated to this project work. Author B. Crawford discloses receiving travel funding from IBA for unrelated to this project work.« less

  4. Rotary Ni-Ti profile systems for preparing curved canals in resin blocks: influence of operator on instrument breakage.

    PubMed

    Mandel, E; Adib-Yazdi, M; Benhamou, L M; Lachkar, T; Mesgouez, C; Sobel, M

    1999-11-01

    The aim of this study was to determine the incidence of fracture of ProFile 0.4 and 0.6 taper Series 29 nickel-titanium instruments with respect to operator experience. A total of 125 simulated root canals in resin blocks with the same geometrical shape in terms of angle and radius of curvature and coronal and apical orifice diameter were used. Five operators prepared all the specimens using an identical step-down instrument sequence, each one preparing 25 canals. The operators included two endodontists and three general practitioners. Statistical data concerning the incidence of instrument failure was compiled using Statlab and Fisher's partial least square difference analysis of variance. A total of 21 (16.8%) instruments fractured, all had 0.04 tapers. Nine size 25 instruments failed, 9 size 20 instruments failed and 3 size 15. During the study, the Binary Tree analysis of instrument failure revealed two operator populations belonging to two different study periods. The first period, which represented the first 13 root canal preparations, was called the 'learning period', and the second period, which represented the next 12 sample preparations, was called the 'application period'. A greater number of instruments failed during the first period than during the second. In the 'learning period', both groups of operators learned the same way. In the 'application period', two groups could be distinguished; the first group consisted of a general practitioner who produced worse results, and the second group consisted of the other four operators. The results indicate the necessity of mastering this rotary canal preparation technique, and the importance of improving competence through learning and experience.

  5. In vivo analysis of conjunctiva in gold micro shunt implantation for glaucoma.

    PubMed

    Mastropasqua, Leonardo; Agnifili, Luca; Ciancaglini, Marco; Nubile, Mario; Carpineto, Paolo; Fasanella, Vincenzo; Figus, Michele; Lazzeri, Stefano; Nardi, Marco

    2010-12-01

    To describe the conjunctival epithelial features seen with in vivo confocal microscopy (IVCM) after gold micro shunt (GMS) implantation in the suprachoroidal space, in patients with uncontrolled glaucoma. This was an observational case series study. Fourteen eyes of 14 consecutive glaucomatous patients with a history of multiple failed incisional surgeries followed by GMS implantation were evaluated with a digital confocal laser-scanning microscope (HRT II Rostock Cornea Module). Patients were divided into two groups: successful implantations (Group 1: eight patients, eight eyes), defined as a one-third reduction in preoperative intraocular pressure (IOP) with or without antiglaucoma medications and failed implantations (Group 2: six patients, six eyes) as a less than one-third reduction in preoperative IOP with maximal tolerated medical therapy. The examination was performed from 3 to 20 months (mean 15.4±5.4) postoperatively. Conjunctival mean microcyst density (MMD: cysts/mm(2)) and mean microcyst area (MMA: μm(2)) were the main outcome measurements. The mean postoperative IOP was statistically different between the two groups (p<0.05), with the values of 14.3±2.77 and 32.3±8.01 mm Hg in Groups 1 and 2, respectively. When comparing successful with failed implantation, the IVCM analysis showed a greater MMD (p<0.01) and MMA (p<0.01). Clinical evidence of filtering bleb was not found in any of the patients. Successful GMS implantation significantly increased conjunctival microcysts density and surface at the site of the device insertion. These findings suggest that the enhancement of the aqueous filtration across the sclera may be one of the possible outflow pathways exploited by the shunt.

  6. Sleep quality, internet addiction and depressive symptoms among undergraduate students in Nepal.

    PubMed

    Bhandari, Parash Mani; Neupane, Dipika; Rijal, Shristi; Thapa, Kiran; Mishra, Shiva Raj; Poudyal, Amod Kumar

    2017-03-21

    Evidence on the burden of depression, internet addiction and poor sleep quality in undergraduate students from Nepal is virtually non-existent. While the interaction between sleep quality, internet addiction and depressive symptoms is frequently assessed in studies, it is not well explored if sleep quality or internet addiction statistically mediates the association between the other two variables. We enrolled 984 students from 27 undergraduate campuses of Chitwan and Kathmandu, Nepal. We assessed sleep quality, internet addiction and depressive symptoms in these students using Pittsburgh Sleep Quality Index, Young's Internet Addiction Test and Patient Health Questionnaire-9 respectively. We included responses from 937 students in the data analysis after removing questionnaires with five percent or more fields missing. Via bootstrap approach, we assessed the mediating role of internet addiction in the association between sleep quality and depressive symptoms, and that of sleep quality in the association between internet addiction and depressive symptoms. Overall, 35.4%, 35.4% and 21.2% of students scored above validated cutoff scores for poor sleep quality, internet addiction and depression respectively. Poorer sleep quality was associated with having lower age, not being alcohol user, being a Hindu, being sexually active and having failed in previous year's board examination. Higher internet addiction was associated with having lower age, being sexually inactive and having failed in previous year's board examination. Depressive symptoms were higher for students having higher age, being sexually inactive, having failed in previous year's board examination and lower years of study. Internet addiction statistically mediated 16.5% of the indirect effect of sleep quality on depressive symptoms. Sleep quality, on the other hand, statistically mediated 30.9% of the indirect effect of internet addiction on depressive symptoms. In the current study, a great proportion of students met criteria for poor sleep quality, internet addiction and depression. Internet addiction and sleep quality both mediated a significant proportion of the indirect effect on depressive symptoms. However, the cross-sectional nature of this study limits causal interpretation of the findings. Future longitudinal study, where the measurement of internet addiction or sleep quality precedes that of depressive symptoms, are necessary to build upon our understanding of the development of depressive symptoms in students.

  7. The journals are full of great studies but can we believe the statistics? Revisiting the mass privatisation - mortality debate.

    PubMed

    Gerry, Christopher J

    2012-07-01

    Cross-national statistical analyses based on country-level panel data are increasingly popular in social epidemiology. To provide reliable results on the societal determinants of health, analysts must give very careful consideration to conceptual and methodological issues: aggregate (historical) data are typically compatible with multiple alternative stories of the data-generating process. Studies in this field which fail to relate their empirical approach to the true underlying data-generating process are likely to produce misleading results if, for example, they misspecify their models by failing to explore the statistical properties of the longitudinal aspect of their data or by ignoring endogeneity issues. We illustrate the importance of this extra need for care with reference to a recent debate on whether discussing the role of rapid mass privatisation can explain post-communist mortality fluctuations. We demonstrate that the finding that rapid mass privatisation was a "crucial determinant" of male mortality fluctuations in the post-communist world is rejected once better consideration is given to the way in which the data are generated. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. WE-H-BRC-08: Examining Credentialing Criteria and Poor Performance Indicators for IROC Houston’s Anthropomorphic Head and Neck Phantom

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carson, M; Molineu, A; Taylor, P

    Purpose: To analyze the most recent results of IROC Houston’s anthropomorphic H&N phantom to determine the nature of failing irradiations and the feasibility of altering pass/fail credentialing criteria. Methods: IROC Houston’s H&N phantom, used for IMRT credentialing for NCI-sponsored clinical trials, requires that an institution’s treatment plan must agree with measurement within 7% (TLD doses) and ≥85% pixels must pass 7%/4 mm gamma analysis. 156 phantom irradiations (November 2014 – October 2015) were re-evaluated using tighter criteria: 1) 5% TLD and 5%/4 mm, 2) 5% TLD and 5%/3 mm, 3) 4% TLD and 4%/4 mm, and 4) 3% TLD andmore » 3%/3 mm. Failure/poor performance rates were evaluated with respect to individual film and TLD performance by location in the phantom. Overall poor phantom results were characterized qualitatively as systematic (dosimetric) errors, setup errors/positional shifts, global but non-systematic errors, and errors affecting only a local region. Results: The pass rate for these phantoms using current criteria is 90%. Substituting criteria 1-4 reduces the overall pass rate to 77%, 70%, 63%, and 37%, respectively. Statistical analyses indicated the probability of noise-induced TLD failure at the 5% criterion was <0.5%. Using criteria 1, TLD results were most often the cause of failure (86% failed TLD while 61% failed film), with most failures identified in the primary PTV (77% cases). Other criteria posed similar results. Irradiations that failed from film only were overwhelmingly associated with phantom shifts/setup errors (≥80% cases). Results failing criteria 1 were primarily diagnosed as systematic: 58% of cases. 11% were setup/positioning errors, 8% were global non-systematic errors, and 22% were local errors. Conclusion: This study demonstrates that 5% TLD and 5%/4 mm gamma criteria may be both practically and theoretically achievable. Further work is necessary to diagnose and resolve dosimetric inaccuracy in these trials, particularly for systematic dose errors. This work is funded by NCI Grant CA180803.« less

  9. A Comprehensive Reliability Methodology for Assessing Risk of Reusing Failed Hardware Without Corrective Actions with and Without Redundancy

    NASA Technical Reports Server (NTRS)

    Putcha, Chandra S.; Mikula, D. F. Kip; Dueease, Robert A.; Dang, Lan; Peercy, Robert L.

    1997-01-01

    This paper deals with the development of a reliability methodology to assess the consequences of using hardware, without failure analysis or corrective action, that has previously demonstrated that it did not perform per specification. The subject of this paper arose from the need to provide a detailed probabilistic analysis to calculate the change in probability of failures with respect to the base or non-failed hardware. The methodology used for the analysis is primarily based on principles of Monte Carlo simulation. The random variables in the analysis are: Maximum Time of Operation (MTO) and operation Time of each Unit (OTU) The failure of a unit is considered to happen if (OTU) is less than MTO for the Normal Operational Period (NOP) in which this unit is used. NOP as a whole uses a total of 4 units. Two cases are considered. in the first specialized scenario, the failure of any operation or system failure is considered to happen if any of the units used during the NOP fail. in the second specialized scenario, the failure of any operation or system failure is considered to happen only if any two of the units used during the MOP fail together. The probability of failure of the units and the system as a whole is determined for 3 kinds of systems - Perfect System, Imperfect System 1 and Imperfect System 2. in a Perfect System, the operation time of the failed unit is the same as that of the MTO. In an Imperfect System 1, the operation time of the failed unit is assumed as 1 percent of the MTO. In an Imperfect System 2, the operation time of the failed unit is assumed as zero. in addition, simulated operation time of failed units is assumed as 10 percent of the corresponding units before zero value. Monte Carlo simulation analysis is used for this study. Necessary software has been developed as part of this study to perform the reliability calculations. The results of the analysis showed that the predicted change in failure probability (P(sub F)) for the previously failed units is as high as 49 percent above the baseline (perfect system) for the worst case. The predicted change in system P(sub F) for the previously failed units is as high as 36% for single unit failure without any redundancy. For redundant systems, with dual unit failure, the predicted change in P(sub F) for the previously failed units is as high as 16%. These results will help management to make decisions regarding the consequences of using previously failed units without adequate failure analysis or corrective action.

  10. Exact goodness-of-fit tests for Markov chains.

    PubMed

    Besag, J; Mondal, D

    2013-06-01

    Goodness-of-fit tests are useful in assessing whether a statistical model is consistent with available data. However, the usual χ² asymptotics often fail, either because of the paucity of the data or because a nonstandard test statistic is of interest. In this article, we describe exact goodness-of-fit tests for first- and higher order Markov chains, with particular attention given to time-reversible ones. The tests are obtained by conditioning on the sufficient statistics for the transition probabilities and are implemented by simple Monte Carlo sampling or by Markov chain Monte Carlo. They apply both to single and to multiple sequences and allow a free choice of test statistic. Three examples are given. The first concerns multiple sequences of dry and wet January days for the years 1948-1983 at Snoqualmie Falls, Washington State, and suggests that standard analysis may be misleading. The second one is for a four-state DNA sequence and lends support to the original conclusion that a second-order Markov chain provides an adequate fit to the data. The last one is six-state atomistic data arising in molecular conformational dynamics simulation of solvated alanine dipeptide and points to strong evidence against a first-order reversible Markov chain at 6 picosecond time steps. © 2013, The International Biometric Society.

  11. SU-E-J-261: Statistical Analysis and Chaotic Dynamics of Respiratory Signal of Patients in BodyFix

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michalski, D; Huq, M; Bednarz, G

    Purpose: To quantify respiratory signal of patients in BodyFix undergoing 4DCT scan with and without immobilization cover. Methods: 20 pairs of respiratory tracks recorded with RPM system during 4DCT scan were analyzed. Descriptive statistic was applied to selected parameters of exhale-inhale decomposition. Standardized signals were used with the delay method to build orbits in embedded space. Nonlinear behavior was tested with surrogate data. Sample entropy SE, Lempel-Ziv complexity LZC and the largest Lyapunov exponents LLE were compared. Results: Statistical tests show difference between scans for inspiration time and its variability, which is bigger for scans without cover. The same ismore » for variability of the end of exhalation and inhalation. Other parameters fail to show the difference. For both scans respiratory signals show determinism and nonlinear stationarity. Statistical test on surrogate data reveals their nonlinearity. LLEs show signals chaotic nature and its correlation with breathing period and its embedding delay time. SE, LZC and LLE measure respiratory signal complexity. Nonlinear characteristics do not differ between scans. Conclusion: Contrary to expectation cover applied to patients in BodyFix appears to have limited effect on signal parameters. Analysis based on trajectories of delay vectors shows respiratory system nonlinear character and its sensitive dependence on initial conditions. Reproducibility of respiratory signal can be evaluated with measures of signal complexity and its predictability window. Longer respiratory period is conducive for signal reproducibility as shown by these gauges. Statistical independence of the exhale and inhale times is also supported by the magnitude of LLE. The nonlinear parameters seem more appropriate to gauge respiratory signal complexity since its deterministic chaotic nature. It contrasts with measures based on harmonic analysis that are blind for nonlinear features. Dynamics of breathing, so crucial for 4D-based clinical technologies, can be better controlled if nonlinear-based methodology, which reflects respiration characteristic, is applied. Funding provided by Varian Medical Systems via Investigator Initiated Research Project.« less

  12. The influence of the compression interface on the failure behavior and size effect of concrete

    NASA Astrophysics Data System (ADS)

    Kampmann, Raphael

    The failure behavior of concrete materials is not completely understood because conventional test methods fail to assess the material response independent of the sample size and shape. To study the influence of strength and strain affecting test conditions, four typical concrete sample types were experimentally evaluated in uniaxial compression and analyzed for strength, deformational behavior, crack initiation/propagation, and fracture patterns under varying boundary conditions. Both low friction and conventional compression interfaces were assessed. High-speed video technology was used to monitor macrocracking. Inferential data analysis proved reliably lower strength results for reduced surface friction at the compression interfaces, regardless of sample shape. Reciprocal comparisons revealed statistically significant strength differences between most sample shapes. Crack initiation and propagation was found to differ for dissimilar compression interfaces. The principal stress and strain distributions were analyzed, and the strain domain was found to resemble the experimental results, whereas the stress analysis failed to explain failure for reduced end confinement. Neither stresses nor strains indicated strength reductions due to reduced friction, and therefore, buckling effects were considered. The high-speed video analysis revealed localize buckling phenomena, regardless of end confinement. Slender elements were the result of low friction, and stocky fragments developed under conventional confinement. The critical buckling load increased accordingly. The research showed that current test methods do not reflect the "true'' compressive strength and that concrete failure is strain driven. Ultimate collapse results from buckling preceded by unstable cracking.

  13. The eyes of Tullimonstrum reveal a vertebrate affinity.

    PubMed

    Clements, Thomas; Dolocan, Andrei; Martin, Peter; Purnell, Mark A; Vinther, Jakob; Gabbott, Sarah E

    2016-04-28

    Tullimonstrum gregarium is an iconic soft-bodied fossil from the Carboniferous Mazon Creek Lagerstätte (Illinois, USA). Despite a large number of specimens and distinct anatomy, various analyses over the past five decades have failed to determine the phylogenetic affinities of the 'Tully monster', and although it has been allied to such disparate phyla as the Mollusca, Annelida or Chordata, it remains enigmatic. The nature and phylogenetic affinities of Tullimonstrum have defied confident systematic placement because none of its preserved anatomy provides unequivocal evidence of homology, without which comparative analysis fails. Here we show that the eyes of Tullimonstrum possess ultrastructural details indicating homology with vertebrate eyes. Anatomical analysis using scanning electron microscopy reveals that the eyes of Tullimonstrum preserve a retina defined by a thick sheet comprising distinct layers of spheroidal and cylindrical melanosomes. Time-of-flight secondary ion mass spectrometry and multivariate statistics provide further evidence that these microbodies are melanosomes. A range of animals have melanin in their eyes, but the possession of melanosomes of two distinct morphologies arranged in layers, forming retinal pigment epithelium, is a synapomorphy of vertebrates. Our analysis indicates that in addition to evidence of colour patterning, ecology and thermoregulation, fossil melanosomes can also carry a phylogenetic signal. Identification in Tullimonstrum of spheroidal and cylindrical melanosomes forming the remains of retinal pigment epithelium indicates that it is a vertebrate; considering its body parts in this new light suggests it was an anatomically unusual member of total group Vertebrata.

  14. Revealing Companions to Nearby Stars with Astrometric Acceleration

    DTIC Science & Technology

    2012-07-01

    objects, such as stellar -mass black holes or failed supernova (Gould & Salim 2002). Table 4 includes a sample of some of the most interesting dis...knowledge of binary and multiple star statistics is needed for the study of star formation, for stellar population synthesis, for predicting the...frequency of supernovae, blue stragglers, X-ray binaries, etc. The statistical properties of binaries strongly depend on stellar mass. Only for nearby solar

  15. Breastfeeding is positively associated with child intelligence even net of parental IQ.

    PubMed

    Kanazawa, Satoshi

    2015-12-01

    Some previous reviews conclude that breastfeeding is not significantly associated with increased intelligence in children once mother's IQ is statistically controlled. The conclusion may potentially have both theoretical and methodological problems. The National Child Development Study allows the examination of the effect of breastfeeding on intelligence in two consecutive generations of British children. The analysis of the first generation shows that the effect of breastfeeding on intelligence increases from Age 7 to 16. The analysis of the second generation shows that each month of breastfeeding, net of parental IQ and other potential confounds, is associated with an increase of .16 IQ points. Further analyses suggest that some previous studies may have failed to uncover the effect of breastfeeding on child intelligence because of their reliance on one IQ test. (c) 2015 APA, all rights reserved).

  16. Replication and Pedagogy in the History of Psychology VI: Egon Brunswik on Perception and Explicit Reasoning

    NASA Astrophysics Data System (ADS)

    Athy, Jeremy; Friedrich, Jeff; Delany, Eileen

    2008-05-01

    Egon Brunswik (1903 1955) first made an interesting distinction between perception and explicit reasoning, arguing that perception included quick estimates of an object’s size, nearly always resulting in good approximations in uncertain environments, whereas explicit reasoning, while better at achieving exact estimates, could often fail by wide margins. An experiment conducted by Brunswik to investigate these ideas was never published and the only available information is a figure of the results presented in a posthumous book in 1956. We replicated and extended his study to gain insight into the procedures Brunswik used in obtaining his results. Explicit reasoning resulted in fewer errors, yet more extreme ones than perception. Brunswik’s graphical analysis of the results led to different conclusions, however, than did a modern statistically-based analysis.

  17. Mapping of epistatic quantitative trait loci in four-way crosses.

    PubMed

    He, Xiao-Hong; Qin, Hongde; Hu, Zhongli; Zhang, Tianzhen; Zhang, Yuan-Ming

    2011-01-01

    Four-way crosses (4WC) involving four different inbred lines often appear in plant and animal commercial breeding programs. Direct mapping of quantitative trait loci (QTL) in these commercial populations is both economical and practical. However, the existing statistical methods for mapping QTL in a 4WC population are built on the single-QTL genetic model. This simple genetic model fails to take into account QTL interactions, which play an important role in the genetic architecture of complex traits. In this paper, therefore, we attempted to develop a statistical method to detect epistatic QTL in 4WC population. Conditional probabilities of QTL genotypes, computed by the multi-point single locus method, were used to sample the genotypes of all putative QTL in the entire genome. The sampled genotypes were used to construct the design matrix for QTL effects. All QTL effects, including main and epistatic effects, were simultaneously estimated by the penalized maximum likelihood method. The proposed method was confirmed by a series of Monte Carlo simulation studies and real data analysis of cotton. The new method will provide novel tools for the genetic dissection of complex traits, construction of QTL networks, and analysis of heterosis.

  18. Interpreting the gamma statistic in phylogenetic diversification rate studies: a rate decrease does not necessarily indicate an early burst.

    PubMed

    Fordyce, James A

    2010-07-23

    Phylogenetic hypotheses are increasingly being used to elucidate historical patterns of diversification rate-variation. Hypothesis testing is often conducted by comparing the observed vector of branching times to a null, pure-birth expectation. A popular method for inferring a decrease in speciation rate, which might suggest an early burst of diversification followed by a decrease in diversification rate is the gamma statistic. Using simulations under varying conditions, I examine the sensitivity of gamma to the distribution of the most recent branching times. Using an exploratory data analysis tool for lineages through time plots, tree deviation, I identified trees with a significant gamma statistic that do not appear to have the characteristic early accumulation of lineages consistent with an early, rapid rate of cladogenesis. I further investigated the sensitivity of the gamma statistic to recent diversification by examining the consequences of failing to simulate the full time interval following the most recent cladogenic event. The power of gamma to detect rate decrease at varying times was assessed for simulated trees with an initial high rate of diversification followed by a relatively low rate. The gamma statistic is extraordinarily sensitive to recent diversification rates, and does not necessarily detect early bursts of diversification. This was true for trees of various sizes and completeness of taxon sampling. The gamma statistic had greater power to detect recent diversification rate decreases compared to early bursts of diversification. Caution should be exercised when interpreting the gamma statistic as an indication of early, rapid diversification.

  19. The German Registry of immune tolerance treatment in hemophilia--1999 update.

    PubMed

    Lenk, H

    2000-10-01

    As of 1999, the German registry of immune tolerance treatment in hemophilia has received reports on 146 patients who have undergone this therapy from 25 hemophilia centers. In 16 of the reported patients treatment is ongoing. Therapy has been completed in 126 patients of all groups with hemophilia A; most of them are children. In 78.6% of hemophilia A patients full success was achieved, 8.7% finished with partial success, and in 12.7% ITT failed. Statistical analysis demonstrates that interruptions of therapy have a negative influence on success. The inhibitor titer has the highest predictive value for success or failure of therapy. A high maximum titer as well as a high titer at start of treatment were related to a low success rate. Other variables such as exposure days and time interval between inhibitor detection and start of ITT were not statistically significant. Four patients with hemophilia B have also completed therapy, only one of them with success.

  20. Detecting the contagion effect in mass killings; a constructive example of the statistical advantages of unbinned likelihood methods.

    PubMed

    Towers, Sherry; Mubayi, Anuj; Castillo-Chavez, Carlos

    2018-01-01

    When attempting to statistically distinguish between a null and an alternative hypothesis, many researchers in the life and social sciences turn to binned statistical analysis methods, or methods that are simply based on the moments of a distribution (such as the mean, and variance). These methods have the advantage of simplicity of implementation, and simplicity of explanation. However, when null and alternative hypotheses manifest themselves in subtle differences in patterns in the data, binned analysis methods may be insensitive to these differences, and researchers may erroneously fail to reject the null hypothesis when in fact more sensitive statistical analysis methods might produce a different result when the null hypothesis is actually false. Here, with a focus on two recent conflicting studies of contagion in mass killings as instructive examples, we discuss how the use of unbinned likelihood methods makes optimal use of the information in the data; a fact that has been long known in statistical theory, but perhaps is not as widely appreciated amongst general researchers in the life and social sciences. In 2015, Towers et al published a paper that quantified the long-suspected contagion effect in mass killings. However, in 2017, Lankford & Tomek subsequently published a paper, based upon the same data, that claimed to contradict the results of the earlier study. The former used unbinned likelihood methods, and the latter used binned methods, and comparison of distribution moments. Using these analyses, we also discuss how visualization of the data can aid in determination of the most appropriate statistical analysis methods to distinguish between a null and alternate hypothesis. We also discuss the importance of assessment of the robustness of analysis results to methodological assumptions made (for example, arbitrary choices of number of bins and bin widths when using binned methods); an issue that is widely overlooked in the literature, but is critical to analysis reproducibility and robustness. When an analysis cannot distinguish between a null and alternate hypothesis, care must be taken to ensure that the analysis methodology itself maximizes the use of information in the data that can distinguish between the two hypotheses. The use of binned methods by Lankford & Tomek (2017), that examined how many mass killings fell within a 14 day window from a previous mass killing, substantially reduced the sensitivity of their analysis to contagion effects. The unbinned likelihood methods used by Towers et al (2015) did not suffer from this problem. While a binned analysis might be favorable for simplicity and clarity of presentation, unbinned likelihood methods are preferable when effects might be somewhat subtle.

  1. Detecting the contagion effect in mass killings; a constructive example of the statistical advantages of unbinned likelihood methods

    PubMed Central

    Mubayi, Anuj; Castillo-Chavez, Carlos

    2018-01-01

    Background When attempting to statistically distinguish between a null and an alternative hypothesis, many researchers in the life and social sciences turn to binned statistical analysis methods, or methods that are simply based on the moments of a distribution (such as the mean, and variance). These methods have the advantage of simplicity of implementation, and simplicity of explanation. However, when null and alternative hypotheses manifest themselves in subtle differences in patterns in the data, binned analysis methods may be insensitive to these differences, and researchers may erroneously fail to reject the null hypothesis when in fact more sensitive statistical analysis methods might produce a different result when the null hypothesis is actually false. Here, with a focus on two recent conflicting studies of contagion in mass killings as instructive examples, we discuss how the use of unbinned likelihood methods makes optimal use of the information in the data; a fact that has been long known in statistical theory, but perhaps is not as widely appreciated amongst general researchers in the life and social sciences. Methods In 2015, Towers et al published a paper that quantified the long-suspected contagion effect in mass killings. However, in 2017, Lankford & Tomek subsequently published a paper, based upon the same data, that claimed to contradict the results of the earlier study. The former used unbinned likelihood methods, and the latter used binned methods, and comparison of distribution moments. Using these analyses, we also discuss how visualization of the data can aid in determination of the most appropriate statistical analysis methods to distinguish between a null and alternate hypothesis. We also discuss the importance of assessment of the robustness of analysis results to methodological assumptions made (for example, arbitrary choices of number of bins and bin widths when using binned methods); an issue that is widely overlooked in the literature, but is critical to analysis reproducibility and robustness. Conclusions When an analysis cannot distinguish between a null and alternate hypothesis, care must be taken to ensure that the analysis methodology itself maximizes the use of information in the data that can distinguish between the two hypotheses. The use of binned methods by Lankford & Tomek (2017), that examined how many mass killings fell within a 14 day window from a previous mass killing, substantially reduced the sensitivity of their analysis to contagion effects. The unbinned likelihood methods used by Towers et al (2015) did not suffer from this problem. While a binned analysis might be favorable for simplicity and clarity of presentation, unbinned likelihood methods are preferable when effects might be somewhat subtle. PMID:29742115

  2. Failure to Replicate a Genetic Association May Provide Important Clues About Genetic Architecture

    PubMed Central

    Greene, Casey S.; Penrod, Nadia M.; Williams, Scott M.; Moore, Jason H.

    2009-01-01

    Replication has become the gold standard for assessing statistical results from genome-wide association studies. Unfortunately this replication requirement may cause real genetic effects to be missed. A real result can fail to replicate for numerous reasons including inadequate sample size or variability in phenotype definitions across independent samples. In genome-wide association studies the allele frequencies of polymorphisms may differ due to sampling error or population differences. We hypothesize that some statistically significant independent genetic effects may fail to replicate in an independent dataset when allele frequencies differ and the functional polymorphism interacts with one or more other functional polymorphisms. To test this hypothesis, we designed a simulation study in which case-control status was determined by two interacting polymorphisms with heritabilities ranging from 0.025 to 0.4 with replication sample sizes ranging from 400 to 1600 individuals. We show that the power to replicate the statistically significant independent main effect of one polymorphism can drop dramatically with a change of allele frequency of less than 0.1 at a second interacting polymorphism. We also show that differences in allele frequency can result in a reversal of allelic effects where a protective allele becomes a risk factor in replication studies. These results suggest that failure to replicate an independent genetic effect may provide important clues about the complexity of the underlying genetic architecture. We recommend that polymorphisms that fail to replicate be checked for interactions with other polymorphisms, particularly when samples are collected from groups with distinct ethnic backgrounds or different geographic regions. PMID:19503614

  3. Examining the Effectiveness of Discriminant Function Analysis and Cluster Analysis in Species Identification of Male Field Crickets Based on Their Calling Songs

    PubMed Central

    Jaiswara, Ranjana; Nandi, Diptarup; Balakrishnan, Rohini

    2013-01-01

    Traditional taxonomy based on morphology has often failed in accurate species identification owing to the occurrence of cryptic species, which are reproductively isolated but morphologically identical. Molecular data have thus been used to complement morphology in species identification. The sexual advertisement calls in several groups of acoustically communicating animals are species-specific and can thus complement molecular data as non-invasive tools for identification. Several statistical tools and automated identifier algorithms have been used to investigate the efficiency of acoustic signals in species identification. Despite a plethora of such methods, there is a general lack of knowledge regarding the appropriate usage of these methods in specific taxa. In this study, we investigated the performance of two commonly used statistical methods, discriminant function analysis (DFA) and cluster analysis, in identification and classification based on acoustic signals of field cricket species belonging to the subfamily Gryllinae. Using a comparative approach we evaluated the optimal number of species and calling song characteristics for both the methods that lead to most accurate classification and identification. The accuracy of classification using DFA was high and was not affected by the number of taxa used. However, a constraint in using discriminant function analysis is the need for a priori classification of songs. Accuracy of classification using cluster analysis, which does not require a priori knowledge, was maximum for 6–7 taxa and decreased significantly when more than ten taxa were analysed together. We also investigated the efficacy of two novel derived acoustic features in improving the accuracy of identification. Our results show that DFA is a reliable statistical tool for species identification using acoustic signals. Our results also show that cluster analysis of acoustic signals in crickets works effectively for species classification and identification. PMID:24086666

  4. Counts-in-cylinders in the Sloan Digital Sky Survey with Comparisons to N-body Simulations

    NASA Astrophysics Data System (ADS)

    Berrier, Heather D.; Barton, Elizabeth J.; Berrier, Joel C.; Bullock, James S.; Zentner, Andrew R.; Wechsler, Risa H.

    2011-01-01

    Environmental statistics provide a necessary means of comparing the properties of galaxies in different environments, and a vital test of models of galaxy formation within the prevailing hierarchical cosmological model. We explore counts-in-cylinders, a common statistic defined as the number of companions of a particular galaxy found within a given projected radius and redshift interval. Galaxy distributions with the same two-point correlation functions do not necessarily have the same companion count distributions. We use this statistic to examine the environments of galaxies in the Sloan Digital Sky Survey Data Release 4 (SDSS DR4). We also make preliminary comparisons to four models for the spatial distributions of galaxies, based on N-body simulations and data from SDSS DR4, to study the utility of the counts-in-cylinders statistic. There is a very large scatter between the number of companions a galaxy has and the mass of its parent dark matter halo and the halo occupation, limiting the utility of this statistic for certain kinds of environmental studies. We also show that prevalent empirical models of galaxy clustering, that match observed two- and three-point clustering statistics well, fail to reproduce some aspects of the observed distribution of counts-in-cylinders on 1, 3, and 6 h -1 Mpc scales. All models that we explore underpredict the fraction of galaxies with few or no companions in 3 and 6 h -1 Mpc cylinders. Roughly 7% of galaxies in the real universe are significantly more isolated within a 6 h -1 Mpc cylinder than the galaxies in any of the models we use. Simple phenomenological models that map galaxies to dark matter halos fail to reproduce high-order clustering statistics in low-density environments.

  5. Efficient summary statistical representation when change localization fails.

    PubMed

    Haberman, Jason; Whitney, David

    2011-10-01

    People are sensitive to the summary statistics of the visual world (e.g., average orientation/speed/facial expression). We readily derive this information from complex scenes, often without explicit awareness. Given the fundamental and ubiquitous nature of summary statistical representation, we tested whether this kind of information is subject to the attentional constraints imposed by change blindness. We show that information regarding the summary statistics of a scene is available despite limited conscious access. In a novel experiment, we found that while observers can suffer from change blindness (i.e., not localize where change occurred between two views of the same scene), observers could nevertheless accurately report changes in the summary statistics (or "gist") about the very same scene. In the experiment, observers saw two successively presented sets of 16 faces that varied in expression. Four of the faces in the first set changed from one emotional extreme (e.g., happy) to another (e.g., sad) in the second set. Observers performed poorly when asked to locate any of the faces that changed (change blindness). However, when asked about the ensemble (which set was happier, on average), observer performance remained high. Observers were sensitive to the average expression even when they failed to localize any specific object change. That is, even when observers could not locate the very faces driving the change in average expression between the two sets, they nonetheless derived a precise ensemble representation. Thus, the visual system may be optimized to process summary statistics in an efficient manner, allowing it to operate despite minimal conscious access to the information presented.

  6. The effect of antimicrobial agents on bond strength of orthodontic adhesives: a meta-analysis of in vitro studies.

    PubMed

    Altmann, A S P; Collares, F M; Leitune, V C B; Samuel, S M W

    2016-02-01

    Antimicrobial orthodontic adhesives aim to reduce white spot lesions' incidence in orthodontic patients, but they should not jeopardizing its properties. Systematic review and meta-analysis were performed to answer the question whether the association of antimicrobial agents with orthodontic adhesives compromises its mechanical properties and whether there is a superior antimicrobial agent. PubMed and Scopus databases. In vitro studies comparing shear bond strength of conventional photo-activated orthodontic adhesives to antimicrobial photo-activated orthodontic adhesives were considered eligible. Search terms included the following: orthodontics, orthodontic, antimicrobial, antibacterial, bactericidal, adhesive, resin, resin composite, bonding agent, bonding system, and bond strength. The searches yielded 494 citations, which turned into 467 after duplicates were discarded. Titles and abstracts were read and 13 publications were selected for full-text reading. Twelve studies were included in the meta-analysis. The global analysis showed no statistically significant difference between control and experimental groups. In the subgroup analysis, only the chlorhexidine subgroup showed a statistically significant difference, where the control groups had higher bond strength than the experimental groups. Many studies on in vitro orthodontic bond strength fail to report test conditions that could affect their outcomes. The pooled in vitro data suggest that adding an antimicrobial agent to an orthodontic adhesive system does not influence bond strength to enamel. It is not possible to state which antimicrobial agent is better to be associated. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  7. Vitamin D and Depression: A Systematic Review and Meta-Analysis Comparing Studies with and without Biological Flaws

    PubMed Central

    Spedding, Simon

    2014-01-01

    Efficacy of Vitamin D supplements in depression is controversial, awaiting further literature analysis. Biological flaws in primary studies is a possible reason meta-analyses of Vitamin D have failed to demonstrate efficacy. This systematic review and meta-analysis of Vitamin D and depression compared studies with and without biological flaws. The systematic review followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. The literature search was undertaken through four databases for randomized controlled trials (RCTs). Studies were critically appraised for methodological quality and biological flaws, in relation to the hypothesis and study design. Meta-analyses were performed for studies according to the presence of biological flaws. The 15 RCTs identified provide a more comprehensive evidence-base than previous systematic reviews; methodological quality of studies was generally good and methodology was diverse. A meta-analysis of all studies without flaws demonstrated a statistically significant improvement in depression with Vitamin D supplements (+0.78 CI +0.24, +1.27). Studies with biological flaws were mainly inconclusive, with the meta-analysis demonstrating a statistically significant worsening in depression by taking Vitamin D supplements (−1.1 CI −0.7, −1.5). Vitamin D supplementation (≥800 I.U. daily) was somewhat favorable in the management of depression in studies that demonstrate a change in vitamin levels, and the effect size was comparable to that of anti-depressant medication. PMID:24732019

  8. Internal construct validity of the Shirom-Melamed Burnout Questionnaire (SMBQ)

    PubMed Central

    2012-01-01

    Background Burnout is a mental condition defined as a result of continuous and long-term stress exposure, particularly related to psychosocial factors at work. This paper seeks to examine the psychometric properties of the Shirom-Melamed Burnout Questionnaire (SMBQ) for validation of use in a clinical setting. Methods Data from both a clinical (319) and general population (319) samples of health care and social insurance workers were included in the study. Data were analysed using both classical and modern test theory approaches, including Confirmatory Factor Analysis (CFA) and Rasch analysis. Results Of the 638 people recruited into the study 416 (65%) persons were working full or part time. Data from the SMBQ failed a CFA, and initially failed to satisfy Rasch model expectations. After the removal of 4 of the original items measuring tension, and accommodating local dependency in the data, model expectations were met. As such, the total score from the revised scale is a sufficient statistic for ascertaining burnout and an interval scale transformation is available. The scale as a whole was perfectly targeted to the joint sample. A cut point of 4.4 for severe burnout was chosen at the intersection of the distributions of the clinical and general population. Conclusion A revised 18 item version of the SMBQ satisfies modern measurement standards. Using its cut point it offers the opportunity to identify potential clinical cases of burnout. PMID:22214479

  9. [Non-operative treatment for severe forms of infantile idiopathic scoliosis].

    PubMed

    Trobisch, P D; Samdani, A; O'Neil, C; Betz, R; Cahill, P

    2012-02-01

    Infantile idiopathic scoliosis (IIS) is a rare orthopaedic condition. Braces and casts are popular options in the treatment of IIS but there is a paucity of studies commenting on the outcome of non-operative treatment. The purpose of this study was to analyse failure and success after non-operative treatment for severe forms of IIS. We retrospectively reviewed the data of all children who had been treated for IIS between 2003 and 2009 at a single institution. After calculating the failure and success rates, we additionally performed a risk factor analysis for patients who failed non-operative treatment. Chi (2) and T tests were used for statistical analysis with significance set at p < 0.05. 25 children with an average age of 11 months and an Cobb angle of 46 degrees at presentation were analysed. Seven (28 %) patients were considered as having failed non-operative treatment after an average follow-up of 28 months. The pretreatment Cobb angle was identified as single significant risk factor for failure (55 versus 42) while neither age, gender, nor RVAD seem to influence the outcome. In children who were considered as successfully treated, the Cobb angle decreased from 42 to 18 degrees. Non-operative treatment for IIS is successful in 3 out of 4 patients. © Georg Thieme Verlag KG Stuttgart · New York.

  10. The Development of Bayesian Theory and Its Applications in Business and Bioinformatics

    NASA Astrophysics Data System (ADS)

    Zhang, Yifei

    2018-03-01

    Bayesian Theory originated from an Essay of a British mathematician named Thomas Bayes in 1763, and after its development in 20th century, Bayesian Statistics has been taking a significant part in statistical study of all fields. Due to the recent breakthrough of high-dimensional integral, Bayesian Statistics has been improved and perfected, and now it can be used to solve problems that Classical Statistics failed to solve. This paper summarizes Bayesian Statistics’ history, concepts and applications, which are illustrated in five parts: the history of Bayesian Statistics, the weakness of Classical Statistics, Bayesian Theory and its development and applications. The first two parts make a comparison between Bayesian Statistics and Classical Statistics in a macroscopic aspect. And the last three parts focus on Bayesian Theory in specific -- from introducing some particular Bayesian Statistics’ concepts to listing their development and finally their applications.

  11. Development of a GIS-based failure investigation system for highway soil slopes

    NASA Astrophysics Data System (ADS)

    Ramanathan, Raghav; Aydilek, Ahmet H.; Tanyu, Burak F.

    2015-06-01

    A framework for preparation of an early warning system was developed for Maryland, using a GIS database and a collective overlay of maps that highlight highway slopes susceptible to soil slides or slope failures in advance through spatial and statistical analysis. Data for existing soil slope failures was collected from geotechnical reports and field visits. A total of 48 slope failures were recorded and analyzed. Six factors, including event precipitation, geological formation, land cover, slope history, slope angle, and elevation were considered to affect highway soil slope stability. The observed trends indicate that precipitation and poor surface or subsurface drainage conditions are principal factors causing slope failures. 96% of the failed slopes have an open drainage section. A majority of the failed slopes lie in regions with relatively high event precipitation ( P>200 mm). 90% of the existing failures are surficial erosion type failures, and only 1 out of the 42 slope failures is deep rotational type failure. More than half of the analyzed slope failures have occurred in regions having low density land cover. 46% of failures are on slopes with slope angles between 20° and 30°. Influx of more data relating to failed slopes should give rise to more trends, and thus the developed slope management system will aid the state highway engineers in prudential budget allocation and prioritizing different remediation projects based on the literature reviewed on the principles, concepts, techniques, and methodology for slope instability evaluation (Leshchinsky et al., 2015).

  12. SDN solutions for switching dedicated long-haul connections: Measurements and comparative analysis

    DOE PAGES

    Rao, Nageswara S. V.

    2016-01-01

    We consider a scenario of two sites connected over a dedicated, long-haul connection that must quickly fail-over in response to degradations in host-to-host application performance. The traditional layer-2/3 hot stand-by fail-over solutions do not adequately address the variety of application degradations, and more recent single controller Software Defined Networks (SDN) solutions are not effective for long-haul connections. We present two methods for such a path fail-over using OpenFlow enabled switches: (a) a light-weight method that utilizes host scripts to monitor application performance and dpctl API for switching, and (b) a generic method that uses two OpenDaylight (ODL) controllers and RESTmore » interfaces. For both methods, the restoration dynamics of applications contain significant statistical variations due to the complexities of controllers, north bound interfaces and switches; they, together with the wide variety of vendor implementations, complicate the choice among such solutions. We develop the impulse-response method based on regression functions of performance parameters to provide a rigorous and objective comparison of different solutions. We describe testing results of the two proposed methods, using TCP throughput and connection rtt as main parameters, over a testbed consisting of HP and Cisco switches connected over longhaul connections emulated in hardware by ANUE devices. Lastly, the combination of analytical and experimental results demonstrate that the dpctl method responds seconds faster than the ODL method on average, even though both methods eventually restore original TCP throughput.« less

  13. Improved amputation-free survival in unreconstructable critical limb ischemia and its implications for clinical trial design and quality measurement.

    PubMed

    Benoit, Eric; O'Donnell, Thomas F; Kitsios, Georgios D; Iafrati, Mark D

    2012-03-01

    Amputation-free survival (AFS), a composite endpoint of mortality and amputation, is the preferred outcome measure in critical limb ischemia (CLI). Given the improvements in systemic management of atherosclerosis and interventional management of limb ischemia over the past 2 decades, we examined whether these outcomes have changed in patients with CLI without revascularization options (no option-critical limb ischemia [NO-CLI]). We reviewed the literature for published 1-year AFS, mortality, and amputation rates from control groups in NO-CLI trials. Summary proportions of events were estimated by conducting a random effects meta-analysis of proportions. To determine whether there had been any change in event rates over time, we performed a random effects meta-regression and a mixed effects logistic regression, both regressed against the variable "final year of recruitment." Eleven trials consisting of 886 patients satisfied search criteria, 7 of which presented AFS data. Summary proportion of events (95% confidence interval) were 0.551 (0.399 to 0.693) for AFS; 0.198 (0.116 to 0.317) for death; and 0.341 (0.209 to 0.487) for amputation. Regression analyses demonstrated that AFS has risen over time as mortality rates have fallen, and these improvements are statistically significant. The decrease in amputation rates failed to reach statistical significance. The lack of published data precluded a quantitative evaluation of any change in the clinical severity or comorbidities in the NO-CLI population. AFS and mortality rates in NO-CLI have improved over the past 2 decades. Due to declining event rates, clinical trials may underestimate treatment effects and thus fail to reach statistical significance unless sample sizes are increased or unless a subgroup with a higher event rate can be identified. Alternatively, comparing outcomes to historical values for quality measurement may overestimate treatment effects. Benchmark values of AFS and morality require periodic review and updating. Copyright © 2012 Society for Vascular Surgery. Published by Mosby, Inc. All rights reserved.

  14. A hint of Poincaré dodecahedral topology in the WMAP first year sky map

    NASA Astrophysics Data System (ADS)

    Roukema, B. F.; Lew, B.; Cechowska, M.; Marecki, A.; Bajtlik, S.

    2004-09-01

    It has recently been suggested by Luminet et al. (\\cite{LumNat03}) that the WMAP data are better matched by a geometry in which the topology is that of a Poincaré dodecahedral model and the curvature is ``slightly'' spherical, rather than by an (effectively) infinite flat model. A general back-to-back matched circles analysis by Cornish et al. (\\cite{CSSK03}) for angular radii in the range 25-90 °, using a correlation statistic for signal detection, failed to support this. In this paper, a matched circles analysis specifically designed to detect dodecahedral patterns of matched circles is performed over angular radii in the range 1-40\\ddeg on the one-year WMAP data. Signal detection is attempted via a correlation statistic and an rms difference statistic. Extreme value distributions of these statistics are calculated for one orientation of the 36\\ddeg ``screw motion'' (Clifford translation) when matching circles, for the opposite screw motion, and for a zero (unphysical) rotation. The most correlated circles appear for circle radii of \\alpha =11 ± 1 \\ddeg, for the left-handed screw motion, but not for the right-handed one, nor for the zero rotation. The favoured six dodecahedral face centres in galactic coordinates are (\\lII,\\bII) ≈ (252\\ddeg,+65\\ddeg), (51\\ddeg,+51\\ddeg), (144\\ddeg,+38\\ddeg), (207\\ddeg,+10\\ddeg), (271\\ddeg,+3\\ddeg), (332\\ddeg,+25\\ddeg) and their opposites. The six pairs of circles independently each favour a circle angular radius of 11 ± 1\\ddeg. The temperature fluctuations along the matched circles are plotted and are clearly highly correlated. Whether or not these six circle pairs centred on dodecahedral faces match via a 36\\ddeg rotation only due to unexpected statistical properties of the WMAP ILC map, or whether they match due to global geometry, it is clear that the WMAP ILC map has some unusual statistical properties which mimic a potentially interesting cosmological signal.

  15. Extended Ponseti method for failed tenotomy in idiopathic clubfeet: a pilot study.

    PubMed

    Agarwal, Anil; Agrawal, Nargesh; Barik, Sitanshu; Gupta, Neeraj

    2018-01-29

    We evaluated the outcome of a new protocol of an extended Ponseti method in the management of idiopathic club foot with residual equinus following failed Achilles tenotomy. We also compared the failed with a successful tenotomy group to analyze the parameters for failure. The Ponseti technique-treated idiopathic club foot patients with failed percutaneous Achilles tenotomy (failure to achieve <15° dorsiflexion) were treated by continued stretching casts, with a weekly change for a further 3 weeks. Final dorsiflexion more than 15° if achieved with the above protocol was recorded as a success. Twenty-six (16%) patients with failed Achilles tenotomy and residual equinus out of a total of 161 patients with primary idiopathic club foot were tested with the protocol. Ten (38.5%) failed patients had bilateral foot involvement and 16 (61.5%) had unilateral foot involvement. A total of seven (26.9%) patients achieved the end point dorsiflexion of more than 15° in one further cast, 10 (38.5%) in two casts, and four (15.4%) in three casts, respectively. Overall success of the extended Ponseti protocol was achieved in 21/26 (80.8%) patients. The patient's age, precasting initial Pirani score, number of Ponseti casts, pretenotomy Pirani score, and pretenotomy ankle joint dorsiflexion were statistically different in the failed compared with the successful tenotomy group. The tested extended Ponseti protocol showed a success rate of 80.8% in salvaging failed tenotomy cases. The failed tenotomy group was relatively older at presentation, had high precasting and pretenotomy Pirani scores, received extra number of Ponseti casts, and less pretenotomy ankle joint dorsiflexion compared with successful feet.

  16. Variability of chemical analysis of reinforcing bar produced in Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Salman, A.; Djavanroodi, F.

    2018-04-01

    In view of the importance and demanding roles of steel rebar’s in the reinforced concrete structures, accurate information on the properties of the steels is important at the design stage. In the steelmaking process, production variations in chemical composition are unavoidable. The aim of this work is to study the variability of the chemical composition of reinforcing steel produced throughout the Saudi Arabia and asses the quality of steel rebar’s acoording to ASTM A615. 68 samples of ASTM A615 Grade 60 from different manufacturers were collected and tested using the Spectrometer test to obtain Chemical Compositions. EasyFit (5.6) software is utilized to conducted statistical analysis. Chemical compositions distributions and, control charts are generated for the compositions. Results showed that some compositions are above the upper line of the control chart. Finally, the analyses show that less than 3% of the steel failed to meet minimum ASTM standards for chemical composition.

  17. Information flow to assess cardiorespiratory interactions in patients on weaning trials.

    PubMed

    Vallverdú, M; Tibaduisa, O; Clariá, F; Hoyer, D; Giraldo, B; Benito, S; Caminal, P

    2006-01-01

    Nonlinear processes of the autonomic nervous system (ANS) can produce breath-to-breath variability in the pattern of breathing. In order to provide assess to these nonlinear processes, nonlinear statistical dependencies between heart rate variability and respiratory pattern variability are analyzed. In this way, auto-mutual information and cross-mutual information concepts are applied. This information flow analysis is presented as a short-term non linear analysis method to investigate the information flow interactions in patients on weaning trials. 78 patients from mechanical ventilation were studied: Group A of 28 patients that failed to maintain spontaneous breathing and were reconnected; Group B of 50 patients with successful trials. The results show lower complexity with an increase of information flow in group A than in group B. Furthermore, a more (weakly) coupled nonlinear oscillator behavior is observed in the series of group A than in B.

  18. Change Detection of High-Resolution Remote Sensing Images Based on Adaptive Fusion of Multiple Features

    NASA Astrophysics Data System (ADS)

    Wang, G. H.; Wang, H. B.; Fan, W. F.; Liu, Y.; Chen, C.

    2018-04-01

    In view of the traditional change detection algorithm mainly depends on the spectral information image spot, failed to effectively mining and fusion of multi-image feature detection advantage, the article borrows the ideas of object oriented analysis proposed a multi feature fusion of remote sensing image change detection algorithm. First by the multi-scale segmentation of image objects based; then calculate the various objects of color histogram and linear gradient histogram; utilizes the color distance and edge line feature distance between EMD statistical operator in different periods of the object, using the adaptive weighted method, the color feature distance and edge in a straight line distance of combination is constructed object heterogeneity. Finally, the curvature histogram analysis image spot change detection results. The experimental results show that the method can fully fuse the color and edge line features, thus improving the accuracy of the change detection.

  19. Downside Risk analysis applied to the Hedge Funds universe

    NASA Astrophysics Data System (ADS)

    Perelló, Josep

    2007-09-01

    Hedge Funds are considered as one of the portfolio management sectors which shows a fastest growing for the past decade. An optimal Hedge Fund management requires an appropriate risk metrics. The classic CAPM theory and its Ratio Sharpe fail to capture some crucial aspects due to the strong non-Gaussian character of Hedge Funds statistics. A possible way out to this problem while keeping the CAPM simplicity is the so-called Downside Risk analysis. One important benefit lies in distinguishing between good and bad returns, that is: returns greater or lower than investor's goal. We revisit most popular Downside Risk indicators and provide new analytical results on them. We compute these measures by taking the Credit Suisse/Tremont Investable Hedge Fund Index Data and with the Gaussian case as a benchmark. In this way, an unusual transversal lecture of the existing Downside Risk measures is provided.

  20. GATA: A graphic alignment tool for comparative sequenceanalysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nix, David A.; Eisen, Michael B.

    2005-01-01

    Several problems exist with current methods used to align DNA sequences for comparative sequence analysis. Most dynamic programming algorithms assume that conserved sequence elements are collinear. This assumption appears valid when comparing orthologous protein coding sequences. Functional constraints on proteins provide strong selective pressure against sequence inversions, and minimize sequence duplications and feature shuffling. For non-coding sequences this collinearity assumption is often invalid. For example, enhancers contain clusters of transcription factor binding sites that change in number, orientation, and spacing during evolution yet the enhancer retains its activity. Dotplot analysis is often used to estimate non-coding sequence relatedness. Yet dotmore » plots do not actually align sequences and thus cannot account well for base insertions or deletions. Moreover, they lack an adequate statistical framework for comparing sequence relatedness and are limited to pairwise comparisons. Lastly, dot plots and dynamic programming text outputs fail to provide an intuitive means for visualizing DNA alignments.« less

  1. Differentiating clinical groups using the serial color-word test (S-CWT).

    PubMed

    Hentschel, Uwe; Rubino, I Alex; Bijleveld, Catrien

    2011-04-01

    The present study attempted to differentiate 11 diagnostic groups by means of the Serial Color-Word Test (S-CWT), using multivariate discriminant analysis. Two alternative scoring systems of the S-CWT were outlined. Asample of 514 individuals who had clinical diagnoses of various types and 397 controls who had no diagnostic findings comprised the sample. The first discriminant analysis failed to differentiate the groups adequately. The groups were consequently reduced to four (schizophrenia, bipolar disorders, temporo-mandibular joint pain dysfunction syndrome, and eating disturbances), which gave better reclassification findings for a clinical application of the test. This classification gave over 55% correct assignments. The final four groups had a statistically significant discrimination on the test, which remained stable also in a bootstrap procedure. Implications for treatment indications and outcomes as well as strategies for further studies using the S-CWT are discussed.

  2. SPONTANEOUS CLOSURE OF A MACULAR HOLE AFTER FOUR FAILED VITRECTOMIES IN THE SETTING OF NF-1.

    PubMed

    Wannamaker, Kendall W; Sharpe, Robert A; Kylstra, Jan A

    2018-01-01

    To present the case of a patient who developed spontaneous closure of an idiopathic macular hole after four failed attempts at surgical closure. This is a retrospective case review of the medical record of a single patient. No statistical analysis was performed. The patient is a 71-year-old white woman with neurofibromatosis Type 1 who presented to the retina clinic of one of the authors. The patient underwent four vitrectomies with long acting gas by two surgeons over the course of 2 years. After each surgery, the hole either did not close or it closed and then reopened within 1 year. Five months after the last surgery (1 year after the hole last reopened), the patient presented with improved vision and spontaneous closure of the macular hole. The hole has remained closed since then. This case demonstrates that spontaneous closure of a macular hole, associated with excellent visual recovery, can occur after multiple surgical failures. We propose that enhanced scar formation due to neurofibromatosis Type 1 was responsible for both the numerous failures following initially successful surgery (centrifugal traction) and for the spontaneous closure (centripetal traction).

  3. Significant Factors Related to Failed Pediatric Dental General Anesthesia Appointments at a Hospital-based Residency Program.

    PubMed

    Emhardt, John R; Yepes, Juan F; Vinson, LaQuia A; Jones, James E; Emhardt, John D; Kozlowski, Diana C; Eckert, George J; Maupome, Gerardo

    2017-05-15

    The purposes of this study were to: (1) evaluate the relationship between appointment failure and the factors of age, gender, race, insurance type, day of week, scheduled time of surgery, distance traveled, and weather; (2) investigate reasons for failure; and (3) explore the relationships between the factors and reasons for failure. Electronic medical records were accessed to obtain data for patients scheduled for dental care under general anesthesia from May 2012 to May 2015. Factors were analyzed for relation to appointment failure. Data from 3,513 appointments for 2,874 children were analyzed. Bivariate associations showed statistically significant (P<0.05) relationships between failed appointment and race, insurance type, scheduled time of surgery, distance traveled, snowfall, and temperature. Multinomial regression analysis showed the following associations between factors and the reason for failure (P<0.05): (1) decreased temperature and increased snowfall were associated with weather as reason for failure; (2) the African American population showed an association with family barriers; (3) Hispanic families were less likely to give advanced notice; and (4) the "additional races" category showed an association with fasting violation. Patients who have treatment under general anesthesia face specific barriers to care.

  4. The Need for Anticoagulation Following Inferior Vena Cava Filter Placement: Systematic Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ray, Charles E.; Prochazka, Allan

    Purpose. To perform a systemic review to determine the effect of anticoagulation on the rates of venous thromboembolism (pulmonary embolus, deep venous thrombosis, inferior vena cava (IVC) filter thrombosis) following placement of an IVC filter. Methods. A comprehensive computerized literature search was performed to identify relevant articles. Data were abstracted by two reviewers. Studies were included if it could be determined whether or not subjects received anticoagulation following filter placement, and if follow-up data were presented. A meta-analysis of patients from all included studies was performed. A total of 14 articles were included in the final analysis, but the datamore » from only nine articles could be used in the meta-analysis; five studies were excluded because they did not present raw data which could be analyzed in the meta-analysis. A total of 1,369 subjects were included in the final meta-analysis. Results. The summary odds ratio for the effect of anticoagulation on venous thromboembolism rates following filter deployment was 0.639 (95% CI 0.351 to 1.159, p = 0.141). There was significant heterogeneity in the results from different studies [Q statistic of 15.95 (p = 0.043)]. Following the meta-analysis, there was a trend toward decreased venous thromboembolism rates in patients with post-filter anticoagulation (12.3% vs. 15.8%), but the result failed to reach statistical significance. Conclusion. Inferior vena cava filters can be placed in patients who cannot receive concomitant anticoagulation without placing them at significantly higher risk of development of venous thromboembolism.« less

  5. Comment on the asymptotics of a distribution-free goodness of fit test statistic.

    PubMed

    Browne, Michael W; Shapiro, Alexander

    2015-03-01

    In a recent article Jennrich and Satorra (Psychometrika 78: 545-552, 2013) showed that a proof by Browne (British Journal of Mathematical and Statistical Psychology 37: 62-83, 1984) of the asymptotic distribution of a goodness of fit test statistic is incomplete because it fails to prove that the orthogonal component function employed is continuous. Jennrich and Satorra (Psychometrika 78: 545-552, 2013) showed how Browne's proof can be completed satisfactorily but this required the development of an extensive and mathematically sophisticated framework for continuous orthogonal component functions. This short note provides a simple proof of the asymptotic distribution of Browne's (British Journal of Mathematical and Statistical Psychology 37: 62-83, 1984) test statistic by using an equivalent form of the statistic that does not involve orthogonal component functions and consequently avoids all complicating issues associated with them.

  6. Recurrence quantification analysis of heart rate variability and respiratory flow series in patients on weaning trials.

    PubMed

    Arcentales, Andrés; Giraldo, Beatriz F; Caminal, Pere; Benito, Salvador; Voss, Andreas

    2011-01-01

    Autonomic nervous system regulates the behavior of cardiac and respiratory systems. Its assessment during the ventilator weaning can provide information about physio-pathological imbalances. This work proposes a non linear analysis of the complexity of the heart rate variability (HRV) and breathing duration (T(Tot)) applying recurrence plot (RP) and their interaction joint recurrence plot (JRP). A total of 131 patients on weaning trials from mechanical ventilation were analyzed: 92 patients with successful weaning (group S) and 39 patients that failed to maintain spontaneous breathing (group F). The results show that parameters as determinism (DET), average diagonal line length (L), and entropy (ENTR), are statistically significant with RP for T(Tot) series, but not with HRV. When comparing the groups with JRP, all parameters have been relevant. In all cases, mean values of recurrence quantification analysis are higher in the group S than in the group F. The main differences between groups were found on the diagonal and vertical structures of the joint recurrence plot.

  7. Transconjunctival penetration of mitomycin C

    PubMed Central

    Velpandian, T; Sihota, Ramanjit; Gupta, Viney

    2008-01-01

    Aims: The study was performed to estimate transconjunctival penetration of mitomycin C (MMC) to Tenon′s tissue following application over the intact conjunctiva before routine trabeculectomy. Settings and Design: Institution-based case series. Materials and Methods: In 41 eyes of 41 patients, MMC (0.4 mg/ml for 3 min) was applied over the intact conjunctiva before beginning trabeculectomy. Tenon′s capsule directly beneath the site of application was excised during trabeculectomy and was homogenized, centrifuged and MMC concentrations were analyzed using high-performance liquid chromatography (HPLC). Statistical Analysis Used: Statistical analysis was performed using stata0 8.0 version software (STATA Corporation, Houston, TX, USA). In this study, P -values less than 0.05 were considered as statistically significant. Results: The average weight of the sample of Tenon′s tissue excised was 5.51 ± 4.42 mg (range: 0.9-17.1) and the average estimated MMC concentration found to be present in Tenon′s tissue using HPLC was 18.67 ± 32.36 × 10−6 moles/kg of the tissue (range: 0.38-197.05 x 10−6 ). In 36 of the 41 patients (87.80%), the MMC concentration reached above 2 x 10−6 moles/kg of the tissue concentration required to inhibit human conjunctival fibroblasts. Conclusions: Mitomycin C does permeate into the subconjunctival tissue after supraconjunctival application for 3 min. Application of MMC over the conjunctiva may be a useful alternative to subconjunctival or subscleral application during routine trabeculectomy and as an adjunct for failing blebs. PMID:18417819

  8. Detecting Multiple Model Components with the Likelihood Ratio Test

    NASA Astrophysics Data System (ADS)

    Protassov, R. S.; van Dyk, D. A.

    2000-05-01

    The likelihood ratio test (LRT) and F-test popularized in astrophysics by Bevington (Data Reduction and Error Analysis in the Physical Sciences ) and Cash (1977, ApJ 228, 939), do not (even asymptotically) adhere to their nominal χ2 and F distributions in many statistical tests commonly used in astrophysics. The many legitimate uses of the LRT (see, e.g., the examples given in Cash (1977)) notwithstanding, it can be impossible to compute the false positive rate of the LRT or related tests such as the F-test. For example, although Cash (1977) did not suggest the LRT for detecting a line profile in a spectral model, it has become common practice despite the lack of certain required mathematical regularity conditions. Contrary to common practice, the nominal distribution of the LRT statistic should not be used in these situations. In this paper, we characterize an important class of problems where the LRT fails, show the non-standard behavior of the test in this setting, and provide a Bayesian alternative to the LRT, i.e., posterior predictive p-values. We emphasize that there are many legitimate uses of the LRT in astrophysics, and even when the LRT is inappropriate, there remain several statistical alternatives (e.g., judicious use of error bars and Bayes factors). We illustrate this point in our analysis of GRB 970508 that was studied by Piro et al. in ApJ, 514:L73-L77, 1999.

  9. Logistic Regression with Multiple Random Effects: A Simulation Study of Estimation Methods and Statistical Packages.

    PubMed

    Kim, Yoonsang; Choi, Young-Ku; Emery, Sherry

    2013-08-01

    Several statistical packages are capable of estimating generalized linear mixed models and these packages provide one or more of three estimation methods: penalized quasi-likelihood, Laplace, and Gauss-Hermite. Many studies have investigated these methods' performance for the mixed-effects logistic regression model. However, the authors focused on models with one or two random effects and assumed a simple covariance structure between them, which may not be realistic. When there are multiple correlated random effects in a model, the computation becomes intensive, and often an algorithm fails to converge. Moreover, in our analysis of smoking status and exposure to anti-tobacco advertisements, we have observed that when a model included multiple random effects, parameter estimates varied considerably from one statistical package to another even when using the same estimation method. This article presents a comprehensive review of the advantages and disadvantages of each estimation method. In addition, we compare the performances of the three methods across statistical packages via simulation, which involves two- and three-level logistic regression models with at least three correlated random effects. We apply our findings to a real dataset. Our results suggest that two packages-SAS GLIMMIX Laplace and SuperMix Gaussian quadrature-perform well in terms of accuracy, precision, convergence rates, and computing speed. We also discuss the strengths and weaknesses of the two packages in regard to sample sizes.

  10. Logistic Regression with Multiple Random Effects: A Simulation Study of Estimation Methods and Statistical Packages

    PubMed Central

    Kim, Yoonsang; Emery, Sherry

    2013-01-01

    Several statistical packages are capable of estimating generalized linear mixed models and these packages provide one or more of three estimation methods: penalized quasi-likelihood, Laplace, and Gauss-Hermite. Many studies have investigated these methods’ performance for the mixed-effects logistic regression model. However, the authors focused on models with one or two random effects and assumed a simple covariance structure between them, which may not be realistic. When there are multiple correlated random effects in a model, the computation becomes intensive, and often an algorithm fails to converge. Moreover, in our analysis of smoking status and exposure to anti-tobacco advertisements, we have observed that when a model included multiple random effects, parameter estimates varied considerably from one statistical package to another even when using the same estimation method. This article presents a comprehensive review of the advantages and disadvantages of each estimation method. In addition, we compare the performances of the three methods across statistical packages via simulation, which involves two- and three-level logistic regression models with at least three correlated random effects. We apply our findings to a real dataset. Our results suggest that two packages—SAS GLIMMIX Laplace and SuperMix Gaussian quadrature—perform well in terms of accuracy, precision, convergence rates, and computing speed. We also discuss the strengths and weaknesses of the two packages in regard to sample sizes. PMID:24288415

  11. Bayesian change-point analysis reveals developmental change in a classic theory of mind task.

    PubMed

    Baker, Sara T; Leslie, Alan M; Gallistel, C R; Hood, Bruce M

    2016-12-01

    Although learning and development reflect changes situated in an individual brain, most discussions of behavioral change are based on the evidence of group averages. Our reliance on group-averaged data creates a dilemma. On the one hand, we need to use traditional inferential statistics. On the other hand, group averages are highly ambiguous when we need to understand change in the individual; the average pattern of change may characterize all, some, or none of the individuals in the group. Here we present a new method for statistically characterizing developmental change in each individual child we study. Using false-belief tasks, fifty-two children in two cohorts were repeatedly tested for varying lengths of time between 3 and 5 years of age. Using a novel Bayesian change point analysis, we determined both the presence and-just as importantly-the absence of change in individual longitudinal cumulative records. Whenever the analysis supports a change conclusion, it identifies in that child's record the most likely point at which change occurred. Results show striking variability in patterns of change and stability across individual children. We then group the individuals by their various patterns of change or no change. The resulting patterns provide scarce support for sudden changes in competence and shed new light on the concepts of "passing" and "failing" in developmental studies. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  12. Validation of Bayesian analysis of compartmental kinetic models in medical imaging.

    PubMed

    Sitek, Arkadiusz; Li, Quanzheng; El Fakhri, Georges; Alpert, Nathaniel M

    2016-10-01

    Kinetic compartmental analysis is frequently used to compute physiologically relevant quantitative values from time series of images. In this paper, a new approach based on Bayesian analysis to obtain information about these parameters is presented and validated. The closed-form of the posterior distribution of kinetic parameters is derived with a hierarchical prior to model the standard deviation of normally distributed noise. Markov chain Monte Carlo methods are used for numerical estimation of the posterior distribution. Computer simulations of the kinetics of F18-fluorodeoxyglucose (FDG) are used to demonstrate drawing statistical inferences about kinetic parameters and to validate the theory and implementation. Additionally, point estimates of kinetic parameters and covariance of those estimates are determined using the classical non-linear least squares approach. Posteriors obtained using methods proposed in this work are accurate as no significant deviation from the expected shape of the posterior was found (one-sided P>0.08). It is demonstrated that the results obtained by the standard non-linear least-square methods fail to provide accurate estimation of uncertainty for the same data set (P<0.0001). The results of this work validate new methods for a computer simulations of FDG kinetics. Results show that in situations where the classical approach fails in accurate estimation of uncertainty, Bayesian estimation provides an accurate information about the uncertainties in the parameters. Although a particular example of FDG kinetics was used in the paper, the methods can be extended for different pharmaceuticals and imaging modalities. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  13. Comparison of the anaesthetic efficacy of different volumes of 4% articaine (1.8 and 3.6 mL) as supplemental buccal infiltration after failed inferior alveolar nerve block.

    PubMed

    Singla, M; Subbiya, A; Aggarwal, V; Vivekanandhan, P; Yadav, S; Yadav, H; Venkatesh, A; Geethapriya, N; Sharma, V

    2015-01-01

    To compare the anaesthetic efficacy of different volumes (1.8 mL vs. 3.6 mL) of 4% articaine with 1 : 100 000 epinephrine injected as buccal infiltrations after a failed inferior alveolar nerve block (IANB) in patients with symptomatic irreversible pulpitis. Two hundred and thirty-four adult patients, diagnosed with irreversible pulpitis in a mandibular tooth, participated in this multicentre, randomized double-blinded trial. Patients received IANB with 1.8 mL of 4% articaine with 1 : 100 000 epinephrine. Pain during treatment was recorded using the Heft-Parker visual analogue scale (HP VAS). The primary outcome measure, and the definition of 'success', was the ability to undertake pulp chamber access and canal instrumentation with no or mild pain (HP VAS score <55 mm). Patients who experienced 'moderate-to-severe' pain (HP VAS score ≥ 55 mm) were randomly allocated into two groups and received buccal infiltrations with either 1.8 mL or 3.6 mL of 4% articaine with 1 : 100 000 epinephrine. Root canal treatment was re-initiated after 10 min. Success was again defined as no pain or weak/mild pain during endodontic access preparation and instrumentation. Statistical analysis was performed using Mann-Whitney U and chi-square tests. The initial IANB of 4% articaine gave an overall success rate of 37%. The success rate of supplementary buccal infiltration with 1.8 and 3.6 mL volumes was 62% and 64%, respectively. The difference between the success rates of the two volumes was not statistically significant. Increasing the volume of 4% articaine with 1 : 100 000 epinephrine from 1.8 to 3.6 mL, given as supplementary buccal infiltrations after a failed primary IANB with 1.8 mL of 4% articaine with 1 : 100 000, did not improve the anaesthetic success rates in patients with symptomatic irreversible pulpitis. © 2014 International Endodontic Journal. Published by John Wiley & Sons Ltd.

  14. Immediate Loading of Tapered Implants Placed in Postextraction Sockets and Healed Sites.

    PubMed

    Han, Chang-Hun; Mangano, Francesco; Mortellaro, Carmen; Park, Kwang-Bum

    2016-07-01

    The aim of the present study was to compare the survival, stability, and complications of immediately loaded implants placed in postextraction sockets and healed sites. Over a 2-year period, all patients presenting with partial or complete edentulism of the maxilla and/or mandible (healed site group, at least 4 months of healing after tooth extraction) or in need of replacement of nonrecoverable failing teeth (postextraction group) were considered for inclusion in this study. Tapered implants featuring a nanostructured calcium-incorporated surface were placed and loaded immediately. The prosthetic restorations comprised single crowns, fixed partial dentures, and fixed full arches. Primary outcomes were implant survival, stability, and complications. Implant stability was assessed at placement and at each follow-up evaluation (1 week, 3 months, and 1 year after placement): implants with an insertion torque (IT) <45 N·cm and/or with an implant stability quotient (ISQ) <70 were considered failed for immediate loading. A statistical analysis was performed. Thirty implants were placed in postextraction sockets of 17 patients, and 32 implants were placed in healed sites of 22 patients. There were no statistically significant differences in ISQ values between the 2 groups, at each assessment. In total, 60 implants (96.8%) had an IT ≥45 and an ISQ ≥70 at placement and at each follow-up control: all these implants were successfully loaded. Only 2 implants (1 in a postextraction socket and 1 in a healed site, 3.2%) could not achieve an IT ≥45 N·cm and/or an ISQ ≥70 at placement or over time: accordingly, these were considered failed for stability, as they could not be subjected to immediate loading. One of these 2 implants, in a healed site of a posterior maxilla, had to be removed, yielding an overall 1-year implant survival rate of 98.4%. No complications were reported. No significant differences were reported between the 2 groups with respect to implant failures and complications. Immediately loaded implants placed in postextraction sockets and healed sites had similar high survival and stability, with no reported complications. Further long-term studies on larger samples of patients are needed to confirm these results.

  15. Using Concurrent Cardiovascular Information to Augment Survival Time Data from Orthostatic Tilt Tests

    NASA Technical Reports Server (NTRS)

    Feiveson, Alan H.; Fiedler, James; Lee, Stuart M. M.; Westby, Christian M.; Stenger, Michael B.; Platts, Steven H.

    2014-01-01

    Orthostatic Intolerance (OI) is the propensity to develop symptoms of fainting during upright standing. OI is associated with changes in heart rate, blood pressure and other measures of cardiac function. Problem: NASA astronauts have shown increased susceptibility to OI on return from space missions. Current methods for counteracting OI in astronauts include fluid loading and the use of compression garments. Multivariate trajectory spread is greater as OI increases. Pairwise comparisons at the same time within subjects allows incorporation of pass/fail outcomes. Path length, convex hull area, and covariance matrix determinant do well as statistics to summarize this spread Missing data problems Time series analysis need many more time points per OTT session treatment of trend? how incorporate survival information?

  16. A single-blinded randomised controlled study to determine the efficacy of Omnilux Revive facial treatment in skin rejuvenation.

    PubMed

    Bhat, Jaideep; Birch, Jan; Whitehurst, Colin; Lanigan, Sean W

    2005-01-01

    To determine the efficacy of Omnilux Revive facial treatment in skin rejuvenation, twenty-three volunteers received randomised 20 min treatments three times a week for three weeks to one half of their face, with the untreated side acting as control. Regular assessments were carried out, focusing on parameters of subject satisfaction, photographic assessments, skin elasticity (Cutometer) and skin hydration (Corneometer CM825). Ninety-one percent of the volunteers reported visible changes to their skin. Blinded photographic evaluation reported a clinical response in 59% of the subjects. Objective analysis failed to show statistically significant changes in skin hydration or elasticity. The Omnilux Revive LED lamp is a safe alternative non-ablative skin rejuvenation treatment.

  17. Predictors of long-term compliance in attending a worksite hypertension programme.

    PubMed

    Landers, R; Riccobene, A; Beyreuther, M; Neusy, A J

    1993-12-01

    Variables such as patient's anxiety, knowledge, number of medication changes, medication-induced side-effects and programme-derived benefits and conveniences have been reported or theorised to be important determinants of patient's attendance at worksite hypertension programmes. This study investigates whether these variables have predictive value in differentiating compliers from noncompliers attending a union-sponsored worksite hypertension programme for at least five years. Scores were created from a questionnaire distributed to 243 patients with a response rate of 98%. Compliance was defined as missing < or = 25% of scheduled clinic appointments. By discriminant statistical analysis scores for patient's anxiety, knowledge, number of medication changes, medication side-effects, perceived benefits and conveniences failed to show any predictive value for patient's compliance with appointment keeping.

  18. A comparison of certified and noncertified pet foods.

    PubMed

    Brown, R G

    1997-11-01

    The market presents the buyer with a wide array of pet food choices. Marketing pet foods has changed in the last decade and today foods may be bought at a variety of outlets. The present study compares nutrient composition, digestibility, and effect on urine pH (cat foods only) of selected certified and noncertified pet foods from different outlets. The selected foods were considered analogous in terms of declared ingredients and macronutrient profiles. The analytical methods used were those of the Association of Official Analytical Chemists as described in the Pet Food Certification Protocol of the Canadian Veterinary Medical Association. The test foods were sampled 4 times from August 1994 to July 1995. Both certified and noncertified products met the nutritional requirements on a consistent basis, although 1 of the noncertified dog foods consistently failed to meet the zinc requirements. This same product also failed to meet the Canadian Veterinary Medical Association's standards for concentrations of protein, calcium, and phosphorus. One of the noncertified cat foods failed to meet the recommended calcium level. With the exception of fat digestion in 1 noncertified food, there were no statistically significant differences in major nutrient digestibility between certified and noncertified pet foods. There were some statistically significant differences in digestibility within both the certified and noncertified groups of foods. The practical significance of any of the statistical differences in digestibility is uncertain. Urine pH observed in cats fed noncertified test diets was variable, with some values greater than 7.0 as a maximum or 6.5 as an average. The general conclusion of this study was that the commonly available certified products were the nutritional equal of those foods that position themselves as "premium."

  19. A comparison of certified and noncertified pet foods.

    PubMed Central

    Brown, R G

    1997-01-01

    The market presents the buyer with a wide array of pet food choices. Marketing pet foods has changed in the last decade and today foods may be bought at a variety of outlets. The present study compares nutrient composition, digestibility, and effect on urine pH (cat foods only) of selected certified and noncertified pet foods from different outlets. The selected foods were considered analogous in terms of declared ingredients and macronutrient profiles. The analytical methods used were those of the Association of Official Analytical Chemists as described in the Pet Food Certification Protocol of the Canadian Veterinary Medical Association. The test foods were sampled 4 times from August 1994 to July 1995. Both certified and noncertified products met the nutritional requirements on a consistent basis, although 1 of the noncertified dog foods consistently failed to meet the zinc requirements. This same product also failed to meet the Canadian Veterinary Medical Association's standards for concentrations of protein, calcium, and phosphorus. One of the noncertified cat foods failed to meet the recommended calcium level. With the exception of fat digestion in 1 noncertified food, there were no statistically significant differences in major nutrient digestibility between certified and noncertified pet foods. There were some statistically significant differences in digestibility within both the certified and noncertified groups of foods. The practical significance of any of the statistical differences in digestibility is uncertain. Urine pH observed in cats fed noncertified test diets was variable, with some values greater than 7.0 as a maximum or 6.5 as an average. The general conclusion of this study was that the commonly available certified products were the nutritional equal of those foods that position themselves as "premium." PMID:9360790

  20. Effectiveness of interventions to screen and manage infections during pregnancy on reducing stillbirths: a review

    PubMed Central

    2011-01-01

    Background Infection is a well acknowledged cause of stillbirths and may account for about half of all perinatal deaths today, especially in developing countries. This review presents the impact of interventions targeting various important infections during pregnancy on stillbirth or perinatal mortality. Methods We undertook a systematic review including all relevant literature on interventions dealing with infections during pregnancy for assessment of effects on stillbirths or perinatal mortality. The quality of the evidence was assessed using the adapted Grading of Recommendations, Assessment, Development and Evaluation (GRADE) approach by Child Health Epidemiology Reference Group (CHERG). For the outcome of interest, namely stillbirth, we applied the rules developed by CHERG to recommend a final estimate for reduction in stillbirth for input to the Lives Saved Tool (LiST) model. Results A total of 25 studies were included in the review. A random-effects meta-analysis of observational studies of detection and treatment of syphilis during pregnancy showed a significant 80% reduction in stillbirths [Relative risk (RR) = 0.20; 95% confidence interval (CI): 0.12 - 0.34) that is recommended for inclusion in the LiST model. Our meta-analysis showed the malaria prevention interventions i.e. intermittent preventive treatment (IPTp) and insecticide-treated mosquito nets (ITNs) can reduce stillbirths by 22%, however results were not statistically significant (RR = 0.78; 95% CI: 0.59 – 1.03). For human immunodeficiency virus infection, a pooled analysis of 6 radomized controlled trials (RCTs) failed to show a statistically significant reduction in stillbirth with the use of antiretroviral in pregnancy compared to placebo (RR = 0.93; 95% CI: 0.45 – 1.92). Similarly, pooled analysis combining four studies for the treatment of bacterial vaginosis (3 for oral and 1 for vaginal antibiotic) failed to yield a significant impact on perinatal mortality (OR = 0.88; 95% CI: 0.50 – 1.55). Conclusions The clearest evidence of impact in stillbirth reduction was found for adequate prevention and treatment of syphilis infection and possibly malaria. At present, large gaps exist in the growing list of stillbirth risk factors, especially those that are infection related. Potential causes of stillbirths including HIV and TORCH infections need to be investigated further to help establish the role of prevention/treatment and its subsequent impact on stillbirth reduction. PMID:21501448

  1. Examining Impulse-Variability Theory and the Speed-Accuracy Trade-Off in Children's Overarm Throwing Performance.

    PubMed

    Molina, Sergio L; Stodden, David F

    2018-04-01

    This study examined variability in throwing speed and spatial error to test the prediction of an inverted-U function (i.e., impulse-variability [IV] theory) and the speed-accuracy trade-off. Forty-five 9- to 11-year-old children were instructed to throw at a specified percentage of maximum speed (45%, 65%, 85%, and 100%) and hit the wall target. Results indicated no statistically significant differences in variable error across the target conditions (p = .72), failing to support the inverted-U hypothesis. Spatial accuracy results indicated no statistically significant differences with mean radial error (p = .18), centroid radial error (p = .13), and bivariate variable error (p = .08) also failing to support the speed-accuracy trade-off in overarm throwing. As neither throwing performance variability nor accuracy changed across percentages of maximum speed in this sample of children as well as in a previous adult sample, current policy and practices of practitioners may need to be reevaluated.

  2. Analysis of Non-Pivotal Bioequivalence Studies Submitted in Abbreviated New Drug Submissions for Delayed-Release Drug Products.

    PubMed

    Kaur, Paramjeet; Jiang, Xiaojian; Stier, Ethan

    2017-01-01

    The US FDA's rule on "Requirements for Submission of Bioequivalence Data" requiring submission of all bioequivalence (BE) studies conducted on the same formulation of the drug product submitted for approval was published in Federal Register in January 2009. With the publication of this rule, we evaluated the impact of data from non-pivotal BE studies in assessing BE and identified the reasons for failed in vivo BE studies for generic oral delayed-release (DR) drug products only. We searched the Agency databases from January 2009 toDecember 2016 to identify Abbreviated New Drug Applications (ANDAs) submitted for DR drug products containing non-pivotal BE studies. Out of 202 ANDAs, 43 ANDAs contained 102 non-pivotal BE studies. Forty-nine non-pivotal BE studies were conducted on the to-be-marketed (TBM) formulation and 53 were conducted on formulations different from the TBM formulation. These experimental formulations primarily differed in the ratio of components of the enteric coating layer and/or amount (i.e., %w/w) of enteric coating layer. Of the 49 non-pivotal BE studies conducted on the TBM formulation, 41 failed to meet the BE acceptance criteria. The majority of failed non-pivotal BE studies on the TBM DR generic products had insufficient power, which was expected as these studies are exploratory in nature and not designed to have adequate power to pass the BE statistical criteria. In addition, among the failed non-pivotal BE studies on the TBM DR generic products, the most commonly failing pharmacokinetic parameter was Cmax. The data from these non-pivotal BE studies indicate that inadequate BE study design can lead to failure of the BE on the same formulation. Also, the non-pivotal BE studies on formulations different from the TBM formulation help us link the formulation design to the product performance in vivo. This article is open to POST-PUBLICATION REVIEW. Registered readers (see "For Readers") may comment by clicking on ABSTRACT on the issue's contents page.

  3. A Network-Based Method to Assess the Statistical Significance of Mild Co-Regulation Effects

    PubMed Central

    Horvát, Emőke-Ágnes; Zhang, Jitao David; Uhlmann, Stefan; Sahin, Özgür; Zweig, Katharina Anna

    2013-01-01

    Recent development of high-throughput, multiplexing technology has initiated projects that systematically investigate interactions between two types of components in biological networks, for instance transcription factors and promoter sequences, or microRNAs (miRNAs) and mRNAs. In terms of network biology, such screening approaches primarily attempt to elucidate relations between biological components of two distinct types, which can be represented as edges between nodes in a bipartite graph. However, it is often desirable not only to determine regulatory relationships between nodes of different types, but also to understand the connection patterns of nodes of the same type. Especially interesting is the co-occurrence of two nodes of the same type, i.e., the number of their common neighbours, which current high-throughput screening analysis fails to address. The co-occurrence gives the number of circumstances under which both of the biological components are influenced in the same way. Here we present SICORE, a novel network-based method to detect pairs of nodes with a statistically significant co-occurrence. We first show the stability of the proposed method on artificial data sets: when randomly adding and deleting observations we obtain reliable results even with noise exceeding the expected level in large-scale experiments. Subsequently, we illustrate the viability of the method based on the analysis of a proteomic screening data set to reveal regulatory patterns of human microRNAs targeting proteins in the EGFR-driven cell cycle signalling system. Since statistically significant co-occurrence may indicate functional synergy and the mechanisms underlying canalization, and thus hold promise in drug target identification and therapeutic development, we provide a platform-independent implementation of SICORE with a graphical user interface as a novel tool in the arsenal of high-throughput screening analysis. PMID:24039936

  4. Statistical Analysis of Large Scale Structure by the Discrete Wavelet Transform

    NASA Astrophysics Data System (ADS)

    Pando, Jesus

    1997-10-01

    The discrete wavelet transform (DWT) is developed as a general statistical tool for the study of large scale structures (LSS) in astrophysics. The DWT is used in all aspects of structure identification including cluster analysis, spectrum and two-point correlation studies, scale-scale correlation analysis and to measure deviations from Gaussian behavior. The techniques developed are demonstrated on 'academic' signals, on simulated models of the Lymanα (Lyα) forests, and on observational data of the Lyα forests. This technique can detect clustering in the Ly-α clouds where traditional techniques such as the two-point correlation function have failed. The position and strength of these clusters in both real and simulated data is determined and it is shown that clusters exist on scales as large as at least 20 h-1 Mpc at significance levels of 2-4 σ. Furthermore, it is found that the strength distribution of the clusters can be used to distinguish between real data and simulated samples even where other traditional methods have failed to detect differences. Second, a method for measuring the power spectrum of a density field using the DWT is developed. All common features determined by the usual Fourier power spectrum can be calculated by the DWT. These features, such as the index of a power law or typical scales, can be detected even when the samples are geometrically complex, the samples are incomplete, or the mean density on larger scales is not known (the infrared uncertainty). Using this method the spectra of Ly-α forests in both simulated and real samples is calculated. Third, a method for measuring hierarchical clustering is introduced. Because hierarchical evolution is characterized by a set of rules of how larger dark matter halos are formed by the merging of smaller halos, scale-scale correlations of the density field should be one of the most sensitive quantities in determining the merging history. We show that these correlations can be completely determined by the correlations between discrete wavelet coefficients on adjacent scales and at nearly the same spatial position, Cj,j+12/cdot2. Scale-scale correlations on two samples of the QSO Ly-α forests absorption spectra are computed. Lastly, higher order statistics are developed to detect deviations from Gaussian behavior. These higher order statistics are necessary to fully characterize the Ly-α forests because the usual 2nd order statistics, such as the two-point correlation function or power spectrum, give inconclusive results. It is shown how this technique takes advantage of the locality of the DWT to circumvent the central limit theorem. A non-Gaussian spectrum is defined and this spectrum reveals not only the magnitude, but the scales of non-Gaussianity. When applied to simulated and observational samples of the Ly-α clouds, it is found that different popular models of structure formation have different spectra while two, independent observational data sets, have the same spectra. Moreover, the non-Gaussian spectra of real data sets are significantly different from the spectra of various possible random samples. (Abstract shortened by UMI.)

  5. Nature's style: Naturally trendy

    USGS Publications Warehouse

    Cohn, T.A.; Lins, H.F.

    2005-01-01

    Hydroclimatological time series often exhibit trends. While trend magnitude can be determined with little ambiguity, the corresponding statistical significance, sometimes cited to bolster scientific and political argument, is less certain because significance depends critically on the null hypothesis which in turn reflects subjective notions about what one expects to see. We consider statistical trend tests of hydroclimatological data in the presence of long-term persistence (LTP). Monte Carlo experiments employing FARIMA models indicate that trend tests which fail to consider LTP greatly overstate the statistical significance of observed trends when LTP is present. A new test is presented that avoids this problem. From a practical standpoint, however, it may be preferable to acknowledge that the concept of statistical significance is meaningless when discussing poorly understood systems.

  6. Nature's style: Naturally trendy

    NASA Astrophysics Data System (ADS)

    Cohn, Timothy A.; Lins, Harry F.

    2005-12-01

    Hydroclimatological time series often exhibit trends. While trend magnitude can be determined with little ambiguity, the corresponding statistical significance, sometimes cited to bolster scientific and political argument, is less certain because significance depends critically on the null hypothesis which in turn reflects subjective notions about what one expects to see. We consider statistical trend tests of hydroclimatological data in the presence of long-term persistence (LTP). Monte Carlo experiments employing FARIMA models indicate that trend tests which fail to consider LTP greatly overstate the statistical significance of observed trends when LTP is present. A new test is presented that avoids this problem. From a practical standpoint, however, it may be preferable to acknowledge that the concept of statistical significance is meaningless when discussing poorly understood systems.

  7. Failure Analysis of Space Shuttle Orbiter Valve Poppet

    NASA Technical Reports Server (NTRS)

    Russell, Rick

    2010-01-01

    The poppet failed during STS-126 due to fatigue cracking that most likely was initiated during MDC ground-testing. This failure ultimately led to the discovery that the cracking problem was a generic issue effecting numerous poppets throughout the Shuttle program's history. This presentation has focused on the laboratory analysis of the failed hardware, but this analysis was only one aspect of a comprehensive failure investigation. One critical aspect of the overall investigation was modeling of the fluid flow through this valve to determine the possible sources of cyclic loading. This work has led to the conclusion that the poppets are failing due to flow-induced vibration.

  8. In-Depth Characterization and Validation of Human Urine Metabolomes Reveal Novel Metabolic Signatures of Lower Urinary Tract Symptoms

    NASA Astrophysics Data System (ADS)

    Hao, Ling; Greer, Tyler; Page, David; Shi, Yatao; Vezina, Chad M.; Macoska, Jill A.; Marker, Paul C.; Bjorling, Dale E.; Bushman, Wade; Ricke, William A.; Li, Lingjun

    2016-08-01

    Lower urinary tract symptoms (LUTS) are a range of irritative or obstructive symptoms that commonly afflict aging population. The diagnosis is mostly based on patient-reported symptoms, and current medication often fails to completely eliminate these symptoms. There is a pressing need for objective non-invasive approaches to measure symptoms and understand disease mechanisms. We developed an in-depth workflow combining urine metabolomics analysis and machine learning bioinformatics to characterize metabolic alterations and support objective diagnosis of LUTS. Machine learning feature selection and statistical tests were combined to identify candidate biomarkers, which were statistically validated with leave-one-patient-out cross-validation and absolutely quantified by selected reaction monitoring assay. Receiver operating characteristic analysis showed highly-accurate prediction power of candidate biomarkers to stratify patients into disease or non-diseased categories. The key metabolites and pathways may be possibly correlated with smooth muscle tone changes, increased collagen content, and inflammation, which have been identified as potential contributors to urinary dysfunction in humans and rodents. Periurethral tissue staining revealed a significant increase in collagen content and tissue stiffness in men with LUTS. Together, our study provides the first characterization and validation of LUTS urinary metabolites and pathways to support the future development of a urine-based diagnostic test for LUTS.

  9. Patient-reported outcomes before and after treatment of major depressive disorder

    PubMed Central

    IsHak, Waguih William; Mirocha, James; Pi, Sarah; Tobia, Gabriel; Becker, Bret; Peselow, Eric D.; Cohen, Robert M.

    2014-01-01

    Patient reported outcomes (PROs) of quality of life (QoL), functioning, and depressive symptom severity are important in assessing the burden of illness of major depressive disorder (MDD) and to evaluate the impact of treatment. We sought to provide a detailed analysis of PROs before and after treatment of MDD from the large Sequenced Treatment Alternatives to Relieve Depression (STAR*D) study. This analysis examines PROs before and after treatment in the second level of STAR*D. The complete data on QoL, functioning, and depressive symptom severity, were analyzed for each STAR*D level 2 treatment. PROs of QoL, functioning, and depressive symptom severity showed substantial impairments after failing a selective serotonin reuptake inhibitor trial using citalopram (level 1). The seven therapeutic options in level 2 had positive statistically (P values) and clinically (Cohen's standardized differences [Cohen's d]) significant impact on QoL, functioning, depressive symptom severity, and reduction in calculated burden of illness. There were no statistically significant differences between the interventions. However, a substantial proportion of patients still suffered from patient-reported QoL and functioning impairment after treatment, an effect that was more pronounced in nonremitters. PROs are crucial in understanding the impact of MDD and in examining the effects of treatment interventions, both in research and clinical settings. PMID:25152656

  10. Applying Monte Carlo Simulation to Launch Vehicle Design and Requirements Analysis

    NASA Technical Reports Server (NTRS)

    Hanson, J. M.; Beard, B. B.

    2010-01-01

    This Technical Publication (TP) is meant to address a number of topics related to the application of Monte Carlo simulation to launch vehicle design and requirements analysis. Although the focus is on a launch vehicle application, the methods may be applied to other complex systems as well. The TP is organized so that all the important topics are covered in the main text, and detailed derivations are in the appendices. The TP first introduces Monte Carlo simulation and the major topics to be discussed, including discussion of the input distributions for Monte Carlo runs, testing the simulation, how many runs are necessary for verification of requirements, what to do if results are desired for events that happen only rarely, and postprocessing, including analyzing any failed runs, examples of useful output products, and statistical information for generating desired results from the output data. Topics in the appendices include some tables for requirements verification, derivation of the number of runs required and generation of output probabilistic data with consumer risk included, derivation of launch vehicle models to include possible variations of assembled vehicles, minimization of a consumable to achieve a two-dimensional statistical result, recontact probability during staging, ensuring duplicated Monte Carlo random variations, and importance sampling.

  11. Exploring the issue of failure to fail in a nursing program.

    PubMed

    Larocque, Sylvie; Luhanga, Florence Loyce

    2013-05-18

    A study using a qualitative descriptive design was undertaken to explore the issue of "failure to fail" in a nursing program. Individual in-depth interviews were conducted with nursing university faculty members, preceptors, and faculty advisors (n=13). Content analysis was used to analyze the data. Results indicate that: (a) failing a student is a difficult process; (b) both academic and emotional support are required for students and preceptors and faculty advisors; (c) there are consequences for programs, faculty, and students when a student has failed a placement; (d) at times, personal, professional, and structural reasons exist for failing to fail a student; and (e) the reputation of the professional program can be diminished as a result of failing to fail a student. Recommendations for improving assessment, evaluation, and intervention with a failing student include documentation, communication, and support. These findings have implications for improving the quality of clinical experiences.

  12. User-perceived reliability of unrepairable shared protection systems with functionally identical units

    NASA Astrophysics Data System (ADS)

    Ozaki, Hirokazu; Kara, Atsushi; Cheng, Zixue

    2012-05-01

    In this article, we investigate the reliability of M-for-N (M:N) shared protection systems. We focus on the reliability that is perceived by an end user of one of N units. We assume that any failed unit is instantly replaced by one of the M units (if available). We describe the effectiveness of such a protection system in a quantitative manner under the condition that the failed units are not repairable. Mathematical analysis gives the closed-form solution of the reliability and mean time to failure (MTTF). We also analyse several numerical examples of the reliability and MTTF. This result can be applied, for example, to the analysis and design of an integrated circuit consisting of redundant backup components. In such a device, repairing a failed component is unrealistic. The analysis provides useful information for the design for general shared protection systems in which the failed units are not repaired.

  13. Preventing distal pullout of posterior spine instrumentation in thoracic hyperkyphosis: a biomechanical analysis.

    PubMed

    Sun, Edward; Alkalay, Ron; Vader, David; Snyder, Brian D

    2009-06-01

    An in vitro biomechanical study. Compare the mechanical behavior of 5 different constructs used to terminate dual-rod posterior spinal instrumentation in resisting forward flexion moment. Failure of the distal fixation construct can be a significant problem for patients undergoing surgical treatment for thoracic hyperkyphosis. We hypothesize that augmenting distal pedicle screws with infralaminar hooks or sublaminar cables significantly increases the strength and stiffness of these constructs. Thirty-seven thoracolumbar (T12 to L2) calf spines were implanted with 5 configurations of distal constructs: (1) infralaminar hooks, (2) sublaminar cables, (3) pedicle screws, (4) pedicle screws+infralaminar hooks, and (5) pedicle screws+sublaminar cables. Progressive bending moment was applied to each construct until failure. The mode of failure was noted and the construct's stiffness and failure load determined from the load-displacement curves. Bone density and vertebral dimensions were equivalent among the groups (F=0.1 to 0.9, P>0.05). One-way analysis of covariance (adjusted for differences in density and vertebral dimension) demonstrated that all of the screw-constructs (screw, screw+hook, and screw+cable) exhibited significantly higher stiffness and ultimate failure loads compared with either sublaminar hook or cable alone (P<0.05). The screw+hook constructs (109+/-11 Nm/mm) were significantly stiffer than either screws alone (88+/-17 Nm/mm) or screw+cable (98+/-13 Nm/mm) constructs, P<0.05. Screw+cable construct exhibited significantly higher failure load (1336+/-328 N) compared with screw constructs (1102+/-256 N, P<0.05), whereas not statistically different from the screw+hook construct (1220+/-75 N). The cable and hook constructs failed by laminar fracture, screw construct failed in uniaxial shear (pullout), whereas the screws+(hooks or wires) failed by fracture of caudal vertebral body. Posterior dual rod constructs fixed distally using pedicle screws were stiffer and stronger in resisting forward flexion compared with cables or hooks alone. Augmenting these screws with either infralaminar hooks or sublaminar cables provided additional resistance to failure.

  14. 40 CFR 1054.315 - How do I know when my engine family fails the production-line testing requirements?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... and CO emissions: Ci = Max [0 or Ci-1 + Xi−(STD + 0.25 × σ)] Where: Ci = The current CumSum statistic...). Xi = The current emission test result for an individual engine. STD = Emission standard (or family...

  15. Schools, Society, and "Teen" Pregnancy.

    ERIC Educational Resources Information Center

    Males, Mike

    1993-01-01

    Reality of widespread adult/teen sex--as revealed through age-specific pregnancy, birth, and sexually transmitted disease (STD) statistics--has profound implications for public school sex education and efforts to reduce incidence of teen pregnancy and STDs. Many public school "prevention" measures have failed because male half of "teen" pregnancy…

  16. Mental Disorder or "Normal Life Variation"? Why It Matters

    ERIC Educational Resources Information Center

    Jacobs, David H.

    2014-01-01

    "Diagnostic and Statistical Manual of Mental Disorders, fifth edition" ("DSM-5") promises a refined definition of mental disorder, which is tantamount to acknowledging that prior "DSM" definitions have failed to clarify what mental disorder is and why a person should be considered mentally disordered. Since the…

  17. Balancing the books - a statistical theory of prospective budgets in Earth System science

    NASA Astrophysics Data System (ADS)

    O'Kane, J. Philip

    An honest declaration of the error in a mass, momentum or energy balance, ɛ, simply raises the question of its acceptability: "At what value of ɛ is the attempted balance to be rejected?" Answering this question requires a reference quantity against which to compare ɛ. This quantity must be a mathematical function of all the data used in making the balance. To deliver this function, a theory grounded in a workable definition of acceptability is essential. A distinction must be drawn between a retrospective balance and a prospective budget in relation to any natural space-filling body. Balances look to the past; budgets look to the future. The theory is built on the application of classical sampling theory to the measurement and closure of a prospective budget. It satisfies R.A. Fisher's "vital requirement that the actual and physical conduct of experiments should govern the statistical procedure of their interpretation". It provides a test, which rejects, or fails to reject, the hypothesis that the closing error on the budget, when realised, was due to sampling error only. By increasing the number of measurements, the discrimination of the test can be improved, controlling both the precision and accuracy of the budget and its components. The cost-effective design of such measurement campaigns is discussed briefly. This analysis may also show when campaigns to close a budget on a particular space-filling body are not worth the effort for either scientific or economic reasons. Other approaches, such as those based on stochastic processes, lack this finality, because they fail to distinguish between different types of error in the mismatch between a set of realisations of the process and the measured data.

  18. Good Models Gone Bad: Quantifying and Predicting Parameter-Induced Climate Model Simulation Failures

    NASA Astrophysics Data System (ADS)

    Lucas, D. D.; Klein, R.; Tannahill, J.; Brandon, S.; Covey, C. C.; Domyancic, D.; Ivanova, D. P.

    2012-12-01

    Simulations using IPCC-class climate models are subject to fail or crash for a variety of reasons. Statistical analysis of the failures can yield useful insights to better understand and improve the models. During the course of uncertainty quantification (UQ) ensemble simulations to assess the effects of ocean model parameter uncertainties on climate simulations, we experienced a series of simulation failures of the Parallel Ocean Program (POP2). About 8.5% of our POP2 runs failed for numerical reasons at certain combinations of parameter values. We apply support vector machine (SVM) classification from the fields of pattern recognition and machine learning to quantify and predict the probability of failure as a function of the values of 18 POP2 parameters. The SVM classifiers readily predict POP2 failures in an independent validation ensemble, and are subsequently used to determine the causes of the failures via a global sensitivity analysis. Four parameters related to ocean mixing and viscosity are identified as the major sources of POP2 failures. Our method can be used to improve the robustness of complex scientific models to parameter perturbations and to better steer UQ ensembles. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was funded by the Uncertainty Quantification Strategic Initiative Laboratory Directed Research and Development Project at LLNL under project tracking code 10-SI-013 (UCRL LLNL-ABS-569112).

  19. Indirect zirconia-reinforced lithium silicate ceramic CAD/CAM restorations: Preliminary clinical results after 12 months.

    PubMed

    Zimmermann, Moritz; Koller, Christina; Mehl, Albert; Hickel, Reinhard

    2017-01-01

    No clinical data are available for the new computer-aided design/computer-assisted manufacture (CAD/CAM) material zirconia-reinforced lithium silicate (ZLS) ceramic. This study describes preliminary clinical results for indirect ZLS CAD/CAM restorations after 12 months. Indirect restorations were fabricated, using the CEREC method and intraoral scanning (CEREC Omnicam, CEREC MCXL). Sixty-seven restorations were seated adhesively (baseline). Sixty restorations were evaluated after 12 months (follow-up), using modified FDI criteria. Two groups were established, according to ZLS restorations' post-processing procedure prior to adhesive seating: group I (three-step polishing, n = 32) and group II (fire glazing, n = 28). Statistical analysis was performed with Mann-Whitney U test and Wilcoxon test (P < .05). The success rate of indirect ZLS CAD/CAM restorations after 12 months was 96.7%. Two restorations clinically failed as a result of bulk fracture (failure rate 3.3%). No statistically significant differences were found for baseline and follow-up criteria (Wilcoxon test, P > .05). Statistically significant differences were found for criteria surface gloss for group I and group II (Mann-Whitney U test, P < .05). This study demonstrates ZLS CAD/CAM restorations have a high clinical success rate after 12 months. A longer clinical evaluation period is necessary to draw further conclusions.

  20. Time lapse photography as an approach to understanding glide avalanche activity

    USGS Publications Warehouse

    Hendrikx, Jordy; Peitzsch, Erich H.; Fagre, Daniel B.

    2012-01-01

    Avalanches resulting from glide cracks are notoriously difficult to forecast, but are a recurring problem for numerous avalanche forecasting programs. In some cases glide cracks are observed to open and then melt away in situ. In other cases, they open and then fail catastrophically as large, full-depth avalanches. Our understanding and management of these phenomena are currently limited. It is thought that an increase in the rate of snow gliding occurs prior to full-depth avalanche activity so frequent observation of glide crack movement can provide an index of instability. During spring 2011 in Glacier National Park, Montana, USA, we began an approach to track glide crack avalanche activity using a time-lapse camera focused on a southwest facing glide crack. This crack melted in-situ without failing as a glide avalanche, while other nearby glide cracks on north through southeast aspects failed. In spring 2012, a camera was aimed at a large and productive glide crack adjacent to the Going to the Sun Road. We captured three unique glide events in the field of view. Unfortunately, all of them either failed very quickly, or during periods of obscured view, so measurements of glide rate could not be obtained. However, we compared the hourly meteorological variables during the period of glide activity to the same variables prior to glide activity. The variables air temperature, relative humidity, air pressure, incoming and reflected long wave radiation, SWE, total precipitation, and snow depth were found to be statistically different for our cases examined. We propose that these are some of the potential precursors for glide avalanche activity, but do urge caution in their use, due to the simple approach and small data set size. It is hoped that by introducing a workable method to easily record glide crack movement, combined with ongoing analysis of the associated meteorological data, we will improve our understanding of when, or if, glide avalanche activity will ensue.

  1. The optimal power puzzle: scrutiny of the monotone likelihood ratio assumption in multiple testing.

    PubMed

    Cao, Hongyuan; Sun, Wenguang; Kosorok, Michael R

    2013-01-01

    In single hypothesis testing, power is a non-decreasing function of type I error rate; hence it is desirable to test at the nominal level exactly to achieve optimal power. The puzzle lies in the fact that for multiple testing, under the false discovery rate paradigm, such a monotonic relationship may not hold. In particular, exact false discovery rate control may lead to a less powerful testing procedure if a test statistic fails to fulfil the monotone likelihood ratio condition. In this article, we identify different scenarios wherein the condition fails and give caveats for conducting multiple testing in practical settings.

  2. Socioeconomic Status Is Not Related with Facial Fluctuating Asymmetry: Evidence from Latin-American Populations

    PubMed Central

    Quinto-Sánchez, Mirsha; Cintas, Celia; Silva de Cerqueira, Caio Cesar; Ramallo, Virginia; Acuña-Alonzo, Victor; Adhikari, Kaustubh; Castillo, Lucía; Gomez-Valdés, Jorge; Everardo, Paola; De Avila, Francisco; Hünemeier, Tábita; Jaramillo, Claudia; Arias, Williams; Fuentes, Macarena; Gallo, Carla; Poletti, Giovani; Schuler-Faccini, Lavinia; Bortolini, Maria Cátira; Canizales-Quinteros, Samuel; Rothhammer, Francisco; Bedoya, Gabriel; Rosique, Javier; Ruiz-Linares, Andrés; González-José, Rolando

    2017-01-01

    The expression of facial asymmetries has been recurrently related with poverty and/or disadvantaged socioeconomic status. Departing from the developmental instability theory, previous approaches attempted to test the statistical relationship between the stress experienced by individuals grown in poor conditions and an increase in facial and corporal asymmetry. Here we aim to further evaluate such hypothesis on a large sample of admixed Latin Americans individuals by exploring if low socioeconomic status individuals tend to exhibit greater facial fluctuating asymmetry values. To do so, we implement Procrustes analysis of variance and Hierarchical Linear Modelling (HLM) to estimate potential associations between facial fluctuating asymmetry values and socioeconomic status. We report significant relationships between facial fluctuating asymmetry values and age, sex, and genetic ancestry, while socioeconomic status failed to exhibit any strong statistical relationship with facial asymmetry. These results are persistent after the effect of heterozygosity (a proxy for genetic ancestry) is controlled in the model. Our results indicate that, at least on the studied sample, there is no relationship between socioeconomic stress (as intended as low socioeconomic status) and facial asymmetries. PMID:28060876

  3. Reuse, Recycle, Reweigh: Combating Influenza through Efficient Sequential Bayesian Computation for Massive Data.

    PubMed

    Tom, Jennifer A; Sinsheimer, Janet S; Suchard, Marc A

    Massive datasets in the gigabyte and terabyte range combined with the availability of increasingly sophisticated statistical tools yield analyses at the boundary of what is computationally feasible. Compromising in the face of this computational burden by partitioning the dataset into more tractable sizes results in stratified analyses, removed from the context that justified the initial data collection. In a Bayesian framework, these stratified analyses generate intermediate realizations, often compared using point estimates that fail to account for the variability within and correlation between the distributions these realizations approximate. However, although the initial concession to stratify generally precludes the more sensible analysis using a single joint hierarchical model, we can circumvent this outcome and capitalize on the intermediate realizations by extending the dynamic iterative reweighting MCMC algorithm. In doing so, we reuse the available realizations by reweighting them with importance weights, recycling them into a now tractable joint hierarchical model. We apply this technique to intermediate realizations generated from stratified analyses of 687 influenza A genomes spanning 13 years allowing us to revisit hypotheses regarding the evolutionary history of influenza within a hierarchical statistical framework.

  4. Reuse, Recycle, Reweigh: Combating Influenza through Efficient Sequential Bayesian Computation for Massive Data

    PubMed Central

    Tom, Jennifer A.; Sinsheimer, Janet S.; Suchard, Marc A.

    2015-01-01

    Massive datasets in the gigabyte and terabyte range combined with the availability of increasingly sophisticated statistical tools yield analyses at the boundary of what is computationally feasible. Compromising in the face of this computational burden by partitioning the dataset into more tractable sizes results in stratified analyses, removed from the context that justified the initial data collection. In a Bayesian framework, these stratified analyses generate intermediate realizations, often compared using point estimates that fail to account for the variability within and correlation between the distributions these realizations approximate. However, although the initial concession to stratify generally precludes the more sensible analysis using a single joint hierarchical model, we can circumvent this outcome and capitalize on the intermediate realizations by extending the dynamic iterative reweighting MCMC algorithm. In doing so, we reuse the available realizations by reweighting them with importance weights, recycling them into a now tractable joint hierarchical model. We apply this technique to intermediate realizations generated from stratified analyses of 687 influenza A genomes spanning 13 years allowing us to revisit hypotheses regarding the evolutionary history of influenza within a hierarchical statistical framework. PMID:26681992

  5. Socioeconomic Status Is Not Related with Facial Fluctuating Asymmetry: Evidence from Latin-American Populations.

    PubMed

    Quinto-Sánchez, Mirsha; Cintas, Celia; Silva de Cerqueira, Caio Cesar; Ramallo, Virginia; Acuña-Alonzo, Victor; Adhikari, Kaustubh; Castillo, Lucía; Gomez-Valdés, Jorge; Everardo, Paola; De Avila, Francisco; Hünemeier, Tábita; Jaramillo, Claudia; Arias, Williams; Fuentes, Macarena; Gallo, Carla; Poletti, Giovani; Schuler-Faccini, Lavinia; Bortolini, Maria Cátira; Canizales-Quinteros, Samuel; Rothhammer, Francisco; Bedoya, Gabriel; Rosique, Javier; Ruiz-Linares, Andrés; González-José, Rolando

    2017-01-01

    The expression of facial asymmetries has been recurrently related with poverty and/or disadvantaged socioeconomic status. Departing from the developmental instability theory, previous approaches attempted to test the statistical relationship between the stress experienced by individuals grown in poor conditions and an increase in facial and corporal asymmetry. Here we aim to further evaluate such hypothesis on a large sample of admixed Latin Americans individuals by exploring if low socioeconomic status individuals tend to exhibit greater facial fluctuating asymmetry values. To do so, we implement Procrustes analysis of variance and Hierarchical Linear Modelling (HLM) to estimate potential associations between facial fluctuating asymmetry values and socioeconomic status. We report significant relationships between facial fluctuating asymmetry values and age, sex, and genetic ancestry, while socioeconomic status failed to exhibit any strong statistical relationship with facial asymmetry. These results are persistent after the effect of heterozygosity (a proxy for genetic ancestry) is controlled in the model. Our results indicate that, at least on the studied sample, there is no relationship between socioeconomic stress (as intended as low socioeconomic status) and facial asymmetries.

  6. DNA viewed as an out-of-equilibrium structure

    NASA Astrophysics Data System (ADS)

    Provata, A.; Nicolis, C.; Nicolis, G.

    2014-05-01

    The complexity of the primary structure of human DNA is explored using methods from nonequilibrium statistical mechanics, dynamical systems theory, and information theory. A collection of statistical analyses is performed on the DNA data and the results are compared with sequences derived from different stochastic processes. The use of χ2 tests shows that DNA can not be described as a low order Markov chain of order up to r =6. Although detailed balance seems to hold at the level of a binary alphabet, it fails when all four base pairs are considered, suggesting spatial asymmetry and irreversibility. Furthermore, the block entropy does not increase linearly with the block size, reflecting the long-range nature of the correlations in the human genomic sequences. To probe locally the spatial structure of the chain, we study the exit distances from a specific symbol, the distribution of recurrence distances, and the Hurst exponent, all of which show power law tails and long-range characteristics. These results suggest that human DNA can be viewed as a nonequilibrium structure maintained in its state through interactions with a constantly changing environment. Based solely on the exit distance distribution accounting for the nonequilibrium statistics and using the Monte Carlo rejection sampling method, we construct a model DNA sequence. This method allows us to keep both long- and short-range statistical characteristics of the native DNA data. The model sequence presents the same characteristic exponents as the natural DNA but fails to capture spatial correlations and point-to-point details.

  7. DNA viewed as an out-of-equilibrium structure.

    PubMed

    Provata, A; Nicolis, C; Nicolis, G

    2014-05-01

    The complexity of the primary structure of human DNA is explored using methods from nonequilibrium statistical mechanics, dynamical systems theory, and information theory. A collection of statistical analyses is performed on the DNA data and the results are compared with sequences derived from different stochastic processes. The use of χ^{2} tests shows that DNA can not be described as a low order Markov chain of order up to r=6. Although detailed balance seems to hold at the level of a binary alphabet, it fails when all four base pairs are considered, suggesting spatial asymmetry and irreversibility. Furthermore, the block entropy does not increase linearly with the block size, reflecting the long-range nature of the correlations in the human genomic sequences. To probe locally the spatial structure of the chain, we study the exit distances from a specific symbol, the distribution of recurrence distances, and the Hurst exponent, all of which show power law tails and long-range characteristics. These results suggest that human DNA can be viewed as a nonequilibrium structure maintained in its state through interactions with a constantly changing environment. Based solely on the exit distance distribution accounting for the nonequilibrium statistics and using the Monte Carlo rejection sampling method, we construct a model DNA sequence. This method allows us to keep both long- and short-range statistical characteristics of the native DNA data. The model sequence presents the same characteristic exponents as the natural DNA but fails to capture spatial correlations and point-to-point details.

  8. Oral 5-aminosalicylic acid for induction of remission in ulcerative colitis.

    PubMed

    Feagan, Brian G; Macdonald, John K

    2012-10-17

    Oral 5-aminosalicylic acid (5-ASA) preparations were intended to avoid the adverse effects of sulfasalazine (SASP) while maintaining its therapeutic benefits. Previously, it was found that 5-ASA drugs in doses of at least 2 g/day, were more effective than placebo but no more effective than SASP for inducing remission in ulcerative colitis. This updated review includes more recent studies and evaluates the efficacy and safety of 5-ASA preparations used for the treatment of mild to moderately active ulcerative colitis. The primary objectives were to assess the efficacy, dose-responsiveness and safety of oral 5-ASA compared to placebo, SASP, or 5-ASA comparators for induction of remission in active ulcerative colitis. A secondary objective of this systematic review was to compare the efficacy and safety of once daily dosing of oral 5-ASA with conventional (two or three times daily) dosing regimens. A computer-assisted literature search for relevant studies (inception to January 20, 2012) was performed using MEDLINE, EMBASE and the Cochrane Library. Review articles and conference proceedings were also searched to identify additional studies. Studies were accepted for analysis if they were randomized controlled clinical trials of parallel design, with a minimum treatment duration of four weeks. Studies of oral 5-ASA therapy for treatment of patients with active ulcerative colitis compared with placebo, SASP or other formulations of 5-ASA were considered for inclusion. Studies that compared once daily 5-ASA treatment with conventional dosing of 5-ASA (two or three times daily) and 5-ASA dose ranging studies were also considered for inclusion. The outcomes of interest were the failure to induce global/clinical remission, global/clinical improvement, endoscopic remission, endoscopic improvement, adherence, adverse events, withdrawals due to adverse events, and withdrawals or exclusions after entry. Trials were separated into five comparison groups: 5-ASA versus placebo, 5-ASA versus sulfasalazine, once daily dosing versus conventional dosing, 5-ASA versus comparator 5-ASA, and 5-ASA dose-ranging. Placebo-controlled trials were subgrouped by dosage. SASP-controlled trials were subgrouped by 5-ASA/SASP mass ratios. Once daily versus conventional dosing studies were subgrouped by formulation. 5-ASA-controlled trials were subgrouped by common 5-ASA comparators (e.g. Asacol, Claversal, Salofalk and Pentasa). Dose-ranging studies were subgrouped by 5-ASA formulation. We calculated the relative risk (RR) and 95% confidence intervals (95% CI) for each outcome. Data were analyzed on an intention to treat basis. Forty-eight studies (7776 patients) were included. The majority of included studies were rated as low risk of bias. 5-ASA was significantly superior to placebo with regard to all measured outcome variables. Seventy-two per cent of 5-ASA patients failed to enter clinical remission compared to 85% of placebo patients (RR 0.86, 95% CI 0.81 to 0.91). A dose-response trend for 5-ASA was also observed. No statistically significant differences in efficacy were found between 5-ASA and SASP. Fifty-four per cent of 5-ASA patients failed to enter remission compared to 58% of SASP patients (RR 0.90, 95% CI 0.77 to 1.04). No statistically significant differences in efficacy or adherence were found between once daily and conventionally dosed 5-ASA. Forty-two per cent of once daily patients failed to enter clinical remission compared to 44% of conventionally dosed patients (RR 0.95, 95% CI 0.82 to 1.10). Eight per cent of patients dosed once daily failed to adhere to their medication regimen compared to 6% of conventionally dosed patients (RR 1.36, 95% CI 0.64 to 2.86). There does not appear to be any difference in efficacy among the various 5-ASA formulations. Forty-eight per cent of patients in the 5-ASA group failed to enter remission compared to 50% of patients in the 5-ASA comparator group (RR 0.94, 95% CI 0.86 to 1.03). A pooled analysis of the ASCEND (I, II and III, n = 1459 patients) studies found no statistically significant difference in clinical improvement between Asacol 4.8 g/day and 2.4 g/day used for the treatment of moderately active ulcerative colitis. Thirty-seven per cent of patients in the 4.8 g/day group failed to improve clinically compared to 41% of patients in the 2.4 g/day group (RR 0.89; 95% CI 0.78 to 1.01). Subgroup analysis indicated that patients with moderate disease may benefit from the higher dose of 4.8 g/day. One study compared (n = 123 patients) Pentasa 4 g/day to 2.25 g/day in patients with moderate disease. Twenty-five per cent of patients in the 4 g/day group failed to improve clinically compared to 57% of patients in the 2.25 g/day group (RR 0.44; 95% CI 0.27 to 0.71). A pooled analysis of two studies comparing MMX mesalamine 4.8 g/day to 2.4 g/day found no statistically significant difference in efficacy (RR 1.03, 95% CI 0.82 to 1.29). 5-ASA was generally safe and common adverse events included flatulence, abdominal pain, nausea, diarrhea, headache and worsening ulcerative colitis. There were no statistically significant differences in the incidence of adverse events between 5-ASA and placebo, once daily and conventionally dosed 5-ASA, 5-ASA and comparator 5-ASA formulation and 5-ASA dose ranging (high dose versus low dose) studies. SASP was not as well tolerated as 5-ASA. Twenty-nine percent of SASP patients experienced an adverse event compared to 15% of 5-ASA patients (RR 0.48, 95% CI 0.37 to 0.63). 5-ASA was superior to placebo and no more effective than SASP. Considering their relative costs, a clinical advantage to using oral 5-ASA in place of SASP appears unlikely. 5-ASA dosed once daily appears to be as efficacious and safe as conventionally dosed 5-ASA. Adherence does not appear to be enhanced by once daily dosing in the clinical trial setting. It is unknown if once daily dosing of 5-ASA improves adherence in a community-based setting. There do not appear to be any differences in efficacy or safety among the various 5-ASA formulations. A daily dosage of 2.4 g appears to be a safe and effective induction therapy for patients with mild to moderately active ulcerative colitis. Patients with moderate disease may benefit from an initial dose of 4.8 g/day.

  9. Oral 5-aminosalicylic acid for induction of remission in ulcerative colitis.

    PubMed

    Wang, Yongjun; Parker, Claire E; Bhanji, Tania; Feagan, Brian G; MacDonald, John K

    2016-04-21

    Oral 5-aminosalicylic acid (5-ASA) preparations were intended to avoid the adverse effects of sulfasalazine (SASP) while maintaining its therapeutic benefits. Previously, it was found that 5-ASA drugs in doses of at least 2 g/day, were more effective than placebo but no more effective than SASP for inducing remission in ulcerative colitis. This updated review includes more recent studies and evaluates the efficacy and safety of 5-ASA preparations used for the treatment of mild to moderately active ulcerative colitis. The primary objectives were to assess the efficacy, dose-responsiveness and safety of oral 5-ASA compared to placebo, SASP, or 5-ASA comparators for induction of remission in active ulcerative colitis. A secondary objective of this systematic review was to compare the efficacy and safety of once daily dosing of oral 5-ASA with conventional (two or three times daily) dosing regimens. A computer-assisted literature search for relevant studies (inception to July 9, 2015) was performed using MEDLINE, EMBASE and the Cochrane Library. Review articles and conference proceedings were also searched to identify additional studies. Studies were accepted for analysis if they were randomized controlled clinical trials of parallel design, with a minimum treatment duration of four weeks. Studies of oral 5-ASA therapy for treatment of patients with active ulcerative colitis compared with placebo, SASP or other formulations of 5-ASA were considered for inclusion. Studies that compared once daily 5-ASA treatment with conventional dosing of 5-ASA (two or three times daily) and 5-ASA dose ranging studies were also considered for inclusion. The outcomes of interest were the failure to induce global/clinical remission, global/clinical improvement, endoscopic remission, endoscopic improvement, adherence, adverse events, withdrawals due to adverse events, and withdrawals or exclusions after entry. Trials were separated into five comparison groups: 5-ASA versus placebo, 5-ASA versus sulfasalazine, once daily dosing versus conventional dosing, 5-ASA versus comparator 5-ASA, and 5-ASA dose-ranging. Placebo-controlled trials were subgrouped by dosage. SASP-controlled trials were subgrouped by 5-ASA/SASP mass ratios. Once daily versus conventional dosing studies were subgrouped by formulation. 5-ASA-controlled trials were subgrouped by common 5-ASA comparators (e.g. Asacol, Claversal, Salofalk and Pentasa). Dose-ranging studies were subgrouped by 5-ASA formulation. We calculated the relative risk (RR) and 95% confidence intervals (95% CI) for each outcome. Data were analyzed on an intention-to-treat basis. Fifty-three studies (8548 patients) were included. The majority of included studies were rated as low risk of bias. 5-ASA was significantly superior to placebo with regard to all measured outcome variables. Seventy-one per cent of 5-ASA patients failed to enter clinical remission compared to 83% of placebo patients (RR 0.86, 95% CI 0.82 to 0.89). A dose-response trend for 5-ASA was also observed. No statistically significant differences in efficacy were found between 5-ASA and SASP. Fifty-four per cent of 5-ASA patients failed to enter remission compared to 58% of SASP patients (RR 0.90, 95% CI 0.77 to 1.04). No statistically significant differences in efficacy or adherence were found between once daily and conventionally dosed 5-ASA. Forty-five per cent of once daily patients failed to enter clinical remission compared to 48% of conventionally dosed patients (RR 0.94, 95% CI 0.83 to 1.07). Eight per cent of patients dosed once daily failed to adhere to their medication regimen compared to 6% of conventionally dosed patients (RR 1.36, 95% CI 0.64 to 2.86). There does not appear to be any difference in efficacy among the various 5-ASA formulations. Fifty per cent of patients in the 5-ASA group failed to enter remission compared to 52% of patients in the 5-ASA comparator group (RR 0.94, 95% CI 0.86 to 1.02). A pooled analysis of 3 studies (n = 1459 patients) studies found no statistically significant difference in clinical improvement between Asacol 4.8 g/day and 2.4 g/day used for the treatment of moderately active ulcerative colitis. Thirty-seven per cent of patients in the 4.8 g/day group failed to improve clinically compared to 41% of patients in the 2.4 g/day group (RR 0.89; 95% CI 0.78 to 1.01). Subgroup analysis indicated that patients with moderate disease may benefit from the higher dose of 4.8 g/day. One study compared (n = 123 patients) Pentasa 4 g/day to 2.25 g/day in patients with moderate disease. Twenty-five per cent of patients in the 4 g/day group failed to improve clinically compared to 57% of patients in the 2.25 g/day group (RR 0.44; 95% CI 0.27 to 0.71). A pooled analysis of two studies comparing MMX mesalamine 4.8 g/day to 2.4 g/day found no statistically significant difference in efficacy (RR 1.03, 95% CI 0.82 to 1.29). There were no statistically significant differences in the incidence of adverse events between 5-ASA and placebo, once daily and conventionally dosed 5-ASA, 5-ASA and comparator 5-ASA formulation and 5-ASA dose ranging (high dose versus low dose) studies. Common adverse events included flatulence, abdominal pain, nausea, diarrhea, headache and worsening ulcerative colitis. SASP was not as well tolerated as 5-ASA. Twenty-nine percent of SASP patients experienced an adverse event compared to 15% of 5-ASA patients (RR 0.48, 95% CI 0.37 to 0.63). 5-ASA was superior to placebo and no more effective than SASP. Considering their relative costs, a clinical advantage to using oral 5-ASA in place of SASP appears unlikely. 5-ASA dosed once daily appears to be as efficacious and safe as conventionally dosed 5-ASA. Adherence does not appear to be enhanced by once daily dosing in the clinical trial setting. It is unknown if once daily dosing of 5-ASA improves adherence in a community-based setting. There do not appear to be any differences in efficacy or safety among the various 5-ASA formulations. A daily dosage of 2.4 g appears to be a safe and effective induction therapy for patients with mild to moderately active ulcerative colitis. Patients with moderate disease may benefit from an initial dose of 4.8 g/day.

  10. Examining Minor and Major Depression in Adolescents

    ERIC Educational Resources Information Center

    Gonzalez-Tejera, Gloria; Canino, Glorisa; Ramirez, Rafael; Chavez, Ligia; Shrout, Patrick; Bird, Hector; Bravo, Milagros; Martinez-Taboas, Alfonso; Ribera, Julio; Bauermeister, Jose

    2005-01-01

    Background: Research has shown that a large proportion of adolescents with symptoms of depression and substantial distress or impairment fail to meet the diagnostic criteria for a major depressive disorder (MDD). However, many of these undiagnosed adolescents may meet criteria for a residual category of the "Diagnostic and Statistical Manual of…

  11. Design Research Using Game Design as an Instructional Strategy

    ERIC Educational Resources Information Center

    Siko, Jason; Barbour, Michael

    2014-01-01

    Using Homemade PowerPoint games as an instructional strategy incorporates elements of game design and constructionism in the classroom using "Microsoft PowerPoint," which is ubiquitous in schools today. However, previous research examining the use of these games has failed to show statistical differences in performance. In the second…

  12. Money Matters: The Influence of Financial Factors on Graduate Student Persistence

    ERIC Educational Resources Information Center

    Strayhorn, Terrell L.

    2010-01-01

    National statistics indicate that approximately 50 percent of all graduate students fail to complete their degree; thus, understanding the factors that influence their persistence is an important research objective. Using data from a nationally representative sample of bachelor's degree recipients, the study aimed to answer three questions: What…

  13. Gender-Friendly Schools

    ERIC Educational Resources Information Center

    King, Kelley; Gurian, Michael; Stevens, Kathy

    2010-01-01

    The authors, who have worked with more than 2,000 schools across the United States in efforts to close gender gaps, describe how gender-related issues consistently intersect and interfere with school improvement efforts. They present statistics showing that schools are now failing boys in more areas than girls, and describe how "the…

  14. Sound texture perception via statistics of the auditory periphery: Evidence from sound synthesis

    PubMed Central

    McDermott, Josh H.; Simoncelli, Eero P.

    2014-01-01

    Rainstorms, insect swarms, and galloping horses produce “sound textures” – the collective result of many similar acoustic events. Sound textures are distinguished by temporal homogeneity, suggesting they could be recognized with time-averaged statistics. To test this hypothesis, we processed real-world textures with an auditory model containing filters tuned for sound frequencies and their modulations, and measured statistics of the resulting decomposition. We then assessed the realism and recognizability of novel sounds synthesized to have matching statistics. Statistics of individual frequency channels, capturing spectral power and sparsity, generally failed to produce compelling synthetic textures. However, combining them with correlations between channels produced identifiable and natural-sounding textures. Synthesis quality declined if statistics were computed from biologically implausible auditory models. The results suggest that sound texture perception is mediated by relatively simple statistics of early auditory representations, presumably computed by downstream neural populations. The synthesis methodology offers a powerful tool for their further investigation. PMID:21903084

  15. Failure Analysis of Sapphire Refractive Secondary Concentrators

    NASA Technical Reports Server (NTRS)

    Salem, Jonathan A.; Quinn, George D.

    2009-01-01

    Failure analysis was performed on two sapphire, refractive secondary concentrators (RSC) that failed during elevated temperature testing. Both concentrators failed from machining/handling damage on the lens face. The first concentrator, which failed during testing to 1300 C, exhibited a large r-plane twin extending from the lens through much of the cone. The second concentrator, which was an attempt to reduce temperature gradients and failed during testing to 649 C, exhibited a few small twins on the lens face. The twins were not located at the origin, but represent another mode of failure that needs to be considered in the design of sapphire components. In order to estimate the fracture stress from fractographic evidence, branching constants were measured on sapphire strength specimens. The fractographic analysis indicated radial tensile stresses of 44 to 65 MPa on the lens faces near the origins. Finite element analysis indicated similar stresses for the first RSC, but lower stresses for the second RSC. Better machining and handling might have prevented the fractures, however, temperature gradients and resultant thermal stresses need to be reduced to prevent twinning.

  16. Extraversion and taste sensitivity.

    PubMed

    Zverev, Yuriy; Mipando, Mwapatsa

    2008-03-01

    The rationale for investigating the gustatory reactivity as influenced by personality dimensions was suggested by some prior findings of an association between extraversion and acuity in other sensory systems. Detection thresholds for sweet, salty, and bitter qualities of taste were measured in 60 young healthy male and female volunteers using a two-alternative forced-choice technique. Personality of the responders was assessed using the Eysenck Personality Inventory. Multivariate analysis of variance failed to demonstrate a statistically significant interaction between an extraversion-introversion score, neuroticism score, smoking, gender and age. The only reliable negative association was found between the body mass index (BMI) and taste sensitivity (Roy's largest root = 0.05, F(7436.5) = 8.34, P = 0.003). Possible reasons for lack of differences between introverts and extraverts in the values of taste detection thresholds were discussed.

  17. The Correlation between Insertion Depth of Prodisc-C Artificial Disc and Postoperative Kyphotic Deformity: Clinical Importance of Insertion Depth of Artificial Disc.

    PubMed

    Lee, Do-Youl; Kim, Se-Hoon; Suh, Jung-Keun; Cho, Tai-Hyoung; Chung, Yong-Gu

    2012-09-01

    This study was designed to investigate the correlation between insertion depth of artificial disc and postoperative kyphotic deformity after Prodisc-C total disc replacement surgery, and the range of artificial disc insertion depth which is effective in preventing postoperative whole cervical or segmental kyphotic deformity. A retrospective radiological analysis was performed in 50 patients who had undergone single level total disc replacement surgery. Records were reviewed to obtain demographic data. Preoperative and postoperative radiographs were assessed to determine C2-7 Cobb's angle and segmental angle and to investigate postoperative kyphotic deformity. A formula was introduced to calculate insertion depth of Prodisc-C artificial disc. Statistical analysis was performed to search the correlation between insertion depth of Prodisc-C artificial disc and postoperative kyphotic deformity, and to estimate insertion depth of Prodisc-C artificial disc to prevent postoperative kyphotic deformity. In this study no significant statistical correlation was observed between insertion depth of Prodisc-C artificial disc and postoperative kyphotic deformity regarding C2-7 Cobb's angle. Statistical correlation between insertion depth of Prodisc-C artificial disc and postoperative kyphotic deformity was observed regarding segmental angle (p<0.05). It failed to estimate proper insertion depth of Prodisc-C artificial disc effective in preventing postoperative kyphotic deformity. Postoperative segmental kyphotic deformity is associated with insertion depth of Prodisc-C artificial disc. Anterior located artificial disc leads to lordotic segmental angle and posterior located artificial disc leads to kyphotic segmental angle postoperatively. But C2-7 Cobb's angle is not affected by artificial disc location after the surgery.

  18. Selection of nontarget arthropod taxa for field research on transgenic insecticidal crops: using empirical data and statistical power.

    PubMed

    Prasifka, J R; Hellmich, R L; Dively, G P; Higgins, L S; Dixon, P M; Duan, J J

    2008-02-01

    One of the possible adverse effects of transgenic insecticidal crops is the unintended decline in the abundance of nontarget arthropods. Field trials designed to evaluate potential nontarget effects can be more complex than expected because decisions to conduct field trials and the selection of taxa to include are not always guided by the results of laboratory tests. Also, recent studies emphasize the potential for indirect effects (adverse impacts to nontarget arthropods without feeding directly on plant tissues), which are difficult to predict because of interactions among nontarget arthropods, target pests, and transgenic crops. As a consequence, field studies may attempt to monitor expansive lists of arthropod taxa, making the design of such broad studies more difficult and reducing the likelihood of detecting any negative effects that might be present. To improve the taxonomic focus and statistical rigor of future studies, existing field data and corresponding power analysis may provide useful guidance. Analysis of control data from several nontarget field trials using repeated-measures designs suggests that while detection of small effects may require considerable increases in replication, there are taxa from different ecological roles that are sampled effectively using standard methods. The use of statistical power to guide selection of taxa for nontarget trials reflects scientists' inability to predict the complex interactions among arthropod taxa, particularly when laboratory trials fail to provide guidance on which groups are more likely to be affected. However, scientists still may exercise judgment, including taxa that are not included in or supported by power analyses.

  19. [Analysis on influencing factor of the complications of percutaneous dilational tracheotomy].

    PubMed

    Zhai, Xiang; Zhang, Jinling; Hang, Wei; Wang, Ming; Shi, Zhan; Mi, Yue; Hu, Yunlei; Liu, Gang

    2015-01-01

    To Analyze the influence factors on the complications of percutaneous dilational tracheotomy. Between August 2008 and February 2014, there were 3 450 patients with the indications of tracheotomy accepted percutaneous dilational tracheostomy, mainly using percutaneous dilational and percutaneous guide wire forceps in these cases. Statistical analysis was performed by SPSS 19.0 software on postoperative complications, the possible influence factors including age, gender, etiology, preoperative hypoxia, obesity, preoperative pulmonary infection, state of consciousness, operation method, operation doctor and whether with tracheal intubation. Among 3 450 patients, there were 164 cases with intraoperative or postoperative complications, including postoperative bleeding in 74 cases (2.14%), subcutaneous emphysema in 54 cases (1.57%), wound infection in 16 cases (0.46%), pneumothorax in 6 cases (0.17%), mediastinal emphysema in 5 cases (0.14%), operation failed and change to conventional incision in 4 cases (0.12%), tracheoesophageal fistula in 2 cases (0.06%), death in 3 cases(0.09%).Obesity, etiology, preoperative hypoxia, preoperative pulmonary infection, state of consciousness and operation method were the main influence factors, with significant statistical difference (χ(2) value was 0.010, 0.000, 0.002, 0.000, 0.000, 0.000, all P < 0.05). Gender, age, operation doctor and whether there was the endotracheal intubation were not the main influence factors. There was no significant statistical difference (P > 0.05). Although percutaneous dilational tracheostomy is safe, but the complications can also happen. In order to reduce the complications, it is need to pay attention to the factors of obesity, etiology, preoperative hypoxia, preoperative pulmonary infection, state of consciousness and operation method.

  20. Systems and methods for circuit lifetime evaluation

    NASA Technical Reports Server (NTRS)

    Heaps, Timothy L. (Inventor); Sheldon, Douglas J. (Inventor); Bowerman, Paul N. (Inventor); Everline, Chester J. (Inventor); Shalom, Eddy (Inventor); Rasmussen, Robert D. (Inventor)

    2013-01-01

    Systems and methods for estimating the lifetime of an electrical system in accordance with embodiments of the invention are disclosed. One embodiment of the invention includes iteratively performing Worst Case Analysis (WCA) on a system design with respect to different system lifetimes using a computer to determine the lifetime at which the worst case performance of the system indicates the system will pass with zero margin or fail within a predetermined margin for error given the environment experienced by the system during its lifetime. In addition, performing WCA on a system with respect to a specific system lifetime includes identifying subcircuits within the system, performing Extreme Value Analysis (EVA) with respect to each subcircuit to determine whether the subcircuit fails EVA for the specific system lifetime, when the subcircuit passes EVA, determining that the subcircuit does not fail WCA for the specified system lifetime, when a subcircuit fails EVA performing at least one additional WCA process that provides a tighter bound on the WCA than EVA to determine whether the subcircuit fails WCA for the specified system lifetime, determining that the system passes WCA with respect to the specific system lifetime when all subcircuits pass WCA, and determining that the system fails WCA when at least one subcircuit fails WCA.

  1. More than 100 Colleges Fail Education Department's Test of Financial Strength

    ERIC Educational Resources Information Center

    Blumenstyk, Goldie

    2009-01-01

    A newly compiled analysis by the U.S. Department of Education and obtained by "The Chronicle" shows that 114 private nonprofit degree-granting colleges were in such fragile financial condition at the end of their last fiscal year that they failed the department's financial-responsibility test. Colleges that fail the test are subject to extra…

  2. Failure factors in non-life insurance companies in United Kingdom

    NASA Astrophysics Data System (ADS)

    Samsudin, Humaida Banu

    2013-04-01

    Failure in insurance company is a condition of financial distress where a company has difficulty paying off its financial obligations to its creditors. This study continues the research from the study in identifying the determinants for run-off non-life insurance companies in United Kingdom. The analysis continues to identify other variables that could lead companies to financial distress that is macroeconomic factors (GDP rates, inflation rates and interest rates); total companies failed a year before and average size for failed companies'. The result from the analysis indicates that inflation rates, interest rates, total companies failed a year before and average sizes for failed companies are the best predictors. An early detection of failure can prevent companies from bankruptcy and allow management to take action to reduce the failure costs.

  3. Soil properties of crocker formation and its influence on slope instability along the Ranau-Tambunan highway, Sabah

    NASA Astrophysics Data System (ADS)

    Azlan, Noran Nabilla Nor; Simon, Norbert; Hussin, Azimah; Roslee, Rodeano

    2016-11-01

    The Crocker formation on the study area consists of an inter-bedded shale and sandstone. The intense deformation and discontinuity on sandstone and shale beds of the arenaceous Crocker Formation makes them easily exposed to weathering and instability. In this study, a total of 15 selected slopes representing highly weathered material of stable and unstable conditions were studied to identify the characteristics of soil material on both conditions and how these characteristics will lead to instability. Physical properties analysis of soil material were conducted on 5 samples from stable slopes and 10 samples from failed slopes collected along the Ranau-Tambunan highway (RTM), Sabah. The analysis shows that the Crocker Formation consists mainly of poorly graded materials of sandy SILT with low plasticity (MLS) and PI value ranges from 1%-14. The failures materials are largely consist of low water content (0.94%-2.03%), higher finer texture material (11%-71%), intermediate liquid limit (21%-44%) and low plastic limit (20%-30%) while stable material consist of low water content (1.25%-1.80%), higher coarser texture material (43%-78%), low liquid limit (25%-28%) and low plastic limit (22%-25%). Specific gravity shows a ranges value of 2.24-2.60 for both slope conditions. The clay content in failed slope samples exhibit a slightly higher percentage of clay indicating a higher plasticity value compared to stable slopes. Statistical analysis was carried out to examine the association between landslide occurrences with soil physical properties in both stable and unstable slopes. The significant of both slope condition properties association to landslide occurrences was determined by mean rank differences. The study reveals that the grain size and plasticity of soil have contributed largely to slope instability in the study area.

  4. Anti-A2 and anti-A1 domain antibodies are potential predictors of immune tolerance induction outcome in children with hemophilia A.

    PubMed

    Lapalud, P; Rothschild, C; Mathieu-Dupas, E; Balicchi, J; Gruel, Y; Laune, D; Molina, F; Schved, J F; Granier, C; Lavigne-Lissalde, G

    2015-04-01

    Hemophilia A (HA) is a congenital bleeding disorder resulting from factor VIII deficiency. The most serious complication of HA management is the appearance of inhibitory antibodies (Abs) against injected FVIII concentrates. To eradicate inhibitors, immune tolerance induction (ITI) is usually attempted, but it fails in up to 30% of cases. Currently, no undisputed predictive marker of ITI outcome is available to facilitate the clinical decision. To identify predictive markers of ITI efficacy. The isotypic and epitopic repertoires of inhibitory Abs were analyzed in plasma samples collected before ITI initiation from 15 children with severe HA and high-titer inhibitors, and their levels were compared in the two outcome groups (ITI success [n = 7] and ITI failure [n = 8]). The predictive value of these candidate biomarkers and of the currently used indicators (inhibitor titer and age at ITI initiation, highest inhibitor titer before ITI, and interval between inhibitor diagnosis and ITI initiation) was then compared by statistical analysis (Wilcoxon test and receiver receiver operating characteristic [ROC] curve analysis). Whereas current indicators seemed to fail in discriminating patients in the two outcome groups (ITI success or failure), anti-A1 and anti-A2 Ab levels before ITI initiation appeared to be good potential predictive markers of ITI outcome (P < 0.018). ROC analysis showed that anti-A1 and anti-A2 Abs were the best at discriminating between outcome groups (area under the ROC curve of > 0.875). Anti-A1 and anti-A2 Abs could represent new promising tools for the development of ITI outcome prediction tests for children with severe HA. © 2015 International Society on Thrombosis and Haemostasis.

  5. Anti-N-Methyl-d-Aspartate receptor (NMDAR) encephalitis during pregnancy: Clinical analysis of reported cases.

    PubMed

    Shi, Yan-Chao; Chen, Xiu-Ju; Zhang, Hong-Mei; Wang, Zhen; Du, Da-Yong

    2017-06-01

    To analyze the clinical features of 13 pregnant patients with anti-N-Methyl-d-Aspartate receptor (NMDAR) encephalitis. Retrospective review of thirteen reported cases was conducted for anti-NMDAR encephalitis patients during pregnancy. The clinical data were collected from papers published in PubMed prior to 16 February 2016. Statistical analysis of the data was performed, which encompasses the patients' age, past medical history, onset of symptoms, concomitant with ovarian teratomas, immunotherapy, outcomes of mothers and newborns. Thirteen cases were reported in 11 articles with a median age of 23 (interquartile range, 19-27) years old. There were eight cases in which the onset periods of gestation happened in the first trimester and five cases in the second trimester. Among 13 cases, five patients had a past medical history, one concomitant with autoimmune Graves' hyperthyroidism, one with bilateral ovarian teratomas removed history, one with anti-NMDAR encephalitis five years before pregnancy and two with psychiatric symptoms. Five patients were found with ovarian teratomas. Seven patients responded to first-line immunotherapy whereas all of two patients responded to second-line immunotherapy when the first-line immunotherapy failed. Following up all the 13 patients, most experienced a substantial recovery, except one had spasticity and dystonia in one hand, and one died of a superimposed infection. Three fetuses were miscarried or aborted in total. Most newborns were healthy, except two cases (2/10) with abnormal neurologic signs. Clinical analysis of the data indicates that most patients respond to first-line immunotherapy. A second-line immunotherapy is effective when first-line immunotherapy fails. It has also been found that most mothers and newborns can have good outcomes. Copyright © 2017. Published by Elsevier B.V.

  6. Clinical effectiveness and safety of leflunomide in inflammatory arthritis: a report from the RAPPORT database with supporting patient survey.

    PubMed

    Schultz, Morgan; Keeling, Stephanie O; Katz, Steven J; Maksymowych, Walter P; Eurich, Dean T; Hall, Jill J

    2017-07-01

    Leflunomide is indicated for the treatment of adults with rheumatoid arthritis, yet is underutilized. Given the cost of biologic therapy, understanding real-life effectiveness, safety, and sustainability of leflunomide, particularly in patients who have failed methotrexate, would be of value. The primary objective was to assess the proportion of patients achieving clinically meaningful benefit following an adequate trial of leflunomide. A retrospective analysis of a cohort supplemented with patient self-reported data using a standardized questionnaire. Data were analyzed using descriptive statistics, with a database multivariate logistic regression analysis to determine predictors of leflunomide response. Of the cohort available (N = 2591), 1671 patients with confirmed leflunomide use were included in the retrospective analysis, of whom 249 were incident users. Low disease activity (DAS-28 < 3.2) was achieved or maintained by 20% of incident users, with 19% achieving a clinical response (DAS-28 decrease ≥1.2) at 3 months. Adverse effects (AE) were reported by 29% of incident users and after 1 year, 45% remained on leflunomide. Achievement of "minimal or no joint symptoms" was reported by 34% in the 661 analyzable survey responses (39% response rate). AE were reported by 55%, with nuisance (hair loss, nausea, stomach pain) AE and diarrhea being most common. Leflunomide was discontinued by 67% of responders within 1 year. An important proportion of patients, the majority of whom had previously failed methotrexate, achieved disease response with leflunomide with a low risk of serious adverse effects, suggesting that a trial of leflunomide may be a reasonable and cost-effective strategy prior to biologic therapy.

  7. Role of spatial inhomogenity in GPCR dimerisation predicted by receptor association-diffusion models

    NASA Astrophysics Data System (ADS)

    Deshpande, Sneha A.; Pawar, Aiswarya B.; Dighe, Anish; Athale, Chaitanya A.; Sengupta, Durba

    2017-06-01

    G protein-coupled receptor (GPCR) association is an emerging paradigm with far reaching implications in the regulation of signalling pathways and therapeutic interventions. Recent super resolution microscopy studies have revealed that receptor dimer steady state exhibits sub-second dynamics. In particular the GPCRs, muscarinic acetylcholine receptor M1 (M1MR) and formyl peptide receptor (FPR), have been demonstrated to exhibit a fast association/dissociation kinetics, independent of ligand binding. In this work, we have developed a spatial kinetic Monte Carlo model to investigate receptor homo-dimerisation at a single receptor resolution. Experimentally measured association/dissociation kinetic parameters and diffusion coefficients were used as inputs to the model. To test the effect of membrane spatial heterogeneity on the simulated steady state, simulations were compared to experimental statistics of dimerisation. In the simplest case the receptors are assumed to be diffusing in a spatially homogeneous environment, while spatial heterogeneity is modelled to result from crowding, membrane micro-domains and cytoskeletal compartmentalisation or ‘corrals’. We show that a simple association-diffusion model is sufficient to reproduce M1MR association statistics, but fails to reproduce FPR statistics despite comparable kinetic constants. A parameter sensitivity analysis is required to reproduce the association statistics of FPR. The model reveals the complex interplay between cytoskeletal components and their influence on receptor association kinetics within the features of the membrane landscape. These results constitute an important step towards understanding the factors modulating GPCR organisation.

  8. Feeling the future: A meta-analysis of 90 experiments on the anomalous anticipation of random future events.

    PubMed

    Bem, Daryl; Tressoldi, Patrizio; Rabeyron, Thomas; Duggan, Michael

    2015-01-01

    In 2011, one of the authors (DJB) published a report of nine experiments in the Journal of Personality and Social Psychology purporting to demonstrate that an individual's cognitive and affective responses can be influenced by randomly selected stimulus events that do not occur until after his or her responses have already been made and recorded, a generalized variant of the phenomenon traditionally denoted by the term precognition. To encourage replications, all materials needed to conduct them were made available on request. We here report a meta-analysis of 90 experiments from 33 laboratories in 14 countries which yielded an overall effect greater than 6 sigma, z = 6.40, p = 1.2 × 10 (-10 ) with an effect size (Hedges' g) of 0.09. A Bayesian analysis yielded a Bayes Factor of 5.1 × 10 (9), greatly exceeding the criterion value of 100 for "decisive evidence" in support of the experimental hypothesis. When DJB's original experiments are excluded, the combined effect size for replications by independent investigators is 0.06, z = 4.16, p = 1.1 × 10 (-5), and the BF value is 3,853, again exceeding the criterion for "decisive evidence." The number of potentially unretrieved experiments required to reduce the overall effect size of the complete database to a trivial value of 0.01 is 544, and seven of eight additional statistical tests support the conclusion that the database is not significantly compromised by either selection bias or by intense " p-hacking"-the selective suppression of findings or analyses that failed to yield statistical significance. P-curve analysis, a recently introduced statistical technique, estimates the true effect size of the experiments to be 0.20 for the complete database and 0.24 for the independent replications, virtually identical to the effect size of DJB's original experiments (0.22) and the closely related "presentiment" experiments (0.21). We discuss the controversial status of precognition and other anomalous effects collectively known as psi.

  9. Flight Deck Refuelling Hose Failure HMCS Preserver

    DTIC Science & Technology

    2000-01-01

    hose and these were used as a basis for the investigation. Chemical analysis indicated that the inner tube and outer cover of the hose were as...Principal Results Chemical analysis indicated that the inner tube (poly(butadiene-acrylonitrile) rubber) and outer cover of the hose (poly( chloroprene... Analysis Py-GC/MS Instrumentation and Experimental Conditions Failed Titan Hose New Titan Hose German Hose Proof Tests Failed Titan Hose New

  10. Time-resolved versus time-integrated portal dosimetry: the role of an object’s position with respect to the isocenter in volumetric modulated arc therapy

    NASA Astrophysics Data System (ADS)

    Schyns, Lotte E. J. R.; Persoon, Lucas C. G. G.; Podesta, Mark; van Elmpt, Wouter J. C.; Verhaegen, Frank

    2016-05-01

    The aim of this work is to compare time-resolved (TR) and time-integrated (TI) portal dosimetry, focussing on the role of an object’s position with respect to the isocenter in volumetric modulated arc therapy (VMAT). Portal dose images (PDIs) are simulated and measured for different cases: a sphere (1), a bovine bone (2) and a patient geometry (3). For the simulated case (1) and the experimental case (2), several transformations are applied at different off-axis positions. In the patient case (3), three simple plans with different isocenters are created and pleural effusion is simulated in the patient. The PDIs before and after the sphere transformations, as well as the PDIs with and without simulated pleural effusion, are compared using a TI and TR gamma analysis. In addition, the performance of the TI and TR gamma analyses for the detection of real geometric changes in patients treated with clinical plans is investigated and a correlation analysis is performed between gamma fail rates and differences in dose volume histogram (DVH) metrics. The TI gamma analysis can show large differences in gamma fail rates for the same transformation at different off-axis positions (or for different plan isocenters). The TR gamma analysis, however, shows consistent gamma fail rates. For the detection of real geometric changes in patients treated with clinical plans, the TR gamma analysis has a higher sensitivity than the TI gamma analysis. However, the specificity for the TR gamma analysis is lower than for the TI gamma analysis. Both the TI and TR gamma fail rates show no correlation with changes in DVH metrics. This work shows that TR portal dosimetry is fundamentally superior to TI portal dosimetry, because it removes the strong dependence of the gamma fail rate on the off-axis position/plan isocenter. However, for 2D TR portal dosimetry, it is still difficult to interpret gamma fail rates in terms of changes in DVH metrics for patients treated with VMAT.

  11. Computing Inter-Rater Reliability for Observational Data: An Overview and Tutorial

    PubMed Central

    Hallgren, Kevin A.

    2012-01-01

    Many research designs require the assessment of inter-rater reliability (IRR) to demonstrate consistency among observational ratings provided by multiple coders. However, many studies use incorrect statistical procedures, fail to fully report the information necessary to interpret their results, or do not address how IRR affects the power of their subsequent analyses for hypothesis testing. This paper provides an overview of methodological issues related to the assessment of IRR with a focus on study design, selection of appropriate statistics, and the computation, interpretation, and reporting of some commonly-used IRR statistics. Computational examples include SPSS and R syntax for computing Cohen’s kappa and intra-class correlations to assess IRR. PMID:22833776

  12. Statistical physics of interacting neural networks

    NASA Astrophysics Data System (ADS)

    Kinzel, Wolfgang; Metzler, Richard; Kanter, Ido

    2001-12-01

    Recent results on the statistical physics of time series generation and prediction are presented. A neural network is trained on quasi-periodic and chaotic sequences and overlaps to the sequence generator as well as the prediction errors are calculated numerically. For each network there exists a sequence for which it completely fails to make predictions. Two interacting networks show a transition to perfect synchronization. A pool of interacting networks shows good coordination in the minority game-a model of competition in a closed market. Finally, as a demonstration, a perceptron predicts bit sequences produced by human beings.

  13. The use of postoperative slit-lamp optical coherence tomography to predict primary failure in descemet stripping automated endothelial keratoplasty.

    PubMed

    Shih, Carolyn Y; Ritterband, David C; Palmiero, Pat-Michael; Seedor, John A; Papachristou, George; Harizman, Noga; Liebmann, Jeffrey M; Ritch, Robert

    2009-05-01

    To determine if central donor lenticle thickness as measured by slit-lamp optical coherence tomography (SL OCT; Heidelberg Engineering, Heidelberg, Germany) is predictive of primary donor failure in patients undergoing Descemet stripping automated endothelial keratoplasty (DSAEK). Retrospective cross-sectional study. Eighty-four patients who underwent DSAEK surgery by 2 surgeons (D.C.R. and J.A.S.) were enrolled. At each postoperative visit (postoperative day 1, week 1, month 1, and month 2), an SL OCT scan was obtained. Statistical differences in SL OCT measurements of successful and failed DSAEK procedures were measured using the Student t test. A successful DSAEK surgery was defined as having an anatomically attached, clear recipient corneal stroma and donor lenticle compatible with good vision 2 months after surgery. A failed DSAEK surgery was defined as an attached donor lenticle with SL evidence of corneal edema and thickening visible at 2 months or more. Ninety-three eyes of 84 consecutive patients who underwent DSAEK surgery also underwent postoperative SL OCT. After 2 months of follow-up, 82 (88%) procedures were successful and 11 (12%) procedures were failures. The average donor lenticle thickness in successful DSAEK eyes was 314 +/- 128 microm on postoperative day 1 as compared with failed DSAEK eyes, which averaged 532 +/- 259 microm (P = .0013). This was independent regardless of whether the lenticle was attached on the first postoperative visit. Seventy-nine (98%) successful DSAEK eyes had a lenticle thickness of < or = 350 microm at the 1-week visit. All of the failed DSAEK eyes (11 eyes) had a lenticle thickness > or = 350 microm at the 1-week postoperative visit. Statistically significant differences in SL OCT thickness measurements were seen between successful and failed DSAEK cases at all examinations after postoperative week 1. Corneal thickness measurements made with SL OCT are an important predictor of DSAEK failure in both attached and detached lenticles within the first week of surgery. DSAEK lenticle thickness of 350 microm or less at 1 week had a predictability of success of more than 98%.

  14. Statistical Deconvolution for Superresolution Fluorescence Microscopy

    PubMed Central

    Mukamel, Eran A.; Babcock, Hazen; Zhuang, Xiaowei

    2012-01-01

    Superresolution microscopy techniques based on the sequential activation of fluorophores can achieve image resolution of ∼10 nm but require a sparse distribution of simultaneously activated fluorophores in the field of view. Image analysis procedures for this approach typically discard data from crowded molecules with overlapping images, wasting valuable image information that is only partly degraded by overlap. A data analysis method that exploits all available fluorescence data, regardless of overlap, could increase the number of molecules processed per frame and thereby accelerate superresolution imaging speed, enabling the study of fast, dynamic biological processes. Here, we present a computational method, referred to as deconvolution-STORM (deconSTORM), which uses iterative image deconvolution in place of single- or multiemitter localization to estimate the sample. DeconSTORM approximates the maximum likelihood sample estimate under a realistic statistical model of fluorescence microscopy movies comprising numerous frames. The model incorporates Poisson-distributed photon-detection noise, the sparse spatial distribution of activated fluorophores, and temporal correlations between consecutive movie frames arising from intermittent fluorophore activation. We first quantitatively validated this approach with simulated fluorescence data and showed that deconSTORM accurately estimates superresolution images even at high densities of activated fluorophores where analysis by single- or multiemitter localization methods fails. We then applied the method to experimental data of cellular structures and demonstrated that deconSTORM enables an approximately fivefold or greater increase in imaging speed by allowing a higher density of activated fluorophores/frame. PMID:22677393

  15. Methamphetamine Use among Homeless Former Foster Youth: The Mediating Role of Social Networks

    PubMed Central

    Yoshioka-Maxwell, Amanda; Rice, Eric; Rhoades, Harmony; Winetrobe, Hailey

    2015-01-01

    Objectives Social network analysis can provide added causal insight into otherwise confusing epidemiologic findings in public health research. Although foster care and homelessness are risk factors for methamphetamine use, current research has failed to explicate why homeless youth with foster care experience engage in methamphetamine use at higher rates than other homeless young adults. This study examined the mediating effect of network engagement and time spent homeless on the relationship between foster care experience and recent methamphetamine use among homeless youth in Los Angeles. Methods Egocentric network data from a cross-sectional community-based sample (n = 652) of homeless youth aged 13–25 were collected from drop-in centers in Los Angeles. Questions addressed foster care experience, time spent homeless, methamphetamine use, and perceived drug use in social networks. Path analysis was performed in SAS to examine mediation. Results Controlling for all other variables, results of path analysis regarding recent methamphetamine use indicated a direct effect between foster care experience and recent methamphetamine use (B = .269, t = 2.73, p < .01). However, this direct effect became statistically nonsignificant when time spent homeless and network methamphetamine use were added to the model, and indirect paths from time spent homeless and network methamphetamine use became statistically significant. Conclusions Foster care experience influenced recent methamphetamine use indirectly through time spent homeless and methamphetamine use by network members. Efforts to reduce methamphetamine use should focus on securing stable housing and addressing network interactions among homeless former foster youth. PMID:26146647

  16. Gap-free segmentation of vascular networks with automatic image processing pipeline.

    PubMed

    Hsu, Chih-Yang; Ghaffari, Mahsa; Alaraj, Ali; Flannery, Michael; Zhou, Xiaohong Joe; Linninger, Andreas

    2017-03-01

    Current image processing techniques capture large vessels reliably but often fail to preserve connectivity in bifurcations and small vessels. Imaging artifacts and noise can create gaps and discontinuity of intensity that hinders segmentation of vascular trees. However, topological analysis of vascular trees require proper connectivity without gaps, loops or dangling segments. Proper tree connectivity is also important for high quality rendering of surface meshes for scientific visualization or 3D printing. We present a fully automated vessel enhancement pipeline with automated parameter settings for vessel enhancement of tree-like structures from customary imaging sources, including 3D rotational angiography, magnetic resonance angiography, magnetic resonance venography, and computed tomography angiography. The output of the filter pipeline is a vessel-enhanced image which is ideal for generating anatomical consistent network representations of the cerebral angioarchitecture for further topological or statistical analysis. The filter pipeline combined with computational modeling can potentially improve computer-aided diagnosis of cerebrovascular diseases by delivering biometrics and anatomy of the vasculature. It may serve as the first step in fully automatic epidemiological analysis of large clinical datasets. The automatic analysis would enable rigorous statistical comparison of biometrics in subject-specific vascular trees. The robust and accurate image segmentation using a validated filter pipeline would also eliminate operator dependency that has been observed in manual segmentation. Moreover, manual segmentation is time prohibitive given that vascular trees have more than thousands of segments and bifurcations so that interactive segmentation consumes excessive human resources. Subject-specific trees are a first step toward patient-specific hemodynamic simulations for assessing treatment outcomes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Steep discounting of delayed monetary and food rewards in obesity: a meta-analysis.

    PubMed

    Amlung, M; Petker, T; Jackson, J; Balodis, I; MacKillop, J

    2016-08-01

    An increasing number of studies have investigated delay discounting (DD) in relation to obesity, but with mixed findings. This meta-analysis synthesized the literature on the relationship between monetary and food DD and obesity, with three objectives: (1) to characterize the relationship between DD and obesity in both case-control comparisons and continuous designs; (2) to examine potential moderators, including case-control v. continuous design, money v. food rewards, sample sex distribution, and sample age (18 years); and (3) to evaluate publication bias. From 134 candidate articles, 39 independent investigations yielded 29 case-control and 30 continuous comparisons (total n = 10 278). Random-effects meta-analysis was conducted using Cohen's d as the effect size. Publication bias was evaluated using fail-safe N, Begg-Mazumdar and Egger tests, meta-regression of publication year and effect size, and imputation of missing studies. The primary analysis revealed a medium effect size across studies that was highly statistically significant (d = 0.43, p < 10-14). None of the moderators examined yielded statistically significant differences, although notably larger effect sizes were found for studies with case-control designs, food rewards and child/adolescent samples. Limited evidence of publication bias was present, although the Begg-Mazumdar test and meta-regression suggested a slightly diminishing effect size over time. Steep DD of food and money appears to be a robust feature of obesity that is relatively consistent across the DD assessment methodologies and study designs examined. These findings are discussed in the context of research on DD in drug addiction, the neural bases of DD in obesity, and potential clinical applications.

  18. Failure Analysis and Magnetic Evaluation of Tertiary Superheater Tube Used in Gas-Fired Boiler

    NASA Astrophysics Data System (ADS)

    Mohapatra, J. N.; Patil, Sujay; Sah, Rameshwar; Krishna, P. C.; Eswarappa, B.

    2018-02-01

    Failure analysis was carried out on a prematurely failed tertiary superheater tube used in gas-fired boiler. The analysis includes a comparative study of visual examination, chemical composition, hardness and microstructure at failed region, adjacent and far to failure as well as on fresh tube. The chemistry was found matching to the standard specification, whereas the hardness was low in failed tube compared to the fish mouth opening region and the fresh tube. Microscopic examination of failed sample revealed the presence of spheroidal carbides of Cr and Mo predominantly along the grain boundaries. The primary cause of failure is found to be localized heating. Magnetic hysteresis loop (MHL) measurements were carried out to correlate the magnetic parameters with microstructure and mechanical properties to establish a possible non-destructive evaluation (NDE) for health monitoring of the tubes. The coercivity of the MHL showed a very good correlation with microstructure and mechanical properties deterioration enabling a possible NDE technique for the health monitoring of the tubes.

  19. Innovations in bonding to zirconia based ceramics: Part III. Phosphate monomer resin cements.

    PubMed

    Mirmohammadi, Hesam; Aboushelib, Moustafa N M; Salameh, Ziad; Feilzer, Albert J; Kleverlaan, Cornelis J

    2010-08-01

    To compare the bond strength values and the ranking order of three phosphate monomer containing resin cements using microtensile (microTBS) and microshear (microSBS) bond strength tests. Zirconia discs (Procera Zirconia) were bonded to resin composite discs (Filtek Z250) using three different cements (Panavia F 2.0, RelyX UniCem, and Multilink). Two bond strength tests were used to determine zirconia resin bond strength; microtensile bond strength test (microTBS) and microshear bond strength test (microSBS). Ten specimens were tested for each group (n=10). Two-way analysis of variance (ANOVA) was used to analyze the data (alpha=0.05). There were statistical significant differences in bond strength values and in the ranking order obtained using the two test methods. microTBS reported significant differences in bond strength values, whereas microSBS failed to detect such effect. Both Multilink and Panavia demonstrated basically cohesive failure in the resin cement while RelyX UniCem demonstrated interfacial failure. Based on the findings of this study, the data obtained using either microTBS or microSBS could not be directly compared. microTBS was more sensitive to material differences compared to microSBS which failed to detect such differences. Copyright 2010 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  20. Guiding the Development and Use of Cost-Effectiveness Analysis in Education

    ERIC Educational Resources Information Center

    Levin, Henry M.; Belfield, Clive

    2015-01-01

    Cost-effectiveness analysis is rarely used in education. When it is used, it often fails to meet methodological standards, especially with regard to cost measurement. Although there are occasional criticisms of these failings, we believe that it is useful to provide a listing of the more common concerns and how they might be addressed. Based upon…

  1. Analysis of Emergency Diesel Generators Failure Incidents in Nuclear Power Plants

    NASA Astrophysics Data System (ADS)

    Hunt, Ronderio LaDavis

    In early years of operation, emergency diesel generators have had a minimal rate of demand failures. Emergency diesel generators are designed to operate as a backup when the main source of electricity has been disrupted. As of late, EDGs (emergency diesel generators) have been failing at NPPs (nuclear power plants) around the United States causing either station blackouts or loss of onsite and offsite power. These failures occurred from a specific type called demand failures. This thesis evaluated the current problem that raised concern in the nuclear industry which was averaging 1 EDG demand failure/year in 1997 to having an excessive event of 4 EDG demand failure year which occurred in 2011. To determine the next occurrence of the extreme event and possible cause to an event of such happening, two analyses were conducted, the statistical and root cause analysis. Considering the statistical analysis in which an extreme event probability approach was applied to determine the next occurrence year of an excessive event as well as, the probability of that excessive event occurring. Using the root cause analysis in which the potential causes of the excessive event occurred by evaluating, the EDG manufacturers, aging, policy changes/ maintenance practices and failure components. The root cause analysis investigated the correlation between demand failure data and historical data. Final results from the statistical analysis showed expectations of an excessive event occurring in a fixed range of probability and a wider range of probability from the extreme event probability approach. The root-cause analysis of the demand failure data followed historical statistics for the EDG manufacturer, aging and policy changes/ maintenance practices but, indicated a possible cause regarding the excessive event with the failure components. Conclusions showed the next excessive demand failure year, prediction of the probability and the next occurrence year of such failures, with an acceptable confidence level, was difficult but, it was likely that this type of failure will not be a 100 year event. It was noticeable to see that the majority of the EDG demand failures occurred within the main components as of 2005. The overall analysis of this study provided from percentages, indicated that it would be appropriate to make the statement that the excessive event was caused by the overall age (wear and tear) of the Emergency Diesel Generators in Nuclear Power Plants. Future Work will be to better determine the return period of the excessive event once the occurrence has happened for a second time by implementing the extreme event probability approach.

  2. Reliability estimation of a N- M-cold-standby redundancy system in a multicomponent stress-strength model with generalized half-logistic distribution

    NASA Astrophysics Data System (ADS)

    Liu, Yiming; Shi, Yimin; Bai, Xuchao; Zhan, Pei

    2018-01-01

    In this paper, we study the estimation for the reliability of a multicomponent system, named N- M-cold-standby redundancy system, based on progressive Type-II censoring sample. In the system, there are N subsystems consisting of M statistically independent distributed strength components, and only one of these subsystems works under the impact of stresses at a time and the others remain as standbys. Whenever the working subsystem fails, one from the standbys takes its place. The system fails when the entire subsystems fail. It is supposed that the underlying distributions of random strength and stress both belong to the generalized half-logistic distribution with different shape parameter. The reliability of the system is estimated by using both classical and Bayesian statistical inference. Uniformly minimum variance unbiased estimator and maximum likelihood estimator for the reliability of the system are derived. Under squared error loss function, the exact expression of the Bayes estimator for the reliability of the system is developed by using the Gauss hypergeometric function. The asymptotic confidence interval and corresponding coverage probabilities are derived based on both the Fisher and the observed information matrices. The approximate highest probability density credible interval is constructed by using Monte Carlo method. Monte Carlo simulations are performed to compare the performances of the proposed reliability estimators. A real data set is also analyzed for an illustration of the findings.

  3. Statistical Inference and Spatial Patterns in Correlates of IQ

    ERIC Educational Resources Information Center

    Hassall, Christopher; Sherratt, Thomas N.

    2011-01-01

    Cross-national comparisons of IQ have become common since the release of a large dataset of international IQ scores. However, these studies have consistently failed to consider the potential lack of independence of these scores based on spatial proximity. To demonstrate the importance of this omission, we present a re-evaluation of several…

  4. Entrepreneurial Education at the Postsecondary Level.

    ERIC Educational Resources Information Center

    Clayton, Graham

    Since 1984, Canada has experienced roughly 150,000 new business start-ups each year. Most of these businesses start small and remain small, yet they have had a significant impact on Canada's economic well-being. Unfortunately, somewhere between 70% and 80% of the new businesses fail to survive beyond 5 years. These statistics underscore the need…

  5. Female Representation in the Higher Education of Geography in Hungary. Symposium

    ERIC Educational Resources Information Center

    Timar, Judit; Jelenszkyne, Ildiko Fabian

    2004-01-01

    This paper charts the changing female representation in the higher education of geography, connecting it with the faltering development of feminist geography in Hungary. The transition from socialism to capitalism has compounded gender inequalities while many of the relevant statistical data display gender blindness. Gender issues fail to form a…

  6. The Ethics of Using Learning Analytics to Categorize Students on Risk

    ERIC Educational Resources Information Center

    Scholes, Vanessa

    2016-01-01

    There are good reasons for higher education institutions to use learning analytics to risk-screen students. Institutions can use learning analytics to better predict which students are at greater risk of dropping out or failing, and use the statistics to treat "risky" students differently. This paper analyses this practice using…

  7. Presidential Address: How to Improve Poverty Measurement in the United States

    ERIC Educational Resources Information Center

    Blank, Rebecca M.

    2008-01-01

    This paper discusses the reasons why the current official U.S. poverty measure is outdated and nonresponsive to many anti-poverty initiatives. A variety of efforts to update and improve the statistic have failed, for political, technical, and institutional reasons. Meanwhile, the European Union is taking a very different approach to poverty…

  8. A meta-analysis of math performance in Turner syndrome.

    PubMed

    Baker, Joseph M; Reiss, Allan L

    2016-02-01

    Studies investigating the relationship between Turner syndrome and math learning disability have used a wide variation of tasks designed to test various aspects of mathematical competencies. Although these studies have revealed much about the math deficits common to Turner syndrome, their diversity makes comparisons between individual studies difficult. As a result, the consistency of outcomes among these diverse measures remains unknown. The overarching aim of this review is to provide a systematic meta-analysis of the differences in math and number performance between females with Turner syndrome and age-matched neurotypical peers. We provide a meta-analysis of behavioral performance in Turner syndrome relative to age-matched neurotypical populations on assessments of math and number aptitude. In total, 112 comparisons collected across 17 studies were included. Although 54% of all statistical comparisons in our analyses failed to reject the null hypothesis, our results indicate that meaningful group differences exist on all comparisons except those that do not require explicit calculation. Taken together, these results help elucidate our current understanding of math and number weaknesses in Turner syndrome, while highlighting specific topics that require further investigation. © 2015 Mac Keith Press.

  9. Job stress: an in-depth investigation based on the HSE questionnaire and a multistep approach in order to identify the most appropriate corrective actions.

    PubMed

    De Sio, S; Cedrone, F; Greco, E; Di Traglia, M; Sanità, D; Mandolesi, D; Stansfeld, S A

    2016-01-01

    Psychosocial hazards and work-related stress have reached epidemic proportions in Europe. The Italia law introduced in 2008 the obligation for Italian companies to assess work related stress risk in order to protect their workers' safety and health. The purpose of our study was to propose an accurate measurement tool, using the HSE indicator tool, for more appropriate and significant work-related stress' prevention measures. The study was conducted on 204 visual display unit (VDU) operators: 106 male and 98 female. All subjects were administered the HSE questionnaire. The sample was studied through a 4 step process, using HSE analysis tool and a statistical analysis, based on the odds ratio calculation. The assessment model used demonstrated the presence of work related stress in VDU operators and additional "critical" aspects which had failed to emerge by the classical use of HSE analysis tool. The approach we propose allows to obtain a complete picture of the perception of work-related stress and can point out the most appropriate corrective actions.

  10. Gender-Related and Age-Related Differences in Implantable Defibrillator Recipients: Results From the Pacemaker and Implantable Defibrillator Leads Survival Study ("PAIDLESS").

    PubMed

    Feldman, Alyssa M; Kersten, Daniel J; Chung, Jessica A; Asheld, Wilbur J; Germano, Joseph; Islam, Shahidul; Cohen, Todd J

    2015-12-01

    The purpose of this study was to investigate the influences of gender and age on defibrillator lead failure and patient mortality. The specific influences of gender and age on defibrillator lead failure have not previously been investigated. This study analyzed the differences in gender and age in relation to defibrillator lead failure and mortality of patients in the Pacemaker and Implantable Defibrillator Leads Survival Study ("PAIDLESS"). PAIDLESS includes all patients at Winthrop University Hospital who underwent defibrillator lead implantation between February 1, 1996 and December 31, 2011. Male and female patients were compared within each age decile, beginning at 15 years old, to analyze lead failure and patient mortality. Statistical analyses were performed using Wilcoxon rank-sum test, Fisher's exact test, Kaplan-Meier analysis, and multivariable Cox regression models. P<.05 was considered statistically significant. No correction for multiple comparisons was performed for the subgroup analyses. A total of 3802 patients (2812 men and 990 women) were included in the analysis. The mean age was 70 ± 13 years (range, 15-94 years). Kaplan-Meier analysis found that between 45 and 54 years of age, leads implanted in women failed significantly faster than in men (P=.03). Multivariable Cox regression models were built to validate this finding, and they confirmed that male gender was an independent protective factor of lead failure in the 45 to 54 years group (for male gender: HR, 0.37; 95% confidence interval, 0.14-0.96; P=.04). Lead survival time for women in this age group was 13.4 years (standard error, 0.6), while leads implanted in men of this age group survived 14.7 years (standard error, 0.3). Although there were significant differences in lead failure, no differences in mortality between the genders were found for any ages or within each decile. This study is the first to compare defibrillator lead failure and patient mortality in relation to gender and age deciles at a single large implanting center. Within the 45 to 54 years group, leads implanted in women failed faster than in men. Male gender was found to be an independent protective factor in lead survival. This study emphasizes the complex interplay between gender and age with respect to implantable defibrillator lead failure and mortality.

  11. Statistics at a glance.

    PubMed

    Ector, Hugo

    2010-12-01

    I still remember my first book on statistics: "Elementary statistics with applications in medicine and the biological sciences" by Frederick E. Croxton. For me, it has been the start of pursuing understanding statistics in daily life and in medical practice. It was the first volume in a long row of books. In his introduction, Croxton pretends that"nearly everyone involved in any aspect of medicine needs to have some knowledge of statistics". The reality is that for many clinicians, statistics are limited to a "P < 0.05 = ok". I do not blame my colleagues who omit the paragraph on statistical methods. They have never had the opportunity to learn concise and clear descriptions of the key features. I have experienced how some authors can describe difficult methods in a well understandable language. Others fail completely. As a teacher, I tell my students that life is impossible without a basic knowledge of statistics. This feeling has resulted in an annual seminar of 90 minutes. This tutorial is the summary of this seminar. It is a summary and a transcription of the best pages I have detected.

  12. The crossing statistic: dealing with unknown errors in the dispersion of Type Ia supernovae

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shafieloo, Arman; Clifton, Timothy; Ferreira, Pedro, E-mail: arman@ewha.ac.kr, E-mail: tclifton@astro.ox.ac.uk, E-mail: p.ferreira1@physics.ox.ac.uk

    2011-08-01

    We propose a new statistic that has been designed to be used in situations where the intrinsic dispersion of a data set is not well known: The Crossing Statistic. This statistic is in general less sensitive than χ{sup 2} to the intrinsic dispersion of the data, and hence allows us to make progress in distinguishing between different models using goodness of fit to the data even when the errors involved are poorly understood. The proposed statistic makes use of the shape and trends of a model's predictions in a quantifiable manner. It is applicable to a variety of circumstances, althoughmore » we consider it to be especially well suited to the task of distinguishing between different cosmological models using type Ia supernovae. We show that this statistic can easily distinguish between different models in cases where the χ{sup 2} statistic fails. We also show that the last mode of the Crossing Statistic is identical to χ{sup 2}, so that it can be considered as a generalization of χ{sup 2}.« less

  13. Big data to smart data in Alzheimer's disease: Real-world examples of advanced modeling and simulation.

    PubMed

    Haas, Magali; Stephenson, Diane; Romero, Klaus; Gordon, Mark Forrest; Zach, Neta; Geerts, Hugo

    2016-09-01

    Many disease-modifying clinical development programs in Alzheimer's disease (AD) have failed to date, and development of new and advanced preclinical models that generate actionable knowledge is desperately needed. This review reports on computer-based modeling and simulation approach as a powerful tool in AD research. Statistical data-analysis techniques can identify associations between certain data and phenotypes, such as diagnosis or disease progression. Other approaches integrate domain expertise in a formalized mathematical way to understand how specific components of pathology integrate into complex brain networks. Private-public partnerships focused on data sharing, causal inference and pathway-based analysis, crowdsourcing, and mechanism-based quantitative systems modeling represent successful real-world modeling examples with substantial impact on CNS diseases. Similar to other disease indications, successful real-world examples of advanced simulation can generate actionable support of drug discovery and development in AD, illustrating the value that can be generated for different stakeholders. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  14. Fine-grained dengue forecasting using telephone triage services

    PubMed Central

    Abdur Rehman, Nabeel; Kalyanaraman, Shankar; Ahmad, Talal; Pervaiz, Fahad; Saif, Umar; Subramanian, Lakshminarayanan

    2016-01-01

    Thousands of lives are lost every year in developing countries for failing to detect epidemics early because of the lack of real-time disease surveillance data. We present results from a large-scale deployment of a telephone triage service as a basis for dengue forecasting in Pakistan. Our system uses statistical analysis of dengue-related phone calls to accurately forecast suspected dengue cases 2 to 3 weeks ahead of time at a subcity level (correlation of up to 0.93). Our system has been operational at scale in Pakistan for the past 3 years and has received more than 300,000 phone calls. The predictions from our system are widely disseminated to public health officials and form a critical part of active government strategies for dengue containment. Our work is the first to demonstrate, with significant empirical evidence, that an accurate, location-specific disease forecasting system can be built using analysis of call volume data from a public health hotline. PMID:27419226

  15. Demographic inference under the coalescent in a spatial continuum.

    PubMed

    Guindon, Stéphane; Guo, Hongbin; Welch, David

    2016-10-01

    Understanding population dynamics from the analysis of molecular and spatial data requires sound statistical modeling. Current approaches assume that populations are naturally partitioned into discrete demes, thereby failing to be relevant in cases where individuals are scattered on a spatial continuum. Other models predict the formation of increasingly tight clusters of individuals in space, which, again, conflicts with biological evidence. Building on recent theoretical work, we introduce a new genealogy-based inference framework that alleviates these issues. This approach effectively implements a stochastic model in which the distribution of individuals is homogeneous and stationary, thereby providing a relevant null model for the fluctuation of genetic diversity in time and space. Importantly, the spatial density of individuals in a population and their range of dispersal during the course of evolution are two parameters that can be inferred separately with this method. The validity of the new inference framework is confirmed with extensive simulations and the analysis of influenza sequences collected over five seasons in the USA. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Parameterization of phosphine ligands demonstrates enhancement of nickel catalysis via remote steric effects.

    PubMed

    Wu, Kevin; Doyle, Abigail G

    2017-08-01

    The field of Ni-catalysed cross-coupling has seen rapid recent growth because of the low cost of Ni, its earth abundance, and its ability to promote unique cross-coupling reactions. Whereas advances in the related field of Pd-catalysed cross-coupling have been driven by ligand design, the development of ligands specifically for Ni has received minimal attention. Here, we disclose a class of phosphines that enable the Ni-catalysed Csp 3 Suzuki coupling of acetals with boronic acids to generate benzylic ethers, a reaction that failed with known ligands for Ni and designer phosphines for Pd. Using parameters to quantify phosphine steric and electronic properties together with regression statistical analysis, we identify a model for ligand success. The study suggests that effective phosphines feature remote steric hindrance, a concept that could guide future ligand design tailored to Ni. Our analysis also reveals that two classic descriptors for ligand steric environment-cone angle and % buried volume-are not equivalent, despite their treatment in the literature.

  17. Image quality in real-time teleultrasound of infant hip exam over low-bandwidth internet links: a transatlantic feasibility study.

    PubMed

    Martinov, Dobrivoje; Popov, Veljko; Ignjatov, Zoran; Harris, Robert D

    2013-04-01

    Evolution of communication systems, especially internet-based technologies, has probably affected Radiology more than any other medical specialty. Tremendous increase in internet bandwidth has enabled a true revolution in image transmission and easy remote viewing of the static images and real-time video stream. Previous reports of real-time telesonography, such as the ones developed for emergency situations and humanitarian work, rely on high compressions of images utilized by remote sonologist to guide and supervise the unexperienced examiner. We believe that remote sonology could be also utilized in teleultrasound exam of infant hip. We tested feasibility of a low-cost teleultrasound system for infant hip and performed data analysis on the transmitted and original images. Transmission of data was accomplished with Remote Ultrasound (RU), a software package specifically designed for teleultrasound transmission through limited internet bandwidth. While image analysis of image pairs revealed statistically significant loss of information, panel evaluation failed to recognize any clinical difference between the original saved and transmitted still images.

  18. The problem of natural funnel asymmetries: a simulation analysis of meta-analysis in macroeconomics.

    PubMed

    Callot, Laurent; Paldam, Martin

    2011-06-01

    Effect sizes in macroeconomic are estimated by regressions on data published by statistical agencies. Funnel plots are a representation of the distribution of the resulting regression coefficients. They are normally much wider than predicted by the t-ratio of the coefficients and often asymmetric. The standard method of meta-analysts in economics assumes that the asymmetries are because of publication bias causing censoring and adjusts the average accordingly. The paper shows that some funnel asymmetries may be 'natural' so that they occur without censoring. We investigate such asymmetries by simulating funnels by pairs of data generating processes (DGPs) and estimating models (EMs), in which the EM has the problem that it disregards a property of the DGP. The problems are data dependency, structural breaks, non-normal residuals, non-linearity, and omitted variables. We show that some of these problems generate funnel asymmetries. When they do, the standard method often fails. Copyright © 2011 John Wiley & Sons, Ltd. Copyright © 2011 John Wiley & Sons, Ltd.

  19. Parameterization of phosphine ligands demonstrates enhancement of nickel catalysis via remote steric effects

    NASA Astrophysics Data System (ADS)

    Wu, Kevin; Doyle, Abigail G.

    2017-08-01

    The field of Ni-catalysed cross-coupling has seen rapid recent growth because of the low cost of Ni, its earth abundance, and its ability to promote unique cross-coupling reactions. Whereas advances in the related field of Pd-catalysed cross-coupling have been driven by ligand design, the development of ligands specifically for Ni has received minimal attention. Here, we disclose a class of phosphines that enable the Ni-catalysed Csp3 Suzuki coupling of acetals with boronic acids to generate benzylic ethers, a reaction that failed with known ligands for Ni and designer phosphines for Pd. Using parameters to quantify phosphine steric and electronic properties together with regression statistical analysis, we identify a model for ligand success. The study suggests that effective phosphines feature remote steric hindrance, a concept that could guide future ligand design tailored to Ni. Our analysis also reveals that two classic descriptors for ligand steric environment—cone angle and % buried volume—are not equivalent, despite their treatment in the literature.

  20. Cognitive Behavioral Therapy: A Meta-Analysis of Race and Substance Use Outcomes

    PubMed Central

    Windsor, Liliane Cambraia; Jemal, Alexis; Alessi, Edward

    2015-01-01

    Cognitive behavioral therapy (CBT) is an effective intervention for reducing substance use. However, because CBT trials have included predominantly White samples caution must be used when generalizing these effects to Blacks and Hispanics. This meta-analysis compared the impact of CBT in reducing substance use between studies with a predominantly non-Hispanic White sample (hereafter NHW studies) and studies with a predominantly Black and/or Hispanic sample (hereafter BH studies). From 322 manuscripts identified in the literature, 17 met criteria for inclusion. Effect sizes between CBT and comparison group at posttest had similar effects on substance abuse across NHW and BH studies. However, when comparing pre-posttest effect sizes from groups receiving CBT between NHW and BH studies, CBT’s impact was significantly stronger in NHW studies. T-test comparisons indicated reduced retention/engagement in BH studies, albeit failing to reach statistical significance. Results highlight the need for further research testing CBT’s impact on substance use among Blacks and Hispanics. PMID:25285527

  1. Predictive validity of the UKCAT for medical school undergraduate performance: a national prospective cohort study.

    PubMed

    Tiffin, Paul A; Mwandigha, Lazaro M; Paton, Lewis W; Hesselgreaves, H; McLachlan, John C; Finn, Gabrielle M; Kasim, Adetayo S

    2016-09-26

    The UK Clinical Aptitude Test (UKCAT) has been shown to have a modest but statistically significant ability to predict aspects of academic performance throughout medical school. Previously, this ability has been shown to be incremental to conventional measures of educational performance for the first year of medical school. This study evaluates whether this predictive ability extends throughout the whole of undergraduate medical study and explores the potential impact of using the test as a selection screening tool. This was an observational prospective study, linking UKCAT scores, prior educational attainment and sociodemographic variables with subsequent academic outcomes during the 5 years of UK medical undergraduate training. The participants were 6812 entrants to UK medical schools in 2007-8 using the UKCAT. The main outcome was academic performance at each year of medical school. A receiver operating characteristic (ROC) curve analysis was also conducted, treating the UKCAT as a screening test for a negative academic outcome (failing at least 1 year at first attempt). All four of the UKCAT scale scores significantly predicted performance in theory- and skills-based exams. After adjustment for prior educational achievement, the UKCAT scale scores remained significantly predictive for most years. Findings from the ROC analysis suggested that, if used as a sole screening test, with the mean applicant UKCAT score as the cut-off, the test could be used to reject candidates at high risk of failing at least 1 year at first attempt. However, the 'number needed to reject' value would be high (at 1.18), with roughly one candidate who would have been likely to pass all years at first sitting being rejected for every higher risk candidate potentially declined entry on this basis. The UKCAT scores demonstrate a statistically significant but modest degree of incremental predictive validity throughout undergraduate training. Whilst the UKCAT could be considered a fairly crude screening tool for future academic performance, it may offer added value when used in conjunction with other selection measures. Future work should focus on the optimum role of such tests within the selection process and the prediction of post-graduate performance.

  2. Many-Body Localization and Thermalization in Quantum Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Nandkishore, Rahul; Huse, David A.

    2015-03-01

    We review some recent developments in the statistical mechanics of isolated quantum systems. We provide a brief introduction to quantum thermalization, paying particular attention to the eigenstate thermalization hypothesis (ETH) and the resulting single-eigenstate statistical mechanics. We then focus on a class of systems that fail to quantum thermalize and whose eigenstates violate the ETH: These are the many-body Anderson-localized systems; their long-time properties are not captured by the conventional ensembles of quantum statistical mechanics. These systems can forever locally remember information about their local initial conditions and are thus of interest for possibilities of storing quantum information. We discuss key features of many-body localization (MBL) and review a phenomenology of the MBL phase. Single-eigenstate statistical mechanics within the MBL phase reveal dynamically stable ordered phases, and phase transitions among them, that are invisible to equilibrium statistical mechanics and can occur at high energy and low spatial dimensionality, where equilibrium ordering is forbidden.

  3. Transmural heterogeneity of cellular level power output is reduced in human heart failure.

    PubMed

    Haynes, Premi; Nava, Kristofer E; Lawson, Benjamin A; Chung, Charles S; Mitov, Mihail I; Campbell, Stuart G; Stromberg, Arnold J; Sadayappan, Sakthivel; Bonnell, Mark R; Hoopes, Charles W; Campbell, Kenneth S

    2014-07-01

    Heart failure is associated with pump dysfunction and remodeling but it is not yet known if the condition affects different transmural regions of the heart in the same way. We tested the hypotheses that the left ventricles of non-failing human hearts exhibit transmural heterogeneity of cellular level contractile properties, and that heart failure produces transmural region-specific changes in contractile function. Permeabilized samples were prepared from the sub-epicardial, mid-myocardial, and sub-endocardial regions of the left ventricular free wall of non-failing (n=6) and failing (n=10) human hearts. Power, an in vitro index of systolic function, was higher in non-failing mid-myocardial samples (0.59±0.06μWmg(-1)) than in samples from the sub-epicardium (p=0.021) and the sub-endocardium (p=0.015). Non-failing mid-myocardial samples also produced more isometric force (14.3±1.33kNm(-2)) than samples from the sub-epicardium (p=0.008) and the sub-endocardium (p=0.026). Heart failure reduced power (p=0.009) and force (p=0.042) but affected the mid-myocardium more than the other transmural regions. Fibrosis increased with heart failure (p=0.021) and mid-myocardial tissue from failing hearts contained more collagen than matched sub-epicardial (p<0.001) and sub-endocardial (p=0.043) samples. Power output was correlated with the relative content of actin and troponin I, and was also statistically linked to the relative content and phosphorylation of desmin and myosin light chain-1. Non-failing human hearts exhibit transmural heterogeneity of contractile properties. In failing organs, region-specific fibrosis produces the greatest contractile deficits in the mid-myocardium. Targeting fibrosis and sarcomeric proteins in the mid-myocardium may be particularly effective therapies for heart failure. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Patient safety: numerical skills and drug calculation abilities of nursing students and registered nurses.

    PubMed

    McMullan, Miriam; Jones, Ray; Lea, Susan

    2010-04-01

    This paper is a report of a correlational study of the relations of age, status, experience and drug calculation ability to numerical ability of nursing students and Registered Nurses. Competent numerical and drug calculation skills are essential for nurses as mistakes can put patients' lives at risk. A cross-sectional study was carried out in 2006 in one United Kingdom university. Validated numerical and drug calculation tests were given to 229 second year nursing students and 44 Registered Nurses attending a non-medical prescribing programme. The numeracy test was failed by 55% of students and 45% of Registered Nurses, while 92% of students and 89% of nurses failed the drug calculation test. Independent of status or experience, older participants (> or = 35 years) were statistically significantly more able to perform numerical calculations. There was no statistically significant difference between nursing students and Registered Nurses in their overall drug calculation ability, but nurses were statistically significantly more able than students to perform basic numerical calculations and calculations for solids, oral liquids and injections. Both nursing students and Registered Nurses were statistically significantly more able to perform calculations for solids, liquid oral and injections than calculations for drug percentages, drip and infusion rates. To prevent deskilling, Registered Nurses should continue to practise and refresh all the different types of drug calculations as often as possible with regular (self)-testing of their ability. Time should be set aside in curricula for nursing students to learn how to perform basic numerical and drug calculations. This learning should be reinforced through regular practice and assessment.

  5. Statistical Exploration of Electronic Structure of Molecules from Quantum Monte-Carlo Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prabhat, Mr; Zubarev, Dmitry; Lester, Jr., William A.

    In this report, we present results from analysis of Quantum Monte Carlo (QMC) simulation data with the goal of determining internal structure of a 3N-dimensional phase space of an N-electron molecule. We are interested in mining the simulation data for patterns that might be indicative of the bond rearrangement as molecules change electronic states. We examined simulation output that tracks the positions of two coupled electrons in the singlet and triplet states of an H2 molecule. The electrons trace out a trajectory, which was analyzed with a number of statistical techniques. This project was intended to address the following scientificmore » questions: (1) Do high-dimensional phase spaces characterizing electronic structure of molecules tend to cluster in any natural way? Do we see a change in clustering patterns as we explore different electronic states of the same molecule? (2) Since it is hard to understand the high-dimensional space of trajectories, can we project these trajectories to a lower dimensional subspace to gain a better understanding of patterns? (3) Do trajectories inherently lie in a lower-dimensional manifold? Can we recover that manifold? After extensive statistical analysis, we are now in a better position to respond to these questions. (1) We definitely see clustering patterns, and differences between the H2 and H2tri datasets. These are revealed by the pamk method in a fairly reliable manner and can potentially be used to distinguish bonded and non-bonded systems and get insight into the nature of bonding. (2) Projecting to a lower dimensional subspace ({approx}4-5) using PCA or Kernel PCA reveals interesting patterns in the distribution of scalar values, which can be related to the existing descriptors of electronic structure of molecules. Also, these results can be immediately used to develop robust tools for analysis of noisy data obtained during QMC simulations (3) All dimensionality reduction and estimation techniques that we tried seem to indicate that one needs 4 or 5 components to account for most of the variance in the data, hence this 5D dataset does not necessarily lie on a well-defined, low dimensional manifold. In terms of specific clustering techniques, K-means was generally useful in exploring the dataset. The partition around medoids (pam) technique produced the most definitive results for our data showing distinctive patterns for both a sample of the complete data and time-series. The gap statistic with tibshirani criteria did not provide any distinction across the 2 dataset. The gap statistic w/DandF criteria, Model based clustering and hierarchical modeling simply failed to run on our datasets. Thankfully, the vanilla PCA technique was successful in handling our entire dataset. PCA revealed some interesting patterns for the scalar value distribution. Kernel PCA techniques (vanilladot, RBF, Polynomial) and MDS failed to run on the entire dataset, or even a significant fraction of the dataset, and we resorted to creating an explicit feature map followed by conventional PCA. Clustering using K-means and PAM in the new basis set seems to produce promising results. Understanding the new basis set in the scientific context of the problem is challenging, and we are currently working to further examine and interpret the results.« less

  6. Poor-quality antimalarial drugs in southeast Asia and sub-Saharan Africa.

    PubMed

    Nayyar, Gaurvika M L; Breman, Joel G; Newton, Paul N; Herrington, James

    2012-06-01

    Poor-quality antimalarial drugs lead to drug resistance and inadequate treatment, which pose an urgent threat to vulnerable populations and jeopardise progress and investments in combating malaria. Emergence of artemisinin resistance or tolerance in Plasmodium falciparum on the Thailand-Cambodia border makes protection of the effectiveness of the drug supply imperative. We reviewed published and unpublished studies reporting chemical analyses and assessments of packaging of antimalarial drugs. Of 1437 samples of drugs in five classes from seven countries in southeast Asia, 497 (35%) failed chemical analysis, 423 (46%) of 919 failed packaging analysis, and 450 (36%) of 1260 were classified as falsified. In 21 surveys of drugs from six classes from 21 countries in sub-Saharan Africa, 796 (35%) of 2297 failed chemical analysis, 28 (36%) of 77 failed packaging analysis, and 79 (20%) of 389 were classified as falsified. Data were insufficient to identify the frequency of substandard (products resulting from poor manufacturing) antimalarial drugs, and packaging analysis data were scarce. Concurrent interventions and a multifaceted approach are needed to define and eliminate criminal production, distribution, and poor manufacturing of antimalarial drugs. Empowering of national medicine regulatory authorities to protect the global drug supply is more important than ever. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. Automating approximate Bayesian computation by local linear regression.

    PubMed

    Thornton, Kevin R

    2009-07-07

    In several biological contexts, parameter inference often relies on computationally-intensive techniques. "Approximate Bayesian Computation", or ABC, methods based on summary statistics have become increasingly popular. A particular flavor of ABC based on using a linear regression to approximate the posterior distribution of the parameters, conditional on the summary statistics, is computationally appealing, yet no standalone tool exists to automate the procedure. Here, I describe a program to implement the method. The software package ABCreg implements the local linear-regression approach to ABC. The advantages are: 1. The code is standalone, and fully-documented. 2. The program will automatically process multiple data sets, and create unique output files for each (which may be processed immediately in R), facilitating the testing of inference procedures on simulated data, or the analysis of multiple data sets. 3. The program implements two different transformation methods for the regression step. 4. Analysis options are controlled on the command line by the user, and the program is designed to output warnings for cases where the regression fails. 5. The program does not depend on any particular simulation machinery (coalescent, forward-time, etc.), and therefore is a general tool for processing the results from any simulation. 6. The code is open-source, and modular.Examples of applying the software to empirical data from Drosophila melanogaster, and testing the procedure on simulated data, are shown. In practice, the ABCreg simplifies implementing ABC based on local-linear regression.

  8. Psychometric considerations in the measurement of event-related brain potentials: Guidelines for measurement and reporting.

    PubMed

    Clayson, Peter E; Miller, Gregory A

    2017-01-01

    Failing to consider psychometric issues related to reliability and validity, differential deficits, and statistical power potentially undermines the conclusions of a study. In research using event-related brain potentials (ERPs), numerous contextual factors (population sampled, task, data recording, analysis pipeline, etc.) can impact the reliability of ERP scores. The present review considers the contextual factors that influence ERP score reliability and the downstream effects that reliability has on statistical analyses. Given the context-dependent nature of ERPs, it is recommended that ERP score reliability be formally assessed on a study-by-study basis. Recommended guidelines for ERP studies include 1) reporting the threshold of acceptable reliability and reliability estimates for observed scores, 2) specifying the approach used to estimate reliability, and 3) justifying how trial-count minima were chosen. A reliability threshold for internal consistency of at least 0.70 is recommended, and a threshold of 0.80 is preferred. The review also advocates the use of generalizability theory for estimating score dependability (the generalizability theory analog to reliability) as an improvement on classical test theory reliability estimates, suggesting that the latter is less well suited to ERP research. To facilitate the calculation and reporting of dependability estimates, an open-source Matlab program, the ERP Reliability Analysis Toolbox, is presented. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Comparison of antimicrobial efficacy of propolis, Morinda citrifolia, Azadirachta indica (Neem) and 5% sodium hypochlorite on Candida albicans biofilm formed on tooth substrate: An in-vitro study

    PubMed Central

    Tyagi, Shashi Prabha; Sinha, Dakshita Joy; Garg, Paridhi; Singh, Udai Pratap; Mishra, Chandrakar Chaman; Nagpal, Rajni

    2013-01-01

    Introduction: Endodontic infections are polymicrobial in nature. Candida albicans is the most common fungus isolated from failed endodontic cases. The constant increase in antibiotic resistant strains and side-effects caused by synthetic drugs has prompted researchers to look for herbal alternatives such as propolis, Morinda citrifolia and Azadirachta indica (Neem) etc., since, the gold standard for irrigation, i.e., sodium hypochlorite has many disadvantages. Materials and Methods: Extracted human mandibular premolars were biomechanically prepared, vertically sectioned, placed in tissue culture wells exposing the root canal surface to C. albicans grown on Sabouraud Dextrose Agar to form a biofilm. At the end of 2 days, all groups were treated with test solutions and control for 10 min and evaluated for Candida growth and number of colony forming units. The readings were subjected to statistical analysis using analysis of variance and post hoc Tukey tests. Results: Sodium hypochlorite and propolis groups exhibited highest antimicrobial efficacy against C. albicans with no statistically significant difference. It was followed by the A. indica (Neem) group. M. citrifolia had limited antifungal action followed by the negative control group of saline. Conclusion: According to the results of this study, propolis can be used as an effective antifungal agent similar to that of sodium hypochlorite, although long-term in vivo studies are warranted. PMID:24347888

  10. TOLERANCE TIME OF EXPERIMENTAL THERMAL PAIN (COLD INDUCED) IN VOLUNTEERS.

    PubMed

    Vaid, V N; Wilkhoo, N S; Jain, A K

    1998-10-01

    Perception of thermal pain (cold induced) was studied in 106 volunteers from troops and civilians deployed in J & K. Thermal stimulus devised was "holding ice". Tolerance time of holding ice was taken to be a measure of thermal sensitivity, volunteers were classified based on their native areas, addiction habits and socio-economic status, out of 106 volunteers, 81 could & 25 could not hold ice over 10 min. Sixteen out of 40 from coastline States and 9 out of 66 from non-coast line States failed to hold ice over 10 min. In "below average" "average" and "high average" socio-economic groups, three out of 27, 19 out of 73 and 03 out of 6 failed to hold ice over 10 min respectively. Fifteen out of 64 from "addiction habit group" and 10 out of 42 from "no addiction habit group" failed to hold ice over 10 min. Statistically no classification used in the study revealed significant difference in "tolerance times" of volunteers except the one based on coastline and non-coastline States.

  11. To Leave or Not to Leave? A Regression Discontinuity Analysis of the Impact of Failing High School Exit Exam. CEE DP 107

    ERIC Educational Resources Information Center

    Ou, Dongshu

    2009-01-01

    This paper presents new empirical evidence on whether failing the high school exit exam increases the chance of exiting from high school "prior to high school completion". More importantly, the author discusses the potentially different impacts of failing the High School Exit Exams (HSEE) on students with limited English proficiency,…

  12. [Biomechanical investigation of the tensile strength of tendon sutures - locking sutures increase stability].

    PubMed

    Betz, C; Schleicher, P; Winkel, R; Hoffmann, R

    2013-02-01

    In this study we examined the tensile strength of core sutures of tendons. In particular, we examined the effect of having 2 or 4 stitch strands in the core suture as well as the effect of additional locking sutures on the tensile strength. 60 flexor tendons from the forepaws of freshly slaughtered swines were harvested for biomechanical testing. They were divided into 4 groups (A, B, C and D) of 15 sutures each. Group A: core suture after Zechner with 2 strands; group B: modified core suture with 4 strands; group C: modified core suture with 2 strands and 4 locking sutures; group D: modified core suture with 4 strands and 4 locking sutures. The primary tensile strength of the sutures was measured in Newton using the testing machine with a traction speed of 0.1 mm/s. Simultaneously, the increasing space forming at the suture was filmed against graph paper. Our command variables were force measured in Newton when forming a space of 2 mm as well as the force at which the suture failed. Statistical analysis was carried out with the software SPSS to produce a multivariate analysis with a statistical significance of p<0.05. Results are presented as averages including the 1st and 3rd quartile (1Q/3Q). Under traction to form a 2 mm space, the force measured with group A was 14.2 N (12.9/15.1 N). In group B the force 22.5 N (20.0/24.7 N) was significantly higher (p<0.05). Group C required a traction force of 28.7 N (23.5/35.8 N) which was significantly higher than for groups A and B. Group D required the significantly highest traction force of 42.0 N (39.5/46.0 N) to produce a 2 mm space. The force required for the suture to fail in group A was 19.9 N (17.9/22.8 N), in group B: 26.2 N (24.5/29.7 N), in group C 32.0 N (27.1/40.1 N) and in Group D 46.5 N (41.5/50.0 N); the differences between the gloups were all statistically significant. The primary tensile strength of core sutures after Zechner on flexor tendons from the forepaws of swines was significantly increased by doubling the number of sutures and also by use of 4 additional locking sutures. © Georg Thieme Verlag KG Stuttgart · New York.

  13. EventThread: Visual Summarization and Stage Analysis of Event Sequence Data.

    PubMed

    Guo, Shunan; Xu, Ke; Zhao, Rongwen; Gotz, David; Zha, Hongyuan; Cao, Nan

    2018-01-01

    Event sequence data such as electronic health records, a person's academic records, or car service records, are ordered series of events which have occurred over a period of time. Analyzing collections of event sequences can reveal common or semantically important sequential patterns. For example, event sequence analysis might reveal frequently used care plans for treating a disease, typical publishing patterns of professors, and the patterns of service that result in a well-maintained car. It is challenging, however, to visually explore large numbers of event sequences, or sequences with large numbers of event types. Existing methods focus on extracting explicitly matching patterns of events using statistical analysis to create stages of event progression over time. However, these methods fail to capture latent clusters of similar but not identical evolutions of event sequences. In this paper, we introduce a novel visualization system named EventThread which clusters event sequences into threads based on tensor analysis and visualizes the latent stage categories and evolution patterns by interactively grouping the threads by similarity into time-specific clusters. We demonstrate the effectiveness of EventThread through usage scenarios in three different application domains and via interviews with an expert user.

  14. Statistical downscaling of general-circulation-model- simulated average monthly air temperature to the beginning of flowering of the dandelion (Taraxacum officinale) in Slovenia

    NASA Astrophysics Data System (ADS)

    Bergant, Klemen; Kajfež-Bogataj, Lučka; Črepinšek, Zalika

    2002-02-01

    Phenological observations are a valuable source of information for investigating the relationship between climate variation and plant development. Potential climate change in the future will shift the occurrence of phenological phases. Information about future climate conditions is needed in order to estimate this shift. General circulation models (GCM) provide the best information about future climate change. They are able to simulate reliably the most important mean features on a large scale, but they fail on a regional scale because of their low spatial resolution. A common approach to bridging the scale gap is statistical downscaling, which was used to relate the beginning of flowering of Taraxacum officinale in Slovenia with the monthly mean near-surface air temperature for January, February and March in Central Europe. Statistical models were developed and tested with NCAR/NCEP Reanalysis predictor data and EARS predictand data for the period 1960-1999. Prior to developing statistical models, empirical orthogonal function (EOF) analysis was employed on the predictor data. Multiple linear regression was used to relate the beginning of flowering with expansion coefficients of the first three EOF for the Janauary, Febrauary and March air temperatures, and a strong correlation was found between them. Developed statistical models were employed on the results of two GCM (HadCM3 and ECHAM4/OPYC3) to estimate the potential shifts in the beginning of flowering for the periods 1990-2019 and 2020-2049 in comparison with the period 1960-1989. The HadCM3 model predicts, on average, 4 days earlier occurrence and ECHAM4/OPYC3 5 days earlier occurrence of flowering in the period 1990-2019. The analogous results for the period 2020-2049 are a 10- and 11-day earlier occurrence.

  15. Influence of bisphosphonates on alveolar bone loss around osseointegrated implants.

    PubMed

    Zahid, Talal M; Wang, Bing-Yan; Cohen, Robert E

    2011-06-01

    The relationship between bisphosphonates (BP) and dental implant failure has not been fully elucidated. The purpose of this retrospective radiographic study was to examine whether patients who take BP are at greater risk of implant failure than patients not using those agents. Treatment records of 362 consecutively treated patients receiving endosseous dental implants were reviewed. The patient population consisted of 227 women and 135 men with a mean age of 56 years (range: 17-87 years), treated in the University at Buffalo Postgraduate Clinic from 1997-2008. Demographic information collected included age, gender, smoking status, as well as systemic conditions and medication use. Implant characteristics reviewed included system, date of placement, date of follow-up radiographs, surgical complications, number of exposed threads, and implant failure. The relationship between BP and implant failure was analyzed using generalized estimating equation (GEE) analysis. Twenty-six patients using BP received a total of 51 dental implants. Three implants failed, yielding success rates of 94.11% and 88.46% for the implant-based and subject-based analyses, respectively. Using the GEE statistical method we found a statistically significant (P  =  .001; OR  =  3.25) association between the use of BP and implant thread exposure. None of the other variables studied were statistically associated with implant failure or thread exposure. In conclusion, patients taking BP may be at higher risk for implant thread exposure.

  16. Defining failed induction of labor.

    PubMed

    Grobman, William A; Bailit, Jennifer; Lai, Yinglei; Reddy, Uma M; Wapner, Ronald J; Varner, Michael W; Thorp, John M; Leveno, Kenneth J; Caritis, Steve N; Prasad, Mona; Tita, Alan T N; Saade, George; Sorokin, Yoram; Rouse, Dwight J; Blackwell, Sean C; Tolosa, Jorge E

    2018-01-01

    While there are well-accepted standards for the diagnosis of arrested active-phase labor, the definition of a "failed" induction of labor remains less certain. One approach to diagnosing a failed induction is based on the duration of the latent phase. However, a standard for the minimum duration that the latent phase of a labor induction should continue, absent acute maternal or fetal indications for cesarean delivery, remains lacking. The objective of this study was to determine the frequency of adverse maternal and perinatal outcomes as a function of the duration of the latent phase among nulliparous women undergoing labor induction. This study is based on data from an obstetric cohort of women delivering at 25 US hospitals from 2008 through 2011. Nulliparous women who had a term singleton gestation in the cephalic presentation were eligible for this analysis if they underwent a labor induction. Consistent with prior studies, the latent phase was determined to begin once cervical ripening had ended, oxytocin was initiated, and rupture of membranes had occurred, and was determined to end once 5-cm dilation was achieved. The frequencies of cesarean delivery, as well as of adverse maternal (eg, postpartum hemorrhage, chorioamnionitis) and perinatal (eg, a composite frequency of seizures, sepsis, bone or nerve injury, encephalopathy, or death) outcomes, were compared as a function of the duration of the latent phase (analyzed with time both as a continuous measure and categorized in 3-hour increments). A total of 10,677 women were available for analysis. In the vast majority (96.4%) of women, the active phase had been reached by 15 hours. The longer the duration of a woman's latent phase, the greater her chance of ultimately undergoing a cesarean delivery (P < .001, for time both as a continuous and categorical independent variable), although >40% of women whose latent phase lasted ≥18 hours still had a vaginal delivery. Several maternal morbidities, such as postpartum hemorrhage (P < .001) and chorioamnionitis (P < .001), increased in frequency as the length of latent phase increased. Conversely, the frequencies of most adverse perinatal outcomes were statistically stable over time. The large majority of women undergoing labor induction will have entered the active phase by 15 hours after oxytocin has started and rupture of membranes has occurred. Maternal adverse outcomes become statistically more frequent with greater time in the latent phase, although the absolute increase in frequency is relatively small. These data suggest that cesarean delivery should not be undertaken during the latent phase prior to at least 15 hours after oxytocin and rupture of membranes have occurred. The decision to continue labor beyond this point should be individualized, and may take into account factors such as other evidence of labor progress. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. [Air-conditioner disease. Results of an industrial medicine survey (author's transl)].

    PubMed

    Molina, C; Aiache, J M; Bedu, M; Menaut, P; Wahl, D; Brestowski, J; Grall, Y

    1982-07-03

    The results of a survey conducted in a company employing 1850 persons working in air-conditioned premises are reported. One hundred and five persons were examined, including 790 who mostly complained of respiratory disorders and 20 controls. Regular check-ups during the last two years have failed to reveal any serious disease. The most frequent complaints were rhinitis and tracheitis, especially among female employees. No alveolitis was observed. The finding of Bacillus subtilis in samples of ambient air and air-conditioner filters in conjunction with the presence of precipitating antibodies against crude extracts from these samples, suggested that the respiratory disorders might have been due to this microorganism. A multifactorial analysis demonstrated a statistically significant correlation between clinical symptoms and immunological disorders. The air-conditioner disease, therefore, may present as a benign condition.

  18. The generalization of the Mermin-Wagner theorem and the possibility of long-range order in the isotropic discrete one-dimensional quantum Heisenberg model

    NASA Astrophysics Data System (ADS)

    Rudoy, Yu. G.; Kotelnikova, O. A.

    2012-10-01

    The problem of existence of long-range order in the isotropic quantum Heisenberg model on the D=1 lattice is reconsidered in view of the possibility of sufficiently slow decaying exchange interaction with infinite effective radius. It is shown that the macrosopic arguments given by Landau and Lifshitz and then supported microscopically by Mermin and Wagner fail for this case so that the non-zero spontaneous magnetization may yet exist. This result was anticipated by Thouless on the grounds of phenomenological analysis, and we give its microscopic foundation, which amounts to the generalization of Mermin-Wagner theorem for the case of the infinite second moment of the exchange interaction. Two well known in lattice statistics models - i.e., Kac-I and Kac-II - illustrate our results.

  19. Diagnosing and dealing with multicollinearity.

    PubMed

    Schroeder, M A

    1990-04-01

    The purpose of this article was to increase nurse researchers' awareness of the effects of collinear data in developing theoretical models for nursing practice. Collinear data distort the true value of the estimates generated from ordinary least-squares analysis. Theoretical models developed to provide the underpinnings of nursing practice need not be abandoned, however, because they fail to produce consistent estimates over repeated applications. It is also important to realize that multicollinearity is a data problem, not a problem associated with misspecification of a theorectical model. An investigator must first be aware of the problem, and then it is possible to develop an educated solution based on the degree of multicollinearity, theoretical considerations, and sources of error associated with alternative, biased, least-square regression techniques. Decisions based on theoretical and statistical considerations will further the development of theory-based nursing practice.

  20. [Fractographic analysis of clinically failed anterior all ceramic crowns].

    PubMed

    DU, Qian; Zhou, Min-bo; Zhang, Xin-ping; Zhao, Ke

    2012-04-01

    To identify the site of crack initiation and propagation path of clinically failed all ceramic crowns by fractographic analysis. Three clinically failed anterior IPS Empress II crowns and two anterior In-Ceram alumina crowns were retrieved. Fracture surfaces were examined using both optical stereo and scanning electron microscopy. Fractographic theory and fracture mechanics principles were applied to disclose the damage characteristics and fracture mode. All the crowns failed by cohesive failure within the veneer on the labial surface. Critical crack originated at the incisal contact area and propagated gingivally. Porosity was found within the veneer because of slurry preparation and the sintering of veneer powder. Cohesive failure within the veneer is the main failure mode of all ceramic crown. Veneer becomes vulnerable when flaws are present. To reduce the chances of chipping, multi-point occlusal contacts are recommended, and layering and sintering technique of veneering layer should also be improved.

  1. A Comparison of Multiscale Permutation Entropy Measures in On-Line Depth of Anesthesia Monitoring.

    PubMed

    Su, Cui; Liang, Zhenhu; Li, Xiaoli; Li, Duan; Li, Yongwang; Ursino, Mauro

    2016-01-01

    Multiscale permutation entropy (MSPE) is becoming an interesting tool to explore neurophysiological mechanisms in recent years. In this study, six MSPE measures were proposed for on-line depth of anesthesia (DoA) monitoring to quantify the anesthetic effect on the real-time EEG recordings. The performance of these measures in describing the transient characters of simulated neural populations and clinical anesthesia EEG were evaluated and compared. Six MSPE algorithms-derived from Shannon permutation entropy (SPE), Renyi permutation entropy (RPE) and Tsallis permutation entropy (TPE) combined with the decomposition procedures of coarse-graining (CG) method and moving average (MA) analysis-were studied. A thalamo-cortical neural mass model (TCNMM) was used to generate noise-free EEG under anesthesia to quantitatively assess the robustness of each MSPE measure against noise. Then, the clinical anesthesia EEG recordings from 20 patients were analyzed with these measures. To validate their effectiveness, the ability of six measures were compared in terms of tracking the dynamical changes in EEG data and the performance in state discrimination. The Pearson correlation coefficient (R) was used to assess the relationship among MSPE measures. CG-based MSPEs failed in on-line DoA monitoring at multiscale analysis. In on-line EEG analysis, the MA-based MSPE measures at 5 decomposed scales could track the transient changes of EEG recordings and statistically distinguish the awake state, unconsciousness and recovery of consciousness (RoC) state significantly. Compared to single-scale SPE and RPE, MSPEs had better anti-noise ability and MA-RPE at scale 5 performed best in this aspect. MA-TPE outperformed other measures with faster tracking speed of the loss of unconsciousness. MA-based multiscale permutation entropies have the potential for on-line anesthesia EEG analysis with its simple computation and sensitivity to drug effect changes. CG-based multiscale permutation entropies may fail to describe the characteristics of EEG at high decomposition scales.

  2. Evaluation of Evidence of Statistical Support and Corroboration of Subgroup Claims in Randomized Clinical Trials.

    PubMed

    Wallach, Joshua D; Sullivan, Patrick G; Trepanowski, John F; Sainani, Kristin L; Steyerberg, Ewout W; Ioannidis, John P A

    2017-04-01

    Many published randomized clinical trials (RCTs) make claims for subgroup differences. To evaluate how often subgroup claims reported in the abstracts of RCTs are actually supported by statistical evidence (P < .05 from an interaction test) and corroborated by subsequent RCTs and meta-analyses. This meta-epidemiological survey examines data sets of trials with at least 1 subgroup claim, including Subgroup Analysis of Trials Is Rarely Easy (SATIRE) articles and Discontinuation of Randomized Trials (DISCO) articles. We used Scopus (updated July 2016) to search for English-language articles citing each of the eligible index articles with at least 1 subgroup finding in the abstract. Articles with a subgroup claim in the abstract with or without evidence of statistical heterogeneity (P < .05 from an interaction test) in the text and articles attempting to corroborate the subgroup findings. Study characteristics of trials with at least 1 subgroup claim in the abstract were recorded. Two reviewers extracted the data necessary to calculate subgroup-level effect sizes, standard errors, and the P values for interaction. For individual RCTs and meta-analyses that attempted to corroborate the subgroup findings from the index articles, trial characteristics were extracted. Cochran Q test was used to reevaluate heterogeneity with the data from all available trials. The number of subgroup claims in the abstracts of RCTs, the number of subgroup claims in the abstracts of RCTs with statistical support (subgroup findings), and the number of subgroup findings corroborated by subsequent RCTs and meta-analyses. Sixty-four eligible RCTs made a total of 117 subgroup claims in their abstracts. Of these 117 claims, only 46 (39.3%) in 33 articles had evidence of statistically significant heterogeneity from a test for interaction. In addition, out of these 46 subgroup findings, only 16 (34.8%) ensured balance between randomization groups within the subgroups (eg, through stratified randomization), 13 (28.3%) entailed a prespecified subgroup analysis, and 1 (2.2%) was adjusted for multiple testing. Only 5 (10.9%) of the 46 subgroup findings had at least 1 subsequent pure corroboration attempt by a meta-analysis or an RCT. In all 5 cases, the corroboration attempts found no evidence of a statistically significant subgroup effect. In addition, all effect sizes from meta-analyses were attenuated toward the null. A minority of subgroup claims made in the abstracts of RCTs are supported by their own data (ie, a significant interaction effect). For those that have statistical support (P < .05 from an interaction test), most fail to meet other best practices for subgroup tests, including prespecification, stratified randomization, and adjustment for multiple testing. Attempts to corroborate statistically significant subgroup differences are rare; when done, the initially observed subgroup differences are not reproduced.

  3. 75 FR 59234 - Agency Information Collection Activities; Submission for Office of Management and Budget Review...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-27

    ... 1508 fail to comply in a manner to warrant a recall, the required records can be used by the... the recall. The Commission will consider all comments received in response to this notice before...) multiplied by a cost of $27.78 per hour (Bureau of Labor Statistics, Total Compensation, All workers, goods...

  4. THE RAPID GROWTH OF COMMUNITY COLLEGES AND THEIR ACCESSIBILITY IN RURAL AREAS.

    ERIC Educational Resources Information Center

    ELDRIDGE, DONALD A.

    THE COURSE OFFERINGS IN SOME JUNIOR COLLEGES FAIL TO MEET ADEQUATELY THE UNIQUE NEEDS OF RURAL YOUTH. A STUDY IN 1964 REVEALED THAT ONLY TWENTY OF THE SEVENTY JUNIOR COLLEGES IN CALIFORNIA OFFERED TRAINING IN AGRICULTURE, ALTHOUGH THE RECENTLY PUBLISHED "DIRECTORY OF JUNIOR COLLEGES" SHOWS AN INCREASE TO SIXTY. FURTHER STATISTICS REVEAL THAT 253…

  5. Faculty and Student Relationships: Context Matters

    ERIC Educational Resources Information Center

    Hoffman, Elin Meyers

    2014-01-01

    As many as 42% of first and second year students at post-secondary institutions fail to complete their degrees, and of those students, only 15-25% of them drop out due to poor academic performance or for financial reasons. The remainder of them leave college for reasons that are less clear (National Center for Education Statistics 2012). However,…

  6. Accuracy vs. Validity, Consistency vs. Reliability, and Fairness vs. Absence of Bias: A Call for Quality

    ERIC Educational Resources Information Center

    Lang, W. Steve; Wilkerson, Judy R.

    2008-01-01

    The National Council for Accreditation of Teacher Education (NCATE, 2002) requires teacher education units to develop assessment systems and evaluate both the success of candidates and unit operations. Because of a stated, but misguided, fear of statistics, NCATE fails to use accepted terminology to assure the quality of institutional evaluative…

  7. A Statistical Ontology-Based Approach to Ranking for Multiword Search

    ERIC Educational Resources Information Center

    Kim, Jinwoo

    2013-01-01

    Keyword search is a prominent data retrieval method for the Web, largely because the simple and efficient nature of keyword processing allows a large amount of information to be searched with fast response. However, keyword search approaches do not formally capture the clear meaning of a keyword query and fail to address the semantic relationships…

  8. Successful Boys and Literacy: Are "Literate Boys" Challenging or Repackaging Hegemonic Masculinity?

    ERIC Educational Resources Information Center

    Skelton, Christine; Francis, Becky

    2011-01-01

    The National Assessment of Educational Progress statistics show that boys are underachieving in literacy compared to girls. Attempts to redress the problem in various Global North countries and particularly Australia and the United Kingdom have failed to make any impact. However, there are boys who are doing well in literacy. The aim of this…

  9. If Life Happened but a Degree Didn't: Examining Factors That Impact Adult Student Persistence

    ERIC Educational Resources Information Center

    Bergman, Mathew; Gross, Jacob P. K.; Berry, Matt; Shuck, Brad

    2014-01-01

    Roughly half of all undergraduate students in the United States fail to persist to degree completion (American College Testing [ACT], 2010; Tinto, 1993; U.S. Department of Education, National Center for Education Statistics, 2013). Adult students often have higher levels of attrition than traditional-age students (Justice & Dornan, 2001;…

  10. To Leave or Not to Leave? A Regression Discontinuity Analysis of the Impact of Failing the High School Exit Exam

    ERIC Educational Resources Information Center

    Ou, Dongshu

    2010-01-01

    The high school exit exam (HSEE) is rapidly becoming a standardized assessment procedure for educational accountability in the United States. I use a unique, state-specific dataset to identify the effects of failing the HSEE on the likelihood of dropping out of high school based on a regression discontinuity design. The analysis shows that…

  11. Classifying Failing States

    DTIC Science & Technology

    2007-03-01

    state failure, and Discriminant Analysis to classify states as Stable, Borderline, or Failing based on these indicators. Furthermore, each...nation’s discriminant function scores are used to determine their degree of instability. The methodology is applied to 200 countries for which open source...and go for a long walk. Finally, to my wonderful wife, who now knows more about Discriminant Analysis than any Legal Assistant on the planet, thank

  12. Effects of advanced treatment of municipal wastewater on the White River near Indianapolis, Indiana; trends in water quality, 1978-86

    USGS Publications Warehouse

    Crawford, Charles G.; Wangsness, David J.

    1993-01-01

    The City of Indianapolis has constructed state-of-the-art advanced municipal wastewater-treatment systems to enlarge and upgrade the existing secondary-treatment processes at its Belmont and Southport treatment plants. These new advanced-wastewater-treatment plants became operational in 1983. A nonparametric statistical procedure--a modified form of the Wilcoxon-Mann-Whitney rank-sum test--was used to test for trends in time-series water-quality data from four sites on the White River and from the Belmont and Southport wastewater-treatment plants. Time-series data representative of pre-advanced- (1978-1980) and post-advanced- (1983--86) wastewater-treatment conditions were tested for trends, and the results indicate substantial changes in water quality of treated effluent and of the White River downstream from Indianapolis after implementation of advanced wastewater treatment. Water quality from 1981 through 1982 was highly variable due to plant construction. Therefore, this time period was excluded from the analysis. Water quality at sample sites located upstream from the wastewater-treatment plants was relatively constant during the period of study (1978-86). Analysis of data from the two plants and downstream from the plants indicates statistically significant decreasing trends in effluent concentrations of total ammonia, 5-day biochemical-oxygen demand, fecal-coliform bacteria, total phosphate, and total solids at all sites where sufficient data were available for testing. Because of in-plant nitrification, increases in nitrate concentration were statistically significant in the two plants and in the White River. The decrease in ammonia concentrations and 5-day biochemical-oxygen demand in the White River resulted in a statistically significant increasing trend in dissolved-oxygen concentration in the river because of reduced oxygen demand for nitrification and biochemical oxidation processes. Following implementation of advanced wastewater treatment, the number of river-quality samples that failed to meet the water-quality standards for ammonia and dissolved oxygen that apply to the White River decreased substantially.

  13. Human cytomegalovirus and Epstein-Barr virus in etiopathogenesis of apical periodontitis: a systematic review.

    PubMed

    Jakovljevic, Aleksandar; Andric, Miroslav

    2014-01-01

    During the last decade, a hypothesis has been established that human cytomegalovirus (HCMV) and Epstein-Barr virus (EBV) may be implicated in the pathogenesis of apical periodontitis. The aim of this review was to analyze the available evidence that indicates that HCMV and EBV can actually contribute to the pathogenesis of periapical lesions and to answer the following focused question: is there a relationship between HCMV and EBV DNA and/or RNA detection and the clinical features of human periapical lesions? The literature search covered MEDLINE, Science Citation Index Expanded (SCIexpanded), Scopus, and The Cochrane Library database. Quantitative statistical analysis was performed on the pooled data of HCMV and EBV messenger RNA transcripts in tissues of symptomatic and asymptomatic periapical lesions. The electronic database search yielded 48 hits from PubMed, 197 hits from Scopus, 40 hits from Web of Science, and 1 from the Cochrane Library. Seventeen cross-sectional studies have been included in the final review. The pooled results from quantitative systematic method analysis showed no statistically significant relationship between the presence of HCMV and EBV messenger RNA transcripts (P = .083 and P = .306, respectively) and the clinical features of apical periodontitis. The findings of HCMV and EBV transcripts in apical periodontitis were controversial among the included studies. Herpesviruses were common in symptomatic and large-size periapical lesions, but such results failed to reach statistical significance. Further studies, including those based on an experimental animal model, should provide more data on herpesviruses as a factor in the pathogenesis of periapical inflammation. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  14. Regression Rates Following the Treatment of Aggressive Posterior Retinopathy of Prematurity with Bevacizumab Versus Laser: 8-Year Retrospective Analysis

    PubMed Central

    Nicoară, Simona D.; Ştefănuţ, Anne C.; Nascutzy, Constanta; Zaharie, Gabriela C.; Toader, Laura E.; Drugan, Tudor C.

    2016-01-01

    Background Retinopathy is a serious complication related to prematurity and a leading cause of childhood blindness. The aggressive posterior form of retinopathy of prematurity (APROP) has a worse anatomical and functional outcome following laser therapy, as compared with the classic form of the disease. The main outcome measures are the APROP regression rate, structural outcomes, and complications associated with intravitreal bevacizumab (IVB) versus laser photocoagulation in APROP. Material/Methods This is a retrospective case series that includes infants with APROP who received either IVB or laser photocoagulation and had a follow-up of at least 60 weeks (for the laser photocoagulation group) and 80 weeks (for the IVB group). In the first group, laser photocoagulation of the retina was carried out and in the second group, 1 bevacizumab injection was administered intravitreally. The following parameters were analyzed in each group: sex, gestational age, birth weight, postnatal age and postmenstrual age at treatment, APROP regression, sequelae, and complications. Statistical analysis was performed using Microsoft Excel and IBM SPSS (version 23.0). Results The laser photocoagulation group consisted of 6 premature infants (12 eyes) and the IVB group consisted of 17 premature infants (34 eyes). Within the laser photocoagulation group, the evolution was favorable in 9 eyes (75%) and unfavorable in 3 eyes (25%). Within the IVB group, APROP regressed in 29 eyes (85.29%) and failed to regress in 5 eyes (14.71%). These differences are statistically significant, as proved by the McNemar test (P<0.001). Conclusions The IVB group had a statistically significant better outcome compared with the laser photocoagulation group, in APROP in our series. PMID:27062023

  15. The power and promise of RNA-seq in ecology and evolution.

    PubMed

    Todd, Erica V; Black, Michael A; Gemmell, Neil J

    2016-03-01

    Reference is regularly made to the power of new genomic sequencing approaches. Using powerful technology, however, is not the same as having the necessary power to address a research question with statistical robustness. In the rush to adopt new and improved genomic research methods, limitations of technology and experimental design may be initially neglected. Here, we review these issues with regard to RNA sequencing (RNA-seq). RNA-seq adds large-scale transcriptomics to the toolkit of ecological and evolutionary biologists, enabling differential gene expression (DE) studies in nonmodel species without the need for prior genomic resources. High biological variance is typical of field-based gene expression studies and means that larger sample sizes are often needed to achieve the same degree of statistical power as clinical studies based on data from cell lines or inbred animal models. Sequencing costs have plummeted, yet RNA-seq studies still underutilize biological replication. Finite research budgets force a trade-off between sequencing effort and replication in RNA-seq experimental design. However, clear guidelines for negotiating this trade-off, while taking into account study-specific factors affecting power, are currently lacking. Study designs that prioritize sequencing depth over replication fail to capitalize on the power of RNA-seq technology for DE inference. Significant recent research effort has gone into developing statistical frameworks and software tools for power analysis and sample size calculation in the context of RNA-seq DE analysis. We synthesize progress in this area and derive an accessible rule-of-thumb guide for designing powerful RNA-seq experiments relevant in eco-evolutionary and clinical settings alike. © 2016 John Wiley & Sons Ltd.

  16. Economic Fluctuations and Statistical Physics: Quantifying Extremely Rare and Much Less Rare Events

    NASA Astrophysics Data System (ADS)

    Stanley, H. Eugene

    2008-03-01

    Recent analysis of truly huge quantities of empirical data suggests that classic economic theories not only fail for a few outliers, but that there occur similar outliers of every possible size. In fact, if one analyzes only a small data set (say 10^4 data points), then outliers appear to occur as ``rare events.'' However, when we analyze orders of magnitude more data (10^8 data points!), we find orders of magnitude more outliers---so ignoring them is not a responsible option, and studying their properties becomes a realistic goal. We find that the statistical properties of these ``outliers'' are identical to the statistical properties of everyday fluctuations. For example, a histogram giving the number of fluctuations of a given magnitude x for fluctuations ranging in magnitude from everyday fluctuations to extremely rare fluctuations that occur with a probability of only 10-8 is a perfect straight line in a double-log plot. Quantitative analogies between financial fluctuations and earthquakes will be discussed. Two unifying principles that underlie much of the finance analysis we will present are scale invariance and universality [R. N. Mantegna and H. E. Stanley, Introduction to Econophysics: Correlations & Complexity in Finance/ (Cambridge U. Press, 2000)]. Scale invariance is a property not about algebraic equations but rather about functional equations, which have as their solutions not numbers but rather functional forms. The key idea of universality is that the identical set of laws hold across diverse markets, and over diverse time periods. This work was carried out in collaboration with a number of students and colleagues, chief among whom are X. Gabaix (MIT and Princeton) and V. Plerou (Boston University).

  17. Neural net controlled tag gas sampling system for nuclear reactors

    DOEpatents

    Gross, Kenneth C.; Laug, Matthew T.; Lambert, John D. B.; Herzog, James P.

    1997-01-01

    A method and system for providing a tag gas identifier to a nuclear fuel rod and analyze escaped tag gas to identify a particular failed nuclear fuel rod. The method and system include disposing a unique tag gas composition into a plenum of a nuclear fuel rod, monitoring gamma ray activity, analyzing gamma ray signals to assess whether a nuclear fuel rod has failed and is emitting tag gas, activating a tag gas sampling and analysis system upon sensing tag gas emission from a failed nuclear rod and evaluating the escaped tag gas to identify the particular failed nuclear fuel rod.

  18. Systematic literature review shows that appetite rating does not predict energy intake.

    PubMed

    Holt, Guy M; Owen, Lauren J; Till, Sophie; Cheng, Yanying; Grant, Vicky A; Harden, Charlotte J; Corfe, Bernard M

    2017-11-02

    Ratings of appetite are commonly used to assess appetite modification following an intervention. Subjectively rated appetite is a widely employed proxy measure for energy intake (EI), measurement of which requires greater time and resources. However, the validity of appetite as a reliable predictor of EI has not yet been reviewed systematically. This literature search identified studies that quantified both appetite ratings and EI. Outcomes were predefined as: (1) agreement between self-reported appetite scores and EI; (2) no agreement between self-reported appetitescores and EI. The presence of direct statistical comparison between the endpoints, intervention type and study population were also recorded. 462 papers were included in this review. Appetite scores failed to correspond with EI in 51.3% of the total studies. Only 6% of all studies evaluated here reported a direct statistical comparison between appetite scores and EI. χ 2 analysis demonstrated that any relationship between EI and appetite was independent of study type stratification by age, gender or sample size. The very substantive corpus reviewed allows us to conclude that self-reported appetite ratings of appetite do not reliably predict EI. Caution should be exercised when drawing conclusions based from self-reported appetite scores in relation to prospective EI.

  19. The efficacy and safety of alprazolam versus other benzodiazepines in the treatment of panic disorder.

    PubMed

    Moylan, Steven; Staples, John; Ward, Stephanie Alison; Rogerson, Jan; Stein, Dan J; Berk, Michael

    2011-10-01

    We performed a meta-analysis of all single- or double-blind, randomized controlled trials comparing alprazolam to another benzodiazepine in the treatment of adult patients meeting the Diagnostic and Statistical Manual of Mental Disorders, Third or Fourth Edition, criteria for panic disorder or agoraphobia with panic attacks. Eight studies met inclusion criteria, describing a total of at least 631 randomized patients. In the pooled results, there were no significant differences in efficacy between alprazolam and the comparator benzodiazepines on any of the prespecified outcomes: improvement in mean panic attack frequency (between-arm weighted mean difference of 0.6 panic attacks per week; 95% confidence interval [CI], -0.3 to 1.6), improvement in Hamilton Anxiety Rating Scale score (weighted mean difference of 0.8 points; 95% CI, -0.5 to 2.1), and proportion of patients free of panic attacks at the final evaluation (pooled relative risk, 1.1; 95% CI, 0.9-1.4). Statistical heterogeneity on prespecified outcomes was not eliminated by stratification on baseline anxiety level. The available evidence fails to demonstrate alprazolam as superior to other benzodiazepines for the treatment of panic disorder.

  20. SIG-VISA: Signal-based Vertically Integrated Seismic Monitoring

    NASA Astrophysics Data System (ADS)

    Moore, D.; Mayeda, K. M.; Myers, S. C.; Russell, S.

    2013-12-01

    Traditional seismic monitoring systems rely on discrete detections produced by station processing software; however, while such detections may constitute a useful summary of station activity, they discard large amounts of information present in the original recorded signal. We present SIG-VISA (Signal-based Vertically Integrated Seismic Analysis), a system for seismic monitoring through Bayesian inference on seismic signals. By directly modeling the recorded signal, our approach incorporates additional information unavailable to detection-based methods, enabling higher sensitivity and more accurate localization using techniques such as waveform matching. SIG-VISA's Bayesian forward model of seismic signal envelopes includes physically-derived models of travel times and source characteristics as well as Gaussian process (kriging) statistical models of signal properties that combine interpolation of historical data with extrapolation of learned physical trends. Applying Bayesian inference, we evaluate the model on earthquakes as well as the 2009 DPRK test event, demonstrating a waveform matching effect as part of the probabilistic inference, along with results on event localization and sensitivity. In particular, we demonstrate increased sensitivity from signal-based modeling, in which the SIGVISA signal model finds statistical evidence for arrivals even at stations for which the IMS station processing failed to register any detection.

  1. Confidence intervals for expected moments algorithm flood quantile estimates

    USGS Publications Warehouse

    Cohn, Timothy A.; Lane, William L.; Stedinger, Jery R.

    2001-01-01

    Historical and paleoflood information can substantially improve flood frequency estimates if appropriate statistical procedures are properly applied. However, the Federal guidelines for flood frequency analysis, set forth in Bulletin 17B, rely on an inefficient “weighting” procedure that fails to take advantage of historical and paleoflood information. This has led researchers to propose several more efficient alternatives including the Expected Moments Algorithm (EMA), which is attractive because it retains Bulletin 17B's statistical structure (method of moments with the Log Pearson Type 3 distribution) and thus can be easily integrated into flood analyses employing the rest of the Bulletin 17B approach. The practical utility of EMA, however, has been limited because no closed‐form method has been available for quantifying the uncertainty of EMA‐based flood quantile estimates. This paper addresses that concern by providing analytical expressions for the asymptotic variance of EMA flood‐quantile estimators and confidence intervals for flood quantile estimates. Monte Carlo simulations demonstrate the properties of such confidence intervals for sites where a 25‐ to 100‐year streamgage record is augmented by 50 to 150 years of historical information. The experiments show that the confidence intervals, though not exact, should be acceptable for most purposes.

  2. Community Design Impacts on Health Habits in Low-income Southern Nevadans.

    PubMed

    Coughenour, Courtney; Burns, Mackenzie S

    2016-07-01

    The purposes of this exploratory study were to: (1) characterize selected community design features; and (2) determine the relationship between select features and physical activity (PA) levels and nutrition habits for a small sample of low-income southern Nevadans. Secondary analysis was conducted on data from selected participants of the Nevada Healthy Homes Partnership program; self-report data on PA and diet habits were compared to national guidelines. Community design features were identified via GIS within a one-mile radius of participants' homes. Descriptive statistics characterized these features and chi-square analyses were conducted to determine the relationship between select features and habits. Data from 71 participants were analyzed; the majority failed to reach either PA or fruit and vegetable guidelines (81.7% and 93.0%, respectively). Many neighborhoods were absent of parks (71.8%), trailheads (36.6%), or pay-for-use PA facilities (47.9%). The mean number of grocery stores was 3.4 ± 2.3 per neighborhood. Chi-square analyses were not statistically significant. Findings were insufficient to make meaningful conclusions, but support the need for health promotion to meet guidelines. More research is needed to assess the impact of health-promoting community design and healthy behaviors, particularly in vulnerable populations.

  3. Got power? A systematic review of sample size adequacy in health professions education research.

    PubMed

    Cook, David A; Hatala, Rose

    2015-03-01

    Many education research studies employ small samples, which in turn lowers statistical power. We re-analyzed the results of a meta-analysis of simulation-based education to determine study power across a range of effect sizes, and the smallest effect that could be plausibly excluded. We systematically searched multiple databases through May 2011, and included all studies evaluating simulation-based education for health professionals in comparison with no intervention or another simulation intervention. Reviewers working in duplicate abstracted information to calculate standardized mean differences (SMD's). We included 897 original research studies. Among the 627 no-intervention-comparison studies the median sample size was 25. Only two studies (0.3%) had ≥80% power to detect a small difference (SMD > 0.2 standard deviations) and 136 (22%) had power to detect a large difference (SMD > 0.8). 110 no-intervention-comparison studies failed to find a statistically significant difference, but none excluded a small difference and only 47 (43%) excluded a large difference. Among 297 studies comparing alternate simulation approaches the median sample size was 30. Only one study (0.3%) had ≥80% power to detect a small difference and 79 (27%) had power to detect a large difference. Of the 128 studies that did not detect a statistically significant effect, 4 (3%) excluded a small difference and 91 (71%) excluded a large difference. In conclusion, most education research studies are powered only to detect effects of large magnitude. For most studies that do not reach statistical significance, the possibility of large and important differences still exists.

  4. The failure to fail underperforming trainees in health professions education: A BEME systematic review: BEME Guide No. 42.

    PubMed

    Yepes-Rios, Monica; Dudek, Nancy; Duboyce, Rita; Curtis, Jerri; Allard, Rhonda J; Varpio, Lara

    2016-11-01

    Many clinical educators feel unprepared and/or unwilling to report unsatisfactory trainee performance. This systematic review consolidates knowledge from medical, nursing, and dental literature on the experiences and perceptions of evaluators or assessors with this failure to fail phenomenon. We searched the English language literature in CINAHL, EMBASE, and MEDLINE from January 2005 to January 2015. Qualitative and quantitative studies were included. Following our review protocol, registered with BEME, reviewers worked in pairs to identify relevant articles. The investigators participated in thematic analysis of the qualitative data reported in these studies. Through several cycles of analysis, discussion and reflection, the team identified the barriers and enablers to failing a trainee. From 5330 articles, we included 28 publications in the review. The barriers identified were (1) assessor's professional considerations, (2) assessor's personal considerations, (3) trainee related considerations, (4) unsatisfactory evaluator development and evaluation tools, (5) institutional culture and (6) consideration of available remediation for the trainee. The enablers identified were: (1) duty to patients, to society, and to the profession, (2) institutional support such as backing a failing evaluation, support from colleagues, evaluator development, and strong assessment systems, and (3) opportunities for students after failing. The inhibiting and enabling factors to failing an underperforming trainee were common across the professions included in this study, across the 10 years of data, and across the educational continuum. We suggest that these results can inform efforts aimed at addressing the failure to fail problem.

  5. Defining the molecular signatures of human right heart failure.

    PubMed

    Williams, Jordan L; Cavus, Omer; Loccoh, Emefah C; Adelman, Sara; Daugherty, John C; Smith, Sakima A; Canan, Benjamin; Janssen, Paul M L; Koenig, Sara; Kline, Crystal F; Mohler, Peter J; Bradley, Elisa A

    2018-03-01

    Right ventricular failure (RVF) varies significantly from the more common left ventricular failure (LVF). This study was undertaken to determine potential molecular pathways that are important in human right ventricular (RV) function and may mediate RVF. We analyzed mRNA of human non-failing LV and RV samples and RVF samples from patients with pulmonary arterial hypertension (PAH), and post-LVAD implantation. We then performed transcript analysis to determine differential expression of genes in the human heart samples. Immunoblot quantification was performed followed by analysis of non-failing and failing phenotypes. Inflammatory pathways were more commonly dysregulated in RV tissue (both non-failing and failing phenotypes). In non-failing human RV tissue we found important differences in expression of FIGF, TRAPPAC, and CTGF suggesting that regulation of normal RV and LV function are not the same. In failing RV tissue, FBN2, CTGF, SMOC2, and TRAPP6AC were differentially expressed, and are potential targets for further study. This work provides some of the first analyses of the molecular heterogeneity between human RV and LV tissue, as well as key differences in human disease (RVF secondary to pulmonary hypertension and LVAD mediated RVF). Our transcriptional data indicated that inflammatory pathways may be more important in RV tissue, and changes in FIGF and CTGF supported this hypothesis. In PAH RV failure samples, upregulation of FBN2 and CTGF further reinforced the potential significance that altered remodeling and inflammation play in normal RV function and failure. Copyright © 2018 Elsevier Inc. All rights reserved.

  6. Sensitivity of super-efficient data envelopment analysis results to individual decision-making units: an example of surgical workload by specialty.

    PubMed

    Dexter, Franklin; O'Neill, Liam; Xin, Lei; Ledolter, Johannes

    2008-12-01

    We use resampling of data to explore the basic statistical properties of super-efficient data envelopment analysis (DEA) when used as a benchmarking tool by the manager of a single decision-making unit. Our focus is the gaps in the outputs (i.e., slacks adjusted for upward bias), as they reveal which outputs can be increased. The numerical experiments show that the estimates of the gaps fail to exhibit asymptotic consistency, a property expected for standard statistical inference. Specifically, increased sample sizes were not always associated with more accurate forecasts of the output gaps. The baseline DEA's gaps equaled the mode of the jackknife and the mode of resampling with/without replacement from any subset of the population; usually, the baseline DEA's gaps also equaled the median. The quartile deviations of gaps were close to zero when few decision-making units were excluded from the sample and the study unit happened to have few other units contributing to its benchmark. The results for the quartile deviations can be explained in terms of the effective combinations of decision-making units that contribute to the DEA solution. The jackknife can provide all the combinations contributing to the quartile deviation and only needs to be performed for those units that are part of the benchmark set. These results show that there is a strong rationale for examining DEA results with a sensitivity analysis that excludes one benchmark hospital at a time. This analysis enhances the quality of decision support using DEA estimates for the potential ofa decision-making unit to grow one or more of its outputs.

  7. A better way to evaluate remote monitoring programs in chronic disease care: receiver operating characteristic analysis.

    PubMed

    Brown Connolly, Nancy E

    2014-12-01

    This foundational study applies the process of receiver operating characteristic (ROC) analysis to evaluate utility and predictive value of a disease management (DM) model that uses RM devices for chronic obstructive pulmonary disease (COPD). The literature identifies a need for a more rigorous method to validate and quantify evidence-based value for remote monitoring (RM) systems being used to monitor persons with a chronic disease. ROC analysis is an engineering approach widely applied in medical testing, but that has not been evaluated for its utility in RM. Classifiers (saturated peripheral oxygen [SPO2], blood pressure [BP], and pulse), optimum threshold, and predictive accuracy are evaluated based on patient outcomes. Parametric and nonparametric methods were used. Event-based patient outcomes included inpatient hospitalization, accident and emergency, and home health visits. Statistical analysis tools included Microsoft (Redmond, WA) Excel(®) and MedCalc(®) (MedCalc Software, Ostend, Belgium) version 12 © 1993-2013 to generate ROC curves and statistics. Persons with COPD were monitored a minimum of 183 days, with at least one inpatient hospitalization within 12 months prior to monitoring. Retrospective, de-identified patient data from a United Kingdom National Health System COPD program were used. Datasets included biometric readings, alerts, and resource utilization. SPO2 was identified as a predictive classifier, with an optimal average threshold setting of 85-86%. BP and pulse were failed classifiers, and areas of design were identified that may improve utility and predictive capacity. Cost avoidance methodology was developed. RESULTS can be applied to health services planning decisions. Methods can be applied to system design and evaluation based on patient outcomes. This study validated the use of ROC in RM program evaluation.

  8. Martial arts as a mental health intervention for children? Evidence from the ECLS-K

    PubMed Central

    Strayhorn, Joseph M; Strayhorn, Jillian C

    2009-01-01

    Background Martial arts studios for children market their services as providing mental health outcomes such as self-esteem, self-confidence, concentration, and self-discipline. It appears that many parents enroll their children in martial arts in hopes of obtaining such outcomes. The current study used the data from the Early Childhood Longitudinal Study, Kindergarten class of 1998-1999, to assess the effects of martial arts upon such outcomes as rated by classroom teachers. Methods The Early Childhood Longitudinal Study used a multistage probability sampling design to gather a sample representative of U.S. children attending kindergarten beginning 1998. We made use of data collected in the kindergarten, 3rd grade, and 5th grade years. Classroom behavior was measured by a rating scale completed by teachers; participation in martial arts was assessed as part of a parent interview. The four possible combinations of participation and nonparticipation in martial arts at time 1 and time 2 for each analysis were coded into three dichotomous variables; the set of three variables constituted the measure of participation studied through regression. Multiple regression was used to estimate the association between martial arts participation and change in classroom behavior from one measurement occasion to the next. The change from kindergarten to third grade was studied as a function of martial arts participation, and the analysis was replicated studying behavior change from third grade to fifth grade. Cohen's f2 effect sizes were derived from these regressions. Results The martial arts variable failed to show a statistically significant effect on behavior, in either of the regression analyses; in fact, the f2 effect size for martial arts was 0.000 for both analyses. The 95% confidence intervals for regression coefficients for martial arts variables have upper and lower bounds that are all close to zero. The analyses not only fail to reject the null hypothesis, but also render unlikely a population effect size that differs greatly from zero. Conclusion The data from the ECLS-K fail to support enrolling children in martial arts to improve mental health outcomes as measured by classroom teachers. PMID:19828027

  9. Martial arts as a mental health intervention for children? Evidence from the ECLS-K.

    PubMed

    Strayhorn, Joseph M; Strayhorn, Jillian C

    2009-10-14

    Martial arts studios for children market their services as providing mental health outcomes such as self-esteem, self-confidence, concentration, and self-discipline. It appears that many parents enroll their children in martial arts in hopes of obtaining such outcomes. The current study used the data from the Early Childhood Longitudinal Study, Kindergarten class of 1998-1999, to assess the effects of martial arts upon such outcomes as rated by classroom teachers. The Early Childhood Longitudinal Study used a multistage probability sampling design to gather a sample representative of U.S. children attending kindergarten beginning 1998. We made use of data collected in the kindergarten, 3rd grade, and 5th grade years. Classroom behavior was measured by a rating scale completed by teachers; participation in martial arts was assessed as part of a parent interview. The four possible combinations of participation and nonparticipation in martial arts at time 1 and time 2 for each analysis were coded into three dichotomous variables; the set of three variables constituted the measure of participation studied through regression. Multiple regression was used to estimate the association between martial arts participation and change in classroom behavior from one measurement occasion to the next. The change from kindergarten to third grade was studied as a function of martial arts participation, and the analysis was replicated studying behavior change from third grade to fifth grade. Cohen's f2 effect sizes were derived from these regressions. The martial arts variable failed to show a statistically significant effect on behavior, in either of the regression analyses; in fact, the f2 effect size for martial arts was 0.000 for both analyses. The 95% confidence intervals for regression coefficients for martial arts variables have upper and lower bounds that are all close to zero. The analyses not only fail to reject the null hypothesis, but also render unlikely a population effect size that differs greatly from zero. The data from the ECLS-K fail to support enrolling children in martial arts to improve mental health outcomes as measured by classroom teachers.

  10. Failures to further developing orphan medicinal products after designation granted in Europe: an analysis of marketing authorisation failures and abandoned drugs.

    PubMed

    Giannuzzi, Viviana; Landi, Annalisa; Bosone, Enrico; Giannuzzi, Floriana; Nicotri, Stefano; Torrent-Farnell, Josep; Bonifazi, Fedele; Felisi, Mariagrazia; Bonifazi, Donato; Ceci, Adriana

    2017-09-11

    The research and development process in the field of rare diseases is characterised by many well-known difficulties, and a large percentage of orphan medicinal products do not reach the marketing approval.This work aims at identifying orphan medicinal products that failed the developmental process and investigating reasons for and possible factors influencing failures. Drugs designated in Europe under Regulation (European Commission) 141/2000 in the period 2000-2012 were investigated in terms of the following failures: (1) marketing authorisation failures (refused or withdrawn) and (2) drugs abandoned by sponsors during development.Possible risk factors for failure were analysed using statistically validated methods. This study points out that 437 out of 788 designations are still under development, while 219 failed the developmental process. Among the latter, 34 failed the marketing authorisation process and 185 were abandoned during the developmental process. In the first group of drugs (marketing authorisation failures), 50% reached phase II, 47% reached phase III and 3% reached phase I, while in the second group (abandoned drugs), the majority of orphan medicinal products apparently never started the development process, since no data on 48.1% of them were published and the 3.2% did not progress beyond the non-clinical stage.The reasons for failures of marketing authorisation were: efficacy/safety issues (26), insufficient data (12), quality issues (7), regulatory issues on trials (4) and commercial reasons (1). The main causes for abandoned drugs were efficacy/safety issues (reported in 54 cases), inactive companies (25.4%), change of company strategy (8.1%) and drug competition (10.8%). No information concerning reasons for failure was available for 23.2% of the analysed products. This analysis shows that failures occurred in 27.8% of all designations granted in Europe, the main reasons being safety and efficacy issues. Moreover, the stage of development reached by drugs represents a specific risk factor for failures. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  11. Identifying barriers to follow-up eye care for children after failed vision screening in a primary care setting.

    PubMed

    Su, Zhuo; Marvin, Elizabeth K; Wang, Bing Q; van Zyl, Tavé; Elia, Maxwell D; Garza, Esteban N; Salchow, Daniel J; Forster, Susan H

    2013-08-01

    To identify barriers to follow-up eye care in children who failed a visual acuity screening conducted by their primary care provider. Children aged 3-14 years who failed a visual acuity screening were identified. A phone survey with the parent of every child was conducted 4 months after the screening. Family demographics, parental awareness of childhood eye diseases and eye care for children, and barriers to follow-up eye care were assessed. Of 971 children sampled, 199 (20.5%) failed a visual acuity screening. The survey was completed by the parents of 58 children (29.1%), of whom 27 (46.6%) presented for follow-up examination. The most common reason for failure to follow-up was parental unawareness of screening results (29.3%). Follow-up rates were higher in children with previous eye examinations than in those without (81% versus 17%; P = 0.005) and in children who waited <2 months for a follow-up appointment than in those who had to wait longer (100% versus 63%; P = 0.024). Child's sex, ethnicity, and health insurance status, parent's marital, education and employment status, household income, and transportation access were not associated with statistically significant different follow-up rates. Parental unawareness of a failed visual acuity screening is an important barrier to obtaining follow-up. Strategies to improve follow-up rates after a failed visual acuity screening may include communicating the results clearly and consistently, providing education about the importance of timely follow-up, and offering logistic support for accessing eye appointments to families. Copyright © 2013 American Association for Pediatric Ophthalmology and Strabismus. Published by Mosby, Inc. All rights reserved.

  12. Comparison of Onsite Versus Online Chart Reviews as Part of the American College of Radiation Oncology Accreditation Program.

    PubMed

    Hepel, Jaroslaw T; Heron, Dwight E; Mundt, Arno J; Yashar, Catheryn; Feigenberg, Steven; Koltis, Gordon; Regine, William F; Prasad, Dheerendra; Patel, Shilpen; Sharma, Navesh; Hebert, Mary; Wallis, Norman; Kuettel, Michael

    2017-05-01

    Accreditation based on peer review of professional standards of care is essential in ensuring quality and safety in administration of radiation therapy. Traditionally, medical chart reviews have been performed by a physical onsite visit. The American College of Radiation Oncology Accreditation Program has remodeled its process whereby electronic charts are reviewed remotely. Twenty-eight radiation oncology practices undergoing accreditation had three charts per practice undergo both onsite and online review. Onsite review was performed by a single reviewer for each practice. Online review consisted of one or more disease site-specific reviewers for each practice. Onsite and online reviews were blinded and scored on a 100-point scale on the basis of 20 categories. A score of less than 75 was failing, and a score of 75 to 79 was marginal. Any failed charts underwent rereview by a disease site team leader. Eighty-four charts underwent both onsite and online review. The mean scores were 86.0 and 86.9 points for charts reviewed onsite and online, respectively. Comparison of onsite and online reviews revealed no statistical difference in chart scores ( P = .43). Of charts reviewed, 21% had a marginal (n = 8) or failing (n = 10) score. There was no difference in failing charts ( P = .48) or combined marginal and failing charts ( P = .13) comparing onsite and online reviews. The American College of Radiation Oncology accreditation process of online chart review results in comparable review scores and rate of failing scores compared with traditional on-site review. However, the modern online process holds less potential for bias by using multiple reviewers per practice and allows for greater oversight via disease site team leader rereview.

  13. Analysis of Naval Facilities Engineering Command’s (NAVFAC) Contracting Processes Using the Contract Management Maturity Model (CMMM)

    DTIC Science & Technology

    2006-12-01

    Antecedents and Consequences of Failed Governance : the Enron Example. Corporate Governance , 5: 5. Garrett, G., & Rendon, R. 2005(a). Contract Management... government organization because NAVFAC faces competition analogous to the corporate world. If NAVFAC fails to provide adequate services, the ...applicable to NAVFAC even though NAVFAC is a government organization because NAVFAC faces competition analogous to the corporate world. If NAVFAC fails to

  14. Statistical learning of action: the role of conditional probability.

    PubMed

    Meyer, Meredith; Baldwin, Dare

    2011-12-01

    Identification of distinct units within a continuous flow of human action is fundamental to action processing. Such segmentation may rest in part on statistical learning. In a series of four experiments, we examined what types of statistics people can use to segment a continuous stream involving many brief, goal-directed action elements. The results of Experiment 1 showed no evidence for sensitivity to conditional probability, whereas Experiment 2 displayed learning based on joint probability. In Experiment 3, we demonstrated that additional exposure to the input failed to engender sensitivity to conditional probability. However, the results of Experiment 4 showed that a subset of adults-namely, those more successful at identifying actions that had been seen more frequently than comparison sequences-were also successful at learning conditional-probability statistics. These experiments help to clarify the mechanisms subserving processing of intentional action, and they highlight important differences from, as well as similarities to, prior studies of statistical learning in other domains, including language.

  15. Factors affecting the inter-annual to centennial timescale variability of Indian summer monsoon rainfall

    NASA Astrophysics Data System (ADS)

    Malik, Abdul; Brönnimann, Stefan

    2017-09-01

    The Modes of Ocean Variability (MOV) namely Atlantic Multidecadal Oscillation (AMO), Pacific Decadal Oscillation (PDO), and El Niño Southern Oscillation (ENSO) can have significant impacts on Indian Summer Monsoon Rainfall (ISMR) on different timescales. The timescales at which these MOV interacts with ISMR and the factors which may perturb their relationship with ISMR need to be investigated. We employ De-trended Cross-Correlation Analysis (DCCA), and De-trended Partial-Cross-Correlation Analysis (DPCCA) to study the timescales of interaction of ISMR with AMO, PDO, and ENSO using observational dataset (AD 1854-1999), and atmosphere-ocean-chemistry climate model simulations with SOCOL-MPIOM (AD 1600-1999). Further, this study uses De-trended Semi-Partial Cross-Correlation Analysis (DSPCCA) to address the relation between solar variability and the ISMR. We find statistically significant evidence of intrinsic correlations of ISMR with AMO, PDO, and ENSO on different timescales, consistent between model simulations and observations. However, the model fails to capture modulation in intrinsic relationship between ISRM and MOV due to external signals. Our analysis indicates that AMO is a potential source of non-stationary relationship between ISMR and ENSO. Furthermore, the pattern of correlation between ISMR and Total Solar Irradiance (TSI) is inconsistent between observations and model simulations. The observational dataset indicates statistically insignificant negative intrinsic correlation between ISMR and TSI on decadal-to-centennial timescales. This statistically insignificant negative intrinsic correlation is transformed to statistically significant positive extrinsic by AMO on 61-86-year timescale. We propose a new mechanism for Sun-monsoon connection which operates through AMO by changes in summer (June-September; JJAS) meridional gradient of tropospheric temperatures (ΔTTJJAS). There is a negative (positive) intrinsic correlation between ΔTTJJAS (AMO) and TSI. The negative intrinsic correlation between ΔTTJJAS and TSI indicates that high (low) solar activity weakens (strengthens) the meridional gradient of tropospheric temperature during the summer monsoon season and subsequently the weak (strong) ΔTTJJAS decreases (increases) the ISMR. However, the presence of AMO transforms the negative intrinsic relation between ΔTTJJAS and TSI into positive extrinsic and strengthens the ISMR. We conclude that the positive relation between ISMR and solar activity, as found by other authors, is mainly due to the effect of AMO on ISMR.

  16. Factors affecting the inter-annual to centennial timescale variability of Indian summer monsoon rainfall

    NASA Astrophysics Data System (ADS)

    Malik, Abdul; Brönnimann, Stefan

    2018-06-01

    The Modes of Ocean Variability (MOV) namely Atlantic Multidecadal Oscillation (AMO), Pacific Decadal Oscillation (PDO), and El Niño Southern Oscillation (ENSO) can have significant impacts on Indian Summer Monsoon Rainfall (ISMR) on different timescales. The timescales at which these MOV interacts with ISMR and the factors which may perturb their relationship with ISMR need to be investigated. We employ De-trended Cross-Correlation Analysis (DCCA), and De-trended Partial-Cross-Correlation Analysis (DPCCA) to study the timescales of interaction of ISMR with AMO, PDO, and ENSO using observational dataset (AD 1854-1999), and atmosphere-ocean-chemistry climate model simulations with SOCOL-MPIOM (AD 1600-1999). Further, this study uses De-trended Semi-Partial Cross-Correlation Analysis (DSPCCA) to address the relation between solar variability and the ISMR. We find statistically significant evidence of intrinsic correlations of ISMR with AMO, PDO, and ENSO on different timescales, consistent between model simulations and observations. However, the model fails to capture modulation in intrinsic relationship between ISRM and MOV due to external signals. Our analysis indicates that AMO is a potential source of non-stationary relationship between ISMR and ENSO. Furthermore, the pattern of correlation between ISMR and Total Solar Irradiance (TSI) is inconsistent between observations and model simulations. The observational dataset indicates statistically insignificant negative intrinsic correlation between ISMR and TSI on decadal-to-centennial timescales. This statistically insignificant negative intrinsic correlation is transformed to statistically significant positive extrinsic by AMO on 61-86-year timescale. We propose a new mechanism for Sun-monsoon connection which operates through AMO by changes in summer (June-September; JJAS) meridional gradient of tropospheric temperatures (ΔTTJJAS). There is a negative (positive) intrinsic correlation between ΔTTJJAS (AMO) and TSI. The negative intrinsic correlation between ΔTTJJAS and TSI indicates that high (low) solar activity weakens (strengthens) the meridional gradient of tropospheric temperature during the summer monsoon season and subsequently the weak (strong) ΔTTJJAS decreases (increases) the ISMR. However, the presence of AMO transforms the negative intrinsic relation between ΔTTJJAS and TSI into positive extrinsic and strengthens the ISMR. We conclude that the positive relation between ISMR and solar activity, as found by other authors, is mainly due to the effect of AMO on ISMR.

  17. Predictive value of CHA2DS2-VASc and CHA2DS2-VASc-HS scores for failed reperfusion after thrombolytic therapy in patients with ST-elevation myocardial ınfarction.

    PubMed

    Kilic, Salih; Kocabas, Umut; Can, Levent Hurkan; Yavuzgil, Oğuz; Çetin, Mustafa; Zoghi, Mehdi

    2018-03-07

    Thrombolytic therapy is recommended for patients with acute ST-segment elevation myocardial infarction (STEMI) who cannot undergo primary percutaneous coronary intervention within the first 120 min. The aim of this study wasz to demonstrate the value of CHA₂DS₂-VASc and CHA₂DS₂-VASc-HS scores in predicting failed reperfusion in STEMI patients treated with thrombolytic therapy. A total of 537 consecutive patients were enrolled in the study; 139 had failed thrombolysis while the remaining 398 fulfilled the criteria for successful thrombolysis. Thrombolysis failure was defined with the lack of symptom relief, < 50% ST resolution-related electrocardiography within 90 min from initiation of the thrombolytic therapy, presence of hemodynamic or electrical instability or in-hospital mortality. CHA₂DS₂-VASc and CHA₂DS₂-VASc-HS scores, which incorporate hyperlipidemia, smoking, switches between female and male gender, were previously shown to be markers of the severity of coronary artery disease (CAD). History of hypertension, diabetes mellitus, hyperlipidemia, heart failure, smoking, and CAD were significantly common in failed reperfusion patients (for all; p < 0.05). For prediction of failed reperfusion, the cut-off value of CHA₂DS₂-VASc score was ≥ 2 with a sensitivity of 80.90% and a specificity of 41.01% (area under curve [AUC] 0.660; 95% confidence interval [CI] 0.618-0.700; p < 0.001) and the cut-off value of CHA₂DS₂-VASc-HS score was ≥ 3 with a sensitivity of 76.13% and a specificity of 67.63% (AUC 0.764; 95% CI 0.725-0.799; p < 0.001). The CHA₂DS₂-VASc-HS score was found to be statistically and significantly better than CHA₂DS₂-VASc score to predict failed reperfusion (p < 0.001). The findings suggest that the CHA₂DS₂-VASc and especially CHA₂DS₂-VASc-HS scores could be considered as predictors of risk of failed reperfusion in STEMI patients.

  18. Resident performance on the in-training and board examinations in obstetrics and gynecology: implications for the ACGME Outcome Project.

    PubMed

    Withiam-Leitch, Matthew; Olawaiye, Alexander

    2008-01-01

    The Accreditation Council on Graduate Medical Education (ACGME) Outcomes Project has endorsed the in-training examination (ITE) as an example of a multiple-choice question examination that is a valid measure of a resident's attainment of medical knowledge. An outcome measure for performance on the ITE would be the subsequent performance on the board certification examination. However, there are few reports that attempt to correlate a resident's performance on the ITE to subsequent performance on the board certification examination. The Council on Resident Education in Obstetrics and Gynecology (CREOG) has administered the ITE annually since 1970. This study tested the hypothesis that the CREOG-ITE score is a valid assessment tool to predict performance on the American Board of Obstetrics and Gynecology (ABOG) written examination. CREOG-ITE and ABOG written board examination results were collected for 69 resident graduates between the years 1998 and 2005. Logistic regression and receiver operating characteristic analyses were used to estimate the relationship between a resident's score on the CREOG-ITE and subsequent performance on the ABOG written examination. Fifty-seven resident graduates passed (82.6%) and 12 graduates failed (17.4%) the ABOG written examination. The correlation between the CREOG-ITE overall score and performance on the ABOG examination was weak (correlation coefficient =.38, p =.001). Receiver operating characteristic analysis for the CREOG-ITE overall scores and the probability of passing or failing the ABOG examination revealed moderate accuracy (area under the curve = 0.77, 95% CI = 0.62-0.92) with a CREOG-ITE score of 187.5 yielding the best trade-off between specificity (0.79) and sensitivity (0.75). At the cutoff value of 187.5 there was a weak positive predictive value of 43% (i.e., 43% of residents with a score less than 187.5 will fail the ABOG exam) and a strong negative predictive value of 94% (i.e., 94% of the residents with a score above 187.5 will pass the ABOG exam). Logistic regression analysis also revealed a statistically significant relationship between the CREOG-ITE overall score and performance on the ABOG written examination (p = .003). Similar to other specialties, resident performance on the CREOG-ITE is a weak assessment tool to predict the probability of a resident failing the ABOG written examination. Our study highlights the need, in the spirit of the ACGME Outcome Project, for residency and board specialty organizations to coordinate efforts to develop more reliable and correlative measures of a resident's medical knowledge and ability to pass the boards.

  19. Gas emissions from failed and actual eruptions from Cook Inlet Volcanoes, Alaska, 1989-2006

    USGS Publications Warehouse

    Werner, C.A.; Doukas, M.P.; Kelly, P.J.

    2011-01-01

    Cook Inlet volcanoes that experienced an eruption between 1989 and 2006 had mean gas emission rates that were roughly an order of magnitude higher than at volcanoes where unrest stalled. For the six events studied, mean emission rates for eruptions were ~13,000 t/d CO2 and 5200 t/d SO2, but only ~1200 t/d CO2 and 500 t/d SO2 for non-eruptive events (‘failed eruptions’). Statistical analysis suggests degassing thresholds for eruption on the order of 1500 and 1000 t/d for CO2 and SO2, respectively. Emission rates greater than 4000 and 2000 t/d for CO2 and SO2, respectively, almost exclusively resulted during eruptive events (the only exception being two measurements at Fourpeaked). While this analysis could suggest that unerupted magmas have lower pre-eruptive volatile contents, we favor the explanations that either the amount of magma feeding actual eruptions is larger than that driving failed eruptions, or that magmas from failed eruptions experience less decompression such that the majority of H2O remains dissolved and thus insufficient permeability is produced to release the trapped volatile phase (or both). In the majority of unrest and eruption sequences, increases in CO2 emission relative to SO2 emission were observed early in the sequence. With time, all events converged to a common molar value of C/S between 0.5 and 2. These geochemical trends argue for roughly similar decompression histories until shallow levels are reached beneath the edifice (i.e., from 20–35 to ~4–6 km) and perhaps roughly similar initial volatile contents in all cases. Early elevated CO2 levels that we find at these high-latitude, andesitic arc volcanoes have also been observed at mid-latitude, relatively snow-free, basaltic volcanoes such as Stromboli and Etna. Typically such patterns are attributed to injection and decompression of deep (CO2-rich) magma into a shallower chamber and open system degassing prior to eruption. Here we argue that the C/S trends probably represent tapping of vapor-saturated regions with high C/S, and then gradual degassing of remaining dissolved volatiles as the magma progresses toward the surface. At these volcanoes, however, C/S is often accentuated due to early preferential scrubbing of sulfur gases. The range of equilibrium degassing is consistent with the bulk degassing of a magma with initial CO2 and S of 0.6 and 0.2 wt.%, respectively, similar to what has been suggested for primitive Redoubt magmas.

  20. Daily rainfall statistics of TRMM and CMORPH: A case for trans-boundary Gandak River basin

    NASA Astrophysics Data System (ADS)

    Kumar, Brijesh; Patra, Kanhu Charan; Lakshmi, Venkat

    2016-07-01

    Satellite precipitation products offer an opportunity to evaluate extreme events (flood and drought) for areas where rainfall data are not available or rain gauge stations are sparse. In this study, daily precipitation amount and frequency of TRMM 3B42V.7 and CMORPH products have been validated against daily rain gauge precipitation for the monsoon months (June-September or JJAS) from 2005-2010 in the trans-boundary Gandak River basin. The analysis shows that the both TRMM and CMORPH can detect rain and no-rain events, but they fail to capture the intensity of rainfall. The detection of precipitation amount is strongly dependent on the topography. In the plains areas, TRMM product is capable of capturing high-intensity rain events but in the hilly regions, it underestimates the amount of high-intensity rain events. On the other hand, CMORPH entirely fails to capture the high-intensity rain events but does well with low-intensity rain events in both hilly regions as well as the plain region. The continuous variable verification method shows better agreement of TRMM rainfall products with rain gauge data. TRMM fares better in the prediction of probability of occurrence of high-intensity rainfall events, but it underestimates intensity at high altitudes. This implies that TRMM precipitation estimates can be used for flood-related studies only after bias adjustment for the topography.

  1. Medical students' personal experience of high-stakes failure: case studies using interpretative phenomenological analysis.

    PubMed

    Patel, R S; Tarrant, C; Bonas, S; Shaw, R L

    2015-05-12

    Failing a high-stakes assessment at medical school is a major event for those who go through the experience. Students who fail at medical school may be more likely to struggle in professional practice, therefore helping individuals overcome problems and respond appropriately is important. There is little understanding about what factors influence how individuals experience failure or make sense of the failing experience in remediation. The aim of this study was to investigate the complexity surrounding the failure experience from the student's perspective using interpretative phenomenological analysis (IPA). The accounts of three medical students who had failed final re-sit exams, were subjected to in-depth analysis using IPA methodology. IPA was used to analyse each transcript case-by-case allowing the researcher to make sense of the participant's subjective world. The analysis process allowed the complexity surrounding the failure to be highlighted, alongside a narrative describing how students made sense of the experience. The circumstances surrounding students as they approached assessment and experienced failure at finals were a complex interaction between academic problems, personal problems (specifically finance and relationships), strained relationships with friends, family or faculty, and various mental health problems. Each student experienced multi-dimensional issues, each with their own individual combination of problems, but experienced remediation as a one-dimensional intervention with focus only on improving performance in written exams. What these students needed to be included was help with clinical skills, plus social and emotional support. Fear of termination of the their course was a barrier to open communication with staff. These students' experience of failure was complex. The experience of remediation is influenced by the way in which students make sense of failing. Generic remediation programmes may fail to meet the needs of students for whom personal, social and mental health issues are a part of the picture.

  2. 75 FR 36637 - Agency Information Collection Activities; Proposed Collection; Comment Request; Requirements for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-28

    ... provisions of 16 CFR 1500.18(a)(14) and part 1509 fail to comply in a manner to warrant a recall, [[Page... identify those persons and firms who should be notified of the recall. The Commission will consider all... cost of $27.78 per hour (Bureau of Labor Statistics, Total Compensation, All workers, goods-producing...

  3. Oil and Gas on Indian Reservations: Statistical Methods Help to Establish Value for Royalty Purposes

    ERIC Educational Resources Information Center

    Fowler, Mary S.; Kadane, Joseph B.

    2006-01-01

    Part of the history of oil and gas development on Indian reservations concerns potential underpayment of royalties due to under-valuation of production by oil companies. This paper discusses a model used by the Shoshone and Arapaho tribes in a lawsuit against the Federal government, claiming the Government failed to collect adequate royalties.…

  4. Toward a Successful Plan for Educational Technology for Low-Income Communities: A Formative Evaluation of One Laptop Per Child (OLPC) Projects in Nigeria and Ghana

    ERIC Educational Resources Information Center

    Ezumah, Bellarmine Anthonia

    2010-01-01

    Copious educational technology projects have been implemented in several low-income communities by multilateral institutions, individuals, and governmental agencies. Statistics show that the majority of these initiatives fail to accomplish their objectives, thereby wasting colossal amounts of money, talent, and resources. Scholars aver that poor…

  5. White Working Class Achievement: An Ethnographic Study of Barriers to Learning in Schools

    ERIC Educational Resources Information Center

    Demie, Feyisa; Lewis, Kirstin

    2011-01-01

    This study aims to examine the key barriers to learning to raise achievement of White British pupils with low-income backgrounds. The main findings suggest that the worryingly low-achievement levels of many White working class pupils have been masked by the middle class success in the English school system and government statistics that fail to…

  6. Addressing Achievement Gaps: Positioning Young Black Boys for Educational Success. Policy Notes. Volume 19, Number 3, Fall 2011

    ERIC Educational Resources Information Center

    Prager, Karen

    2011-01-01

    America is failing its young Black boys. In metropolitan ghettos, rural villages and midsized townships across the country, schools have become holding tanks for populations of Black boys who have a statistically higher probability of walking the corridors of prison than the halls of college. Across America, the problem of Black male achievement…

  7. Research and Teaching: Online Collaborative Misconception Mapping Strategy Enhanced Health Science Students' Discussion and Knowledge of Basic Statistical Concepts

    ERIC Educational Resources Information Center

    Sas, Magdalena; Bendixen, Lisa D.; Crippen, Kent J.; Saddler, Sterling

    2017-01-01

    Online discussions have become inherent components of both face-to-face and distance education college courses, yet they often fail to provide much benefit to students' learning outcomes. One reason behind this phenomenon is the lack of or inadequate scaffolding or guidance provided to students when participating on asynchronous discussion boards.…

  8. Design of Critical Components

    NASA Technical Reports Server (NTRS)

    Hendricks, Robert C.; Zaretsky, Erwin V.

    2001-01-01

    Critical component design is based on minimizing product failures that results in loss of life. Potential catastrophic failures are reduced to secondary failures where components removed for cause or operating time in the system. Issues of liability and cost of component removal become of paramount importance. Deterministic design with factors of safety and probabilistic design address but lack the essential characteristics for the design of critical components. In deterministic design and fabrication there are heuristic rules and safety factors developed over time for large sets of structural/material components. These factors did not come without cost. Many designs failed and many rules (codes) have standing committees to oversee their proper usage and enforcement. In probabilistic design, not only are failures a given, the failures are calculated; an element of risk is assumed based on empirical failure data for large classes of component operations. Failure of a class of components can be predicted, yet one can not predict when a specific component will fail. The analogy is to the life insurance industry where very careful statistics are book-kept on classes of individuals. For a specific class, life span can be predicted within statistical limits, yet life-span of a specific element of that class can not be predicted.

  9. [Synovial fluid from aseptically failed total hip or knee arthroplasty is not toxic to osteoblasts].

    PubMed

    Gallo, J; Zdařilová, A; Rajnochová Svobodová, A; Ulrichová, J; Radová, L; Smižanský, M

    2010-10-01

    A failure of total hip or knee artroplasty is associated with an increased production of joint fluid. This contains wear particles and host cells and proteins, and is assumed to be involved in the pathogenesis of aseptic loosening and periprosthetic osteolysis. This study investigated the effect of synovial fluid from patients with aseptically failed joint prostheses on osteoblast cultures. Synovial fluid samples were obtained from patients with failed total joint prostheses (TJP; n=36) and from control patient groups (n = 16) involving cases without TJP and osteoarthritis, without TJP but with osteoarthritis, and with stable TJP. The samples were treated in the standard manner and then cultured with the SaOS-2 cell line which shows the characteristics and behaviour of osteoblasts. Each fluid sample was also examined for the content of proteins, cells and selected cytokines (IL-1ß, TNF-α, IL-6, RANKL and OPG detected by ELISA). We tested the hypothesis assuming that the fluids from failed joints would show higher cytotoxicity to osteoblast culture and we also expected higher levels of IL-1ß, TNF-α, IL-6, and RANKL in patients with TJP failure and/ or with more severe bone loss. The statistical methods used included the Kruskal-Wallis ANOVA and Mann-Whitney U test. The fluids from failed TJPs showed the highest RANKL and the lowest OPG levels resulting in the highest RANKL/OPG ratio. However, there was no evidence suggesting that the joint fluids from failed TJPs would be more toxic to osteoblast culture than the fluids from control groups. In addition, no correlation was found between the fluid levels of molecules promoting inflammation and osteoclastic activity and the extent of bone loss in the hip (in terms of Saleh's classification) or the knee (AORI classification). In fact, the fluids from failed TJPs had higher protein levels in comparison with the controls, but the difference was not significant. The finding of high RANKL levels and low OPG concentrations is in agreement with the theory of aseptic loosening and periprosthetic osteolysis. The other cytokines, particularly TNF-α and IL-1ß, were found in low levels. This can be explained by the stage of particle disease at which the samples were taken for ELISA analysis. It is probable that the level of signal molecules reflects osteolytic process activity and is therefore not constant. The reason for no correlation found between cytokine levels and the extent of bone loss may also lie in the use of therapeutic classifications of bone defects that is apparently less sensitive to the biological activity of aseptic loosening and/or periprosthetic osteolysis. Synovial fluids from failed total hip or knee joint prostheses are not toxic to osteoblast cultures. Cytotoxicity indicators and levels of pro-inflammatory and pro-osteoclastic cytokines (IL-1ß, TNF-α, IL-6, RANKL and OPG) do not correlate well with the extent of periprosthetic bone loss. Key words: total joint replacement, arthroplasty, aseptic loosening, periprosthetic osteolysis, joint fluid, SaOS-2 cell line, cytotoxicity, cytokines, RANKL, OPG.

  10. Why fish oil fails: a comprehensive 21st century lipids-based physiologic analysis.

    PubMed

    Peskin, B S

    2014-01-01

    The medical community suffered three significant fish oil failures/setbacks in 2013. Claims that fish oil's EPA/DHA would stop the progression of heart disease were crushed when The Risk and Prevention Study Collaborative Group (Italy) released a conclusive negative finding regarding fish oil for those patients with high risk factors but no previous myocardial infarction. Fish oil failed in all measures of CVD prevention-both primary and secondary. Another major 2013 setback occurred when fish oil's DHA was shown to significantly increase prostate cancer in men, in particular, high-grade prostate cancer, in the Selenium and Vitamin E Cancer Prevention Trial (SELECT) analysis by Brasky et al. Another monumental failure occurred in 2013 whereby fish oil's EPA/DHA failed to improve macular degeneration. In 2010, fish oil's EPA/DHA failed to help Alzheimer's victims, even those with low DHA levels. These are by no means isolated failures. The promise of fish oil and its so-called active ingredients EPA / DHA fails time and time again in clinical trials. This lipids-based physiologic review will explain precisely why there should have never been expectation for success. This review will focus on underpublicized lipid science with a focus on physiology.

  11. Why Fish Oil Fails: A Comprehensive 21st Century Lipids-Based Physiologic Analysis

    PubMed Central

    Peskin, B. S.

    2014-01-01

    The medical community suffered three significant fish oil failures/setbacks in 2013. Claims that fish oil's EPA/DHA would stop the progression of heart disease were crushed when The Risk and Prevention Study Collaborative Group (Italy) released a conclusive negative finding regarding fish oil for those patients with high risk factors but no previous myocardial infarction. Fish oil failed in all measures of CVD prevention—both primary and secondary. Another major 2013 setback occurred when fish oil's DHA was shown to significantly increase prostate cancer in men, in particular, high-grade prostate cancer, in the Selenium and Vitamin E Cancer Prevention Trial (SELECT) analysis by Brasky et al. Another monumental failure occurred in 2013 whereby fish oil's EPA/DHA failed to improve macular degeneration. In 2010, fish oil's EPA/DHA failed to help Alzheimer's victims, even those with low DHA levels. These are by no means isolated failures. The promise of fish oil and its so-called active ingredients EPA / DHA fails time and time again in clinical trials. This lipids-based physiologic review will explain precisely why there should have never been expectation for success. This review will focus on underpublicized lipid science with a focus on physiology. PMID:24551453

  12. Neural net controlled tag gas sampling system for nuclear reactors

    DOEpatents

    Gross, K.C.; Laug, M.T.; Lambert, J.B.; Herzog, J.P.

    1997-02-11

    A method and system are disclosed for providing a tag gas identifier to a nuclear fuel rod and analyze escaped tag gas to identify a particular failed nuclear fuel rod. The method and system include disposing a unique tag gas composition into a plenum of a nuclear fuel rod, monitoring gamma ray activity, analyzing gamma ray signals to assess whether a nuclear fuel rod has failed and is emitting tag gas, activating a tag gas sampling and analysis system upon sensing tag gas emission from a failed nuclear rod and evaluating the escaped tag gas to identify the particular failed nuclear fuel rod. 12 figs.

  13. Risk factors for failed conversion of labor epidural analgesia to cesarean delivery anesthesia: a systematic review and meta-analysis of observational trials.

    PubMed

    Bauer, M E; Kountanis, J A; Tsen, L C; Greenfield, M L; Mhyre, J M

    2012-10-01

    This systematic review and meta-analysis evaluates evidence for seven risk factors associated with failed conversion of labor epidural analgesia to cesarean delivery anesthesia. Online scientific literature databases were searched using a strategy which identified observational trials, published between January 1979 and May 2011, which evaluated risk factors for failed conversion of epidural analgesia to anesthesia or documented a failure rate resulting in general anesthesia. 1450 trials were screened, and 13 trials were included for review (n=8628). Three factors increase the risk for failed conversion: an increasing number of clinician-administered boluses during labor (OR=3.2, 95% CI 1.8-5.5), greater urgency for cesarean delivery (OR=40.4, 95% CI 8.8-186), and a non-obstetric anesthesiologist providing care (OR=4.6, 95% CI 1.8-11.5). Insufficient evidence is available to support combined spinal-epidural versus standard epidural techniques, duration of epidural analgesia, cervical dilation at the time of epidural placement, and body mass index or weight as risk factors for failed epidural conversion. The risk of failed conversion of labor epidural analgesia to anesthesia is increased with an increasing number of boluses administered during labor, an enhanced urgency for cesarean delivery, and care being provided by a non-obstetric anesthesiologist. Further high-quality studies are needed to evaluate the many potential risk factors associated with failed conversion of labor epidural analgesia to anesthesia for cesarean delivery. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Pesticides, Neurodevelopmental Disagreement, and Bradford Hill's Guidelines.

    PubMed

    Shrader-Frechette, Kristin; ChoGlueck, Christopher

    2016-06-27

    Neurodevelopmental disorders such as autism affect one-eighth of all U.S. newborns. Yet scientists, accessing the same data and using Bradford-Hill guidelines, draw different conclusions about the causes of these disorders. They disagree about the pesticide-harm hypothesis, that typical United States prenatal pesticide exposure can cause neurodevelopmental damage. This article aims to discover whether apparent scientific disagreement about this hypothesis might be partly attributable to questionable interpretations of the Bradford-Hill causal guidelines. Key scientists, who claim to employ Bradford-Hill causal guidelines, yet fail to accept the pesticide-harm hypothesis, fall into errors of trimming the guidelines, requiring statistically-significant data, and ignoring semi-experimental evidence. However, the main scientists who accept the hypothesis appear to commit none of these errors. Although settling disagreement over the pesticide-harm hypothesis requires extensive analysis, this article suggests that at least some conflicts may arise because of questionable interpretations of the guidelines.

  15. Research on the impacts of past and future hurricanes on the endangered Florida manatee: Chapter 6J in Science and the storms-the USGS response to the hurricanes of 2005

    USGS Publications Warehouse

    Langtimm, Catherine A.; Krohn, M. Dennis; Stith, Bradley M.; Reid, James P.; Beck, C.A.; Butler, Susan M.

    2007-01-01

    U.S. Geological Survey (USGS) research on Florida manatees (Trichechus manatus latirostris) from 1982 through 1998 identified lower apparent survival rates for adult manatees during years when Hurricane Elena (1985), the March "Storm of the Century"(1993), and Hurricane Opal (1995) hit the northern coast of the Gulf of Mexico. Although our analysis showed that a significant number of our monitored individual manatees failed to return to their winter homes after these storms, their actual fate remains unknown. With the aid of new satellite technology to track manatees during storms and new statistical techniques to determine survival and emigration rates, researchers are working to understand how hurricanes impact the endangered species by studying manatees caught in the path of the destructive hurricanes of 2004 and 2005.

  16. [Laparoscopic sterilization with electrocautery: complications and reliability (author's transl)].

    PubMed

    Bänninger, U; Kunz, J; Schreiner, W E

    1979-05-01

    1084 laparoscopic sterilizations were evaluated in a retrospective study at the Universitäts-Frauenklinik Zürich. The operative and early postoperative complications and the reliability of the method were analysed and compared to the results in the literature. Based on a cumulative statistical analysis 0,5% intraoperative complications required laparotomy, the main indications being haemorrhages and bowel injuries. Failed attempts were encountered in one of 150 patients, the main causes of which were adhaesions and difficulties at establishing pneumoperitoneum. The failure rate of the laparoscopic electrocoagulation of the fallopian tube after a long-term follow-up was about 0,5%, 20--25% of these were ectopic pregnancies. The transection of the fallopian tubes did not diminish the pregnancy rate, but the risk of bleeding was considerably higher with this technic. Concurrently performed therapeutic abortion or preceeeding laparotomy did not increase the operative complication rate.

  17. Mathematical Model of Cardiovascular and Metabolic Responses to Umbilical Cord Occlusions in Fetal Sheep.

    PubMed

    Wang, Qiming; Gold, Nathan; Frasch, Martin G; Huang, Huaxiong; Thiriet, Marc; Wang, Xiaogang

    2015-12-01

    Fetal acidemia during labor is associated with an increased risk of brain injury and lasting neurological deficits. This is in part due to the repetitive occlusions of the umbilical cord (UCO) induced by uterine contractions. Whereas fetal heart rate (FHR) monitoring is widely used clinically, it fails to detect fetal acidemia. Hence, new approaches are needed for early detection of fetal acidemia during labor. We built a mathematical model of the UCO effects on FHR, mean arterial blood pressure (MABP), oxygenation and metabolism. Mimicking fetal experiments, our in silico model reproduces salient features of experimentally observed fetal cardiovascular and metabolic behavior including FHR overshoot, gradual MABP decrease and mixed metabolic and respiratory acidemia during UCO. Combined with statistical analysis, our model provides valuable insight into the labor-like fetal distress and guidance for refining FHR monitoring algorithms to improve detection of fetal acidemia and cardiovascular decompensation.

  18. Improving nutrition and physical activity in the workplace: a meta-analysis of intervention studies.

    PubMed

    Hutchinson, Amanda D; Wilson, Carlene

    2012-06-01

    A comprehensive search of the literature for studies examining physical activity or nutrition interventions in the workplace, published between 1999 and March 2009, was conducted. This search identified 29 relevant studies. Interventions were grouped according to the theoretical framework on which the interventions were based (e.g. education, cognitive-behavioural, motivation enhancement, social influence, exercise). Weighted Cohen's d effect sizes, percentage overlap statistics, confidence intervals and fail safe Ns were calculated. Most theoretical approaches were associated with small effects. However, large effects were found for some measures of interventions using motivation enhancement. Effect sizes were larger for studies focusing on one health behaviour and for randomized controlled trials. The workplace is a suitable environment for making modest changes in the physical activity, nutrition and health of employees. Further research is necessary to determine whether these changes can be maintained in the long term.

  19. The effectiveness of arthroscopic stabilisation for failed open shoulder instability surgery.

    PubMed

    Millar, N L; Murrell, G A C

    2008-06-01

    We identified ten patients who underwent arthroscopic revision of anterior shoulder stabilisation between 1999 and 2005. Their results were compared with 15 patients, matched for age and gender, who had a primary arthroscopic stabilisation during the same period. At a mean follow-up of 37 and 36 months, respectively, the scores for pain and shoulder function improved significantly between the pre-operative and follow-up visits in both groups (p = 0.002), with no significant difference between them (p = 0.4). The UCLA and Rowe shoulder scores improved significantly (p = 0.004 and p = 0.002, respectively), with no statistically significant differences between groups (p = 0.6). Kaplan-Meier analysis for time to recurrent instability showed no differences between the groups (p = 0.2). These results suggest that arthroscopic revision anterior shoulder stabilisation is as reliable as primary arthroscopic stabilisation for patients who have had previous open surgery for recurrent anterior instability.

  20. Outcome, transport times, and costs of patients evacuated by helicopter versus fixed-wing aircraft.

    PubMed Central

    Thomas, F.; Wisham, J.; Clemmer, T. P.; Orme, J. F.; Larsen, K. G.

    1990-01-01

    We determined the differences in transport times and costs for patients transported by fixed-wing aircraft versus helicopter at ranges of 101 to 150 radial miles, where fixed-wing and helicopter in-hospital transports commonly overlap. Statistical analysis failed to show a significant difference between the trauma-care patients transported by helicopter (n = 109) and those transported by fixed-wing (n = 86) for age, injury severity score, hospital length of stay, hospital mortality, or discharge disability score. The times in returning patients to the receiving hospital by helicopter (n = 104) versus fixed-wing (n = 509) did not differ significantly. Helicopter transport costs per mile ($24), however, were 400% higher than those of fixed-wing aircraft with its associated ground ambulance transport costs ($6). Thus, helicopter transport is economically unjustified for interhospital transports exceeding 100 radial miles when an efficient fixed-wing service exists. PMID:2389575

  1. Rate, causes and reporting of medication errors in Jordan: nurses' perspectives.

    PubMed

    Mrayyan, Majd T; Shishani, Kawkab; Al-Faouri, Ibrahim

    2007-09-01

    The aim of the study was to describe Jordanian nurses' perceptions about various issues related to medication errors. This is the first nursing study about medication errors in Jordan. This was a descriptive study. A convenient sample of 799 nurses from 24 hospitals was obtained. Descriptive and inferential statistics were used for data analysis. Over the course of their nursing career, the average number of recalled committed medication errors per nurse was 2.2. Using incident reports, the rate of medication errors reported to nurse managers was 42.1%. Medication errors occurred mainly when medication labels/packaging were of poor quality or damaged. Nurses failed to report medication errors because they were afraid that they might be subjected to disciplinary actions or even lose their jobs. In the stepwise regression model, gender was the only predictor of medication errors in Jordan. Strategies to reduce or eliminate medication errors are required.

  2. How to limit false positives in environmental DNA and metabarcoding?

    PubMed

    Ficetola, Gentile Francesco; Taberlet, Pierre; Coissac, Eric

    2016-05-01

    Environmental DNA (eDNA) and metabarcoding are boosting our ability to acquire data on species distribution in a variety of ecosystems. Nevertheless, as most of sampling approaches, eDNA is not perfect. It can fail to detect species that are actually present, and even false positives are possible: a species may be apparently detected in areas where it is actually absent. Controlling false positives remains a main challenge for eDNA analyses: in this issue of Molecular Ecology Resources, Lahoz-Monfort et al. () test the performance of multiple statistical modelling approaches to estimate the rate of detection and false positives from eDNA data. Here, we discuss the importance of controlling for false detection from early steps of eDNA analyses (laboratory, bioinformatics), to improve the quality of results and allow an efficient use of the site occupancy-detection modelling (SODM) framework for limiting false presences in eDNA analysis. © 2016 John Wiley & Sons Ltd.

  3. A short-term clinical evaluation of IPS Empress 2 crowns.

    PubMed

    Toksavul, Suna; Toman, Muhittin

    2007-01-01

    The aim of this study was to evaluate the clinical performance of all-ceramic crowns made with the IPS Empress 2 system after an observation period of 12 to 60 months. Seventy-nine IPS Empress 2 crowns were placed in 21 patients. The all-ceramic crowns were evaluated clinically, radiographically, and using clinical photographs. The evaluations took place at baseline (2 days after cementation) and at 6-month intervals for 12 to 60 months. Survival rate of the crowns was determined using Kaplan-Meier statistical analysis. Based on the US Public Health Service criteria, 95.24% of the crowns were rated satisfactory after a mean follow-up period of 58 months. Fracture was registered in only 1 crown. One endodontically treated tooth failed as a result of fracture at the cervical margin area. In this in vivo study, IPS Empress 2 crowns exhibited a satisfactory clinical performance during an observation period ranging from 12 to 60 months.

  4. Domestic violence in consanguineous marriages - findings from Pakistan Demographic and Health Survey 2012-13.

    PubMed

    Shaikh, Masood Ali

    2016-10-01

    Domestic violence is a pandemic and estimated to affect one in three women globally, in their lifetime. Marriages within blood relations in Pakistan are common. In this study a secondary analysis of Pakistan Demographic and Health Survey 2012-13 was done to study the prevalence and profile of domestic violence in the context of consanguineous marriages in Pakistan. Almost 65% of women had some kind of blood relationship with their husbands. Women having a blood relationship with husbands were more likely to report having ever been subjected to marital control behaviours, emotional and physical violence by their husbands, compared to ones without such relationship. However, these associations fail to reach statistical significance; underscoring the ubiquitous nature of marital control and violence. More effective public health education campaigns for just and equal treatment of wives by their husbands to speedily curb the scourge of domestic violence in the country are needed.

  5. Effect of first-encounter pretest on pass/fail rates of a clinical skills medical licensure examination.

    PubMed

    Roberts, William L; McKinley, Danette W; Boulet, John R

    2010-05-01

    Due to the high-stakes nature of medical exams it is prudent for test agencies to critically evaluate test data and control for potential threats to validity. For the typical multiple station performance assessments used in medicine, it may take time for examinees to become comfortable with the test format and administrative protocol. Since each examinee in the rotational sequence starts with a different task (e.g., simulated clinical encounter), those who are administered non-scored pretest material on their first station may have an advantage compared to those who are not. The purpose of this study is to investigate whether pass/fail rates are different across the sequence of pretest encounters administered during the testing day. First-time takers were grouped by the sequential order in which they were administered the pretest encounter. No statistically significant difference in fail rates was found between examinees who started with the pretest encounter and those who encountered the pretest encounter later in the sequence. Results indicate that current examination administration protocols do not present a threat to the validity of test score interpretations.

  6. A Life Study of Ausforged, Standard Forged and Standard Machined AISI M-50 Spur Gears

    NASA Technical Reports Server (NTRS)

    Townsend, D. P.; Bamberger, E. N.; Zaretsky, E. V.

    1975-01-01

    Tests were conducted at 350 K (170 F) with three groups of 8.9 cm (3.5 in.) pitch diameter spur gears made of vacuum induction melted (VIM) consumable-electrode vacuum-arc melted (VAR), AISI M-50 steel and one group of vacuum-arc remelted (VAR) AISI 9310 steel. The pitting fatigue life of the standard forged and ausforged gears was approximately five times that of the VAR AISI 9310 gears and ten times that of the bending fatigue life of the standard machined VIM-VAR AISI M-50 gears run under identical conditions. There was a slight decrease in the 10-percent life of the ausforged gears from that for the standard forged gears, but the difference is not statistically significant. The standard machined gears failed primarily by gear tooth fracture while the forged and ausforged VIM-VAR AISI M-50 and the VAR AISI 9310 gears failed primarily by surface pitting fatigue. The ausforged gears had a slightly greater tendency to fail by tooth fracture than the standard forged gears.

  7. Invariance in the recurrence of large returns and the validation of models of price dynamics

    NASA Astrophysics Data System (ADS)

    Chang, Lo-Bin; Geman, Stuart; Hsieh, Fushing; Hwang, Chii-Ruey

    2013-08-01

    Starting from a robust, nonparametric definition of large returns (“excursions”), we study the statistics of their occurrences, focusing on the recurrence process. The empirical waiting-time distribution between excursions is remarkably invariant to year, stock, and scale (return interval). This invariance is related to self-similarity of the marginal distributions of returns, but the excursion waiting-time distribution is a function of the entire return process and not just its univariate probabilities. Generalized autoregressive conditional heteroskedasticity (GARCH) models, market-time transformations based on volume or trades, and generalized (Lévy) random-walk models all fail to fit the statistical structure of excursions.

  8. Latent Dirichlet Allocation (LDA) for Sentiment Analysis Toward Tourism Review in Indonesia

    NASA Astrophysics Data System (ADS)

    Putri, IR; Kusumaningrum, R.

    2017-01-01

    The tourism industry is one of foreign exchange sector, which has considerable potential development in Indonesia. Compared to other Southeast Asia countries such as Malaysia with 18 million tourists and Singapore 20 million tourists, Indonesia which is the largest Southeast Asia’s country have failed to attract higher tourist numbers compared to its regional peers. Indonesia only managed to attract 8,8 million foreign tourists in 2013, with the value of foreign tourists each year which is likely to decrease. Apart from the infrastructure problems, marketing and managing also form of obstacles for tourism growth. An evaluation and self-analysis should be done by the stakeholder to respond toward this problem and capture opportunities that related to tourism satisfaction from tourists review. Recently, one of technology to answer this problem only relying on the subjective of statistical data which collected by voting or grading from user randomly. So the result is still not to be accountable. Thus, we proposed sentiment analysis with probabilistic topic model using Latent Dirichlet Allocation (LDA) method to be applied for reading general tendency from tourist review into certain topics that can be classified toward positive and negative sentiment.

  9. Failures to replicate blocking are surprising and informative-Reply to Soto (2018).

    PubMed

    Maes, Elisa; Krypotos, Angelos-Miltiadis; Boddez, Yannick; Alfei Palloni, Joaquín Matías; D'Hooge, Rudi; De Houwer, Jan; Beckers, Tom

    2018-04-01

    The blocking effect has inspired numerous associative learning theories and is widely cited in the literature. We recently reported a series of 15 experiments that failed to obtain a blocking effect in rodents. On the basis of those consistent failures, we claimed that there is a lack of insight into the boundary conditions for blocking. In his commentary, Soto (2018) argued that contemporary associative learning theory does provide a specific boundary condition for the occurrence of blocking, namely the use of same- versus different-modality stimuli. Given that in 10 of our 15 experiments same-modality stimuli were used, he claims that our failure to observe a blocking effect is unsurprising. We disagree with that claim, because of theoretical, empirical, and statistical problems with his analysis. We also address 2 other possible reasons for a lack of blocking that are referred to in Soto's (2018) analysis, related to generalization and salience, and dissect the potential importance of both. Although Soto's (2018) analyses raise a number of interesting points, we see more merit in an empirically guided analysis and call for empirical testing of boundary conditions on blocking. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  10. Metabolomic Modularity Analysis (MMA) to Quantify Human Liver Perfusion Dynamics.

    PubMed

    Sridharan, Gautham Vivek; Bruinsma, Bote Gosse; Bale, Shyam Sundhar; Swaminathan, Anandh; Saeidi, Nima; Yarmush, Martin L; Uygun, Korkut

    2017-11-13

    Large-scale -omics data are now ubiquitously utilized to capture and interpret global responses to perturbations in biological systems, such as the impact of disease states on cells, tissues, and whole organs. Metabolomics data, in particular, are difficult to interpret for providing physiological insight because predefined biochemical pathways used for analysis are inherently biased and fail to capture more complex network interactions that span multiple canonical pathways. In this study, we introduce a nov-el approach coined Metabolomic Modularity Analysis (MMA) as a graph-based algorithm to systematically identify metabolic modules of reactions enriched with metabolites flagged to be statistically significant. A defining feature of the algorithm is its ability to determine modularity that highlights interactions between reactions mediated by the production and consumption of cofactors and other hub metabolites. As a case study, we evaluated the metabolic dynamics of discarded human livers using time-course metabolomics data and MMA to identify modules that explain the observed physiological changes leading to liver recovery during subnormothermic machine perfusion (SNMP). MMA was performed on a large scale liver-specific human metabolic network that was weighted based on metabolomics data and identified cofactor-mediated modules that would not have been discovered by traditional metabolic pathway analyses.

  11. Risk factors for poor multidrug-resistant tuberculosis treatment outcomes in Kyiv Oblast, Ukraine.

    PubMed

    Aibana, Omowunmi; Bachmaha, Mariya; Krasiuk, Viatcheslav; Rybak, Natasha; Flanigan, Timothy P; Petrenko, Vasyl; Murray, Megan B

    2017-02-07

    Ukraine is among ten countries with the highest burden of multidrug- resistant TB (MDR-TB) worldwide. Treatment success rates for MDR-TB in Ukraine remain below global success rates as reported by the World Health Organization. Few studies have evaluated predictors of poor MDR-TB outcomes in Ukraine. We conducted a retrospective analysis of patients initiated on MDR-TB treatment in the Kyiv Oblast of Ukraine between January 01, 2012 and March 31st, 2015. We defined good treatment outcomes as cure or completion and categorized poor outcomes among those who died, failed treatment or defaulted. We used logistic regression analyses to identify baseline patient characteristics associated with poor MDR-TB treatment outcomes. Among 360 patients, 65 (18.1%) achieved treatment cure or completion while 131 (36.4%) died, 115 (31.9%) defaulted, and 37 (10.3%) failed treatment. In the multivariate analysis, the strongest baseline predictors of poor outcomes were HIV infection without anti-retroviral therapy (ART) initiation (aOR 10.07; 95% CI 1.20-84.45; p 0.03) and presence of extensively-drug resistant TB (aOR 9.19; 95% CI 1.17-72.06; p 0.03). HIV-positive patients initiated on ART were not at increased risk of poor outcomes (aOR 1.43; 95% CI 0.58-3.54; p 0.44). There was no statistically significant difference in risk of poor outcomes among patients who received baseline molecular testing with Gene Xpert compared to those who were not tested (aOR 1.31; 95% CI 0.63-2.73). Rigorous compliance with national guidelines recommending prompt initiation of ART among HIV/TB co-infected patients and use of drug susceptibility testing results to construct treatment regimens can have a major impact on improving MDR-TB treatment outcomes in Ukraine.

  12. Feeling the future: A meta-analysis of 90 experiments on the anomalous anticipation of random future events

    PubMed Central

    Bem, Daryl; Tressoldi, Patrizio; Rabeyron, Thomas; Duggan, Michael

    2016-01-01

    In 2011, one of the authors (DJB) published a report of nine experiments in the Journal of Personality and Social Psychology purporting to demonstrate that an individual’s cognitive and affective responses can be influenced by randomly selected stimulus events that do not occur until after his or her responses have already been made and recorded, a generalized variant of the phenomenon traditionally denoted by the term precognition. To encourage replications, all materials needed to conduct them were made available on request. We here report a meta-analysis of 90 experiments from 33 laboratories in 14 countries which yielded an overall effect greater than 6 sigma, z = 6.40, p = 1.2 × 10 -10  with an effect size (Hedges’ g) of 0.09. A Bayesian analysis yielded a Bayes Factor of 5.1 × 10 9, greatly exceeding the criterion value of 100 for “decisive evidence” in support of the experimental hypothesis. When DJB’s original experiments are excluded, the combined effect size for replications by independent investigators is 0.06, z = 4.16, p = 1.1 × 10 -5, and the BF value is 3,853, again exceeding the criterion for “decisive evidence.” The number of potentially unretrieved experiments required to reduce the overall effect size of the complete database to a trivial value of 0.01 is 544, and seven of eight additional statistical tests support the conclusion that the database is not significantly compromised by either selection bias or by intense “ p-hacking”—the selective suppression of findings or analyses that failed to yield statistical significance. P-curve analysis, a recently introduced statistical technique, estimates the true effect size of the experiments to be 0.20 for the complete database and 0.24 for the independent replications, virtually identical to the effect size of DJB’s original experiments (0.22) and the closely related “presentiment” experiments (0.21). We discuss the controversial status of precognition and other anomalous effects collectively known as psi. PMID:26834996

  13. Predicting In Vivo Anti-Hepatofibrotic Drug Efficacy Based on In Vitro High-Content Analysis

    PubMed Central

    Zheng, Baixue; Tan, Looling; Mo, Xuejun; Yu, Weimiao; Wang, Yan; Tucker-Kellogg, Lisa; Welsch, Roy E.; So, Peter T. C.; Yu, Hanry

    2011-01-01

    Background/Aims Many anti-fibrotic drugs with high in vitro efficacies fail to produce significant effects in vivo. The aim of this work is to use a statistical approach to design a numerical predictor that correlates better with in vivo outcomes. Methods High-content analysis (HCA) was performed with 49 drugs on hepatic stellate cells (HSCs) LX-2 stained with 10 fibrotic markers. ∼0.3 billion feature values from all cells in >150,000 images were quantified to reflect the drug effects. A systematic literature search on the in vivo effects of all 49 drugs on hepatofibrotic rats yields 28 papers with histological scores. The in vivo and in vitro datasets were used to compute a single efficacy predictor (Epredict). Results We used in vivo data from one context (CCl4 rats with drug treatments) to optimize the computation of Epredict. This optimized relationship was independently validated using in vivo data from two different contexts (treatment of DMN rats and prevention of CCl4 induction). A linear in vitro-in vivo correlation was consistently observed in all the three contexts. We used Epredict values to cluster drugs according to efficacy; and found that high-efficacy drugs tended to target proliferation, apoptosis and contractility of HSCs. Conclusions The Epredict statistic, based on a prioritized combination of in vitro features, provides a better correlation between in vitro and in vivo drug response than any of the traditional in vitro markers considered. PMID:22073152

  14. Performance evaluation of mobile downflow booths for reducing airborne particles in the workplace.

    PubMed

    Lo, Li-Ming; Hocker, Braden; Steltz, Austin E; Kremer, John; Feng, H Amy

    2017-11-01

    Compared to other common control measures, the downflow booth is a costly engineering control used to contain airborne dust or particles. The downflow booth provides unidirectional filtered airflow from the ceiling, entraining released particles away from the workers' breathing zone, and delivers contained airflow to a lower level exhaust for removing particulates by filtering media. In this study, we designed and built a mobile downflow booth that is capable of quick assembly and easy size change to provide greater flexibility and particle control for various manufacturing processes or tasks. An experimental study was conducted to thoroughly evaluate the control performance of downflow booths used for removing airborne particles generated by the transfer of powdered lactose between two containers. Statistical analysis compared particle reduction ratios obtained from various test conditions including booth size (short, regular, or extended), supply air velocity (0.41 and 0.51 m/s or 80 and 100 feet per minute, fpm), powder transfer location (near or far from the booth exhaust), and inclusion or exclusion of curtains at the booth entrance. Our study results show that only short-depth downflow booths failed to protect the worker performing powder transfer far from the booth exhausts. Statistical analysis shows that better control performance can be obtained with supply air velocity of 0.51 m/s (100 fpm) than with 0.41 m/s (80 fpm) and that use of curtains for downflow booths did not improve their control performance.

  15. Comparative evaluation of spectroscopic models using different multivariate statistical tools in a multicancer scenario

    NASA Astrophysics Data System (ADS)

    Ghanate, A. D.; Kothiwale, S.; Singh, S. P.; Bertrand, Dominique; Krishna, C. Murali

    2011-02-01

    Cancer is now recognized as one of the major causes of morbidity and mortality. Histopathological diagnosis, the gold standard, is shown to be subjective, time consuming, prone to interobserver disagreement, and often fails to predict prognosis. Optical spectroscopic methods are being contemplated as adjuncts or alternatives to conventional cancer diagnostics. The most important aspect of these approaches is their objectivity, and multivariate statistical tools play a major role in realizing it. However, rigorous evaluation of the robustness of spectral models is a prerequisite. The utility of Raman spectroscopy in the diagnosis of cancers has been well established. Until now, the specificity and applicability of spectral models have been evaluated for specific cancer types. In this study, we have evaluated the utility of spectroscopic models representing normal and malignant tissues of the breast, cervix, colon, larynx, and oral cavity in a broader perspective, using different multivariate tests. The limit test, which was used in our earlier study, gave high sensitivity but suffered from poor specificity. The performance of other methods such as factorial discriminant analysis and partial least square discriminant analysis are at par with more complex nonlinear methods such as decision trees, but they provide very little information about the classification model. This comparative study thus demonstrates not just the efficacy of Raman spectroscopic models but also the applicability and limitations of different multivariate tools for discrimination under complex conditions such as the multicancer scenario.

  16. Canadian firearms legislation and effects on homicide 1974 to 2008.

    PubMed

    Langmann, Caillin

    2012-08-01

    Canada has implemented legislation covering all firearms since 1977 and presents a model to examine incremental firearms control. The effect of legislation on homicide by firearm and the subcategory, spousal homicide, is controversial and has not been well studied to date. Legislative effects on homicide and spousal homicide were analyzed using data obtained from Statistics Canada from 1974 to 2008. Three statistical methods were applied to search for any associated effects of firearms legislation. Interrupted time series regression, ARIMA, and Joinpoint analysis were performed. Neither were any significant beneficial associations between firearms legislation and homicide or spousal homicide rates found after the passage of three Acts by the Canadian Parliament--Bill C-51 (1977), C-17 (1991), and C-68 (1995)--nor were effects found after the implementation of licensing in 2001 and the registration of rifles and shotguns in 2003. After the passage of C-68, a decrease in the rate of the decline of homicide by firearm was found by interrupted regression. Joinpoint analysis also found an increasing trend in homicide by firearm rate post the enactment of the licensing portion of C-68. Other factors found to be associated with homicide rates were median age, unemployment, immigration rates, percentage of population in low-income bracket, Gini index of income equality, population per police officer, and incarceration rate. This study failed to demonstrate a beneficial association between legislation and firearm homicide rates between 1974 and 2008.

  17. On polarimetric radar signatures of deep convection for model evaluation: columns of specific differential phase observed during MC3E

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    van Lier-Walqui, Marcus; Fridlind, Ann; Ackerman, Andrew S

    2016-02-01

    The representation of deep convection in general circulation models is in part informed by cloud-resolving models (CRMs) that function at higher spatial and temporal resolution; however, recent studies have shown that CRMs often fail at capturing the details of deep convection updrafts. With the goal of providing constraint on CRM simulation of deep convection updrafts, ground-based remote sensing observations are analyzed and statistically correlated for four deep convection events observed during the Midlatitude Continental Convective Clouds Experiment (MC3E). Since positive values of specific differential phase observed above the melting level are associated with deep convection updraft cells, so-called columns aremore » analyzed using two scanning polarimetric radars in Oklahoma: the National Weather Service Vance WSR-88D (KVNX) and the Department of Energy C-band Scanning Atmospheric Radiation Measurement (ARM) Precipitation Radar (C-SAPR). KVNX and C-SAPR volumes and columns are then statistically correlated with vertical winds retrieved via multi-Doppler wind analysis, lightning flash activity derived from the Oklahoma Lightning Mapping Array, and KVNX differential reflectivity . Results indicate strong correlations of volume above the melting level with updraft mass flux, lightning flash activity, and intense rainfall. Analysis of columns reveals signatures of changing updraft properties from one storm event to another as well as during event evolution. Comparison of to shows commonalities in information content of each, as well as potential problems with associated with observational artifacts.« less

  18. On damage detection in wind turbine gearboxes using outlier analysis

    NASA Astrophysics Data System (ADS)

    Antoniadou, Ifigeneia; Manson, Graeme; Dervilis, Nikolaos; Staszewski, Wieslaw J.; Worden, Keith

    2012-04-01

    The proportion of worldwide installed wind power in power systems increases over the years as a result of the steadily growing interest in renewable energy sources. Still, the advantages offered by the use of wind power are overshadowed by the high operational and maintenance costs, resulting in the low competitiveness of wind power in the energy market. In order to reduce the costs of corrective maintenance, the application of condition monitoring to gearboxes becomes highly important, since gearboxes are among the wind turbine components with the most frequent failure observations. While condition monitoring of gearboxes in general is common practice, with various methods having been developed over the last few decades, wind turbine gearbox condition monitoring faces a major challenge: the detection of faults under the time-varying load conditions prevailing in wind turbine systems. Classical time and frequency domain methods fail to detect faults under variable load conditions, due to the temporary effect that these faults have on vibration signals. This paper uses the statistical discipline of outlier analysis for the damage detection of gearbox tooth faults. A simplified two-degree-of-freedom gearbox model considering nonlinear backlash, time-periodic mesh stiffness and static transmission error, simulates the vibration signals to be analysed. Local stiffness reduction is used for the simulation of tooth faults and statistical processes determine the existence of intermittencies. The lowest level of fault detection, the threshold value, is considered and the Mahalanobis squared-distance is calculated for the novelty detection problem.

  19. Vulnerability and cosusceptibility determine the size of network cascades

    DOE PAGES

    Yang, Yang; Nishikawa, Takashi; Motter, Adilson E.

    2017-01-27

    In a network, a local disturbance can propagate and eventually cause a substantial part of the system to fail in cascade events that are easy to conceptualize but extraordinarily difficult to predict. Furthermore, we develop a statistical framework that can predict cascade size distributions by incorporating two ingredients only: the vulnerability of individual components and the cosusceptibility of groups of components (i.e., their tendency to fail together). Using cascades in power grids as a representative example, we show that correlations between component failures define structured and often surprisingly large groups of cosusceptible components. Aside from their implications for blackout studies,more » these results provide insights and a new modeling framework for understanding cascades in financial systems, food webs, and complex networks in general.« less

  20. SIGPI. Fault Tree Cut Set System Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patenaude, C.J.

    1992-01-13

    SIGPI computes the probabilistic performance of complex systems by combining cut set or other binary product data with probability information on each basic event. SIGPI is designed to work with either coherent systems, where the system fails when certain combinations of components fail, or noncoherent systems, where at least one cut set occurs only if at least one component of the system is operating properly. The program can handle conditionally independent components, dependent components, or a combination of component types and has been used to evaluate responses to environmental threats and seismic events. The three data types that can bemore » input are cut set data in disjoint normal form, basic component probabilities for independent basic components, and mean and covariance data for statistically dependent basic components.« less

  1. SIGPI. Fault Tree Cut Set System Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patenaude, C.J.

    1992-01-14

    SIGPI computes the probabilistic performance of complex systems by combining cut set or other binary product data with probability information on each basic event. SIGPI is designed to work with either coherent systems, where the system fails when certain combinations of components fail, or noncoherent systems, where at least one cut set occurs only if at least one component of the system is operating properly. The program can handle conditionally independent components, dependent components, or a combination of component types and has been used to evaluate responses to environmental threats and seismic events. The three data types that can bemore » input are cut set data in disjoint normal form, basic component probabilities for independent basic components, and mean and covariance data for statistically dependent basic components.« less

  2. National intelligence estimates and the Failed State Index.

    PubMed

    Voracek, Martin

    2013-10-01

    Across 177 countries around the world, the Failed State Index, a measure of state vulnerability, was reliably negatively associated with the estimates of national intelligence. Psychometric analysis of the Failed State Index, compounded of 12 social, economic, and political indicators, suggested factorial unidimensionality of this index. The observed correspondence of higher national intelligence figures to lower state vulnerability might arise through these two macro-level variables possibly being proxies of even more pervasive historical and societal background variables that affect both.

  3. 'I Beg Your Pardon?': The Preverbal Negotiation of Failed Messages.

    ERIC Educational Resources Information Center

    Golinkoff, Roberta Michnick

    1986-01-01

    Analysis of videotapes recorded of three preverbal infants' communication attempts with their mothers revealed three behaviors: Negotiations occurred when mothers helped infants make their intents clear; Immediate Successes occurred when mothers readily comprehended the infants' intents; and Missed Attempts occurred when the mother failed to…

  4. Skewness and kurtosis analysis for non-Gaussian distributions

    NASA Astrophysics Data System (ADS)

    Celikoglu, Ahmet; Tirnakli, Ugur

    2018-06-01

    In this paper we address a number of pitfalls regarding the use of kurtosis as a measure of deviations from the Gaussian. We treat kurtosis in both its standard definition and that which arises in q-statistics, namely q-kurtosis. We have recently shown that the relation proposed by Cristelli et al. (2012) between skewness and kurtosis can only be verified for relatively small data sets, independently of the type of statistics chosen; however it fails for sufficiently large data sets, if the fourth moment of the distribution is finite. For infinite fourth moments, kurtosis is not defined as the size of the data set tends to infinity. For distributions with finite fourth moments, the size, N, of the data set for which the standard kurtosis saturates to a fixed value, depends on the deviation of the original distribution from the Gaussian. Nevertheless, using kurtosis as a criterion for deciding which distribution deviates further from the Gaussian can be misleading for small data sets, even for finite fourth moment distributions. Going over to q-statistics, we find that although the value of q-kurtosis is finite in the range of 0 < q < 3, this quantity is not useful for comparing different non-Gaussian distributed data sets, unless the appropriate q value, which truly characterizes the data set of interest, is chosen. Finally, we propose a method to determine the correct q value and thereby to compute the q-kurtosis of q-Gaussian distributed data sets.

  5. [The effect of air pollutants on birth weight in medium-sized towns in the state of São Paulo].

    PubMed

    Santos, Veridiana de Paula; de Medeiros, Andréa Paula Peneluppi; de Lima, Thaiza Agostini Córdoba; Nascimento, Luiz Fernando Costa

    2014-12-01

    To investigate the effect of air pollution on birth weight in a medium-sized town in the State of São Paulo, Southeast Brazil. Cross-sectional study using data of live births to mothers residing in São José dos Campos from 2005 to 2009. Data was obtained from the Department of Information and Computing of the Brazilian Unified Health System. Air pollutant data (PM, SO and O) and daily averages of their concentrations were obtained from the Environmental Sanitation & Technology Company. Statistical analysis was performed by linear and logistic regressions using the Excel and STATA v.7 software programs. Maternal exposure to air pollutants was not associated with low birth weight, with the exception of exposure to SO within the last month of pregnancy (OR=1,25; IC95% 1,00-1,56). Maternal exposure to PM and SO during the last month of pregnancy led to lower weight at birth (0.28 g and 3.15 g, respectively) for each 1mg/m(3) increase in the concentration of these pollutants, but without statistical significance. This study failed to identify a statistically significant association between the levels of air pollutants and birth weight, with the exception of exposure to SO within the last month of pregnancy. Copyright © 2014 Associação de Pediatria de São Paulo. Publicado por Elsevier Editora Ltda. All rights reserved.

  6. Whole vertebral bone segmentation method with a statistical intensity-shape model based approach

    NASA Astrophysics Data System (ADS)

    Hanaoka, Shouhei; Fritscher, Karl; Schuler, Benedikt; Masutani, Yoshitaka; Hayashi, Naoto; Ohtomo, Kuni; Schubert, Rainer

    2011-03-01

    An automatic segmentation algorithm for the vertebrae in human body CT images is presented. Especially we focused on constructing and utilizing 4 different statistical intensity-shape combined models for the cervical, upper / lower thoracic and lumbar vertebrae, respectively. For this purpose, two previously reported methods were combined: a deformable model-based initial segmentation method and a statistical shape-intensity model-based precise segmentation method. The former is used as a pre-processing to detect the position and orientation of each vertebra, which determines the initial condition for the latter precise segmentation method. The precise segmentation method needs prior knowledge on both the intensities and the shapes of the objects. After PCA analysis of such shape-intensity expressions obtained from training image sets, vertebrae were parametrically modeled as a linear combination of the principal component vectors. The segmentation of each target vertebra was performed as fitting of this parametric model to the target image by maximum a posteriori estimation, combined with the geodesic active contour method. In the experimental result by using 10 cases, the initial segmentation was successful in 6 cases and only partially failed in 4 cases (2 in the cervical area and 2 in the lumbo-sacral). In the precise segmentation, the mean error distances were 2.078, 1.416, 0.777, 0.939 mm for cervical, upper and lower thoracic, lumbar spines, respectively. In conclusion, our automatic segmentation algorithm for the vertebrae in human body CT images showed a fair performance for cervical, thoracic and lumbar vertebrae.

  7. A Biosequence-based Approach to Software Characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oehmen, Christopher S.; Peterson, Elena S.; Phillips, Aaron R.

    For many applications, it is desirable to have some process for recognizing when software binaries are closely related without relying on them to be identical or have identical segments. Some examples include monitoring utilization of high performance computing centers or service clouds, detecting freeware in licensed code, and enforcing application whitelists. But doing so in a dynamic environment is a nontrivial task because most approaches to software similarity require extensive and time-consuming analysis of a binary, or they fail to recognize executables that are similar but nonidentical. Presented herein is a novel biosequence-based method for quantifying similarity of executable binaries.more » Using this method, it is shown in an example application on large-scale multi-author codes that 1) the biosequence-based method has a statistical performance in recognizing and distinguishing between a collection of real-world high performance computing applications better than 90% of ideal; and 2) an example of using family tree analysis to tune identification for a code subfamily can achieve better than 99% of ideal performance.« less

  8. Education Industry

    DTIC Science & Technology

    2007-01-01

    US higher education system is considered the “gold standard” worldwide. For example, according to National Science Foundation data , the number of... data has been to highlight a school as failing when in fact it isn’t. This is known in the education community as the “diversity penalty” (Darling...address these challenges at the local level. Background The National Center for Education Statistics estimates there will be a six percent

  9. A Comparison of Multiscale Permutation Entropy Measures in On-Line Depth of Anesthesia Monitoring

    PubMed Central

    Li, Xiaoli; Li, Duan; Li, Yongwang; Ursino, Mauro

    2016-01-01

    Objective Multiscale permutation entropy (MSPE) is becoming an interesting tool to explore neurophysiological mechanisms in recent years. In this study, six MSPE measures were proposed for on-line depth of anesthesia (DoA) monitoring to quantify the anesthetic effect on the real-time EEG recordings. The performance of these measures in describing the transient characters of simulated neural populations and clinical anesthesia EEG were evaluated and compared. Methods Six MSPE algorithms—derived from Shannon permutation entropy (SPE), Renyi permutation entropy (RPE) and Tsallis permutation entropy (TPE) combined with the decomposition procedures of coarse-graining (CG) method and moving average (MA) analysis—were studied. A thalamo-cortical neural mass model (TCNMM) was used to generate noise-free EEG under anesthesia to quantitatively assess the robustness of each MSPE measure against noise. Then, the clinical anesthesia EEG recordings from 20 patients were analyzed with these measures. To validate their effectiveness, the ability of six measures were compared in terms of tracking the dynamical changes in EEG data and the performance in state discrimination. The Pearson correlation coefficient (R) was used to assess the relationship among MSPE measures. Results CG-based MSPEs failed in on-line DoA monitoring at multiscale analysis. In on-line EEG analysis, the MA-based MSPE measures at 5 decomposed scales could track the transient changes of EEG recordings and statistically distinguish the awake state, unconsciousness and recovery of consciousness (RoC) state significantly. Compared to single-scale SPE and RPE, MSPEs had better anti-noise ability and MA-RPE at scale 5 performed best in this aspect. MA-TPE outperformed other measures with faster tracking speed of the loss of unconsciousness. Conclusions MA-based multiscale permutation entropies have the potential for on-line anesthesia EEG analysis with its simple computation and sensitivity to drug effect changes. CG-based multiscale permutation entropies may fail to describe the characteristics of EEG at high decomposition scales. PMID:27723803

  10. Modified irinotecan and infusional 5-fluorouracil (mFOLFIRI) in patients with refractory advanced pancreas cancer (APC): a single-institution experience.

    PubMed

    Bupathi, M; Ahn, D H; Wu, C; Ciombor, K K; Stephens, J A; Reardon, J; Goldstein, D A; Bekaii-Saab, T

    2016-04-01

    Pancreatic adenocarcinoma is the fourth leading cause of cancer death. Recently, MM-398 (nanoliposomal irinotecan) was shown to be associated with significant improvement in outcome measures with acceptable toxicities when combined with 5-fluorouracil (5-FU)/leucovorin (LV) compared to 5-FU/LV alone in patients failing one line of gemcitabine-based therapy. There is a paucity of data evaluating the role of irinotecan in combination with 5FU in advanced pancreas cancer (APC). We performed a retrospective analysis of all patients who received mFOLFIRI (minus bolus 5FU and LV). All patients with metastatic disease who had failed at least one line of gemcitabine-based therapy prior to receiving mFOLFIRI were included in this study. Descriptive statistics were used to assess the continuous variables and adverse events (AEs), and Kaplan-Meier methods were used to calculate the median progression-free survival (PFS) and overall survival (OS). Forty patients were included in this analysis. Patients received 1-5 lines of prior therapy (25 % with more than 3 lines of prior therapy). The mean age at diagnosis was 60, and 98 % had ECOG of 1. The mean CA 19-9 at the start of therapy was 33,169 U/ml. The median PFS was 2.59 months [95 % confidence interval (CI) (1.90, 3.54)], and OS was 4.75 months [95 % CI (3.14, 8.98)]. The most common AEs included fatigue (98 %), neuropathy (83 %), anorexia (68 %), nausea (60 %) and constipation (55 %). Grade 3 toxicities included fatigue (13 %) and rash (3 %). There were no observed grade 4 toxicities. In this single-institution retrospective analysis, mFOLFIRI was found to be both tolerable and relatively effective in a heavily pretreated patient population with APC. Future prospective studies should consider evaluating the role of mFOLFIRI in refractory APC.

  11. Identifying factors that predict the choice and success rate of radial artery catheterisation in contemporary real world cardiology practice: a sub-analysis of the PREVAIL study data.

    PubMed

    Pristipino, Christian; Roncella, Adriana; Trani, Carlo; Nazzaro, Marco S; Berni, Andrea; Di Sciascio, Germano; Sciahbasi, Alessandro; Musarò, Salvatore Donato; Mazzarotto, Pietro; Gioffrè, Gaetano; Speciale, Giulio

    2010-06-01

    To assess: the reasons behind an operator choosing to perform radial artery catheterisation (RAC) as against femoral arterial catheterisation, and to explore why RAC may fail in the real world. A pre-determined analysis of PREVAIL study database was performed. Relevant data were collected in a prospective, observational survey of 1,052 consecutive patients undergoing invasive cardiovascular procedures at nine Italian hospitals over a one month observation period. By multivariate analysis, the independent predictors of RAC choice were having the procedure performed: (1) at a high procedural volume centre; and (2) by an operator who performs a high volume of radial procedures; clinical variables played no statistically significant role. RAC failure was predicted independently by (1) a lower operator propensity to use RAC; and (2) the presence of obstructive peripheral artery disease. A 10-fold lower rate of RAC failure was observed among operators who perform RAC for > 85% of their personal caseload than among those who use RAC < 25% of the time (3.8% vs. 33.0%, respectively); by receiver operator characteristic (ROC) analysis, no threshold value for operator RAC volume predicted RAC failure. A routine RAC in all-comers is superior to a selective strategy in terms of feasibility and success rate.

  12. Determining the Publication Impact of a Digital Library

    NASA Technical Reports Server (NTRS)

    Kaplan, Nancy R.; Nelson, Michael L.

    2000-01-01

    We attempt to assess the publication impact of a digital library (DL) of aerospace scientific and technical information (STI). The Langley Technical Report Server (LTRS) is a digital library of over 1,400 electronic publications authored by NASA Langley Research Center personnel or contractors and has been available in its current World Wide Web (WWW) form since 1994. In this study, we examine calendar year 1997 usage statistics of LTRS and the Center for AeroSpace Information (CASI), a facility that archives and distributes hard copies of NASA and aerospace information. We also perform a citation analysis on some of the top publications distributed by LTRS. We find that although LTRS distributes over 71,000 copies of publications (compared with an estimated 24,000 copies from CASI), citation analysis indicates that LTRS has almost no measurable publication impact. We discuss the caveats of our investigation, speculate on possible different models of usage facilitated by DLs , and suggest retrieval analysis as a complementary metric to citation analysis. While our investigation failed to establish a relationship between LTRS and increased citations and raises at least as many questions as it answers, we hope it will serve as a invitation to, and guide for, further research in the use of DLs.

  13. A Meta-Analysis of Reliability Coefficients in Second Language Research

    ERIC Educational Resources Information Center

    Plonsky, Luke; Derrick, Deirdre J.

    2016-01-01

    Ensuring internal validity in quantitative research requires, among other conditions, reliable instrumentation. Unfortunately, however, second language (L2) researchers often fail to report and even more often fail to interpret reliability estimates beyond generic benchmarks for acceptability. As a means to guide interpretations of such estimates,…

  14. An analysis of highway condemnation cases under the provisions of Senate Bill 724 : comparisons of jury and commission awards : final report.

    DOT National Transportation Integrated Search

    1997-01-01

    The Virginia Department of Transportation (VDOT) may legally condemn land for road improvements if all purchase negotiations with the landowner fail. If all further negotiations fail, just compensation for the landowner is decided in court by a five ...

  15. Patching compliance with full-time vs. part-time occlusion therapy.

    PubMed

    Kane, Jessica; Biernacki, Ron; Fraine, Lisa; Fukuda, Neva; Haskins, Kelsie; Morrison, David G

    2013-01-01

    Amblyopia is commonly treated with part-time occlusion (PTO) therapy. We have made two anecdotal observations regarding this therapy. First, children undergoing full-time occlusion seem to have better success and compliance rates. Secondly, a subset of children exists that fail PTO but can improve with more aggressive therapy. A retrospective review where treatment, visual outcome, and compliance scores were recorded. Compliance was graded on percent adherence reported by family. Patients scored “1” (for no compliance), “2” (for 1–25% of prescribed treatment performed), “3” (for 26–50%), “4” (for 51–75%), or “5” (for 76–100%). Seventy-six children were enrolled in the study: forty-five were treated with part-time occlusion, twenty-two were treated with full time occlusion (FTO), and nine had a history of failed PTO and were subsequently treated with FTO. Visual outcomes for FTO versus PTO were not statistically significant (P = 0.82). However, compliance rates in FTO were significantly better (P = 0.02). Of the nine patients that failed PTO, four improved an average of three lines with full-time occlusion, and five had no change with more aggressive patching. This study confirms previous reports of similar visual outcomes between PTO and FTO. However, compliance rates for FTO seem to be higher and some children who have failed PTO may improve with FTO.

  16. A Statistical Project Control Tool for Engineering Managers

    NASA Technical Reports Server (NTRS)

    Bauch, Garland T.

    2001-01-01

    This slide presentation reviews the use of a Statistical Project Control Tool (SPCT) for managing engineering projects. A literature review pointed to a definition of project success, (i.e., A project is successful when the cost, schedule, technical performance, and quality satisfy the customer.) The literature review also pointed to project success factors, and traditional project control tools, and performance measures that are detailed in the report. The essential problem is that with resources becoming more limited, and an increasing number or projects, project failure is increasing, there is a limitation of existing methods and systematic methods are required. The objective of the work is to provide a new statistical project control tool for project managers. Graphs using the SPCT method plotting results of 3 successful projects and 3 failed projects are reviewed, with success and failure being defined by the owner.

  17. Results of celiac trunk stenting during fenestrated or branched aortic endografting.

    PubMed

    Wattez, Hélène; Martin-Gonzalez, Teresa; Lopez, Benjamin; Spear, Rafaëlle; Clough, Rachel E; Hertault, Adrien; Sobocinski, Jonathan; Haulon, Stéphan

    2016-12-01

    Endovascular repair of aortic aneurysms involving the visceral segment of the aorta often requires placement of a covered bridging stent in the celiac axis (CA). The median arcuate ligament (MAL) is a fibrous arch that unites the diaphragmatic crura on either side of the aortic hiatus. The ligament may compress and distort the celiac artery and result in difficult cannulation, or stenosis and occlusion of the vessel. This study evaluated the influence of the MAL compression on the technical success and the patency of the celiac artery after branched and fenestrated endovascular aortic repair. We retrospectively analyzed a cohort of consecutive patients treated electively for complex aneurysms with branched and fenestrated endovascular aortic repair between January 2007 and April 2014. All data were collected prospectively. Analysis of preoperative computed tomography angiography on a three-dimensional workstation determined the presence of MAL compression. Patency of the CA bridging stent was assessed during follow-up by computed tomography angiography and duplex ultrasound evaluation. Statistical analysis was performed to compare the outcomes of patients with MAL (MAL+) and without MAL (MAL-) compression. Of 315 patients treated for aortic disease involving the visceral segment during the study period, 113 had endografts designed with a branch (n = 57) or fenestration (n = 56) for the CA. In 45 patients (39.8%), asymptomatic compression of the CA by the MAL was depicted (MAL+). Complex endovascular techniques were required in this group to access the CA in 16 (14.2%) patients (vs none in the MAL- group; P = .003), which lead to a failed bridging stent implantation in seven patients (6.2%). Increased operative time and dose area product were observed in the MAL+ group, but this did not reach statistical significance. In the MAL+ group, no thrombosis of the CA bridging stents were observed during follow-up; an external compression of the CA bridging stent was depicted in six patients but without hemodynamic effect on duplex ultrasound imaging. In the MAL- group, one CA bridging stent occlusion occurred owing to an embolus from a cardiac source. MAL compression is associated with good celiac trunk bridging stent patency during follow-up, but with a higher rate of technical difficulties and failed bridging stent implantation during the procedure. Copyright © 2016 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.

  18. U.S. Food safety and Inspection Service testing for Salmonella in selected raw meat and poultry products in the United States, 1998 through 2003: an establishment-level analysis.

    PubMed

    Eblen, Denise R; Barlow, Kristina E; Naugle, Alecia Larew

    2006-11-01

    The U.S. Food Safety and Inspection Service (FSIS) pathogen reduction-hazard analysis critical control point systems final rule, published in 1996, established Salmonella performance standards for broiler chicken, cow and bull, market hog, and steer and heifer carcasses and for ground beef, chicken, and turkey meat. In 1998, the FSIS began testing to verify that establishments are meeting performance standards. Samples are collected in sets in which the number of samples is defined but varies according to product class. A sample set fails when the number of positive Salmonella samples exceeds the maximum number of positive samples allowed under the performance standard. Salmonella sample sets collected at 1,584 establishments from 1998 through 2003 were examined to identify factors associated with failure of one or more sets. Overall, 1,282 (80.9%) of establishments never had failed sets. In establishments that did experience set failure(s), generally the failed sets were collected early in the establishment testing history, with the exception of broiler establishments where failure(s) occurred both early and late in the course of testing. Small establishments were more likely to have experienced a set failure than were large or very small establishments, and broiler establishments were more likely to have failed than were ground beef, market hog, or steer-heifer establishments. Agency response to failed Salmonella sample sets in the form of in-depth verification reviews and related establishment-initiated corrective actions have likely contributed to declines in the number of establishments that failed sets. A focus on food safety measures in small establishments and broiler processing establishments should further reduce the number of sample sets that fail to meet the Salmonella performance standard.

  19. KPS/LDH index: a simple tool for identifying patients with metastatic melanoma who are unlikely to benefit from palliative whole brain radiotherapy.

    PubMed

    Partl, Richard; Fastner, Gerd; Kaiser, Julia; Kronhuber, Elisabeth; Cetin-Strohmer, Klaudia; Steffal, Claudia; Böhmer-Breitfelder, Barbara; Mayer, Johannes; Avian, Alexander; Berghold, Andrea

    2016-02-01

    Low Karnofsky performance status (KPS) and elevated lactate dehydrogenases (LDHs) as a surrogate marker for tumor load and cell turnover may depict patients with a very short life expectancy. To validate this finding and compare it to other indices, namely, the recursive partitioning analysis (RPA) and diagnosis-specific graded prognostic assessment (DS-GPA), a multicenter analysis was undertaken. A retrospective analysis of 234 metastatic melanoma patients uniformly treated with palliative whole brain radiotherapy (WBRT) was done. Univariate and multivariate analyses were used to determine the impact of patient-, tumor-, and treatment-related parameters on overall survival (OS). KPS and LDH emerged as independent factors predicting OS. By combining KPS and LDH values (KPS/LDH index), groups of patients with statistically significant differences in median OS (days; 95 % CI) after onset of WBRT were identified: group 1 (KPS ≥ 70/normal LDH) 234 (96-372), group 2 (KPS ≥ 70/elevated LDH) 112 (69-155), group 3 (KPS <70/normal LDH) 43 (12-74), and group 4 (KPS <70/elevated LDH) 29 (17-41). Between all four groups, statistically significant differences were observed. The RPA and DS-GPA indices failed to distinguish significantly between good and moderate prognosis and were inferior in predicting a very unfavorable prognosis. The parameters KPS and LDH independently impacted on OS. The combination of both (KPS/LDH index) identified patients with a very short life expectancy, who might be better served by recommending best supportive care instead of WBRT. The KPS/LDH index is simple and effective in terms of time and cost as compared to other prognostic indices.

  20. 5,10-Methylenetetrahydrofolate reductase polymorphisms and acute lymphoblastic leukemia risk: a meta-analysis.

    PubMed

    Pereira, Tiago Veiga; Rudnicki, Martina; Pereira, Alexandre Costa; Pombo-de-Oliveira, Maria S; Franco, Rendrik França

    2006-10-01

    There is evidence supporting a role for 5-10 methylenetetrahydrofolate reductase (MTHFR) gene variants in acute lymphoblastic leukemia (ALL). To provide a more robust estimate of the effect of MTHFR polymorphisms on the risk of ALL, we did a meta-analysis to reevaluate the association between the two most commonly studied MTHFR polymorphisms (C677T and A1298C) and ALL risk. All case-control studies investigating an association between the C677T or A1298C polymorphisms and risk of ALL were included. We applied both fixed-effects and random-effects models to combine odds ratio (OR) and 95% confidence intervals (95% CI). Q-statistic was used to evaluate the homogeneity and both Egger and Begg-Mazumdar tests were used to assess publication bias. The meta-analysis of the C677T polymorphism and risk of childhood ALL included 13 studies with a total of 4,894 individuals. Under a fixed-effects model, the TT genotype failed to be associated with a statistically significant reduction of childhood ALL risk (TT versus CT + CC: OR, 0.88; 95% CI, 0.73-1.06; P = 0.18). However, individuals homozygous for the 677T allele exhibited a 2.2-fold decrease in risk of adult ALL (TT versus CT + CC: OR, 0.45; 95% CI, 0.26-0.77; P = 0.004). In both cases, no evidence of heterogeneity was observed. No association between the A1298C variant and susceptibility to both adult and childhood ALL was disclosed. Our findings support the proposal that the common genetic C677T polymorphism in the MTHFR contributes to the risk of adult ALL, but not to the childhood ALL susceptibility.

  1. How reliable are gray matter disruptions in specific reading disability across multiple countries and languages? Insights from a large-scale voxel-based morphometry study.

    PubMed

    Jednoróg, Katarzyna; Marchewka, Artur; Altarelli, Irene; Monzalvo Lopez, Ana Karla; van Ermingen-Marbach, Muna; Grande, Marion; Grabowska, Anna; Heim, Stefan; Ramus, Franck

    2015-05-01

    The neural basis of specific reading disability (SRD) remains only partly understood. A dozen studies have used voxel-based morphometry (VBM) to investigate gray matter volume (GMV) differences between SRD and control children, however, recent meta-analyses suggest that few regions are consistent across studies. We used data collected across three countries (France, Poland, and Germany) with the aim of both increasing sample size (236 SRD and controls) to obtain a clearer picture of group differences, and of further assessing the consistency of the findings across languages. VBM analysis reveals a significant group difference in a single cluster in the left thalamus. Furthermore, we observe correlations between reading accuracy and GMV in the left supramarginal gyrus and in the left cerebellum, in controls only. Most strikingly, we fail to replicate all the group differences in GMV reported in previous studies, despite the superior statistical power. The main limitation of this study is the heterogeneity of the sample drawn from different countries (i.e., speaking languages with varying orthographic transparencies) and selected based on different assessment batteries. Nevertheless, analyses within each country support the conclusions of the cross-linguistic analysis. Explanations for the discrepancy between the present and previous studies may include: (1) the limited suitability of VBM to reveal the subtle brain disruptions underlying SRD; (2) insufficient correction for multiple statistical tests and flexibility in data analysis, and (3) publication bias in favor of positive results. Thus the study echoes widespread concerns about the risk of false-positive results inherent to small-scale VBM studies. © 2015 Wiley Periodicals, Inc.

  2. Analysis of reliability of professor recommendation letters based on concordance with self-introduction letter.

    PubMed

    Kim, Sang Hyun

    2013-12-01

    The purpose of this study was to examine the concordance between a checklist's categories of professor recommendation letters and characteristics of the self-introduction letter. Checklists of professor recommendation letters were analyzed and classified into cognitive, social, and affective domains. Simple correlation was performed to determine whether the characteristics of the checklists were concordant with those of the self-introduction letter. The difference in ratings of the checklists by pass or fail grades was analyzed by independent sample t-test. Logistic regression analysis was performed to determine whether a pass or fail grade was influenced by ratings on the checklists. The Cronbach alpha value of the checklists was 0.854. Initiative, as an affective domain, in the professor's recommendation letter was highly ranked among the six checklist categories. Self-directed learning in the self-introduction letter was influenced by a pass or fail grade by logistic regression analysis (p<0.05). Successful applicants received higher ratings than those who failed in every checklist category, particularly in problem-solving ability, communication skills, initiative, and morality (p<0.05). There was a strong correlation between cognitive and affective characteristics in the professor recommendation letters and the sum of all characteristics in the self-introduction letter.

  3. The Effects of Vortioxetine on Cognitive Function in Patients with Major Depressive Disorder: A Meta-Analysis of Three Randomized Controlled Trials

    PubMed Central

    Harrison, J; Loft, H; Jacobson, W; Olsen, CK

    2016-01-01

    Background: Management of cognitive deficits in Major Depressive Disorder (MDD) remains an important unmet need. This meta-analysis evaluated the effects of vortioxetine on cognition in patients with MDD. Methods: Random effects meta-analysis was applied to three randomized, double-blind, placebo-controlled 8-week trials of vortioxetine (5–20mg/day) in MDD, and separately to two duloxetine-referenced trials. The primary outcome measure was change in Digit Symbol Substitution Test (DSST) score. Standardized effect sizes (SES) versus placebo (Cohen’s d) were used as input. Path analysis was employed to determine the extent to which changes in DSST were mediated independently of a change in Montgomery-Åsberg Depression Rating Scale (MADRS) score. Meta-analysis was applied to MADRS-adjusted and -unadjusted SES values. Changes on additional cognitive tests were evaluated (source studies only). Results: Before adjustment for MADRS, vortioxetine separated from placebo on DSST score (SES 0.25–0.48; nominal p < 0.05) in all individual trials, and statistically improved DSST performance versus placebo in meta-analyses of the three trials (SES = 0.35; p < 0.0001) and two duloxetine-referenced trials (SES = 0.26; p = 0.001). After adjustment for MADRS, vortioxetine maintained DSST improvement in one individual trial (p = 0.001) and separation from placebo was maintained in meta-analyses of all three trials (SES = 0.24; p < 0.0001) and both duloxetine-referenced trials (SES 0.19; p = 0.01). Change in DSST with duloxetine failed to separate from placebo in individual trials and both meta-analyses. Change in DSST statistically favored vortioxetine versus duloxetine after MADRS adjustment (SES = 0.16; p = 0.04). Conclusions: Vortioxetine, but not duloxetine, significantly improved cognition, independent of depressive symptoms. Vortioxetine represents an important treatment for MDD-related cognitive dysfunction. PMID:27312740

  4. Challenges Associated with Estimating Utility in Wet Age-Related Macular Degeneration: A Novel Regression Analysis to Capture the Bilateral Nature of the Disease.

    PubMed

    Hodgson, Robert; Reason, Timothy; Trueman, David; Wickstead, Rose; Kusel, Jeanette; Jasilek, Adam; Claxton, Lindsay; Taylor, Matthew; Pulikottil-Jacob, Ruth

    2017-10-01

    The estimation of utility values for the economic evaluation of therapies for wet age-related macular degeneration (AMD) is a particular challenge. Previous economic models in wet AMD have been criticized for failing to capture the bilateral nature of wet AMD by modelling visual acuity (VA) and utility values associated with the better-seeing eye only. Here we present a de novo regression analysis using generalized estimating equations (GEE) applied to a previous dataset of time trade-off (TTO)-derived utility values from a sample of the UK population that wore contact lenses to simulate visual deterioration in wet AMD. This analysis allows utility values to be estimated as a function of VA in both the better-seeing eye (BSE) and worse-seeing eye (WSE). VAs in both the BSE and WSE were found to be statistically significant (p < 0.05) when regressed separately. When included without an interaction term, only the coefficient for VA in the BSE was significant (p = 0.04), but when an interaction term between VA in the BSE and WSE was included, only the constant term (mean TTO utility value) was significant, potentially a result of the collinearity between the VA of the two eyes. The lack of both formal model fit statistics from the GEE approach and theoretical knowledge to support the superiority of one model over another make it difficult to select the best model. Limitations of this analysis arise from the potential influence of collinearity between the VA of both eyes, and the use of contact lenses to reflect VA states to obtain the original dataset. Whilst further research is required to elicit more accurate utility values for wet AMD, this novel regression analysis provides a possible source of utility values to allow future economic models to capture the quality of life impact of changes in VA in both eyes. Novartis Pharmaceuticals UK Limited.

  5. Differences in Looking at Own- and Other-Race Faces Are Subtle and Analysis-Dependent: An Account of Discrepant Reports.

    PubMed

    Arizpe, Joseph; Kravitz, Dwight J; Walsh, Vincent; Yovel, Galit; Baker, Chris I

    2016-01-01

    The Other-Race Effect (ORE) is the robust and well-established finding that people are generally poorer at facial recognition of individuals of another race than of their own race. Over the past four decades, much research has focused on the ORE because understanding this phenomenon is expected to elucidate fundamental face processing mechanisms and the influence of experience on such mechanisms. Several recent studies of the ORE in which the eye-movements of participants viewing own- and other-race faces were tracked have, however, reported highly conflicting results regarding the presence or absence of differential patterns of eye-movements to own- versus other-race faces. This discrepancy, of course, leads to conflicting theoretical interpretations of the perceptual basis for the ORE. Here we investigate fixation patterns to own- versus other-race (African and Chinese) faces for Caucasian participants using different analysis methods. While we detect statistically significant, though subtle, differences in fixation pattern using an Area of Interest (AOI) approach, we fail to detect significant differences when applying a spatial density map approach. Though there were no significant differences in the spatial density maps, the qualitative patterns matched the results from the AOI analyses reflecting how, in certain contexts, Area of Interest (AOI) analyses can be more sensitive in detecting the differential fixation patterns than spatial density analyses, due to spatial pooling of data with AOIs. AOI analyses, however, also come with the limitation of requiring a priori specification. These findings provide evidence that the conflicting reports in the prior literature may be at least partially accounted for by the differences in the statistical sensitivity associated with the different analysis methods employed across studies. Overall, our results suggest that detection of differences in eye-movement patterns can be analysis-dependent and rests on the assumptions inherent in the given analysis.

  6. Differences in Looking at Own- and Other-Race Faces Are Subtle and Analysis-Dependent: An Account of Discrepant Reports

    PubMed Central

    Arizpe, Joseph; Kravitz, Dwight J.; Walsh, Vincent; Yovel, Galit; Baker, Chris I.

    2016-01-01

    The Other-Race Effect (ORE) is the robust and well-established finding that people are generally poorer at facial recognition of individuals of another race than of their own race. Over the past four decades, much research has focused on the ORE because understanding this phenomenon is expected to elucidate fundamental face processing mechanisms and the influence of experience on such mechanisms. Several recent studies of the ORE in which the eye-movements of participants viewing own- and other-race faces were tracked have, however, reported highly conflicting results regarding the presence or absence of differential patterns of eye-movements to own- versus other-race faces. This discrepancy, of course, leads to conflicting theoretical interpretations of the perceptual basis for the ORE. Here we investigate fixation patterns to own- versus other-race (African and Chinese) faces for Caucasian participants using different analysis methods. While we detect statistically significant, though subtle, differences in fixation pattern using an Area of Interest (AOI) approach, we fail to detect significant differences when applying a spatial density map approach. Though there were no significant differences in the spatial density maps, the qualitative patterns matched the results from the AOI analyses reflecting how, in certain contexts, Area of Interest (AOI) analyses can be more sensitive in detecting the differential fixation patterns than spatial density analyses, due to spatial pooling of data with AOIs. AOI analyses, however, also come with the limitation of requiring a priori specification. These findings provide evidence that the conflicting reports in the prior literature may be at least partially accounted for by the differences in the statistical sensitivity associated with the different analysis methods employed across studies. Overall, our results suggest that detection of differences in eye-movement patterns can be analysis-dependent and rests on the assumptions inherent in the given analysis. PMID:26849447

  7. Test and analysis of a stitched RFI graphite-epoxy panel with a fuel access door

    NASA Technical Reports Server (NTRS)

    Jegley, Dawn C.; Waters, W. Allen, Jr.

    1994-01-01

    A stitched RFI graphite-epoxy panel with a fuel access door was analyzed using a finite element analysis and loaded to failure in compression. The panel was initially 56-inches long and 36.75-inches wide and the oval access door was 18-inches long and 15-inches wide. The panel was impact damaged with impact energy of 100 ft-lb prior to compressive loading; however, no impact damage was detectable visually or by A-scan. The panel carried a failure load of 695,000 Ib and global failure strain of .00494 in/in. Analysis indicated the panel would fail due to collapse at a load of 688,100 Ib. The test data indicate that the maximum strain in a region near the access door was .0096 in/in and analysis indicates a local surface strain of .010 in/in at the panel's failure load. The panel did not fail through the impact damage, but instead failed through bolt holes for attachment of the access door in a region of high strain.

  8. The Love of Large Numbers: A Popularity Bias in Consumer Choice.

    PubMed

    Powell, Derek; Yu, Jingqi; DeWolf, Melissa; Holyoak, Keith J

    2017-10-01

    Social learning-the ability to learn from observing the decisions of other people and the outcomes of those decisions-is fundamental to human evolutionary and cultural success. The Internet now provides social evidence on an unprecedented scale. However, properly utilizing this evidence requires a capacity for statistical inference. We examined how people's interpretation of online review scores is influenced by the numbers of reviews-a potential indicator both of an item's popularity and of the precision of the average review score. Our task was designed to pit statistical information against social information. We modeled the behavior of an "intuitive statistician" using empirical prior information from millions of reviews posted on Amazon.com and then compared the model's predictions with the behavior of experimental participants. Under certain conditions, people preferred a product with more reviews to one with fewer reviews even though the statistical model indicated that the latter was likely to be of higher quality than the former. Overall, participants' judgments suggested that they failed to make meaningful statistical inferences.

  9. Controlling false-negative errors in microarray differential expression analysis: a PRIM approach.

    PubMed

    Cole, Steve W; Galic, Zoran; Zack, Jerome A

    2003-09-22

    Theoretical considerations suggest that current microarray screening algorithms may fail to detect many true differences in gene expression (Type II analytic errors). We assessed 'false negative' error rates in differential expression analyses by conventional linear statistical models (e.g. t-test), microarray-adapted variants (e.g. SAM, Cyber-T), and a novel strategy based on hold-out cross-validation. The latter approach employs the machine-learning algorithm Patient Rule Induction Method (PRIM) to infer minimum thresholds for reliable change in gene expression from Boolean conjunctions of fold-induction and raw fluorescence measurements. Monte Carlo analyses based on four empirical data sets show that conventional statistical models and their microarray-adapted variants overlook more than 50% of genes showing significant up-regulation. Conjoint PRIM prediction rules recover approximately twice as many differentially expressed transcripts while maintaining strong control over false-positive (Type I) errors. As a result, experimental replication rates increase and total analytic error rates decline. RT-PCR studies confirm that gene inductions detected by PRIM but overlooked by other methods represent true changes in mRNA levels. PRIM-based conjoint inference rules thus represent an improved strategy for high-sensitivity screening of DNA microarrays. Freestanding JAVA application at http://microarray.crump.ucla.edu/focus

  10. Incidence of post-operative pain after single visit and multiple visit root canal treatment: A randomized controlled trial

    PubMed Central

    Singh, Smita; Garg, Aniket

    2012-01-01

    Aim: To compare the incidence and intensity of post-obturation pain after single or multi visit root canal treatment on single rooted teeth in a randomized controlled trial. Materials and Methods: Two hundred patients requiring root canal treatment on permanent single rooted teeth (both vital and non vital) were included. The patients were assigned randomly into two groups of 100 patients each. The teeth in Group1 (n = 100) were obturated at the first visit, whilst those in Group 2 (n = 100) were obturated in a second visit 7 days later. A modified Heft Parker visual analog scale was used to measure pre-operative pain and post-obturation pain at 6, 12, 24 and 48 hours after obturation. Independent-sample T-tests was used for statistical analysis. Results: Twelve patients were excluded from the study as they failed to follow the scheduled revisit. Data were obtained from the remaining 188 patients. There was no statistically significant difference in the incidence and intensity of post-obturation pain experienced by two groups. Conclusions: The incidence and intensity of post-obturation pain experience following one- or two-visit root canal treatment on teeth with a single canal were not significantly different. PMID:23112477

  11. Late response to patient-reported outcome questionnaires after surgery was associated with worse outcome.

    PubMed

    Hutchings, Andrew; Grosse Frie, Kirstin; Neuburger, Jenny; van der Meulen, Jan; Black, Nick

    2013-02-01

    Nonresponse to patient-reported outcome (PRO) questionnaires after surgery might bias the results. Our aim was to gauge the potential impact of nonresponse bias by comparing the outcomes of early and late responders. This study compares 59,565 early and 20,735 late responders who underwent a hip or knee replacement, hernia repair, or varicose vein (VV) surgery. The association between timeliness of response and three outcomes (the mean postoperative disease-specific PRO and generic PRO scores and the proportion reporting a fair or poor result) was examined by regression analysis. Late responders were slightly more likely to be young, nonwhite, deprived, and have a more severe preoperative condition with poorer quality of life. Late response was associated with a slightly poorer outcome in all four procedures although not statistically significant (P < 0.05) for VV surgery. Adjusting for preoperative characteristics reduced the strength of the associations, but they remained statistically significant. As nonresponse to PRO questionnaires introduces slight bias, differences in response rates between hospitals should be taken into account when making comparisons so as to avoid overestimating the performance of those with lower response rates and failing to detect poor performing hospitals. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. Regional instability following cervicothoracic junction surgery.

    PubMed

    Steinmetz, Michael P; Miller, Jared; Warbel, Ann; Krishnaney, Ajit A; Bingaman, William; Benzel, Edward C

    2006-04-01

    The cervicothoracic junction (CTJ) is the transitional region between the cervical and thoracic sections of the spinal axis. Because it is a transitional zone between the mobile lordotic cervical and rigid kyphotic thoracic spines, the CTJ is a region of potential instability. This potential for instability may be exaggerated by surgical intervention. A retrospective review of all patients who underwent surgery involving the CTJ in the Department of Neurosurgery at the Cleveland Clinic Foundation during a 5-year period was performed. The CTJ was strictly defined as encompassing the C-7 vertebra and C7-T1 disc interspace. Patients were examined after surgery to determine if treatment had failed. Failure was defined as construct failure, deformity (progression or de novo), or instability. Variables possibly associated with treatment failure were analyzed. Statistical comparisons were performed using the Fisher exact test. Between January 1998 and November 2003, 593 CTJ operations were performed. Treatment failed in 14 patients. Of all variables studied, failure was statistically associated with laminectomy and multilevel ventral corpectomies with fusion across the CTJ. Other factors statistically associated with treatment failure included histories of cervical surgery, tobacco use, and surgery for the correction of deformity. The CTJ is a vulnerable region, and this vulnerability is exacerbated by surgery. Results of the present study indicate that laminectomy across the CTJ should be supplemented with instrumentation (and fusion). Multilevel ventral corpectomies across the CTJ should also be supplemented with dorsal instrumentation. Supplemental instrumentation should be considered for patients who have undergone prior cervical surgery, have a history of tobacco use, or are undergoing surgery for deformity correction.

  13. Identifying the Source of Misfit in Item Response Theory Models.

    PubMed

    Liu, Yang; Maydeu-Olivares, Alberto

    2014-01-01

    When an item response theory model fails to fit adequately, the items for which the model provides a good fit and those for which it does not must be determined. To this end, we compare the performance of several fit statistics for item pairs with known asymptotic distributions under maximum likelihood estimation of the item parameters: (a) a mean and variance adjustment to bivariate Pearson's X(2), (b) a bivariate subtable analog to Reiser's (1996) overall goodness-of-fit test, (c) a z statistic for the bivariate residual cross product, and (d) Maydeu-Olivares and Joe's (2006) M2 statistic applied to bivariate subtables. The unadjusted Pearson's X(2) with heuristically determined degrees of freedom is also included in the comparison. For binary and ordinal data, our simulation results suggest that the z statistic has the best Type I error and power behavior among all the statistics under investigation when the observed information matrix is used in its computation. However, if one has to use the cross-product information, the mean and variance adjusted X(2) is recommended. We illustrate the use of pairwise fit statistics in 2 real-data examples and discuss possible extensions of the current research in various directions.

  14. Changing viewer perspectives reveals constraints to implicit visual statistical learning.

    PubMed

    Jiang, Yuhong V; Swallow, Khena M

    2014-10-07

    Statistical learning-learning environmental regularities to guide behavior-likely plays an important role in natural human behavior. One potential use is in search for valuable items. Because visual statistical learning can be acquired quickly and without intention or awareness, it could optimize search and thereby conserve energy. For this to be true, however, visual statistical learning needs to be viewpoint invariant, facilitating search even when people walk around. To test whether implicit visual statistical learning of spatial information is viewpoint independent, we asked participants to perform a visual search task from variable locations around a monitor placed flat on a stand. Unbeknownst to participants, the target was more often in some locations than others. In contrast to previous research on stationary observers, visual statistical learning failed to produce a search advantage for targets in high-probable regions that were stable within the environment but variable relative to the viewer. This failure was observed even when conditions for spatial updating were optimized. However, learning was successful when the rich locations were referenced relative to the viewer. We conclude that changing viewer perspective disrupts implicit learning of the target's location probability. This form of learning shows limited integration with spatial updating or spatiotopic representations. © 2014 ARVO.

  15. 40 CFR 82.180 - Agency review of SNAP submissions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... until EPA has received data it judges adequate to support analysis of the submission. (4) Letter of... time the Agency perceives a lack of information necessary to reach a SNAP determination, it will... expires even if the Agency fails to reach a decision within the 90-day review period or fails to...

  16. Decision-Tree Analysis for Predicting First-Time Pass/Fail Rates for the NCLEX-RN® in Associate Degree Nursing Students.

    PubMed

    Chen, Hsiu-Chin; Bennett, Sean

    2016-08-01

    Little evidence shows the use of decision-tree algorithms in identifying predictors and analyzing their associations with pass rates for the NCLEX-RN(®) in associate degree nursing students. This longitudinal and retrospective cohort study investigated whether a decision-tree algorithm could be used to develop an accurate prediction model for the students' passing or failing the NCLEX-RN. This study used archived data from 453 associate degree nursing students in a selected program. The chi-squared automatic interaction detection analysis of the decision trees module was used to examine the effect of the collected predictors on passing/failing the NCLEX-RN. The actual percentage scores of Assessment Technologies Institute®'s RN Comprehensive Predictor(®) accurately identified students at risk of failing. The classification model correctly classified 92.7% of the students for passing. This study applied the decision-tree model to analyze a sequence database for developing a prediction model for early remediation in preparation for the NCLEXRN. [J Nurs Educ. 2016;55(8):454-457.]. Copyright 2016, SLACK Incorporated.

  17. Aid and Advocacy: Why Community College Transfer Students Do Not Apply for Financial Aid and How Counselors Can Help Them Get in the Game

    ERIC Educational Resources Information Center

    Handel, Stephen J.

    2008-01-01

    According to the American Council on Education, more than 1 million community college students failed to receive financial aid for which they were likely eligible. This is a startling statistic. Given that financial aid is such a critical determinant of college-going and that academic counselors are among students' most important advocates,…

  18. Identification of different nutritional status groups in institutionalized elderly people by cluster analysis.

    PubMed

    López-Contreras, María José; López, Maria Ángeles; Canteras, Manuel; Candela, María Emilia; Zamora, Salvador; Pérez-Llamas, Francisca

    2014-03-01

    To apply a cluster analysis to groups of individuals of similar characteristics in an attempt to identify undernutrition or the risk of undernutrition in this population. A cross-sectional study. Seven public nursing homes in the province of Murcia, on the Mediterranean coast of Spain. 205 subjects aged 65 and older (131 women and 74 men). Dietary intake (energy and nutrients), anthropometric (body mass index, skinfold thickness, mid-arm muscle circumference, mid-arm muscle area, corrected arm muscle area, waist to hip ratio) and biochemical and haematological (serum albumin, transferrin, total cholesterol, total lymphocyte count). Variables were analyzed by cluster analysis. The results of the cluster analysis, including intake, anthropometric and analytical data showed that, of the 205 elderly subjects, 66 (32.2%) were over - weight/obese, 72 (35.1%) had an adequate nutritional status and 67 (32.7%) were undernourished or at risk of undernutrition. The undernourished or at risk of undernutrition group showed the lowest values for dietary intake and the anthropometric and analytical parameters measured. Our study shows that cluster analysis is a useful statistical method for assessing the nutritional status of institutionalized elderly populations. In contrast, use of the specific reference values frequently described in the literature might fail to detect real cases of undernourishment or those at risk of undernutrition. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  19. Magnetoencephalography and ictal SPECT in patients with failed epilepsy surgery.

    PubMed

    El Tahry, Riёm; Wang, Z Irene; Thandar, Aung; Podkorytova, Irina; Krishnan, Balu; Tousseyn, Simon; Guiyun, Wu; Burgess, Richard C; Alexopoulos, Andreas V

    2018-06-06

    Selected patients with intractable focal epilepsy who have failed a previous epilepsy surgery can become seizure-free with reoperation. Preoperative evaluation is exceedingly challenging in this cohort. We aim to investigate the diagnostic value of two noninvasive approaches, magnetoencephalography (MEG) and ictal single-photon emission computed tomography (SPECT), in patients with failed epilepsy surgery. We retrospectively included a consecutive cohort of patients who failed prior resective epilepsy surgery, underwent re-evaluation including MEG and ictal SPECT, and had another surgery after the re-evaluation. The relationship between resection and localization from each test was determined, and their association with seizure outcomes was analyzed. A total of 46 patients were included; 21 (46%) were seizure-free at 1-year followup after reoperation. Twenty-seven (58%) had a positive MEG and 31 (67%) had a positive ictal SPECT. The resection of MEG foci was significantly associated with seizure-free outcome (p = 0.002). Overlap of ictal SPECT hyperperfusion zones with resection was significantly associated with seizure-free outcome in the subgroup of patients with injection time ≤20 seconds(p = 0.03), but did not show significant association in the overall cohort (p = 0.46) although all injections were ictal. Patients whose MEG and ictal SPECT were concordant on a sublobar level had a significantly higher chance of seizure freedom (p = 0.05). MEG alone achieved successful localization in patients with failed epilepsy surgery with a statistical significance. Only ictal SPECT with early injection (≤20 seconds) had good localization value. Sublobar concordance between both tests was significantly associated with seizure freedom. SPECT can provide essential information in MEG-negative cases and vice versa. Our results emphasize the importance of considering a multimodal presurgical evaluation including MEG and SPECT in all patients with a previous failed epilepsy surgery. Copyright © 2018 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.

  20. Missing data treatments matter: an analysis of multiple imputation for anterior cervical discectomy and fusion procedures.

    PubMed

    Ondeck, Nathaniel T; Fu, Michael C; Skrip, Laura A; McLynn, Ryan P; Cui, Jonathan J; Basques, Bryce A; Albert, Todd J; Grauer, Jonathan N

    2018-04-09

    The presence of missing data is a limitation of large datasets, including the National Surgical Quality Improvement Program (NSQIP). In addressing this issue, most studies use complete case analysis, which excludes cases with missing data, thus potentially introducing selection bias. Multiple imputation, a statistically rigorous approach that approximates missing data and preserves sample size, may be an improvement over complete case analysis. The present study aims to evaluate the impact of using multiple imputation in comparison with complete case analysis for assessing the associations between preoperative laboratory values and adverse outcomes following anterior cervical discectomy and fusion (ACDF) procedures. This is a retrospective review of prospectively collected data. Patients undergoing one-level ACDF were identified in NSQIP 2012-2015. Perioperative adverse outcome variables assessed included the occurrence of any adverse event, severe adverse events, and hospital readmission. Missing preoperative albumin and hematocrit values were handled using complete case analysis and multiple imputation. These preoperative laboratory levels were then tested for associations with 30-day postoperative outcomes using logistic regression. A total of 11,999 patients were included. Of this cohort, 63.5% of patients had missing preoperative albumin and 9.9% had missing preoperative hematocrit. When using complete case analysis, only 4,311 patients were studied. The removed patients were significantly younger, healthier, of a common body mass index, and male. Logistic regression analysis failed to identify either preoperative hypoalbuminemia or preoperative anemia as significantly associated with adverse outcomes. When employing multiple imputation, all 11,999 patients were included. Preoperative hypoalbuminemia was significantly associated with the occurrence of any adverse event and severe adverse events. Preoperative anemia was significantly associated with the occurrence of any adverse event, severe adverse events, and hospital readmission. Multiple imputation is a rigorous statistical procedure that is being increasingly used to address missing values in large datasets. Using this technique for ACDF avoided the loss of cases that may have affected the representativeness and power of the study and led to different results than complete case analysis. Multiple imputation should be considered for future spine studies. Copyright © 2018 Elsevier Inc. All rights reserved.

  1. Sleeve Gastrectomy: Correlation of Long-Term Results with Remnant Morphology and Eating Disorders.

    PubMed

    Tassinari, Daniele; Berta, Rossana D; Nannipieri, Monica; Giusti, Patrizia; Di Paolo, Luca; Guarino, Daniela; Anselmino, Marco

    2017-11-01

    Remnant dimension is considered one of the crucial elements determining the success of sleeve gastrectomy (SG), and dilation of the gastric fundus is often believed to be the main cause of failure. The main outcome of this study is to find correlations between remnant morphology in the immediate post-operative stage, its dilation in years, and the long-term results. The second purpose aims to correlate preoperative eating disorders, taste alteration, hunger perception, and early satiety with post-SG results. Remnant morphology was evaluated, in the immediate post-operative stage and over the years (≥2 years), through X-ray of the oesophagus-stomach-duodenum calculating the surface in anteroposterior (AP) and right anterior oblique projection (RAO). Presurgery diagnosis of eating disorders and their evaluation through "Eating Disorder Inventory-3" (EDI3) during follow-up were performed. Change in taste perception, sense of appetite, and early satiety were evaluated. Patients were divided into two groups: "failed SGs (EWL<50%) and "efficient SGs" (EWL >50%). There were a total of 50 patients (37 F, 13 M), with mean age 52 years, preoperative weight 131 ± 21.8 kg, and BMI 47.4 ± 6.8 kg/m 2 . Post-operative remnant mean dimensions overlapped between the two groups. On a long-term basis, an increase of 57.2 and 48.4% was documented in the AP and RAO areas respectively. In "failed" SGs, dilation was significantly superior to "efficient" SGs (AP area 70.2 vs 46.1%; RAO area 59.3 vs 39%; body width 102% vs 41.7%). Preoperative eating disorders were more present in efficient SGs than in failed SGs with the exception of sweet eating. There were no significant changes to taste perception during follow-up. Fifty-two percent of efficient SGs vs 26% of failed SGs reported a persistent lack of sense of hunger; similarly, 92.5 vs 78% declared the persistence of a sense of early satiety. The two groups did not statistically differ as far as all the variables of the EDI3 are concerned. On a long-term basis, the remnant mean dilation is around 50% compared to the immediate post-operative stage but failed SGs showed larger remnant dilation than efficient SGs and, in percentage, the more dilated portion is the body of the stomach. As far as all the EDI3 variables obtained are concerned, the two groups did not statistically differ. Of all eating disorders, sweet eating seems to be weakly connected to SG failure.

  2. Impact of Chronic Condition Status and Severity on the Time to First Dental Visit for Newly Medicaid-Enrolled Children in Iowa

    PubMed Central

    Chi, Donald L; Momany, Elizabeth T; Neff, John; Jones, Michael P; Warren, John J; Slayton, Rebecca L; Weber-Gasparoni, Karin; Damiano, Peter C

    2011-01-01

    Objective To assess the extent to which chronic condition (CC) status and severity affected how soon children had a dental visit after enrolling in Medicaid. Data Source Enrollment and claims data (2003–2008) for newly Medicaid-enrolled children ages 3–14 in Iowa. Study Design 3M Clinical Risk Grouping methods were used to identify CC status (no/yes) and CC severity (less severe/more severe). Survival analysis was used to identify the factors associated with earlier first dental visits after initially enrolling in Medicaid. Principal Findings Children with a CC were 17 percent more likely to have earlier first dental visits after enrolling in Medicaid (p<.0001). There was no significant difference by CC severity. Children who lived in a dental health professional shortage area and those who did not utilize primary medical care had significantly later first Medicaid dental visits, whereas these factors failed to reach statistical significance for children with a CC. Conclusion While newly Medicaid-enrolled children with a CC were significantly more likely to have earlier first dental visits, we failed to detect a relationship between CC severity and the time to first Medicaid dental visit. The determinants of first Medicaid dental visits were heterogeneous across subgroups of newly Medicaid-enrolled children. Future studies should identify the sociobehavioral factors associated with CCs that are potential barriers to earlier first Medicaid dental visits for newly Medicaid-enrolled children. PMID:20849559

  3. Clinical outcome of head and neck cancer patients: a comparison between ENT patients referred via the 2 weeks wait pathway and alternative routes in the UK health system.

    PubMed

    Wong, B Y Winson; Fischer, S; Cruickshank, H E

    2017-01-01

    2 weeks wait (2ww) referral was intended to improve cancer outcomes in the UK. However, a previous study found that 2ww failed to detect early stage head and neck cancer. There is no current study to examine the survival outcome of head and neck cancer patients diagnosed on 2ww and non-2ww pathways. The aim of this study is to compare the outcome of cancer patients diagnosed on these pathways. We performed a retrospective review of head and neck cancer patients diagnosed between 2009 and 2013 in the ENT Department at Mid-Yorkshire NHS Hospitals Trust. Gender, age, disease staging, treatment modalities, route of referrals along with survival data were documented. Survival analysis was performed for 2ww and non-2ww cancer patients. There were 4123 patients referred on 2ww during the study period. 147 patients were diagnosed with cancers on 2ww and 89 patients were diagnosed on non-2ww. There were no statistical differences in clinical staging (p = 0.416) and overall survival (p = 0.376) between 2ww and non-2ww patients. This study failed to demonstrate a better overall survival in head and neck cancer patients diagnosed on 2ww pathway within the ENT cohort. Current referral system needs to be refined to improve the survival outcome in head and neck cancer patients.

  4. Influence of enamel preservation on failure rates of porcelain laminate veneers.

    PubMed

    Gurel, Galip; Sesma, Newton; Calamita, Marcelo A; Coachman, Christian; Morimoto, Susana

    2013-01-01

    The purpose of this study was to evaluate the failure rates of porcelain laminate veneers (PLVs) and the influence of clinical parameters on these rates in a retrospective survey of up to 12 years. Five hundred eighty laminate veneers were bonded in 66 patients. The following parameters were analyzed: type of preparation (depth and margin), crown lengthening, presence of restoration, diastema, crowding, discoloration, abrasion, and attrition. Survival was analyzed using the Kaplan-Meier method. Cox regression modeling was used to determine which factors would predict PLV failure. Forty-two veneers (7.2%) failed in 23 patients, and an overall cumulative survival rate of 86% was observed. A statistically significant association was noted between failure and the limits of the prepared tooth surface (margin and depth). The most frequent failure type was fracture (n = 20). The results revealed no significant influence of crown lengthening apically, presence of restoration, diastema, discoloration, abrasion, or attrition on failure rates. Multivariable analysis (Cox regression model) also showed that PLVs bonded to dentin and teeth with preparation margins in dentin were approximately 10 times more likely to fail than PLVs bonded to enamel. Moreover, coronal crown lengthening increased the risk of PLV failure by 2.3 times. A survival rate of 99% was observed for veneers with preparations confined to enamel and 94% for veneers with enamel only at the margins. Laminate veneers have high survival rates when bonded to enamel and provide a safe and predictable treatment option that preserves tooth structure.

  5. Rocket launcher: A novel reduction technique for posterior hip dislocations and review of current literature.

    PubMed

    Dan, Michael; Phillips, Alfred; Simonian, Marcus; Flannagan, Scott

    2015-06-01

    We provide a review of literature on reduction techniques for posterior hip dislocations and present our experience with a novel technique for the reduction of acute posterior hip dislocations in the ED, 'the rocket launcher' technique. We present our results with six patients with prosthetic posterior hip dislocation treated in our rural ED. We recorded patient demographics. The technique involves placing the patient's knee over the shoulder, and holding the lower leg like a 'Rocket Launcher' allow the physician's shoulder to work as a fulcrum, in an ergonomically friendly manner for the reducer. We used Fisher's t-test for cohort analysis between reduction techniques. Of our patients, the mean age was 74 years (range 66 to 85 years). We had a 83% success rate. The one patient who the 'rocket launcher' failed in, was a hemi-arthroplasty patient who also failed all other closed techniques and needed open reduction. When compared with Allis (62% success rate), Whistler (60% success rate) and Captain Morgan (92% success rate) techniques, there was no statistically significant difference in the successfulness of the reduction techniques. There were no neurovascular or periprosthetic complications. We have described a reduction technique for posterior hip dislocations. Placing the patient's knee over the shoulder, and holding the lower leg like a 'Rocket Launcher' allow the physician's shoulder to work as a fulcrum, thus mechanically and ergonomically superior to standard techniques. © 2015 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.

  6. Developing a shale heterogeneity index to predict fracture response in the Mancos Shale

    NASA Astrophysics Data System (ADS)

    DeReuil, Aubry; Birgenheier, Lauren; McLennan, John

    2017-04-01

    The interplay between sedimentary heterogeneity and fracture propagation in mudstone is crucial to assess the potential of low permeability rocks as unconventional reservoirs. Previous experimental research has demonstrated a relationship between heterogeneity and fracture of brittle rocks, as discontinuities in a rock mass influence micromechanical processes such as microcracking and strain localization, which evolve into macroscopic fractures. Though numerous studies have observed heterogeneity influencing fracture development, fundamental understanding of the entire fracture process and the physical controls on this process is still lacking. This is partly due to difficulties in quantifying heterogeneity in fine-grained rocks. Our study tests the hypothesis that there is a correlation between sedimentary heterogeneity and the manner in which mudstone is fractured. An extensive range of heterogeneity related to complex sedimentology is represented by various samples from cored intervals of the Mancos Shale. Samples were categorized via facies analysis consisting of: visual core description, XRF and XRD analysis, SEM and thin section microscopy, and reservoir quality analysis that tested porosity, permeability, water saturation, and TOC. Systematic indirect tensile testing on a broad variety of facies has been performed, and uniaxial and triaxial compression testing is underway. A novel tool based on analytically derived and statistically proven relationships between sedimentary geologic and geomechanical heterogeneity is the ultimate result, referred to as the shale heterogeneity index. Preliminary conclusions from development of the shale heterogeneity index reveal that samples with compositionally distinct bedding withstand loading at higher stress values, while texturally and compositionally homogeneous, bedded samples fail at lower stress values. The highest tensile strength results from cemented Ca-enriched samples, medial to high strength samples have approximately equivalent proportions of Al-Ca-Si compositions, while Al-rich samples have consistently low strength. Moisture preserved samples fail on average at approximately 5 MPa lower than dry samples of similar facies. Additionally, moisture preserved samples fail in a step-like pattern when tested perpendicular to bedding. Tensile fractures are halted at heterogeneities and propagate parallel to bedding planes before developing a through-going failure plane, as opposed to the discrete, continuous fractures that crosscut dry samples. This result suggests that sedimentary heterogeneity plays a greater role in fracture propagation in moisture preserved samples, which are more indicative of in-situ reservoir conditions. Stress-strain curves will be further analyzed, including estimation of an energy released term based on post-failure response, and an estimation of volume of cracking measure on the physical fracture surface.

  7. Why conventional detection methods fail in identifying the existence of contamination events.

    PubMed

    Liu, Shuming; Li, Ruonan; Smith, Kate; Che, Han

    2016-04-15

    Early warning systems are widely used to safeguard water security, but their effectiveness has raised many questions. To understand why conventional detection methods fail to identify contamination events, this study evaluates the performance of three contamination detection methods using data from a real contamination accident and two artificial datasets constructed using a widely applied contamination data construction approach. Results show that the Pearson correlation Euclidean distance (PE) based detection method performs better for real contamination incidents, while the Euclidean distance method (MED) and linear prediction filter (LPF) method are more suitable for detecting sudden spike-like variation. This analysis revealed why the conventional MED and LPF methods failed to identify existence of contamination events. The analysis also revealed that the widely used contamination data construction approach is misleading. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. [Arthrodesis following revision of a knee endoprosthesis. Literature review 1984-1994].

    PubMed

    Kohn, D; Schmolke, S

    1996-04-01

    Two percent of primary and 8% of revision total knee replacements are followed by arthrodesis. Today knee arthrodesis is the most important salvage procedure after failed total knee arthroplasty, resection arthroplasty and above-the-knee amputation being the only alternatives. Analysis of the literature between 1984 and 1994 revealed 533 cases treated with arthrodesis of the knee; 403 were done after failed total knee arthroplasty. The fusion rate was 74%. External fixation, intramedullary nail, plates and combinations of these are currently used for fixation. The literature and an analysis of our own patients from 1988 to 1994 showed that arthrodesis after failed arthroplasty is a difficult procedure, and complications often occur. Bone loss of the distal femur and proximal tibia is the one most important prognostic factor. A new classification system for bone loss is presented.

  9. Role of oral antibiotics in treatment of breastfeeding women with chronic breast pain who fail conservative therapy.

    PubMed

    Witt, Ann M; Burgess, Kelly; Hawn, Thomas R; Zyzanski, Steven

    2014-03-01

    Although breast pain remains a common cause of weaning, controversy exists regarding the etiology of chronic pain. Prospective studies are needed to define optimal treatment regimens. We evaluated patient history, exam, and bacterial cultures in breastfeeding women with chronic breast pain. We compared pain resolution and breastfeeding complications in patients responding to conservative therapy (CTX) (n=38) versus those in patients failing CTX and receiving oral antibiotic treatment (OTX) (n=48). We prospectively enrolled 86 breastfeeding women with breast pain lasting greater than 1 week and followed up patients through 12 weeks. Higher initial breast (p=0.012) and nipple pain severity (p=0.004), less response to latch correction (p=0.015) at baseline visit, and breastmilk Staphylococcus aureus growth (p=0.001) were associated with failing CTX. Pain type was not associated with failure of CTX. When culture results were available at 5 days, breast pain remained higher (p<0.001) in patients failing CTX and starting antibiotics. OTX patients then had more rapid breast pain reduction between 5 and 14 days (score of 3.1 vs. 1.3; p<0.001). By 4 weeks there was no difference (1.8/10 vs. 1.4/10; p=0.088) in breast pain level between groups. Median length of OTX was 14 days. At 12 weeks, weaning frequency (17% vs. 8%; p=0.331) was not statistically different. Initial pain severity and limited improvement to latch correction predicts failure of CTX. S. aureus growth is more common in women failing CTX. For those women not responding to CTX, OTX matched to breastmilk culture may significantly decrease their pain and is not associated with increased complications.

  10. Collagen Type IV and Laminin Expressions during Cartilage Repair and in Late Clinically Failed Repair Tissues from Human Subjects

    PubMed Central

    Foldager, Casper Bindzus; Toh, Wei Seong; Christensen, Bjørn Borsøe; Lind, Martin; Gomoll, Andreas H.; Spector, Myron

    2016-01-01

    Objective To identify the collagen type IV (Col4) isoform in articular cartilage and to evaluate the expressions of Col4 and laminin in the pericellular matrix (PCM) in damaged cartilage and during cartilage repair. Design The Col4 isoform was determined in chondrocytes isolated from 6 patients cultured up to 6 days and in 21% O2 or 1% O2, and the gene expression of Col4 α-chains was investigated. The distribution of Col4 and laminin in traumatically damaged cartilage (n = 7) and clinically failed cartilage repair (microfracture, TruFit, autologous chondrocyte implantation; n = 11) were investigated using immunohistochemistry. Normal human cartilage was used as control (n = 8). The distribution during clinical cartilage repair procedures was investigated in a minipig model with 6-month follow-up (untreated chondral, untreated osteochondral, microfracture, autologous chondrocyte implantation; n = 10). Results The Col4 isoform in articular cartilage was characterized as α1α1α2, which is an isoform containing antiangiogenic domains in the NC1-terminals (arresten and canstatin). In normal cartilage, laminin and Col4 was exclusively found in the PCM. High amounts (>50%) of Col4 in the PCM significantly decreased in damaged cartilage (P = 0.004) and clinically failed repair tissue (P < 0.001). Laminin was only found with high expression (>50%) in 4/8 of the normal samples, which was not statistically significantly different from damaged cartilage (P = 0.15) or failed cartilage repair (P = 0.054). Conclusions Col4 in cartilage contain antiangiogenic domains and may play a role in the hypoxic environment in articular cartilage. Col4 and laminin was not found in the PCM of damaged and clinically failed repair. PMID:26958317

  11. Analysis of model development strategies: predicting ventral hernia recurrence.

    PubMed

    Holihan, Julie L; Li, Linda T; Askenasy, Erik P; Greenberg, Jacob A; Keith, Jerrod N; Martindale, Robert G; Roth, J Scott; Liang, Mike K

    2016-11-01

    There have been many attempts to identify variables associated with ventral hernia recurrence; however, it is unclear which statistical modeling approach results in models with greatest internal and external validity. We aim to assess the predictive accuracy of models developed using five common variable selection strategies to determine variables associated with hernia recurrence. Two multicenter ventral hernia databases were used. Database 1 was randomly split into "development" and "internal validation" cohorts. Database 2 was designated "external validation". The dependent variable for model development was hernia recurrence. Five variable selection strategies were used: (1) "clinical"-variables considered clinically relevant, (2) "selective stepwise"-all variables with a P value <0.20 were assessed in a step-backward model, (3) "liberal stepwise"-all variables were included and step-backward regression was performed, (4) "restrictive internal resampling," and (5) "liberal internal resampling." Variables were included with P < 0.05 for the Restrictive model and P < 0.10 for the Liberal model. A time-to-event analysis using Cox regression was performed using these strategies. The predictive accuracy of the developed models was tested on the internal and external validation cohorts using Harrell's C-statistic where C > 0.70 was considered "reasonable". The recurrence rate was 32.9% (n = 173/526; median/range follow-up, 20/1-58 mo) for the development cohort, 36.0% (n = 95/264, median/range follow-up 20/1-61 mo) for the internal validation cohort, and 12.7% (n = 155/1224, median/range follow-up 9/1-50 mo) for the external validation cohort. Internal validation demonstrated reasonable predictive accuracy (C-statistics = 0.772, 0.760, 0.767, 0.757, 0.763), while on external validation, predictive accuracy dipped precipitously (C-statistic = 0.561, 0.557, 0.562, 0.553, 0.560). Predictive accuracy was equally adequate on internal validation among models; however, on external validation, all five models failed to demonstrate utility. Future studies should report multiple variable selection techniques and demonstrate predictive accuracy on external data sets for model validation. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. The Objective Borderline method (OBM): a probability-based model for setting up an objective pass/fail cut-off score in medical programme assessments.

    PubMed

    Shulruf, Boaz; Turner, Rolf; Poole, Phillippa; Wilkinson, Tim

    2013-05-01

    The decision to pass or fail a medical student is a 'high stakes' one. The aim of this study is to introduce and demonstrate the feasibility and practicality of a new objective standard-setting method for determining the pass/fail cut-off score from borderline grades. Three methods for setting up pass/fail cut-off scores were compared: the Regression Method, the Borderline Group Method, and the new Objective Borderline Method (OBM). Using Year 5 students' OSCE results from one medical school we established the pass/fail cut-off scores by the abovementioned three methods. The comparison indicated that the pass/fail cut-off scores generated by the OBM were similar to those generated by the more established methods (0.840 ≤ r ≤ 0.998; p < .0001). Based on theoretical and empirical analysis, we suggest that the OBM has advantages over existing methods in that it combines objectivity, realism, robust empirical basis and, no less importantly, is simple to use.

  13. Dancing in a Minefield: An Analysis of Turnaround Specialists in Arizona Schools

    ERIC Educational Resources Information Center

    McMillie, Kyann L.

    2010-01-01

    In 2008, educational leaders from the Arizona Department of Education (ADE) assigned a group of turnaround specialists to work in four failing public schools in a large, urban school district in Phoenix, Arizona in hopes of improving those schools. The utilization of turnaround specialists in failing schools was Arizona's method of enacting…

  14. An Unfulfilled Dream of an Urban Community School for Girls: A Failed Experiment in Educational Reform

    ERIC Educational Resources Information Center

    Doyle, Kerri

    2013-01-01

    This research presents the qualitative case study of an urban community school initiative that began as an educational reform effort and that ultimately failed. The process of emergence for this school and factors leading to its collapse are described through participant interviews and document analysis. Nationally, policy reformers,…

  15. Knee Contact Force Asymmetries in Patients Who Failed Return-to-Sport Readiness Criteria 6 Months After Anterior Cruciate Ligament Reconstruction

    PubMed Central

    Gardinier, Emily S.; Di Stasi, Stephanie; Manal, Kurt; Buchanan, Thomas S.; Snyder-Mackler, Lynn

    2015-01-01

    Background After anterior cruciate ligament (ACL) injury, contact forces are decreased in the injured knee when compared with the uninjured knee. The persistence of contact force asymmetries after ACL reconstruction may increase the risk of reinjury and may play an important role in the development of knee osteoarthritis in these patients. Functional performance may also be useful in identifying patients who demonstrate potentially harmful joint contact force asymmetries after ACL reconstruction. Hypothesis Knee joint contact force asymmetries would be present during gait after ACL reconstruction, and performance on a specific set of validated return-to-sport (RTS) readiness criteria would discriminate between those who demonstrated contact force asymmetries and those who did not. Study Design Descriptive laboratory study. Methods A total of 29 patients with ACL ruptures participated in gait analysis and RTS readiness testing 6 months after reconstruction. Muscle and joint contact forces were estimated using an electromyography (EMG)–driven musculoskeletal model of the knee. The magnitude of typical limb asymmetry in uninjured controls was used to define limits of meaningful limb asymmetry in patients after ACL reconstruction. The RTS testing included isometric quadriceps strength testing, 4 unilateral hop tests, and 2 self-report questionnaires. Paired t tests were used to assess limb symmetry for peak medial and tibiofemoral contact forces in all patients, and a mixed-design analysis of variance was used to analyze the effect of passing or failing RTS testing on contact force asymmetry. Results Among all patients, neither statistically significant nor meaningful contact force asymmetries were identified. However, patients who failed RTS testing exhibited meaningful contact force asymmetries, with tibiofemoral contact force being significantly lower for the involved knee. Conversely, patients who passed RTS testing exhibited neither significant nor meaningful contact force asymmetries. Conclusion Joint contact force asymmetries during gait are present in some patients 6 months after ACL reconstruction. Patients who demonstrated poor functional performance on RTS readiness testing exhibited significant and meaningful contact force asymmetries. Clinical Relevance When assessing all patients together, variability in the functional status obscured significant and meaningful differences in contact force asymmetry in patients 6 months after ACL reconstruction. These specific RTS readiness criteria appear to differentiate between those who demonstrate joint contact force symmetry after ACL reconstruction and those who do not. PMID:25318940

  16. Compared efficacy of preservation solutions on the outcome of liver transplantation: Meta-analysis.

    PubMed

    Szilágyi, Ágnes Lilla; Mátrai, Péter; Hegyi, Péter; Tuboly, Eszter; Pécz, Daniella; Garami, András; Solymár, Margit; Pétervári, Erika; Balaskó, Márta; Veres, Gábor; Czopf, László; Wobbe, Bastian; Szabó, Dorottya; Wagner, Juliane; Hartmann, Petra

    2018-04-28

    To compare the effects of the four most commonly used preservation solutions on the outcome of liver transplantations. A systematic literature search was performed using MEDLINE, Scopus, EMBASE and the Cochrane Library databases up to January 31 st , 2017. The inclusion criteria were comparative, randomized controlled trials (RCTs) for deceased donor liver (DDL) allografts with adult and pediatric donors using the gold standard University of Wisconsin (UW) solution or histidine-tryptophan-ketoglutarate (HTK), Celsior (CS) and Institut Georges Lopez (IGL-1) solutions. Fifteen RCTs (1830 livers) were included; the primary outcomes were primary non-function (PNF) and one-year post-transplant graft survival (OGS-1). All trials were homogenous with respect to donor and recipient characteristics. There was no statistical difference in the incidence of PNF with the use of UW, HTK, CS and IGL-1 (RR = 0.02, 95%CI: 0.01-0.03, P = 0.356). Comparing OGS-1 also failed to reveal any difference between UW, HTK, CS and IGL-1 (RR = 0.80, 95%CI: 0.80-0.80, P = 0.369). Two trials demonstrated higher PNF levels for UW in comparison with the HTK group, and individual studies described higher rates of biliary complications where HTK and CS were used compared to the UW and IGL-1 solutions. However, the meta-analysis of the data did not prove a statistically significant difference: the UW, CS, HTK and IGL-1 solutions were associated with nearly equivalent outcomes. Alternative solutions for UW yield the same degree of safety and effectiveness for the preservation of DDLs, but further well-designed clinical trials are warranted.

  17. Adolescent Sexual Health Communication and Condom Use: A Meta-Analysis

    PubMed Central

    Widman, Laura; Noar, Seth M.; Choukas-Bradley, Sophia; Francis, Diane

    2014-01-01

    Objective Condom use is critical for the health of sexually active adolescents, and yet many adolescents fail to use condoms consistently. One interpersonal factor that may be key to condom use is sexual communication between sexual partners; however, the association between communication and condom use has varied considerably in prior studies of youth. The purpose of this meta-analysis was to synthesize the growing body of research linking adolescents’ sexual communication to condom use, and to examine several moderators of this association. Methods A total of 41 independent effect sizes from 34 studies with 15,046 adolescent participants (Mage=16.8, age range=12–23) were meta-analyzed. Results Results revealed a weighted mean effect size of the sexual communication-condom use relationship of r = .24, which was statistically heterogeneous (Q=618.86, p<.001, I2 =93.54). Effect sizes did not differ significantly by gender, age, recruitment setting, country of study, or condom measurement timeframe; however, communication topic and communication format were statistically significant moderators (p<.001). Larger effect sizes were found for communication about condom use (r = .34) than communication about sexual history (r = .15) or general safer sex topics (r = .14). Effect sizes were also larger for communication behavior formats (r = .27) and self-efficacy formats (r = .28), than for fear/concern (r = .18), future intention (r = .15), or communication comfort (r = −.15) formats. Conclusions Results highlight the urgency of emphasizing communication skills, particularly about condom use, in HIV/STI prevention work for youth. Implications for the future study of sexual communication are discussed. PMID:25133828

  18. On polarimetric radar signatures of deep convection for model evaluation: columns of specific differential phase observed during MC3E

    PubMed Central

    van Lier-Walqui, Marcus; Fridlind, Ann M.; Ackerman, Andrew S.; Collis, Scott; Helmus, Jonathan; MacGorman, Donald R.; North, Kirk; Kollias, Pavlos; Posselt, Derek J.

    2017-01-01

    The representation of deep convection in general circulation models is in part informed by cloud-resolving models (CRMs) that function at higher spatial and temporal resolution; however, recent studies have shown that CRMs often fail at capturing the details of deep convection updrafts. With the goal of providing constraint on CRM simulation of deep convection updrafts, ground-based remote-sensing observations are analyzed and statistically correlated for four deep convection events observed during the Midlatitude Continental Convective Clouds Experiment (MC3E). Since positive values of specific differential phase (KDP) observed above the melting level are associated with deep convection updraft cells, so-called “KDP columns” are analyzed using two scanning polarimetric radars in Oklahoma: the National Weather Service Vance WSR-88D (KVNX) and the Department of Energy C-band Scanning Atmospheric Radiation Measurement (ARM) Precipitation Radar (C-SAPR). KVNX and C-SAPR KDP volumes and columns are then statistically correlated with vertical winds retrieved via multi-Doppler wind analysis, lightning flash activity derived from the Oklahoma Lightning Mapping Array, and KVNX differential reflectivity (ZDR). Results indicate strong correlations of KDP volume above the melting level with updraft mass flux, lightning flash activity, and intense rainfall. Analysis of KDP columns reveals signatures of changing updraft properties from one storm event to another as well as during event evolution. Comparison of ZDR to KDP shows commonalities in information content of each, as well as potential problems with ZDR associated with observational artifacts. PMID:29503466

  19. Bootstrap Signal-to-Noise Confidence Intervals: An Objective Method for Subject Exclusion and Quality Control in ERP Studies

    PubMed Central

    Parks, Nathan A.; Gannon, Matthew A.; Long, Stephanie M.; Young, Madeleine E.

    2016-01-01

    Analysis of event-related potential (ERP) data includes several steps to ensure that ERPs meet an appropriate level of signal quality. One such step, subject exclusion, rejects subject data if ERP waveforms fail to meet an appropriate level of signal quality. Subject exclusion is an important quality control step in the ERP analysis pipeline as it ensures that statistical inference is based only upon those subjects exhibiting clear evoked brain responses. This critical quality control step is most often performed simply through visual inspection of subject-level ERPs by investigators. Such an approach is qualitative, subjective, and susceptible to investigator bias, as there are no standards as to what constitutes an ERP of sufficient signal quality. Here, we describe a standardized and objective method for quantifying waveform quality in individual subjects and establishing criteria for subject exclusion. The approach uses bootstrap resampling of ERP waveforms (from a pool of all available trials) to compute a signal-to-noise ratio confidence interval (SNR-CI) for individual subject waveforms. The lower bound of this SNR-CI (SNRLB) yields an effective and objective measure of signal quality as it ensures that ERP waveforms statistically exceed a desired signal-to-noise criterion. SNRLB provides a quantifiable metric of individual subject ERP quality and eliminates the need for subjective evaluation of waveform quality by the investigator. We detail the SNR-CI methodology, establish the efficacy of employing this approach with Monte Carlo simulations, and demonstrate its utility in practice when applied to ERP datasets. PMID:26903849

  20. Bench to bedside: the quest for quality in experimental stroke research.

    PubMed

    Dirnagl, Ulrich

    2006-12-01

    Over the past decades, great progress has been made in clinical as well as experimental stroke research. Disappointingly, however, hundreds of clinical trials testing neuroprotective agents have failed despite efficacy in experimental models. Recently, several systematic reviews have exposed a number of important deficits in the quality of preclinical stroke research. Many of the issues raised in these reviews are not specific to experimental stroke research, but apply to studies of animal models of disease in general. It is the aim of this article to review some quality-related sources of bias with a particular focus on experimental stroke research. Weaknesses discussed include, among others, low statistical power and hence reproducibility, defects in statistical analysis, lack of blinding and randomization, lack of quality-control mechanisms, deficiencies in reporting, and negative publication bias. Although quantitative evidence for quality problems at present is restricted to preclinical stroke research, to spur discussion and in the hope that they will be exposed to meta-analysis in the near future, I have also included some quality-related sources of bias, which have not been systematically studied. Importantly, these may be also relevant to mechanism-driven basic stroke research. I propose that by a number of rather simple measures reproducibility of experimental results, as well as the step from bench to bedside in stroke research may be made more successful. However, the ultimate proof for this has to await successful phase III stroke trials, which were built on basic research conforming to the criteria as put forward in this article.

  1. On polarimetric radar signatures of deep convection for model evaluation: columns of specific differential phase observed during MC3E.

    PubMed

    van Lier-Walqui, Marcus; Fridlind, Ann M; Ackerman, Andrew S; Collis, Scott; Helmus, Jonathan; MacGorman, Donald R; North, Kirk; Kollias, Pavlos; Posselt, Derek J

    2016-02-01

    The representation of deep convection in general circulation models is in part informed by cloud-resolving models (CRMs) that function at higher spatial and temporal resolution; however, recent studies have shown that CRMs often fail at capturing the details of deep convection updrafts. With the goal of providing constraint on CRM simulation of deep convection updrafts, ground-based remote-sensing observations are analyzed and statistically correlated for four deep convection events observed during the Midlatitude Continental Convective Clouds Experiment (MC3E). Since positive values of specific differential phase ( K DP ) observed above the melting level are associated with deep convection updraft cells, so-called " K DP columns" are analyzed using two scanning polarimetric radars in Oklahoma: the National Weather Service Vance WSR-88D (KVNX) and the Department of Energy C-band Scanning Atmospheric Radiation Measurement (ARM) Precipitation Radar (C-SAPR). KVNX and C-SAPR K DP volumes and columns are then statistically correlated with vertical winds retrieved via multi-Doppler wind analysis, lightning flash activity derived from the Oklahoma Lightning Mapping Array, and KVNX differential reflectivity ( Z DR ). Results indicate strong correlations of K DP volume above the melting level with updraft mass flux, lightning flash activity, and intense rainfall. Analysis of K DP columns reveals signatures of changing updraft properties from one storm event to another as well as during event evolution. Comparison of Z DR to K DP shows commonalities in information content of each, as well as potential problems with Z DR associated with observational artifacts.

  2. [Prevalence and associated factors with depressive symptoms in Health Sciences students from a private university in Lima, Peru 2010].

    PubMed

    Pereyra-Elías, Reneé; Ocampo-Mascaró, Javier; Silva-Salazar, Vera; Vélez-Segovia, Eduardo; Costa-Bullón, A Daniel da; Toro-Polo, Luis Miguel; Vicuña-Ortega, Joanna

    2010-01-01

    Depressive symptoms in health sciences students are common, these might be potentially detrimental. To determine the prevalence of depressive symptoms and its associated factors in students from the Health Sciences Faculty of the Universidad Peruana de Ciencias Aplicadas in Lima (Peru), June 2010. Cross-sectional analytic study; a pre-consented survey was applied to the population 590/869 students. Zung's abbreviated scale was used to measure depressive symptoms. To evaluate de associated factors, logistic regression was used, p<0.05 was considered statistically significant. The mean age was 18.97 ± 2.45 years and 71.1% were women, 19.6% were migrants and 62.5% were medical students. The prevalence of depressive symptoms was 31.2% in the whole population and in medical students was 33.6%. Depressive symptoms were not associated in bivariate analysis with sex, career, having failed a course, living alone or being a migrant (p>0.05). In the multivariate analysis, significant statistical association was found between depressive symptoms and dissatisfaction with the own academic performance (OR=2.13 CI95%1.47-3.08), dissatisfaction with the current economic status (OR=1.93 CI95%1.24-2.99) and living with a relative external to the nuclear family (OR=1.62 CI95%1.07-2.45). A high prevalence of depressive symptoms was found, especially in medical students; being dissatisfaction with academic performance, economic status and living with a relative external to the nuclear family associated factors that could be taken into account in order to build preventive programs.

  3. GRAPHIC REANALYSIS OF THE TWO NINDS-TPA TRIALS CONFIRMS SUBSTANTIAL TREATMENT BENEFIT

    PubMed Central

    Saver, Jeffrey L.; Gornbein, Jeffrey; Starkman, Sidney

    2010-01-01

    Background of Comment/Review Multiple statistical analyses of the two NINDS-TPA Trials have confirmed study findings of benefit of fibrinolytic therapy. A recent graphic analysis departed from best practices in the visual display of quantitative information by failing to take into account the skewed functional importance NIH Stroke Scale raw scores and by scaling change axes at up to twenty times the range achievable by individual patients. Methods Using the publicly available datasets of the 2 NINDS-TPA Trials, we generated a variety of figures appropriate to the characteristics of acute stroke trial data. Results A diverse array of figures all visually delineated substantial benefits of fibrinolytic therapy, including: bar charts of normalized gain and loss; stacked bar, bar, and matrix plots of clinically relevant ordinal ranks; a time series stacked line plot of continuous scale disability weights; and line plot, bubble chart, and person icon array graphs of joint outcome table analysis. The achievable change figure showed substantially greater improvement among TPA than placebo patients, median 66.7% (IQR 0–92.0) vs 50.0% (IQR −7.1 – 80.0), p=0.003. Conclusions On average, under 3 hour patients treated with TPA recovered two-thirds while placebo patients improved only half of the way towards fully normal. Graphical analyses of the two NINDS-TPA trials, when performed according to best practices, is a useful means of conveying details about patient response to therapy not fully delineated by summary statistics, and confirms a valuable treatment benefit of under 3 hour fibrinolytic therapy in acute stroke. PMID:20829518

  4. Reliability and sensitivity analysis of a system with multiple unreliable service stations and standby switching failures

    NASA Astrophysics Data System (ADS)

    Ke, Jyh-Bin; Lee, Wen-Chiung; Wang, Kuo-Hsiung

    2007-07-01

    This paper presents the reliability and sensitivity analysis of a system with M primary units, W warm standby units, and R unreliable service stations where warm standby units switching to the primary state might fail. Failure times of primary and warm standby units are assumed to have exponential distributions, and service times of the failed units are exponentially distributed. In addition, breakdown times and repair times of the service stations also follow exponential distributions. Expressions for system reliability, RY(t), and mean time to system failure, MTTF are derived. Sensitivity analysis, relative sensitivity analysis of the system reliability and the mean time to failure, with respect to system parameters are also investigated.

  5. Incorporating covariates into fisheries stock assessment models with application to Pacific herring.

    PubMed

    Deriso, Richard B; Maunder, Mark N; Pearson, Walter H

    2008-07-01

    We present a framework for evaluating the cause of fishery declines by integrating covariates into a fisheries stock assessment model. This allows the evaluation of fisheries' effects vs. natural and other human impacts. The analyses presented are based on integrating ecological science and statistics and form the basis for environmental decision-making advice. Hypothesis tests are described to rank hypotheses and determine the size of a multiple covariate model. We extend recent developments in integrated analysis and use novel methods to produce effect size estimates that are relevant to policy makers and include estimates of uncertainty. Results can be directly applied to evaluate trade-offs among alternative management decisions. The methods and results are also broadly applicable outside fisheries stock assessment. We show that multiple factors influence populations and that analysis of factors in isolation can be misleading. We illustrate the framework by applying it to Pacific herring of Prince William Sound, Alaska (USA). The Pacific herring stock that spawns in Prince William Sound is a stock that has collapsed, but there are several competing or alternative hypotheses to account for the initial collapse and subsequent lack of recovery. Factors failing the initial screening tests for statistical significance included indicators of the 1989 Exxon Valdez oil spill, coho salmon predation, sea lion predation, Pacific Decadal Oscillation, Northern Oscillation Index, and effects of containment in the herring egg-on-kelp pound fishery. The overall results indicate that the most statistically significant factors related to the lack of recovery of the herring stock involve competition or predation by juvenile hatchery pink salmon on herring juveniles. Secondary factors identified in the analysis were poor nutrition in the winter, ocean (Gulf of Alaska) temperature in the winter, the viral hemorrhagic septicemia virus, and the pathogen Ichthyophonus hoferi. The implication of this result to fisheries management in Prince William Sound is that it may well be difficult to simultaneously increase the production of pink salmon and maintain a viable Pacific herring fishery. The impact can be extended to other commercially important fisheries, and a whole ecosystem approach may be needed to evaluate the costs and benefits of salmon hatcheries.

  6. Cost-utility analysis of competing treatment strategies for drug-resistant epilepsy in children with Tuberous Sclerosis Complex.

    PubMed

    Fallah, Aria; Weil, Alexander G; Wang, Shelly; Lewis, Evan; Baca, Christine B; Mathern, Gary W

    2016-10-01

    The management of drug-resistant epilepsy in children with Tuberous Sclerosis Complex (TSC) is challenging because of the multitude of treatment options, wide range of associated costs, and uncertainty of seizure outcomes. The most cost-effective approach for children whose epilepsy has failed to improve with first-line medical therapy is uncertain. A review of MEDLINE from 1990 to 2015 was conducted. A cost-utility analysis, from a third-party payer perspective, was performed for children with drug-resistant epilepsy that had failed to improve with 2 antiseizure drugs (ASDs) and that was amenable to resective epilepsy surgery, across a time-horizon of 5years. Four strategies were included: (1) resective epilepsy surgery, (2) vagus nerve stimulator (VNS) implantation, (3) ketogenic diet, and (4) addition of a third ASD (specifically, carbamazepine). The incremental cost per quality-adjusted life year (QALY) gained was analyzed. Given a willingness-to-pay (WTP) of $100,000 per QALY, the addition of a third ASD ($6600 for a gain of 4.14 QALYs) was the most cost-effective treatment strategy. In a secondary analysis, if the child whose epilepsy had failed to improve with 3 ASDs, ketogenic diet, addition of a fourth ASD, and resective epilepsy surgery were incrementally cost-effective treatment strategies. Vagus nerve stimulator implantation was more expensive yet less effective than alternative strategies and should not be prioritized. The addition of a third ASD is a universally cost-effective treatment option in the management of children with drug-resistant epilepsy that has failed to improve with 2 ASDs. For children whose epilepsy has failed to improve with 3 ASDs, the most cost-effective treatment depends on the health-care resources available reflected by the WTP. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Blunt splenic injuries: have we watched long enough?

    PubMed

    Smith, Jason; Armen, Scott; Cook, Charles H; Martin, Larry C

    2008-03-01

    Nonoperative management (NOM) of blunt splenic injuries (BSIs) has been used with increasing frequency in adult patients. There are currently no definitive guidelines established for how long BSI patients should be monitored for failure of NOM after injury. This study was performed to ascertain the length of inpatient observation needed to capture most failures, and to identify factors associated with failure of NOM. We utilized the National Trauma Data Bank to determine time to failure after BSI. During the 5-year study period, 23,532 patients were identified with BSI, of which 2,366 (10% overall) were taken directly to surgery (within 2 hours of arrival). Of 21,166 patients initially managed nonoperatively, 18,506 were successful (79% of all-comers). Patients with isolated BSI are currently monitored approximately 5 days as inpatients. Of patients failing NOM, 95% failed during the first 72 hours, and monitoring 2 additional days saw only 1.5% more failures. Factors influencing success of NOM included computed tomographic injury grade, severity of patient injury, and American College of Surgeons designation of trauma center. Importantly, patients who failed NOM did not seem to have detrimental outcomes when compared with patients with successful NOM. No statistically significant predictive variables could be identified that would help predict patients who would go on to fail NOM. We conclude that at least 80% of BSI can be managed successfully with NOM, and that patients should be monitored as inpatients for failure after BSI for 3 to 5 days.

  8. Study of a fail-safe abort system for an actively cooled hypersonic aircraft: Computer program documentation

    NASA Technical Reports Server (NTRS)

    Haas, L. A., Sr.

    1976-01-01

    The Fail-Safe Abort System TEMPerature Analysis Program, (FASTEMP), user's manual is presented. This program was used to analyze fail-safe abort systems for an actively cooled hypersonic aircraft. FASTEMP analyzes the steady state or transient temperature response of a thermal model defined in rectangular, cylindrical, conical and/or spherical coordinate system. FASTEMP provides the user with a large selection of subroutines for heat transfer calculations. The various modes of heat transfer available from these subroutines are: heat storage, conduction, radiation, heat addition or generation, convection, and fluid flow.

  9. The reliability of the pass/fail decision for assessments comprised of multiple components.

    PubMed

    Möltner, Andreas; Tımbıl, Sevgi; Jünger, Jana

    2015-01-01

    The decision having the most serious consequences for a student taking an assessment is the one to pass or fail that student. For this reason, the reliability of the pass/fail decision must be determined for high quality assessments, just as the measurement reliability of the point values. Assessments in a particular subject (graded course credit) are often composed of multiple components that must be passed independently of each other. When "conjunctively" combining separate pass/fail decisions, as with other complex decision rules for passing, adequate methods of analysis are necessary for estimating the accuracy and consistency of these classifications. To date, very few papers have addressed this issue; a generally applicable procedure was published by Douglas and Mislevy in 2010. Using the example of an assessment comprised of several parts that must be passed separately, this study analyzes the reliability underlying the decision to pass or fail students and discusses the impact of an improved method for identifying those who do not fulfill the minimum requirements. The accuracy and consistency of the decision to pass or fail an examinee in the subject cluster Internal Medicine/General Medicine/Clinical Chemistry at the University of Heidelberg's Faculty of Medicine was investigated. This cluster requires students to separately pass three components (two written exams and an OSCE), whereby students may reattempt to pass each component twice. Our analysis was carried out using the method described by Douglas and Mislevy. Frequently, when complex logical connections exist between the individual pass/fail decisions in the case of low failure rates, only a very low reliability for the overall decision to grant graded course credit can be achieved, even if high reliabilities exist for the various components. For the example analyzed here, the classification accuracy and consistency when conjunctively combining the three individual parts is relatively low with κ=0.49 or κ=0.47, despite the good reliability of over 0.75 for each of the three components. The option to repeat each component twice leads to a situation in which only about half of the candidates who do not satisfy the minimum requirements would fail the overall assessment, while the other half is able to continue their studies despite having deficient knowledge and skills. The method put forth by Douglas and Mislevy allows the analysis of the decision accuracy and consistency for complex combinations of scores from different components. Even in the case of highly reliable components, it is not necessarily so that a reliable pass/fail decision has been reached - for instance in the case of low failure rates. Assessments must be administered with the explicit goal of identifying examinees that do not fulfill the minimum requirements.

  10. The reliability of the pass/fail decision for assessments comprised of multiple components

    PubMed Central

    Möltner, Andreas; Tımbıl, Sevgi; Jünger, Jana

    2015-01-01

    Objective: The decision having the most serious consequences for a student taking an assessment is the one to pass or fail that student. For this reason, the reliability of the pass/fail decision must be determined for high quality assessments, just as the measurement reliability of the point values. Assessments in a particular subject (graded course credit) are often composed of multiple components that must be passed independently of each other. When “conjunctively” combining separate pass/fail decisions, as with other complex decision rules for passing, adequate methods of analysis are necessary for estimating the accuracy and consistency of these classifications. To date, very few papers have addressed this issue; a generally applicable procedure was published by Douglas and Mislevy in 2010. Using the example of an assessment comprised of several parts that must be passed separately, this study analyzes the reliability underlying the decision to pass or fail students and discusses the impact of an improved method for identifying those who do not fulfill the minimum requirements. Method: The accuracy and consistency of the decision to pass or fail an examinee in the subject cluster Internal Medicine/General Medicine/Clinical Chemistry at the University of Heidelberg’s Faculty of Medicine was investigated. This cluster requires students to separately pass three components (two written exams and an OSCE), whereby students may reattempt to pass each component twice. Our analysis was carried out using the method described by Douglas and Mislevy. Results: Frequently, when complex logical connections exist between the individual pass/fail decisions in the case of low failure rates, only a very low reliability for the overall decision to grant graded course credit can be achieved, even if high reliabilities exist for the various components. For the example analyzed here, the classification accuracy and consistency when conjunctively combining the three individual parts is relatively low with κ=0.49 or κ=0.47, despite the good reliability of over 0.75 for each of the three components. The option to repeat each component twice leads to a situation in which only about half of the candidates who do not satisfy the minimum requirements would fail the overall assessment, while the other half is able to continue their studies despite having deficient knowledge and skills. Conclusion: The method put forth by Douglas and Mislevy allows the analysis of the decision accuracy and consistency for complex combinations of scores from different components. Even in the case of highly reliable components, it is not necessarily so that a reliable pass/fail decision has been reached – for instance in the case of low failure rates. Assessments must be administered with the explicit goal of identifying examinees that do not fulfill the minimum requirements. PMID:26483855

  11. EEG analysis of the brain activity during the observation of commercial, political, or public service announcements.

    PubMed

    Vecchiato, Giovanni; Astolfi, Laura; Tabarrini, Alessandro; Salinari, Serenella; Mattia, Donatella; Cincotti, Febo; Bianchi, Luigi; Sorrentino, Domenica; Aloise, Fabio; Soranzo, Ramon; Babiloni, Fabio

    2010-01-01

    The use of modern brain imaging techniques could be useful to understand what brain areas are involved in the observation of video clips related to commercial advertising, as well as for the support of political campaigns, and also the areas of Public Service Announcements (PSAs). In this paper we describe the capability of tracking brain activity during the observation of commercials, political spots, and PSAs with advanced high-resolution EEG statistical techniques in time and frequency domains in a group of normal subjects. We analyzed the statistically significant cortical spectral power activity in different frequency bands during the observation of a commercial video clip related to the use of a beer in a group of 13 normal subjects. In addition, a TV speech of the Prime Minister of Italy was analyzed in two groups of swing and "supporter" voters. Results suggested that the cortical activity during the observation of commercial spots could vary consistently across the spot. This fact suggest the possibility to remove the parts of the spot that are not particularly attractive by using those cerebral indexes. The cortical activity during the observation of the political speech indicated a major cortical activity in the supporters group when compared to the swing voters. In this case, it is possible to conclude that the communication proposed has failed to raise attention or interest on swing voters. In conclusions, high-resolution EEG statistical techniques have been proved to able to generate useful insights about the particular fruition of TV messages, related to both commercial as well as political fields.

  12. Smooth quantile normalization.

    PubMed

    Hicks, Stephanie C; Okrah, Kwame; Paulson, Joseph N; Quackenbush, John; Irizarry, Rafael A; Bravo, Héctor Corrada

    2018-04-01

    Between-sample normalization is a critical step in genomic data analysis to remove systematic bias and unwanted technical variation in high-throughput data. Global normalization methods are based on the assumption that observed variability in global properties is due to technical reasons and are unrelated to the biology of interest. For example, some methods correct for differences in sequencing read counts by scaling features to have similar median values across samples, but these fail to reduce other forms of unwanted technical variation. Methods such as quantile normalization transform the statistical distributions across samples to be the same and assume global differences in the distribution are induced by only technical variation. However, it remains unclear how to proceed with normalization if these assumptions are violated, for example, if there are global differences in the statistical distributions between biological conditions or groups, and external information, such as negative or control features, is not available. Here, we introduce a generalization of quantile normalization, referred to as smooth quantile normalization (qsmooth), which is based on the assumption that the statistical distribution of each sample should be the same (or have the same distributional shape) within biological groups or conditions, but allowing that they may differ between groups. We illustrate the advantages of our method on several high-throughput datasets with global differences in distributions corresponding to different biological conditions. We also perform a Monte Carlo simulation study to illustrate the bias-variance tradeoff and root mean squared error of qsmooth compared to other global normalization methods. A software implementation is available from https://github.com/stephaniehicks/qsmooth.

  13. EEG Analysis of the Brain Activity during the Observation of Commercial, Political, or Public Service Announcements

    PubMed Central

    Vecchiato, Giovanni; Astolfi, Laura; Tabarrini, Alessandro; Salinari, Serenella; Mattia, Donatella; Cincotti, Febo; Bianchi, Luigi; Sorrentino, Domenica; Aloise, Fabio; Soranzo, Ramon; Babiloni, Fabio

    2010-01-01

    The use of modern brain imaging techniques could be useful to understand what brain areas are involved in the observation of video clips related to commercial advertising, as well as for the support of political campaigns, and also the areas of Public Service Announcements (PSAs). In this paper we describe the capability of tracking brain activity during the observation of commercials, political spots, and PSAs with advanced high-resolution EEG statistical techniques in time and frequency domains in a group of normal subjects. We analyzed the statistically significant cortical spectral power activity in different frequency bands during the observation of a commercial video clip related to the use of a beer in a group of 13 normal subjects. In addition, a TV speech of the Prime Minister of Italy was analyzed in two groups of swing and “supporter” voters. Results suggested that the cortical activity during the observation of commercial spots could vary consistently across the spot. This fact suggest the possibility to remove the parts of the spot that are not particularly attractive by using those cerebral indexes. The cortical activity during the observation of the political speech indicated a major cortical activity in the supporters group when compared to the swing voters. In this case, it is possible to conclude that the communication proposed has failed to raise attention or interest on swing voters. In conclusions, high-resolution EEG statistical techniques have been proved to able to generate useful insights about the particular fruition of TV messages, related to both commercial as well as political fields. PMID:20069055

  14. Comparison of the compressive strength of 3 different implant design systems.

    PubMed

    Pedroza, Jose E; Torrealba, Ysidora; Elias, Augusto; Psoter, Walter

    2007-01-01

    The aims of this study were twofold: to compare the static compressive strength at the implant-abutment interface of 3 design systems and to describe the implant abutment connection failure mode. A stainless steel holding device was designed to align the implants at 30 degrees with respect to the y-axis. Sixty-nine specimens were used, 23 for each system. A computer-controlled universal testing machine (MTS 810) applied static compression loading by a unidirectional vertical piston until failure. Specimens were evaluated macroscopically for longitudinal displacement, abutment looseness, and screw and implant fracture. Data were analyzed by analysis of variance (ANOVA). The mean compressive strength for the Unipost system was 392.5 psi (SD +/-40.9), for the Spline system 342.8 psi (SD+/-25.8), and for the Screw-Vent system 269.1 psi (SD+/-30.7). The Unipost implant-abutment connection demonstrated a statistically significant superior mechanical stability (P < or = .009) compared with the Spline implant system. The Spline implant system showed a statistically significant higher compressive strength than the Screw-Vent implant system (P < or =.009). Regarding failure mode, the Unipost system consistently broke at the same site, while the other systems failed at different points of the connection. The Unipost system demonstrated excellent fracture resistance to compressive forces; this resistance may be attributed primarily to the diameter of the abutment screw and the 2.5 mm counter bore, representing the same and a unique piece of the implant. The Unipost implant system demonstrated a statistically significant superior compressive strength value compared with the Spline and Screw-Vent systems, at a 30 degrees angulation.

  15. {Phi}{sup 4} kinks: Statistical mechanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habib, S.

    1995-12-31

    Some recent investigations of the thermal equilibrium properties of kinks in a 1+1-dimensional, classical {phi}{sup 4} field theory are reviewed. The distribution function, kink density, correlation function, and certain thermodynamic quantities were studied both theoretically and via large scale simulations. A simple double Gaussian variational approach within the transfer operator formalism was shown to give good results in the intermediate temperature range where the dilute gas theory is known to fail.

  16. Implications of "Too Good to Be True" for Replication, Theoretical Claims, and Experimental Design: An Example Using Prominent Studies of Racial Bias.

    PubMed

    Francis, Gregory

    2016-01-01

    In response to concerns about the validity of empirical findings in psychology, some scientists use replication studies as a way to validate good science and to identify poor science. Such efforts are resource intensive and are sometimes controversial (with accusations of researcher incompetence) when a replication fails to show a previous result. An alternative approach is to examine the statistical properties of the reported literature to identify some cases of poor science. This review discusses some details of this process for prominent findings about racial bias, where a set of studies seems "too good to be true." This kind of analysis is based on the original studies, so it avoids criticism from the original authors about the validity of replication studies. The analysis is also much easier to perform than a new empirical study. A variation of the analysis can also be used to explore whether it makes sense to run a replication study. As demonstrated here, there are situations where the existing data suggest that a direct replication of a set of studies is not worth the effort. Such a conclusion should motivate scientists to generate alternative experimental designs that better test theoretical ideas.

  17. Implications of “Too Good to Be True” for Replication, Theoretical Claims, and Experimental Design: An Example Using Prominent Studies of Racial Bias

    PubMed Central

    Francis, Gregory

    2016-01-01

    In response to concerns about the validity of empirical findings in psychology, some scientists use replication studies as a way to validate good science and to identify poor science. Such efforts are resource intensive and are sometimes controversial (with accusations of researcher incompetence) when a replication fails to show a previous result. An alternative approach is to examine the statistical properties of the reported literature to identify some cases of poor science. This review discusses some details of this process for prominent findings about racial bias, where a set of studies seems “too good to be true.” This kind of analysis is based on the original studies, so it avoids criticism from the original authors about the validity of replication studies. The analysis is also much easier to perform than a new empirical study. A variation of the analysis can also be used to explore whether it makes sense to run a replication study. As demonstrated here, there are situations where the existing data suggest that a direct replication of a set of studies is not worth the effort. Such a conclusion should motivate scientists to generate alternative experimental designs that better test theoretical ideas. PMID:27713708

  18. Python for Information Theoretic Analysis of Neural Data

    PubMed Central

    Ince, Robin A. A.; Petersen, Rasmus S.; Swan, Daniel C.; Panzeri, Stefano

    2008-01-01

    Information theory, the mathematical theory of communication in the presence of noise, is playing an increasingly important role in modern quantitative neuroscience. It makes it possible to treat neural systems as stochastic communication channels and gain valuable, quantitative insights into their sensory coding function. These techniques provide results on how neurons encode stimuli in a way which is independent of any specific assumptions on which part of the neuronal response is signal and which is noise, and they can be usefully applied even to highly non-linear systems where traditional techniques fail. In this article, we describe our work and experiences using Python for information theoretic analysis. We outline some of the algorithmic, statistical and numerical challenges in the computation of information theoretic quantities from neural data. In particular, we consider the problems arising from limited sampling bias and from calculation of maximum entropy distributions in the presence of constraints representing the effects of different orders of interaction in the system. We explain how and why using Python has allowed us to significantly improve the speed and domain of applicability of the information theoretic algorithms, allowing analysis of data sets characterized by larger numbers of variables. We also discuss how our use of Python is facilitating integration with collaborative databases and centralised computational resources. PMID:19242557

  19. MrBayes tgMC3++: A High Performance and Resource-Efficient GPU-Oriented Phylogenetic Analysis Method.

    PubMed

    Ling, Cheng; Hamada, Tsuyoshi; Gao, Jingyang; Zhao, Guoguang; Sun, Donghong; Shi, Weifeng

    2016-01-01

    MrBayes is a widespread phylogenetic inference tool harnessing empirical evolutionary models and Bayesian statistics. However, the computational cost on the likelihood estimation is very expensive, resulting in undesirably long execution time. Although a number of multi-threaded optimizations have been proposed to speed up MrBayes, there are bottlenecks that severely limit the GPU thread-level parallelism of likelihood estimations. This study proposes a high performance and resource-efficient method for GPU-oriented parallelization of likelihood estimations. Instead of having to rely on empirical programming, the proposed novel decomposition storage model implements high performance data transfers implicitly. In terms of performance improvement, a speedup factor of up to 178 can be achieved on the analysis of simulated datasets by four Tesla K40 cards. In comparison to the other publicly available GPU-oriented MrBayes, the tgMC 3 ++ method (proposed herein) outperforms the tgMC 3 (v1.0), nMC 3 (v2.1.1) and oMC 3 (v1.00) methods by speedup factors of up to 1.6, 1.9 and 2.9, respectively. Moreover, tgMC 3 ++ supports more evolutionary models and gamma categories, which previous GPU-oriented methods fail to take into analysis.

  20. Hydroxychloroquine for the prevention of fetal growth restriction and prematurity in lupus pregnancy: A systematic review and meta-analysis.

    PubMed

    Vivien, Guillotin; Alice, Bouhet; Thomas, Barnetche; Christophe, Richez; Marie-Elise, Truchetet; Julien, Seneschal; Pierre, Duffau; Estibaliz, Lazaro

    2018-04-06

    Systemic lupus erythematosus (SLE) is a chronic autoimmune disease that primarily affects women of childbearing age. While the impact of hydroxychloroquine (HCQ) on SLE activity and neonatal lupus occurrence has been evaluated in several studies, its role on prematurity and intrauterine growth restriction (IUGR) remains uncertain. The aim of this study was to assess the impact of HCQ exposure on prematurity and IUGR during pregnancy in women with SLE. We conducted a systematic review and a meta-analysis comparing prematurity and IUGR in SLE pregnancies exposed or not exposed to HCQ. The odds ratio of IUGR and prematurity were calculated and compared between pregnancies in each group according HCQ treatment. Six studies were included (3 descriptive cohort studies and 3 case series) totalling 870 pregnancies. Of the SLE pregnancies, 308 were exposed to HCQ and were compared to 562 not exposed to HCQ. There was no statistical difference for prematurity or IUGR between groups. This meta-analysis failed to prove the efficacy of HCQ in the prevention of prematurity as well as IUGR during SLE pregnancies. Due to the heterogeneity of the studies, these results should be interpreted cautiously. Copyright © 2018 Société française de rhumatologie. Published by Elsevier SAS. All rights reserved.

  1. A cluster pattern algorithm for the analysis of multiparametric cell assays.

    PubMed

    Kaufman, Menachem; Bloch, David; Zurgil, Naomi; Shafran, Yana; Deutsch, Mordechai

    2005-09-01

    The issue of multiparametric analysis of complex single cell assays of both static and flow cytometry (SC and FC, respectively) has become common in recent years. In such assays, the analysis of changes, applying common statistical parameters and tests, often fails to detect significant differences between the investigated samples. The cluster pattern similarity (CPS) measure between two sets of gated clusters is based on computing the difference between their density distribution functions' set points. The CPS was applied for the discrimination between two observations in a four-dimensional parameter space. The similarity coefficient (r) ranges between 0 (perfect similarity) to 1 (dissimilar). Three CPS validation tests were carried out: on the same stock samples of fluorescent beads, yielding very low r's (0, 0.066); and on two cell models: mitogenic stimulation of peripheral blood mononuclear cells (PBMC), and apoptosis induction in Jurkat T cell line by H2O2. In both latter cases, r indicated similarity (r < 0.23) within the same group, and dissimilarity (r > 0.48) otherwise. This classification and algorithm approach offers a measure of similarity between samples. It relies on the multidimensional pattern of the sample parameters. The algorithm compensates for environmental drifts in this apparatus and assay; it also may be applied to more than four dimensions.

  2. Hierarchical cluster analysis of technical replicates to identify interferents in untargeted mass spectrometry metabolomics.

    PubMed

    Caesar, Lindsay K; Kvalheim, Olav M; Cech, Nadja B

    2018-08-27

    Mass spectral data sets often contain experimental artefacts, and data filtering prior to statistical analysis is crucial to extract reliable information. This is particularly true in untargeted metabolomics analyses, where the analyte(s) of interest are not known a priori. It is often assumed that chemical interferents (i.e. solvent contaminants such as plasticizers) are consistent across samples, and can be removed by background subtraction from blank injections. On the contrary, it is shown here that chemical contaminants may vary in abundance across each injection, potentially leading to their misidentification as relevant sample components. With this metabolomics study, we demonstrate the effectiveness of hierarchical cluster analysis (HCA) of replicate injections (technical replicates) as a methodology to identify chemical interferents and reduce their contaminating contribution to metabolomics models. Pools of metabolites with varying complexity were prepared from the botanical Angelica keiskei Koidzumi and spiked with known metabolites. Each set of pools was analyzed in triplicate and at multiple concentrations using ultraperformance liquid chromatography coupled to mass spectrometry (UPLC-MS). Before filtering, HCA failed to cluster replicates in the data sets. To identify contaminant peaks, we developed a filtering process that evaluated the relative peak area variance of each variable within triplicate injections. These interferent peaks were found across all samples, but did not show consistent peak area from injection to injection, even when evaluating the same chemical sample. This filtering process identified 128 ions that appear to originate from the UPLC-MS system. Data sets collected for a high number of pools with comparatively simple chemical composition were highly influenced by these chemical interferents, as were samples that were analyzed at a low concentration. When chemical interferent masses were removed, technical replicates clustered in all data sets. This work highlights the importance of technical replication in mass spectrometry-based studies, and presents a new application of HCA as a tool for evaluating the effectiveness of data filtering prior to statistical analysis. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Binomial outcomes in dataset with some clusters of size two: can the dependence of twins be accounted for? A simulation study comparing the reliability of statistical methods based on a dataset of preterm infants.

    PubMed

    Sauzet, Odile; Peacock, Janet L

    2017-07-20

    The analysis of perinatal outcomes often involves datasets with some multiple births. These are datasets mostly formed of independent observations and a limited number of clusters of size two (twins) and maybe of size three or more. This non-independence needs to be accounted for in the statistical analysis. Using simulated data based on a dataset of preterm infants we have previously investigated the performance of several approaches to the analysis of continuous outcomes in the presence of some clusters of size two. Mixed models have been developed for binomial outcomes but very little is known about their reliability when only a limited number of small clusters are present. Using simulated data based on a dataset of preterm infants we investigated the performance of several approaches to the analysis of binomial outcomes in the presence of some clusters of size two. Logistic models, several methods of estimation for the logistic random intercept models and generalised estimating equations were compared. The presence of even a small percentage of twins means that a logistic regression model will underestimate all parameters but a logistic random intercept model fails to estimate the correlation between siblings if the percentage of twins is too small and will provide similar estimates to logistic regression. The method which seems to provide the best balance between estimation of the standard error and the parameter for any percentage of twins is the generalised estimating equations. This study has shown that the number of covariates or the level two variance do not necessarily affect the performance of the various methods used to analyse datasets containing twins but when the percentage of small clusters is too small, mixed models cannot capture the dependence between siblings.

  4. Subcontinuum mass transport of condensed hydrocarbons in nanoporous media

    PubMed Central

    Falk, Kerstin; Coasne, Benoit; Pellenq, Roland; Ulm, Franz-Josef; Bocquet, Lydéric

    2015-01-01

    Although hydrocarbon production from unconventional reservoirs, the so-called shale gas, has exploded recently, reliable predictions of resource availability and extraction are missing because conventional tools fail to account for their ultra-low permeability and complexity. Here, we use molecular simulation and statistical mechanics to show that continuum description—Darcy's law—fails to predict transport in shales nanoporous matrix (kerogen). The non-Darcy behaviour arises from strong adsorption in kerogen and the breakdown of hydrodynamics at the nanoscale, which contradict the assumption of viscous flow. Despite this complexity, all permeances collapse on a master curve with an unexpected dependence on alkane length. We rationalize this non-hydrodynamic behaviour using a molecular description capturing the scaling of permeance with alkane length and density. These results, which stress the need for a change of paradigm from classical descriptions to nanofluidic transport, have implications for shale gas but more generally for transport in nanoporous media. PMID:25901931

  5. Energy Efficiency Potential in the U.S. Single-Family Housing Stock

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, Eric J.; Christensen, Craig B.; Horowitz, Scott G.

    Typical approaches for assessing energy efficiency potential in buildings use a limited number of prototypes, and therefore suffer from inadequate resolution when pass-fail cost-effectiveness tests are applied, which can significantly underestimate or overestimate the economic potential of energy efficiency technologies. This analysis applies a new approach to large-scale residential energy analysis, combining the use of large public and private data sources, statistical sampling, detailed building simulations, and high-performance computing to achieve unprecedented granularity - and therefore accuracy - in modeling the diversity of the single-family housing stock. The result is a comprehensive set of maps, tables, and figures showing themore » technical and economic potential of 50 plus residential energy efficiency upgrades and packages for each state. Policymakers, program designers, and manufacturers can use these results to identify upgrades with the highest potential for cost-effective savings in a particular state or region, as well as help identify customer segments for targeted marketing and deployment. The primary finding of this analysis is that there is significant technical and economic potential to save electricity and on-site fuel use in the single-family housing stock. However, the economic potential is very sensitive to the cost-effectiveness criteria used for analysis. Additionally, the savings of particular energy efficiency upgrades is situation-specific within the housing stock (depending on climate, building vintage, heating fuel type, building physical characteristics, etc.).« less

  6. Non-extensive quantum statistics with particle-hole symmetry

    NASA Astrophysics Data System (ADS)

    Biró, T. S.; Shen, K. M.; Zhang, B. W.

    2015-06-01

    Based on Tsallis entropy (1988) and the corresponding deformed exponential function, generalized distribution functions for bosons and fermions have been used since a while Teweldeberhan et al. (2003) and Silva et al. (2010). However, aiming at a non-extensive quantum statistics further requirements arise from the symmetric handling of particles and holes (excitations above and below the Fermi level). Naive replacements of the exponential function or "cut and paste" solutions fail to satisfy this symmetry and to be smooth at the Fermi level at the same time. We solve this problem by a general ansatz dividing the deformed exponential to odd and even terms and demonstrate that how earlier suggestions, like the κ- and q-exponential behave in this respect.

  7. Balance failure in single limb stance due to ankle sprain injury: an analysis of center of pressure using the fractal dimension method.

    PubMed

    Doherty, Cailbhe; Bleakley, Chris; Hertel, Jay; Caulfield, Brian; Ryan, John; Delahunt, Eamonn

    2014-01-01

    Instrumented postural control analysis plays an important role in evaluating the effects of injury on dynamic stability during balance tasks, and is often conveyed with measures based on the displacement of the center-of-pressure (COP) assessed with a force platform. However, the desired outcome of the task is frequently characterized by a loss of dynamic stability, secondary to injury. Typically, these failed trials are discarded during research investigations, with the potential loss of informative data pertaining to task success. The novelty of the present study is that COP characteristics of failed trials in injured participants are compared to successful trial data in another injured group, and a control group of participants, using the fractal dimension (FD) method. Three groups of participants attempted a task of eyes closed single limb stance (SLS): twenty-nine participants with acute ankle sprain successfully completed the task on their non-injured limb (successful injury group); twenty eight participants with acute ankle sprain failed their attempt on their injured limb (failed injury group); sixteen participants with no current injury successfully completed the task on their non-dominant limb (successful non-injured group). Between trial analyses of these groups revealed significant differences in COP trajectory FD (successful injury group: 1.58±0.06; failed injury group: 1.54±0.07; successful non-injured group: 1.64±0.06) with a large effect size (0.27). These findings demonstrate that successful eyes-closed SLS is characterized by a larger FD of the COP path when compared to failed trials, and that injury causes a decrease in COP path FD. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Influence of diabetes mellitus on postoperative complications and failure in head and neck free flap reconstruction: a systematic review and meta-analysis.

    PubMed

    Rosado, Pablo; Cheng, Hsu-Tang; Wu, Chao-Min; Wei, Fu-Chan

    2015-04-01

    We performed a systematic review and meta-analysis to determine whether diabetic patients have an increased rate of postoperative complications compared to nondiabetic patients after head and neck free flap reconstruction. A systematic review of PubMed Database between 1966 and 2012 was performed. RevMan 5.0 was used for meta-analysis. A retrospective medical chart review of 7890 patients to identify those who had a failed microsurgical reconstruction of the head and neck region at Chang Gung Memorial Hospital was also carried out. The result revealed that patients with diabetes mellitus have a 1.76 increased risk of complications (odds ratio [OR] = 1.76; 95% confidence interval [CI] = 1.11-2.79) with minimal heterogeneity (I( 2)  = 22%; p = .28). The prevalence of diabetes mellitus in patients with failed free flaps for head and neck reconstruction is 15%. The incidence of diabetes mellitus in these patients with failed free flaps is 2.3 times higher than in the general population. © 2014 Wiley Periodicals, Inc.

  9. Reasons for revision of failed hemiarthroplasty: Are there any differences between unipolar and bipolar?

    PubMed

    Iamthanaporn, Khanin; Chareancholvanich, Keerati; Pornrattanamaneewong, Chaturong

    2018-03-16

    Hemiarthroplasty (HA) is an effective procedure for treatment of femoral neck fracture. However, it is debatable whether unipolar or bipolar HA is the most suitable implant. The purpose of this study was to compare the causes of failure and longevity in both types of HA. We retrospectively reviewed 133 cases that underwent revision surgery of HA between 2002 and 2012. The causes of revision surgery were identified and stratified into early (≤ 5 years) failure and late (> 5 years) failure. Survival analyses were performed for each implant type. The common causes for revision were aseptic loosening (49.6%), infection (22.6%) and acetabular erosion (15.0%). Unipolar and bipolar HA were not different in causes for revision, but the unipolar group had a statistically significantly higher number of acetabular erosion events compared with the bipolar group (p = 0.002). In the early period, 24 unipolar HA (52.9%) and 28 bipolar HA (34.1%) failed. There were no statistically significant differences in the numbers of revised HA in each period between the two groups (p = 0.138). The median survival times in the unipolar and bipolar groups were 84.0 ± 24.5 and 120.0 ± 5.5 months, respectively. However, the survival times of both implants were not statistically significantly different. Aseptic loosening was the most common reason for revision surgery after hemiarthroplasty surgery in early and late failures. Unipolar and bipolar hemiarthroplasty were not different in terms of causes of failure and survivorship except bipolar hemiarthroplasty had many fewer acetabular erosion events.

  10. Statistics of acoustic emissions and stress drops during granular shearing using a stick-slip fiber bundle mode

    NASA Astrophysics Data System (ADS)

    Cohen, D.; Michlmayr, G.; Or, D.

    2012-04-01

    Shearing of dense granular materials appears in many engineering and Earth sciences applications. Under a constant strain rate, the shearing stress at steady state oscillates with slow rises followed by rapid drops that are linked to the build up and failure of force chains. Experiments indicate that these drops display exponential statistics. Measurements of acoustic emissions during shearing indicates that the energy liberated by failure of these force chains has power-law statistics. Representing force chains as fibers, we use a stick-slip fiber bundle model to obtain analytical solutions of the statistical distribution of stress drops and failure energy. In the model, fibers stretch, fail, and regain strength during deformation. Fibers have Weibull-distributed threshold strengths with either quenched and annealed disorder. The shape of the distribution for drops and energy obtained from the model are similar to those measured during shearing experiments. This simple model may be useful to identify failure events linked to force chain failures. Future generalizations of the model that include different types of fiber failure may also allow identification of different types of granular failures that have distinct statistical acoustic emission signatures.

  11. Why Students Fail at Volumetric Analysis.

    ERIC Educational Resources Information Center

    Pickering, Miles

    1979-01-01

    Investigates the reasons for students' failure in an introductory volumetric analysis course by analyzing test papers and judging them against a hypothetical ideal method of grading laboratory techniques. (GA)

  12. Healing X-ray scattering images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Jiliang; Lhermitte, Julien; Tian, Ye

    X-ray scattering images contain numerous gaps and defects arising from detector limitations and experimental configuration. Here, we present a method to heal X-ray scattering images, filling gaps in the data and removing defects in a physically meaningful manner. Unlike generic inpainting methods, this method is closely tuned to the expected structure of reciprocal-space data. In particular, we exploit statistical tests and symmetry analysis to identify the structure of an image; we then copy, average and interpolate measured data into gaps in a way that respects the identified structure and symmetry. Importantly, the underlying analysis methods provide useful characterization of structuresmore » present in the image, including the identification of diffuseversussharp features, anisotropy and symmetry. The presented method leverages known characteristics of reciprocal space, enabling physically reasonable reconstruction even with large image gaps. The method will correspondingly fail for images that violate these underlying assumptions. The method assumes point symmetry and is thus applicable to small-angle X-ray scattering (SAXS) data, but only to a subset of wide-angle data. Our method succeeds in filling gaps and healing defects in experimental images, including extending data beyond the original detector borders.« less

  13. Measurement of negativity bias in personal narratives using corpus-based emotion dictionaries.

    PubMed

    Cohen, Shuki J

    2011-04-01

    This study presents a novel methodology for the measurement of negativity bias using positive and negative dictionaries of emotion words applied to autobiographical narratives. At odds with the cognitive theory of mood dysregulation, previous text-analytical studies have failed to find significant correlation between emotion dictionaries and negative affectivity or dysphoria. In the present study, an a priori list dictionary of emotion words was refined based on the actual use of these words in personal narratives collected from close to 500 college students. Half of the corpus was used to construct, via concordance analysis, the grammatical structures associated with the words in their emotional sense. The second half of the corpus served as a validation corpus. The resulting dictionary ignores words that are not used in their intended emotional sense, including negated emotions, homophones, frozen idioms etc. Correlations of the resulting corpus-based negative and positive emotion dictionaries with self-report measures of negative affectivity were in the expected direction, and were statistically significant, with medium effect size. The potential use of these dictionaries as implicit measures of negativity bias and in the analysis of psychotherapy transcripts is discussed.

  14. Cross-study projections of genomic biomarkers: an evaluation in cancer genomics.

    PubMed

    Lucas, Joseph E; Carvalho, Carlos M; Chen, Julia Ling-Yu; Chi, Jen-Tsan; West, Mike

    2009-01-01

    Human disease studies using DNA microarrays in both clinical/observational and experimental/controlled studies are having increasing impact on our understanding of the complexity of human diseases. A fundamental concept is the use of gene expression as a "common currency" that links the results of in vitro controlled experiments to in vivo observational human studies. Many studies--in cancer and other diseases--have shown promise in using in vitro cell manipulations to improve understanding of in vivo biology, but experiments often simply fail to reflect the enormous phenotypic variation seen in human diseases. We address this with a framework and methods to dissect, enhance and extend the in vivo utility of in vitro derived gene expression signatures. From an experimentally defined gene expression signature we use statistical factor analysis to generate multiple quantitative factors in human cancer gene expression data. These factors retain their relationship to the original, one-dimensional in vitro signature but better describe the diversity of in vivo biology. In a breast cancer analysis, we show that factors can reflect fundamentally different biological processes linked to molecular and clinical features of human cancers, and that in combination they can improve prediction of clinical outcomes.

  15. Sociodemographic predictors of elderly's psychological well-being in Malaysia.

    PubMed

    Momtaz, Yadollah A; Ibrahim, Rahimah; Hamid, Tengku A; Yahaya, Nurizan

    2011-05-01

    Psychological well-being as one of the most important indicators of successful aging has received substantial attention in the gerontological literature. Prior studies show that sociodemographic factors influencing elderly's psychological well-being are multiple and differ across cultures. The aim of this study was to identify significant sociodemographic predictors of psychological well-being among Malay elders. The study included 1415 older Malays (60-100 years, 722 women), randomly selected through a multistage stratified random method from Peninsular Malaysia. WHO-Five well-being index was used to measure psychological well-being. Data analysis was conducted using the Statistical Package for Social Sciences (SPSS) version 13.0. Using multiple regression analysis a significant model emerged (F(7, 1407) = 20.14, p ≤ 0.001), where age, sex, marital status, and household income were significant predictor variables of psychological well-being among Malay elders. However, level of education, employment status, and place of residence failed to predict psychological well-being. This study showed that the oldest old, elderly women, unmarried, and the poor elderly people are at risk for experiencing low psychological well-being. Therefore, they need special attention from family, policy makers, and those who work with elderly people.

  16. Healing X-ray scattering images

    DOE PAGES

    Liu, Jiliang; Lhermitte, Julien; Tian, Ye; ...

    2017-05-24

    X-ray scattering images contain numerous gaps and defects arising from detector limitations and experimental configuration. Here, we present a method to heal X-ray scattering images, filling gaps in the data and removing defects in a physically meaningful manner. Unlike generic inpainting methods, this method is closely tuned to the expected structure of reciprocal-space data. In particular, we exploit statistical tests and symmetry analysis to identify the structure of an image; we then copy, average and interpolate measured data into gaps in a way that respects the identified structure and symmetry. Importantly, the underlying analysis methods provide useful characterization of structuresmore » present in the image, including the identification of diffuseversussharp features, anisotropy and symmetry. The presented method leverages known characteristics of reciprocal space, enabling physically reasonable reconstruction even with large image gaps. The method will correspondingly fail for images that violate these underlying assumptions. The method assumes point symmetry and is thus applicable to small-angle X-ray scattering (SAXS) data, but only to a subset of wide-angle data. Our method succeeds in filling gaps and healing defects in experimental images, including extending data beyond the original detector borders.« less

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miyasaka, Hiromasa; Harrison, Fiona A.; Fürst, Felix

    The Nuclear Spectroscopic Telescope Array hard X-ray telescope observed the transient Be/X-ray binary GS 0834–430 during its 2012 outburst—the first active state of this system observed in the past 19 yr. We performed timing and spectral analysis and measured the X-ray spectrum between 3-79 keV with high statistical significance. We find the phase-averaged spectrum to be consistent with that observed in many other magnetized, accreting pulsars. We fail to detect cyclotron resonance scattering features that would allow us to constrain the pulsar's magnetic field in either phase-averaged or phase-resolved spectra. Timing analysis shows a clearly detected pulse period of ∼12.29more » s in all energy bands. The pulse profiles show a strong, energy-dependent hard phase lag of up to 0.3 cycles in phase, or about 4 s. Such dramatic energy-dependent lags in the pulse profile have never before been reported in high-mass X-ray binary pulsars. Previously reported lags have been significantly smaller in phase and restricted to low energies (E < 10 keV). We investigate the possible mechanisms that might produce this energy-dependent pulse phase shift. We find the most likely explanation for this effect is a complex beam geometry.« less

  18. Testing for independence in J×K contingency tables with complex sample survey data.

    PubMed

    Lipsitz, Stuart R; Fitzmaurice, Garrett M; Sinha, Debajyoti; Hevelone, Nathanael; Giovannucci, Edward; Hu, Jim C

    2015-09-01

    The test of independence of row and column variables in a (J×K) contingency table is a widely used statistical test in many areas of application. For complex survey samples, use of the standard Pearson chi-squared test is inappropriate due to correlation among units within the same cluster. Rao and Scott (1981, Journal of the American Statistical Association 76, 221-230) proposed an approach in which the standard Pearson chi-squared statistic is multiplied by a design effect to adjust for the complex survey design. Unfortunately, this test fails to exist when one of the observed cell counts equals zero. Even with the large samples typical of many complex surveys, zero cell counts can occur for rare events, small domains, or contingency tables with a large number of cells. Here, we propose Wald and score test statistics for independence based on weighted least squares estimating equations. In contrast to the Rao-Scott test statistic, the proposed Wald and score test statistics always exist. In simulations, the score test is found to perform best with respect to type I error. The proposed method is motivated by, and applied to, post surgical complications data from the United States' Nationwide Inpatient Sample (NIS) complex survey of hospitals in 2008. © 2015, The International Biometric Society.

  19. Statistical fluctuations in pedestrian evacuation times and the effect of social contagion

    NASA Astrophysics Data System (ADS)

    Nicolas, Alexandre; Bouzat, Sebastián; Kuperman, Marcelo N.

    2016-08-01

    Mathematical models of pedestrian evacuation and the associated simulation software have become essential tools for the assessment of the safety of public facilities and buildings. While a variety of models is now available, their calibration and test against empirical data are generally restricted to global averaged quantities; the statistics compiled from the time series of individual escapes ("microscopic" statistics) measured in recent experiments are thus overlooked. In the same spirit, much research has primarily focused on the average global evacuation time, whereas the whole distribution of evacuation times over some set of realizations should matter. In the present paper we propose and discuss the validity of a simple relation between this distribution and the microscopic statistics, which is theoretically valid in the absence of correlations. To this purpose, we develop a minimal cellular automaton, with features that afford a semiquantitative reproduction of the experimental microscopic statistics. We then introduce a process of social contagion of impatient behavior in the model and show that the simple relation under test may dramatically fail at high contagion strengths, the latter being responsible for the emergence of strong correlations in the system. We conclude with comments on the potential practical relevance for safety science of calculations based on microscopic statistics.

  20. Effectiveness of mandatory license testing for older drivers in reducing crash risk among urban older Australian drivers.

    PubMed

    Langford, Jim; Fitzharris, Michael; Koppel, Sjaanie; Newstead, Stuart

    2004-12-01

    Most licensing jurisdictions in Australia maintain mandatory assessment programs targeting older drivers, whereby a driver reaching a specified age is required to prove his or her fitness to drive through medical assessment and/or on-road testing. Previous studies both in Australia and elsewhere have consistently failed to demonstrate that age-based mandatory assessment results in reduced crash involvement for older drivers. However studies that have based their results upon either per-population or per-driver crash rates fail to take into account possible differences in driving activity. Because some older people maintain their driving licenses but rarely if ever drive, the proportion of inactive license-holders might be higher in jurisdictions without mandatory assessment relative to jurisdictions with periodic license assessment, where inactive drivers may more readily either surrender or lose their licenses. The failure to control for possible differences in driving activity across jurisdictions may be disguising possible safety benefits associated with mandatory assessment. The current study compared the crash rates of drivers in Melbourne, Australia, where there is no mandatory assessment and Sydney, Australia, where there is regular mandatory assessment from 80 years of age onward. The crash rate comparisons were based on four exposure measures: per population, per licensed driver, per distance driven, and per time spent driving. Poisson regression analysis incorporating an offset to control for inter-jurisdictional road safety differences indicated that there was no difference in crash risk for older drivers based on population. However drivers aged 80 years and older in the Sydney region had statistically higher rates of casualty crash involvement than their Melbourne counterparts on a per license issued basis (RR: 1.15, 1.02-1.29, p=0.02) and time spent driving basis (RR: 1.19, 1.06-1.34, p=0.03). A similar trend was apparent based on distance travelled but was of borderline statistical significance (RR: 1.11, 0.99-1.25, p=0.07). Collectively, it can be inferred from these findings that mandatory license re-testing schemes of the type evaluated have no demonstrable road safety benefits overall. Further research to resolve this on-going policy debate is discussed and recommended.

  1. Implications of Bioremediation of Polycyclic Aromatic Hydrocarbon-Contaminated Soils for Human Health and Cancer Risk

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davie-Martin, Cleo L.; Stratton, Kelly G.; Teeguarden, Justin G.

    Background: Bioremediation uses microorganisms to degrade polycyclic aromatic hydrocarbons (PAHs) in contaminated soils. Its success is largely evaluated through targeted analysis of PAH concentrations in soil and cancer risk (exposure) estimates. However, bioremediation often fails to significantly degrade the most carcinogenic PAHs and can initiate formation of more polar metabolites, some of which may be more toxic. Objectives: We aimed to investigate whether the cancer risk associated with PAH-contaminated soils was reduced post-bioremediation and to identify the most effective bioremediation strategies for degrading the carcinogenic and high molecular weight (≥MW302) PAHs. Methods: Pre- and post-bioremediation concentrations of eight B2 groupmore » carcinogenic PAHs in soils were collated from the literature and used to calculate excess lifetime cancer risks (ELCR) for adult populations exposed via non-dietary ingestion, per current U.S. Environmental Protection Agency (USEPA) recommendations. Due to the nature of the collated data (reported as mean concentrations ± standard deviations pre- and post-bioremediation), we used simulation methods to reconstruct the datasets and enable statistical comparison of ELCR values pre- and post-bioremediation. Additionally, we measured MW302 PAHs in a contaminated soil prior to and following treatment in an aerobic bioreactor and examined their contributions to cancer risk. Results: 120 of 158 treated soils (76%) exhibited a statistically significant reduction in cancer risk following bioremediation; however, 67% (106/158) of soils had post-bioremediation ELCR values over 10 fold higher than the USEPA health-based ‘acceptable’ risk level. Composting treatments were most effective at biodegrading PAHs in soils and reducing the ELCR. MW302 PAHs were not significantly degraded during bioremediation and dibenzo(a,l)pyrene, alone, contributed an additional 35% to the cancer risk associated with the eight B2 group PAHs in the same bioremediated soil. Conclusions: Bioremediation strategies often fail to reduce carcinogenic PAH concentrations in contaminated soils below USEPA acceptable cancer risk levels. Additionally, MW302 PAHs and ‘unknown’ metabolites (compounds not routinely measured) are not included in current cancer risk assessments and could significantly contribute to soil carcinogenicity.« less

  2. [Effectiveness of low-intensity extracorporeal shock wave therapy on patients with Erectile Dysfunction (ED) who have failed to respond to PDE5i therapy. A pilot study].

    PubMed

    Bechara, Amado; Casabé, Adolfo; De Bonis, Walter; Nazar, Julián

    2015-03-01

    Low-intensity extracorporeal shock wave therapy (LIESWT) of the penis has recently emerged as a promising modality in the treatment of ED. The objective of this paper is to assess the effectiveness and safety of LIESWT on patients with ED who have failed to respond to PDE5i treatment. Open label, prospective, longitudinal observational study. The study involved an uncontrolled population of 25 patients. The treatment consisted in applying 20,000 shock waves during a period of four weeks. In each session the patient received 5000 shock waves of 0.09 mJ/mm2: 1800 were applied on the penis (900 on each corpus cavernosum), and 3200 were applied on the perineum (1600 on each crus). During the active treatment and follow-up phases, all patients remained on their regular high on demand or once-a-day dose PDE5i schedules. Effectiveness was assessed by IIEF-6, SEP2, SEP3 and GAQ. Patients were considered to be responders whenever they improved on all three erection assessment parameters and respond positively to the GAQ at three months post-treatment. Adverse events were recorded. Statistical variables were applied and findings were considered to be statistically significant whenever the P value was<0.05. Eighty percent (median age 63) of the patients (20/25) completed the study. Five patients were lost to follow-up and were excluded from the analysis. Sixty percent (60%) of the patients responded to the treatment, improved the 3 efficacy evaluating parameters and responded positively to the GAQ. The increase in mean IIEF-6 score was of 9 points after the third post-treatment month. There were no patients reporting treatment-related adverse events. LIESWT for men with ED and that are PDE5i non-responders was safe and effective and restoring PDE5i response in more than 50% of patients.A large-scale multicenter study is required to determine the benefits of this treatment for ED.

  3. SVAw - a web-based application tool for automated surrogate variable analysis of gene expression studies

    PubMed Central

    2013-01-01

    Background Surrogate variable analysis (SVA) is a powerful method to identify, estimate, and utilize the components of gene expression heterogeneity due to unknown and/or unmeasured technical, genetic, environmental, or demographic factors. These sources of heterogeneity are common in gene expression studies, and failing to incorporate them into the analysis can obscure results. Using SVA increases the biological accuracy and reproducibility of gene expression studies by identifying these sources of heterogeneity and correctly accounting for them in the analysis. Results Here we have developed a web application called SVAw (Surrogate variable analysis Web app) that provides a user friendly interface for SVA analyses of genome-wide expression studies. The software has been developed based on open source bioconductor SVA package. In our software, we have extended the SVA program functionality in three aspects: (i) the SVAw performs a fully automated and user friendly analysis workflow; (ii) It calculates probe/gene Statistics for both pre and post SVA analysis and provides a table of results for the regression of gene expression on the primary variable of interest before and after correcting for surrogate variables; and (iii) it generates a comprehensive report file, including graphical comparison of the outcome for the user. Conclusions SVAw is a web server freely accessible solution for the surrogate variant analysis of high-throughput datasets and facilitates removing all unwanted and unknown sources of variation. It is freely available for use at http://psychiatry.igm.jhmi.edu/sva. The executable packages for both web and standalone application and the instruction for installation can be downloaded from our web site. PMID:23497726

  4. Longitudinal outcomes after tibioperoneal angioplasty alone compared to tibial stenting and atherectomy for critical limb ischemia.

    PubMed

    Reynolds, Shaun; Galiñanes, Edgar Luis; Dombrovskiy, Viktor Y; Vogel, Todd R

    2013-10-01

    There are limited data available evaluating longitudinal outcomes after tibioperoneal angioplasty (TA) alone compared to adjunctive tibial procedures including stenting and atherectomy. Using the Centers for Medicare & Medicaid Services inpatient claims (2005-2007), patients evaluated TA only, TA plus stent placement (TA + S), and TA plus atherectomy (TA + A). A total of 2080 patients with critical limb ischemia underwent percutaneous tibioperoneal intervention for the indication of ulceration. Procedures included TA (56.3%), TA + S (16.2%), and TA + A (27.5%). Rates of amputation were not statistically different between the groups at 30, 90, and 365 days after the intervention. Mean total hospital charges were TA ($35,867), TA + A ($41,698; P = .0004), and TA + S ($51,040; P < .0001). Patients undergoing TA alone compared to concomitant stenting or atherectomy for ulceration demonstrated no improvement in limb salvage. Future analysis of adjunctive tibioperoneal interventions is essential to temper cost, as they fail to improve long-term limb salvage.

  5. Earth Mover's Distance (EMD): A True Metric for Comparing Biomarker Expression Levels in Cell Populations.

    PubMed

    Orlova, Darya Y; Zimmerman, Noah; Meehan, Stephen; Meehan, Connor; Waters, Jeffrey; Ghosn, Eliver E B; Filatenkov, Alexander; Kolyagin, Gleb A; Gernez, Yael; Tsuda, Shanel; Moore, Wayne; Moss, Richard B; Herzenberg, Leonore A; Walther, Guenther

    2016-01-01

    Changes in the frequencies of cell subsets that (co)express characteristic biomarkers, or levels of the biomarkers on the subsets, are widely used as indices of drug response, disease prognosis, stem cell reconstitution, etc. However, although the currently available computational "gating" tools accurately reveal subset frequencies and marker expression levels, they fail to enable statistically reliable judgements as to whether these frequencies and expression levels differ significantly between/among subject groups. Here we introduce flow cytometry data analysis pipeline which includes the Earth Mover's Distance (EMD) metric as solution to this problem. Well known as an informative quantitative measure of differences between distributions, we present three exemplary studies showing that EMD 1) reveals clinically-relevant shifts in two markers on blood basophils responding to an offending allergen; 2) shows that ablative tumor radiation induces significant changes in the murine colon cancer tumor microenvironment; and, 3) ranks immunological differences in mouse peritoneal cavity cells harvested from three genetically distinct mouse strains.

  6. Epidemiologic survey of bladder cancer in greater New Orleans.

    PubMed

    Sullivan, J W

    1982-08-01

    Primary ancestry of the patients and controls in this study was not statistically different but the Jewish population had a significantly increased incidence of bladder cancer. Over-all, a significantly greater number of patients smoked filtered cigarettes, began drinking artificially sweetened beverages at an earlier age, drank artificially sweetened beverages for a greater number of years, consumed a greater number of glasses of artificially sweetened beverages weekly and related a history of urinary tract infections. A significantly increased incidence of bladder cancer was noted in individuals employed by certain types of companies, by certain job titles and by certain job material handled. Analysis of the data failed to show any significant difference in years of consumption of coffee, amount of various types of coffee or tea consumed, consumption of various nonalcoholic and alcoholic beverages, including source of drinking water, use of hair dye, incidence of diabetes mellitus, family history of urinary cancer and a history of pelvic irradiation or bladder stones.

  7. Association of mitochondrial DNA variants with myalgic encephalomyelitis/chronic fatigue syndrome (ME/CFS) symptoms.

    PubMed

    Hanson, Maureen R; Gu, Zhenglong; Keinan, Alon; Ye, Kaixiong; Germain, Arnaud; Billing-Ross, Paul

    2016-12-20

    Earlier this year, we described an analysis of mitochondrial DNA (mtDNA) variants in myalgic encephalomyelitis (ME)/chronic fatigue syndrome (CFS) patients and healthy controls. We reported that there was no significant association of haplogroups or singe nucleotide polymorphisms (SNPs) with disease status. Nevertheless, a commentary about our paper appeared (Finsterer and Zarrouk-Mahjoub. J Transl Med14:182, 2016) that criticized the association of mtDNA haplogroups with ME/CFS, a conclusion that was absent from our paper. The aforementioned commentary also demanded experiments that were outside of the scope of our study, ones that we had suggested as follow-up studies. Because they failed to consult a published and cited report describing the cohorts we studied, the authors also cast aspersions on the method of selection of cases for inclusion. We reiterate that we observed statistically significant association of mtDNA variants with particular symptoms and their severity, though we observed no association with disease status.

  8. Analysis of Parent, Teacher, and Consultant Speech Exchanges and Educational Outcomes of Students With Autism During COMPASS Consultation.

    PubMed

    Ruble, Lisa; Birdwhistell, Jessie; Toland, Michael D; McGrew, John H

    2011-01-01

    The significant increase in the numbers of students with autism combined with the need for better trained teachers (National Research Council, 2001) call for research on the effectiveness of alternative methods, such as consultation, that have the potential to improve service delivery. Data from 2 randomized controlled single-blind trials indicate that an autism-specific consultation planning framework known as the collaborative model for promoting competence and success (COMPASS) is effective in increasing child Individual Education Programs (IEP) outcomes (Ruble, Dal-rymple, & McGrew, 2010; Ruble, McGrew, & Toland, 2011). In this study, we describe the verbal interactions, defined as speech acts and speech act exchanges that take place during COMPASS consultation, and examine the associations between speech exchanges and child outcomes. We applied the Psychosocial Processes Coding Scheme (Leaper, 1991) to code speech acts. Speech act exchanges were overwhelmingly affiliative, failed to show statistically significant relationships with child IEP outcomes and teacher adherence, but did correlate positively with IEP quality.

  9. Analysis of Parent, Teacher, and Consultant Speech Exchanges and Educational Outcomes of Students With Autism During COMPASS Consultation

    PubMed Central

    RUBLE, LISA; BIRDWHISTELL, JESSIE; TOLAND, MICHAEL D.; MCGREW, JOHN H.

    2011-01-01

    The significant increase in the numbers of students with autism combined with the need for better trained teachers (National Research Council, 2001) call for research on the effectiveness of alternative methods, such as consultation, that have the potential to improve service delivery. Data from 2 randomized controlled single-blind trials indicate that an autism-specific consultation planning framework known as the collaborative model for promoting competence and success (COMPASS) is effective in increasing child Individual Education Programs (IEP) outcomes (Ruble, Dal-rymple, & McGrew, 2010; Ruble, McGrew, & Toland, 2011). In this study, we describe the verbal interactions, defined as speech acts and speech act exchanges that take place during COMPASS consultation, and examine the associations between speech exchanges and child outcomes. We applied the Psychosocial Processes Coding Scheme (Leaper, 1991) to code speech acts. Speech act exchanges were overwhelmingly affiliative, failed to show statistically significant relationships with child IEP outcomes and teacher adherence, but did correlate positively with IEP quality. PMID:22639523

  10. A statistical analysis of product prices in online markets

    NASA Astrophysics Data System (ADS)

    Mizuno, T.; Watanabe, T.

    2010-08-01

    We empirically investigate fluctuations in product prices in online markets by using a tick-by-tick price data collected from a Japanese price comparison site, and find some similarities and differences between product and asset prices. The average price of a product across e-retailers behaves almost like a random walk, although the probability of price increase/decrease is higher conditional on the multiple events of price increase/decrease. This is quite similar to the property reported by previous studies about asset prices. However, we fail to find a long memory property in the volatility of product price changes. Also, we find that the price change distribution for product prices is close to an exponential distribution, rather than a power law distribution. These two findings are in a sharp contrast with the previous results regarding asset prices. We propose an interpretation that these differences may stem from the absence of speculative activities in product markets; namely, e-retailers seldom repeat buy and sell of a product, unlike traders in asset markets.

  11. Seasonal rationalization of river water quality sampling locations: a comparative study of the modified Sanders and multivariate statistical approaches.

    PubMed

    Varekar, Vikas; Karmakar, Subhankar; Jha, Ramakar

    2016-02-01

    The design of surface water quality sampling location is a crucial decision-making process for rationalization of monitoring network. The quantity, quality, and types of available dataset (watershed characteristics and water quality data) may affect the selection of appropriate design methodology. The modified Sanders approach and multivariate statistical techniques [particularly factor analysis (FA)/principal component analysis (PCA)] are well-accepted and widely used techniques for design of sampling locations. However, their performance may vary significantly with quantity, quality, and types of available dataset. In this paper, an attempt has been made to evaluate performance of these techniques by accounting the effect of seasonal variation, under a situation of limited water quality data but extensive watershed characteristics information, as continuous and consistent river water quality data is usually difficult to obtain, whereas watershed information may be made available through application of geospatial techniques. A case study of Kali River, Western Uttar Pradesh, India, is selected for the analysis. The monitoring was carried out at 16 sampling locations. The discrete and diffuse pollution loads at different sampling sites were estimated and accounted using modified Sanders approach, whereas the monitored physical and chemical water quality parameters were utilized as inputs for FA/PCA. The designed optimum number of sampling locations for monsoon and non-monsoon seasons by modified Sanders approach are eight and seven while that for FA/PCA are eleven and nine, respectively. Less variation in the number and locations of designed sampling sites were obtained by both techniques, which shows stability of results. A geospatial analysis has also been carried out to check the significance of designed sampling location with respect to river basin characteristics and land use of the study area. Both methods are equally efficient; however, modified Sanders approach outperforms FA/PCA when limited water quality and extensive watershed information is available. The available water quality dataset is limited and FA/PCA-based approach fails to identify monitoring locations with higher variation, as these multivariate statistical approaches are data-driven. The priority/hierarchy and number of sampling sites designed by modified Sanders approach are well justified by the land use practices and observed river basin characteristics of the study area.

  12. Disaster Metrics: Evaluation of de Boer's Disaster Severity Scale (DSS) Applied to Earthquakes.

    PubMed

    Bayram, Jamil D; Zuabi, Shawki; McCord, Caitlin M; Sherak, Raphael A G; Hsu, Edberdt B; Kelen, Gabor D

    2015-02-01

    Quantitative measurement of the medical severity following multiple-casualty events (MCEs) is an important goal in disaster medicine. In 1990, de Boer proposed a 13-point, 7-parameter scale called the Disaster Severity Scale (DSS). Parameters include cause, duration, radius, number of casualties, nature of injuries, rescue time, and effect on surrounding community. Hypothesis This study aimed to examine the reliability and dimensionality (number of salient themes) of de Boer's DSS scale through its application to 144 discrete earthquake events. A search for earthquake events was conducted via National Oceanic and Atmospheric Administration (NOAA) and US Geological Survey (USGS) databases. Two experts in the field of disaster medicine independently reviewed and assigned scores for parameters that had no data readily available (nature of injuries, rescue time, and effect on surrounding community), and differences were reconciled via consensus. Principle Component Analysis was performed using SPSS Statistics for Windows Version 22.0 (IBM Corp; Armonk, New York USA) to evaluate the reliability and dimensionality of the DSS. A total of 144 individual earthquakes from 2003 through 2013 were identified and scored. Of 13 points possible, the mean score was 6.04, the mode = 5, minimum = 4, maximum = 11, and standard deviation = 2.23. Three parameters in the DSS had zero variance (ie, the parameter received the same score in all 144 earthquakes). Because of the zero contribution to variance, these three parameters (cause, duration, and radius) were removed to run the statistical analysis. Cronbach's alpha score, a coefficient of internal consistency, for the remaining four parameters was found to be robust at 0.89. Principle Component Analysis showed uni-dimensional characteristics with only one component having an eigenvalue greater than one at 3.17. The 4-parameter DSS, however, suffered from restriction of scoring range on both parameter and scale levels. Jan de Boer's DSS in its 7-parameter format fails to hold statistically in a dataset of 144 earthquakes subjected to analysis. A modified 4-parameter scale was found to quantitatively assess medical severity more directly, but remains flawed due to range restriction on both individual parameter and scale levels. Further research is needed in the field of disaster metrics to develop a scale that is reliable in its complete set of parameters, capable of better fine discrimination, and uni-dimensional in measurement of the medical severity of MCEs.

  13. Estimation of the geochemical threshold and its statistical significance

    USGS Publications Warehouse

    Miesch, A.T.

    1981-01-01

    A statistic is proposed for estimating the geochemical threshold and its statistical significance, or it may be used to identify a group of extreme values that can be tested for significance by other means. The statistic is the maximum gap between adjacent values in an ordered array after each gap has been adjusted for the expected frequency. The values in the ordered array are geochemical values transformed by either ln(?? - ??) or ln(?? - ??) and then standardized so that the mean is zero and the variance is unity. The expected frequency is taken from a fitted normal curve with unit area. The midpoint of an adjusted gap that exceeds the corresponding critical value may be taken as an estimate of the geochemical threshold, and the associated probability indicates the likelihood that the threshold separates two geochemical populations. The adjusted gap test may fail to identify threshold values if the variation tends to be continuous from background values to the higher values that reflect mineralized ground. However, the test will serve to identify other anomalies that may be too subtle to have been noted by other means. ?? 1981.

  14. Statistical classification approach to discrimination between weak earthquakes and quarry blasts recorded by the Israel Seismic Network

    NASA Astrophysics Data System (ADS)

    Kushnir, A. F.; Troitsky, E. V.; Haikin, L. M.; Dainty, A.

    1999-06-01

    A semi-automatic procedure has been developed to achieve statistically optimum discrimination between earthquakes and explosions at local or regional distances based on a learning set specific to a given region. The method is used for step-by-step testing of candidate discrimination features to find the optimum (combination) subset of features, with the decision taken on a rigorous statistical basis. Linear (LDF) and Quadratic (QDF) Discriminant Functions based on Gaussian distributions of the discrimination features are implemented and statistically grounded; the features may be transformed by the Box-Cox transformation z=(1/ α)( yα-1) to make them more Gaussian. Tests of the method were successfully conducted on seismograms from the Israel Seismic Network using features consisting of spectral ratios between and within phases. Results showed that the QDF was more effective than the LDF and required five features out of 18 candidates for the optimum set. It was found that discrimination improved with increasing distance within the local range, and that eliminating transformation of the features and failing to correct for noise led to degradation of discrimination.

  15. Failure Investigation of a Cage Suspension Gear Chain used in Coal Mines

    NASA Astrophysics Data System (ADS)

    Ghosh, Debashis; Dutta, Shamik; Shukla, Awdhesh Kumar; Roy, Himadri

    2016-10-01

    This investigation is primarily aimed to examine the probable causes of in-service failure of cage suspension gear chain used in coal mines. Preliminary visual examination, dimensional measurement, chemical analysis, magnetic particle inspection and estimation of mechanical properties are necessary supplement to this investigation. Optical microscopic analysis along with scanning electron microscopy examinations are carried out to understand the metallurgical reasons for failure. The visual examination and magnetic particle investigations reveal presence of fissure cracks at weld joint for both un-failed and failed end link chain. The average hardness value has been found to increase gradually with the distance from the weld interface. The macro and microstructural examinations of the samples prepared from both failed and un-failed specimens depict presence of continuous as well as aligned linear inclusions randomly distributed along with decarburized layer at weld interface/fusion zone. Fractographic examination shows flat fracture covering major portion of cross-section, which is surrounded by a narrow annular metallic fracture surface having a texture different from that of the remaining surface. Fracture mechanics principles have been used to study the fatigue crack growth rate in both weld region and base region of the un-failed gear chain material. Detailed stress analyses are also carried out to evaluate the stress generated along the chain periphery. Finally, it is concluded that presence of serious weld defect due to use of improper welding parameters/procedure caused failure of the end links of the investigated chain link.

  16. Investigating the Mechanism of MenaINV-Driven Metastasis

    DTIC Science & Technology

    2016-02-01

    bearing control tumor or expressing Mena or MenaINV 12 weeks after injection. Data presented as mean± SEM for 3 mice per group . Statistics determined by...proteins, 12 are known PTP1B substrates; as a group , these exhibited signifi- cantly higher phosphorylation in MenaINV-expressing cells than in controls ...person shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number

  17. Information Foraging Theory in Software Maintenance

    DTIC Science & Technology

    2012-09-30

    classified information, stamp classification level on the top and bottom of this page. 17. LIMITATION OF ABSTRACT. This block must be completed to assign a ...time: for example a time series plot of model reaction times to many (simulated) stimuli presented to it in a run • “ Statistical ” abstractions summed...shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number

  18. Automatic centring and bonding of lenses

    NASA Astrophysics Data System (ADS)

    Krey, Stefan; Heinisch, J.; Dumitrescu, E.

    2007-05-01

    We present an automatic bonding station which is able to center and bond individual lenses or doublets to a barrel with sub micron centring accuracy. The complete manufacturing cycle includes the glue dispensing and UV curing. During the process the state of centring is continuously controlled by the vision software, and the final result is recorded to a file for process statistics. Simple pass or fail results are displayed to the operator at the end of the process.

  19. Efficient Algorithms for Bayesian Network Parameter Learning from Incomplete Data

    DTIC Science & Technology

    2015-07-01

    Efficient Algorithms for Bayesian Network Parameter Learning from Incomplete Data Guy Van den Broeck∗ and Karthika Mohan∗ and Arthur Choi and Adnan ...notwithstanding any other provision of law , no person shall be subject to a penalty for failing to comply with a collection of information if it does...Wasserman, L. (2011). All of Statistics. Springer Science & Business Media. Yaramakala, S., & Margaritis, D. (2005). Speculative markov blanket discovery for optimal feature selection. In Proceedings of ICDM.

  20. Mathematical Capture of Human Data for Computer Model Building and Validation

    DTIC Science & Technology

    2014-04-03

    weapon. The Projectile, the VDE , and the IDE weapons had effects of financial loss for the targeted participant, while the MRAD yielded its own...for LE, Centroid and TE for the baseline and The VDE weapon conditions since p-values exceeded α. All other conditions rejected the null...hypothesis except the LE for VDE weapon. The K-S Statistics were correspondingly lower for the measures that failed to reject the null hypothesis. The CDF

Top