Sample records for complete statistical analysis

  1. Teaching statistics in biology: using inquiry-based learning to strengthen understanding of statistical analysis in biology laboratory courses.

    PubMed

    Metz, Anneke M

    2008-01-01

    There is an increasing need for students in the biological sciences to build a strong foundation in quantitative approaches to data analyses. Although most science, engineering, and math field majors are required to take at least one statistics course, statistical analysis is poorly integrated into undergraduate biology course work, particularly at the lower-division level. Elements of statistics were incorporated into an introductory biology course, including a review of statistics concepts and opportunity for students to perform statistical analysis in a biological context. Learning gains were measured with an 11-item statistics learning survey instrument developed for the course. Students showed a statistically significant 25% (p < 0.005) increase in statistics knowledge after completing introductory biology. Students improved their scores on the survey after completing introductory biology, even if they had previously completed an introductory statistics course (9%, improvement p < 0.005). Students retested 1 yr after completing introductory biology showed no loss of their statistics knowledge as measured by this instrument, suggesting that the use of statistics in biology course work may aid long-term retention of statistics knowledge. No statistically significant differences in learning were detected between male and female students in the study.

  2. Teaching Statistics in Biology: Using Inquiry-based Learning to Strengthen Understanding of Statistical Analysis in Biology Laboratory Courses

    PubMed Central

    2008-01-01

    There is an increasing need for students in the biological sciences to build a strong foundation in quantitative approaches to data analyses. Although most science, engineering, and math field majors are required to take at least one statistics course, statistical analysis is poorly integrated into undergraduate biology course work, particularly at the lower-division level. Elements of statistics were incorporated into an introductory biology course, including a review of statistics concepts and opportunity for students to perform statistical analysis in a biological context. Learning gains were measured with an 11-item statistics learning survey instrument developed for the course. Students showed a statistically significant 25% (p < 0.005) increase in statistics knowledge after completing introductory biology. Students improved their scores on the survey after completing introductory biology, even if they had previously completed an introductory statistics course (9%, improvement p < 0.005). Students retested 1 yr after completing introductory biology showed no loss of their statistics knowledge as measured by this instrument, suggesting that the use of statistics in biology course work may aid long-term retention of statistics knowledge. No statistically significant differences in learning were detected between male and female students in the study. PMID:18765754

  3. Simulation Study of Evacuation Control Center Operations Analysis

    DTIC Science & Technology

    2011-06-01

    28 4.3 Baseline Manning (Runs 1, 2, & 3) . . . . . . . . . . . . 30 4.3.1 Baseline Statistics Interpretation...46 Appendix B. Key Statistic Matrix: Runs 1-12 . . . . . . . . . . . . . 48 Appendix C. Blue Dart...Completion Time . . . 33 11. Paired T result - Run 5 v. Run 6: ECC Completion Time . . . 35 12. Key Statistics : Run 3 vs. Run 9

  4. The Effect of Using Case Studies in Business Statistics

    ERIC Educational Resources Information Center

    Pariseau, Susan E.; Kezim, Boualem

    2007-01-01

    The authors evaluated the effect on learning of using case studies in business statistics courses. The authors divided students into 3 groups: a control group, a group that completed 1 case study, and a group that completed 3 case studies. Results evidenced that, on average, students whom the authors required to complete a case analysis received…

  5. Funding source and primary outcome changes in clinical trials registered on ClinicalTrials.gov are associated with the reporting of a statistically significant primary outcome: a cross-sectional study.

    PubMed

    Ramagopalan, Sreeram V; Skingsley, Andrew P; Handunnetthi, Lahiru; Magnus, Daniel; Klingel, Michelle; Pakpoor, Julia; Goldacre, Ben

    2015-01-01

    We and others have shown a significant proportion of interventional trials registered on ClinicalTrials.gov have their primary outcomes altered after the listed study start and completion dates. The objectives of this study were to investigate whether changes made to primary outcomes are associated with the likelihood of reporting a statistically significant primary outcome on ClinicalTrials.gov. A cross-sectional analysis of all interventional clinical trials registered on ClinicalTrials.gov as of 20 November 2014 was performed. The main outcome was any change made to the initially listed primary outcome and the time of the change in relation to the trial start and end date. 13,238 completed interventional trials were registered with ClinicalTrials.gov that also had study results posted on the website. 2555 (19.3%) had one or more statistically significant primary outcomes. Statistical analysis showed that registration year, funding source and primary outcome change after trial completion were associated with reporting a statistically significant primary outcome .  Funding source and primary outcome change after trial completion are associated with a statistically significant primary outcome report on clinicaltrials.gov.

  6. a Study of Women Engineering Students and Time to Completion of First-Year Required Courses at Texas A&M University

    NASA Astrophysics Data System (ADS)

    Kimball, Jorja; Cole, Bryan; Hobson, Margaret; Watson, Karan; Stanley, Christine

    This paper reports findings on gender that were part of a larger study reviewing time to completion of course work that includes the first two semesters of calculus, chemistry, and physics, which are often considered the stumbling points or "barrier courses" to an engineering baccalaureate degree. Texas A&M University terms these courses core body of knowledge (CBK), and statistical analysis was conducted on two cohorts of first-year enrolling engineering students at the institution. Findings indicate that gender is statistically significantly related to completion of CBK with female engineering students completing required courses faster than males at the .01 level (p = 0.008). Statistical significance for gender and ethnicity was found between white male and white female students at the .01 level (p = 0.008). Descriptive analysis indicated that of the five majors studied (chemical, civil, computer, electrical, and mechanical engineering), women completed CBK faster than men, and African American and Hispanic women completed CBK faster than males of the same ethnicity.

  7. Statistical energy analysis computer program, user's guide

    NASA Technical Reports Server (NTRS)

    Trudell, R. W.; Yano, L. I.

    1981-01-01

    A high frequency random vibration analysis, (statistical energy analysis (SEA) method) is examined. The SEA method accomplishes high frequency prediction of arbitrary structural configurations. A general SEA computer program is described. A summary of SEA theory, example problems of SEA program application, and complete program listing are presented.

  8. Risk Factors for Sexual Violence in the Military: An Analysis of Sexual Assault and Sexual Harassment Incidents and Reporting

    DTIC Science & Technology

    2017-03-01

    53 ix LIST OF TABLES Table 1. Descriptive Statistics for Control Variables by... Statistics for Control Variables by Gender (Random Subsample with Complete Survey) ............................................................30 Table...empirical analysis. Chapter IV describes the summary statistics and results. Finally, Chapter V offers concluding thoughts, study limitations, and

  9. Antiviral treatment of Bell's palsy based on baseline severity: a systematic review and meta-analysis.

    PubMed

    Turgeon, Ricky D; Wilby, Kyle J; Ensom, Mary H H

    2015-06-01

    We conducted a systematic review with meta-analysis to evaluate the efficacy of antiviral agents on complete recovery of Bell's palsy. We searched CENTRAL, Embase, MEDLINE, International Pharmaceutical Abstracts, and sources of unpublished literature to November 1, 2014. Primary and secondary outcomes were complete and satisfactory recovery, respectively. To evaluate statistical heterogeneity, we performed subgroup analysis of baseline severity of Bell's palsy and between-study sensitivity analyses based on risk of allocation and detection bias. The 10 included randomized controlled trials (2419 patients; 807 with severe Bell's palsy at onset) had variable risk of bias, with 9 trials having a high risk of bias in at least 1 domain. Complete recovery was not statistically significantly greater with antiviral use versus no antiviral use in the random-effects meta-analysis of 6 trials (relative risk, 1.06; 95% confidence interval, 0.97-1.16; I(2) = 65%). Conversely, random-effects meta-analysis of 9 trials showed a statistically significant difference in satisfactory recovery (relative risk, 1.10; 95% confidence interval, 1.02-1.18; I(2) = 63%). Response to antiviral agents did not differ visually or statistically between patients with severe symptoms at baseline and those with milder disease (test for interaction, P = .11). Sensitivity analyses did not show a clear effect of bias on outcomes. Antiviral agents are not efficacious in increasing the proportion of patients with Bell's palsy who achieved complete recovery, regardless of baseline symptom severity. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. A novel complete-case analysis to determine statistical significance between treatments in an intention-to-treat population of randomized clinical trials involving missing data.

    PubMed

    Liu, Wei; Ding, Jinhui

    2018-04-01

    The application of the principle of the intention-to-treat (ITT) to the analysis of clinical trials is challenged in the presence of missing outcome data. The consequences of stopping an assigned treatment in a withdrawn subject are unknown. It is difficult to make a single assumption about missing mechanisms for all clinical trials because there are complicated reactions in the human body to drugs due to the presence of complex biological networks, leading to data missing randomly or non-randomly. Currently there is no statistical method that can tell whether a difference between two treatments in the ITT population of a randomized clinical trial with missing data is significant at a pre-specified level. Making no assumptions about the missing mechanisms, we propose a generalized complete-case (GCC) analysis based on the data of completers. An evaluation of the impact of missing data on the ITT analysis reveals that a statistically significant GCC result implies a significant treatment effect in the ITT population at a pre-specified significance level unless, relative to the comparator, the test drug is poisonous to the non-completers as documented in their medical records. Applications of the GCC analysis are illustrated using literature data, and its properties and limits are discussed.

  11. The "S.W.A.T." Concept.

    ERIC Educational Resources Information Center

    Williamson, Bob; And Others

    1985-01-01

    Describes Statistical Work Analysis Teams (S.W.A.T.), which marry the two factors necessary for successful statistical analysis with the personal nature of attribute data into a single effort. Discusses S.W.A.T. project guidelines, implementation of the first S.W.A.T. projects, team training, and project completion. (CT)

  12. An Analysis of Attitudes toward Statistics: Gender Differences among Advertising Majors.

    ERIC Educational Resources Information Center

    Fullerton, Jami A.; Umphrey, Don

    This study measures advertising students' attitudes toward statistics. Subjects, 275 undergraduate advertising students from two southwestern United States universities, completed a questionnaire used to gauge students' attitudes toward statistics by measuring 6 underlying factors: (1) students' interest and future applicability; (2) relationship…

  13. Pre-installation customer satisfaction survey

    DOT National Transportation Integrated Search

    1996-10-01

    The National Center for Statistics and Analysis (NCSA) Information Services Branch (ISB) required a more effective method of receiving, tracking, and completing requests for data, statistics, and information. To enhance ISBs services, a new cus...

  14. Nonoccupant Fatalities Associated with Backing Crashes

    DOT National Transportation Integrated Search

    1997-02-01

    National Highway Traffic Safety Administration's National Center for Statistics : and Analysis (NCSA) recently completed a study of data from the National Center : Health Statistics (NCHS) to obtain an estimate of the number of nonoccupant : fataliti...

  15. Limitations of Using Microsoft Excel Version 2016 (MS Excel 2016) for Statistical Analysis for Medical Research.

    PubMed

    Tanavalee, Chotetawan; Luksanapruksa, Panya; Singhatanadgige, Weerasak

    2016-06-01

    Microsoft Excel (MS Excel) is a commonly used program for data collection and statistical analysis in biomedical research. However, this program has many limitations, including fewer functions that can be used for analysis and a limited number of total cells compared with dedicated statistical programs. MS Excel cannot complete analyses with blank cells, and cells must be selected manually for analysis. In addition, it requires multiple steps of data transformation and formulas to plot survival analysis graphs, among others. The Megastat add-on program, which will be supported by MS Excel 2016 soon, would eliminate some limitations of using statistic formulas within MS Excel.

  16. 7 CFR 4279.43 - Certified Lender Program.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... complete financial statements; and completion by the Agency of the environmental analysis. The Agency may... lender will provide a written certification to this effect along with a statistical analysis of its... 80 percent. (4) If the lender is a bank or savings and loan, it must have a financial strength rating...

  17. 7 CFR 4279.43 - Certified Lender Program.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... complete financial statements; and completion by the Agency of the environmental analysis. The Agency may... lender will provide a written certification to this effect along with a statistical analysis of its... 80 percent. (4) If the lender is a bank or savings and loan, it must have a financial strength rating...

  18. 7 CFR 4279.43 - Certified Lender Program.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... complete financial statements; and completion by the Agency of the environmental analysis. The Agency may... lender will provide a written certification to this effect along with a statistical analysis of its... 80 percent. (4) If the lender is a bank or savings and loan, it must have a financial strength rating...

  19. 7 CFR 4279.43 - Certified Lender Program.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... complete financial statements; and completion by the Agency of the environmental analysis. The Agency may... lender will provide a written certification to this effect along with a statistical analysis of its... 80 percent. (4) If the lender is a bank or savings and loan, it must have a financial strength rating...

  20. Highly Efficient Design-of-Experiments Methods for Combining CFD Analysis and Experimental Data

    NASA Technical Reports Server (NTRS)

    Anderson, Bernhard H.; Haller, Harold S.

    2009-01-01

    It is the purpose of this study to examine the impact of "highly efficient" Design-of-Experiments (DOE) methods for combining sets of CFD generated analysis data with smaller sets of Experimental test data in order to accurately predict performance results where experimental test data were not obtained. The study examines the impact of micro-ramp flow control on the shock wave boundary layer (SWBL) interaction where a complete paired set of data exist from both CFD analysis and Experimental measurements By combining the complete set of CFD analysis data composed of fifteen (15) cases with a smaller subset of experimental test data containing four/five (4/5) cases, compound data sets (CFD/EXP) were generated which allows the prediction of the complete set of Experimental results No statistical difference were found to exist between the combined (CFD/EXP) generated data sets and the complete Experimental data set composed of fifteen (15) cases. The same optimal micro-ramp configuration was obtained using the (CFD/EXP) generated data as obtained with the complete set of Experimental data, and the DOE response surfaces generated by the two data sets were also not statistically different.

  1. A PERT/CPM of the Computer Assisted Completion of The Ministry September Report. Research Report.

    ERIC Educational Resources Information Center

    Feeney, J. D.

    Using two statistical analysis techniques (the Program Evaluation and Review Technique and the Critical Path Method), this study analyzed procedures for compiling the required yearly report of the Metropolitan Separate School Board (Catholic) of Toronto, Canada. The computer-assisted analysis organized the process of completing the report more…

  2. The Analysis of Completely Randomized Factorial Experiments When Observations Are Lost at Random.

    ERIC Educational Resources Information Center

    Hummel, Thomas J.

    An investigation was conducted of the characteristics of two estimation procedures and corresponding test statistics used in the analysis of completely randomized factorial experiments when observations are lost at random. For one estimator, contrast coefficients for cell means did not involve the cell frequencies. For the other, contrast…

  3. The Australasian Resuscitation in Sepsis Evaluation (ARISE) trial statistical analysis plan.

    PubMed

    Delaney, Anthony P; Peake, Sandra L; Bellomo, Rinaldo; Cameron, Peter; Holdgate, Anna; Howe, Belinda; Higgins, Alisa; Presneill, Jeffrey; Webb, Steve

    2013-09-01

    The Australasian Resuscitation in Sepsis Evaluation (ARISE) study is an international, multicentre, randomised, controlled trial designed to evaluate the effectiveness of early goal-directed therapy compared with standard care for patients presenting to the emergency department with severe sepsis. In keeping with current practice, and considering aspects of trial design and reporting specific to non-pharmacological interventions, our plan outlines the principles and methods for analysing and reporting the trial results. The document is prepared before completion of recruitment into the ARISE study, without knowledge of the results of the interim analysis conducted by the data safety and monitoring committee and before completion of the two related international studies. Our statistical analysis plan was designed by the ARISE chief investigators, and reviewed and approved by the ARISE steering committee. We reviewed the data collected by the research team as specified in the study protocol and detailed in the study case report form. We describe information related to baseline characteristics, characteristics of delivery of the trial interventions, details of resuscitation, other related therapies and other relevant data with appropriate comparisons between groups. We define the primary, secondary and tertiary outcomes for the study, with description of the planned statistical analyses. We have developed a statistical analysis plan with a trial profile, mock-up tables and figures. We describe a plan for presenting baseline characteristics, microbiological and antibiotic therapy, details of the interventions, processes of care and concomitant therapies and adverse events. We describe the primary, secondary and tertiary outcomes with identification of subgroups to be analysed. We have developed a statistical analysis plan for the ARISE study, available in the public domain, before the completion of recruitment into the study. This will minimise analytical bias and conforms to current best practice in conducting clinical trials.

  4. Uncertainty Analysis of Seebeck Coefficient and Electrical Resistivity Characterization

    NASA Technical Reports Server (NTRS)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    In order to provide a complete description of a materials thermoelectric power factor, in addition to the measured nominal value, an uncertainty interval is required. The uncertainty may contain sources of measurement error including systematic bias error and precision error of a statistical nature. The work focuses specifically on the popular ZEM-3 (Ulvac Technologies) measurement system, but the methods apply to any measurement system. The analysis accounts for sources of systematic error including sample preparation tolerance, measurement probe placement, thermocouple cold-finger effect, and measurement parameters; in addition to including uncertainty of a statistical nature. Complete uncertainty analysis of a measurement system allows for more reliable comparison of measurement data between laboratories.

  5. Missing Data and Multiple Imputation: An Unbiased Approach

    NASA Technical Reports Server (NTRS)

    Foy, M.; VanBaalen, M.; Wear, M.; Mendez, C.; Mason, S.; Meyers, V.; Alexander, D.; Law, J.

    2014-01-01

    The default method of dealing with missing data in statistical analyses is to only use the complete observations (complete case analysis), which can lead to unexpected bias when data do not meet the assumption of missing completely at random (MCAR). For the assumption of MCAR to be met, missingness cannot be related to either the observed or unobserved variables. A less stringent assumption, missing at random (MAR), requires that missingness not be associated with the value of the missing variable itself, but can be associated with the other observed variables. When data are truly MAR as opposed to MCAR, the default complete case analysis method can lead to biased results. There are statistical options available to adjust for data that are MAR, including multiple imputation (MI) which is consistent and efficient at estimating effects. Multiple imputation uses informing variables to determine statistical distributions for each piece of missing data. Then multiple datasets are created by randomly drawing on the distributions for each piece of missing data. Since MI is efficient, only a limited number, usually less than 20, of imputed datasets are required to get stable estimates. Each imputed dataset is analyzed using standard statistical techniques, and then results are combined to get overall estimates of effect. A simulation study will be demonstrated to show the results of using the default complete case analysis, and MI in a linear regression of MCAR and MAR simulated data. Further, MI was successfully applied to the association study of CO2 levels and headaches when initial analysis showed there may be an underlying association between missing CO2 levels and reported headaches. Through MI, we were able to show that there is a strong association between average CO2 levels and the risk of headaches. Each unit increase in CO2 (mmHg) resulted in a doubling in the odds of reported headaches.

  6. Fatalities Associated with Carbon Monoxide Poisoning from Motor Vehicles in 1993

    DOT National Transportation Integrated Search

    1996-12-01

    National Highway Traffic Safety Administration's National Center for Statistics : and Analysis (NCSA) recently completed a study of data from the National Center : for Health Statistics (NCHS) to obtain an estimate of the number of persons : killed a...

  7. A New Paradigm to Analyze Data Completeness of Patient Data.

    PubMed

    Nasir, Ayan; Gurupur, Varadraj; Liu, Xinliang

    2016-08-03

    There is a need to develop a tool that will measure data completeness of patient records using sophisticated statistical metrics. Patient data integrity is important in providing timely and appropriate care. Completeness is an important step, with an emphasis on understanding the complex relationships between data fields and their relative importance in delivering care. This tool will not only help understand where data problems are but also help uncover the underlying issues behind them. Develop a tool that can be used alongside a variety of health care database software packages to determine the completeness of individual patient records as well as aggregate patient records across health care centers and subpopulations. The methodology of this project is encapsulated within the Data Completeness Analysis Package (DCAP) tool, with the major components including concept mapping, CSV parsing, and statistical analysis. The results from testing DCAP with Healthcare Cost and Utilization Project (HCUP) State Inpatient Database (SID) data show that this tool is successful in identifying relative data completeness at the patient, subpopulation, and database levels. These results also solidify a need for further analysis and call for hypothesis driven research to find underlying causes for data incompleteness. DCAP examines patient records and generates statistics that can be used to determine the completeness of individual patient data as well as the general thoroughness of record keeping in a medical database. DCAP uses a component that is customized to the settings of the software package used for storing patient data as well as a Comma Separated Values (CSV) file parser to determine the appropriate measurements. DCAP itself is assessed through a proof of concept exercise using hypothetical data as well as available HCUP SID patient data.

  8. A New Paradigm to Analyze Data Completeness of Patient Data

    PubMed Central

    Nasir, Ayan; Liu, Xinliang

    2016-01-01

    Summary Background There is a need to develop a tool that will measure data completeness of patient records using sophisticated statistical metrics. Patient data integrity is important in providing timely and appropriate care. Completeness is an important step, with an emphasis on understanding the complex relationships between data fields and their relative importance in delivering care. This tool will not only help understand where data problems are but also help uncover the underlying issues behind them. Objectives Develop a tool that can be used alongside a variety of health care database software packages to determine the completeness of individual patient records as well as aggregate patient records across health care centers and subpopulations. Methods The methodology of this project is encapsulated within the Data Completeness Analysis Package (DCAP) tool, with the major components including concept mapping, CSV parsing, and statistical analysis. Results The results from testing DCAP with Healthcare Cost and Utilization Project (HCUP) State Inpatient Database (SID) data show that this tool is successful in identifying relative data completeness at the patient, subpopulation, and database levels. These results also solidify a need for further analysis and call for hypothesis driven research to find underlying causes for data incompleteness. Conclusion DCAP examines patient records and generates statistics that can be used to determine the completeness of individual patient data as well as the general thoroughness of record keeping in a medical database. DCAP uses a component that is customized to the settings of the software package used for storing patient data as well as a Comma Separated Values (CSV) file parser to determine the appropriate measurements. DCAP itself is assessed through a proof of concept exercise using hypothetical data as well as available HCUP SID patient data. PMID:27484918

  9. Gaia DR2 documentation Chapter 1: Introduction

    NASA Astrophysics Data System (ADS)

    de Bruijne, J. H. J.; Abreu, A.; Brown, A. G. A.; Castañeda, J.; Cheek, N.; Crowley, C.; De Angeli, F.; Drimmel, R.; Fabricius, C.; Fleitas, J.; Gracia-Abril, G.; Guerra, R.; Hutton, A.; Messineo, R.; Mora, A.; Nienartowicz, K.; Panem, C.; Siddiqui, H.

    2018-04-01

    This chapter of the Gaia DR2 documentation describes the Gaia mission, the Gaia spacecraft, and the organisation of the Gaia Data Processing and Analysis Consortium (DPAC), which is responsible for the processing and analysis of the Gaia data. Furthermore, various properties of the data release are summarised, including statistical properties, object statistics, completeness, selection and filtering criteria, and limitations of the data.

  10. An Automated System for Chromosome Analysis

    NASA Technical Reports Server (NTRS)

    Castleman, K. R.; Melnyk, J. H.

    1976-01-01

    The design, construction, and testing of a complete system to produce karyotypes and chromosome measurement data from human blood samples, and to provide a basis for statistical analysis of quantitative chromosome measurement data are described.

  11. Rural and Urban Crashes: A Comparative Analysis

    DOT National Transportation Integrated Search

    1996-08-01

    National Highway Traffic Safety Administration's National Center for Statistics : and Analysis (NCSA) recently completed a study comparing the characteristics of : crashes occurring in rural areas to the characteristics of crashes occurring in : urba...

  12. Statistical correlation of structural mode shapes from test measurements and NASTRAN analytical values

    NASA Technical Reports Server (NTRS)

    Purves, L.; Strang, R. F.; Dube, M. P.; Alea, P.; Ferragut, N.; Hershfeld, D.

    1983-01-01

    The software and procedures of a system of programs used to generate a report of the statistical correlation between NASTRAN modal analysis results and physical tests results from modal surveys are described. Topics discussed include: a mathematical description of statistical correlation, a user's guide for generating a statistical correlation report, a programmer's guide describing the organization and functions of individual programs leading to a statistical correlation report, and a set of examples including complete listings of programs, and input and output data.

  13. Australasian Resuscitation In Sepsis Evaluation trial statistical analysis plan.

    PubMed

    Delaney, Anthony; Peake, Sandra L; Bellomo, Rinaldo; Cameron, Peter; Holdgate, Anna; Howe, Belinda; Higgins, Alisa; Presneill, Jeffrey; Webb, Steve

    2013-10-01

    The Australasian Resuscitation In Sepsis Evaluation (ARISE) study is an international, multicentre, randomised, controlled trial designed to evaluate the effectiveness of early goal-directed therapy compared with standard care for patients presenting to the ED with severe sepsis. In keeping with current practice, and taking into considerations aspects of trial design and reporting specific to non-pharmacologic interventions, this document outlines the principles and methods for analysing and reporting the trial results. The document is prepared prior to completion of recruitment into the ARISE study, without knowledge of the results of the interim analysis conducted by the data safety and monitoring committee and prior to completion of the two related international studies. The statistical analysis plan was designed by the ARISE chief investigators, and reviewed and approved by the ARISE steering committee. The data collected by the research team as specified in the study protocol, and detailed in the study case report form were reviewed. Information related to baseline characteristics, characteristics of delivery of the trial interventions, details of resuscitation and other related therapies, and other relevant data are described with appropriate comparisons between groups. The primary, secondary and tertiary outcomes for the study are defined, with description of the planned statistical analyses. A statistical analysis plan was developed, along with a trial profile, mock-up tables and figures. A plan for presenting baseline characteristics, microbiological and antibiotic therapy, details of the interventions, processes of care and concomitant therapies, along with adverse events are described. The primary, secondary and tertiary outcomes are described along with identification of subgroups to be analysed. A statistical analysis plan for the ARISE study has been developed, and is available in the public domain, prior to the completion of recruitment into the study. This will minimise analytic bias and conforms to current best practice in conducting clinical trials. © 2013 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.

  14. Randomized clinical trials in orthodontics are rarely registered a priori and often published late or not at all.

    PubMed

    Papageorgiou, Spyridon N; Antonoglou, Georgios N; Sándor, George K; Eliades, Theodore

    2017-01-01

    A priori registration of randomized clinical trials is crucial to the transparency and credibility of their findings. Aim of this study was to assess the frequency with which registered and completed randomized trials in orthodontics are published. We searched ClinicalTrials.gov and ISRCTN for registered randomized clinical trials in orthodontics that had been completed up to January 2017 and judged the publication status and date of registered trials using a systematic protocol. Statistical analysis included descriptive statistics, chi-square or Fisher exact tests, and Kaplan-Meier survival estimates. From the 266 orthodontic trials registered up to January 2017, 80 trials had been completed and included in the present study. Among these 80 included trials, the majority (76%) were registered retrospectively, while only 33 (41%) were published at the time. The median time from completion to publication was 20.1 months (interquartile range: 9.1 to 31.6 months), while survival analysis indicated that less than 10% of the trials were published after 5 years from their completion. Finally, 22 (28%) of completed trials remain unpublished even after 5 years from their completion. Publication rates of registered randomized trials in orthodontics remained low, even 5 years after their completion date.

  15. Randomized clinical trials in orthodontics are rarely registered a priori and often published late or not at all

    PubMed Central

    Antonoglou, Georgios N.; Sándor, George K.; Eliades, Theodore

    2017-01-01

    A priori registration of randomized clinical trials is crucial to the transparency and credibility of their findings. Aim of this study was to assess the frequency with which registered and completed randomized trials in orthodontics are published. We searched ClinicalTrials.gov and ISRCTN for registered randomized clinical trials in orthodontics that had been completed up to January 2017 and judged the publication status and date of registered trials using a systematic protocol. Statistical analysis included descriptive statistics, chi-square or Fisher exact tests, and Kaplan-Meier survival estimates. From the 266 orthodontic trials registered up to January 2017, 80 trials had been completed and included in the present study. Among these 80 included trials, the majority (76%) were registered retrospectively, while only 33 (41%) were published at the time. The median time from completion to publication was 20.1 months (interquartile range: 9.1 to 31.6 months), while survival analysis indicated that less than 10% of the trials were published after 5 years from their completion. Finally, 22 (28%) of completed trials remain unpublished even after 5 years from their completion. Publication rates of registered randomized trials in orthodontics remained low, even 5 years after their completion date. PMID:28777820

  16. An Analysis of the Crash Experience of Vehicles Equipped with Antilock Braking System

    DOT National Transportation Integrated Search

    1995-06-01

    National Center for Statistics and Analysis has recently completed an initial : analysis of the crash experience of passenger cars (PCs) and light trucks and : vans (LTVs) equipped with antilock braking systems (ABS). Four types of crashes : were ide...

  17. FT. Sam 91 Whiskey Combat Medic Medical Simulation Training Quantitative Integration Enhancement Program

    DTIC Science & Technology

    2011-07-01

    joined the project team in the statistical and research coordination role. Dr. Collin is an employee at the University of Pittsburgh. A successful...3. Submit to Ft. Detrick Completed Milestone: Statistical analysis planning 1. Review planned data metrics and data gathering tools...approach to performance assessment for continuous quality improvement.  Analyzing data with modern statistical techniques to determine the

  18. World Population: Facts in Focus. World Population Data Sheet Workbook. Population Learning Series.

    ERIC Educational Resources Information Center

    Crews, Kimberly A.

    This workbook teaches population analysis using world population statistics. To complete the four student activity sheets, the students refer to the included "1988 World Population Data Sheet" which lists nations' statistical data that includes population totals, projected population, birth and death rates, fertility levels, and the…

  19. SOCR: Statistics Online Computational Resource

    PubMed Central

    Dinov, Ivo D.

    2011-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR). This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student’s intuition and enhance their learning. PMID:21451741

  20. Nearfield Summary and Statistical Analysis of the Second AIAA Sonic Boom Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Park, Michael A.; Nemec, Marian

    2017-01-01

    A summary is provided for the Second AIAA Sonic Boom Workshop held 8-9 January 2017 in conjunction with AIAA SciTech 2017. The workshop used three required models of increasing complexity: an axisymmetric body, a wing body, and a complete configuration with flow-through nacelle. An optional complete configuration with propulsion boundary conditions is also provided. These models are designed with similar nearfield signatures to isolate geometry and shock/expansion interaction effects. Eleven international participant groups submitted nearfield signatures with forces, pitching moment, and iterative convergence norms. Statistics and grid convergence of these nearfield signatures are presented. These submissions are propagated to the ground, and noise levels are computed. This allows the grid convergence and the statistical distribution of a noise level to be computed. While progress is documented since the first workshop, improvement to the analysis methods for a possible subsequent workshop are provided. The complete configuration with flow-through nacelle showed the most dramatic improvement between the two workshops. The current workshop cases are more relevant to vehicles with lower loudness and have the potential for lower annoyance than the first workshop cases. The models for this workshop with quieter ground noise levels than the first workshop exposed weaknesses in analysis, particularly in convective discretization.

  1. Revised Estimates of Child Restraint Effectiveness

    DOT National Transportation Integrated Search

    1996-12-01

    NHTSA's National Center for Statistics and Analysis (NCSA) recently completed an : analysis of data from the Fatal Accident Reporting System (FARS) to reexamine : the effectiveness of restraints in saving the lives of children, ages 0 - 4. : This ana...

  2. Four modes of optical parametric operation for squeezed state generation

    NASA Astrophysics Data System (ADS)

    Andersen, U. L.; Buchler, B. C.; Lam, P. K.; Wu, J. W.; Gao, J. R.; Bachor, H.-A.

    2003-11-01

    We report a versatile instrument, based on a monolithic optical parametric amplifier, which reliably generates four different types of squeezed light. We obtained vacuum squeezing, low power amplitude squeezing, phase squeezing and bright amplitude squeezing. We show a complete analysis of this light, including a full quantum state tomography. In addition we demonstrate the direct detection of the squeezed state statistics without the aid of a spectrum analyser. This technique makes the nonclassical properties directly visible and allows complete measurement of the statistical moments of the squeezed quadrature.

  3. Analysis of Antarctic Remote-Site Automatic Weather Station Data for Period January 1979 - February 1980.

    DTIC Science & Technology

    1982-06-01

    usefulness to the Untted States Antarctic mission as managed by the National Science Foundation. Various statistical measures were applied to the reported... statistical procedures that would evolve a general meteorological picture of each of these remote sites. Primary texts used as a basis for...processed by station for monthly, seasonal and annual statistics , as appropriate. The following outlines the evaluations completed for both

  4. Nebraska's forests, 2005: statistics, methods, and quality assurance

    Treesearch

    Patrick D. Miles; Dacia M. Meneguzzo; Charles J. Barnett

    2011-01-01

    The first full annual inventory of Nebraska's forests was completed in 2005 after 8,335 plots were selected and 274 forested plots were visited and measured. This report includes detailed information on forest inventory methods, and data quality estimates. Tables of various important resource statistics are presented. Detailed analysis of the inventory data are...

  5. Kansas's forests, 2005: statistics, methods, and quality assurance

    Treesearch

    Patrick D. Miles; W. Keith Moser; Charles J. Barnett

    2011-01-01

    The first full annual inventory of Kansas's forests was completed in 2005 after 8,868 plots were selected and 468 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of Kansas inventory is presented...

  6. [Application of Stata software to test heterogeneity in meta-analysis method].

    PubMed

    Wang, Dan; Mou, Zhen-yun; Zhai, Jun-xia; Zong, Hong-xia; Zhao, Xiao-dong

    2008-07-01

    To introduce the application of Stata software to heterogeneity test in meta-analysis. A data set was set up according to the example in the study, and the corresponding commands of the methods in Stata 9 software were applied to test the example. The methods used were Q-test and I2 statistic attached to the fixed effect model forest plot, H statistic and Galbraith plot. The existence of the heterogeneity among studies could be detected by Q-test and H statistic and the degree of the heterogeneity could be detected by I2 statistic. The outliers which were the sources of the heterogeneity could be spotted from the Galbraith plot. Heterogeneity test in meta-analysis can be completed by the four methods in Stata software simply and quickly. H and I2 statistics are more robust, and the outliers of the heterogeneity can be clearly seen in the Galbraith plot among the four methods.

  7. Implementation and evaluation of an efficient secure computation system using ‘R’ for healthcare statistics

    PubMed Central

    Chida, Koji; Morohashi, Gembu; Fuji, Hitoshi; Magata, Fumihiko; Fujimura, Akiko; Hamada, Koki; Ikarashi, Dai; Yamamoto, Ryuichi

    2014-01-01

    Background and objective While the secondary use of medical data has gained attention, its adoption has been constrained due to protection of patient privacy. Making medical data secure by de-identification can be problematic, especially when the data concerns rare diseases. We require rigorous security management measures. Materials and methods Using secure computation, an approach from cryptography, our system can compute various statistics over encrypted medical records without decrypting them. An issue of secure computation is that the amount of processing time required is immense. We implemented a system that securely computes healthcare statistics from the statistical computing software ‘R’ by effectively combining secret-sharing-based secure computation with original computation. Results Testing confirmed that our system could correctly complete computation of average and unbiased variance of approximately 50 000 records of dummy insurance claim data in a little over a second. Computation including conditional expressions and/or comparison of values, for example, t test and median, could also be correctly completed in several tens of seconds to a few minutes. Discussion If medical records are simply encrypted, the risk of leaks exists because decryption is usually required during statistical analysis. Our system possesses high-level security because medical records remain in encrypted state even during statistical analysis. Also, our system can securely compute some basic statistics with conditional expressions using ‘R’ that works interactively while secure computation protocols generally require a significant amount of processing time. Conclusions We propose a secure statistical analysis system using ‘R’ for medical data that effectively integrates secret-sharing-based secure computation and original computation. PMID:24763677

  8. Implementation and evaluation of an efficient secure computation system using 'R' for healthcare statistics.

    PubMed

    Chida, Koji; Morohashi, Gembu; Fuji, Hitoshi; Magata, Fumihiko; Fujimura, Akiko; Hamada, Koki; Ikarashi, Dai; Yamamoto, Ryuichi

    2014-10-01

    While the secondary use of medical data has gained attention, its adoption has been constrained due to protection of patient privacy. Making medical data secure by de-identification can be problematic, especially when the data concerns rare diseases. We require rigorous security management measures. Using secure computation, an approach from cryptography, our system can compute various statistics over encrypted medical records without decrypting them. An issue of secure computation is that the amount of processing time required is immense. We implemented a system that securely computes healthcare statistics from the statistical computing software 'R' by effectively combining secret-sharing-based secure computation with original computation. Testing confirmed that our system could correctly complete computation of average and unbiased variance of approximately 50,000 records of dummy insurance claim data in a little over a second. Computation including conditional expressions and/or comparison of values, for example, t test and median, could also be correctly completed in several tens of seconds to a few minutes. If medical records are simply encrypted, the risk of leaks exists because decryption is usually required during statistical analysis. Our system possesses high-level security because medical records remain in encrypted state even during statistical analysis. Also, our system can securely compute some basic statistics with conditional expressions using 'R' that works interactively while secure computation protocols generally require a significant amount of processing time. We propose a secure statistical analysis system using 'R' for medical data that effectively integrates secret-sharing-based secure computation and original computation. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  9. Treatments of Missing Values in Large National Data Affect Conclusions: The Impact of Multiple Imputation on Arthroplasty Research.

    PubMed

    Ondeck, Nathaniel T; Fu, Michael C; Skrip, Laura A; McLynn, Ryan P; Su, Edwin P; Grauer, Jonathan N

    2018-03-01

    Despite the advantages of large, national datasets, one continuing concern is missing data values. Complete case analysis, where only cases with complete data are analyzed, is commonly used rather than more statistically rigorous approaches such as multiple imputation. This study characterizes the potential selection bias introduced using complete case analysis and compares the results of common regressions using both techniques following unicompartmental knee arthroplasty. Patients undergoing unicompartmental knee arthroplasty were extracted from the 2005 to 2015 National Surgical Quality Improvement Program. As examples, the demographics of patients with and without missing preoperative albumin and hematocrit values were compared. Missing data were then treated with both complete case analysis and multiple imputation (an approach that reproduces the variation and associations that would have been present in a full dataset) and the conclusions of common regressions for adverse outcomes were compared. A total of 6117 patients were included, of which 56.7% were missing at least one value. Younger, female, and healthier patients were more likely to have missing preoperative albumin and hematocrit values. The use of complete case analysis removed 3467 patients from the study in comparison with multiple imputation which included all 6117 patients. The 2 methods of handling missing values led to differing associations of low preoperative laboratory values with commonly studied adverse outcomes. The use of complete case analysis can introduce selection bias and may lead to different conclusions in comparison with the statistically rigorous multiple imputation approach. Joint surgeons should consider the methods of handling missing values when interpreting arthroplasty research. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Treatment of control data in lunar phototriangulation. [application of statistical procedures and development of mathematical and computer techniques

    NASA Technical Reports Server (NTRS)

    Wong, K. W.

    1974-01-01

    In lunar phototriangulation, there is a complete lack of accurate ground control points. The accuracy analysis of the results of lunar phototriangulation must, therefore, be completely dependent on statistical procedure. It was the objective of this investigation to examine the validity of the commonly used statistical procedures, and to develop both mathematical techniques and computer softwares for evaluating (1) the accuracy of lunar phototriangulation; (2) the contribution of the different types of photo support data on the accuracy of lunar phototriangulation; (3) accuracy of absolute orientation as a function of the accuracy and distribution of both the ground and model points; and (4) the relative slope accuracy between any triangulated pass points.

  11. SparRec: An effective matrix completion framework of missing data imputation for GWAS

    NASA Astrophysics Data System (ADS)

    Jiang, Bo; Ma, Shiqian; Causey, Jason; Qiao, Linbo; Hardin, Matthew Price; Bitts, Ian; Johnson, Daniel; Zhang, Shuzhong; Huang, Xiuzhen

    2016-10-01

    Genome-wide association studies present computational challenges for missing data imputation, while the advances of genotype technologies are generating datasets of large sample sizes with sample sets genotyped on multiple SNP chips. We present a new framework SparRec (Sparse Recovery) for imputation, with the following properties: (1) The optimization models of SparRec, based on low-rank and low number of co-clusters of matrices, are different from current statistics methods. While our low-rank matrix completion (LRMC) model is similar to Mendel-Impute, our matrix co-clustering factorization (MCCF) model is completely new. (2) SparRec, as other matrix completion methods, is flexible to be applied to missing data imputation for large meta-analysis with different cohorts genotyped on different sets of SNPs, even when there is no reference panel. This kind of meta-analysis is very challenging for current statistics based methods. (3) SparRec has consistent performance and achieves high recovery accuracy even when the missing data rate is as high as 90%. Compared with Mendel-Impute, our low-rank based method achieves similar accuracy and efficiency, while the co-clustering based method has advantages in running time. The testing results show that SparRec has significant advantages and competitive performance over other state-of-the-art existing statistics methods including Beagle and fastPhase.

  12. North Dakota's forests, 2005: statistics, methods, and quality assurance

    Treesearch

    Patrick D. Miles; David E. Haugen; Charles J. Barnett

    2011-01-01

    The first full annual inventory of North Dakota's forests was completed in 2005 after 7,622 plots were selected and 164 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of the North Dakota...

  13. Illinois' Forests, 2005: Statistics, Methods, and Quality Assurance

    Treesearch

    Susan J. Crocker; Charles J. Barnett; Mark A. Hatfield

    2013-01-01

    The first full annual inventory of Illinois' forests was completed in 2005. This report contains 1) descriptive information on methods, statistics, and quality assurance of data collection, 2) a glossary of terms, 3) tables that summarize quality assurance, and 4) a core set of tabular estimates for a variety of forest resources. A detailed analysis of inventory...

  14. South Dakota's forests, 2005: statistics, methods, and quality assurance

    Treesearch

    Patrick D. Miles; Ronald J. Piva; Charles J. Barnett

    2011-01-01

    The first full annual inventory of South Dakota's forests was completed in 2005 after 8,302 plots were selected and 325 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of the South Dakota...

  15. The Development of Career Naval Officers from the U.S. Naval Academy: A Statistical Analysis of the Effects of Selectivity and Human Capital

    DTIC Science & Technology

    1997-06-01

    career success for academy graduates relative to officers commissioned from other sources. Favoritism occurs if high-ranking officers who are service... career success as a naval officer? 6 The thesis investigates several databases in an effort to paint a complete statistical picture of naval officer...including both public and private sector career success was conducted by the Standard & Poor’s Corporation with a related analysis by Professor Michael Useem

  16. [Statistical analysis of German radiologic periodicals: developmental trends in the last 10 years].

    PubMed

    Golder, W

    1999-09-01

    To identify which statistical tests are applied in German radiological publications, to what extent their use has changed during the last decade, and which factors might be responsible for this development. The major articles published in "ROFO" and "DER RADIOLOGE" during 1988, 1993 and 1998 were reviewed for statistical content. The contributions were classified by principal focus and radiological subspecialty. The methods used were assigned to descriptive, basal and advanced statistics. Sample size, significance level and power were established. The use of experts' assistance was monitored. Finally, we calculated the so-called cumulative accessibility of the publications. 525 contributions were found to be eligible. In 1988, 87% used descriptive statistics only, 12.5% basal, and 0.5% advanced statistics. The corresponding figures in 1993 and 1998 are 62 and 49%, 32 and 41%, and 6 and 10%, respectively. Statistical techniques were most likely to be used in research on musculoskeletal imaging and articles dedicated to MRI. Six basic categories of statistical methods account for the complete statistical analysis appearing in 90% of the articles. ROC analysis is the single most common advanced technique. Authors make increasingly use of statistical experts' opinion and programs. During the last decade, the use of statistical methods in German radiological journals has fundamentally improved, both quantitatively and qualitatively. Presently, advanced techniques account for 20% of the pertinent statistical tests. This development seems to be promoted by the increasing availability of statistical analysis software.

  17. A study of engineering student attributes and time to completion of first-year required courses at Texas A&M University

    NASA Astrophysics Data System (ADS)

    Kimball, Jorja Lay

    For many years, colleges of engineering across the nation have required that a foundational set of courses be completed for entry into upper division coursework or into a specific engineering major. Since 1998, The Dwight Look College of Engineering at Texas A&M University (TAMU) has required that incoming first-time enrolling students complete a Core Body of Knowledge (CBK) with specific cumulative grade points required for specific majors. However, considerations of the time to completion of coursework and other student characteristics and academic factors have not been taken into consideration by TAMU, like most institutions. The purpose of this study is to determine for first year engineering students at TAMU the relationship of gender, ethnicity, engineering major, unmet financial need, cumulative grade point average, and total transfer hours on time to completion of CBK courses. The results of the analysis showed that cumulative grade point average (CGPA) had the strongest relationship to completion of CBK of any independent variable in this study. Statistical significance was found for the following variables in this study: CGPA, gender, ethnicity, and unmet financial need. For the study's variable of major, statistical significance was found for Chemical, Electrical, and Computer Engineering majors. The one variable in this study that did not show statistical significance in relation to time to completion of CBK was transfer credit. Findings with implications for recruitment and retention of underrepresented in engineering is a statistical significance indicating that on average females are taking less time than males to complete CBK. The conclusion from the study is that efforts to attract more women into engineering have merit as do programs to support underrepresented students in order that they may complete CBK at a faster pace. Further study to determine profiles of those majors where statistical significance was found for students taking a greater or lesser amount of time for CBK completion than the mean is recommended, as is ongoing data collection and comparison for current cohorts of engineering majors at TAMU.

  18. Neoadjuvant Long-Course Chemoradiotherapy for Rectal Cancer: Does Time to Surgery Matter?

    PubMed Central

    Panagiotopoulou, Ioanna G.; Parashar, Deepak; Qasem, Eyas; Mezher-Sikafi, Rasha; Parmar, Jitesh; Wells, Alan D.; Bajwa, Farrukh M.; Menon, Madhav; Jephcott, Catherine R.

    2015-01-01

    The objective of this paper was to evaluate whether delaying surgery following long-course chemoradiotherapy for rectal cancer correlates with pathologic complete response. Pre-operative chemoradiotherapy (CRT) is standard practice in the UK for the management of locally advanced rectal cancer. Optimal timing of surgery following CRT is still not clearly defined. All patients with a diagnosis of rectal cancer who had undergone long-course CRT prior to surgery between January 2008 and December 2011 were included. Statistical analysis was performed using Stata 11. Fifty-nine patients received long-course CRT prior to surgery in the selected period. Twenty-seven percent (16/59) of patients showed a complete histopathologic response and 59.3% (35/59) of patients had tumor down-staging from radiologically-assessed node positive to histologically-proven node negative disease. There was no statistically significant delay to surgery after completion of CRT in the 16 patients with complete response (CR) compared with the rest of the group [IR: incomplete response; CR group median: 74.5 days (IQR: 70–87.5) and IR group median: 72 days (IQR: 57–83), P = 0.470]. Although no statistically significant predictors of either complete response or tumor nodal status down-staging were identified in logistic regression analyses, a trend toward complete response was seen with longer delay to surgery following completion of long-course CRT. PMID:26414816

  19. User's manual for the Simulated Life Analysis of Vehicle Elements (SLAVE) model

    NASA Technical Reports Server (NTRS)

    Paul, D. D., Jr.

    1972-01-01

    The simulated life analysis of vehicle elements model was designed to perform statistical simulation studies for any constant loss rate. The outputs of the model consist of the total number of stages required, stages successfully completing their lifetime, and average stage flight life. This report contains a complete description of the model. Users' instructions and interpretation of input and output data are presented such that a user with little or no prior programming knowledge can successfully implement the program.

  20. ANALYSIS TO ACCOUNT FOR SMALL AGE RANGE CATEGORIES IN DISTRIBUTIONS OF WATER CONSUMPTION AND BODY WEIGHT IN THE U.S. USING CSFII DATA

    EPA Science Inventory

    Statistical population based estimates of water ingestion play a vital role in many types of exposure and risk analysis. A significant large scale analysis of water ingestion by the population of the United States was recently completed and is documented in the report titled ...

  1. Injuries Associated With Hazards Involving Motor Vehicle Power Windows

    DOT National Transportation Integrated Search

    1997-05-01

    National Highway Traffic Safety Administration's (NHTSA) National Center for : Statistics and Analysis (NCSA) recently completed a study of data from the : Consumer Product Safety Commission's (CPSC) National Electronic Injury : Surveillance System (...

  2. An automated system for chromosome analysis. Volume 1: Goals, system design, and performance

    NASA Technical Reports Server (NTRS)

    Castleman, K. R.; Melnyk, J. H.

    1975-01-01

    The design, construction, and testing of a complete system to produce karyotypes and chromosome measurement data from human blood samples, and a basis for statistical analysis of quantitative chromosome measurement data is described. The prototype was assembled, tested, and evaluated on clinical material and thoroughly documented.

  3. Earth science research

    NASA Technical Reports Server (NTRS)

    Botkin, Daniel B.

    1987-01-01

    The analysis of ground-truth data from the boreal forest plots in the Superior National Forest, Minnesota, was completed. Development of statistical methods was completed for dimension analysis (equations to estimate the biomass of trees from measurements of diameter and height). The dimension-analysis equations were applied to the data obtained from ground-truth plots, to estimate the biomass. Classification and analyses of remote sensing images of the Superior National Forest were done as a test of the technique to determine forest biomass and ecological state by remote sensing. Data was archived on diskette and tape and transferred to UCSB to be used in subsequent research.

  4. Missing data treatments matter: an analysis of multiple imputation for anterior cervical discectomy and fusion procedures.

    PubMed

    Ondeck, Nathaniel T; Fu, Michael C; Skrip, Laura A; McLynn, Ryan P; Cui, Jonathan J; Basques, Bryce A; Albert, Todd J; Grauer, Jonathan N

    2018-04-09

    The presence of missing data is a limitation of large datasets, including the National Surgical Quality Improvement Program (NSQIP). In addressing this issue, most studies use complete case analysis, which excludes cases with missing data, thus potentially introducing selection bias. Multiple imputation, a statistically rigorous approach that approximates missing data and preserves sample size, may be an improvement over complete case analysis. The present study aims to evaluate the impact of using multiple imputation in comparison with complete case analysis for assessing the associations between preoperative laboratory values and adverse outcomes following anterior cervical discectomy and fusion (ACDF) procedures. This is a retrospective review of prospectively collected data. Patients undergoing one-level ACDF were identified in NSQIP 2012-2015. Perioperative adverse outcome variables assessed included the occurrence of any adverse event, severe adverse events, and hospital readmission. Missing preoperative albumin and hematocrit values were handled using complete case analysis and multiple imputation. These preoperative laboratory levels were then tested for associations with 30-day postoperative outcomes using logistic regression. A total of 11,999 patients were included. Of this cohort, 63.5% of patients had missing preoperative albumin and 9.9% had missing preoperative hematocrit. When using complete case analysis, only 4,311 patients were studied. The removed patients were significantly younger, healthier, of a common body mass index, and male. Logistic regression analysis failed to identify either preoperative hypoalbuminemia or preoperative anemia as significantly associated with adverse outcomes. When employing multiple imputation, all 11,999 patients were included. Preoperative hypoalbuminemia was significantly associated with the occurrence of any adverse event and severe adverse events. Preoperative anemia was significantly associated with the occurrence of any adverse event, severe adverse events, and hospital readmission. Multiple imputation is a rigorous statistical procedure that is being increasingly used to address missing values in large datasets. Using this technique for ACDF avoided the loss of cases that may have affected the representativeness and power of the study and led to different results than complete case analysis. Multiple imputation should be considered for future spine studies. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. Diagnosis checking of statistical analysis in RCTs indexed in PubMed.

    PubMed

    Lee, Paul H; Tse, Andy C Y

    2017-11-01

    Statistical analysis is essential for reporting of the results of randomized controlled trials (RCTs), as well as evaluating their effectiveness. However, the validity of a statistical analysis also depends on whether the assumptions of that analysis are valid. To review all RCTs published in journals indexed in PubMed during December 2014 to provide a complete picture of how RCTs handle assumptions of statistical analysis. We reviewed all RCTs published in December 2014 that appeared in journals indexed in PubMed using the Cochrane highly sensitive search strategy. The 2014 impact factors of the journals were used as proxies for their quality. The type of statistical analysis used and whether the assumptions of the analysis were tested were reviewed. In total, 451 papers were included. Of the 278 papers that reported a crude analysis for the primary outcomes, 31 (27·2%) reported whether the outcome was normally distributed. Of the 172 papers that reported an adjusted analysis for the primary outcomes, diagnosis checking was rarely conducted, with only 20%, 8·6% and 7% checked for generalized linear model, Cox proportional hazard model and multilevel model, respectively. Study characteristics (study type, drug trial, funding sources, journal type and endorsement of CONSORT guidelines) were not associated with the reporting of diagnosis checking. The diagnosis of statistical analyses in RCTs published in PubMed-indexed journals was usually absent. Journals should provide guidelines about the reporting of a diagnosis of assumptions. © 2017 Stichting European Society for Clinical Investigation Journal Foundation.

  6. Wheelchair User Injuries and Deaths Associated with Motor Vehicle Related Incidents

    DOT National Transportation Integrated Search

    1997-09-01

    National Highway Traffic Safety Administration's National Center for Statistics : and Analysis (NCSA) recently completed a study of data from the Consumer Product : Safety Commission's (CPSC) National Electronic Injury Surveillance System : (NEISS) o...

  7. Progress of statistical analysis in biomedical research through the historical review of the development of the Framingham score.

    PubMed

    Ignjatović, Aleksandra; Stojanović, Miodrag; Milošević, Zoran; Anđelković Apostolović, Marija

    2017-12-02

    The interest in developing risk models in medicine not only is appealing, but also associated with many obstacles in different aspects of predictive model development. Initially, the association of biomarkers or the association of more markers with the specific outcome was proven by statistical significance, but novel and demanding questions required the development of new and more complex statistical techniques. Progress of statistical analysis in biomedical research can be observed the best through the history of the Framingham study and development of the Framingham score. Evaluation of predictive models comes from a combination of the facts which are results of several metrics. Using logistic regression and Cox proportional hazards regression analysis, the calibration test, and the ROC curve analysis should be mandatory and eliminatory, and the central place should be taken by some new statistical techniques. In order to obtain complete information related to the new marker in the model, recently, there is a recommendation to use the reclassification tables by calculating the net reclassification index and the integrated discrimination improvement. Decision curve analysis is a novel method for evaluating the clinical usefulness of a predictive model. It may be noted that customizing and fine-tuning of the Framingham risk score initiated the development of statistical analysis. Clinically applicable predictive model should be a trade-off between all abovementioned statistical metrics, a trade-off between calibration and discrimination, accuracy and decision-making, costs and benefits, and quality and quantity of patient's life.

  8. A Comparison of Imputation Methods for Bayesian Factor Analysis Models

    ERIC Educational Resources Information Center

    Merkle, Edgar C.

    2011-01-01

    Imputation methods are popular for the handling of missing data in psychology. The methods generally consist of predicting missing data based on observed data, yielding a complete data set that is amiable to standard statistical analyses. In the context of Bayesian factor analysis, this article compares imputation under an unrestricted…

  9. Measuring outcome from vestibular rehabilitation, part II: refinement and validation of a new self-report measure.

    PubMed

    Morris, Anna E; Lutman, Mark E; Yardley, Lucy

    2009-01-01

    A prototype self-report measure of vestibular rehabilitation outcome is described in a previous paper. The objectives of the present work were to identify the most useful items and assess their psychometric properties. Stage 1: One hundred fifty-five participants completed a prototype 36-item Vestibular Rehabilitation Benefit Questionnaire (VRBQ). Statistical analysis demonstrated its subscale structure and identified redundant items. Stage 2: One hundred twenty-four participants completed a refined 22-item VRBQ and three established questionnaires (Dizziness Handicap Inventory, DHI; Vertigo Symptom Scale short form, VSS-sf; Medical Outcomes Study short form 36, SF-36) in a longitudinal study. Statistical analysis revealed four internally consistent subscales of the VRBQ: Dizziness, Anxiety, Motion-Provoked Dizziness, and Quality of Life. Correlations with the DHI, VSS-sf, and SF-36 support the validity of the VRBQ, and effect size estimates suggest that the VRBQ is more responsive than comparable questionnaires. Twenty participants completed the VRBQ twice in a 24-hour period, indicating excellent test-retest reliability. The VRBQ appears to be a concise and psychometrically robust questionnaire that addresses the main aspects of dizziness impact.

  10. Crash analysis, statistics & information notebook 2008

    DOT National Transportation Integrated Search

    2008-01-01

    Traditionally crash data is often presented as single fact sheets highlighting a single factor such as Vehicle Type or Road Type. This document will try to show how the risk factors interrelate to produce a crash. Complete detailed analys...

  11. Assessing the Robustness of Graph Statistics for Network Analysis Under Incomplete Information

    DTIC Science & Technology

    strategy for dismantling these networks based on their network structure. However, these strategies typically assume complete information about the...combat them with missing information . This thesis analyzes the performance of a variety of network statistics in the context of incomplete information by...leveraging simulation to remove nodes and edges from networks and evaluating the effect this missing information has on our ability to accurately

  12. Persistence in STEM: An investigation of the relationship between high school experiences in science and mathematics and college degree completion in STEM fields

    NASA Astrophysics Data System (ADS)

    Maltese, Adam V.

    While the number of Bachelor's degrees awarded annually has nearly tripled over the past 40 years (NSF, 2008), the same cannot be said for degrees in the STEM (science, technology, engineering and mathematics) fields. The Bureau of Labor Statistics projects that by the year 2014 the combination of new positions and retirements will lead to 2 million job openings in STEM (BLS, 2005). Thus, the research questions I sought to answer with this study were: (1)What are the most common enrollment patterns for students who enter into and exit from the STEM pipeline during high school and college? (2) Controlling for differences in student background and early interest in STEM careers, what are the high school science and mathematics classroom experiences that characterize student completion of a college major in STEM? Using data from NELS:88 I analyzed descriptive statistics and completed logistic regressions to gain an understanding of factors related to student persistence in STEM. Approximately 4700 students with transcript records and who participated in all survey rounds were included in the analyses. The results of the descriptive analysis demonstrated that most students who went on to complete majors in STEM completed at least three or four years of STEM courses during high school, and enrolled in advanced high school mathematics and science courses at higher rates. At almost every pipeline checkpoint indicators of the level of coursework and achievement were significant in predicting student completion of a STEM degree. The results also support previous research that showed demographic variables have little effect on persistence once the sample is limited to those who have the intrinsic ability and desire to complete a college degree. The most significant finding is that measures of student interest and engagement in science and mathematics were significant in predicting completion of a STEM degree, above and beyond the effects of course enrollment and performance. A final analysis, which involved the comparison of descriptive statistics for students who switched into and out of the STEM pipeline during high school, suggested that attitudes toward mathematics and science play a major role in choices regarding pipeline persistence.

  13. 76 FR 77768 - Information Collection; Flathead and McKenzie Rivers and McKenzie National Recreational Trail...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-14

    ... completed and validated, the hardcopy questionnaires will be discarded. Data will be imported into SPSS (Statistical Package for the Social Sciences) for analysis. The database will be maintained at the respective...

  14. Statistical strategies for averaging EC50 from multiple dose-response experiments.

    PubMed

    Jiang, Xiaoqi; Kopp-Schneider, Annette

    2015-11-01

    In most dose-response studies, repeated experiments are conducted to determine the EC50 value for a chemical, requiring averaging EC50 estimates from a series of experiments. Two statistical strategies, the mixed-effect modeling and the meta-analysis approach, can be applied to estimate average behavior of EC50 values over all experiments by considering the variabilities within and among experiments. We investigated these two strategies in two common cases of multiple dose-response experiments in (a) complete and explicit dose-response relationships are observed in all experiments and in (b) only in a subset of experiments. In case (a), the meta-analysis strategy is a simple and robust method to average EC50 estimates. In case (b), all experimental data sets can be first screened using the dose-response screening plot, which allows visualization and comparison of multiple dose-response experimental results. As long as more than three experiments provide information about complete dose-response relationships, the experiments that cover incomplete relationships can be excluded from the meta-analysis strategy of averaging EC50 estimates. If there are only two experiments containing complete dose-response information, the mixed-effects model approach is suggested. We subsequently provided a web application for non-statisticians to implement the proposed meta-analysis strategy of averaging EC50 estimates from multiple dose-response experiments.

  15. Effect of completion-time windows in the analysis of health-related quality of life outcomes in cancer patients

    PubMed Central

    Ediebah, D. E.; Coens, C.; Maringwa, J. T.; Quinten, C.; Zikos, E.; Ringash, J.; King, M.; Gotay, C.; Flechtner, H.-H.; Schmucker von Koch, J.; Weis, J.; Smit, E. F.; Köhne, C.-H.; Bottomley, A.

    2013-01-01

    Background We examined if cancer patients' health-related quality of life (HRQoL) scores on the European Organisation for Research and Treatment of Cancer (EORTC) QLQ-C30 are affected by the specific time point, before or during treatment, at which the questionnaire is completed, and whether this could bias the overall treatment comparison analyses. Patients and methods A ‘completion-time window’ variable was created on three closed EORTC randomised control trials in lung (non-small cell lung cancer, NSCLC) and colorectal cancer (CRC) to indicate when the QLQ-30 was completed relative to chemotherapy cycle dates, defined as ‘before’, ‘on’ and ‘after’. HRQoL mean scores were calculated using a linear mixed model. Results Statistically significant differences (P < 0.05) were observed on 6 and 5 scales for ‘on’ and ‘after’ comparisons in the NSCLC and two-group CRC trial, respectively. As for the three-group CRC trial, several statistical differences were observed in the ‘before’ to ‘on’ and the ‘on’ to ‘after’ comparisons. For all three trials, including the ‘completion-time window’ variable in the model resulted in a better fit, but no substantial changes in the treatment effects were noted. Conclusions We showed that considering the exact timing of completion within specified windows resulted in statistical and potentially clinically significant differences, but it did not alter the conclusions of treatment comparison in these studies. PMID:22935549

  16. An exploratory analysis of treatment completion and client and organizational factors using hierarchical linear modeling.

    PubMed

    Woodward, Albert; Das, Abhik; Raskin, Ira E; Morgan-Lopez, Antonio A

    2006-11-01

    Data from the Alcohol and Drug Services Study (ADSS) are used to analyze the structure and operation of the substance abuse treatment industry in the United States. Published literature contains little systematic empirical analysis of the interaction between organizational characteristics and treatment outcomes. This paper addresses that deficit. It develops and tests a hierarchical linear model (HLM) to address questions about the empirical relationship between treatment inputs (industry costs, types and use of counseling and medical personnel, diagnosis mix, patient demographics, and the nature and level of services used in substance abuse treatment), and patient outcomes (retention and treatment completion rates). The paper adds to the literature by demonstrating a direct and statistically significant link between treatment completion and the organizational and staffing structure of the treatment setting. Related reimbursement issues, questions for future analysis, and limitations of the ADSS for this analysis are discussed.

  17. Global, Local, and Graphical Person-Fit Analysis Using Person-Response Functions

    ERIC Educational Resources Information Center

    Emons, Wilco H. M.; Sijtsma, Klaas; Meijer, Rob R.

    2005-01-01

    Person-fit statistics test whether the likelihood of a respondent's complete vector of item scores on a test is low given the hypothesized item response theory model. This binary information may be insufficient for diagnosing the cause of a misfitting item-score vector. The authors propose a comprehensive methodology for person-fit analysis in the…

  18. Regression Analysis as a Cost Estimation Model for Unexploded Ordnance Cleanup at Former Military Installations

    DTIC Science & Technology

    2002-06-01

    fits our actual data . To determine the goodness of fit, statisticians typically use the following four measures: R2 Statistic. The R2 statistic...reviewing instruction, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of...mathematical model is developed to better estimate cleanup costs using historical cost data that could be used by the Defense Department prior to placing

  19. Women at Sea: Welcome Aboard.

    DTIC Science & Technology

    1983-03-01

    should be familiar with typical military/navy terms, and elementary statistical tests (T-test, Chi Square, and One-Way Analysis of Variance). The ...and the media. One theory is that the gradual internalization or acceptance of values and ideals (which is influenced by the individual’s class, family...completed with a comparison of the two. A similar format is followed for the commanding officer’s data. Three sets of statistical tests were done on the

  20. A retrospective analysis of hyperthermic intraperitoneal chemotherapy for gastric cancer with peritoneal metastasis

    PubMed Central

    Yuan, Meiqin; Wang, Zeng; Hu, Guinv; Yang, Yunshan; Lv, Wangxia; Lu, Fangxiao; Zhong, Haijun

    2016-01-01

    Peritoneal metastasis (PM) is a poor prognostic factor in patients with gastric cancer. The aim of this study was to evaluate the efficacy and safety of hyperthermic intraperitoneal chemotherapy (HIPEC) in patients with advanced gastric cancer with PM by retrospective analysis. A total of 54 gastric cancer patients with positive ascitic fluid cytology were included in this study: 23 patients were treated with systemic chemotherapy combined with HIPEC (HIPEC+ group) and 31 received systemic chemotherapy alone (HIPEC- group). The patients were divided into 4 categories according to the changes of ascites, namely disappear, decrease, stable and increase. The disappear + decrease rate in the HIPEC+ group was 82.60%, which was statistically significantly superior to that of the HIPEC- group (54.80%). The disappear + decrease + stable rate was 95.70% in the HIPEC+ group and 74.20% in the HIPEC- group, but the difference was not statistically significant. In 33 patients with complete survival data, including 12 from the HIPEC+ and 21 from the HIPEC- group, the median progression-free survival was 164 and 129 days, respectively, and the median overall survival (OS) was 494 and 223 days, respectively. In patients with ascites disappear/decrease/stable, the OS appeared to be better compared with that in patients with ascites increase, but the difference was not statistically significant. Further analysis revealed that patients with controlled disease (complete response + partial response + stable disease) may have a better OS compared with patients with progressive disease, with a statistically significant difference. The toxicities were well tolerated in both groups. Therefore, HIPEC was found to improve survival in advanced gastric cancer patients with PM, but the difference was not statistically significant, which may be attributed to the small number of cases. Further studies with larger samples are required to confirm our data. PMID:27446587

  1. Completeness of birth and death registration in a rural area of South Africa: the Agincourt health and demographic surveillance, 1992–2014

    PubMed Central

    Garenne, Michel; Collinson, Mark A.; Kabudula, Chodziwadziwa W.; Gómez-Olivé, F. Xavier; Kahn, Kathleen; Tollman, Stephen

    2016-01-01

    Background Completeness of vital registration remains very low in sub-Saharan Africa, especially in rural areas. Objectives To investigate trends and factors in completeness of birth and death registration in Agincourt, a rural area of South Africa covering a population of about 110,000 persons, under demographic surveillance since 1992. The population belongs to the Shangaan ethnic group and hosts a sizeable community of Mozambican refugees. Design Statistical analysis of birth and death registration over time in a 22-year perspective (1992–2014). Over this period, major efforts were made by the government of South Africa to improve vital registration. Factors associated with completeness of registration were investigated using univariate and multivariate analysis. Results Birth registration was very incomplete at onset (7.8% in 1992) and reached high values at end point (90.5% in 2014). Likewise, death registration was low at onset (51.4% in 1992), also reaching high values at end point (97.1% in 2014). For births, the main factors were mother's age (much lower completeness among births to adolescent mothers), refugee status, and household wealth. For deaths, the major factors were age at death (lower completeness among under-five children), refugee status, and household wealth. Completeness increased for all demographic and socioeconomic categories studied and is likely to approach 100% in the future if trends continue at this speed. Conclusion Reaching high values in the completeness of birth and death registration was achieved by excellent organization of the civil registration and vital statistics, a variety of financial incentives, strong involvement of health personnel, and wide-scale information and advocacy campaigns by the South African government. PMID:27782873

  2. Analysis of spectrum utilization and message length statistics for the railroad land mobile radio service

    DOT National Transportation Integrated Search

    1997-11-01

    In support of the Federal Railroad Administration of the United States Department of Transportation, the Institute for Telecommunication Sciences (ITS) has completed a field spectrum uitilization survey designed to examine individual channel utilizat...

  3. Guidelines for the design and statistical analysis of experiments in papers submitted to ATLA.

    PubMed

    Festing, M F

    2001-01-01

    In vitro experiments need to be well designed and correctly analysed if they are to achieve their full potential to replace the use of animals in research. An "experiment" is a procedure for collecting scientific data in order to answer a hypothesis, or to provide material for generating new hypotheses, and differs from a survey because the scientist has control over the treatments that can be applied. Most experiments can be classified into one of a few formal designs, the most common being completely randomised, and randomised block designs. These are quite common with in vitro experiments, which are often replicated in time. Some experiments involve a single independent (treatment) variable, while other "factorial" designs simultaneously vary two or more independent variables, such as drug treatment and cell line. Factorial designs often provide additional information at little extra cost. Experiments need to be carefully planned to avoid bias, be powerful yet simple, provide for a valid statistical analysis and, in some cases, have a wide range of applicability. Virtually all experiments need some sort of statistical analysis in order to take account of biological variation among the experimental subjects. Parametric methods using the t test or analysis of variance are usually more powerful than non-parametric methods, provided the underlying assumptions of normality of the residuals and equal variances are approximately valid. The statistical analyses of data from a completely randomised design, and from a randomised-block design are demonstrated in Appendices 1 and 2, and methods of determining sample size are discussed in Appendix 3. Appendix 4 gives a checklist for authors submitting papers to ATLA.

  4. A Monte Carlo investigation of thrust imbalance of solid rocket motor pairs

    NASA Technical Reports Server (NTRS)

    Sforzini, R. H.; Foster, W. A., Jr.; Johnson, J. S., Jr.

    1974-01-01

    A technique is described for theoretical, statistical evaluation of the thrust imbalance of pairs of solid-propellant rocket motors (SRMs) firing in parallel. Sets of the significant variables, determined as a part of the research, are selected using a random sampling technique and the imbalance calculated for a large number of motor pairs. The performance model is upgraded to include the effects of statistical variations in the ovality and alignment of the motor case and mandrel. Effects of cross-correlations of variables are minimized by selecting for the most part completely independent input variables, over forty in number. The imbalance is evaluated in terms of six time - varying parameters as well as eleven single valued ones which themselves are subject to statistical analysis. A sample study of the thrust imbalance of 50 pairs of 146 in. dia. SRMs of the type to be used on the space shuttle is presented. The FORTRAN IV computer program of the analysis and complete instructions for its use are included. Performance computation time for one pair of SRMs is approximately 35 seconds on the IBM 370/155 using the FORTRAN H compiler.

  5. Complete scanpaths analysis toolbox.

    PubMed

    Augustyniak, Piotr; Mikrut, Zbigniew

    2006-01-01

    This paper presents a complete open software environment for control, data processing and assessment of visual experiments. Visual experiments are widely used in research on human perception physiology and the results are applicable to various visual information-based man-machine interfacing, human-emulated automatic visual systems or scanpath-based learning of perceptual habits. The toolbox is designed for Matlab platform and supports infra-red reflection-based eyetracker in calibration and scanpath analysis modes. Toolbox procedures are organized in three layers: the lower one, communicating with the eyetracker output file, the middle detecting scanpath events on a physiological background and the one upper consisting of experiment schedule scripts, statistics and summaries. Several examples of visual experiments carried out with use of the presented toolbox complete the paper.

  6. Job Stress among Hispanic Professionals

    ERIC Educational Resources Information Center

    Rodriguez-Calcagno, Maria; Brewer, Ernest W.

    2005-01-01

    This study explores job stress among a random sample of 219 Hispanic professionals. Participants complete the Job Stress Survey by Spielberger and Vagg and a demographic questionnaire. Responses are analyzed using descriptive statistics, a factorial analysis of variance, and coefficients of determination. Results indicate that Hispanic…

  7. Completely automated modal analysis procedure based on the combination of different OMA methods

    NASA Astrophysics Data System (ADS)

    Ripamonti, Francesco; Bussini, Alberto; Resta, Ferruccio

    2018-03-01

    In this work a completely automated output-only Modal Analysis procedure is presented and all its benefits are listed. Based on the merging of different Operational Modal Analysis methods and a statistical approach, the identification process has been improved becoming more robust and giving as results only the real natural frequencies, damping ratios and mode shapes of the system. The effect of the temperature can be taken into account as well, leading to the creation of a better tool for automated Structural Health Monitoring. The algorithm has been developed and tested on a numerical model of a scaled three-story steel building present in the laboratories of Politecnico di Milano.

  8. Statistical properties of the radiation from SASE FEL operating in the linear regime

    NASA Astrophysics Data System (ADS)

    Saldin, E. L.; Schneidmiller, E. A.; Yurkov, M. V.

    1998-02-01

    The paper presents comprehensive analysis of statistical properties of the radiation from self amplified spontaneous emission (SASE) free electron laser operating in linear mode. The investigation has been performed in a one-dimensional approximation, assuming the electron pulse length to be much larger than a coherence length of the radiation. The following statistical properties of the SASE FEL radiation have been studied: field correlations, distribution of the radiation energy after monochromator installed at the FEL amplifier exit and photoelectric counting statistics of SASE FEL radiation. It is shown that the radiation from SASE FEL operating in linear regime possesses all the features corresponding to completely chaotic polarized radiation.

  9. Expanding the enablement framework and testing an evaluative instrument for diabetes patient education.

    PubMed

    Leeseberg Stamler, L; Cole, M M; Patrick, L J

    2001-08-01

    Strategies to delay or prevent complications from diabetes include diabetes patient education. Diabetes educators seek to provide education that meets the needs of clients and influences positive health outcomes. (1) To expand prior research exploring an enablement framework for patient education by examining perceptions of patient education by persons with diabetes and (2) to test the mastery of stress instrument (MSI) as a potential evaluative instrument for patient education. Triangulated data collection with a convenience sample of adults taking diabetes education classes. Half the sample completed audio-taped semi-structured interviews pre, during and posteducation and all completed the MSI posteducation. Qualitative data were analysed using latent content analysis, descriptive statistics were completed. Qualitative analysis revealed content categories similar to previous work with prenatal participants, supporting the enablement framework. Statistical analyses noted congruence with psychometric findings from development of MSI; secondary qualitative analyses revealed congruency between MSI scores and patient perceptions. Mastery is an outcome congruent with the enablement framework for patient education across content areas. Mastery of stress instrument may be a instrument for identification of patients who are coping well with diabetes self-management, as well as those who are not and who require further nursing interventions.

  10. Prevalence, cause, and location of palatal fistula in operated complete unilateral cleft lip and palate: retrospective study.

    PubMed

    de Agostino Biella Passos, Vivian; de Carvalho Carrara, Cleide Felício; da Silva Dalben, Gisele; Costa, Beatriz; Gomide, Marcia Ribeiro

    2014-03-01

    To evaluate the prevalence of fistulas after palate repair and analyze their location and association with possible causal factors. Retrospective analysis of patient records and evaluation of preoperative initial photographs. Tertiary craniofacial center. Five hundred eighty-nine individuals with complete unilateral cleft lip and palate that underwent palate repair at the age of 12 to 36 months by the von Langenbeck technique, in a single stage, by the plastic surgery team of the hospital, from January 2003 to July 2007. The cleft width was visually classified by a single examiner as narrow, regular, or wide. The following regions of the palate were considered for the location: anterior, medium, transition (between hard and soft palate), and soft palate. Descriptive statistics and analysis of association between the occurrence of fistula and the different parameters were evaluated. Palatal fistulas were observed in 27% of the sample, with a greater proportion at the anterior region (37.11%). The chi-square statistical test revealed statistically significant association (P ≤ .05) between the fistulas and initial cleft width (P = .0003), intraoperative problems (P = .0037), and postoperative problems (P = .00002). The prevalence of palatal fistula was similar to mean values reported in the literature. Analysis of causal factors showed a positive association between palatal fistulas with wide and regular initial cleft width and intraoperative and postoperative problems. The anterior region presented the greatest occurrence of fistulas.

  11. A powerful approach for association analysis incorporating imprinting effects

    PubMed Central

    Xia, Fan; Zhou, Ji-Yuan; Fung, Wing Kam

    2011-01-01

    Motivation: For a diallelic marker locus, the transmission disequilibrium test (TDT) is a simple and powerful design for genetic studies. The TDT was originally proposed for use in families with both parents available (complete nuclear families) and has further been extended to 1-TDT for use in families with only one of the parents available (incomplete nuclear families). Currently, the increasing interest of the influence of parental imprinting on heritability indicates the importance of incorporating imprinting effects into the mapping of association variants. Results: In this article, we extend the TDT-type statistics to incorporate imprinting effects and develop a series of new test statistics in a general two-stage framework for association studies. Our test statistics enjoy the nature of family-based designs that need no assumption of Hardy–Weinberg equilibrium. Also, the proposed methods accommodate complete and incomplete nuclear families with one or more affected children. In the simulation study, we verify the validity of the proposed test statistics under various scenarios, and compare the powers of the proposed statistics with some existing test statistics. It is shown that our methods greatly improve the power for detecting association in the presence of imprinting effects. We further demonstrate the advantage of our methods by the application of the proposed test statistics to a rheumatoid arthritis dataset. Contact: wingfung@hku.hk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:21798962

  12. A powerful approach for association analysis incorporating imprinting effects.

    PubMed

    Xia, Fan; Zhou, Ji-Yuan; Fung, Wing Kam

    2011-09-15

    For a diallelic marker locus, the transmission disequilibrium test (TDT) is a simple and powerful design for genetic studies. The TDT was originally proposed for use in families with both parents available (complete nuclear families) and has further been extended to 1-TDT for use in families with only one of the parents available (incomplete nuclear families). Currently, the increasing interest of the influence of parental imprinting on heritability indicates the importance of incorporating imprinting effects into the mapping of association variants. In this article, we extend the TDT-type statistics to incorporate imprinting effects and develop a series of new test statistics in a general two-stage framework for association studies. Our test statistics enjoy the nature of family-based designs that need no assumption of Hardy-Weinberg equilibrium. Also, the proposed methods accommodate complete and incomplete nuclear families with one or more affected children. In the simulation study, we verify the validity of the proposed test statistics under various scenarios, and compare the powers of the proposed statistics with some existing test statistics. It is shown that our methods greatly improve the power for detecting association in the presence of imprinting effects. We further demonstrate the advantage of our methods by the application of the proposed test statistics to a rheumatoid arthritis dataset. wingfung@hku.hk Supplementary data are available at Bioinformatics online.

  13. A Longitudinal Analysis of the Influence of a Peer Run Warm Line Phone Service on Psychiatric Recovery.

    PubMed

    Dalgin, Rebecca Spirito; Dalgin, M Halim; Metzger, Scott J

    2018-05-01

    This article focuses on the impact of a peer run warm line as part of the psychiatric recovery process. It utilized data including the Recovery Assessment Scale, community integration measures and crisis service usage. Longitudinal statistical analysis was completed on 48 sets of data from 2011, 2012, and 2013. Although no statistically significant differences were observed for the RAS score, community integration data showed increases in visits to primary care doctors, leisure/recreation activities and socialization with others. This study highlights the complexity of psychiatric recovery and that nonclinical peer services like peer run warm lines may be critical to the process.

  14. Knowledge and utilization of computer-software for statistics among Nigerian dentists.

    PubMed

    Chukwuneke, F N; Anyanechi, C E; Obiakor, A O; Amobi, O; Onyejiaka, N; Alamba, I

    2013-01-01

    The use of computer soft ware for generation of statistic analysis has transformed health information and data to simplest form in the areas of access, storage, retrieval and analysis in the field of research. This survey therefore was carried out to assess the level of knowledge and utilization of computer software for statistical analysis among dental researchers in eastern Nigeria. Questionnaires on the use of computer software for statistical analysis were randomly distributed to 65 practicing dental surgeons of above 5 years experience in the tertiary academic hospitals in eastern Nigeria. The focus was on: years of clinical experience; research work experience; knowledge and application of computer generated software for data processing and stastistical analysis. Sixty-two (62/65; 95.4%) of these questionnaires were returned anonymously, which were used in our data analysis. Twenty-nine (29/62; 46.8%) respondents fall within those with 5-10 years of clinical experience out of which none has completed the specialist training programme. Practitioners with above 10 years clinical experiences were 33 (33/62; 53.2%) out of which 15 (15/33; 45.5%) are specialists representing 24.2% (15/62) of the total number of respondents. All the 15 specialists are actively involved in research activities and only five (5/15; 33.3%) can utilize software statistical analysis unaided. This study has i dentified poor utilization of computer software for statistic analysis among dental researchers in eastern Nigeria. This is strongly associated with lack of exposure on the use of these software early enough especially during the undergraduate training. This call for introduction of computer training programme in dental curriculum to enable practitioners develops the attitude of using computer software for their research.

  15. mapDIA: Preprocessing and statistical analysis of quantitative proteomics data from data independent acquisition mass spectrometry.

    PubMed

    Teo, Guoshou; Kim, Sinae; Tsou, Chih-Chiang; Collins, Ben; Gingras, Anne-Claude; Nesvizhskii, Alexey I; Choi, Hyungwon

    2015-11-03

    Data independent acquisition (DIA) mass spectrometry is an emerging technique that offers more complete detection and quantification of peptides and proteins across multiple samples. DIA allows fragment-level quantification, which can be considered as repeated measurements of the abundance of the corresponding peptides and proteins in the downstream statistical analysis. However, few statistical approaches are available for aggregating these complex fragment-level data into peptide- or protein-level statistical summaries. In this work, we describe a software package, mapDIA, for statistical analysis of differential protein expression using DIA fragment-level intensities. The workflow consists of three major steps: intensity normalization, peptide/fragment selection, and statistical analysis. First, mapDIA offers normalization of fragment-level intensities by total intensity sums as well as a novel alternative normalization by local intensity sums in retention time space. Second, mapDIA removes outlier observations and selects peptides/fragments that preserve the major quantitative patterns across all samples for each protein. Last, using the selected fragments and peptides, mapDIA performs model-based statistical significance analysis of protein-level differential expression between specified groups of samples. Using a comprehensive set of simulation datasets, we show that mapDIA detects differentially expressed proteins with accurate control of the false discovery rates. We also describe the analysis procedure in detail using two recently published DIA datasets generated for 14-3-3β dynamic interaction network and prostate cancer glycoproteome. The software was written in C++ language and the source code is available for free through SourceForge website http://sourceforge.net/projects/mapdia/.This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Motor Vehicle Crashes as a Leading Cause of Death in 1994

    DOT National Transportation Integrated Search

    1998-03-01

    National Highway Traffic Safety Administration's National Center for Statistics : and Analysis (NCSA) recently completed a study of data on the causes of death : for all persons, by age and sex, which occurred in the U. S. in 1994. The : purpose of t...

  17. Graduate Programs in Education: Impact on Teachers' Careers

    ERIC Educational Resources Information Center

    Tucker, Janice; Fushell, Marian

    2013-01-01

    This paper examined teachers' decisions to pursue graduate programs and their career choices following completion of their studies. Based on document analysis and statistical examination of teacher questionnaire responses, this study determined that teachers choose graduate studies for different reasons, their program choice influences future…

  18. Wrestling with Philosophy: Improving Scholarship in Higher Education

    ERIC Educational Resources Information Center

    Kezar, Adrianna

    2004-01-01

    Method is usually viewed as completely separate from philosophy or theory, focusing instead on techniques and procedures of interviewing, focus groups, observation, or statistical analysis. Several texts on methodology published recently have added significant sections on philosophy, such as Creswell's (1998) Qualitative inquiry and research…

  19. Evaluation of psychiatric and genetic risk factors among primary relatives of suicide completers in Delhi NCR region, India.

    PubMed

    Pasi, Shivani; Singh, Piyoosh Kumar; Pandey, Rajeev Kumar; Dikshit, P C; Jiloha, R C; Rao, V R

    2015-10-30

    Suicide as a public health problem is studied worldwide and association of psychiatric and genetic risk factors for suicidal behavior are the point of discussion in studies across different ethnic groups. The present study is aimed at evaluating psychiatric and genetic traits among primary relatives of suicide completer families in an urban Indian population. Bi-variate analysis shows significant increase in major depression (PHQ and Hamilton), stress, panic disorder, somatoform disorder and suicide attemptamong primary compared to other relatives. Sib pair correlations also reveal significant results for major depression (Hamilton), stress, suicide attempt, intensity of suicide ideation and other anxiety syndrome. 5-HTTLPR, 5-HTT (Stin2) and COMT risk alleles are higher among primary relatives, though statistically insignificant. Backward conditional logistic regression analysis show only independent variable, Depression (Hamilton) made a unique statistically significant contribution to the model in primary relatives. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  20. Review of Statistical Learning Methods in Integrated Omics Studies (An Integrated Information Science).

    PubMed

    Zeng, Irene Sui Lan; Lumley, Thomas

    2018-01-01

    Integrated omics is becoming a new channel for investigating the complex molecular system in modern biological science and sets a foundation for systematic learning for precision medicine. The statistical/machine learning methods that have emerged in the past decade for integrated omics are not only innovative but also multidisciplinary with integrated knowledge in biology, medicine, statistics, machine learning, and artificial intelligence. Here, we review the nontrivial classes of learning methods from the statistical aspects and streamline these learning methods within the statistical learning framework. The intriguing findings from the review are that the methods used are generalizable to other disciplines with complex systematic structure, and the integrated omics is part of an integrated information science which has collated and integrated different types of information for inferences and decision making. We review the statistical learning methods of exploratory and supervised learning from 42 publications. We also discuss the strengths and limitations of the extended principal component analysis, cluster analysis, network analysis, and regression methods. Statistical techniques such as penalization for sparsity induction when there are fewer observations than the number of features and using Bayesian approach when there are prior knowledge to be integrated are also included in the commentary. For the completeness of the review, a table of currently available software and packages from 23 publications for omics are summarized in the appendix.

  1. Bayesian Sensitivity Analysis of Statistical Models with Missing Data

    PubMed Central

    ZHU, HONGTU; IBRAHIM, JOSEPH G.; TANG, NIANSHENG

    2013-01-01

    Methods for handling missing data depend strongly on the mechanism that generated the missing values, such as missing completely at random (MCAR) or missing at random (MAR), as well as other distributional and modeling assumptions at various stages. It is well known that the resulting estimates and tests may be sensitive to these assumptions as well as to outlying observations. In this paper, we introduce various perturbations to modeling assumptions and individual observations, and then develop a formal sensitivity analysis to assess these perturbations in the Bayesian analysis of statistical models with missing data. We develop a geometric framework, called the Bayesian perturbation manifold, to characterize the intrinsic structure of these perturbations. We propose several intrinsic influence measures to perform sensitivity analysis and quantify the effect of various perturbations to statistical models. We use the proposed sensitivity analysis procedure to systematically investigate the tenability of the non-ignorable missing at random (NMAR) assumption. Simulation studies are conducted to evaluate our methods, and a dataset is analyzed to illustrate the use of our diagnostic measures. PMID:24753718

  2. Archival Legacy Investigations of Circumstellar Environments (ALICE): Statistical assessment of point source detections

    NASA Astrophysics Data System (ADS)

    Choquet, Élodie; Pueyo, Laurent; Soummer, Rémi; Perrin, Marshall D.; Hagan, J. Brendan; Gofas-Salas, Elena; Rajan, Abhijith; Aguilar, Jonathan

    2015-09-01

    The ALICE program, for Archival Legacy Investigation of Circumstellar Environment, is currently conducting a virtual survey of about 400 stars, by re-analyzing the HST-NICMOS coronagraphic archive with advanced post-processing techniques. We present here the strategy that we adopted to identify detections and potential candidates for follow-up observations, and we give a preliminary overview of our detections. We present a statistical analysis conducted to evaluate the confidence level on these detection and the completeness of our candidate search.

  3. Interferon for the treatment of genital warts: a systematic review

    PubMed Central

    2009-01-01

    Background Interferon has been widely used in the treatment of genital warts for its immunomodulatory, antiproliferative and antiviral properties. Currently, no evidence that interferon improves the complete response rate or reduces the recurrence rate of genital warts has been generally provided. The aim of this review is to assess, from randomized control trials (RCTs), the efficacy and safety of interferon in curing genital warts. Methods We searched Cochrane Sexually Transmitted Diseases Group's Trials Register (January, 2009), Cochrane Central Register of Controlled Trials (2009, issue 1), PubMed (1950-2009), EMBASE (1974-2009), Chinese Biomedical Literature Database (CBM) (1975-2009), China National Knowledge Infrastructure (CNKI) (1979-2009), VIP database (1989-2009), as well as reference lists of relevant studies. Two reviewers independently screened searched studies, extracted data and evaluated their methodological qualities. RevMan 4.2.8 software was used for meta-analysis Results 12 RCTs involving 1445 people were included. Among them, 7 studies demonstrated the complete response rate of locally-used interferon as compared to placebo for treating genital warts. Based on meta-analysis, the rate of Complete response of the two interventions differed significantly (locally-used interferon:44.4%; placebo:16.1%). The difference between the two groups had statistical significance (RR 2.68, 95% CI 1.79 to 4.02, P < 0.00001). 5 studies demonstrated the complete response rate of systemically-used interferon as compared to placebo for treating genital warts. Based on meta-analysis, the rate of Complete response of the two interventions had no perceivable discrepancy (systemically-used interferon:27.4%; placebo:26.4%). The difference between the two groups had no statistical significance (RR1.25, 95% CI 0.80 to 1.95, P > 0.05). 7 studies demonstrated the recurrence rate of interferon as compared to placebo for treating genital warts. Based on meta-analysis, the recurrence rate of the two interventions had no perceivable discrepancy(interferon 21.1%; placebo: 34.2%). The difference between the two groups had no statistical significance (RR0.56, 95% CI 0.27 to 1.18, P > 0.05). However, subgroup analysis showed that HPV-infected patients with locally administered interferon were less likely than those given placebo to relapse, but that no significant difference in relapse rates was observed between systemic and placebo. The reported adverse events of interferon were mostly mild and transient, which could be well tolerated. Conclusion Interferon tends to be a fairly well-tolerated form of therapy. According to different routes of administration, locally-used interferon appears to be much more effective than both systemically-used interferon and placebo in either improving the complete response rate or reducing the recurrence rate for the treatment of genital warts. PMID:19772554

  4. Interferon for the treatment of genital warts: a systematic review.

    PubMed

    Yang, Jin; Pu, Yu-Guo; Zeng, Zhong-Ming; Yu, Zhi-Jian; Huang, Na; Deng, Qi-Wen

    2009-09-21

    Interferon has been widely used in the treatment of genital warts for its immunomodulatory, antiproliferative and antiviral properties. Currently, no evidence that interferon improves the complete response rate or reduces the recurrence rate of genital warts has been generally provided. The aim of this review is to assess, from randomized control trials (RCTs), the efficacy and safety of interferon in curing genital warts. We searched Cochrane Sexually Transmitted Diseases Group's Trials Register (January, 2009), Cochrane Central Register of Controlled Trials (2009, issue 1), PubMed (1950-2009), EMBASE (1974-2009), Chinese Biomedical Literature Database (CBM) (1975-2009), China National Knowledge Infrastructure (CNKI) (1979-2009), VIP database (1989-2009), as well as reference lists of relevant studies. Two reviewers independently screened searched studies, extracted data and evaluated their methodological qualities. RevMan 4.2.8 software was used for meta-analysis 12 RCTs involving 1445 people were included. Among them, 7 studies demonstrated the complete response rate of locally-used interferon as compared to placebo for treating genital warts. Based on meta-analysis, the rate of Complete response of the two interventions differed significantly (locally-used interferon:44.4%; placebo:16.1%). The difference between the two groups had statistical significance (RR 2.68, 95% CI 1.79 to 4.02, P < 0.00001). 5 studies demonstrated the complete response rate of systemically-used interferon as compared to placebo for treating genital warts. Based on meta-analysis, the rate of Complete response of the two interventions had no perceivable discrepancy (systemically-used interferon:27.4%; placebo:26.4%). The difference between the two groups had no statistical significance (RR1.25, 95% CI 0.80 to 1.95, P > 0.05). 7 studies demonstrated the recurrence rate of interferon as compared to placebo for treating genital warts. Based on meta-analysis, the recurrence rate of the two interventions had no perceivable discrepancy(interferon 21.1%; placebo: 34.2%). The difference between the two groups had no statistical significance (RR0.56, 95% CI 0.27 to 1.18, P > 0.05). However, subgroup analysis showed that HPV-infected patients with locally administered interferon were less likely than those given placebo to relapse, but that no significant difference in relapse rates was observed between systemic and placebo. The reported adverse events of interferon were mostly mild and transient, which could be well tolerated. Interferon tends to be a fairly well-tolerated form of therapy. According to different routes of administration, locally-used interferon appears to be much more effective than both systemically-used interferon and placebo in either improving the complete response rate or reducing the recurrence rate for the treatment of genital warts.

  5. Research Analysis on MOOC Course Dropout and Retention Rates

    ERIC Educational Resources Information Center

    Gomez-Zermeno, Marcela Gerogina; Aleman de La Garza, Lorena

    2016-01-01

    This research's objective was to identify the terminal efficiency of the Massive Online Open Course "Educational Innovation with Open Resources" offered by a Mexican private university. A quantitative methodology was used, combining descriptive statistics and probabilistic models to analyze the levels of retention, completion, and…

  6. Network Analysis Tools: from biological networks to clusters and pathways.

    PubMed

    Brohée, Sylvain; Faust, Karoline; Lima-Mendez, Gipsi; Vanderstocken, Gilles; van Helden, Jacques

    2008-01-01

    Network Analysis Tools (NeAT) is a suite of computer tools that integrate various algorithms for the analysis of biological networks: comparison between graphs, between clusters, or between graphs and clusters; network randomization; analysis of degree distribution; network-based clustering and path finding. The tools are interconnected to enable a stepwise analysis of the network through a complete analytical workflow. In this protocol, we present a typical case of utilization, where the tasks above are combined to decipher a protein-protein interaction network retrieved from the STRING database. The results returned by NeAT are typically subnetworks, networks enriched with additional information (i.e., clusters or paths) or tables displaying statistics. Typical networks comprising several thousands of nodes and arcs can be analyzed within a few minutes. The complete protocol can be read and executed in approximately 1 h.

  7. A guide to missing data for the pediatric nephrologist.

    PubMed

    Larkins, Nicholas G; Craig, Jonathan C; Teixeira-Pinto, Armando

    2018-03-13

    Missing data is an important and common source of bias in clinical research. Readers should be alert to and consider the impact of missing data when reading studies. Beyond preventing missing data in the first place, through good study design and conduct, there are different strategies available to handle data containing missing observations. Complete case analysis is often biased unless data are missing completely at random. Better methods of handling missing data include multiple imputation and models using likelihood-based estimation. With advancing computing power and modern statistical software, these methods are within the reach of clinician-researchers under guidance of a biostatistician. As clinicians reading papers, we need to continue to update our understanding of statistical methods, so that we understand the limitations of these techniques and can critically interpret literature.

  8. Gene Identification Algorithms Using Exploratory Statistical Analysis of Periodicity

    NASA Astrophysics Data System (ADS)

    Mukherjee, Shashi Bajaj; Sen, Pradip Kumar

    2010-10-01

    Studying periodic pattern is expected as a standard line of attack for recognizing DNA sequence in identification of gene and similar problems. But peculiarly very little significant work is done in this direction. This paper studies statistical properties of DNA sequences of complete genome using a new technique. A DNA sequence is converted to a numeric sequence using various types of mappings and standard Fourier technique is applied to study the periodicity. Distinct statistical behaviour of periodicity parameters is found in coding and non-coding sequences, which can be used to distinguish between these parts. Here DNA sequences of Drosophila melanogaster were analyzed with significant accuracy.

  9. Missing data imputation: focusing on single imputation.

    PubMed

    Zhang, Zhongheng

    2016-01-01

    Complete case analysis is widely used for handling missing data, and it is the default method in many statistical packages. However, this method may introduce bias and some useful information will be omitted from analysis. Therefore, many imputation methods are developed to make gap end. The present article focuses on single imputation. Imputations with mean, median and mode are simple but, like complete case analysis, can introduce bias on mean and deviation. Furthermore, they ignore relationship with other variables. Regression imputation can preserve relationship between missing values and other variables. There are many sophisticated methods exist to handle missing values in longitudinal data. This article focuses primarily on how to implement R code to perform single imputation, while avoiding complex mathematical calculations.

  10. Missing data imputation: focusing on single imputation

    PubMed Central

    2016-01-01

    Complete case analysis is widely used for handling missing data, and it is the default method in many statistical packages. However, this method may introduce bias and some useful information will be omitted from analysis. Therefore, many imputation methods are developed to make gap end. The present article focuses on single imputation. Imputations with mean, median and mode are simple but, like complete case analysis, can introduce bias on mean and deviation. Furthermore, they ignore relationship with other variables. Regression imputation can preserve relationship between missing values and other variables. There are many sophisticated methods exist to handle missing values in longitudinal data. This article focuses primarily on how to implement R code to perform single imputation, while avoiding complex mathematical calculations. PMID:26855945

  11. The impact of mother's literacy on child dental caries: Individual data or aggregate data analysis?

    PubMed

    Haghdoost, Ali-Akbar; Hessari, Hossein; Baneshi, Mohammad Reza; Rad, Maryam; Shahravan, Arash

    2017-01-01

    To evaluate the impact of mother's literacy on child dental caries based on a national oral health survey in Iran and to investigate the possibility of ecological fallacy in aggregate data analysis. Existing data were from second national oral health survey that was carried out in 2004, which including 8725 6 years old participants. The association of mother's literacy with caries occurrence (DMF (Decayed, Missing, Filling) total score >0) of her child was assessed using individual data by logistic regression model. Then the association of the percentages of mother's literacy and the percentages of decayed teeth in each 30 provinces of Iran was assessed using aggregated data retrieved from the data of second national oral health survey of Iran and alternatively from census of "Statistical Center of Iran" using linear regression model. The significance level was set at 0.05 for all analysis. Individual data analysis showed a statistically significant association between mother's literacy and decayed teeth of children ( P = 0.02, odds ratio = 0.83). There were not statistical significant association between mother's literacy and child dental caries in aggregate data analysis of oral health survey ( P = 0.79, B = 0.03) and census of "Statistical Center of Statistics" ( P = 0.60, B = 0.14). Literate mothers have a preventive effect on occurring dental caries of children. According to the high percentage of illiterate parents in Iran, it's logical to consider suitable methods of oral health education which do not need reading or writing. Aggregate data analysis and individual data analysis had completely different results in this study.

  12. Determinants of Student Wastage in Higher Education.

    ERIC Educational Resources Information Center

    Johnes, Jill

    1990-01-01

    Statistical analysis of a sample of the 1979 entry cohort to Lancaster University indicates that the likelihood of non-completion is determined by various characteristics including the student's academic ability, gender, marital status, work experience prior to university, school background, and location of home in relation to university.…

  13. A Statistical Analysis of Variables Related to Officer Retention

    DTIC Science & Technology

    1996-09-01

    officers’ career decisions (21: 28). In 1986, Marchewka completed an unpublished study investigating significant differences in the job attitudes of...Staff College, Maxwell AFB AL, 1981. 13. Herzberg, F. and others. The Motivation to Work. New York: Wiley, 1959. 14. Marchewka , Peter S. "Job Attitudes

  14. The mediating effect of calling on the relationship between medical school students' academic burnout and empathy.

    PubMed

    Chae, Su Jin; Jeong, So Mi; Chung, Yoon-Sok

    2017-09-01

    This study is aimed at identifying the relationships between medical school students' academic burnout, empathy, and calling, and determining whether their calling has a mediating effect on the relationship between academic burnout and empathy. A mixed method study was conducted. One hundred twenty-seven medical students completed a survey. Scales measuring academic burnout, medical students' empathy, and calling were utilized. For statistical analysis, correlation analysis, descriptive statistics analysis, and hierarchical multiple regression analyses were conducted. For qualitative approach, eight medical students participated in a focus group interview. The study found that empathy has a statistically significant, negative correlation with academic burnout, while having a significant, positive correlation with calling. Sense of calling proved to be an effective mediator of the relationship between academic burnout and empathy. This result demonstrates that calling is a key variable that mediates the relationship between medical students' academic burnout and empathy. As such, this study provides baseline data for an education that could improve medical students' empathy skills.

  15. Statistical and Detailed Analysis on Fiber Reinforced Self-Compacting Concrete Containing Admixtures- A State of Art of Review

    NASA Astrophysics Data System (ADS)

    Athiyamaan, V.; Mohan Ganesh, G.

    2017-11-01

    Self-Compacting Concrete is one of the special concretes that have ability to flow and consolidate on its own weight, completely fill the formwork even in the presence of dense reinforcement; whilst maintaining its homogeneity throughout the formwork without any requirement for vibration. Researchers all over the world are developing high performance concrete by adding various Fibers, admixtures in different proportions. Various different kinds Fibers like glass, steel, carbon, Poly propylene and aramid Fibers provide improvement in concrete properties like tensile strength, fatigue characteristic, durability, shrinkage, impact, erosion resistance and serviceability of concrete[6]. It includes fundamental study on fiber reinforced self-compacting concrete with admixtures; its rheological properties, mechanical properties and overview study on design methodology statistical approaches regarding optimizing the concrete performances. The study has been classified into seven basic chapters: introduction, phenomenal study on material properties review on self-compacting concrete, overview on fiber reinforced self-compacting concrete containing admixtures, review on design and analysis of experiment; a statistical approach, summary of existing works on FRSCC and statistical modeling, literature review and, conclusion. It is so eminent to know the resent studies that had been done on polymer based binder materials (fly ash, metakaolin, GGBS, etc.), fiber reinforced concrete and SCC; to do an effective research on fiber reinforced self-compacting concrete containing admixtures. The key aim of the study is to sort-out the research gap and to gain a complete knowledge on polymer based Self compacting fiber reinforced concrete.

  16. The fracture load and failure types of veneered anterior zirconia crowns: an analysis of normal and Weibull distribution of complete and censored data.

    PubMed

    Stawarczyk, Bogna; Ozcan, Mutlu; Hämmerle, Christoph H F; Roos, Malgorzata

    2012-05-01

    The aim of this study was to compare the fracture load of veneered anterior zirconia crowns using normal and Weibull distribution of complete and censored data. Standardized zirconia frameworks for maxillary canines were milled using a CAD/CAM system and randomly divided into 3 groups (N=90, n=30 per group). They were veneered with three veneering ceramics, namely GC Initial ZR, Vita VM9, IPS e.max Ceram using layering technique. The crowns were cemented with glass ionomer cement on metal abutments. The specimens were then loaded to fracture (1 mm/min) in a Universal Testing Machine. The data were analyzed using classical method (normal data distribution (μ, σ); Levene test and one-way ANOVA) and according to the Weibull statistics (s, m). In addition, fracture load results were analyzed depending on complete and censored failure types (only chipping vs. total fracture together with chipping). When computed with complete data, significantly higher mean fracture loads (N) were observed for GC Initial ZR (μ=978, σ=157; s=1043, m=7.2) and VITA VM9 (μ=1074, σ=179; s=1139; m=7.8) than that of IPS e.max Ceram (μ=798, σ=174; s=859, m=5.8) (p<0.05) by classical and Weibull statistics, respectively. When the data were censored for only total fracture, IPS e.max Ceram presented the lowest fracture load for chipping with both classical distribution (μ=790, σ=160) and Weibull statistics (s=836, m=6.5). When total fracture with chipping (classical distribution) was considered as failure, IPS e.max Ceram did not show significant fracture load for total fracture (μ=1054, σ=110) compared to other groups (GC Initial ZR: μ=1039, σ=152, VITA VM9: μ=1170, σ=166). According to Weibull distributed data, VITA VM9 showed significantly higher fracture load (s=1228, m=9.4) than those of other groups. Both classical distribution and Weibull statistics for complete data yielded similar outcomes. Censored data analysis of all ceramic systems based on failure types is essential and brings additional information regarding the susceptibility to chipping or total fracture. Copyright © 2011 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  17. Treatment of missing data in follow-up studies of randomised controlled trials: A systematic review of the literature.

    PubMed

    Sullivan, Thomas R; Yelland, Lisa N; Lee, Katherine J; Ryan, Philip; Salter, Amy B

    2017-08-01

    After completion of a randomised controlled trial, an extended follow-up period may be initiated to learn about longer term impacts of the intervention. Since extended follow-up studies often involve additional eligibility restrictions and consent processes for participation, and a longer duration of follow-up entails a greater risk of participant attrition, missing data can be a considerable threat in this setting. As a potential source of bias, it is critical that missing data are appropriately handled in the statistical analysis, yet little is known about the treatment of missing data in extended follow-up studies. The aims of this review were to summarise the extent of missing data in extended follow-up studies and the use of statistical approaches to address this potentially serious problem. We performed a systematic literature search in PubMed to identify extended follow-up studies published from January to June 2015. Studies were eligible for inclusion if the original randomised controlled trial results were also published and if the main objective of extended follow-up was to compare the original randomised groups. We recorded information on the extent of missing data and the approach used to treat missing data in the statistical analysis of the primary outcome of the extended follow-up study. Of the 81 studies included in the review, 36 (44%) reported additional eligibility restrictions and 24 (30%) consent processes for entry into extended follow-up. Data were collected at a median of 7 years after randomisation. Excluding 28 studies with a time to event primary outcome, 51/53 studies (96%) reported missing data on the primary outcome. The median percentage of randomised participants with complete data on the primary outcome was just 66% in these studies. The most common statistical approach to address missing data was complete case analysis (51% of studies), while likelihood-based analyses were also well represented (25%). Sensitivity analyses around the missing data mechanism were rarely performed (25% of studies), and when they were, they often involved unrealistic assumptions about the mechanism. Despite missing data being a serious problem in extended follow-up studies, statistical approaches to addressing missing data were often inadequate. We recommend researchers clearly specify all sources of missing data in follow-up studies and use statistical methods that are valid under a plausible assumption about the missing data mechanism. Sensitivity analyses should also be undertaken to assess the robustness of findings to assumptions about the missing data mechanism.

  18. A note about high blood pressure in childhood

    NASA Astrophysics Data System (ADS)

    Teodoro, M. Filomena; Simão, Carla

    2017-06-01

    In medical, behavioral and social sciences it is usual to get a binary outcome. In the present work is collected information where some of the outcomes are binary variables (1='yes'/ 0='no'). In [14] a preliminary study about the caregivers perception of pediatric hypertension was introduced. An experimental questionnaire was designed to be answered by the caregivers of routine pediatric consultation attendees in the Santa Maria's hospital (HSM). The collected data was statistically analyzed, where a descriptive analysis and a predictive model were performed. Significant relations between some socio-demographic variables and the assessed knowledge were obtained. In [14] can be found a statistical data analysis using partial questionnaire's information. The present article completes the statistical approach estimating a model for relevant remaining questions of questionnaire by Generalized Linear Models (GLM). Exploring the binary outcome issue, we intend to extend this approach using Generalized Linear Mixed Models (GLMM), but the process is still ongoing.

  19. Psychology, Science, and Knowledge Construction: Broadening Perspectives from the Replication Crisis.

    PubMed

    Shrout, Patrick E; Rodgers, Joseph L

    2018-01-04

    Psychology advances knowledge by testing statistical hypotheses using empirical observations and data. The expectation is that most statistically significant findings can be replicated in new data and in new laboratories, but in practice many findings have replicated less often than expected, leading to claims of a replication crisis. We review recent methodological literature on questionable research practices, meta-analysis, and power analysis to explain the apparently high rates of failure to replicate. Psychologists can improve research practices to advance knowledge in ways that improve replicability. We recommend that researchers adopt open science conventions of preregi-stration and full disclosure and that replication efforts be based on multiple studies rather than on a single replication attempt. We call for more sophisticated power analyses, careful consideration of the various influences on effect sizes, and more complete disclosure of nonsignificant as well as statistically significant findings.

  20. The Impact of Measurement Noise in GPA Diagnostic Analysis of a Gas Turbine Engine

    NASA Astrophysics Data System (ADS)

    Ntantis, Efstratios L.; Li, Y. G.

    2013-12-01

    The performance diagnostic analysis of a gas turbine is accomplished by estimating a set of internal engine health parameters from available sensor measurements. No physical measuring instruments however can ever completely eliminate the presence of measurement uncertainties. Sensor measurements are often distorted by noise and bias leading to inaccurate estimation results. This paper explores the impact of measurement noise on Gas Turbine GPA analysis. The analysis is demonstrated with a test case where gas turbine performance simulation and diagnostics code TURBOMATCH is used to build a performance model of a model engine similar to Rolls-Royce Trent 500 turbofan engine, and carry out the diagnostic analysis with the presence of different levels of measurement noise. Conclusively, to improve the reliability of the diagnostic results, a statistical analysis of the data scattering caused by sensor uncertainties is made. The diagnostic tool used to deal with the statistical analysis of measurement noise impact is a model-based method utilizing a non-linear GPA.

  1. Rare variant association analysis in case-parents studies by allowing for missing parental genotypes.

    PubMed

    Li, Yumei; Xiang, Yang; Xu, Chao; Shen, Hui; Deng, Hongwen

    2018-01-15

    The development of next-generation sequencing technologies has facilitated the identification of rare variants. Family-based design is commonly used to effectively control for population admixture and substructure, which is more prominent for rare variants. Case-parents studies, as typical strategies in family-based design, are widely used in rare variant-disease association analysis. Current methods in case-parents studies are based on complete case-parents data; however, parental genotypes may be missing in case-parents trios, and removing these data may lead to a loss in statistical power. The present study focuses on testing for rare variant-disease association in case-parents study by allowing for missing parental genotypes. In this report, we extended the collapsing method for rare variant association analysis in case-parents studies to allow for missing parental genotypes, and investigated the performance of two methods by using the difference of genotypes between affected offspring and their corresponding "complements" in case-parent trios and TDT framework. Using simulations, we showed that, compared with the methods just only using complete case-parents data, the proposed strategy allowing for missing parental genotypes, or even adding unrelated affected individuals, can greatly improve the statistical power and meanwhile is not affected by population stratification. We conclude that adding case-parents data with missing parental genotypes to complete case-parents data set can greatly improve the power of our strategy for rare variant-disease association.

  2. Multiple imputation methods for bivariate outcomes in cluster randomised trials.

    PubMed

    DiazOrdaz, K; Kenward, M G; Gomes, M; Grieve, R

    2016-09-10

    Missing observations are common in cluster randomised trials. The problem is exacerbated when modelling bivariate outcomes jointly, as the proportion of complete cases is often considerably smaller than the proportion having either of the outcomes fully observed. Approaches taken to handling such missing data include the following: complete case analysis, single-level multiple imputation that ignores the clustering, multiple imputation with a fixed effect for each cluster and multilevel multiple imputation. We contrasted the alternative approaches to handling missing data in a cost-effectiveness analysis that uses data from a cluster randomised trial to evaluate an exercise intervention for care home residents. We then conducted a simulation study to assess the performance of these approaches on bivariate continuous outcomes, in terms of confidence interval coverage and empirical bias in the estimated treatment effects. Missing-at-random clustered data scenarios were simulated following a full-factorial design. Across all the missing data mechanisms considered, the multiple imputation methods provided estimators with negligible bias, while complete case analysis resulted in biased treatment effect estimates in scenarios where the randomised treatment arm was associated with missingness. Confidence interval coverage was generally in excess of nominal levels (up to 99.8%) following fixed-effects multiple imputation and too low following single-level multiple imputation. Multilevel multiple imputation led to coverage levels of approximately 95% throughout. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  3. Analysis tools for discovering strong parity violation at hadron colliders

    NASA Astrophysics Data System (ADS)

    Backović, Mihailo; Ralston, John P.

    2011-07-01

    Several arguments suggest parity violation may be observable in high energy strong interactions. We introduce new analysis tools to describe the azimuthal dependence of multiparticle distributions, or “azimuthal flow.” Analysis uses the representations of the orthogonal group O(2) and dihedral groups DN necessary to define parity completely in two dimensions. Classification finds that collective angles used in event-by-event statistics represent inequivalent tensor observables that cannot generally be represented by a single “reaction plane.” Many new parity-violating observables exist that have never been measured, while many parity-conserving observables formerly lumped together are now distinguished. We use the concept of “event-shape sorting” to suggest separating right- and left-handed events, and we discuss the effects of transverse and longitudinal spin. The analysis tools are statistically robust, and can be applied equally to low or high multiplicity events at the Tevatron, RHIC or RHIC Spin, and the LHC.

  4. Teaching Integrity in Empirical Research: A Protocol for Documenting Data Management and Analysis

    ERIC Educational Resources Information Center

    Ball, Richard; Medeiros, Norm

    2012-01-01

    This article describes a protocol the authors developed for teaching undergraduates to document their statistical analyses for empirical research projects so that their results are completely reproducible and verifiable. The protocol is guided by the principle that the documentation prepared to accompany an empirical research project should be…

  5. The Relationship between Adequate Yearly Progress and the Quality of Professional Development

    ERIC Educational Resources Information Center

    Wolff, Lori A.; McClelland, Susan S.; Stewart, Stephanie E.

    2010-01-01

    Based on publicly available data, the study examined the relationship between adequate yearly progress status and teachers' perceptions of the quality of their professional development. The sample included responses of 5,558 teachers who completed the questionnaire in the 2005-2006 school year. Results of the statistical analysis show a…

  6. PUNCHED CARD SYSTEM NEEDN'T BE COMPLEX TO GIVE COMPLETE CONTROL.

    ERIC Educational Resources Information Center

    BEMIS, HAZEL T.

    AT WORCESTER JUNIOR COLLEGE, MASSACHUSETTS, USE OF A MANUALLY OPERATED PUNCHED CARD SYSTEM HAS RESULTED IN (1) SIMPLIFIED REGISTRATION PROCEDURES, (2) QUICK ANALYSIS OF CONFLICTS AND PROBLEMS IN CLASS SCHEDULING, (3) READY ACCESS TO STATISTICAL INFORMATION, (4) DIRECTORY INFORMATION IN A WIDE RANGE OF CLASSIFICATIONS, (5) EASY VERIFICATION OF…

  7. An Examination of the Relationship between Job Satisfaction, Organizational Norms, and Communication Climate among Employees in an Organization.

    ERIC Educational Resources Information Center

    Applbaum, Ronald L.; Anatol, Kark W. E.

    Three separate instruments were used to measure and assess the interrelationships of organizational norms, communication climate, and job satisfaction. Of the 155 top administrators and managers at California State University, Long Beach, 101 subjects completed all three measurement instruments. Statistical analysis showed that a significant…

  8. Transferability of Postsecondary Credit Following Student Transfer or Coenrollment. Statistical Analysis Report. NCES 2014-163

    ERIC Educational Resources Information Center

    Simone, Sean Anthony

    2014-01-01

    The federal government invests billions of dollars in grants and loans to help students access and complete postsecondary education. Federal policymakers, therefore, have had a continuing interest in understanding the ability of students to transfer credits between postsecondary institutions. In 2005, the Senate Health, Education, Labor, and…

  9. Relationship of Class-Size to Classroom Processes, Teacher Satisfaction and Pupil Affect: A Meta-Analysis.

    ERIC Educational Resources Information Center

    Smith, Mary Lee; Glass, Gene V.

    Using data from previously completed research, the authors of this report attempted to examine the relationship between class size and measures of outcomes such as student attitudes and behavior, classroom processes and learning environment, and teacher satisfaction. The authors report that statistical integration of the existing research…

  10. AG Channel Measurement and Modeling Results for Over-Water and Hilly Terrain Conditions

    NASA Technical Reports Server (NTRS)

    Matolak, David W.; Sun, Ruoyu

    2015-01-01

    This report describes work completed over the past year on our project, entitled "Unmanned Aircraft Systems (UAS) Research: The AG Channel, Robust Waveforms, and Aeronautical Network Simulations." This project is funded under the NASA project "Unmanned Aircraft Systems (UAS) in the National Airspace System (NAS)." In this report we provide the following: an update on project progress; a description of the over-freshwater and hilly terrain initial results on path loss, delay spread, small-scale fading, and correlations; complete path loss models for the over-water AG channels; analysis for obtaining parameter statistics required for development of accurate wideband AG channel models; and analysis of an atypical AG channel in which the aircraft flies out of the ground site antenna main beam. We have modeled the small-scale fading of these channels with Ricean statistics, and have quantified the behavior of the Ricean K-factor. We also provide some results for correlations of signal components, both intra-band and inter-band. An updated literature review, and a summary that also describes future work, are also included.

  11. Statistical Reporting Errors and Collaboration on Statistical Analyses in Psychological Science.

    PubMed

    Veldkamp, Coosje L S; Nuijten, Michèle B; Dominguez-Alvarez, Linda; van Assen, Marcel A L M; Wicherts, Jelte M

    2014-01-01

    Statistical analysis is error prone. A best practice for researchers using statistics would therefore be to share data among co-authors, allowing double-checking of executed tasks just as co-pilots do in aviation. To document the extent to which this 'co-piloting' currently occurs in psychology, we surveyed the authors of 697 articles published in six top psychology journals and asked them whether they had collaborated on four aspects of analyzing data and reporting results, and whether the described data had been shared between the authors. We acquired responses for 49.6% of the articles and found that co-piloting on statistical analysis and reporting results is quite uncommon among psychologists, while data sharing among co-authors seems reasonably but not completely standard. We then used an automated procedure to study the prevalence of statistical reporting errors in the articles in our sample and examined the relationship between reporting errors and co-piloting. Overall, 63% of the articles contained at least one p-value that was inconsistent with the reported test statistic and the accompanying degrees of freedom, and 20% of the articles contained at least one p-value that was inconsistent to such a degree that it may have affected decisions about statistical significance. Overall, the probability that a given p-value was inconsistent was over 10%. Co-piloting was not found to be associated with reporting errors.

  12. Statistical Reporting Errors and Collaboration on Statistical Analyses in Psychological Science

    PubMed Central

    Veldkamp, Coosje L. S.; Nuijten, Michèle B.; Dominguez-Alvarez, Linda; van Assen, Marcel A. L. M.; Wicherts, Jelte M.

    2014-01-01

    Statistical analysis is error prone. A best practice for researchers using statistics would therefore be to share data among co-authors, allowing double-checking of executed tasks just as co-pilots do in aviation. To document the extent to which this ‘co-piloting’ currently occurs in psychology, we surveyed the authors of 697 articles published in six top psychology journals and asked them whether they had collaborated on four aspects of analyzing data and reporting results, and whether the described data had been shared between the authors. We acquired responses for 49.6% of the articles and found that co-piloting on statistical analysis and reporting results is quite uncommon among psychologists, while data sharing among co-authors seems reasonably but not completely standard. We then used an automated procedure to study the prevalence of statistical reporting errors in the articles in our sample and examined the relationship between reporting errors and co-piloting. Overall, 63% of the articles contained at least one p-value that was inconsistent with the reported test statistic and the accompanying degrees of freedom, and 20% of the articles contained at least one p-value that was inconsistent to such a degree that it may have affected decisions about statistical significance. Overall, the probability that a given p-value was inconsistent was over 10%. Co-piloting was not found to be associated with reporting errors. PMID:25493918

  13. Statistical approaches in published ophthalmic clinical science papers: a comparison to statistical practice two decades ago.

    PubMed

    Zhang, Harrison G; Ying, Gui-Shuang

    2018-02-09

    The aim of this study is to evaluate the current practice of statistical analysis of eye data in clinical science papers published in British Journal of Ophthalmology ( BJO ) and to determine whether the practice of statistical analysis has improved in the past two decades. All clinical science papers (n=125) published in BJO in January-June 2017 were reviewed for their statistical analysis approaches for analysing primary ocular measure. We compared our findings to the results from a previous paper that reviewed BJO papers in 1995. Of 112 papers eligible for analysis, half of the studies analysed the data at an individual level because of the nature of observation, 16 (14%) studies analysed data from one eye only, 36 (32%) studies analysed data from both eyes at ocular level, one study (1%) analysed the overall summary of ocular finding per individual and three (3%) studies used the paired comparison. Among studies with data available from both eyes, 50 (89%) of 56 papers in 2017 did not analyse data from both eyes or ignored the intereye correlation, as compared with in 60 (90%) of 67 papers in 1995 (P=0.96). Among studies that analysed data from both eyes at an ocular level, 33 (92%) of 36 studies completely ignored the intereye correlation in 2017, as compared with in 16 (89%) of 18 studies in 1995 (P=0.40). A majority of studies did not analyse the data properly when data from both eyes were available. The practice of statistical analysis did not improve in the past two decades. Collaborative efforts should be made in the vision research community to improve the practice of statistical analysis for ocular data. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  14. Detection of Buried Targets via Active Selection of Labeled Data: Application to Sensing Subsurface UXO

    DTIC Science & Technology

    2007-06-01

    images,” IEEE Trans. Pattern Analysis Machine Intelligence, vol. 13, no. 2, pp. 99–113, 1991. [15] C. Bouman and M. Shapiro, “A multiscale random...including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing...this project was on developing new statistical algorithms for analysis of electromagnetic induction (EMI) and magnetometer data measured at actual

  15. Holo-analysis.

    PubMed

    Rosen, G D

    2006-06-01

    Meta-analysis is a vague descriptor used to encompass very diverse methods of data collection analysis, ranging from simple averages to more complex statistical methods. Holo-analysis is a fully comprehensive statistical analysis of all available data and all available variables in a specified topic, with results expressed in a holistic factual empirical model. The objectives and applications of holo-analysis include software production for prediction of responses with confidence limits, translation of research conditions to praxis (field) circumstances, exposure of key missing variables, discovery of theoretically unpredictable variables and interactions, and planning future research. Holo-analyses are cited as examples of the effects on broiler feed intake and live weight gain of exogenous phytases, which account for 70% of variation in responses in terms of 20 highly significant chronological, dietary, environmental, genetic, managemental, and nutrient variables. Even better future accountancy of variation will be facilitated if and when authors of papers routinely provide key data for currently neglected variables, such as temperatures, complete feed formulations, and mortalities.

  16. Tackling missing radiographic progression data: multiple imputation technique compared with inverse probability weights and complete case analysis.

    PubMed

    Descalzo, Miguel Á; Garcia, Virginia Villaverde; González-Alvaro, Isidoro; Carbonell, Jordi; Balsa, Alejandro; Sanmartí, Raimon; Lisbona, Pilar; Hernandez-Barrera, Valentín; Jiménez-Garcia, Rodrigo; Carmona, Loreto

    2013-02-01

    To describe the results of different statistical ways of addressing radiographic outcome affected by missing data--multiple imputation technique, inverse probability weights and complete case analysis--using data from an observational study. A random sample of 96 RA patients was selected for a follow-up study in which radiographs of hands and feet were scored. Radiographic progression was tested by comparing the change in the total Sharp-van der Heijde radiographic score (TSS) and the joint erosion score (JES) from baseline to the end of the second year of follow-up. MI technique, inverse probability weights in weighted estimating equation (WEE) and CC analysis were used to fit a negative binomial regression. Major predictors of radiographic progression were JES and joint space narrowing (JSN) at baseline, together with baseline disease activity measured by DAS28 for TSS and MTX use for JES. Results from CC analysis show larger coefficients and s.e.s compared with MI and weighted techniques. The results from the WEE model were quite in line with those of MI. If it seems plausible that CC or MI analysis may be valid, then MI should be preferred because of its greater efficiency. CC analysis resulted in inefficient estimates or, translated into non-statistical terminology, could guide us into inaccurate results and unwise conclusions. The methods discussed here will contribute to the use of alternative approaches for tackling missing data in observational studies.

  17. Counting Better? An Examination of the Impact of Quantitative Method Teaching on Statistical Anxiety and Confidence

    ERIC Educational Resources Information Center

    Chamberlain, John Martyn; Hillier, John; Signoretta, Paola

    2015-01-01

    This article reports the results of research concerned with students' statistical anxiety and confidence to both complete and learn to complete statistical tasks. Data were collected at the beginning and end of a quantitative method statistics module. Students recognised the value of numeracy skills but felt they were not necessarily relevant for…

  18. Distortion Representation of Forecast Errors for Model Skill Assessment and Objective Analysis

    NASA Technical Reports Server (NTRS)

    Hoffman, Ross N.

    2001-01-01

    We completed the formulation of the smoothness penalty functional this past quarter. We used a simplified procedure for estimating the statistics of the FCA solution spectral coefficients from the results of the unconstrained, low-truncation FCA (stopping criterion) solutions. During the current reporting period we have completed the calculation of GEOS-2 model-equivalent brightness temperatures for the 6.7 micron and 11 micron window channels used in the GOES imagery for all 10 cases from August 1999. These were simulated using the AER-developed Optimal Spectral Sampling (OSS) model.

  19. Recovering incomplete data using Statistical Multiple Imputations (SMI): a case study in environmental chemistry.

    PubMed

    Mercer, Theresa G; Frostick, Lynne E; Walmsley, Anthony D

    2011-10-15

    This paper presents a statistical technique that can be applied to environmental chemistry data where missing values and limit of detection levels prevent the application of statistics. A working example is taken from an environmental leaching study that was set up to determine if there were significant differences in levels of leached arsenic (As), chromium (Cr) and copper (Cu) between lysimeters containing preservative treated wood waste and those containing untreated wood. Fourteen lysimeters were setup and left in natural conditions for 21 weeks. The resultant leachate was analysed by ICP-OES to determine the As, Cr and Cu concentrations. However, due to the variation inherent in each lysimeter combined with the limits of detection offered by ICP-OES, the collected quantitative data was somewhat incomplete. Initial data analysis was hampered by the number of 'missing values' in the data. To recover the dataset, the statistical tool of Statistical Multiple Imputation (SMI) was applied, and the data was re-analysed successfully. It was demonstrated that using SMI did not affect the variance in the data, but facilitated analysis of the complete dataset. Copyright © 2011 Elsevier B.V. All rights reserved.

  20. Evaluation of Nursing Documentation Completion of Stroke Patients in the Emergency Department: A Pre-Post Analysis Using Flowsheet Templates and Clinical Decision Support.

    PubMed

    Richardson, Karen J; Sengstack, Patricia; Doucette, Jeffrey N; Hammond, William E; Schertz, Matthew; Thompson, Julie; Johnson, Constance

    2016-02-01

    The primary aim of this performance improvement project was to determine whether the electronic health record implementation of stroke-specific nursing documentation flowsheet templates and clinical decision support alerts improved the nursing documentation of eligible stroke patients in seven stroke-certified emergency departments. Two system enhancements were introduced into the electronic record in an effort to improve nursing documentation: disease-specific documentation flowsheets and clinical decision support alerts. Using a pre-post design, project measures included six stroke management goals as defined by the National Institute of Neurological Disorders and Stroke and three clinical decision support measures based on entry of orders used to trigger documentation reminders for nursing: (1) the National Institutes of Health's Stroke Scale, (2) neurological checks, and (3) dysphagia screening. Data were reviewed 6 months prior (n = 2293) and 6 months following the intervention (n = 2588). Fisher exact test was used for statistical analysis. Statistical significance was found for documentation of five of the six stroke management goals, although effect sizes were small. Customizing flowsheets to meet the needs of nursing workflow showed improvement in the completion of documentation. The effects of the decision support alerts on the completeness of nursing documentation were not statistically significant (likely due to lack of order entry). For example, an order for the National Institutes of Health Stroke Scale was entered only 10.7% of the time, which meant no alert would fire for nursing in the postintervention group. Future work should focus on decision support alerts that trigger reminders for clinicians to place relevant orders for this population.

  1. Meta-analysis of the efficacy and safety of combined surgery in the management of eyes with coexisting cataract and open angle glaucoma.

    PubMed

    Jiang, Nan; Zhao, Gui-Qiu; Lin, Jing; Hu, Li-Ting; Che, Cheng-Ye; Wang, Qian; Xu, Qiang; Li, Cui; Zhang, Jie

    2018-01-01

    To conduct a systematic review and quantitative Meta-analysis of the efficacy and safety of combined surgery for the eyes with coexisting cataract and open angle glaucoma. We performed a systematic search of the related literature in the Cochrane Library, PubMed, EMBASE, Web of Science databases, CNKI, CBM and Wan Fang databases, with no limitations on language or publication date. The primary efficacy estimate was identified by weighted mean difference of the percentage of intraocular pressure reduction (IOPR%) from baseline to end-point, the percentage of number of glaucoma medications reduction from pre- to post-operation, and the secondary efficacy evaluations were performed by odds ratio (OR) and 95% confidence interval (CI) for complete and qualified success rate. Besides, ORs were applied to assess the tolerability of adverse incidents. Meta-analyses of fixed or random effect models were performed using RevMan software 5.2 to gather the consequences. Heterogeneity was evaluated by Chi 2 test and the I 2 measure. Ten studies enrolling 3108 patients were included. The combined consequences indicated that both glaucoma and combined cataract and glaucoma surgery significantly decreased IOP. For deep sclerectomy vs deep sclerectomy plus phacoemulsification and canaloplasty vs phaco-canaloplasty, the differences in IOPR% were not all statistically significant while trabeculotomy was detected to gain a quantitatively greater IOPR% compared with trabeculotomy plus phacoemulsification. Furthermore, there was no statistical significance in the complete and qualified success rate, and the rates of adverse incidents for trabeculotomy vs trabeculotomy plus phacoemulsification. Compared with trabeculotomy plus phacoemulsification, trabeculectomy alone is more effective in lowering IOP and the number of glaucoma medications, while the two surgeries can not demonstrate statistical differences in the complete success rate, qualified success rate, or incidence of adverse incidents.

  2. Is using multiple imputation better than complete case analysis for estimating a prevalence (risk) difference in randomized controlled trials when binary outcome observations are missing?

    PubMed

    Mukaka, Mavuto; White, Sarah A; Terlouw, Dianne J; Mwapasa, Victor; Kalilani-Phiri, Linda; Faragher, E Brian

    2016-07-22

    Missing outcomes can seriously impair the ability to make correct inferences from randomized controlled trials (RCTs). Complete case (CC) analysis is commonly used, but it reduces sample size and is perceived to lead to reduced statistical efficiency of estimates while increasing the potential for bias. As multiple imputation (MI) methods preserve sample size, they are generally viewed as the preferred analytical approach. We examined this assumption, comparing the performance of CC and MI methods to determine risk difference (RD) estimates in the presence of missing binary outcomes. We conducted simulation studies of 5000 simulated data sets with 50 imputations of RCTs with one primary follow-up endpoint at different underlying levels of RD (3-25 %) and missing outcomes (5-30 %). For missing at random (MAR) or missing completely at random (MCAR) outcomes, CC method estimates generally remained unbiased and achieved precision similar to or better than MI methods, and high statistical coverage. Missing not at random (MNAR) scenarios yielded invalid inferences with both methods. Effect size estimate bias was reduced in MI methods by always including group membership even if this was unrelated to missingness. Surprisingly, under MAR and MCAR conditions in the assessed scenarios, MI offered no statistical advantage over CC methods. While MI must inherently accompany CC methods for intention-to-treat analyses, these findings endorse CC methods for per protocol risk difference analyses in these conditions. These findings provide an argument for the use of the CC approach to always complement MI analyses, with the usual caveat that the validity of the mechanism for missingness be thoroughly discussed. More importantly, researchers should strive to collect as much data as possible.

  3. Income-related inequality in completed suicide across the provinces of Iran.

    PubMed

    Kazemi-Galougahi, Mohammad Hassan; Mansouri, Asieh; Akbarpour, Samaneh; Bakhtiyari, Mahmood; Sartipi, Majid; Moradzadeh, Rahmatollah

    2018-01-01

    The aim of this study was to measure income-related inequality in completed suicide across the provinces of Iran. This ecological study was performed using data from the Urban and Rural Household Income and Expenditure Survey-2010 conducted by the Iranian Center of Statistics, along with data on completed suicide from the Iranian Legal Medicine Organization in 2012. We calculated the Gini coefficient of per capita income and the completed suicide rate, as well as the concentration index for per capita income inequality in completed suicide, across the provinces of Iran. The Gini coefficients of per capita income and the completed suicide rate in the provinces of Iran were 0.10 (95% confidence interval [CI], 0.06 to 0.13) and 0.34 (95% CI, 0.21 to 0.46), respectively. We found a trivial decreasing trend in the completed suicide incidence rate according to income quintile. The poorest-to-richest ratio in the completed suicide rate was 2.01 (95% CI, 1.26 to 3.22). The concentration index of completed suicide in the provinces of Iran was -0.12 (95% CI, -0.30 to 0.06). This study found that lower income might be considered as a risk factor for completed suicide. Nonetheless, further individual studies incorporating multivariable analysis and repeated cross-sectional data would allow a more fine-grained analysis of this phenomenon.

  4. Missing CD4+ cell response in randomized clinical trials of maraviroc and dolutegravir.

    PubMed

    Cuffe, Robert; Barnett, Carly; Granier, Catherine; Machida, Mitsuaki; Wang, Cunshan; Roger, James

    2015-10-01

    Missing data can compromise inferences from clinical trials, yet the topic has received little attention in the clinical trial community. Shortcomings in commonly used methods used to analyze studies with missing data (complete case, last- or baseline-observation carried forward) have been highlighted in a recent Food and Drug Administration-sponsored report. This report recommends how to mitigate the issues associated with missing data. We present an example of the proposed concepts using data from recent clinical trials. CD4+ cell count data from the previously reported SINGLE and MOTIVATE studies of dolutegravir and maraviroc were analyzed using a variety of statistical methods to explore the impact of missing data. Four methodologies were used: complete case analysis, simple imputation, mixed models for repeated measures, and multiple imputation. We compared the sensitivity of conclusions to the volume of missing data and to the assumptions underpinning each method. Rates of missing data were greater in the MOTIVATE studies (35%-68% premature withdrawal) than in SINGLE (12%-20%). The sensitivity of results to assumptions about missing data was related to volume of missing data. Estimates of treatment differences by various analysis methods ranged across a 61 cells/mm3 window in MOTIVATE and a 22 cells/mm3 window in SINGLE. Where missing data are anticipated, analyses require robust statistical and clinical debate of the necessary but unverifiable underlying statistical assumptions. Multiple imputation makes these assumptions transparent, can accommodate a broad range of scenarios, and is a natural analysis for clinical trials in HIV with missing data.

  5. The Influence of Cognitive Reserve on Recovery from Traumatic Brain Injury.

    PubMed

    Donders, Jacobus; Stout, Jacob

    2018-04-12

    we sought to determine the degree to which cognitive reserve, as assessed by the Test of Premorbid Functioning in combination with demographic variables, could act as a buffer against the effect of traumatic brain injury (TBI) on cognitive test performance. retrospective analysis of a cohort of 121 persons with TBI who completed the Wechsler Adult Intelligence Scale-Fourth Edition (WAIS-IV) within 1-12 months after injury. regression analyses indicated that cognitive reserve was a statistically significant predictor of all postinjury WAIS-IV factor index scores, after controlling for various premorbid and comorbid confounding variables. Only for Processing Speed did injury severity make an additional statistically significant contribution to the prediction model. cognitive reserve has a protective effect with regard to the impact of TBI on cognitive test performance but this effect is imperfect and does not completely negate the effect of injury severity.

  6. EVALUATION OF THE EXTRACELLULAR MATRIX OF INJURED SUPRASPINATUS IN RATS

    PubMed Central

    Almeida, Luiz Henrique Oliveira; Ikemoto, Roberto; Mader, Ana Maria; Pinhal, Maria Aparecida Silva; Munhoz, Bruna; Murachovsky, Joel

    2016-01-01

    ABSTRACT Objective: To evaluate the evolution of injuries of the supraspinatus muscle by immunohistochemistry (IHC) and anatomopathological analysis in animal model (Wistar rats). Methods: Twenty-five Wistar rats were submitted to complete injury of the supraspinatus tendon, then subsequently sacrificed in groups of five animals at the following periods: immediately after the injury, 24h after the injury, 48h after, 30 days after and three months after the injury. All groups underwent histological and IHC analysis. Results: Regarding vascular proliferation and inflammatory infiltrate, we found a statistically significant difference between groups 1(control group) and 2 (24h after injury). IHC analysis showed that expression of vascular endothelial growth factor (VEGF) showed a statistically significant difference between groups 1 and 2, and collagen type 1 (Col-1) evaluation presented a statistically significant difference between groups 1 and 4. Conclusion: We observed changes in the extracellular matrix components compatible with remodeling and healing. Remodeling is more intense 24h after injury. However, VEGF and Col-1 are substantially increased at 24h and 30 days after the injury, respectively. Level of Evidence I, Experimental Study. PMID:26997907

  7. The mediating effect of calling on the relationship between medical school students’ academic burnout and empathy

    PubMed Central

    2017-01-01

    Purpose This study is aimed at identifying the relationships between medical school students’ academic burnout, empathy, and calling, and determining whether their calling has a mediating effect on the relationship between academic burnout and empathy. Methods A mixed method study was conducted. One hundred twenty-seven medical students completed a survey. Scales measuring academic burnout, medical students’ empathy, and calling were utilized. For statistical analysis, correlation analysis, descriptive statistics analysis, and hierarchical multiple regression analyses were conducted. For qualitative approach, eight medical students participated in a focus group interview. Results The study found that empathy has a statistically significant, negative correlation with academic burnout, while having a significant, positive correlation with calling. Sense of calling proved to be an effective mediator of the relationship between academic burnout and empathy. Conclusion This result demonstrates that calling is a key variable that mediates the relationship between medical students’ academic burnout and empathy. As such, this study provides baseline data for an education that could improve medical students’ empathy skills. PMID:28870019

  8. Video Games as a Context for Numeracy Development

    ERIC Educational Resources Information Center

    Thomas, Troy A.; Wiest, Lynda R.

    2013-01-01

    Troy Thomas and Lynda Wiest share an engaging lesson on statistics involving analysis of real-world data on the top ten video game sales in the United States during a one-week period. Three upper-primary classes completed the lesson, providing insight into the lesson's effectiveness. The lesson description includes attention to the manner in which…

  9. Categories of Computer Use and Their Relationships with Attitudes toward Computers.

    ERIC Educational Resources Information Center

    Mitra, Anandra

    1998-01-01

    Analysis of attitude and use questionnaires completed by undergraduates (n1,444) at Wake Forest University determined that computers were used most frequently for word processing. Other uses were e-mail for task and non-task activities and mathematical and statistical computation. Results suggest that the level of computer use was related to…

  10. 78 FR 792 - Self-Regulatory Organizations; National Securities Clearing Corporation; Notice of Filing of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-04

    ... this formula and is designed to calculate the amount of money that may be lost on a portfolio over a... to statistical analysis, such as OTC Bulletin Board or Pink Sheet issues or issues trading below a... considered matched and institutional delivery details are sent to DTC for settlement. Completion of the money...

  11. 78 FR 3960 - Self-Regulatory Organizations; National Securities Clearing Corporation; Notice of Filing of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-17

    ... ``VaR,'' is a core component of this formula and is designed to calculate the amount of money that may... volatility is (x) less amendable to statistical analysis, such as OTC Bulletin Board or Pink Sheet issues or... to DTC for settlement. Completion of the money and securities settlement of institutional trades...

  12. Data Analysis and Statistics across the Curriculum. Curriculum and Evaluation Standards for School Mathematics Addenda Series. Grades 9-12.

    ERIC Educational Resources Information Center

    Burrill, Gail; And Others

    The 1989 document, "Curriculum and Evaluation Standards for School Mathematics" (the "Standards"), provides a vision and a framework for revising and strengthening the K-12 mathematics curriculum in North American schools and for evaluating both the mathematics curriculum and students' progress. When completed, it is expected…

  13. Symmetric and asymmetric ternary fission of hot nuclei

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Siwek-Wilczynska, K.; Wilczynski, J.; Leegte, H.K.W.

    1993-07-01

    Emission of [alpha] particles accompanying fusion-fission processes in the [sup 40]Ar +[sup 232]Th reaction at [ital E]([sup 40]Ar) = 365 MeV was studied in a wide range of in-fission-plane and out-of-plane angles. The exact determination of the emission angles of both fission fragments combined with the time-of-flight measurements allowed us to reconstruct the complete kinematics of each ternary event. The coincident energy spectra of [alpha] particles were analyzed by using predictions of the energy spectra of the statistical code CASCADE . The analysis clearly demonstrates emission from the composite system prior to fission, emission from fully accelerated fragments after fission,more » and also emission during scission. The analysis is presented for both symmetric and asymmetric fission. The results have been analyzed using a time-dependent statistical decay code and confronted with dynamical calculations based on a classical one-body dissipation model. The observed near-scission emission is consistent with evaporation from a dinuclear system just before scission and evaporation from separated fragments just after scission. The analysis suggests that the time scale of fission of the hot composite systems is long (about 7[times]10[sup [minus]20] s) and the motion during the descent to scission almost completely damped.« less

  14. Marginalized zero-inflated Poisson models with missing covariates.

    PubMed

    Benecha, Habtamu K; Preisser, John S; Divaris, Kimon; Herring, Amy H; Das, Kalyan

    2018-05-11

    Unlike zero-inflated Poisson regression, marginalized zero-inflated Poisson (MZIP) models for counts with excess zeros provide estimates with direct interpretations for the overall effects of covariates on the marginal mean. In the presence of missing covariates, MZIP and many other count data models are ordinarily fitted using complete case analysis methods due to lack of appropriate statistical methods and software. This article presents an estimation method for MZIP models with missing covariates. The method, which is applicable to other missing data problems, is illustrated and compared with complete case analysis by using simulations and dental data on the caries preventive effects of a school-based fluoride mouthrinse program. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Automated system for the on-line monitoring of powder blending processes using near-infrared spectroscopy. Part I. System development and control.

    PubMed

    Hailey, P A; Doherty, P; Tapsell, P; Oliver, T; Aldridge, P K

    1996-03-01

    An automated system for the on-line monitoring of powder blending processes is described. The system employs near-infrared (NIR) spectroscopy using fibre-optics and a graphical user interface (GUI) developed in the LabVIEW environment. The complete supervisory control and data analysis (SCADA) software controls blender and spectrophotometer operation and performs statistical spectral data analysis in real time. A data analysis routine using standard deviation is described to demonstrate an approach to the real-time determination of blend homogeneity.

  16. An improved multiple linear regression and data analysis computer program package

    NASA Technical Reports Server (NTRS)

    Sidik, S. M.

    1972-01-01

    NEWRAP, an improved version of a previous multiple linear regression program called RAPIER, CREDUC, and CRSPLT, allows for a complete regression analysis including cross plots of the independent and dependent variables, correlation coefficients, regression coefficients, analysis of variance tables, t-statistics and their probability levels, rejection of independent variables, plots of residuals against the independent and dependent variables, and a canonical reduction of quadratic response functions useful in optimum seeking experimentation. A major improvement over RAPIER is that all regression calculations are done in double precision arithmetic.

  17. Statistical models of lunar rocks and regolith

    NASA Technical Reports Server (NTRS)

    Marcus, A. H.

    1973-01-01

    The mathematical, statistical, and computational approaches used in the investigation of the interrelationship of lunar fragmental material, regolith, lunar rocks, and lunar craters are described. The first two phases of the work explored the sensitivity of the production model of fragmental material to mathematical assumptions, and then completed earlier studies on the survival of lunar surface rocks with respect to competing processes. The third phase combined earlier work into a detailed statistical analysis and probabilistic model of regolith formation by lithologically distinct layers, interpreted as modified crater ejecta blankets. The fourth phase of the work dealt with problems encountered in combining the results of the entire project into a comprehensive, multipurpose computer simulation model for the craters and regolith. Highlights of each phase of research are given.

  18. Mars Pathfinder Near-Field Rock Distribution Re-Evaluation

    NASA Technical Reports Server (NTRS)

    Haldemann, A. F. C.; Golombek, M. P.

    2003-01-01

    We have completed analysis of a new near-field rock count at the Mars Pathfinder landing site and determined that the previously published rock count suggesting 16% cumulative fractional area (CFA) covered by rocks is incorrect. The earlier value is not so much wrong (our new CFA is 20%), as right for the wrong reason: both the old and the new CFA's are consistent with remote sensing data, however the earlier determination incorrectly calculated rock coverage using apparent width rather than average diameter. Here we present details of the new rock database and the new statistics, as well as the importance of using rock average diameter for rock population statistics. The changes to the near-field data do not affect the far-field rock statistics.

  19. Analysis of censored data.

    PubMed

    Lucijanic, Marko; Petrovecki, Mladen

    2012-01-01

    Analyzing events over time is often complicated by incomplete, or censored, observations. Special non-parametric statistical methods were developed to overcome difficulties in summarizing and comparing censored data. Life-table (actuarial) method and Kaplan-Meier method are described with an explanation of survival curves. For the didactic purpose authors prepared a workbook based on most widely used Kaplan-Meier method. It should help the reader understand how Kaplan-Meier method is conceptualized and how it can be used to obtain statistics and survival curves needed to completely describe a sample of patients. Log-rank test and hazard ratio are also discussed.

  20. MetaGenyo: a web tool for meta-analysis of genetic association studies.

    PubMed

    Martorell-Marugan, Jordi; Toro-Dominguez, Daniel; Alarcon-Riquelme, Marta E; Carmona-Saez, Pedro

    2017-12-16

    Genetic association studies (GAS) aims to evaluate the association between genetic variants and phenotypes. In the last few years, the number of this type of study has increased exponentially, but the results are not always reproducible due to experimental designs, low sample sizes and other methodological errors. In this field, meta-analysis techniques are becoming very popular tools to combine results across studies to increase statistical power and to resolve discrepancies in genetic association studies. A meta-analysis summarizes research findings, increases statistical power and enables the identification of genuine associations between genotypes and phenotypes. Meta-analysis techniques are increasingly used in GAS, but it is also increasing the amount of published meta-analysis containing different errors. Although there are several software packages that implement meta-analysis, none of them are specifically designed for genetic association studies and in most cases their use requires advanced programming or scripting expertise. We have developed MetaGenyo, a web tool for meta-analysis in GAS. MetaGenyo implements a complete and comprehensive workflow that can be executed in an easy-to-use environment without programming knowledge. MetaGenyo has been developed to guide users through the main steps of a GAS meta-analysis, covering Hardy-Weinberg test, statistical association for different genetic models, analysis of heterogeneity, testing for publication bias, subgroup analysis and robustness testing of the results. MetaGenyo is a useful tool to conduct comprehensive genetic association meta-analysis. The application is freely available at http://bioinfo.genyo.es/metagenyo/ .

  1. Comparison of immediate complete denture, tooth and implant-supported overdenture on vertical dimension and muscle activity

    PubMed Central

    Shah, Farhan Khalid; Gebreel, Ashraf; Elshokouki, Ali hamed; Habib, Ahmed Ali

    2012-01-01

    PURPOSE To compare the changes in the occlusal vertical dimension, activity of masseter muscles and biting force after insertion of immediate denture constructed with conventional, tooth-supported and Implant-supported immediate mandibular complete denture. MATERIALS AND METHODS Patients were selected and treatment was carried out with all the three different concepts i.e, immediate denture constructed with conventional (Group A), tooth-supported (Group B) and Implant-supported (Group C) immediate mandibular complete dentures. Parameters of evaluation and comparison were occlusal vertical dimension measured by radiograph (at three different time intervals), Masseter muscle electromyographic (EMG) measurement by EMG analysis (at three different positions of jaws) and bite force measured by force transducer (at two different time intervals). The obtained data were statistically analyzed by using ANOVA-F test at 5% level of significance. If the F test was significant, Least Significant Difference test was performed to test further significant differences between variables. RESULTS Comparison between mean differences in occlusal vertical dimension for tested groups showed that it was only statistically significant at 1 year after immediate dentures insertion. Comparison between mean differences in wavelet packet coefficients of the electromyographic signals of masseter muscles for tested groups was not significant at rest position, but significant at initial contact position and maximum voluntary clench position. Comparison between mean differences in maximum biting force for tested groups was not statistically significant at 5% level of significance. CONCLUSION Immediate complete overdentures whether tooth or implant supported prosthesis is recommended than totally mucosal supported prosthesis. PMID:22737309

  2. Materials of acoustic analysis: sustained vowel versus sentence.

    PubMed

    Moon, Kyung Ray; Chung, Sung Min; Park, Hae Sang; Kim, Han Su

    2012-09-01

    Sustained vowel is a widely used material of acoustic analysis. However, vowel phonation does not sufficiently demonstrate sentence-based real-life phonation, and biases may occur depending on the test subjects intent during pronunciation. The purpose of this study was to investigate the differences between the results of acoustic analysis using each material. An individual prospective study. Two hundred two individuals (87 men and 115 women) with normal findings in videostroboscopy were enrolled. Acoustic analysis was done using the speech pattern element acquisition and display program. Fundamental frequency (Fx), amplitude (Ax), contact quotient (Qx), jitter, and shimmer were measured with sustained vowel-based acoustic analysis. Average fundamental frequency (FxM), average amplitude (AxM), average contact quotient (QxM), Fx perturbation (CFx), and amplitude perturbation (CAx) were measured with sentence-based acoustic analysis. Corresponding data of the two methods were compared with each other. SPSS (Statistical Package for the Social Sciences, Version 12.0; SPSS, Inc., Chicago, IL) software was used for statistical analysis. FxM was higher than Fx in men (Fx, 124.45 Hz; FxM, 133.09 Hz; P=0.000). In women, FxM seemed to be lower than Fx, but the results were not statistically significant (Fx, 210.58 Hz; FxM, 208.34 Hz; P=0.065). There was no statistical significance between Ax and AxM in both the groups. QxM was higher than Qx in men and women. Jitter was lower in men, but CFx was lower in women. Both Shimmer and CAx were higher in men. Sustained vowel phonation could not be a complete substitute for real-time phonation in acoustic analysis. Characteristics of acoustic materials should be considered when choosing the material for acoustic analysis and interpreting the results. Copyright © 2012 The Voice Foundation. Published by Mosby, Inc. All rights reserved.

  3. GWAR: robust analysis and meta-analysis of genome-wide association studies.

    PubMed

    Dimou, Niki L; Tsirigos, Konstantinos D; Elofsson, Arne; Bagos, Pantelis G

    2017-05-15

    In the context of genome-wide association studies (GWAS), there is a variety of statistical techniques in order to conduct the analysis, but, in most cases, the underlying genetic model is usually unknown. Under these circumstances, the classical Cochran-Armitage trend test (CATT) is suboptimal. Robust procedures that maximize the power and preserve the nominal type I error rate are preferable. Moreover, performing a meta-analysis using robust procedures is of great interest and has never been addressed in the past. The primary goal of this work is to implement several robust methods for analysis and meta-analysis in the statistical package Stata and subsequently to make the software available to the scientific community. The CATT under a recessive, additive and dominant model of inheritance as well as robust methods based on the Maximum Efficiency Robust Test statistic, the MAX statistic and the MIN2 were implemented in Stata. Concerning MAX and MIN2, we calculated their asymptotic null distributions relying on numerical integration resulting in a great gain in computational time without losing accuracy. All the aforementioned approaches were employed in a fixed or a random effects meta-analysis setting using summary data with weights equal to the reciprocal of the combined cases and controls. Overall, this is the first complete effort to implement procedures for analysis and meta-analysis in GWAS using Stata. A Stata program and a web-server are freely available for academic users at http://www.compgen.org/tools/GWAR. pbagos@compgen.org. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  4. Multivariate analysis, mass balance techniques, and statistical tests as tools in igneous petrology: application to the Sierra de las Cruces volcanic range (Mexican Volcanic Belt).

    PubMed

    Velasco-Tapia, Fernando

    2014-01-01

    Magmatic processes have usually been identified and evaluated using qualitative or semiquantitative geochemical or isotopic tools based on a restricted number of variables. However, a more complete and quantitative view could be reached applying multivariate analysis, mass balance techniques, and statistical tests. As an example, in this work a statistical and quantitative scheme is applied to analyze the geochemical features for the Sierra de las Cruces (SC) volcanic range (Mexican Volcanic Belt). In this locality, the volcanic activity (3.7 to 0.5 Ma) was dominantly dacitic, but the presence of spheroidal andesitic enclaves and/or diverse disequilibrium features in majority of lavas confirms the operation of magma mixing/mingling. New discriminant-function-based multidimensional diagrams were used to discriminate tectonic setting. Statistical tests of discordancy and significance were applied to evaluate the influence of the subducting Cocos plate, which seems to be rather negligible for the SC magmas in relation to several major and trace elements. A cluster analysis following Ward's linkage rule was carried out to classify the SC volcanic rocks geochemical groups. Finally, two mass-balance schemes were applied for the quantitative evaluation of the proportion of the end-member components (dacitic and andesitic magmas) in the comingled lavas (binary mixtures).

  5. Practicality of Elementary Statistics Module Based on CTL Completed by Instructions on Using Software R

    NASA Astrophysics Data System (ADS)

    Delyana, H.; Rismen, S.; Handayani, S.

    2018-04-01

    This research is a development research using 4-D design model (define, design, develop, and disseminate). The results of the define stage are analyzed for the needs of the following; Syllabus analysis, textbook analysis, student characteristics analysis and literature analysis. The results of textbook analysis obtained the description that of the two textbooks that must be owned by students also still difficulty in understanding it, the form of presentation also has not facilitated students to be independent in learning to find the concept, textbooks are also not equipped with data processing referrals by using software R. The developed module is considered valid by the experts. Further field trials are conducted to determine the practicality and effectiveness. The trial was conducted to the students of Mathematics Education Study Program of STKIP PGRI which was taken randomly which has not taken Basic Statistics Course that is as many as 4 people. Practical aspects of attention are easy, time efficient, easy to interpret, and equivalence. The practical value in each aspect is 3.7; 3.79, 3.7 and 3.78. Based on the results of the test students considered that the module has been very practical use in learning. This means that the module developed can be used by students in Elementary Statistics learning.

  6. Status Report on Female Completers in New Jersey Vocational Education 1990.

    ERIC Educational Resources Information Center

    Montclair State Coll., Upper Montclair, NJ. Life Skills Center.

    The New Jersey Occupational Information Coordinating Committee's statistics for average annual predicted job openings for program year 1989 are given in this report, along with the New Jersey Division of Vocational Education completers' statistics for the 1988-89 school year. The numbers of male and female completers of secondary programs for each…

  7. Student Performance in an Introductory Business Statistics Course: Does Delivery Mode Matter?

    ERIC Educational Resources Information Center

    Haughton, Jonathan; Kelly, Alison

    2015-01-01

    Approximately 600 undergraduates completed an introductory business statistics course in 2013 in one of two learning environments at Suffolk University, a mid-sized private university in Boston, Massachusetts. The comparison group completed the course in a traditional classroom-based environment, whereas the treatment group completed the course in…

  8. Calculation of Weibull strength parameters, Batdorf flaw density constants and related statistical quantities using PC-CARES

    NASA Technical Reports Server (NTRS)

    Szatmary, Steven A.; Gyekenyesi, John P.; Nemeth, Noel N.

    1990-01-01

    This manual describes the operation and theory of the PC-CARES (Personal Computer-Ceramic Analysis and Reliability Evaluation of Structures) computer program for the IBM PC and compatibles running PC-DOS/MS-DOR OR IBM/MS-OS/2 (version 1.1 or higher) operating systems. The primary purpose of this code is to estimate Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities. Included in the manual is the description of the calculation of shape and scale parameters of the two-parameter Weibull distribution using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. The methods for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull line, as well as the techniques for calculating the Batdorf flaw-density constants are also described.

  9. Analysis of pediatric blood lead levels in New York City for 1970-1976.

    PubMed Central

    Billick, I H; Curran, A S; Shier, D R

    1979-01-01

    A study was completed of more than 170,000 records of pediatric venous blood levels and supporting demographic information collected in New York City during 1970-1976. The geometric mean (GM) blood lead level shows a consistent cyclical variation superimposed on an overall decreasing trend with time for all ages and ethnic groups studied. The GM blood lead levels for blacks are significantly greater than those for either Hispanics or whites. Regression analysis indicates a significant statistical association between GM blood lead level and ambient air lead level, after appropriate adjustments are made for age and ethnic group. These highly significant statistical relationships provide extremely strong incentives and directions for research into casual factors related to blood lead levels in children. PMID:499123

  10. Therapies for acute myeloid leukemia: vosaroxin

    PubMed Central

    Sayar, Hamid; Bashardoust, Parvaneh

    2017-01-01

    Vosaroxin, a quinolone-derivative chemotherapeutic agent, was considered a promising drug for the treatment of acute myeloid leukemia (AML). Early-stage clinical trials with this agent led to a large randomized double-blind placebo-controlled study of vosaroxin in combination with intermediate-dose cytarabine for the treatment of relapsed or refractory AML. The study demonstrated better complete remission rates with vosaroxin, but there was no statistically significant overall survival benefit in the whole cohort. A subset analysis censoring patients who had undergone allogeneic stem cell transplantation, however, revealed a modest but statistically significant improvement in overall survival particularly among older patients. This article reviews the data available on vosaroxin including clinical trials in AML and offers an analysis of findings of these studies as well as the current status of vosaroxin. PMID:28860803

  11. Therapies for acute myeloid leukemia: vosaroxin.

    PubMed

    Sayar, Hamid; Bashardoust, Parvaneh

    2017-01-01

    Vosaroxin, a quinolone-derivative chemotherapeutic agent, was considered a promising drug for the treatment of acute myeloid leukemia (AML). Early-stage clinical trials with this agent led to a large randomized double-blind placebo-controlled study of vosaroxin in combination with intermediate-dose cytarabine for the treatment of relapsed or refractory AML. The study demonstrated better complete remission rates with vosaroxin, but there was no statistically significant overall survival benefit in the whole cohort. A subset analysis censoring patients who had undergone allogeneic stem cell transplantation, however, revealed a modest but statistically significant improvement in overall survival particularly among older patients. This article reviews the data available on vosaroxin including clinical trials in AML and offers an analysis of findings of these studies as well as the current status of vosaroxin.

  12. Are general practice characteristics predictors of good glycaemic control in patients with diabetes? A cross-sectional study.

    PubMed

    Esterman, Adrian J; Fountaine, Tim; McDermott, Robyn

    2016-01-18

    To determine whether certain characteristics of general practices are associated with good glycaemic control in patients with diabetes and with completing an annual cycle of care (ACC). Our cross-sectional analysis used baseline data from the Australian Diabetes Care Project conducted between 2011 and 2014. Practice characteristics were self-reported. Characteristics of the patients that were assessed included glycaemic control (HbA1c level ≤ 53 mmol/mol), age, sex, duration of diabetes, socio-economic disadvantage (SEIFA) score, the complexity of the patient's condition, and whether the patient had completed an ACC for diabetes in the past 18 months. Clustered logistic regression was used to establish predictors of glycaemic control and a completed ACC. Data were available from 147 general practices and 5455 patients with established type 1 or type 2 diabetes in three Australian states. After adjustment for other patient characteristics, only the patient completing an ACC was statistically significant as a predictor of glycaemic control (P = 0.011). In a multivariate model, the practice having a chronic disease-focused practice nurse (P = 0.036) and running educational events for patients with diabetes (P = 0.004) were statistically significant predictors of the patient having complete an ACC. Patient characteristics are moderately good predictors of whether the patient is in glycaemic control, whereas practice characteristics appear to predict only the likelihood of patients completing an ACC. The ACC is an established indicator of good diabetes management. This is the first study to report a positive association between having completed an ACC and the patient being in glycaemic control.

  13. Difficult Decisions: A Qualitative Exploration of the Statistical Decision Making Process from the Perspectives of Psychology Students and Academics

    PubMed Central

    Allen, Peter J.; Dorozenko, Kate P.; Roberts, Lynne D.

    2016-01-01

    Quantitative research methods are essential to the development of professional competence in psychology. They are also an area of weakness for many students. In particular, students are known to struggle with the skill of selecting quantitative analytical strategies appropriate for common research questions, hypotheses and data types. To begin understanding this apparent deficit, we presented nine psychology undergraduates (who had all completed at least one quantitative methods course) with brief research vignettes, and asked them to explicate the process they would follow to identify an appropriate statistical technique for each. Thematic analysis revealed that all participants found this task challenging, and even those who had completed several research methods courses struggled to articulate how they would approach the vignettes on more than a very superficial and intuitive level. While some students recognized that there is a systematic decision making process that can be followed, none could describe it clearly or completely. We then presented the same vignettes to 10 psychology academics with particular expertise in conducting research and/or research methods instruction. Predictably, these “experts” were able to describe a far more systematic, comprehensive, flexible, and nuanced approach to statistical decision making, which begins early in the research process, and pays consideration to multiple contextual factors. They were sensitive to the challenges that students experience when making statistical decisions, which they attributed partially to how research methods and statistics are commonly taught. This sensitivity was reflected in their pedagogic practices. When asked to consider the format and features of an aid that could facilitate the statistical decision making process, both groups expressed a preference for an accessible, comprehensive and reputable resource that follows a basic decision tree logic. For the academics in particular, this aid should function as a teaching tool, which engages the user with each choice-point in the decision making process, rather than simply providing an “answer.” Based on these findings, we offer suggestions for tools and strategies that could be deployed in the research methods classroom to facilitate and strengthen students' statistical decision making abilities. PMID:26909064

  14. Difficult Decisions: A Qualitative Exploration of the Statistical Decision Making Process from the Perspectives of Psychology Students and Academics.

    PubMed

    Allen, Peter J; Dorozenko, Kate P; Roberts, Lynne D

    2016-01-01

    Quantitative research methods are essential to the development of professional competence in psychology. They are also an area of weakness for many students. In particular, students are known to struggle with the skill of selecting quantitative analytical strategies appropriate for common research questions, hypotheses and data types. To begin understanding this apparent deficit, we presented nine psychology undergraduates (who had all completed at least one quantitative methods course) with brief research vignettes, and asked them to explicate the process they would follow to identify an appropriate statistical technique for each. Thematic analysis revealed that all participants found this task challenging, and even those who had completed several research methods courses struggled to articulate how they would approach the vignettes on more than a very superficial and intuitive level. While some students recognized that there is a systematic decision making process that can be followed, none could describe it clearly or completely. We then presented the same vignettes to 10 psychology academics with particular expertise in conducting research and/or research methods instruction. Predictably, these "experts" were able to describe a far more systematic, comprehensive, flexible, and nuanced approach to statistical decision making, which begins early in the research process, and pays consideration to multiple contextual factors. They were sensitive to the challenges that students experience when making statistical decisions, which they attributed partially to how research methods and statistics are commonly taught. This sensitivity was reflected in their pedagogic practices. When asked to consider the format and features of an aid that could facilitate the statistical decision making process, both groups expressed a preference for an accessible, comprehensive and reputable resource that follows a basic decision tree logic. For the academics in particular, this aid should function as a teaching tool, which engages the user with each choice-point in the decision making process, rather than simply providing an "answer." Based on these findings, we offer suggestions for tools and strategies that could be deployed in the research methods classroom to facilitate and strengthen students' statistical decision making abilities.

  15. Evaluation of Sensibility Threshold for Interocclusal Thickness of Patients Wearing Complete Dentures

    PubMed Central

    Shala, Kujtim Sh.; Ahmedi, Enis F.; Tmava-Dragusha, Arlinda

    2017-01-01

    Objective The aim of this study was to evaluate sensibility threshold for interocclusal thickness in experienced and nonexperienced denture wearers after the insertion of new complete dentures. Materials and Methods A total of 88 patients with complete dentures have participated in this study. The research was divided into two experimental groups, compared with the previous experience prosthetic dental treatment. The sensibility threshold for interocclusal thickness was measured with metal foil with 8 μm thickness and width of 8 mm, placed between the upper and lower incisor region. Statistical analysis was performed using standard software package BMDP (biomedical statistical package). Results Results suggest that time of measurement affects the average values of the sensibility threshold for interocclusal thickness (F = 242.68, p = 0.0000). Gender appeared to be a significant factor when it interacted with time measurement resulting in differences in sensibility threshold for interocclusal thickness (gender: F = 9.84, p = 0.018; F = 4.83, p = 0.0003). Conclusion The sensibility threshold for interocclusal thickness was the most important functional adaptation in patient with complete dentures. A unique trait of this indicator is the progressive reduction of initial values and a tendency to reestablish the stationary state in the fifteenth week after dentures is taken off. PMID:28702055

  16. Working Performance Analysis of Rolling Bearings Used in Mining Electric Excavator Crowd Reducer

    NASA Astrophysics Data System (ADS)

    Zhang, Y. H.; Hou, G.; Chen, G.; Liang, J. F.; Zheng, Y. M.

    2017-12-01

    Refer to the statistical load data of digging process, on the basis of simulation analysis of crowd reducer system dynamics, the working performance simulation analysis of rolling bearings used in crowd reducer of large mining electric excavator is completed. The contents of simulation analysis include analysis of internal load distribution, rolling elements contact stresses and rolling bearing fatigue life. The internal load characteristics of rolling elements in cylindrical roller bearings are obtained. The results of this study identified that all rolling bearings satisfy the requirements of contact strength and fatigue life. The rationality of bearings selection and arrangement is also verified.

  17. Analysis of satellite data on energetic particles of ionospheric origin

    NASA Technical Reports Server (NTRS)

    Sharp, R. D.; Johnson, R. G.; Shelley, E. G.

    1976-01-01

    The principal result of this program has been the completion of a detailed statistical study of the properties of precipitating O(+) and H(+) ions during two principal magnetic storms. The results of the analysis of selected data of ion mass spectrometer experiment on satellites are given with emphasis on the morphology of the O(+) ions of ionospheric origin with energies in the 0.7 les than or equal to E less than or equal to 12 keV range that were discovered with this experiment.

  18. Small Sample Statistics for Incomplete Nonnormal Data: Extensions of Complete Data Formulae and a Monte Carlo Comparison

    ERIC Educational Resources Information Center

    Savalei, Victoria

    2010-01-01

    Incomplete nonnormal data are common occurrences in applied research. Although these 2 problems are often dealt with separately by methodologists, they often cooccur. Very little has been written about statistics appropriate for evaluating models with such data. This article extends several existing statistics for complete nonnormal data to…

  19. [The evaluation of costs: standards of medical care and clinical statistic groups].

    PubMed

    Semenov, V Iu; Samorodskaia, I V

    2014-01-01

    The article presents the comparative analysis of techniques of evaluation of costs of hospital treatment using medical economic standards of medical care and clinical statistical groups. The technique of evaluation of costs on the basis of clinical statistical groups was developed almost fifty years ago and is largely applied in a number of countries. Nowadays, in Russia the payment for completed case of treatment on the basis of medical economic standards is the main mode of payment for medical care in hospital. It is very conditionally a Russian analogue of world-wide prevalent system of diagnostic related groups. The tariffs for these cases of treatment as opposed to clinical statistical groups are counted on basis of standards of provision of medical care approved by Minzdrav of Russia. The information derived from generalization of cases of treatment of real patients is not applied.

  20. Nursing EDGE: evaluating delegation guidelines in education.

    PubMed

    Henderson, Deborah; Sealover, Pamela; Sharrer, Vicki; Fusner, Sally; Jones, Sandy; Sweet, Stacie; Blake, Tim

    2006-01-01

    Delegation, an important concept for nursing students to learn and practice, is central to registered nurse (RN) performance, and important on the NCLEX-RN examination. Nursing faculty members from an ADN program designed a descriptive study to evaluate planned versus actual delegation in the curriculum, and a second study to evaluate an intervention on delegation. Study One assessed the presence of delegation in each nursing course. Statistical analysis compared the planned implementation with the results for student definitions of delegation, and identification of the five rights of delegation based on the National Council of State Boards of Nursing (NCSBN) definition and five rights. Study one results are shared. Study Two utilized a comparison of pre-to-post intervention measures. Students were asked to complete eight steps of a delegation exercise and determine what could be delegated to an unlicensed assistant, and what should be completed by the RN. Answers were coded and entered into SPSS. Statistical analysis compared each student's ability to correctly identify the five rights of delegation prior to the exercise, against the ability to correctly answer five questions two weeks post exercise. Significant improvement (p< 0.05) occurred on each measure. Recommendations are discussed.

  1. A prospective study on transplantation of third molars with complete root formation.

    PubMed

    Mejàre, Bertil; Wannfors, Karin; Jansson, Leif

    2004-02-01

    The study objective was to evaluate the prognosis for autotransplantation of third molar teeth with fully developed roots followed by endodontic treatment on the basis of a time-table analysis. A total of 50 third molars with completely developed roots were autotransplanted to replace a lost first or second molar in the same number of admitted patients. Root canal treatment was started 3 to 4 weeks later. Clinical and radiographic checkup of the transplanted and root-filled third molars was done annually according to a predesigned record form. Descriptive statistics including a life table and statistical analysis were performed. The cumulative survival rate during 4 years' follow-up was 81.4%. In all, 7 transplants were lost during the follow-up time, 4 of them due to marginal periodontal pathosis and the other 3 due to root resorption. None of the root resorptions was observed before the second postoperative year. The radiographic periapical status was considered normal in 96% of the transplants at the latest follow-up visit. Autotransplantation of mature third molar teeth is a reasonable treatment alternative to conventional prosthetic rehabilitation or implant treatment in cases of partial edentualism from both a therapeutic and an economic point of view.

  2. On the self-preservation of turbulent jet flows with variable viscosity

    NASA Astrophysics Data System (ADS)

    Danaila, Luminita; Gauding, Michael; Varea, Emilien; Turbulence; mixing Team

    2017-11-01

    The concept of self-preservation has played an important role in shaping the understanding of turbulent flows. The assumption of complete self-preservation imposes certain constrains on the dynamics of the flow, allowing to express one-point or two-point statistics by choosing an appropriate unique length scale. Determining this length scale and its scaling is of high relevance for modeling. In this work, we study turbulent jet flows with variable viscosity from the self-preservation perspective. Turbulent flows encountered in engineering and environmental applications are often characterized by fluctuations of viscosity resulting for instance from variations of temperature or species composition. Starting from the transport equation for the moments of the mixture fraction increment, constraints for self-preservation are derived. The analysis is based on direct numerical simulations of turbulent jet flows where the viscosity between host and jet fluid differs. It is shown that fluctuations of viscosity do not affect the decay exponents of the turbulent energy or the dissipation but modify the scaling of two-point statistics in the dissipative range. Moreover, the analysis reveals that complete self-preservation in turbulent flows with variable viscosity cannot be achieved. Financial support from Labex EMC3 and FEDER is gratefully acknowledged.

  3. Factors influencing health professions students' use of computers for data analysis at three Ugandan public medical schools: a cross-sectional survey.

    PubMed

    Munabi, Ian G; Buwembo, William; Bajunirwe, Francis; Kitara, David Lagoro; Joseph, Ruberwa; Peter, Kawungezi; Obua, Celestino; Quinn, John; Mwaka, Erisa S

    2015-02-25

    Effective utilization of computers and their applications in medical education and research is of paramount importance to students. The objective of this study was to determine the association between owning a computer and use of computers for research data analysis and the other factors influencing health professions students' computer use for data analysis. We conducted a cross sectional study among undergraduate health professions students at three public universities in Uganda using a self-administered questionnaire. The questionnaire was composed of questions on participant demographics, students' participation in research, computer ownership, and use of computers for data analysis. Descriptive and inferential statistics (uni-variable and multi- level logistic regression analysis) were used to analyse data. The level of significance was set at 0.05. Six hundred (600) of 668 questionnaires were completed and returned (response rate 89.8%). A majority of respondents were male (68.8%) and 75.3% reported owning computers. Overall, 63.7% of respondents reported that they had ever done computer based data analysis. The following factors were significant predictors of having ever done computer based data analysis: ownership of a computer (adj. OR 1.80, p = 0.02), recently completed course in statistics (Adj. OR 1.48, p =0.04), and participation in research (Adj. OR 2.64, p <0.01). Owning a computer, participation in research and undertaking courses in research methods influence undergraduate students' use of computers for research data analysis. Students are increasingly participating in research, and thus need to have competencies for the successful conduct of research. Medical training institutions should encourage both curricular and extra-curricular efforts to enhance research capacity in line with the modern theories of adult learning.

  4. Sulfur in Cometary Dust

    NASA Technical Reports Server (NTRS)

    Fomenkova, M. N.

    1997-01-01

    The computer-intensive project consisted of the analysis and synthesis of existing data on composition of comet Halley dust particles. The main objective was to obtain a complete inventory of sulfur containing compounds in the comet Halley dust by building upon the existing classification of organic and inorganic compounds and applying a variety of statistical techniques for cluster and cross-correlational analyses. A student hired for this project wrote and tested the software to perform cluster analysis. The following tasks were carried out: (1) selecting the data from existing database for the proposed project; (2) finding access to a standard library of statistical routines for cluster analysis; (3) reformatting the data as necessary for input into the library routines; (4) performing cluster analysis and constructing hierarchical cluster trees using three methods to define the proximity of clusters; (5) presenting the output results in different formats to facilitate the interpretation of the obtained cluster trees; (6) selecting groups of data points common for all three trees as stable clusters. We have also considered the chemistry of sulfur in inorganic compounds.

  5. Content, Affective, and Behavioral Challenges to Learning: Students' Experiences Learning Statistics

    ERIC Educational Resources Information Center

    McGrath, April L.

    2014-01-01

    This study examined the experiences of and challenges faced by students when completing a statistics course. As part of the requirement for this course, students completed a learning check-in, which consisted of an individual meeting with the instructor to discuss questions and the completion of a learning reflection and study plan. Forty…

  6. Parental consanguineous marriages and clinical response to chemotherapy in locally advanced breast cancer patients.

    PubMed

    Saadat, Mostafa; Khalili, Maryam; Omidvari, Shahpour; Ansari-Lari, Maryam

    2011-03-28

    The main aim of the present study was investigating the association between parental consanguinity and clinical response to chemotherapy in females affected with locally advanced breast cancer. A consecutive series of 92 patients were prospectively included in this study. Clinical assessment of treatment was accomplished by comparing initial tumor size with preoperative tumor size using revised RECIST guideline (version 1.1). Clinical response defined as complete response, partial response and no response. The Kaplan-Meier survival analysis were used to evaluate the association of parental marriages (first cousin vs unrelated marriages) and clinical response to chemotherapy (complete and partial response vs no response). Number of courses of chemotherapy was considered as time, in the analysis. Kaplan-Meier analysis revealed that offspring of unrelated marriages had poorer response to chemotherapy (log rank statistic=5.10, df=1, P=0.023). Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  7. Evaluation of the marginal fit of metal copings fabricated on three different marginal designs using conventional and accelerated casting techniques: an in vitro study.

    PubMed

    Vaidya, Sharad; Parkash, Hari; Bhargava, Akshay; Gupta, Sharad

    2014-01-01

    Abundant resources and techniques have been used for complete coverage crown fabrication. Conventional investing and casting procedures for phosphate-bonded investments require a 2- to 4-h procedure before completion. Accelerated casting techniques have been used, but may not result in castings with matching marginal accuracy. The study measured the marginal gap and determined the clinical acceptability of single cast copings invested in a phosphate-bonded investment with the use of conventional and accelerated methods. One hundred and twenty cast coping samples were fabricated using conventional and accelerated methods, with three finish lines: Chamfer, shoulder and shoulder with bevel. Sixty copings were prepared with each technique. Each coping was examined with a stereomicroscope at four predetermined sites and measurements of marginal gaps were documented for each. A master chart was prepared for all the data and was analyzed using Statistical Package for the Social Sciences version. Evidence of marginal gap was then evaluated by t-test. Analysis of variance and Post-hoc analysis were used to compare two groups as well as to make comparisons between three subgroups . Measurements recorded showed no statistically significant difference between conventional and accelerated groups. Among the three marginal designs studied, shoulder with bevel showed the best marginal fit with conventional as well as accelerated casting techniques. Accelerated casting technique could be a vital alternative to the time-consuming conventional casting technique. The marginal fit between the two casting techniques showed no statistical difference.

  8. FabricS: A user-friendly, complete and robust software for particle shape-fabric analysis

    NASA Astrophysics Data System (ADS)

    Moreno Chávez, G.; Castillo Rivera, F.; Sarocchi, D.; Borselli, L.; Rodríguez-Sedano, L. A.

    2018-06-01

    Shape-fabric is a textural parameter related to the spatial arrangement of elongated particles in geological samples. Its usefulness spans a range from sedimentary petrology to igneous and metamorphic petrology. Independently of the process being studied, when a material flows, the elongated particles are oriented with the major axis in the direction of flow. In sedimentary petrology this information has been used for studies of paleo-flow direction of turbidites, the origin of quartz sediments, and locating ignimbrite vents, among others. In addition to flow direction and its polarity, the method enables flow rheology to be inferred. The use of shape-fabric has been limited due to the difficulties of automatically measuring particles and analyzing them with reliable circular statistics programs. This has dampened interest in the method for a long time. Shape-fabric measurement has increased in popularity since the 1980s thanks to the development of new image analysis techniques and circular statistics software. However, the programs currently available are unreliable, old and are incompatible with newer operating systems, or require programming skills. The goal of our work is to develop a user-friendly program, in the MATLAB environment, with a graphical user interface, that can process images and includes editing functions, and thresholds (elongation and size) for selecting a particle population and analyzing it with reliable circular statistics algorithms. Moreover, the method also has to produce rose diagrams, orientation vectors, and a complete series of statistical parameters. All these requirements are met by our new software. In this paper, we briefly explain the methodology from collection of oriented samples in the field to the minimum number of particles needed to obtain reliable fabric data. We obtained the data using specific statistical tests and taking into account the degree of iso-orientation of the samples and the required degree of reliability. The program has been verified by means of several simulations performed using appropriately designed features and by analyzing real samples.

  9. Comparison of Housing Construction Development in Selected Regions of Central Europe

    NASA Astrophysics Data System (ADS)

    Dvorský, Ján; Petráková, Zora; Hollý, Ján

    2017-12-01

    In fast-growing countries, the economic growth, which came after the global financial crisis, ought to be manifested in the development of housing policy. The development of the region is directly related to the increase of the quality of living of its inhabitants. Housing construction and its relation with the availability of housing is a key issue for population overall. Comparison of its development in selected regions is important for experts in the field of construction, mayors of the regions, the state, but especially for the inhabitants themselves. The aim of the article is to compare the number of new dwellings with building permits and completed dwellings with final building approval between selected regions by using a mathematical statistics method - “Analysis of variance”. The article also uses the tools of descriptive statistics such as a point graph, a graph of deviations from the average, basic statistical characteristics of mean and variability. Qualitative factors influencing the construction of flats as well as the causes of quantitative differences in the number of started apartments under construction and completed apartments in selected regions of Central Europe are the subjects of the article’s conclusions.

  10. Simultaneous ocular and muscle artifact removal from EEG data by exploiting diverse statistics.

    PubMed

    Chen, Xun; Liu, Aiping; Chen, Qiang; Liu, Yu; Zou, Liang; McKeown, Martin J

    2017-09-01

    Electroencephalography (EEG) recordings are frequently contaminated by both ocular and muscle artifacts. These are normally dealt with separately, by employing blind source separation (BSS) techniques relying on either second-order or higher-order statistics (SOS & HOS respectively). When HOS-based methods are used, it is usually in the setting of assuming artifacts are statistically independent to the EEG. When SOS-based methods are used, it is assumed that artifacts have autocorrelation characteristics distinct from the EEG. In reality, ocular and muscle artifacts do not completely follow the assumptions of strict temporal independence to the EEG nor completely unique autocorrelation characteristics, suggesting that exploiting HOS or SOS alone may be insufficient to remove these artifacts. Here we employ a novel BSS technique, independent vector analysis (IVA), to jointly employ HOS and SOS simultaneously to remove ocular and muscle artifacts. Numerical simulations and application to real EEG recordings were used to explore the utility of the IVA approach. IVA was superior in isolating both ocular and muscle artifacts, especially for raw EEG data with low signal-to-noise ratio, and also integrated usually separate SOS and HOS steps into a single unified step. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. The Effects of Pre-Lecture Quizzes on Test Anxiety and Performance in a Statistics Course

    ERIC Educational Resources Information Center

    Brown, Michael J.; Tallon, Jennifer

    2015-01-01

    The purpose of our study was to examine the effects of pre-lecture quizzes in a statistics course. Students (N = 70) from 2 sections of an introductory statistics course served as participants in this study. One section completed pre-lecture quizzes whereas the other section did not. Completing pre-lecture quizzes was associated with improved exam…

  12. Left spermatic vein retrograde sclerosis: comparison between sclerosant agent injection through a diagnostic catheter versus through an occluding balloon catheter.

    PubMed

    Basile, Antonio; Failla, Giovanni; La Vignera, Sandro; Condorelli, Rosita Angela; Calogero, Aldo; Vicari, Enzo; Granata, Antonio; Mundo, Elena; Caltabiano, Giuseppe; Pizzarelli, Marco; Messina, Martina; Scavone, Giovanni; Lanzafame, Franz; Iezzi, Roberto; Tsetis, Dimitrios

    2015-05-01

    The aim of this study was to compare the technical success between left spermatic vein (LSV) scleroembolisation achieved with the injection of sclerosant through a diagnostic catheter and through an occluding balloon (OB), in the treatment of male varicocele. From January 2012 to September 2013, we prospectively enrolled 100 patients with left varicocele and an indication for LSV scleroembolisation related to symptoms or spermiogram anomalies; patients were randomised to two groups (we wrote a list of 100 lines assigned casually with A or B and each patient was consecutively allocated to group A or B on the basis of this list). Patients in group A underwent injection of the sclerosing agent through an angiographic diagnostic catheter (free catheter technique) and patients in group B through an OB catheter (OB technique). In cases of incomplete occlusion of the LSV, the procedure was completed with coils. Total occlusion of the LSV at post-treatment phlebography during a Valsalva manoeuvre before any coil embolisation was considered a technical success. The rate of complications was also evaluated. The Fischer's test was used for statistical analysis. We evaluated a total of 90 patients because five patients for each group were not included in the statistical analysis owing to technical problems or complications. In group A we had a technical success of 75.6 versus 93.4 % in group B, and the difference was statistically significant (P = 0.003); in particular, we had to complete the embolisation with insertion of coils in 11 cases (24.4 %) in group A, and in three cases in group B (6.6 %). In group A, LSV rupture occurred in four cases (8 %) so the procedure was completed by sclerosant injection through the OB located distally to the lesion. These patients were not considered for evaluation. In another case, a high flow shunt towards the inferior vena cava was detected, so the patient underwent OB injection to stop the flow to the shunt, and was not included for statistical evaluation. In group B, vein rupture with contrast leakage was noted in six cases (12 %); nonetheless, all the procedures were completed because the OB was positioned distally to the vessel tear, obviating any retrograde leakage of sclerosant. In group B, in five cases (10 %), we were unable to advance the OB though the LSV ostium so the procedures were completed with the diagnostic catheter and not considered for statistical evaluation. On the basis of our data, the embolisation of the LSV obtained by injecting the sclerosant through an OB rather than through a diagnostic catheter seems to be more effective in achieving total vein embolisation, as well as allowing a controlled injection of sclerosant even in cases of vein rupture.

  13. Do flexible acrylic resin lingual flanges improve retention of mandibular complete dentures?

    PubMed Central

    Ahmed Elmorsy, Ayman Elmorsy; Ahmed Ibraheem, Eman Mostafa; Ela, Alaa Aboul; Fahmy, Ahmed; Nassani, Mohammad Zakaria

    2015-01-01

    Objectives: The aim of this study was to compare the retention of conventional mandibular complete dentures with that of mandibular complete dentures having lingual flanges constructed with flexible acrylic resin “Versacryl.” Materials and Methods: The study sample comprised 10 completely edentulous patients. Each patient received one maxillary complete denture and two mandibular complete dentures. One mandibular denture was made of conventional heat-cured acrylic resin and the other had its lingual flanges made of flexible acrylic resin Versacryl. Digital force-meter was used to measure retention of mandibular dentures at delivery and at 2 weeks and 45 days following denture insertion. Results: The statistical analysis showed that at baseline and follow-up appointments, retention of mandibular complete dentures with flexible lingual flanges was significantly greater than retention of conventional mandibular dentures (P < 0.05). In both types of mandibular dentures, retention of dentures increased significantly over the follow-up period (P < 0.05). Conclusions: The use of flexible acrylic resin lingual flanges in the construction of mandibular complete dentures improved denture retention. PMID:26539387

  14. Statistical analysis plan for evaluating low- vs. standard-dose alteplase in the ENhanced Control of Hypertension and Thrombolysis strokE stuDy (ENCHANTED).

    PubMed

    Anderson, Craig S; Woodward, Mark; Arima, Hisatomi; Chen, Xiaoying; Lindley, Richard I; Wang, Xia; Chalmers, John

    2015-12-01

    The ENhanced Control of Hypertension And Thrombolysis strokE stuDy trial is a 2 × 2 quasi-factorial active-comparison, prospective, randomized, open, blinded endpoint clinical trial that is evaluating in thrombolysis-eligible acute ischemic stroke patients whether: (1) low-dose (0·6 mg/kg body weight) intravenous alteplase has noninferior efficacy and lower risk of symptomatic intracerebral hemorrhage compared with standard-dose (0·9 mg/kg body weight) intravenous alteplase; and (2) early intensive blood pressure lowering (systolic target 130-140 mmHg) has superior efficacy and lower risk of any intracerebral hemorrhage compared with guideline-recommended blood pressure control (systolic target <180 mmHg). To outline in detail the predetermined statistical analysis plan for the 'alteplase dose arm' of the study. All data collected by participating researchers will be reviewed and formally assessed. Information pertaining to the baseline characteristics of patients, their process of care, and the delivery of treatments will be classified, and for each item, appropriate descriptive statistical analyses are planned with appropriate comparisons made between randomized groups. For the trial outcomes, the most appropriate statistical comparisons to be made between groups are planned and described. A statistical analysis plan was developed for the results of the alteplase dose arm of the study that is transparent, available to the public, verifiable, and predetermined before completion of data collection. We have developed a predetermined statistical analysis plan for the ENhanced Control of Hypertension And Thrombolysis strokE stuDy alteplase dose arm which is to be followed to avoid analysis bias arising from prior knowledge of the study findings. © 2015 The Authors. International Journal of Stroke published by John Wiley & Sons Ltd on behalf of World Stroke Organization.

  15. Redshift data and statistical inference

    NASA Technical Reports Server (NTRS)

    Newman, William I.; Haynes, Martha P.; Terzian, Yervant

    1994-01-01

    Frequency histograms and the 'power spectrum analysis' (PSA) method, the latter developed by Yu & Peebles (1969), have been widely employed as techniques for establishing the existence of periodicities. We provide a formal analysis of these two classes of methods, including controlled numerical experiments, to better understand their proper use and application. In particular, we note that typical published applications of frequency histograms commonly employ far greater numbers of class intervals or bins than is advisable by statistical theory sometimes giving rise to the appearance of spurious patterns. The PSA method generates a sequence of random numbers from observational data which, it is claimed, is exponentially distributed with unit mean and variance, essentially independent of the distribution of the original data. We show that the derived random processes is nonstationary and produces a small but systematic bias in the usual estimate of the mean and variance. Although the derived variable may be reasonably described by an exponential distribution, the tail of the distribution is far removed from that of an exponential, thereby rendering statistical inference and confidence testing based on the tail of the distribution completely unreliable. Finally, we examine a number of astronomical examples wherein these methods have been used giving rise to widespread acceptance of statistically unconfirmed conclusions.

  16. Utility of Gram stain for the microbiological analysis of burn wound surfaces.

    PubMed

    Elsayed, Sameer; Gregson, Daniel B; Lloyd, Tracie; Crichton, Marilyn; Church, Deirdre L

    2003-11-01

    Surface swab cultures have attracted attention as a potential alternative to biopsy histology or quantitative culture methods for microbiological burn wound monitoring. To our knowledge, the utility of adding a Gram-stained slide in this context has not been evaluated previously. To determine the degree of correlation of Gram stain with culture for the microbiological analysis of burn wound surfaces. Prospective laboratory analysis. Urban health region/centralized diagnostic microbiology laboratory. Burn patients hospitalized in any Calgary Health Region burn center from November 2000 to September 2001. Gram stain plus culture of burn wound surface swab specimens obtained during routine dressing changes or based on clinical signs of infection. Degree of correlation (complete, high, partial, none), including weighted kappa statistic (kappa(w)), of Gram stain with culture based on quantitative microscopy and degree of culture growth. A total of 375 specimens from 50 burn patients were evaluated. Of these, 239 were negative by culture and Gram stain, 7 were positive by Gram stain only, 89 were positive by culture only, and 40 were positive by both methods. The degree of complete, high, partial, and no correlation of Gram stain with culture was 70.9% (266/375), 1.1% (4/375), 2.4% (9/375), and 25.6% (96/375), respectively. The degree of correlation for all 375 specimens, as expressed by the weighted kappa statistic, was found to be fair (kappa(w) = 0.32).Conclusion.-The Gram stain is not suitable for the microbiological analysis of burn wound surfaces.

  17. Learning curves for single incision and conventional laparoscopic right hemicolectomy: a multidimensional analysis.

    PubMed

    Park, Yoonah; Yong, Yuen Geng; Yun, Seong Hyeon; Jung, Kyung Uk; Huh, Jung Wook; Cho, Yong Beom; Kim, Hee Cheol; Lee, Woo Yong; Chun, Ho-Kyung

    2015-05-01

    This study aimed to compare the learning curves and early postoperative outcomes for conventional laparoscopic (CL) and single incision laparoscopic (SIL) right hemicolectomy (RHC). This retrospective study included the initial 35 cases in each group. Learning curves were evaluated by the moving average of operative time, mean operative time of every five consecutive cases, and cumulative sum (CUSUM) analysis. The learning phase was considered overcome when the moving average of operative times reached a plateau, and when the mean operative time of every five consecutive cases reached a low point and subsequently did not vary by more than 30 minutes. Six patients with missing data in the CL RHC group were excluded from the analyses. According to the mean operative time of every five consecutive cases, learning phase of SIL and CL RHC was completed between 26 and 30 cases, and 16 and 20 cases, respectively. Moving average analysis revealed that approximately 31 (SIL) and 25 (CL) cases were needed to complete the learning phase, respectively. CUSUM analysis demonstrated that 10 (SIL) and two (CL) cases were required to reach a steady state of complication-free performance, respectively. Postoperative complications rate was higher in SIL than in CL group, but the difference was not statistically significant (17.1% vs. 3.4%). The learning phase of SIL RHC is longer than that of CL RHC. Early oncological outcomes of both techniques were comparable. However, SIL RHC had a statistically insignificant higher complication rate than CL RHC during the learning phase.

  18. Attitudes toward statistics in medical postgraduates: measuring, evaluating and monitoring.

    PubMed

    Zhang, Yuhai; Shang, Lei; Wang, Rui; Zhao, Qinbo; Li, Chanjuan; Xu, Yongyong; Su, Haixia

    2012-11-23

    In medical training, statistics is considered a very difficult course to learn and teach. Current studies have found that students' attitudes toward statistics can influence their learning process. Measuring, evaluating and monitoring the changes of students' attitudes toward statistics are important. Few studies have focused on the attitudes of postgraduates, especially medical postgraduates. Our purpose was to understand current attitudes regarding statistics held by medical postgraduates and explore their effects on students' achievement. We also wanted to explore the influencing factors and the sources of these attitudes and monitor their changes after a systematic statistics course. A total of 539 medical postgraduates enrolled in a systematic statistics course completed the pre-form of the Survey of Attitudes Toward Statistics -28 scale, and 83 postgraduates were selected randomly from among them to complete the post-form scale after the course. Most medical postgraduates held positive attitudes toward statistics, but they thought statistics was a very difficult subject. The attitudes mainly came from experiences in a former statistical or mathematical class. Age, level of statistical education, research experience, specialty and mathematics basis may influence postgraduate attitudes toward statistics. There were significant positive correlations between course achievement and attitudes toward statistics. In general, student attitudes showed negative changes after completing a statistics course. The importance of student attitudes toward statistics must be recognized in medical postgraduate training. To make sure all students have a positive learning environment, statistics teachers should measure their students' attitudes and monitor their change of status during a course. Some necessary assistance should be offered for those students who develop negative attitudes.

  19. Post-operative diffusion weighted imaging as a predictor of posterior fossa syndrome permanence in paediatric medulloblastoma.

    PubMed

    Chua, Felicia H Z; Thien, Ady; Ng, Lee Ping; Seow, Wan Tew; Low, David C Y; Chang, Kenneth T E; Lian, Derrick W Q; Loh, Eva; Low, Sharon Y Y

    2017-03-01

    Posterior fossa syndrome (PFS) is a serious complication faced by neurosurgeons and their patients, especially in paediatric medulloblastoma patients. The uncertain aetiology of PFS, myriad of cited risk factors and therapeutic challenges make this phenomenon an elusive entity. The primary objective of this study was to identify associative factors related to the development of PFS in medulloblastoma patient post-tumour resection. This is a retrospective study based at a single institution. Patient data and all related information were collected from the hospital records, in accordance to a list of possible risk factors associated with PFS. These included pre-operative tumour volume, hydrocephalus, age, gender, extent of resection, metastasis, ventriculoperitoneal shunt insertion, post-operative meningitis and radiological changes in MRI. Additional variables included molecular and histological subtypes of each patient's medulloblastoma tumour. Statistical analysis was employed to determine evidence of each variable's significance in PFS permanence. A total of 19 patients with appropriately complete data was identified. Initial univariate analysis did not show any statistical significance. However, multivariate analysis for MRI-specific changes reported bilateral DWI restricted diffusion changes involving both right and left sides of the surgical cavity was of statistical significance for PFS permanence. The authors performed a clinical study that evaluated possible risk factors for permanent PFS in paediatric medulloblastoma patients. Analysis of collated results found that post-operative DWI restriction in bilateral regions within the surgical cavity demonstrated statistical significance as a predictor of PFS permanence-a novel finding in the current literature.

  20. Prediction of the space adaptation syndrome

    NASA Technical Reports Server (NTRS)

    Reschke, M. F.; Homick, J. L.; Ryan, P.; Moseley, E. C.

    1984-01-01

    The univariate and multivariate relationships of provocative measures used to produce motion sickness symptoms were described. Normative subjects were used to develop and cross-validate sets of linear equations that optimally predict motion sickness in parabolic flights. The possibility of reducing the number of measurements required for prediction was assessed. After describing the variables verbally and statistically for 159 subjects, a factor analysis of 27 variables was completed to improve understanding of the relationships between variables and to reduce the number of measures for prediction purposes. The results of this analysis show that none of variables are significantly related to the responses to parabolic flights. A set of variables was selected to predict responses to KC-135 flights. A series of discriminant analyses were completed. Results indicate that low, moderate, or severe susceptibility could be correctly predicted 64 percent and 53 percent of the time on original and cross-validation samples, respectively. Both the factor analysis and the discriminant analysis provided no basis for reducing the number of tests.

  1. Early prediction of olanzapine-induced weight gain for schizophrenia patients.

    PubMed

    Lin, Ching-Hua; Lin, Shih-Chi; Huang, Yu-Hui; Wang, Fu-Chiang; Huang, Chun-Jen

    2018-05-01

    The aim of this study was to determine whether weight changes at week 2 or other factors predicted weight gain at week 6 for schizophrenia patients receiving olanzapine. This study was the secondary analysis of a six-week trial for 94 patients receiving olanzapine (5 mg/d) plus trifluoperazine (5 mg/d), or olanzapine (10 mg/d) alone. Patients were included in analysis only if they had completed the 6-week trial (per protocol analysis). Weight gain was defined as a 7% or greater increase of the patient's baseline weight. The receiver operating characteristic curve was employed to determine the optimal cutoff points of statistically significant predictors. Eleven of the 67 patients completing the 6-week trial were classified as weight gainers. Weight change at week 2 was the statistically significant predictor for ultimate weight gain at week 6. A weight change of 1.0 kg at week 2 appeared to be the optimal cutoff point, with a sensitivity of 0.92, a specificity of 0.75, and an AUC of 0.85. Using weight change at week 2 to predict weight gain at week 6 is favorable in terms of both specificity and sensitivity. Weight change of 1.0 kg or more at 2 weeks is a reliable predictor. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. The effect of online collaborative learning on middle school student science literacy and sense of community

    NASA Astrophysics Data System (ADS)

    Wendt, Jillian Leigh

    This study examines the effects of online collaborative learning on middle school students' science literacy and sense of community. A quantitative, quasi-experimental pretest/posttest control group design was used. Following IRB approval and district superintendent approval, students at a public middle school in central Virginia completed a pretest consisting of the Misconceptions-Oriented Standards-Based Assessment Resources for Teachers (MOSART) Physical Science assessment and the Classroom Community Scale. Students in the control group received in-class assignments that were completed collaboratively in a face-to-face manner. Students in the experimental group received in-class assignments that were completed online collaboratively through the Edmodo educational platform. Both groups were members of intact, traditional face-to-face classrooms. The students were then post tested. Results pertaining to the MOSART assessment were statistically analyzed through ANCOVA analysis while results pertaining to the Classroom Community Scale were analyzed through MANOVA analysis. Results are reported and suggestions for future research are provided.

  3. Non-Gaussian Distributions Affect Identification of Expression Patterns, Functional Annotation, and Prospective Classification in Human Cancer Genomes

    PubMed Central

    Marko, Nicholas F.; Weil, Robert J.

    2012-01-01

    Introduction Gene expression data is often assumed to be normally-distributed, but this assumption has not been tested rigorously. We investigate the distribution of expression data in human cancer genomes and study the implications of deviations from the normal distribution for translational molecular oncology research. Methods We conducted a central moments analysis of five cancer genomes and performed empiric distribution fitting to examine the true distribution of expression data both on the complete-experiment and on the individual-gene levels. We used a variety of parametric and nonparametric methods to test the effects of deviations from normality on gene calling, functional annotation, and prospective molecular classification using a sixth cancer genome. Results Central moments analyses reveal statistically-significant deviations from normality in all of the analyzed cancer genomes. We observe as much as 37% variability in gene calling, 39% variability in functional annotation, and 30% variability in prospective, molecular tumor subclassification associated with this effect. Conclusions Cancer gene expression profiles are not normally-distributed, either on the complete-experiment or on the individual-gene level. Instead, they exhibit complex, heavy-tailed distributions characterized by statistically-significant skewness and kurtosis. The non-Gaussian distribution of this data affects identification of differentially-expressed genes, functional annotation, and prospective molecular classification. These effects may be reduced in some circumstances, although not completely eliminated, by using nonparametric analytics. This analysis highlights two unreliable assumptions of translational cancer gene expression analysis: that “small” departures from normality in the expression data distributions are analytically-insignificant and that “robust” gene-calling algorithms can fully compensate for these effects. PMID:23118863

  4. Approach to addressing missing data for electronic medical records and pharmacy claims data research.

    PubMed

    Bounthavong, Mark; Watanabe, Jonathan H; Sullivan, Kevin M

    2015-04-01

    The complete capture of all values for each variable of interest in pharmacy research studies remains aspirational. The absence of these possibly influential values is a common problem for pharmacist investigators. Failure to account for missing data may translate to biased study findings and conclusions. Our goal in this analysis was to apply validated statistical methods for missing data to a previously analyzed data set and compare results when missing data methods were implemented versus standard analytics that ignore missing data effects. Using data from a retrospective cohort study, the statistical method of multiple imputation was used to provide regression-based estimates of the missing values to improve available data usable for study outcomes measurement. These findings were then contrasted with a complete-case analysis that restricted estimation to subjects in the cohort that had no missing values. Odds ratios were compared to assess differences in findings of the analyses. A nonadjusted regression analysis ("crude analysis") was also performed as a reference for potential bias. Veterans Integrated Systems Network that includes VA facilities in the Southern California and Nevada regions. New statin users between November 30, 2006, and December 2, 2007, with a diagnosis of dyslipidemia. We compared the odds ratios (ORs) and 95% confidence intervals (CIs) for the crude, complete-case, and multiple imputation analyses for the end points of a 25% or greater reduction in atherogenic lipids. Data were missing for 21.5% of identified patients (1665 subjects of 7739). Regression model results were similar for the crude, complete-case, and multiple imputation analyses with overlap of 95% confidence limits at each end point. The crude, complete-case, and multiple imputation ORs (95% CIs) for a 25% or greater reduction in low-density lipoprotein cholesterol were 3.5 (95% CI 3.1-3.9), 4.3 (95% CI 3.8-4.9), and 4.1 (95% CI 3.7-4.6), respectively. The crude, complete-case, and multiple imputation ORs (95% CIs) for a 25% or greater reduction in non-high-density lipoprotein cholesterol were 3.5 (95% CI 3.1-3.9), 4.5 (95% CI 4.0-5.2), and 4.4 (95% CI 3.9-4.9), respectively. The crude, complete-case, and multiple imputation ORs (95% CIs) for 25% or greater reduction in TGs were 3.1 (95% CI 2.8-3.6), 4.0 (95% CI 3.5-4.6), and 4.1 (95% CI 3.6-4.6), respectively. The use of the multiple imputation method to account for missing data did not alter conclusions based on a complete-case analysis. Given the frequency of missing data in research using electronic health records and pharmacy claims data, multiple imputation may play an important role in the validation of study findings. © 2015 Pharmacotherapy Publications, Inc.

  5. Allocation of Academic Workloads in the Faculty of Human and Social Sciences at a South African University

    ERIC Educational Resources Information Center

    Botha, P. A.; Swanepoel, S.

    2015-01-01

    This article reports on the results of a statistical analysis of the weekly working hours of academics in a Faculty of Human and Social Sciences at a South African university. The aim was to quantify, analyse and compare the workload of academic staff. Seventy-five academics self-reported on their workload by completing the workload measuring…

  6. Defense Safety Oversight Council (DSOC) Reducing Vehicular Vibration and Impact

    DTIC Science & Technology

    2013-10-10

    Data Collection Analysis/Writing Total Funding = Planned Completion Medical Research and Materiel Command U.S. Army Aeromedical Research...newly introduced to the UK in 2000-2001 • Little was known about long term health effects of monocular helmet mounted displays Purpose • Analyze data ...Collate data then analyze questionnaires and examinations for statistical differences Product/Payoff • Increased knowledge of risks to Apache

  7. Supporting Vocational Students' Development of Preventive Behaviour at Work: A Phenomenological Analysis of Teachers' Experiences

    ERIC Educational Resources Information Center

    Lecours, Alexandra; Therriault, Pierre-Yves

    2017-01-01

    Statistics indicate that even if young workers complete vocational training, as a group they are at risk of sustaining injury. It appears that a lack of training in the area of injury prevention may explain some of this effect. Teachers are considered to be key actors in injury-prevention training and in the process of developing students'…

  8. Multivariate Analysis, Mass Balance Techniques, and Statistical Tests as Tools in Igneous Petrology: Application to the Sierra de las Cruces Volcanic Range (Mexican Volcanic Belt)

    PubMed Central

    Velasco-Tapia, Fernando

    2014-01-01

    Magmatic processes have usually been identified and evaluated using qualitative or semiquantitative geochemical or isotopic tools based on a restricted number of variables. However, a more complete and quantitative view could be reached applying multivariate analysis, mass balance techniques, and statistical tests. As an example, in this work a statistical and quantitative scheme is applied to analyze the geochemical features for the Sierra de las Cruces (SC) volcanic range (Mexican Volcanic Belt). In this locality, the volcanic activity (3.7 to 0.5 Ma) was dominantly dacitic, but the presence of spheroidal andesitic enclaves and/or diverse disequilibrium features in majority of lavas confirms the operation of magma mixing/mingling. New discriminant-function-based multidimensional diagrams were used to discriminate tectonic setting. Statistical tests of discordancy and significance were applied to evaluate the influence of the subducting Cocos plate, which seems to be rather negligible for the SC magmas in relation to several major and trace elements. A cluster analysis following Ward's linkage rule was carried out to classify the SC volcanic rocks geochemical groups. Finally, two mass-balance schemes were applied for the quantitative evaluation of the proportion of the end-member components (dacitic and andesitic magmas) in the comingled lavas (binary mixtures). PMID:24737994

  9. The German Registry of immune tolerance treatment in hemophilia--1999 update.

    PubMed

    Lenk, H

    2000-10-01

    As of 1999, the German registry of immune tolerance treatment in hemophilia has received reports on 146 patients who have undergone this therapy from 25 hemophilia centers. In 16 of the reported patients treatment is ongoing. Therapy has been completed in 126 patients of all groups with hemophilia A; most of them are children. In 78.6% of hemophilia A patients full success was achieved, 8.7% finished with partial success, and in 12.7% ITT failed. Statistical analysis demonstrates that interruptions of therapy have a negative influence on success. The inhibitor titer has the highest predictive value for success or failure of therapy. A high maximum titer as well as a high titer at start of treatment were related to a low success rate. Other variables such as exposure days and time interval between inhibitor detection and start of ITT were not statistically significant. Four patients with hemophilia B have also completed therapy, only one of them with success.

  10. Water levels and groundwater and surface-water exchanges in lakes of the northeast Twin Cities Metropolitan Area, Minnesota, 2002 through 2015

    USGS Publications Warehouse

    Jones, Perry M.; Trost, Jared J.; Erickson, Melinda L.

    2016-10-19

    OverviewThis study assessed lake-water levels and regional and local groundwater and surface-water exchanges near northeast Twin Cities Metropolitan Area lakes applying three approaches: statistical analysis, field study, and groundwater-flow modeling.  Statistical analyses of lake levels were completed to assess the effect of physical setting and climate on lake-level fluctuations of selected lakes. A field study of groundwater and surface-water interactions in selected lakes was completed to (1) estimate potential percentages of surface-water contributions to well water across the northeast Twin Cities Metropolitan Area, (2) estimate general ages for waters extracted from the wells, and (3) assess groundwater inflow to lakes and lake-water outflow to aquifers downgradient from White Bear Lake.  Groundwater flow was simulated using a steady-state, groundwater-flow model to assess regional groundwater and surface-water exchanges and the effects of groundwater withdrawals, climate, and other factors on water levels of northeast Twin Cities Metropolitan Area lakes.

  11. Feeling the future: A meta-analysis of 90 experiments on the anomalous anticipation of random future events.

    PubMed

    Bem, Daryl; Tressoldi, Patrizio; Rabeyron, Thomas; Duggan, Michael

    2015-01-01

    In 2011, one of the authors (DJB) published a report of nine experiments in the Journal of Personality and Social Psychology purporting to demonstrate that an individual's cognitive and affective responses can be influenced by randomly selected stimulus events that do not occur until after his or her responses have already been made and recorded, a generalized variant of the phenomenon traditionally denoted by the term precognition. To encourage replications, all materials needed to conduct them were made available on request. We here report a meta-analysis of 90 experiments from 33 laboratories in 14 countries which yielded an overall effect greater than 6 sigma, z = 6.40, p = 1.2 × 10 (-10 ) with an effect size (Hedges' g) of 0.09. A Bayesian analysis yielded a Bayes Factor of 5.1 × 10 (9), greatly exceeding the criterion value of 100 for "decisive evidence" in support of the experimental hypothesis. When DJB's original experiments are excluded, the combined effect size for replications by independent investigators is 0.06, z = 4.16, p = 1.1 × 10 (-5), and the BF value is 3,853, again exceeding the criterion for "decisive evidence." The number of potentially unretrieved experiments required to reduce the overall effect size of the complete database to a trivial value of 0.01 is 544, and seven of eight additional statistical tests support the conclusion that the database is not significantly compromised by either selection bias or by intense " p-hacking"-the selective suppression of findings or analyses that failed to yield statistical significance. P-curve analysis, a recently introduced statistical technique, estimates the true effect size of the experiments to be 0.20 for the complete database and 0.24 for the independent replications, virtually identical to the effect size of DJB's original experiments (0.22) and the closely related "presentiment" experiments (0.21). We discuss the controversial status of precognition and other anomalous effects collectively known as psi.

  12. Medical School Attrition-Beyond the Statistics A Ten Year Retrospective Study

    PubMed Central

    2013-01-01

    Background Medical school attrition is important - securing a place in medical school is difficult and a high attrition rate can affect the academic reputation of a medical school and staff morale. More important, however, are the personal consequences of dropout for the student. The aims of our study were to examine factors associated with attrition over a ten-year period (2001–2011) and to study the personal effects of dropout on individual students. Methods The study included quantitative analysis of completed cohorts and qualitative analysis of ten-year data. Data were collected from individual student files, examination and admission records, exit interviews and staff interviews. Statistical analysis was carried out on five successive completed cohorts. Qualitative data from student files was transcribed and independently analysed by three authors. Data was coded and categorized and key themes were identified. Results Overall attrition rate was 5.7% (45/779) in 6 completed cohorts when students who transferred to other medical courses were excluded. Students from Kuwait and United Arab Emirates had the highest dropout rate (RR = 5.70, 95% Confidence Intervals 2.65 to 12.27;p < 0.0001) compared to Irish and EU students combined. North American students had a higher dropout rate than Irish and EU students; RR = 2.68 (1.09 to 6.58;p = 0.027) but this was not significant when transfers were excluded (RR = 1.32(0.38, 4.62);p = 0.75). Male students were more likely to dropout than females (RR 1.70, .93 to 3.11) but this was not significant (p = 0.079). Absenteeism was documented in 30% of students, academic difficulty in 55.7%, social isolation in 20%, and psychological morbidity in 40% (higher than other studies). Qualitative analysis revealed recurrent themes of isolation, failure, and despair. Student Welfare services were only accessed by one-third of dropout students. Conclusions While dropout is often multifactorial, certain red flag signals may alert us to risk of dropout including non-EU origin, academic struggling, absenteeism, social isolation, depression and leave of absence. Psychological morbidity amongst dropout students is high and Student Welfare services should be actively promoted. Absenteeism should prompt early intervention. Behind every dropout statistic lies a personal story. All medical schools have a duty of care to support students who leave the medical programme. PMID:23363547

  13. Methods of learning in statistical education: Design and analysis of a randomized trial

    NASA Astrophysics Data System (ADS)

    Boyd, Felicity Turner

    Background. Recent psychological and technological advances suggest that active learning may enhance understanding and retention of statistical principles. A randomized trial was designed to evaluate the addition of innovative instructional methods within didactic biostatistics courses for public health professionals. Aims. The primary objectives were to evaluate and compare the addition of two active learning methods (cooperative and internet) on students' performance; assess their impact on performance after adjusting for differences in students' learning style; and examine the influence of learning style on trial participation. Methods. Consenting students enrolled in a graduate introductory biostatistics course were randomized to cooperative learning, internet learning, or control after completing a pretest survey. The cooperative learning group participated in eight small group active learning sessions on key statistical concepts, while the internet learning group accessed interactive mini-applications on the same concepts. Controls received no intervention. Students completed evaluations after each session and a post-test survey. Study outcome was performance quantified by examination scores. Intervention effects were analyzed by generalized linear models using intent-to-treat analysis and marginal structural models accounting for reported participation. Results. Of 376 enrolled students, 265 (70%) consented to randomization; 69, 100, and 96 students were randomized to the cooperative, internet, and control groups, respectively. Intent-to-treat analysis showed no differences between study groups; however, 51% of students in the intervention groups had dropped out after the second session. After accounting for reported participation, expected examination scores were 2.6 points higher (of 100 points) after completing one cooperative learning session (95% CI: 0.3, 4.9) and 2.4 points higher after one internet learning session (95% CI: 0.0, 4.7), versus nonparticipants or controls, adjusting for other performance predictors. Students who preferred learning by reflective observation and active experimentation experienced improved performance through internet learning (5.9 points, 95% CI: 1.2, 10.6) and cooperative learning (2.9 points, 95% CI: 0.6, 5.2), respectively. Learning style did not influence study participation. Conclusions. No performance differences by group were observed by intent-to-treat analysis. Participation in active learning appears to improve student performance in an introductory biostatistics course and provides opportunities for enhancing understanding beyond that attained in traditional didactic classrooms.

  14. Medical school attrition-beyond the statistics a ten year retrospective study.

    PubMed

    Maher, Bridget M; Hynes, Helen; Sweeney, Catherine; Khashan, Ali S; O'Rourke, Margaret; Doran, Kieran; Harris, Anne; Flynn, Siun O'

    2013-01-31

    Medical school attrition is important--securing a place in medical school is difficult and a high attrition rate can affect the academic reputation of a medical school and staff morale. More important, however, are the personal consequences of dropout for the student. The aims of our study were to examine factors associated with attrition over a ten-year period (2001-2011) and to study the personal effects of dropout on individual students. The study included quantitative analysis of completed cohorts and qualitative analysis of ten-year data. Data were collected from individual student files, examination and admission records, exit interviews and staff interviews. Statistical analysis was carried out on five successive completed cohorts. Qualitative data from student files was transcribed and independently analysed by three authors. Data was coded and categorized and key themes were identified. Overall attrition rate was 5.7% (45/779) in 6 completed cohorts when students who transferred to other medical courses were excluded. Students from Kuwait and United Arab Emirates had the highest dropout rate (RR = 5.70, 95% Confidence Intervals 2.65 to 12.27;p < 0.0001) compared to Irish and EU students combined. North American students had a higher dropout rate than Irish and EU students; RR = 2.68 (1.09 to 6.58;p = 0.027) but this was not significant when transfers were excluded (RR = 1.32(0.38, 4.62);p = 0.75). Male students were more likely to dropout than females (RR 1.70, .93 to 3.11) but this was not significant (p = 0.079).Absenteeism was documented in 30% of students, academic difficulty in 55.7%, social isolation in 20%, and psychological morbidity in 40% (higher than other studies). Qualitative analysis revealed recurrent themes of isolation, failure, and despair. Student Welfare services were only accessed by one-third of dropout students. While dropout is often multifactorial, certain red flag signals may alert us to risk of dropout including non-EU origin, academic struggling, absenteeism, social isolation, depression and leave of absence. Psychological morbidity amongst dropout students is high and Student Welfare services should be actively promoted. Absenteeism should prompt early intervention. Behind every dropout statistic lies a personal story. All medical schools have a duty of care to support students who leave the medical programme.

  15. Influence of psychological factors on the acceptance of complete dentures.

    PubMed

    al Quran, F; Clifford, T; Cooper, C; Lamey, P J

    2001-07-01

    To assess the influence of psychological factors on the acceptance of complete dentures in a population wearing dentures judged to be clinically satisfactory. Subjects were asked to complete personality profiles and also to rate their dentures using a denture satisfaction questionnaire. The survey was conducted in the prosthetics clinic of a teaching hospital. Patients were selected from those who had new complete dentures constructed in the department within the previous two years. The personality inventory was a self-administered questionnaire comprising 240 items covering the five domains of personality. Denture satisfaction was scored on a nine item scale with four Likert type responses to each. A group of 16% consistently complained about their dentures. Statistical analysis showed that personality factors especially Neuroticism had a significant relationship with denture satisfaction. Psychological factors significantly influence denture satisfaction and profiles may provide useful in predicting potential difficult denture wearers.

  16. Spectral Discrete Probability Density Function of Measured Wind Turbine Noise in the Far Field

    PubMed Central

    Ashtiani, Payam; Denison, Adelaide

    2015-01-01

    Of interest is the spectral character of wind turbine noise at typical residential set-back distances. In this paper, a spectral statistical analysis has been applied to immission measurements conducted at three locations. This method provides discrete probability density functions for the Turbine ONLY component of the measured noise. This analysis is completed for one-third octave sound levels, at integer wind speeds, and is compared to existing metrics for measuring acoustic comfort as well as previous discussions on low-frequency noise sources. PMID:25905097

  17. Identifying environmental features for land management decisions

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The benefits of changes in management organization and facilities for the Center for Remote Sensing and Cartography in Utah are reported as well as interactions with and outreach to state and local agencies. Completed projects are described which studied (1) Unita Basin wetland/land use; (2) Davis County foothill development; (3) Farmington Bay shoreline fluctuation; (4) irrigation detection; and (5) satellite investigation of snow cover/mule deer relationships. Techniques developed for composite computer mapping, contrast enhancement, U-2 CIR/LANDSAT digital interface; factor analysis, and multivariate statistical analysis are described.

  18. Status and results from the OPERA experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ariga, Tomoko

    2011-10-06

    The OPERA experiment is aiming at the first direct detection of neutrino oscillations in appearance mode through the study of the v{sub {mu}}{yields}v{tau} channel. The OPERA detector is placed in the CNGS long baseline v{sub {mu}} beam 730 km away from the neutrino source. The analysis of a sub-sample of the data taken in the 2008-2009 runs was completed After a brief description of the beam and the experimental setup, we report on event analysis and on a first candidate event, its background estimation and statistical significance.

  19. Statistical significance of trace evidence matches using independent physicochemical measurements

    NASA Astrophysics Data System (ADS)

    Almirall, Jose R.; Cole, Michael; Furton, Kenneth G.; Gettinby, George

    1997-02-01

    A statistical approach to the significance of glass evidence is proposed using independent physicochemical measurements and chemometrics. Traditional interpretation of the significance of trace evidence matches or exclusions relies on qualitative descriptors such as 'indistinguishable from,' 'consistent with,' 'similar to' etc. By performing physical and chemical measurements with are independent of one another, the significance of object exclusions or matches can be evaluated statistically. One of the problems with this approach is that the human brain is excellent at recognizing and classifying patterns and shapes but performs less well when that object is represented by a numerical list of attributes. Chemometrics can be employed to group similar objects using clustering algorithms and provide statistical significance in a quantitative manner. This approach is enhanced when population databases exist or can be created and the data in question can be evaluated given these databases. Since the selection of the variables used and their pre-processing can greatly influence the outcome, several different methods could be employed in order to obtain a more complete picture of the information contained in the data. Presently, we report on the analysis of glass samples using refractive index measurements and the quantitative analysis of the concentrations of the metals: Mg, Al, Ca, Fe, Mn, Ba, Sr, Ti and Zr. The extension of this general approach to fiber and paint comparisons also is discussed. This statistical approach should not replace the current interpretative approaches to trace evidence matches or exclusions but rather yields an additional quantitative measure. The lack of sufficient general population databases containing the needed physicochemical measurements and the potential for confusion arising from statistical analysis currently hamper this approach and ways of overcoming these obstacles are presented.

  20. NONPARAMETRIC MANOVA APPROACHES FOR NON-NORMAL MULTIVARIATE OUTCOMES WITH MISSING VALUES

    PubMed Central

    He, Fanyin; Mazumdar, Sati; Tang, Gong; Bhatia, Triptish; Anderson, Stewart J.; Dew, Mary Amanda; Krafty, Robert; Nimgaonkar, Vishwajit; Deshpande, Smita; Hall, Martica; Reynolds, Charles F.

    2017-01-01

    Between-group comparisons often entail many correlated response variables. The multivariate linear model, with its assumption of multivariate normality, is the accepted standard tool for these tests. When this assumption is violated, the nonparametric multivariate Kruskal-Wallis (MKW) test is frequently used. However, this test requires complete cases with no missing values in response variables. Deletion of cases with missing values likely leads to inefficient statistical inference. Here we extend the MKW test to retain information from partially-observed cases. Results of simulated studies and analysis of real data show that the proposed method provides adequate coverage and superior power to complete-case analyses. PMID:29416225

  1. Attitudes toward statistics in medical postgraduates: measuring, evaluating and monitoring

    PubMed Central

    2012-01-01

    Background In medical training, statistics is considered a very difficult course to learn and teach. Current studies have found that students’ attitudes toward statistics can influence their learning process. Measuring, evaluating and monitoring the changes of students’ attitudes toward statistics are important. Few studies have focused on the attitudes of postgraduates, especially medical postgraduates. Our purpose was to understand current attitudes regarding statistics held by medical postgraduates and explore their effects on students’ achievement. We also wanted to explore the influencing factors and the sources of these attitudes and monitor their changes after a systematic statistics course. Methods A total of 539 medical postgraduates enrolled in a systematic statistics course completed the pre-form of the Survey of Attitudes Toward Statistics −28 scale, and 83 postgraduates were selected randomly from among them to complete the post-form scale after the course. Results Most medical postgraduates held positive attitudes toward statistics, but they thought statistics was a very difficult subject. The attitudes mainly came from experiences in a former statistical or mathematical class. Age, level of statistical education, research experience, specialty and mathematics basis may influence postgraduate attitudes toward statistics. There were significant positive correlations between course achievement and attitudes toward statistics. In general, student attitudes showed negative changes after completing a statistics course. Conclusions The importance of student attitudes toward statistics must be recognized in medical postgraduate training. To make sure all students have a positive learning environment, statistics teachers should measure their students’ attitudes and monitor their change of status during a course. Some necessary assistance should be offered for those students who develop negative attitudes. PMID:23173770

  2. Statistical primer: how to deal with missing data in scientific research?

    PubMed

    Papageorgiou, Grigorios; Grant, Stuart W; Takkenberg, Johanna J M; Mokhles, Mostafa M

    2018-05-10

    Missing data are a common challenge encountered in research which can compromise the results of statistical inference when not handled appropriately. This paper aims to introduce basic concepts of missing data to a non-statistical audience, list and compare some of the most popular approaches for handling missing data in practice and provide guidelines and recommendations for dealing with and reporting missing data in scientific research. Complete case analysis and single imputation are simple approaches for handling missing data and are popular in practice, however, in most cases they are not guaranteed to provide valid inferences. Multiple imputation is a robust and general alternative which is appropriate for data missing at random, surpassing the disadvantages of the simpler approaches, but should always be conducted with care. The aforementioned approaches are illustrated and compared in an example application using Cox regression.

  3. On the Helicity in 3D-Periodic Navier-Stokes Equations II: The Statistical Case

    NASA Astrophysics Data System (ADS)

    Foias, Ciprian; Hoang, Luan; Nicolaenko, Basil

    2009-09-01

    We study the asymptotic behavior of the statistical solutions to the Navier-Stokes equations using the normalization map [9]. It is then applied to the study of mean energy, mean dissipation rate of energy, and mean helicity of the spatial periodic flows driven by potential body forces. The statistical distribution of the asymptotic Beltrami flows are also investigated. We connect our mathematical analysis with the empirical theory of decaying turbulence. With appropriate mathematically defined ensemble averages, the Kolmogorov universal features are shown to be transient in time. We provide an estimate for the time interval in which those features may still be present. Our collaborator and friend Basil Nicolaenko passed away in September of 2007, after this work was completed. Honoring his contribution and friendship, we dedicate this article to him.

  4. FY2017 Report on NISC Measurements and Detector Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrews, Madison Theresa; Meierbachtol, Krista Cruse; Jordan, Tyler Alexander

    FY17 work focused on automation, both of the measurement analysis and comparison of simulations. The experimental apparatus was relocated and weeks of continuous measurements of the spontaneous fission source 252Cf was performed. Programs were developed to automate the conversion of measurements into ROOT data framework files with a simple terminal input. The complete analysis of the measurement (which includes energy calibration and the identification of correlated counts) can now be completed with a documented process which involves one simple execution line as well. Finally, the hurdles of slow MCNP simulations resulting in low simulation statistics have been overcome with themore » generation of multi-run suites which make use of the highperformance computing resources at LANL. Preliminary comparisons of measurements and simulations have been performed and will be the focus of FY18 work.« less

  5. Statistical sensitivity analysis of a simple nuclear waste repository model

    NASA Astrophysics Data System (ADS)

    Ronen, Y.; Lucius, J. L.; Blow, E. M.

    1980-06-01

    A preliminary step in a comprehensive sensitivity analysis of the modeling of a nuclear waste repository. The purpose of the complete analysis is to determine which modeling parameters and physical data are most important in determining key design performance criteria and then to obtain the uncertainty in the design for safety considerations. The theory for a statistical screening design methodology is developed for later use in the overall program. The theory was applied to the test case of determining the relative importance of the sensitivity of near field temperature distribution in a single level salt repository to modeling parameters. The exact values of the sensitivities to these physical and modeling parameters were then obtained using direct methods of recalculation. The sensitivity coefficients found to be important for the sample problem were thermal loading, distance between the spent fuel canisters and their radius. Other important parameters were those related to salt properties at a point of interest in the repository.

  6. Pedagogical monitoring as a tool to reduce dropout in distance learning in family health.

    PubMed

    de Castro E Lima Baesse, Deborah; Grisolia, Alexandra Monteiro; de Oliveira, Ana Emilia Figueiredo

    2016-08-22

    This paper presents the results of a study of the Monsys monitoring system, an educational support tool designed to prevent and control the dropout rate in a distance learning course in family health. Developed by UNA-SUS/UFMA, Monsys was created to enable data mining in the virtual learning environment known as Moodle. This is an exploratory study using documentary and bibliographic research and analysis of the Monsys database. Two classes (2010 and 2011) were selected as research subjects, one with Monsys intervention and the other without. The samples were matched (using a ration of 1:1) by gender, age, marital status, graduation year, previous graduation status, location and profession. Statistical analysis was performed using the chi-square test and a multivariate logistic regression model with a 5 % significance level. The findings show that the dropout rate in the class in which Monsys was not employed (2010) was 43.2 %. However, the dropout rate in the class of 2011, in which the tool was employed as a pedagogical team aid, was 30.6 %. After statistical adjustment, the Monsys monitoring system remained in correlation with the course completion variable (adjusted OR = 1.74, IC95% = 1.17-2.59; p = 0.005), suggesting that the use of the Monsys tool, isolated to the adjusted variables, can enhance the likelihood that students will complete the course. Using the chi-square test, a profile analysis of students revealed a higher completion rate among women (67.7 %) than men (52.2 %). Analysis of age demonstrated that students between 40 and 49 years dropped out the least (32.1 %) and, with regard to professional training, nurses have the lowest dropout rates (36.3 %). The use of Monsys significantly reduced the dropout, with results showing greater association between the variables denoting presence of the monitoring system and female gender.

  7. Motor vehicle traffic crashes as a leading cause of death in the U.S. : summary of the 1997 mortality experience and traffic crash fatality trend from 1992 to 1997

    DOT National Transportation Integrated Search

    2000-06-01

    The National Center for Statistics and Analysis (NCSA) recently completed a study of data on the causes of death for all persons, by age and sex, which occurred in the U.S. in 1997. The purpose of this study was to examine the status of motor vehicle...

  8. A Preliminary Analysis of the Theoretical Parameters of Organizaational Learning.

    DTIC Science & Technology

    1995-09-01

    PARAMETERS OF ORGANIZATIONAL LEARNING THESIS Presented to the Faculty of the Graduate School of Logistics and Acquisition Management of the Air...Organizational Learning Parameters in the Knowledge Acquisition Category 2~™ 2-3. Organizational Learning Parameters in the Information Distribution Category...Learning Refined Scale 4-94 4-145. Composition of Refined Scale 4 Knowledge Flow 4-95 4-146. Cronbach’s Alpha Statistics for the Complete Knowledge Flow

  9. [Establishment of diagnostic model to monitor minimal residual disease of acute promyelocytic leukemia by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry].

    PubMed

    Zhang, Lin-lin; Xu, Zhi-fang; Tan, Yan-hong; Chen, Xiu-hua; Xu, Ai-ning; Ren, Fang-gang; Wang, Hong-wei

    2013-01-01

    To screen the potential protein biomarkers in minimal residual disease (MRD) of the acute promyelocytic leukemia (APL) by comparison of differentially expressed serum protein between APL patients at diagnosis and after complete remission (CR) and healthy controls, and to establish and verify a diagnostic model. Serum proteins from 36 cases of primary APL, 29 cases of APL during complete remission and 32 healthy controls were purified by magnetic beads and then analyzed by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS). The spectra were analyzed statistically using FlexAnalysis(TM) and ClinProt(TM) software. Two prediction model of primary APL/healthy control, primary APL/APL CR were developed. Thirty four statistically significant peptide peaks were obtained with the m/z value ranging from 1000 to 10 000 (P < 0.001) in primary APL/healthy control model. Seven statistically significant peptide peaks were obtained in primary APL/APL CR model (P < 0.001). Comparison of the protein profiles between the two models, three peptides with m/z 4642, 7764 and 9289 were considered as the protein biomarker of APL MRD. A diagnostic pattern for APL CR using m/z 4642 and 9289 was established. Blind validation yielded correct classification of 6 out of 8 cases. The MALDI-TOF MS analysis of APL patients serum protein can be used as a promising dynamic method for MRD detection and the two peptides with m/z 4642 and 9289 may be better biomarkers.

  10. Humans make efficient use of natural image statistics when performing spatial interpolation.

    PubMed

    D'Antona, Anthony D; Perry, Jeffrey S; Geisler, Wilson S

    2013-12-16

    Visual systems learn through evolution and experience over the lifespan to exploit the statistical structure of natural images when performing visual tasks. Understanding which aspects of this statistical structure are incorporated into the human nervous system is a fundamental goal in vision science. To address this goal, we measured human ability to estimate the intensity of missing image pixels in natural images. Human estimation accuracy is compared with various simple heuristics (e.g., local mean) and with optimal observers that have nearly complete knowledge of the local statistical structure of natural images. Human estimates are more accurate than those of simple heuristics, and they match the performance of an optimal observer that knows the local statistical structure of relative intensities (contrasts). This optimal observer predicts the detailed pattern of human estimation errors and hence the results place strong constraints on the underlying neural mechanisms. However, humans do not reach the performance of an optimal observer that knows the local statistical structure of the absolute intensities, which reflect both local relative intensities and local mean intensity. As predicted from a statistical analysis of natural images, human estimation accuracy is negligibly improved by expanding the context from a local patch to the whole image. Our results demonstrate that the human visual system exploits efficiently the statistical structure of natural images.

  11. Healthy work environments and staff nurse retention: the relationship between communication, collaboration, and leadership in the pediatric intensive care unit.

    PubMed

    Blake, Nancy; Leach, Linda Searle; Robbins, Wendy; Pike, Nancy; Needleman, Jack

    2013-01-01

    A healthy work environment can improve patient outcomes and registered nurse (RN) turnover. Creating cultures of retention and fostering healthy work environments are 2 major challenges facing nurse leaders today. Examine the effects of the healthy work environment (communication, collaboration, and leadership) on RN turnover from data collected from a research study. Descriptive, cross-sectional, correlational design. Pediatric critical care RNs from 10 pediatric intensive care units (PICU) completed the Practice Environment Scale of the Nursing Work Index Revised and a subscale of the Intensive Care Unit Nurse-Physician Communication Questionnaire. These staff nurses were asked whether they intend to leave their current job in the next 6 months. Statistical analysis included correlations, multiple linear regression, t tests (2-tailed), and 1-way analysis of variance. A total of 415 RNs completed the survey. There was a statistically significant relationship between leadership and the intent to leave (P < .05). There was also an inverse relationship between years of experience and intent to leave. None of the communication variables between RNs and among RNs and MDs or collaboration were significantly associated with PICU nurses' intention to leave. Effective leadership in the PICU is important to PICU RNs and significantly influences their decisions about staying in their current job.

  12. The effect of partially exposed connective tissue graft on root-coverage outcomes: a systematic review and meta-analysis.

    PubMed

    Dodge, Austin; Garcia, Jeffrey; Luepke, Paul; Lai, Yu-Lin; Kassab, Moawia; Lin, Guo-Hao

    2018-04-01

    The aim of this systematic review was to compare the root-coverage outcomes of using a partially exposed connective tissue graft (CTG) technique with a fully covered CTG technique for root coverage. An electronic search up to February 28 th , 2017, was performed to identify human clinical studies with data comparing outcomes of root coverage using CTG, with and without a partially exposed graft. Five clinical studies were selected for inclusion in this review. For each study, the gain of keratinized gingiva, reduction of recession depth, number of surgical sites achieving complete root coverage, percentage of root coverage, gain of tissue thickness, and changes of probing depth and clinical attachment level were recorded. Meta-analysis for the comparison of complete root coverage between the two techniques presented no statistically significant differences. A statistically significant gain of keratinized tissue in favor of the sites with an exposed CTG and a tendency of greater reduction in recession depth were seen at the sites with a fully covered CTG. Based on the results, the use of a partially exposed CTG in root-coverage procedures could achieve greater gain in keratinized gingiva, while a fully covered CTG might be indicated for procedures aiming to reduce recession depth. © 2018 Eur J Oral Sci.

  13. Data on education: from population statistics to epidemiological research.

    PubMed

    Pallesen, Palle Bo; Tverborgvik, Torill; Rasmussen, Hanna Barbara; Lynge, Elsebeth

    2010-03-01

    Level of education is in many fields of research used as an indicator of social status. Using Statistics Denmark's register for education and employment of the population, we examined highest completed education with a birth-cohort perspective focusing on people born between 1930 and 1974. Irregularities in the educational data were found for both men and women born from 1951 to 1957. For the birth cohorts born from 1951 to 1954, a sudden increase in the proportion of persons with basic school education only was seen, and a following decrease in this proportion was seen for the birth cohorts born from 1955 to 1957. For the same birth cohorts, a reverse curve was found for the proportion with vocational training as highest completed education. Using proportion of women with at least one child at the age of 30, our analysis illustrated that spurious patterns may emerge when other social phenomena are analysed by partly misclassified educational groups. Our findings showed that register data are not always to be taken at face value and that thorough analysis may unravel unexpected irregularities. Although such data errors may be remedied in analyses of population trends by use of extrapolated values, solutions are less obvious in epidemiological research using individual level data.

  14. Effect of complete dentures on dynamic measurement of changing head position: A pilot study.

    PubMed

    Usumez, Aslihan; Usumez, Serdar; Orhan, Metin

    2003-10-01

    Complete dentures contribute significantly to the facial esthetics of edentulous patients. However, information as to the effect of complete dentures on the natural position of the head is limited. The purpose of this pilot study was to evaluate the immediate and 30-day effect of wearing complete dentures on the dynamic natural head position measured during walking. The sample consisted of a volunteer group of 16 patients, 8 women and 8 men, who received new complete dentures. The ages of the subjects ranged from 45 to 64 years (mean=52 years). Dynamic measurement of head posture was carried out by a specially constructed inclinometer device. Each subject in turn was fitted with the inclinometer system and instructed to walk in a relaxed manner for 5 minutes. The data, measured as degrees, were stored in a pocket data logger. This procedure was repeated before insertion of dentures (T1), immediately after insertion of dentures (T2), and 30 days after insertion of dentures (T3). Stored dynamic head posture data were transferred to computer for analysis. The means of the measurements were statistically compared with Friedman and following Wilcoxon tests (alpha =.05). Twelve of 16 (75%) subjects showed an average of 4.6 degrees of cranial extension immediately after insertion of dentures. Six (37.5%) subjects showed an average of 6.4 degrees of cranial flexion, and 8 (50%) subjects showed an average of 5.2 degrees of cranial extension at T3 relative to the T1 measurement. Dynamic head posture measurements of the other 2 subjects remained unchanged. There were significant differences between different measurements of dynamic head posture positions (P<.025). However, only the T1 and T2 measurements were significantly different (P<.015). The findings indicate that the statistically significant average extension 4.6 degrees in subjects immediately after insertion of complete dentures was not stable after a 30-day evaluation period and did not produce any statistically significant change. The overall effect of wearing dentures was an irregular flexion or extension pattern on dynamic head posture.

  15. Changing Trends in the Clinical Presentation and Management of Complete Hydatidiform Mole Among Brazilian Women.

    PubMed

    Braga, Antonio; Moraes, Valéria; Maestá, Izildinha; Amim Júnior, Joffre; Rezende-Filho, Jorge de; Elias, Kevin; Berkowitz, Ross

    2016-06-01

    The aim of the study was to evaluate potential changes in the clinical, diagnostic, and therapeutic parameters of complete hydatidiform mole in the last 25 years in Brazil. A retrospective cohort study was conducted involving the analysis of 2163 medical records of patients diagnosed with complete hydatidiform mole who received treatment at the Rio de Janeiro Reference Center for Gestational Trophoblastic Disease between January 1988 and December 2012. For the statistical analysis of the natural history of the patients with complete molar pregnancies, time series were evaluated using the Cox-Stuart test and adjusted by linear regression models. A downward linear temporal trend was observed for gestational age of complete hydatidiform mole at diagnosis, which is also reflected in the reduced occurrence of vaginal bleeding, hyperemesis and pre-eclampsia. We also observed an increase in the use of uterine vacuum aspiration to treat molar pregnancy. Although the duration of postmolar follow-up was found to decline, this was not accompanied by any alteration in the time to remission of the disease or its progression to gestational trophoblastic neoplasia. Early diagnosis of complete hydatidiform mole has altered the natural history of molar pregnancy, especially with a reduction in classical clinical symptoms. However, early diagnosis has not resulted in a reduction in the development of gestational trophoblastic neoplasia, a dilemma that still challenges professionals working with gestational trophoblastic disease.

  16. Development of a funding, cost, and spending model for satellite projects

    NASA Technical Reports Server (NTRS)

    Johnson, Jesse P.

    1989-01-01

    The need for a predictive budget/funging model is obvious. The current models used by the Resource Analysis Office (RAO) are used to predict the total costs of satellite projects. An effort to extend the modeling capabilities from total budget analysis to total budget and budget outlays over time analysis was conducted. A statistical based and data driven methodology was used to derive and develop the model. Th budget data for the last 18 GSFC-sponsored satellite projects were analyzed and used to build a funding model which would describe the historical spending patterns. This raw data consisted of dollars spent in that specific year and their 1989 dollar equivalent. This data was converted to the standard format used by the RAO group and placed in a database. A simple statistical analysis was performed to calculate the gross statistics associated with project length and project cost ant the conditional statistics on project length and project cost. The modeling approach used is derived form the theory of embedded statistics which states that properly analyzed data will produce the underlying generating function. The process of funding large scale projects over extended periods of time is described by Life Cycle Cost Models (LCCM). The data was analyzed to find a model in the generic form of a LCCM. The model developed is based on a Weibull function whose parameters are found by both nonlinear optimization and nonlinear regression. In order to use this model it is necessary to transform the problem from a dollar/time space to a percentage of total budget/time space. This transformation is equivalent to moving to a probability space. By using the basic rules of probability, the validity of both the optimization and the regression steps are insured. This statistically significant model is then integrated and inverted. The resulting output represents a project schedule which relates the amount of money spent to the percentage of project completion.

  17. Quality Assessments of Long-Term Quantitative Proteomic Analysis of Breast Cancer Xenograft Tissues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Jian-Ying; Chen, Lijun; Zhang, Bai

    The identification of protein biomarkers requires large-scale analysis of human specimens to achieve statistical significance. In this study, we evaluated the long-term reproducibility of an iTRAQ (isobaric tags for relative and absolute quantification) based quantitative proteomics strategy using one channel for universal normalization across all samples. A total of 307 liquid chromatography tandem mass spectrometric (LC-MS/MS) analyses were completed, generating 107 one-dimensional (1D) LC-MS/MS datasets and 8 offline two-dimensional (2D) LC-MS/MS datasets (25 fractions for each set) for human-in-mouse breast cancer xenograft tissues representative of basal and luminal subtypes. Such large-scale studies require the implementation of robust metrics to assessmore » the contributions of technical and biological variability in the qualitative and quantitative data. Accordingly, we developed a quantification confidence score based on the quality of each peptide-spectrum match (PSM) to remove quantification outliers from each analysis. After combining confidence score filtering and statistical analysis, reproducible protein identification and quantitative results were achieved from LC-MS/MS datasets collected over a 16 month period.« less

  18. The new statistics: why and how.

    PubMed

    Cumming, Geoff

    2014-01-01

    We need to make substantial changes to how we conduct research. First, in response to heightened concern that our published research literature is incomplete and untrustworthy, we need new requirements to ensure research integrity. These include prespecification of studies whenever possible, avoidance of selection and other inappropriate data-analytic practices, complete reporting, and encouragement of replication. Second, in response to renewed recognition of the severe flaws of null-hypothesis significance testing (NHST), we need to shift from reliance on NHST to estimation and other preferred techniques. The new statistics refers to recommended practices, including estimation based on effect sizes, confidence intervals, and meta-analysis. The techniques are not new, but adopting them widely would be new for many researchers, as well as highly beneficial. This article explains why the new statistics are important and offers guidance for their use. It describes an eight-step new-statistics strategy for research with integrity, which starts with formulation of research questions in estimation terms, has no place for NHST, and is aimed at building a cumulative quantitative discipline.

  19. Langmuir waveforms at interplanetary shocks: STEREO statistical analysis

    NASA Astrophysics Data System (ADS)

    Briand, C.

    2016-12-01

    Wave-particle interactions and particle acceleration are the two main processes allowing energy dissipation at non collisional shocks. Ion acceleration has been deeply studied for many years, also for their central role in the shock front reformation. Electron dynamics is also important in the shock dynamics through the instabilities they can generate which may impact the ion dynamics.Particle measurements can be efficiently completed by wave measurements to determine the characteristics of the electron beams and study the turbulence of the medium. Electric waveforms obtained from the S/WAVES instrument of the STEREO mission between 2007 to 2014 are analyzed. Thus, clear signature of Langmuir waves are observed on 41 interplanetary shocks. These data enable a statistical analysis and to deduce some characteristics of the electron dynamics on different shocks sources (SIR or ICME) and types (quasi-perpendicular or quasi-parallel). The conversion process between electrostatic to electromagnetic waves has also been tested in several cases.

  20. Hydrotherapy after total hip arthroplasty: a follow-up study.

    PubMed

    Giaquinto, S; Ciotola, E; Dall'armi, V; Margutti, F

    2010-01-01

    The aim of the study was to evaluate the subjective functional outcome of total hip arthroplasty (THA) in patients who underwent hydrotherapy (HT) 6 months after discharge. A prospective randomized study was performed on 70 elderly inpatients with recent THA, who completed a rehabilitation program. After randomization, 33 of them were treated in conventional gyms (no-hydrotherapy group=NHTG) and 31 received HT (hydrotherapy group=HTG). Interviews with the Western-Ontario MacMasters Universities Osteoarthritis Index (WOMAC) were performed at admission, at discharge and 6 months later. Kruskal-Wallis, Mann-Whitney and Wilcoxon tests were applied for statistical analysis. Both groups improved. Pain, stiffness and function were all positively affected. Statistical analysis indicated that WOMAC sub-scales were significantly lower for all patients treated with HT. The benefits at discharge still remained after 6 months. We conclude that HT is recommended after THA in a geriatric population.

  1. Ultra-trace analysis of 41Ca in urine by accelerator mass spectrometry: an inter-laboratory comparison

    PubMed Central

    Jackson, George S.; Hillegonds, Darren J.; Muzikar, Paul; Goehring, Brent

    2013-01-01

    A 41Ca interlaboratory comparison between Lawrence Livermore National Laboratory (LLNL) and the Purdue Rare Isotope Laboratory (PRIME Lab) has been completed. Analysis of the ratios assayed by accelerator mass spectrometry (AMS) shows that there is no statistically significant difference in the ratios. Further, Bayesian analysis shows that the uncertainties reported by both facilities are correct with the possibility of a slight under-estimation by one laboratory. Finally, the chemistry procedures used by the two facilities to produce CaF2 for the cesium sputter ion source are robust and don't yield any significant differences in the final result. PMID:24179312

  2. Diagnostic studies of the HxOy-NzOy-O3 photochemical system using data from NASA GTE field expeditions

    NASA Technical Reports Server (NTRS)

    Chameides, William L.

    1988-01-01

    Spring 1084 GTE CITE-1 flight data from the field exercise was obtained from a GTE Data Archive Tape. Chemical and supporting meteorological data taken over the Pacific Ocean was statistically and diagnostically analyzed to identify the key processes affecting the concentrations of ozone and its chemical precursors in the region. The analysis was completed. The analysis of the GTE CITE-2 data is being performed in collaboration with Dr. D.D. Davis and other GTE scientists. Initial results of the analysis were presented and work begun on the paper describing the results.

  3. Texture analysis of pulmonary parenchyma in normal and emphysematous lung

    NASA Astrophysics Data System (ADS)

    Uppaluri, Renuka; Mitsa, Theophano; Hoffman, Eric A.; McLennan, Geoffrey; Sonka, Milan

    1996-04-01

    Tissue characterization using texture analysis is gaining increasing importance in medical imaging. We present a completely automated method for discriminating between normal and emphysematous regions from CT images. This method involves extracting seventeen features which are based on statistical, hybrid and fractal texture models. The best subset of features is derived from the training set using the divergence technique. A minimum distance classifier is used to classify the samples into one of the two classes--normal and emphysema. Sensitivity and specificity and accuracy values achieved were 80% or greater in most cases proving that texture analysis holds great promise in identifying emphysema.

  4. Controlling reactivity of nanoporous catalyst materials by tuning reaction product-pore interior interactions: Statistical mechanical modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jing; Ackerman, David M.; Lin, Victor S.-Y.

    2013-04-02

    Statistical mechanical modeling is performed of a catalytic conversion reaction within a functionalized nanoporous material to assess the effect of varying the reaction product-pore interior interaction from attractive to repulsive. A strong enhancement in reactivity is observed not just due to the shift in reaction equilibrium towards completion but also due to enhanced transport within the pore resulting from reduced loading. The latter effect is strongest for highly restricted transport (single-file diffusion), and applies even for irreversible reactions. The analysis is performed utilizing a generalized hydrodynamic formulation of the reaction-diffusion equations which can reliably capture the complex interplay between reactionmore » and restricted transport.« less

  5. TOMS and SBUV Data: Comparison to 3D Chemical-Transport Model Results

    NASA Technical Reports Server (NTRS)

    Stolarski, Richard S.; Douglass, Anne R.; Steenrod, Steve; Frith, Stacey

    2003-01-01

    We have updated our merged ozone data (MOD) set using the TOMS data from the new version 8 algorithm. We then analyzed these data for contributions from solar cycle, volcanoes, QBO, and halogens using a standard statistical time series model. We have recently completed a hindcast run of our 3D chemical-transport model for the same years. This model uses off-line winds from the finite-volume GCM, a full stratospheric photochemistry package, and time-varying forcing due to halogens, solar uv, and volcanic aerosols. We will report on a parallel analysis of these model results using the same statistical time series technique as used for the MOD data.

  6. LADES: a software for constructing and analyzing longitudinal designs in biomedical research.

    PubMed

    Vázquez-Alcocer, Alan; Garzón-Cortes, Daniel Ladislao; Sánchez-Casas, Rosa María

    2014-01-01

    One of the most important steps in biomedical longitudinal studies is choosing a good experimental design that can provide high accuracy in the analysis of results with a minimum sample size. Several methods for constructing efficient longitudinal designs have been developed based on power analysis and the statistical model used for analyzing the final results. However, development of this technology is not available to practitioners through user-friendly software. In this paper we introduce LADES (Longitudinal Analysis and Design of Experiments Software) as an alternative and easy-to-use tool for conducting longitudinal analysis and constructing efficient longitudinal designs. LADES incorporates methods for creating cost-efficient longitudinal designs, unequal longitudinal designs, and simple longitudinal designs. In addition, LADES includes different methods for analyzing longitudinal data such as linear mixed models, generalized estimating equations, among others. A study of European eels is reanalyzed in order to show LADES capabilities. Three treatments contained in three aquariums with five eels each were analyzed. Data were collected from 0 up to the 12th week post treatment for all the eels (complete design). The response under evaluation is sperm volume. A linear mixed model was fitted to the results using LADES. The complete design had a power of 88.7% using 15 eels. With LADES we propose the use of an unequal design with only 14 eels and 89.5% efficiency. LADES was developed as a powerful and simple tool to promote the use of statistical methods for analyzing and creating longitudinal experiments in biomedical research.

  7. Learning curves for single incision and conventional laparoscopic right hemicolectomy: a multidimensional analysis

    PubMed Central

    Park, Yoonah; Yong, Yuen Geng; Jung, Kyung Uk; Huh, Jung Wook; Cho, Yong Beom; Kim, Hee Cheol; Lee, Woo Yong; Chun, Ho-Kyung

    2015-01-01

    Purpose This study aimed to compare the learning curves and early postoperative outcomes for conventional laparoscopic (CL) and single incision laparoscopic (SIL) right hemicolectomy (RHC). Methods This retrospective study included the initial 35 cases in each group. Learning curves were evaluated by the moving average of operative time, mean operative time of every five consecutive cases, and cumulative sum (CUSUM) analysis. The learning phase was considered overcome when the moving average of operative times reached a plateau, and when the mean operative time of every five consecutive cases reached a low point and subsequently did not vary by more than 30 minutes. Results Six patients with missing data in the CL RHC group were excluded from the analyses. According to the mean operative time of every five consecutive cases, learning phase of SIL and CL RHC was completed between 26 and 30 cases, and 16 and 20 cases, respectively. Moving average analysis revealed that approximately 31 (SIL) and 25 (CL) cases were needed to complete the learning phase, respectively. CUSUM analysis demonstrated that 10 (SIL) and two (CL) cases were required to reach a steady state of complication-free performance, respectively. Postoperative complications rate was higher in SIL than in CL group, but the difference was not statistically significant (17.1% vs. 3.4%). Conclusion The learning phase of SIL RHC is longer than that of CL RHC. Early oncological outcomes of both techniques were comparable. However, SIL RHC had a statistically insignificant higher complication rate than CL RHC during the learning phase. PMID:25960990

  8. Results of the NaCo Large Program: probing the occurrence of exoplanets and brown dwarfs at wide orbit

    NASA Astrophysics Data System (ADS)

    Vigan, A.; Chauvin, G.; Bonavita, M.; Desidera, S.; Bonnefoy, M.; Mesa, D.; Beuzit, J.-L.; Augereau, J.-C.; Biller, B.; Boccaletti, A.; Brugaletta, E.; Buenzli, E.; Carson, J.; Covino, E.; Delorme, P.; Eggenberger, A.; Feldt, M.; Hagelberg, J.; Henning, T.; Lagrange, A.-M.; Lanzafame, A.; Ménard, F.; Messina, S.; Meyer, M.; Montagnier, G.; Mordasini, C.; Mouillet, D.; Moutou, C.; Mugnier, L.; Quanz, S. P.; Reggiani, M.; Ségransan, D.; Thalmann, C.; Waters, R.; Zurlo, A.

    2014-01-01

    Over the past decade, a growing number of deep imaging surveys have started to provide meaningful constraints on the population of extrasolar giant planets at large orbital separation. Primary targets for these surveys have been carefully selected based on their age, distance and spectral type, and often on their membership to young nearby associations where all stars share common kinematics, photometric and spectroscopic properties. The next step is a wider statistical analysis of the frequency and properties of low mass companions as a function of stellar mass and orbital separation. In late 2009, we initiated a coordinated European Large Program using angular differential imaging in the H band (1.66 μm) with NaCo at the VLT. Our aim is to provide a comprehensive and statistically significant study of the occurrence of extrasolar giant planets and brown dwarfs at large (5-500 AU) orbital separation around ~150 young, nearby stars, a large fraction of which have never been observed at very deep contrast. The survey has now been completed and we present the data analysis and detection limits for the observed sample, for which we reach the planetary-mass domain at separations of >~50 AU on average. We also present the results of the statistical analysis that has been performed over the 75 targets newly observed at high-contrast. We discuss the details of the statistical analysis and the physical constraints that our survey provides for the frequency and formation scenario of planetary mass companions at large separation.

  9. ELECTROMYOGRAPHIC EVALUATION OF MASTICATION AND SWALLOWING IN ELDERLY INDIVIDUALS WITH MANDIBULAR FIXED IMPLANTSUPPORTED PROSTHESES

    PubMed Central

    Berretin-Felix, Giédre; Nary, Hugo; Padovani, Carlos Roberto; Trindade, Alceu Sergio; Machado, Wellington Monteiro

    2008-01-01

    This study evaluated the effect of implant-supported oral rehabilitation in the mandible on the electromyographic activity during mastication and swallowing in edentulous elderly individuals. Fifteen patients aged more than 60 years were evaluated, being 10 females and 5 males. All patients were edentulous, wore removable complete dentures on both dental arches, and had the mandibular dentures replaced by implant-supported prostheses. All patients were submitted to electromyographic evaluation of the masseter, superior orbicularis oris muscles, and the submental muscles, before surgery and 3, 6 and 18 months postoperatively, using foods of different textures. The results obtained at the different periods were analyzed statistically by Kruskal-Wallis non-parametric test. Statistical analysis showed that only the masseter muscle had a significant loss in electromyographic activity (p<0.001), with a tendency of similar response for the submental muscles. Moreover, there was an increase in the activity of the orbicularis oris muscle during rubber chewing after treatment, yet without statistically significant difference. Mandibular fixed implant-supported prostheses in elderly individuals revealed a decrease in electromyographic amplitude for the masseter muscles during swallowing, which may indicate adaptation to new conditions of stability provided by fixation of the complete denture in the mandibular arch. PMID:19089202

  10. Likelihoods for fixed rank nomination networks

    PubMed Central

    HOFF, PETER; FOSDICK, BAILEY; VOLFOVSKY, ALEX; STOVEL, KATHERINE

    2014-01-01

    Many studies that gather social network data use survey methods that lead to censored, missing, or otherwise incomplete information. For example, the popular fixed rank nomination (FRN) scheme, often used in studies of schools and businesses, asks study participants to nominate and rank at most a small number of contacts or friends, leaving the existence of other relations uncertain. However, most statistical models are formulated in terms of completely observed binary networks. Statistical analyses of FRN data with such models ignore the censored and ranked nature of the data and could potentially result in misleading statistical inference. To investigate this possibility, we compare Bayesian parameter estimates obtained from a likelihood for complete binary networks with those obtained from likelihoods that are derived from the FRN scheme, and therefore accommodate the ranked and censored nature of the data. We show analytically and via simulation that the binary likelihood can provide misleading inference, particularly for certain model parameters that relate network ties to characteristics of individuals and pairs of individuals. We also compare these different likelihoods in a data analysis of several adolescent social networks. For some of these networks, the parameter estimates from the binary and FRN likelihoods lead to different conclusions, indicating the importance of analyzing FRN data with a method that accounts for the FRN survey design. PMID:25110586

  11. Polish Adaptation of Wrist Evaluation Questionnaires.

    PubMed

    Czarnecki, Piotr; Wawrzyniak-Bielęda, Anna; Romanowski, Leszek

    2015-01-01

    Questionnaires evaluating hand and wrist function are a very useful tool allowing for objective and systematic recording of symptoms reported by the patients. Most questionnaires generally accepted in clinical practice are available in English and need to be appropriately adapted in translation and undergo subsequent validation before they can be used in another culture and language. The process of translation of the questionnaires was based on the generally accepted guidelines of the International Quality of Life Assessment Project (IQOLA). First, the questionnaires were translated from English into Polish by two independent translators. Then, a joint version of the translation was prepared collectively and translated back into English. Each stage was followed by a written report. The translated questionnaires were then evaluated by a group of patients. We selected 31 patients with wrist problems and asked them to complete the PRWE, Mayo, Michigan and DASH questionnaires twice at intervals of 3-10 days. The results were submitted for statistical analysis. We found a statistically significant (p<0.05) correlation for the two completions of the questionnaires. A comparison of the PRWE and Mayo questionnaires with the DASH questionnaire also showed a statistically significant correlation (p<0.05). Our results indicate that the cultural adaptation of the translated questionnaires was successful and that the questionnaires may be used in clinical practice.

  12. Sampling and Data Analysis for Environmental Microbiology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murray, Christopher J.

    2001-06-01

    A brief review of the literature indicates the importance of statistical analysis in applied and environmental microbiology. Sampling designs are particularly important for successful studies, and it is highly recommended that researchers review their sampling design before heading to the laboratory or the field. Most statisticians have numerous stories of scientists who approached them after their study was complete only to have to tell them that the data they gathered could not be used to test the hypothesis they wanted to address. Once the data are gathered, a large and complex body of statistical techniques are available for analysis ofmore » the data. Those methods include both numerical and graphical techniques for exploratory characterization of the data. Hypothesis testing and analysis of variance (ANOVA) are techniques that can be used to compare the mean and variance of two or more groups of samples. Regression can be used to examine the relationships between sets of variables and is often used to examine the dependence of microbiological populations on microbiological parameters. Multivariate statistics provides several methods that can be used for interpretation of datasets with a large number of variables and to partition samples into similar groups, a task that is very common in taxonomy, but also has applications in other fields of microbiology. Geostatistics and other techniques have been used to examine the spatial distribution of microorganisms. The objectives of this chapter are to provide a brief survey of some of the statistical techniques that can be used for sample design and data analysis of microbiological data in environmental studies, and to provide some examples of their use from the literature.« less

  13. Comparison of Recanalization and In-Stent Stenosis Between the Low-Profile Visualized Intraluminal Support Stent and Enterprise Stent-Assisted Coiling for 254 Intracranial Aneurysms.

    PubMed

    Feng, Xin; Qian, Zenghui; Liu, Peng; Zhang, Baorui; Wang, Luyao; Guo, Erkang; Wen, Xiaolong; Xu, Wenjuan; Jiang, Chuhan; Wu, Zhongxue; Li, Youxiang; Liu, Aihua

    2018-01-01

    To compare the rates of recanalization and in-stent stenosis between the Enterprise (EP) and low-profile visualized intraluminal support (LVIS) stent deployments for intracranial aneurysms (IAs), and the factors associated therein. Between June 2014 and July 2016, 142 patients with a total of 161 IAs were treated by LVIS stent-assisted coiling and 111 patients with a total of 142 IAs were treated by EP stent-assisted coiling at our institution. Procedure-related complications, angiographic follow-up results, and clinical outcomes were analyzed statistically. The rates of initially complete and near-complete IA occlusion immediately after the procedure were similar in the LVIS and EP groups (94.3% vs. 89.9%; P = 0.275). On follow-up, complete and near-complete occlusion rates and recanalization rates were also similar in the 2 groups (96.6% vs. 92.1%; P =0.330 and 8.0% vs. 13.5%; P = 0.245, respectively). On logistic regression analysis, a higher size ratio (SR) was significantly associated with the recanalization of aneurysms in the EP group, but not in the LVIS group. The rate of moderate to severe in-stent stenosis was lower in the LVIS group (10.2%) than in the EP group (16.8%), but the difference was not statistically significant (P = 0.198). Our data show acceptable rates of complete and near-complete occlusion with both the LVIS and EP stents. LVIS stents were associated with lower rates of recanalization and in-stent stenosis, but the difference was not significant. Higher SR (≥2) was a significant predictor of recanalization in IAs treated with EP stents, but not in those treated with LVIS stents. Copyright © 2017. Published by Elsevier Inc.

  14. Use of the Wii Gaming System for Balance Rehabilitation: Establishing Parameters for Healthy Individuals.

    PubMed

    Burns, Melissa K; Andeway, Kathleen; Eppenstein, Paula; Ruroede, Kathleen

    2014-06-01

    This study was designed to establish balance parameters for the Nintendo(®) (Redmond, WA) "Wii Fit™" Balance Board system with three common games, in a sample of healthy adults, and to evaluate the balance measurement reproducibility with separation by age. This was a prospective, multivariate analysis of variance, cohort study design. Seventy-five participants who satisfied all inclusion criteria and completed an informed consent were enrolled. Participants were grouped into age ranges: 21-35 years (n=24), 36-50 years (n=24), and 51-65 years (n=27). Each participant completed the following games three consecutive times, in a randomized order, during one session: "Balance Bubble" (BB) for distance and duration, "Tight Rope" (TR) for distance and duration, and "Center of Balance" (COB) on the left and right sides. COB distributed weight was fairly symmetrical across all subjects and trials; therefore, no influence was assumed on or interaction with other "Wii Fit" measurements. Homogeneity of variance statistics indicated the assumption of distribution normality of the dependent variables (rates) were tenable. The multivariate analysis of variance included dependent variables BB and TR rates (distance divided by duration to complete) with age group and trials as the independent variables. The BB rate was statistically significant (F=4.725, P<0.005), but not the TR rate. The youngest group's BB rate was significantly larger than those of the other two groups. "Wii Fit" can discriminate among age groups across trials. The results show promise as a viable tool to measure balance and distance across time (speed) and center of balance distribution.

  15. Relationship between Graduate Students' Statistics Self-Efficacy, Statistics Anxiety, Attitude toward Statistics, and Social Support

    ERIC Educational Resources Information Center

    Perepiczka, Michelle; Chandler, Nichelle; Becerra, Michael

    2011-01-01

    Statistics plays an integral role in graduate programs. However, numerous intra- and interpersonal factors may lead to successful completion of needed coursework in this area. The authors examined the extent of the relationship between self-efficacy to learn statistics and statistics anxiety, attitude towards statistics, and social support of 166…

  16. In vitro cavity and crown preparations and direct restorations: A comparison of performance at the start and end of the FD programme.

    PubMed

    Burke, F J T; Ravaghi, V; Mackenzie, L; Priest, N; Falcon, H C

    2017-04-21

    Aim To assess the performance and thereby the progress of the FDs when they carried out a number of simulated clinical exercises at the start and at the end of their FD year.Methods A standardised simulated clinical restorative dentistry training exercise was carried out by a group of 61 recently qualified dental graduates undertaking a 12 months' duration foundation training programme in England, at both the start and end of the programme. Participants completed a Class II cavity preparation and amalgam restoration, a Class IV composite resin restoration and two preparations for a porcelain-metal full crown. The completed preparations and restorations were independently assessed by an experienced consultant in restorative dentistry, using a scoring system based on previously validated criteria. The data were subjected to statistical analysis.Results There was wide variation in individual performance. Overall, there was a small but not statistically significant improvement in performance by the end of the programme. A statistically significant improvement was observed for the amalgam preparation and restoration, and, overall, for one of the five geographical sub-groups in the study. Possible reasons for the variable performance and improvement are discussed.Conclusions There was variability in the performance of the FDs. The operative performance of FDs at the commencement and end of their FD year indicated an overall moderately improved performance over the year and a statistically significant improvement in their performance with regard to amalgam restoration.

  17. Should palonosetron be a preferred 5-HT3 receptor antagonist for chemotherapy-induced nausea and vomiting? An updated systematic review and meta-analysis.

    PubMed

    Chow, Ronald; Warr, David G; Navari, Rudolph M; Tsao, May; Popovic, Marko; Chiu, Leonard; Milakovic, Milica; Lam, Henry; DeAngelis, Carlo

    2018-05-23

    Chemotherapy-induced nausea and vomiting (CINV) continues to be a common side effect of systemic anticancer therapy, decreasing quality of life and increasing resource utilization. The aim of this meta-analysis was to investigate the comparative efficacy and safety of palonosetron relative to other 5-HT 3 RAs. A literature search was carried out in Ovid MEDLINE, Embase, and Cochrane Central Register of Controlled Trials. Full-text references were then screened and included in this meta-analysis if they were an RCT and had adequate data regarding one of the five primary endpoints-complete response (CR), complete control (CC), no emesis, no nausea, or no rescue medications. A total of 24 RCTs were included in this review. Palonosetron was statistically superior to other 5-HT 3 RAs for 10 of the 19 assessed endpoints. Only one endpoint-emesis in the overall phase-had noticeable more favorable data for palonosetron to the point that it approached the 10% risk difference (RD) threshold as specified by the MASCC/ESMO antiemetic panel; another two endpoints (CR in the overall phase and nausea in the delayed phase) approached the 10% threshold. Palonosetron seems to be more efficacious and safe than other 5-HT 3 RAs-statistically superior in 10 of 19 endpoints. It is, however, only clinically significant in one endpoint and approached clinically significant difference in another two endpoints. Within the limits of this meta-analysis, our results indicate that palonosetron may not be as superior in efficacy and safety as reported in a previous meta-analysis, and supports the recent MASCC/ESMO, ASCO, and NCCN guidelines in not generally indicating palonosetron as the 5-HT 3 RA of choice.

  18. Using Network Analysis to Characterize Biogeographic Data in a Community Archive

    NASA Astrophysics Data System (ADS)

    Wellman, T. P.; Bristol, S.

    2017-12-01

    Informative measures are needed to evaluate and compare data from multiple providers in a community-driven data archive. This study explores insights from network theory and other descriptive and inferential statistics to examine data content and application across an assemblage of publically available biogeographic data sets. The data are archived in ScienceBase, a collaborative catalog of scientific data supported by the U.S Geological Survey to enhance scientific inquiry and acuity. In gaining understanding through this investigation and other scientific venues our goal is to improve scientific insight and data use across a spectrum of scientific applications. Network analysis is a tool to reveal patterns of non-trivial topological features in the data that do not exhibit complete regularity or randomness. In this work, network analyses are used to explore shared events and dependencies between measures of data content and application derived from metadata and catalog information and measures relevant to biogeographic study. Descriptive statistical tools are used to explore relations between network analysis properties, while inferential statistics are used to evaluate the degree of confidence in these assessments. Network analyses have been used successfully in related fields to examine social awareness of scientific issues, taxonomic structures of biological organisms, and ecosystem resilience to environmental change. Use of network analysis also shows promising potential to identify relationships in biogeographic data that inform programmatic goals and scientific interests.

  19. [Evaluation of using statistical methods in selected national medical journals].

    PubMed

    Sych, Z

    1996-01-01

    The paper covers the performed evaluation of frequency with which the statistical methods were applied in analyzed works having been published in six selected, national medical journals in the years 1988-1992. For analysis the following journals were chosen, namely: Klinika Oczna, Medycyna Pracy, Pediatria Polska, Polski Tygodnik Lekarski, Roczniki Państwowego Zakładu Higieny, Zdrowie Publiczne. Appropriate number of works up to the average in the remaining medical journals was randomly selected from respective volumes of Pol. Tyg. Lek. The studies did not include works wherein the statistical analysis was not implemented, which referred both to national and international publications. That exemption was also extended to review papers, casuistic ones, reviews of books, handbooks, monographies, reports from scientific congresses, as well as papers on historical topics. The number of works was defined in each volume. Next, analysis was performed to establish the mode of finding out a suitable sample in respective studies, differentiating two categories: random and target selections. Attention was also paid to the presence of control sample in the individual works. In the analysis attention was also focussed on the existence of sample characteristics, setting up three categories: complete, partial and lacking. In evaluating the analyzed works an effort was made to present the results of studies in tables and figures (Tab. 1, 3). Analysis was accomplished with regard to the rate of employing statistical methods in analyzed works in relevant volumes of six selected, national medical journals for the years 1988-1992, simultaneously determining the number of works, in which no statistical methods were used. Concurrently the frequency of applying the individual statistical methods was analyzed in the scrutinized works. Prominence was given to fundamental statistical methods in the field of descriptive statistics (measures of position, measures of dispersion) as well as most important methods of mathematical statistics such as parametric tests of significance, analysis of variance (in single and dual classifications). non-parametric tests of significance, correlation and regression. The works, in which use was made of either multiple correlation or multiple regression or else more complex methods of studying the relationship for two or more numbers of variables, were incorporated into the works whose statistical methods were constituted by correlation and regression as well as other methods, e.g. statistical methods being used in epidemiology (coefficients of incidence and morbidity, standardization of coefficients, survival tables) factor analysis conducted by Jacobi-Hotellng's method, taxonomic methods and others. On the basis of the performed studies it has been established that the frequency of employing statistical methods in the six selected national, medical journals in the years 1988-1992 was 61.1-66.0% of the analyzed works (Tab. 3), and they generally were almost similar to the frequency provided in English language medical journals. On a whole, no significant differences were disclosed in the frequency of applied statistical methods (Tab. 4) as well as in frequency of random tests (Tab. 3) in the analyzed works, appearing in the medical journals in respective years 1988-1992. The most frequently used statistical methods in analyzed works for 1988-1992 were the measures of position 44.2-55.6% and measures of dispersion 32.5-38.5% as well as parametric tests of significance 26.3-33.1% of the works analyzed (Tab. 4). For the purpose of increasing the frequency and reliability of the used statistical methods, the didactics should be widened in the field of biostatistics at medical studies and postgraduation training designed for physicians and scientific-didactic workers.

  20. Effect of endodontic irrigation with 1% sodium hypochlorite and 17% EDTA on primary teeth: a scanning electron microscope analysis.

    PubMed

    Ximenes, Marcos; Triches, Thaisa C; Beltrame, Ana Paula C A; Hilgert, Leandro A; Cardoso, Mariane

    2013-01-01

    This study evaluated the efficacy of 2 final irrigation solutions for removal of the smear layer (SL) from root canals of primary teeth, using scanning electron microscope (SEM) analysis. Thirty primary molars were selected and a single operator instrumented the canals. The initial irrigation was done with a 1% sodium hypochlorite (NaOCl) solution. After the preparation, the roots were randomly divided into 3 groups for final irrigation: Group 1, 1% NaOCl (n = 10); Group 2, 17% EDTA + 1% NaOCl (n = 10); and Group 3, 17% EDTA + saline solution (n = 10). The roots were prepared for SEM analysis (magnification 1000X). The photomicrographs were independently analyzed by 2 investigators with SEM experience, attributing scores to each root third in terms of SL removal. Kruskal-Wallis and Mann-Whitney tests revealed that there was no statistical difference between the groups (P = 0.489). However, a statistical difference was found (P < 0.05) in a comparison of root thirds, with the apical third having the worst results. Comparing the thirds within the same group, all canals showed statistical differences between the cervical and apical thirds (P < 0.05). The authors determined that no substance or association of substances were able to completely remove SL.

  1. Validation of the World Health Organization tool for situational analysis to assess emergency and essential surgical care at district hospitals in Ghana.

    PubMed

    Osen, Hayley; Chang, David; Choo, Shelly; Perry, Henry; Hesse, Afua; Abantanga, Francis; McCord, Colin; Chrouser, Kristin; Abdullah, Fizan

    2011-03-01

    The World Health Organization (WHO) Tool for Situational Analysis to Assess Emergency and Essential Surgical Care (hereafter called the WHO Tool) has been used in more than 25 countries and is the largest effort to assess surgical care in the world. However, it has not yet been independently validated. Test-retest reliability is one way to validate the degree to which tests instruments are free from random error. The aim of the present field study was to determine the test-retest reliability of the WHO Tool. The WHO Tool was mailed to 10 district hospitals in Ghana. Written instructions were provided along with a letter from the Ghana Health Services requesting the hospital administrator to complete the survey tool. After ensuring delivery and completion of the forms, the study team readministered the WHO Tool at the time of an on-site visit less than 1 month later. The results of the two tests were compared to calculate kappa statistics for each of the 152 questions in the WHO Tool. The kappa statistic is a statistical measure of the degree of agreement above what would be expected based on chance alone. Ten hospitals were surveyed twice over a short interval (i.e., less than 1 month). Weighted and unweighted kappa statistics were calculated for 152 questions. The median unweighted kappa for the entire survey was 0.43 (interquartile range 0-0.84). The infrastructure section (24 questions) had a median kappa of 0.81; the human resources section (13 questions) had a median kappa of 0.77; the surgical procedures section (67 questions) had a median kappa of 0.00; and the emergency surgical equipment section (48 questions) had a median kappa of 0.81. Hospital capacity survey questions related to infrastructure characteristics had high reliability. However, questions related to process of care had poor reliability and may benefit from supplemental data gathered by direct observation. Limitations to the study include the small sample size: 10 district hospitals in a single country. Consistent and high correlations calculated from the field testing within the present analysis suggest that the WHO Tool for Situational Analysis is a reliable tool where it measures structure and setting, but it should be revised for measuring process of care.

  2. 28 CFR 22.25 - Final disposition of identifiable materials.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... RESEARCH AND STATISTICAL INFORMATION § 22.25 Final disposition of identifiable materials. Upon completion of a research or statistical project the security of identifiable research or statistical information...

  3. 28 CFR 22.25 - Final disposition of identifiable materials.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... RESEARCH AND STATISTICAL INFORMATION § 22.25 Final disposition of identifiable materials. Upon completion of a research or statistical project the security of identifiable research or statistical information...

  4. The Importance of Medical Students' Attitudes Regarding Cognitive Competence for Teaching Applied Statistics: Multi-Site Study and Meta-Analysis

    PubMed Central

    Milic, Natasa M.; Masic, Srdjan; Milin-Lazovic, Jelena; Trajkovic, Goran; Bukumiric, Zoran; Savic, Marko; Milic, Nikola V.; Cirkovic, Andja; Gajic, Milan; Kostic, Mirjana; Ilic, Aleksandra; Stanisavljevic, Dejana

    2016-01-01

    Background The scientific community increasingly is recognizing the need to bolster standards of data analysis given the widespread concern that basic mistakes in data analysis are contributing to the irreproducibility of many published research findings. The aim of this study was to investigate students’ attitudes towards statistics within a multi-site medical educational context, monitor their changes and impact on student achievement. In addition, we performed a systematic review to better support our future pedagogical decisions in teaching applied statistics to medical students. Methods A validated Serbian Survey of Attitudes Towards Statistics (SATS-36) questionnaire was administered to medical students attending obligatory introductory courses in biostatistics from three medical universities in the Western Balkans. A systematic review of peer-reviewed publications was performed through searches of Scopus, Web of Science, Science Direct, Medline, and APA databases through 1994. A meta-analysis was performed for the correlation coefficients between SATS component scores and statistics achievement. Pooled estimates were calculated using random effects models. Results SATS-36 was completed by 461 medical students. Most of the students held positive attitudes towards statistics. Ability in mathematics and grade point average were associated in a multivariate regression model with the Cognitive Competence score, after adjusting for age, gender and computer ability. The results of 90 paired data showed that Affect, Cognitive Competence, and Effort scores demonstrated significant positive changes. The Cognitive Competence score showed the largest increase (M = 0.48, SD = 0.95). The positive correlation found between the Cognitive Competence score and students’ achievement (r = 0.41; p<0.001), was also shown in the meta-analysis (r = 0.37; 95% CI 0.32–0.41). Conclusion Students' subjective attitudes regarding Cognitive Competence at the beginning of the biostatistics course, which were directly linked to mathematical knowledge, affected their attitudes at the end of the course that, in turn, influenced students' performance. This indicates the importance of positively changing not only students’ cognitive competency, but also their perceptions of gained competency during the biostatistics course. PMID:27764123

  5. The Importance of Medical Students' Attitudes Regarding Cognitive Competence for Teaching Applied Statistics: Multi-Site Study and Meta-Analysis.

    PubMed

    Milic, Natasa M; Masic, Srdjan; Milin-Lazovic, Jelena; Trajkovic, Goran; Bukumiric, Zoran; Savic, Marko; Milic, Nikola V; Cirkovic, Andja; Gajic, Milan; Kostic, Mirjana; Ilic, Aleksandra; Stanisavljevic, Dejana

    2016-01-01

    The scientific community increasingly is recognizing the need to bolster standards of data analysis given the widespread concern that basic mistakes in data analysis are contributing to the irreproducibility of many published research findings. The aim of this study was to investigate students' attitudes towards statistics within a multi-site medical educational context, monitor their changes and impact on student achievement. In addition, we performed a systematic review to better support our future pedagogical decisions in teaching applied statistics to medical students. A validated Serbian Survey of Attitudes Towards Statistics (SATS-36) questionnaire was administered to medical students attending obligatory introductory courses in biostatistics from three medical universities in the Western Balkans. A systematic review of peer-reviewed publications was performed through searches of Scopus, Web of Science, Science Direct, Medline, and APA databases through 1994. A meta-analysis was performed for the correlation coefficients between SATS component scores and statistics achievement. Pooled estimates were calculated using random effects models. SATS-36 was completed by 461 medical students. Most of the students held positive attitudes towards statistics. Ability in mathematics and grade point average were associated in a multivariate regression model with the Cognitive Competence score, after adjusting for age, gender and computer ability. The results of 90 paired data showed that Affect, Cognitive Competence, and Effort scores demonstrated significant positive changes. The Cognitive Competence score showed the largest increase (M = 0.48, SD = 0.95). The positive correlation found between the Cognitive Competence score and students' achievement (r = 0.41; p<0.001), was also shown in the meta-analysis (r = 0.37; 95% CI 0.32-0.41). Students' subjective attitudes regarding Cognitive Competence at the beginning of the biostatistics course, which were directly linked to mathematical knowledge, affected their attitudes at the end of the course that, in turn, influenced students' performance. This indicates the importance of positively changing not only students' cognitive competency, but also their perceptions of gained competency during the biostatistics course.

  6. Influence of Deployment on the Use of E-Cigarettes in the United States Army and Air Force

    DTIC Science & Technology

    2018-03-22

    the "Tobacco Use Among Service Members" survey sponsored by the Murtha Cancer Center and the Postgraduate Dental School of the Uniformed Services...the study period, and were willing to complete the survey . The survey was voluntary and anonymous; no personally identifiable information was...collected about participants. Statistical analysis of the data obtained from this survey database was performed using SAS. The independent variables were

  7. Statistical Analysis of Japanese Structural Damage Data

    DTIC Science & Technology

    1977-01-01

    buildings and no ready correlation between I-beam and lattice work columns could be established. The complete listing of the buildings contained in the final...subclassification efforts in this structure class. Of the 90 buildings in the data base, two have such light lattice work steel columns that they would...more properly be clas- sified as Very Light Steel Frame Buildings; six have concrete panel walls; two have lattice steel columns that are filled with

  8. Forest statistics for Southeast Georgia, 1996

    Treesearch

    Michael T. Thompson; Raymond M. Sheffield

    1997-01-01

    This report highlights the principal findings of the seventh forest survey of Southeast Georgia. Field work began in November 1995 and was completed in November 1996. Six previous surveys, completed in 1934, 1952, 1960, 1971, 1981, and 1988 provide statistics for measuring changes and trends over the past 62 years. This report primarily emphasizes the changes and...

  9. A brief history of numbers and statistics with cytometric applications.

    PubMed

    Watson, J V

    2001-02-15

    A brief history of numbers and statistics traces the development of numbers from prehistory to completion of our current system of numeration with the introduction of the decimal fraction by Viete, Stevin, Burgi, and Galileo at the turn of the 16th century. This was followed by the development of what we now know as probability theory by Pascal, Fermat, and Huygens in the mid-17th century which arose in connection with questions in gambling with dice and can be regarded as the origin of statistics. The three main probability distributions on which statistics depend were introduced and/or formalized between the mid-17th and early 19th centuries: the binomial distribution by Pascal; the normal distribution by de Moivre, Gauss, and Laplace, and the Poisson distribution by Poisson. The formal discipline of statistics commenced with the works of Pearson, Yule, and Gosset at the turn of the 19th century when the first statistical tests were introduced. Elementary descriptions of the statistical tests most likely to be used in conjunction with cytometric data are given and it is shown how these can be applied to the analysis of difficult immunofluorescence distributions when there is overlap between the labeled and unlabeled cell populations. Copyright 2001 Wiley-Liss, Inc.

  10. SAP- FORTRAN STATIC SOURCE CODE ANALYZER PROGRAM (IBM VERSION)

    NASA Technical Reports Server (NTRS)

    Manteufel, R.

    1994-01-01

    The FORTRAN Static Source Code Analyzer program, SAP, was developed to automatically gather statistics on the occurrences of statements and structures within a FORTRAN program and to provide for the reporting of those statistics. Provisions have been made for weighting each statistic and to provide an overall figure of complexity. Statistics, as well as figures of complexity, are gathered on a module by module basis. Overall summed statistics are also accumulated for the complete input source file. SAP accepts as input syntactically correct FORTRAN source code written in the FORTRAN 77 standard language. In addition, code written using features in the following languages is also accepted: VAX-11 FORTRAN, IBM S/360 FORTRAN IV Level H Extended; and Structured FORTRAN. The SAP program utilizes two external files in its analysis procedure. A keyword file allows flexibility in classifying statements and in marking a statement as either executable or non-executable. A statistical weight file allows the user to assign weights to all output statistics, thus allowing the user flexibility in defining the figure of complexity. The SAP program is written in FORTRAN IV for batch execution and has been implemented on a DEC VAX series computer under VMS and on an IBM 370 series computer under MVS. The SAP program was developed in 1978 and last updated in 1985.

  11. SAP- FORTRAN STATIC SOURCE CODE ANALYZER PROGRAM (DEC VAX VERSION)

    NASA Technical Reports Server (NTRS)

    Merwarth, P. D.

    1994-01-01

    The FORTRAN Static Source Code Analyzer program, SAP, was developed to automatically gather statistics on the occurrences of statements and structures within a FORTRAN program and to provide for the reporting of those statistics. Provisions have been made for weighting each statistic and to provide an overall figure of complexity. Statistics, as well as figures of complexity, are gathered on a module by module basis. Overall summed statistics are also accumulated for the complete input source file. SAP accepts as input syntactically correct FORTRAN source code written in the FORTRAN 77 standard language. In addition, code written using features in the following languages is also accepted: VAX-11 FORTRAN, IBM S/360 FORTRAN IV Level H Extended; and Structured FORTRAN. The SAP program utilizes two external files in its analysis procedure. A keyword file allows flexibility in classifying statements and in marking a statement as either executable or non-executable. A statistical weight file allows the user to assign weights to all output statistics, thus allowing the user flexibility in defining the figure of complexity. The SAP program is written in FORTRAN IV for batch execution and has been implemented on a DEC VAX series computer under VMS and on an IBM 370 series computer under MVS. The SAP program was developed in 1978 and last updated in 1985.

  12. A case report of low intensity laser therapy (LILT) in the management of venous ulceration: potential effects of wound debridement upon efficacy.

    PubMed

    Lagan, K M; Mc Donough, S M; Clements, B A; Baxter, G D

    2000-02-01

    This single case report (ABA design) was undertaken as a preliminary investigation into the clinical effects of low intensity laser upon venous ulceration, applied to wound margins only, and the potential relevance of wound debridement and wound measurement techniques to any effects observed. Ethical approval was granted by the University of Ulster's Research Ethical Committee and the patient recruited was required to attend 3 times per week for a total of 8 weeks. Treatments were carried out using single source irradiation (830 nm; 9 J/cm2, CB Medico, Copenhagen, Denmark) in conjunction with dry dressings during each visit. Assessment of wound surface area, wound appearance, and current pain were completed by an independent investigator. Planimetry and digitizing were completed for wound tracings and for photographs to quantify surface areas. Video image analysis was also performed on photographs of wounds. The primary findings were changes in wound appearance, and a decrease in wound surface area (range 33.3-46.3%), dependent on the choice of measurement method. Video image analysis was used, but rejected as an accurate method of wound measurement. Treatment intervention produced a statistically significant reduction in wound area using the C statistic on digitizing data for photographs (at Phase one only; Z = 2.412; p < 0.05). Wound debridement emerged as an important procedure to be carried out prior to measuring wounds. Despite fluctuating pain levels recorded throughout the duration of the study, VAS scores showed a decrease of 15% at the end of the study. This hypoalgesic effect was, however, statistically significant (using the C statistic) at Phase one only (Z = 2.554; p < 0.05). Low intensity laser therapy at this dosage, and using single source irradiation would seem to be an effective treatment for patients suffering venous ulceration. Further group studies are indicated to establish the most effective therapeutic dosage for this and other types of ulceration.

  13. A new statistical methodology predicting chip failure probability considering electromigration

    NASA Astrophysics Data System (ADS)

    Sun, Ted

    In this research thesis, we present a new approach to analyze chip reliability subject to electromigration (EM) whose fundamental causes and EM phenomenon happened in different materials are presented in this thesis. This new approach utilizes the statistical nature of EM failure in order to assess overall EM risk. It includes within-die temperature variations from the chip's temperature map extracted by an Electronic Design Automation (EDA) tool to estimate the failure probability of a design. Both the power estimation and thermal analysis are performed in the EDA flow. We first used the traditional EM approach to analyze the design with a single temperature across the entire chip that involves 6 metal and 5 via layers. Next, we used the same traditional approach but with a realistic temperature map. The traditional EM analysis approach and that coupled with a temperature map and the comparison between the results of considering and not considering temperature map are presented in in this research. A comparison between these two results confirms that using a temperature map yields a less pessimistic estimation of the chip's EM risk. Finally, we employed the statistical methodology we developed considering a temperature map and different use-condition voltages and frequencies to estimate the overall failure probability of the chip. The statistical model established considers the scaling work with the usage of traditional Black equation and four major conditions. The statistical result comparisons are within our expectations. The results of this statistical analysis confirm that the chip level failure probability is higher i) at higher use-condition frequencies for all use-condition voltages, and ii) when a single temperature instead of a temperature map across the chip is considered. In this thesis, I start with an overall review on current design types, common flows, and necessary verifications and reliability checking steps used in this IC design industry. Furthermore, the important concepts about "Scripting Automation" which is used in all the integration of using diversified EDA tools in this research work are also described in detail with several examples and my completed coding works are also put in the appendix for your reference. Hopefully, this construction of my thesis will give readers a thorough understanding about my research work from the automation of EDA tools to the statistical data generation, from the nature of EM to the statistical model construction, and the comparisons among the traditional EM analysis and the statistical EM analysis approaches.

  14. Applying the LANL Statistical Pattern Recognition Paradigm for Structural Health Monitoring to Data from a Surface-Effect Fast Patrol Boat

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoon Sohn; Charles Farrar; Norman Hunter

    2001-01-01

    This report summarizes the analysis of fiber-optic strain gauge data obtained from a surface-effect fast patrol boat being studied by the staff at the Norwegian Defense Research Establishment (NDRE) in Norway and the Naval Research Laboratory (NRL) in Washington D.C. Data from two different structural conditions were provided to the staff at Los Alamos National Laboratory. The problem was then approached from a statistical pattern recognition paradigm. This paradigm can be described as a four-part process: (1) operational evaluation, (2) data acquisition & cleansing, (3) feature extraction and data reduction, and (4) statistical model development for feature discrimination. Given thatmore » the first two portions of this paradigm were mostly completed by the NDRE and NRL staff, this study focused on data normalization, feature extraction, and statistical modeling for feature discrimination. The feature extraction process began by looking at relatively simple statistics of the signals and progressed to using the residual errors from auto-regressive (AR) models fit to the measured data as the damage-sensitive features. Data normalization proved to be the most challenging portion of this investigation. A novel approach to data normalization, where the residual errors in the AR model are considered to be an unmeasured input and an auto-regressive model with exogenous inputs (ARX) is then fit to portions of the data exhibiting similar waveforms, was successfully applied to this problem. With this normalization procedure, a clear distinction between the two different structural conditions was obtained. A false-positive study was also run, and the procedure developed herein did not yield any false-positive indications of damage. Finally, the results must be qualified by the fact that this procedure has only been applied to very limited data samples. A more complete analysis of additional data taken under various operational and environmental conditions as well as other structural conditions is necessary before one can definitively state that the procedure is robust enough to be used in practice.« less

  15. Land use statistics for West Virginia, Part I

    USGS Publications Warehouse

    Erwin, Robert B.; ,; ,

    1979-01-01

    The West Virginia Geological and Economic Survey and the United States Geological Survey have completed a cooperative program to provide land-use and land-cover maps and data for the State. This program begins to satisfy a longstanding need for a consistent level of detail, standardization in categorization, and scale of compilation for land-use and land-cover maps and data. The statistical information contained in this Bulletin provides land-use acreage tabulations for the first 20 counties that have been completed. Statistics are being compiled for the remaining counties and will be published shortly. This information has been derived from the recently completed Land-Use Map of West Virginia (on open file at the West Virginia Geological and Economic Survey - Environmental Section). In addition to land-use acreage, we have also included land-use percent. All statistics throughout this Bulletin are in the same format for ease of comparison.

  16. The modified Memorial Symptom Assessment Scale Short Form: a modified response format and rational scoring rules.

    PubMed

    Sharp, J L; Gough, K; Pascoe, M C; Drosdowsky, A; Chang, V T; Schofield, P

    2018-07-01

    The Memorial Symptom Assessment Scale Short Form (MSAS-SF) is a widely used symptom assessment instrument. Patients who self-complete the MSAS-SF have difficulty following the two-part response format, resulting in incorrectly completed responses. We describe modifications to the response format to improve useability, and rational scoring rules for incorrectly completed items. The modified MSAS-SF was completed by 311 women in our Peer and Nurse support Trial to Assist women in Gynaecological Oncology; the PeNTAGOn study. Descriptive statistics were used to summarise completion of the modified MSAS-SF, and provide symptom statistics before and after applying the rational scoring rules. Spearman's correlations with the Functional Assessment for Cancer Therapy-General (FACT-G) and Hospital Anxiety and Depression Scale (HADS) were assessed. Correct completion of the modified MSAS-SF items ranged from 91.5 to 98.7%. The rational scoring rules increased the percentage of useable responses on average 4% across all symptoms. MSAS-SF item statistics were similar with and without the scoring rules. The pattern of correlations with FACT-G and HADS was compatible with prior research. The modified MSAS-SF was useable for self-completion and responses demonstrated validity. The rational scoring rules can minimise loss of data from incorrectly completed responses. Further investigation is recommended.

  17. REPORT FOR COMMERCIAL GRADE NICKEL CHARACTERIZATION AND BENCHMARKING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2012-12-20

    Oak Ridge Associated Universities (ORAU), under the Oak Ridge Institute for Science and Education (ORISE) contract, has completed the collection, sample analysis, and review of analytical results to benchmark the concentrations of gross alpha-emitting radionuclides, gross beta-emitting radionuclides, and technetium-99 in commercial grade nickel. This report presents methods, change management, observations, and statistical analysis of materials procured from sellers representing nine countries on four continents. The data suggest there is a low probability of detecting alpha- and beta-emitting radionuclides in commercial nickel. Technetium-99 was not detected in any samples, thus suggesting it is not present in commercial nickel.

  18. ODM Data Analysis-A tool for the automatic validation, monitoring and generation of generic descriptive statistics of patient data.

    PubMed

    Brix, Tobias Johannes; Bruland, Philipp; Sarfraz, Saad; Ernsting, Jan; Neuhaus, Philipp; Storck, Michael; Doods, Justin; Ständer, Sonja; Dugas, Martin

    2018-01-01

    A required step for presenting results of clinical studies is the declaration of participants demographic and baseline characteristics as claimed by the FDAAA 801. The common workflow to accomplish this task is to export the clinical data from the used electronic data capture system and import it into statistical software like SAS software or IBM SPSS. This software requires trained users, who have to implement the analysis individually for each item. These expenditures may become an obstacle for small studies. Objective of this work is to design, implement and evaluate an open source application, called ODM Data Analysis, for the semi-automatic analysis of clinical study data. The system requires clinical data in the CDISC Operational Data Model format. After uploading the file, its syntax and data type conformity of the collected data is validated. The completeness of the study data is determined and basic statistics, including illustrative charts for each item, are generated. Datasets from four clinical studies have been used to evaluate the application's performance and functionality. The system is implemented as an open source web application (available at https://odmanalysis.uni-muenster.de) and also provided as Docker image which enables an easy distribution and installation on local systems. Study data is only stored in the application as long as the calculations are performed which is compliant with data protection endeavors. Analysis times are below half an hour, even for larger studies with over 6000 subjects. Medical experts have ensured the usefulness of this application to grant an overview of their collected study data for monitoring purposes and to generate descriptive statistics without further user interaction. The semi-automatic analysis has its limitations and cannot replace the complex analysis of statisticians, but it can be used as a starting point for their examination and reporting.

  19. Standardized synoptic cancer pathology reports - so what and who cares? A population-based satisfaction survey of 970 pathologists, surgeons, and oncologists.

    PubMed

    Lankshear, Sara; Srigley, John; McGowan, Thomas; Yurcan, Marta; Sawka, Carol

    2013-11-01

    Cancer Care Ontario implemented synoptic pathology reporting across Ontario, impacting the practice of pathologists, surgeons, and medical and radiation oncologists. The benefits of standardized synoptic pathology reporting include enhanced completeness and improved consistency in comparison with narrative reports, with reported challenges including increased workload and report turnaround time. To determine the impact of synoptic pathology reporting on physician satisfaction specific to practice and process. A descriptive, cross-sectional design was utilized involving 970 clinicians across 27 hospitals. An 11-item survey was developed to obtain information regarding timeliness, completeness, clarity, and usability. Open-ended questions were also employed to obtain qualitative comments. A 51% response rate was obtained, with descriptive statistics reporting that physicians perceive synoptic reports as significantly better than narrative reports. Correlation analysis revealed a moderately strong, positive relationship between respondents' perceptions of overall satisfaction with the level of information provided and perceptions of completeness for clinical decision making (r = 0.750, P < .001) and ease of finding information for clinical decision making (r = 0.663, P < .001). Dependent t tests showed a statistically significant difference in the satisfaction scores of pathologists and oncologists (t169 = 3.044, P = .003). Qualitative comments revealed technology-related issues as the most frequently cited factor impacting timeliness of report completion. This study provides evidence of strong physician satisfaction with synoptic cancer pathology reporting as a clinical decision support tool in the diagnosis, prognosis, and treatment of cancer patients.

  20. A statistical simulation model for field testing of non-target organisms in environmental risk assessment of genetically modified plants.

    PubMed

    Goedhart, Paul W; van der Voet, Hilko; Baldacchino, Ferdinando; Arpaia, Salvatore

    2014-04-01

    Genetic modification of plants may result in unintended effects causing potentially adverse effects on the environment. A comparative safety assessment is therefore required by authorities, such as the European Food Safety Authority, in which the genetically modified plant is compared with its conventional counterpart. Part of the environmental risk assessment is a comparative field experiment in which the effect on non-target organisms is compared. Statistical analysis of such trials come in two flavors: difference testing and equivalence testing. It is important to know the statistical properties of these, for example, the power to detect environmental change of a given magnitude, before the start of an experiment. Such prospective power analysis can best be studied by means of a statistical simulation model. This paper describes a general framework for simulating data typically encountered in environmental risk assessment of genetically modified plants. The simulation model, available as Supplementary Material, can be used to generate count data having different statistical distributions possibly with excess-zeros. In addition the model employs completely randomized or randomized block experiments, can be used to simulate single or multiple trials across environments, enables genotype by environment interaction by adding random variety effects, and finally includes repeated measures in time following a constant, linear or quadratic pattern in time possibly with some form of autocorrelation. The model also allows to add a set of reference varieties to the GM plants and its comparator to assess the natural variation which can then be used to set limits of concern for equivalence testing. The different count distributions are described in some detail and some examples of how to use the simulation model to study various aspects, including a prospective power analysis, are provided.

  1. A statistical simulation model for field testing of non-target organisms in environmental risk assessment of genetically modified plants

    PubMed Central

    Goedhart, Paul W; van der Voet, Hilko; Baldacchino, Ferdinando; Arpaia, Salvatore

    2014-01-01

    Genetic modification of plants may result in unintended effects causing potentially adverse effects on the environment. A comparative safety assessment is therefore required by authorities, such as the European Food Safety Authority, in which the genetically modified plant is compared with its conventional counterpart. Part of the environmental risk assessment is a comparative field experiment in which the effect on non-target organisms is compared. Statistical analysis of such trials come in two flavors: difference testing and equivalence testing. It is important to know the statistical properties of these, for example, the power to detect environmental change of a given magnitude, before the start of an experiment. Such prospective power analysis can best be studied by means of a statistical simulation model. This paper describes a general framework for simulating data typically encountered in environmental risk assessment of genetically modified plants. The simulation model, available as Supplementary Material, can be used to generate count data having different statistical distributions possibly with excess-zeros. In addition the model employs completely randomized or randomized block experiments, can be used to simulate single or multiple trials across environments, enables genotype by environment interaction by adding random variety effects, and finally includes repeated measures in time following a constant, linear or quadratic pattern in time possibly with some form of autocorrelation. The model also allows to add a set of reference varieties to the GM plants and its comparator to assess the natural variation which can then be used to set limits of concern for equivalence testing. The different count distributions are described in some detail and some examples of how to use the simulation model to study various aspects, including a prospective power analysis, are provided. PMID:24834325

  2. Formalizing the definition of meta-analysis in Molecular Ecology.

    PubMed

    ArchMiller, Althea A; Bauer, Eric F; Koch, Rebecca E; Wijayawardena, Bhagya K; Anil, Ammu; Kottwitz, Jack J; Munsterman, Amelia S; Wilson, Alan E

    2015-08-01

    Meta-analysis, the statistical synthesis of pertinent literature to develop evidence-based conclusions, is relatively new to the field of molecular ecology, with the first meta-analysis published in the journal Molecular Ecology in 2003 (Slate & Phua 2003). The goal of this article is to formalize the definition of meta-analysis for the authors, editors, reviewers and readers of Molecular Ecology by completing a review of the meta-analyses previously published in this journal. We also provide a brief overview of the many components required for meta-analysis with a more specific discussion of the issues related to the field of molecular ecology, including the use and statistical considerations of Wright's FST and its related analogues as effect sizes in meta-analysis. We performed a literature review to identify articles published as 'meta-analyses' in Molecular Ecology, which were then evaluated by at least two reviewers. We specifically targeted Molecular Ecology publications because as a flagship journal in this field, meta-analyses published in Molecular Ecology have the potential to set the standard for meta-analyses in other journals. We found that while many of these reviewed articles were strong meta-analyses, others failed to follow standard meta-analytical techniques. One of these unsatisfactory meta-analyses was in fact a secondary analysis. Other studies attempted meta-analyses but lacked the fundamental statistics that are considered necessary for an effective and powerful meta-analysis. By drawing attention to the inconsistency of studies labelled as meta-analyses, we emphasize the importance of understanding the components of traditional meta-analyses to fully embrace the strengths of quantitative data synthesis in the field of molecular ecology. © 2015 John Wiley & Sons Ltd.

  3. A statistical analysis of flank eruptions on Etna volcano

    NASA Astrophysics Data System (ADS)

    Mulargia, Francesco; Tinti, Stefano; Boschi, Enzo

    1985-02-01

    A singularly complete record exists for the eruptive activity of Etna volcano. The time series of occurrence of flank eruptions in the period 1600-1980, in which the record is presumably complete, is found to follow a stationary Poisson process. A revision of the available data shows that eruption durations are rather well correlated with the estimates of the volume of lava flows. This implies that the magnitude of an eruption can be defined directly by its duration. Extreme value statistics are then applied to the time series, using duration as a dependent variable. The probability of occurrence of a very long (300 days) eruption is greater than 50% only in time intervals of the order of 50 years. The correlation found between duration and total output also allows estimation of the probability of occurrence of a major event which exceeds a given duration and total flow of lava. The composite probabilities do not differ considerably from the pure ones. Paralleling a well established application to seismic events, extreme value theory can be profitably used in volcanic risk estimates, provided that appropriate account is also taken of all other variables.

  4. Forest statistics for Central Georgia, 1982

    Treesearch

    Raymond M. Sheffield; John B. Tansey

    1982-01-01

    This report highlights the principal findings of the fifth forest survey of Central Georgia. Fieldwork began in October 1981 and was completed in June 1982. Four previous surveys, completed in 1936, 1952, 1961, and 1972, provide statistics for measuring changes and trends over the past 46 years. The primary emphasis in this report is on the changes and trends since...

  5. Forest statistics for North Central Georgia, 1998

    Treesearch

    Michael T. Thompson

    1998-01-01

    This report highlights the principal findings of the seventh forest survey of North Central Georgia. Field work began in June 1997 and was completed in November 1997. Six previous surveys, completed in 1936, 1953, 196 1, 1972, 1983, and 1989 provide statistics for measuring changes and trends over the past 6 1 years. This report primarily emphasizes the changes and...

  6. Forest statistics for South Florida, 1995

    Treesearch

    Michael T. Thompson

    1996-01-01

    This report highlights the principal findings of the seventh forest survey of South Florida. Field work began in September 1994 and was completed in November 1994. Six previous surveys, completed in 1936, 1949, 1959, 1970, 1980, and 1988 provide statistics for measuring changes and trends over the past 59 years. This report primarily emphasizes the changes and trends...

  7. Forest statistics for Central Georgia, 1997

    Treesearch

    Michael T. Thompson

    1998-01-01

    This report highlights the principal findings of the seventh forest survey of Central Georgia. Field work began in November 1996 and was completed in August 1997. Six previous surveys, completed in 1936, 1952, 1961, 1972, 1982, and 1989 provide statistics for measuring changes and trends over the past 61 years. This report primarily emphasizes the changes and trends...

  8. Forest statistics for Central Florida - 1995

    Treesearch

    Mark J. Brown

    1996-01-01

    This report highlights the principal findings of the seventh forest survey of Central Florida. Field work began in February 1995 and was completed in May 1995. Six previous surveys, completed in 1936, 1949, 1959, 1970, 1960, and 1988 provide statistics for measuring changes and trends over the past 59 years. This report primarily emphasizes the changes and trends since...

  9. Forest statistics for Southwest Georgia, 1996

    Treesearch

    Raymond M. Sheffield; Michael T. Thompson

    1997-01-01

    This report highlights the principal findings of the seventh forest survey of Southwest Georgia. Field work began in June 1995 and was completed in November 1995. Six previous surveys, completed in 1934, 1951, 1960, 1971, 1981, and 1988 provide statistics for measuring changes and trends over the past 62 years. This report primarily emphasizes the changes and trends...

  10. Forest statistics for Northeast Florida, 1980

    Treesearch

    Raymond M. Sheffield

    1981-01-01

    This report highlights the principal findings of the fifth forest survey of Northeast Florida. Fieldwork began in June 1979 and was completed in December 1979. Four previous surveys, completed in 1934, 1949, 1959, and 1970, provide statistics for measuring changes and trends over the past 46 years. The primary emphasis in this report is on the changes and trends since...

  11. Forest statistics for Virginia, 1992

    Treesearch

    Tony G. Johnson

    1992-01-01

    This report highlights the principal findings of the sixth forest survey of Virginia. Field work began in October 1990 and was completed in January 1992. Five previous surveys, completed in 1940, 1957, 1966, 1977, and 1986, provide statistics for measuring changes and trends over the past 52 years. The primary emphasis in this report is on the changes and trends since...

  12. Forest statistics for the Northern Piedmont of Virginia 1976

    Treesearch

    Raymond M. Sheffield

    1976-01-01

    This report highlights the principal findings of the fourth inventory of the timber resource in the Northern Piedmont of Virginia. The inventory was started in March 1976 and completed in August 1976. Three previous inventories, completed in 1940, 1957, and 1965, provide statistics for measuring changes and trends over the past 36 years. In this report, the primary...

  13. Forest statistics for the Coastal Plain of Virginia, 1976

    Treesearch

    Noel D. Cost

    1976-01-01

    This report highlights the principal findings of the fourth inventory of the timber resource in the coastal Plain of Virginia. The inventory was started in February 1975 and completed in November 1975. Three previous inventories, completed in 1940, 1956, and 1966, provide statistics for measuring changes and trends over the past 36 years. In this report, the primary...

  14. Forest statistics for Northeast Florida, 1987

    Treesearch

    Mark J. Brown

    1987-01-01

    This report highlights the principal findings of the sixth forest survey of Northeast Florida. Field work began in January 1987 and was completed in July 1987. Five previous surveys, completed in 1934, 1949, 1959, 1970, and 1980, provide statistics for measuring changes and trends over the past 53 years. The primary emphasis in this report is on the changes and trends...

  15. Forest statistics for Northwest Florida, 1979

    Treesearch

    Raymond M. Sheffield

    1979-01-01

    This report highlights the principal findings of the fifth forest survey of Northwest Florida. Fieldwork began in September 1978 and was completed in June 1979. Four previous surveys, completed in 1934, 1949, 1959, and 1969, provide statistics for measuring changes and trends over the past 45 years. The primary emphasis in this report is on the changes and trends since...

  16. Forest statistics for Northeast Florida, 1995

    Treesearch

    Raymond M. Sheffield

    1995-01-01

    This report highlights the principal findings of the seventh forest survey of Northeast Florida. Field work began in April 1994 and was completed in May 1995. Six previous surveys, completed in 1934, 1949. 1959, 1970, 1980, and 1987 provide statistics for measuring changes and trends over the past 61 years. The primary emphasis in this report is on the changes and...

  17. Forest statistics for North Georgia, 1983

    Treesearch

    John B. Tansey

    1983-01-01

    This report highlights the principal findings of the fifth forest survey of North Georgia. Fieldwork began in September 1982 and was completed in January 1983. Four previous surveys, completed in 1936, 1953, 1961, and 1972, provide statistics for measuring changes and trends over the past 47 years. The primary emphasis in this report is on the changes and trends since...

  18. Forest statistics for North Carolina, 1984

    Treesearch

    William A. Bechtold

    1984-01-01

    This report highlights the principal findings of the fifth forest survey of North Carolina, Fieldwork began in November 1982 and was completed in September 1984, Four previous surveys, completed in 1938, 1956, 1964, and 1974, provide statistics for measuring changes and trends over the past 46 years, The primary emphasis in this report is on the changes and trends...

  19. Forest statistics for North Central Georgia, 1983

    Treesearch

    John B. Tansey

    1983-01-01

    This report highlights the principal findings of the fifth forest survey of North Central Georgia. Fieldwork began in May 1982 and was completed in September 1982. Four previous surveys, completed in 1936, 1953, 1961, and 1972, provide statistics for measuring changes and trends over the past 47 years. The primary emphasis in this report is on the changes and trends...

  20. Forest statistics for South Carolina, 1978

    Treesearch

    Raymond M. Sheffield

    1978-01-01

    This report highlights the principal findings of the fifth inventory of South Carolina's forests. Fieldwork began in April 1977 and was completed in August 1978. Four previous statewide inventories, completed in 1936, 1947, 19.58, and 1968, provide statistics for measuring changes and trends over the past 42 years. The primary emphasis in this report is on the...

  1. Forest statistics for Southeast Georgia, 1981

    Treesearch

    Raymond M. Sheffield

    1982-01-01

    This report highlights the principal findings of the fifth forest survey of Southeast Georgia, Fieldwork began in November 1980 and was completed in October 1981. Four previous surveys, completed in 1934, 1952, 1960, and 1971, provide statistics for measuring changes and trends over the past 47 years. The primary emphasis in this report is on the changes and trends...

  2. Forest statistics for the Southern Piedmont of Virginia 1976

    Treesearch

    Raymond M. Sheffield

    1976-01-01

    This report highlights the principal findings of the fourth inventory of the timber resource in the Southern Piedmont of Virginia. The inventory was started in February 1975 and completed in November 1975. Three previous inventories, completed in 1940, 1956, and 1966, provide statistics for measuring changes and trends over the past 36 years. In this report, the...

  3. Air Force Officials did not Consistently Comply with Requirements for Assessing Contractor Performance

    DTIC Science & Technology

    2016-01-29

    31 Appendix B. Improvement in PAR Completion Statistics _________________________________ 33 vi...agencies must perform frequent evaluation of compliance with reporting requirements so they can readily identify delinquent past performance efforts...Reporting Program,” August 13, 2011 Appendixes DODIG-2016-043 │ 33 Appendix B Improvement in PAR Completion Statistics The Senate Armed Services Committee

  4. Forest statistics for Virginia, 1986

    Treesearch

    Mark J. Brown

    1986-01-01

    This report highlights the principal findings of the fifth forest survey of Virginia. Fieldwork began in September 1984 and was completed in November 1985. Four previous surveys, completed in 1940, 1957, 1966, and 1977, provide statistics for measuring changes and trends over the past 46 years. The primary emphasis in this report is on the changes and trends since 1977...

  5. Forest statistics for Southwest Georgia, 1981

    Treesearch

    Raymond M. Sheffield

    1981-01-01

    This report highlights the principal findings of the fifth forest survey of southwest Georgia, Fieldwork began in May 1980 and was completed in November 1980. Four previous surveys, completed in 1938, 1951, 1960, 1971, provide statistics for measuring changes and trends over the past 47 years. The primary emphasis in this report is on the changes and trends since 1971...

  6. Forest statistics for the Northern Mountain region of Virginia 1977

    Treesearch

    Raymond M. Sheffield

    1977-01-01

    This report highlights the principal findings of the fourth inventory of timber resources in the Northern Mountain Region of Virginia. The inventory was started in August 1976 and completed in December 1976. Three previous inventories, completed in 1940, 1957 and 1966, provide statistics for measuring changes and trends over the past 37 years. In this report, the...

  7. Forest statistics for Central Florida - 1980

    Treesearch

    Raymond M. Sheffield

    1981-01-01

    This report highlights the principal findings of the fifth forest survey of Central Florida. Fieldwork began in December 1979 and was completed in March 1980. Four previous surveys, completed in 1936, 1949, 1959, and 1970, provide statistics for measuring changes and trends over the past 44 years. The primary emphasis in this report is on the changes and trends since...

  8. Forest statistics for South Carolina, 1986

    Treesearch

    John B. Tansey

    1986-01-01

    This report highlights the principal findings of the sixth forest survey in South Carolina. Fieldwork began in November 1985 and was completed in September 1986. Five previous surveys, completed in 1936, 1947, 1958, 1968, and 1978, provide statistics for measuring changes and trends over the past 50 years, The primary emphasis in this report is on the changes and...

  9. Forest statistics for Southwest Georgia, 1988

    Treesearch

    Michael T. Thompson

    1988-01-01

    This report highlights the principal findings of the sixth forest survey in southwest Georgia. Field work began in October 1987 and was completed in January 1988. Five previous surveys, completed in 1934, 1951, 1960, 1971, and 1981, provide statistics for measuring changes and trends over the past 54 years. The primary emphasis in this report is on the changes and...

  10. Forest statistics for Central Florida - 1988

    Treesearch

    Mark J. Brown

    1988-01-01

    This report highlights the principal findings of the sixth forest survey of Central Florida. Field work began in July 1987 and was completed in September 1987. Five previous surveys, completed in 1936, 1949, 1959, 1970, and 1980, provide statistics for measuring changes and trends over the past 52 years. The primary emphasis in this report is on the changes and trends...

  11. Effectiveness of community-based rehabilitation after traumatic brain injury for 489 program completers compared with those precipitously discharged.

    PubMed

    Altman, Irwin M; Swick, Shannon; Parrot, Devan; Malec, James F

    2010-11-01

    To evaluate outcomes of home- and community-based postacute brain injury rehabilitation (PABIR). Retrospective analysis of program evaluation data for treatment completers and noncompleters. Home- and community-based PABIR conducted in 7 geographically distinct U.S. cities. Patients (N=489) with traumatic brain injury who completed the prescribed course of rehabilitation (completed-course-of-treatment [CCT] group) compared with 114 who were discharged precipitously before program completion (precipitous-discharge [PD] group). PABIR delivered in home and community settings by certified professional staff on an individualized basis. Mayo-Portland Adaptability Inventory (MPAI-4) completed by means of professional consensus on admission and at discharge; MPAI-4 Participation Index at 3- and 12-month follow-up through telephone contact. Analysis of covariance (CCT vs PD group as between-subjects variable, admission MPAI-4 score as covariate) showed significant differences between groups at discharge on the full MPAI-4 (F=82.25; P<.001), Ability Index (F=50.24; P<.001), Adjustment Index (F=81.20; P<.001), and Participation Index (F=59.48; P<.001). A large portion of the sample was lost to follow-up; however, available data showed that group differences remained statistically significant at follow-up. Results provided evidence of the effectiveness of home- and community-based PABIR and that treatment effects were maintained at follow-up. Copyright © 2010 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  12. Constraining sterile neutrinos with AMANDA and IceCube atmospheric neutrino data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Esmaili, Arman; Peres, O.L.G.; Halzen, Francis, E-mail: aesmaili@ifi.unicamp.br, E-mail: halzen@icecube.wisc.edu, E-mail: orlando@ifi.unicamp.br

    2012-11-01

    We demonstrate that atmospheric neutrino data accumulated with the AMANDA and the partially deployed IceCube experiments constrain the allowed parameter space for a hypothesized fourth sterile neutrino beyond the reach of a combined analysis of all other experiments, for Δm{sup 2}{sub 41}∼<1 eV{sup 2}. Although the IceCube data wins the statistics in the analysis, the advantage of a combined analysis of AMANDA and IceCube data is the partial remedy of yet unknown instrumental systematic uncertainties. We also illustrate the sensitivity of the completed IceCube detector, that is now taking data, to the parameter space of 3+1 model.

  13. Jamie's Ministry of Food: quasi-experimental evaluation of immediate and sustained impacts of a cooking skills program in Australia.

    PubMed

    Flego, Anna; Herbert, Jessica; Waters, Elizabeth; Gibbs, Lisa; Swinburn, Boyd; Reynolds, John; Moodie, Marj

    2014-01-01

    To evaluate the immediate and sustained effectiveness of the first Jamie's Ministry of Food Program in Australia on individuals' cooking confidence and positive cooking/eating behaviours. A quasi- experimental repeated measures design was used incorporating a wait-list control group. A questionnaire was developed and administered at baseline (T1), immediately post program (T2) and 6 months post completion (T3) for participants allocated to the intervention group, while wait -list controls completed it 10 weeks prior to program commencement (T1) and just before program commencement (T2). The questionnaire measured: participants' confidence to cook, the frequency of cooking from basic ingredients, and consumption of vegetables, vegetables with the main meal, fruit, ready-made meals and takeaway. Analysis used a linear mixed model approach for repeated measures using all available data to determine mean differences within and between groups over time. All adult participants (≥18 years) who registered and subsequently participated in the program in Ipswich, Queensland, between late November 2011- December 2013, were invited to participate. In the intervention group: 694 completed T1, 383 completed T1 and T2 and 214 completed T1, T2 and T3 assessments. In the wait-list group: 237 completed T1 and 149 completed T1 and T2 assessments. Statistically significant increases within the intervention group (P<0.001) and significant group*time interaction effects (P<0.001) were found in all cooking confidence measures between T1 and T2 as well as cooking from basic ingredients, frequency of eating vegetables with the main meal and daily vegetable intake (0.52 serves/day increase). Statistically significant increases at T2 were sustained at 6 months post program in the intervention group. Jamie's Ministry of Food Program, Australia improved individuals' cooking confidence and cooking/eating behaviours contributing to a healthier diet and is a promising community-based strategy to influence diet quality.

  14. Jamie's Ministry of Food: Quasi-Experimental Evaluation of Immediate and Sustained Impacts of a Cooking Skills Program in Australia

    PubMed Central

    Flego, Anna; Herbert, Jessica; Waters, Elizabeth; Gibbs, Lisa; Swinburn, Boyd; Reynolds, John; Moodie, Marj

    2014-01-01

    Objective To evaluate the immediate and sustained effectiveness of the first Jamie's Ministry of Food Program in Australia on individuals' cooking confidence and positive cooking/eating behaviours. Methods A quasi- experimental repeated measures design was used incorporating a wait-list control group. A questionnaire was developed and administered at baseline (T1), immediately post program (T2) and 6 months post completion (T3) for participants allocated to the intervention group, while wait -list controls completed it 10 weeks prior to program commencement (T1) and just before program commencement (T2). The questionnaire measured: participants' confidence to cook, the frequency of cooking from basic ingredients, and consumption of vegetables, vegetables with the main meal, fruit, ready-made meals and takeaway. Analysis used a linear mixed model approach for repeated measures using all available data to determine mean differences within and between groups over time. Subjects All adult participants (≥18 years) who registered and subsequently participated in the program in Ipswich, Queensland, between late November 2011- December 2013, were invited to participate. Results In the intervention group: 694 completed T1, 383 completed T1 and T2 and 214 completed T1, T2 and T3 assessments. In the wait-list group: 237 completed T1 and 149 completed T1 and T2 assessments. Statistically significant increases within the intervention group (P<0.001) and significant group*time interaction effects (P<0.001) were found in all cooking confidence measures between T1 and T2 as well as cooking from basic ingredients, frequency of eating vegetables with the main meal and daily vegetable intake (0.52 serves/day increase). Statistically significant increases at T2 were sustained at 6 months post program in the intervention group. Conclusions Jamie's Ministry of Food Program, Australia improved individuals' cooking confidence and cooking/eating behaviours contributing to a healthier diet and is a promising community-based strategy to influence diet quality. PMID:25514531

  15. Psychometric evaluation of the Persian version of the Templer's Death Anxiety Scale in cancer patients.

    PubMed

    Soleimani, Mohammad Ali; Yaghoobzadeh, Ameneh; Bahrami, Nasim; Sharif, Saeed Pahlevan; Sharif Nia, Hamid

    2016-10-01

    In this study, 398 Iranian cancer patients completed the 15-item Templer's Death Anxiety Scale (TDAS). Tests of internal consistency, principal components analysis, and confirmatory factor analysis were conducted to assess the internal consistency and factorial validity of the Persian TDAS. The construct reliability statistic and average variance extracted were also calculated to measure construct reliability, convergent validity, and discriminant validity. Principal components analysis indicated a 3-component solution, which was generally supported in the confirmatory analysis. However, acceptable cutoffs for construct reliability, convergent validity, and discriminant validity were not fulfilled for the three subscales that were derived from the principal component analysis. This study demonstrated both the advantages and potential limitations of using the TDAS with Persian-speaking cancer patients.

  16. Impact of Integrated Science and English Language Arts Literacy Supplemental Instructional Intervention on Science Academic Achievement of Elementary Students

    NASA Astrophysics Data System (ADS)

    Marks, Jamar Terry

    The purpose of this quasi-experimental, nonequivalent pretest-posttest control group design study was to determine if any differences existed in upper elementary school students' science academic achievement when instructed using an 8-week integrated science and English language arts literacy supplemental instructional intervention in conjunction with traditional science classroom instruction as compared to when instructed using solely traditional science classroom instruction. The targeted sample population consisted of fourth-grade students enrolled in a public elementary school located in the southeastern region of the United States. The convenience sample size consisted of 115 fourth-grade students enrolled in science classes. The pretest and posttest academic achievement data collected consisted of the science segment from the Spring 2015, and Spring 2016 state standardized assessments. Pretest and posttest academic achievement data were analyzed using an ANCOVA statistical procedure to test for differences, and the researcher reported the results of the statistical analysis. The results of the study show no significant difference in science academic achievement between treatment and control groups. An interpretation of the results and recommendations for future research were provided by the researcher upon completion of the statistical analysis.

  17. Choroidal Thickness Analysis in Patients with Usher Syndrome Type 2 Using EDI OCT.

    PubMed

    Colombo, L; Sala, B; Montesano, G; Pierrottet, C; De Cillà, S; Maltese, P; Bertelli, M; Rossetti, L

    2015-01-01

    To portray Usher Syndrome type 2, analyzing choroidal thickness and comparing data reported in published literature on RP and healthy subjects. Methods. 20 eyes of 10 patients with clinical signs and genetic diagnosis of Usher Syndrome type 2. Each patient underwent a complete ophthalmologic examination including Best Corrected Visual Acuity (BCVA), intraocular pressure (IOP), axial length (AL), automated visual field (VF), and EDI OCT. Both retinal and choroidal measures were measured. Statistical analysis was performed to correlate choroidal thickness with age, BCVA, IOP, AL, VF, and RT. Comparison with data about healthy people and nonsyndromic RP patients was performed. Results. Mean subfoveal choroidal thickness (SFCT) was 248.21 ± 79.88 microns. SFCT was statistically significant correlated with age (correlation coefficient -0.7248179, p < 0.01). No statistically significant correlation was found between SFCT and BCVA, IOP, AL, VF, and RT. SFCT was reduced if compared to healthy subjects (p < 0.01). No difference was found when compared to choroidal thickness from nonsyndromic RP patients (p = 0.2138). Conclusions. Our study demonstrated in vivo choroidal thickness reduction in patients with Usher Syndrome type 2. These data are important for the comprehension of mechanisms of disease and for the evaluation of therapeutic approaches.

  18. Statistical analysis of the factors that influenced the mechanical properties improvement of cassava starch films

    NASA Astrophysics Data System (ADS)

    Monteiro, Mayra; Oliveira, Victor; Santos, Francisco; Barros Neto, Eduardo; Silva, Karyn; Silva, Rayane; Henrique, João; Chibério, Abimaelle

    2017-08-01

    In order to obtain cassava starch films with improved mechanical properties in relation to the synthetic polymer in the packaging production, a complete factorial design 23 was carried out in order to investigate which factor significantly influences the tensile strength of the biofilm. The factors to be investigated were cassava starch, glycerol and modified clay contents. Modified bentonite clay was used as a filling material of the biofilm. Glycerol was the plasticizer used to thermoplastify cassava starch. The factorial analysis suggested a regression model capable of predicting the optimal mechanical property of the cassava starch film from the maximization of the tensile strength. The reliability of the regression model was tested by the correlation established with the experimental data through the following statistical analyse: Pareto graph. The modified clay was the factor of greater statistical significance on the observed response variable, being the factor that contributed most to the improvement of the mechanical property of the starch film. The factorial experiments showed that the interaction of glycerol with both modified clay and cassava starch was significant for the reduction of biofilm ductility. Modified clay and cassava starch contributed to the maximization of biofilm ductility, while glycerol contributed to the minimization.

  19. [The main directions of reforming the service of medical statistics in Ukraine].

    PubMed

    Golubchykov, Mykhailo V; Orlova, Nataliia M; Bielikova, Inna V

    2018-01-01

    Introduction: Implementation of new methods of information support of managerial decision-making should ensure of the effective health system reform and create conditions for improving the quality of operational management, reasonable planning of medical care and increasing the efficiency of the use of system resources. Reforming of Medical Statistics Service of Ukraine should be considered only in the context of the reform of the entire health system. The aim: This work is an analysis of the current situation and justification of the main directions of reforming of Medical Statistics Service of Ukraine. Material and methods: In the work is used a range of methods: content analysis, bibliosemantic, systematic approach. The information base of the research became: WHO strategic and program documents, data of the Medical Statistics Center of the Ministry of Health of Ukraine. Review: The Medical Statistics Service of Ukraine has a completed and effective structure, headed by the State Institution "Medical Statistics Center of the Ministry of Health of Ukraine." This institution reports on behalf of the Ministry of Health of Ukraine to the State Statistical Service of Ukraine, the WHO European Office and other international organizations. An analysis of the current situation showed that to achieve this goal it is necessary: to improve the system of statistical indicators for an adequate assessment of the performance of health institutions, including in the economic aspect; creation of a developed medical and statistical base of administrative territories; change of existing technologies for the formation of information resources; strengthening the material-technical base of the structural units of Medical Statistics Service; improvement of the system of training and retraining of personnel for the service of medical statistics; development of international cooperation in the field of methodology and practice of medical statistics, implementation of internationally accepted methods for collecting, processing, analyzing and disseminating medical and statistical information; the creation of a medical and statistical service that adapted to the specifics of market relations in health care, flexible and sensitive to changes in international methodologies and standards. Conclusions: The data of medical statistics are the basis for taking managerial decisions by managers at all levels of health care. Reform of Medical Statistics Service of Ukraine should be considered only in the context of the reform of the entire health system. The main directions of the reform of the medical statistics service in Ukraine are: the introduction of information technologies, the improvement of the training of personnel for the service, the improvement of material and technical equipment, the maximum reuse of the data obtained, which provides for the unification of primary data and a system of indicators. The most difficult area is the formation of information funds and the introduction of modern information technologies.

  20. [The main directions of reforming the service of medical statistics in Ukraine].

    PubMed

    Golubchykov, Mykhailo V; Orlova, Nataliia M; Bielikova, Inna V

    Introduction: Implementation of new methods of information support of managerial decision-making should ensure of the effective health system reform and create conditions for improving the quality of operational management, reasonable planning of medical care and increasing the efficiency of the use of system resources. Reforming of Medical Statistics Service of Ukraine should be considered only in the context of the reform of the entire health system. The aim: This work is an analysis of the current situation and justification of the main directions of reforming of Medical Statistics Service of Ukraine. Material and methods: In the work is used a range of methods: content analysis, bibliosemantic, systematic approach. The information base of the research became: WHO strategic and program documents, data of the Medical Statistics Center of the Ministry of Health of Ukraine. Review: The Medical Statistics Service of Ukraine has a completed and effective structure, headed by the State Institution "Medical Statistics Center of the Ministry of Health of Ukraine." This institution reports on behalf of the Ministry of Health of Ukraine to the State Statistical Service of Ukraine, the WHO European Office and other international organizations. An analysis of the current situation showed that to achieve this goal it is necessary: to improve the system of statistical indicators for an adequate assessment of the performance of health institutions, including in the economic aspect; creation of a developed medical and statistical base of administrative territories; change of existing technologies for the formation of information resources; strengthening the material-technical base of the structural units of Medical Statistics Service; improvement of the system of training and retraining of personnel for the service of medical statistics; development of international cooperation in the field of methodology and practice of medical statistics, implementation of internationally accepted methods for collecting, processing, analyzing and disseminating medical and statistical information; the creation of a medical and statistical service that adapted to the specifics of market relations in health care, flexible and sensitive to changes in international methodologies and standards. Conclusions: The data of medical statistics are the basis for taking managerial decisions by managers at all levels of health care. Reform of Medical Statistics Service of Ukraine should be considered only in the context of the reform of the entire health system. The main directions of the reform of the medical statistics service in Ukraine are: the introduction of information technologies, the improvement of the training of personnel for the service, the improvement of material and technical equipment, the maximum reuse of the data obtained, which provides for the unification of primary data and a system of indicators. The most difficult area is the formation of information funds and the introduction of modern information technologies.

  1. Identifying the Types of Support Needed by Interprofessional Teams Providing Pediatric End-of-Life Care: A Thematic Analysis.

    PubMed

    Riotte, Clare O; Kukora, Stephanie K; Keefer, Patricia M; Firn, Janice I

    2018-04-01

    Despite the number of interprofessional team members caring for children at the end of life, little evidence exists on how institutions can support their staff in providing care in these situations. We sought to evaluate which aspects of the hospital work environment were most helpful for multidisciplinary team members who care for patients at the end of life and identify areas for improvement to better address staff needs. Qualitative thematic analysis was completed of free-text comments from a survey distributed to interprofessional staff members involved in the care of a recently deceased pediatric patient. A total of 2701 surveys were sent; 890 completed. Free-text responses were provided by 306 interprofessional team members. Interprofessional team members involved in the care of a child who died at a 348 bed academic children's hospital in the Midwestern United States. Realist thematic analysis of free-text responses was completed in Dedoose using a deductive and inductive approach with line-by-line coding. Descriptive statistics of demographic information was completed using Excel. Thematic analysis of the 306 free-text responses identified three main support-related themes. Interprofessional team members desire to have (1) support through educational efforts such as workshops, (2) support from colleagues, and (3) support through institutional practices. Providers who participate in end-of-life work benefit from ongoing support through education, interpersonal relationships, and institutional practices. Addressing these areas from an interprofessional perspective enables staff to provide the optimal care for patients, patients' families, and themselves.

  2. Novel Mechanism for Reducing Acute and Chronic Neurodegeneration After Traumatic Brain Injury

    DTIC Science & Technology

    2017-07-01

    glutamate from the brain. Scope: We will test this novel and powerful neuroprotective treatment in a rat model of repetitive mild (concussive) TBIs...variability. 2. Completed statistical analysis of behavioral experiments examining effects of rGOT and rGOT + OxAc on outcome on rotarod and Morris water ...neuroprotective treatment in a rat model of a single moderate TBI and in a rat model of repetitive mild (concussive) TBIs. Outcome measures include blood and

  3. Impact of Embedded Military Metal Alloys on Skeletal Physiology in an Animal Model

    DTIC Science & Technology

    2017-04-04

    turnover were completed and statistical comparison performed for each time point. Each ELISA was performed according to the instructions within each kit...expectations for controls. Results of osteocalcin ELISA were evaluated and any results with a coefficient of variation greater than 25% were omitted...Results of TRAP5b ELISA were evaluated and any results with a coefficient of variation greater than 25% were omitted from analysis. Measures of TRAP5b

  4. The Roles of Experience, Gender, and Individual Differences in Statistical Reasoning

    ERIC Educational Resources Information Center

    Martin, Nadia; Hughes, Jeffrey; Fugelsang, Jonathan

    2017-01-01

    We examine the joint effects of gender and experience on statistical reasoning. Participants with various levels of experience in statistics completed the Statistical Reasoning Assessment (Garfield, 2003), along with individual difference measures assessing cognitive ability and thinking dispositions. Although the performance of both genders…

  5. On the age and mass function of the globular cluster M 4: A different interpretation of recent deep HST observations

    NASA Astrophysics Data System (ADS)

    De Marchi, G.; Paresce, F.; Straniero, O.; Prada Moroni, P. G.

    2004-03-01

    Very deep images of the Galactic globular cluster M 4 (NGC 6121) through the F606W and F814W filters were taken in 2001 with the WFPC2 on board the HST. A first published analysis of this data set (Richer et al. \\cite{Richer2002}) produced the result that the age of M 4 is 12.7± 0.7 Gyr (Hansen et al. \\cite{Hansen2002}), thus setting a robust lower limit to the age of the universe. In view of the great astronomical importance of getting this number right, we have subjected the same data set to the simplest possible photometric analysis that completely avoids uncertain assumptions about the origin of the detected sources. This analysis clearly reveals both a thin main sequence, from which can be deduced the deepest statistically complete mass function yet determined for a globular cluster, and a white dwarf (WD) sequence extending all the way down to the 5 \\sigma detection limit at I ≃ 27. The WD sequence is abruptly terminated at exactly this limit as expected by detection statistics. Using our most recent theoretical WD models (Prada Moroni & Straniero \\cite{Prada2002}) to obtain the expected WD sequence for different ages in the observed bandpasses, we find that the data so far obtained do not reach the peak of the WD luminosity function, thus only allowing one to set a lower limit to the age of M 4 of ˜9 Gyr. Thus, the problem of determining the absolute age of a globular cluster and, therefore, the onset of GC formation with cosmologically significant accuracy remains completely open. Only observations several magnitudes deeper than the limit obtained so far would allow one to approach this objective. Based on observations with the NASA/ESA Hubble Space Telescope, obtained at the Space Telescope Science Institute, which is operated by AURA for NASA under contract NAS5-26555.

  6. HICOSMO: cosmology with a complete sample of galaxy clusters - II. Cosmological results

    NASA Astrophysics Data System (ADS)

    Schellenberger, G.; Reiprich, T. H.

    2017-10-01

    The X-ray bright, hot gas in the potential well of a galaxy cluster enables systematic X-ray studies of samples of galaxy clusters to constrain cosmological parameters. HIFLUGCS consists of the 64 X-ray brightest galaxy clusters in the Universe, building up a local sample. Here, we utilize this sample to determine, for the first time, individual hydrostatic mass estimates for all the clusters of the sample and, by making use of the completeness of the sample, we quantify constraints on the two interesting cosmological parameters, Ωm and σ8. We apply our total hydrostatic and gas mass estimates from the X-ray analysis to a Bayesian cosmological likelihood analysis and leave several parameters free to be constrained. We find Ωm = 0.30 ± 0.01 and σ8 = 0.79 ± 0.03 (statistical uncertainties, 68 per cent credibility level) using our default analysis strategy combining both a mass function analysis and the gas mass fraction results. The main sources of biases that we correct here are (1) the influence of galaxy groups (incompleteness in parent samples and differing behaviour of the Lx-M relation), (2) the hydrostatic mass bias, (3) the extrapolation of the total mass (comparing various methods), (4) the theoretical halo mass function and (5) other physical effects (non-negligible neutrino mass). We find that galaxy groups introduce a strong bias, since their number density seems to be over predicted by the halo mass function. On the other hand, incorporating baryonic effects does not result in a significant change in the constraints. The total (uncorrected) systematic uncertainties (∼20 per cent) clearly dominate the statistical uncertainties on cosmological parameters for our sample.

  7. Chordee and Penile Shortening Rather Than Voiding Function Are Associated With Patient Dissatisfaction After Urethroplasty.

    PubMed

    Maciejewski, Conrad C; Haines, Trevor; Rourke, Keith F

    2017-05-01

    To identify factors that predict patient satisfaction after urethroplasty by prospectively examining patient-reported quality of life scores using 3 validated instruments. A 3-part prospective survey consisting of the International Prostate Symptom Score (IPSS), the International Index of Erectile Function (IIEF) score, and a urethroplasty quality of life survey was completed by patients who underwent urethroplasty preoperatively and at 6 months postoperatively. The quality of life score included questions on genitourinary pain, urinary tract infection (UTI), postvoid dribbling, chordee, shortening, overall satisfaction, and overall health. Data were analyzed using descriptive statistics, paired t test, univariate and multivariate logistic regression analyses, and Wilcoxon signed-rank analysis. Patients were enrolled in the study from February 2011 to December 2014, and a total of 94 patients who underwent a total of 102 urethroplasties completed the study. Patients reported statistically significant improvements in IPSS (P < .001). Ordinal linear regression analysis revealed no association between age, IPSS, or IIEF score and patient satisfaction. Wilcoxon signed-rank analysis revealed significant improvements in pain scores (P = .02), UTI (P < .001), perceived overall health (P = .01), and satisfaction (P < .001). Univariate logistic regression identified a length >4 cm and the absence of UTI, pain, shortening, and chordee as predictors of patient satisfaction. Multivariate analysis of quality of life domain scores identified absence of shortening and absence of chordee as independent predictors of patient satisfaction following urethroplasty (P < .01). Patient voiding function and quality of life improve significantly following urethroplasty, but improvement in voiding function is not associated with patient satisfaction. Chordee status and perceived penile shortening impact patient satisfaction, and should be included in patient-reported outcome measures. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Evaluation of centrifuged bone marrow on bone regeneration around implants in rabbit tibia.

    PubMed

    Betoni, Walter; Queiroz, Thallita P; Luvizuto, Eloá R; Valentini-Neto, Rodolpho; Garcia-Júnior, Idelmo R; Bernabé, Pedro F E

    2012-12-01

    To evaluate the bone regeneration of cervical defects produced around titanium implants filled with blood clot and filled with centrifuged bone marrow (CBM) by means of histomorphometric analysis. Twelve rabbits received 2 titanium implants in each right tibia, with the upper cortical prepared with a 5-mm drill and the lower cortex with a 3-mm-diameter drill. Euthanasia was performed to allow analysis at 7, 21, and 60 days after operation. The samples were embedded in light curing resin, cut and stained with alizarin red and Stevenel blue for a histomorphometric analysis of the bone-to-implant contact (BIC) and the bone area around implant (BA). The values obtained were statistically analyzed using the nonparametric Kruskal-Wallis test (P = 0.05). At 60 days postoperation, the groups had their cervical defects completely filled by neoformed bone tissue. There was no statistically significant difference between the groups regarding BIC and BA during the analyzed periods. There was no difference in the bone repair of periimplant cervical defects with or without the use of CBM.

  9. Employment status, inflation and suicidal behaviour: an analysis of a stratified sample in Italy.

    PubMed

    Solano, Paola; Pizzorno, Enrico; Gallina, Anna M; Mattei, Chiara; Gabrielli, Filippo; Kayman, Joshua

    2012-09-01

    There is abundant empirical evidence of a surplus risk of suicide among the unemployed, although few studies have investigated the influence of economic downturns on suicidal behaviours in an employment status-stratified sample. We investigated how economic inflation affected suicidal behaviours according to employment status in Italy from 2001 to 2008. Data concerning economically active people were provided by the Italian Institute for Statistical Analysis and by the International Monetary Fund. The association between inflation and completed versus attempted suicide with respect to employment status was investigated in every year and quarter-year of the study time frame. We considered three occupational categories: employed, unemployed who were previously employed and unemployed who had never worked. The unemployed are at higher suicide risk than the employed. Among the PE, a significant association between inflation and suicide attempt was found, whereas no association was reported concerning completed suicides. No association was found between completed and attempted suicides among the employed, the NE and inflation. Completed suicide in females is significantly associated with unemployment in every quarter-year. The reported vulnerability to suicidal behaviours among the PE as inflation rises underlines the need of effective support strategies for both genders in times of economic downturns.

  10. Single-row, double-row, and transosseous equivalent techniques for isolated supraspinatus tendon tears with minimal atrophy: A retrospective comparative outcome and radiographic analysis at minimum 2-year followup

    PubMed Central

    McCormick, Frank; Gupta, Anil; Bruce, Ben; Harris, Josh; Abrams, Geoff; Wilson, Hillary; Hussey, Kristen; Cole, Brian J.

    2014-01-01

    Purpose: The purpose of this study was to measure and compare the subjective, objective, and radiographic healing outcomes of single-row (SR), double-row (DR), and transosseous equivalent (TOE) suture techniques for arthroscopic rotator cuff repair. Materials and Methods: A retrospective comparative analysis of arthroscopic rotator cuff repairs by one surgeon from 2004 to 2010 at minimum 2-year followup was performed. Cohorts were matched for age, sex, and tear size. Subjective outcome variables included ASES, Constant, SST, UCLA, and SF-12 scores. Objective outcome variables included strength, active range of motion (ROM). Radiographic healing was assessed by magnetic resonance imaging (MRI). Statistical analysis was performed using analysis of variance (ANOVA), Mann — Whitney and Kruskal — Wallis tests with significance, and the Fisher exact probability test <0.05. Results: Sixty-three patients completed the study requirements (20 SR, 21 DR, 22 TOE). There was a clinically and statistically significant improvement in outcomes with all repair techniques (ASES mean improvement P = <0.0001). The mean final ASES scores were: SR 83; (SD 21.4); DR 87 (SD 18.2); TOE 87 (SD 13.2); (P = 0.73). There was a statistically significant improvement in strength for each repair technique (P < 0.001). There was no significant difference between techniques across all secondary outcome assessments: ASES improvement, Constant, SST, UCLA, SF-12, ROM, Strength, and MRI re-tear rates. There was a decrease in re-tear rates from single row (22%) to double-row (18%) to transosseous equivalent (11%); however, this difference was not statistically significant (P = 0.6). Conclusions: Compared to preoperatively, arthroscopic rotator cuff repair, using SR, DR, or TOE techniques, yielded a clinically and statistically significant improvement in subjective and objective outcomes at a minimum 2-year follow-up. Level of Evidence: Therapeutic level 3. PMID:24926159

  11. Initial assessment on the use of cocoa pulp in complete feed formulation: in vitro dry matter and organic matter digestibility

    NASA Astrophysics Data System (ADS)

    Natsir, A.; Mujnisa, A.; Mide, M. Z.; Purnomo, N.; Saade, M. F.

    2018-05-01

    Cocoa pulp is a by-product from cocoa industry which is produced in large quantity, but very limited study has been carried out in utilizing it as energy source in animal feed. The purpose of this study was to assess the in vitro dry matter (IVDMD) and in vitro organic matter digestibility (IVOMD) of complete feed containing different levels of cocoa pulp. The experiment was carried out according to completely randomised design consisting of four treatments and three replications. The treatments were P0 = Complete feed containing 0% cocoa pulp, P1 = Complete feed containing 5% cocoa pulp, P2 = Complete feed containing 10% cocoa pulp, and P3 = Complete feed containing 15% cocoa pulp on dry matter basis. The results of the study indicated that the average IVDMD was 567, 538, 566, and 526 g kg-1 DM, while the average IVOMD was 522, 491, 502, and 461 g/kg DM, respectively for treatment P0, P1, P2, and P3. Statistical analysis indicated that increasing levels of coca pulp in the feed significantly affected (P<0.05) the IVDMD and IVOMD of the feed. In conclusion, cocoa pulp is potential to be used up to 10% in complete feed with corn cobs as the fibre source.

  12. History of water quality parameters - a study on the Sinos River/Brazil.

    PubMed

    Konzen, G B; Figueiredo, J A S; Quevedo, D M

    2015-05-01

    Water is increasingly becoming a valuable resource, constituting one of the central themes of environmental, economic and social discussions. The Sinos River, located in southern Brazil, is the main river from the Sinos River Basin, representing a source of drinking water supply for a highly populated region. Considering its size and importance, it becomes necessary to conduct a study to follow up the water quality of this river, which is considered by some experts as one of the most polluted rivers in Brazil. As for this study, its great importance lies in the historical analysis of indicators. In this sense, we sought to develop aspects related to the management of water resources by performing a historical analysis of the Water Quality Index (WQI) of the Sinos River, using statistical methods. With regard to the methodological procedures, it should be pointed out that this study performs a time analysis of monitoring data on parameters related to a punctual measurement that is variable in time, using statistical tools. The data used refer to analyses of the water quality of the Sinos River (WQI) from the State Environmental Protection Agency Henrique Luiz Roessler (Fundação Estadual de Proteção Ambiental Henrique Luiz Roessler, FEPAM) covering the period between 2000 and 2008, as well as to a theoretical analysis focusing on the management of water resources. The study of WQI and its parameters by statistical analysis has shown to be effective, ensuring its effectiveness as a tool for the management of water resources. The descriptive analysis of the WQI and its parameters showed that the water quality of the Sinos River is concerning low, which reaffirms that it is one of the most polluted rivers in Brazil. It should be highlighted that there was an overall difficulty in obtaining data with the appropriate periodicity, as well as a long complete series, which limited the conduction of statistical studies such as the present one.

  13. Weather related continuity and completeness on Deep Space Ka-band links: statistics and forecasting

    NASA Technical Reports Server (NTRS)

    Shambayati, Shervin

    2006-01-01

    In this paper the concept of link 'stability' as means of measuring the continuity of the link is introduced and through it, along with the distributions of 'good' periods and 'bad' periods, the performance of the proposed Ka-band link design method using both forecasting and long-term statistics has been analyzed. The results indicate that the proposed link design method has relatively good continuity and completeness characteristics even when only long-term statistics are used and that the continuity performance further improves when forecasting is employed. .

  14. Compilation and Analysis of 20 and 30 GHz Rain Fade Events at the ACTS NASA Ground Station: Statistics and Model Assessment

    NASA Technical Reports Server (NTRS)

    Manning, Robert M.

    1996-01-01

    The purpose of the propagation studies within the ACTS Project Office is to acquire 20 and 30 GHz rain fade statistics using the ACTS beacon links received at the NGS (NASA Ground Station) in Cleveland. Other than the raw, statistically unprocessed rain fade events that occur in real time, relevant rain fade statistics derived from such events are the cumulative rain fade statistics as well as fade duration statistics (beyond given fade thresholds) over monthly and yearly time intervals. Concurrent with the data logging exercise, monthly maximum rainfall levels recorded at the US Weather Service at Hopkins Airport are appended to the database to facilitate comparison of observed fade statistics with those predicted by the ACTS Rain Attenuation Model. Also, the raw fade data will be in a format, complete with documentation, for use by other investigators who require realistic fade event evolution in time for simulation purposes or further analysis for comparisons with other rain fade prediction models, etc. The raw time series data from the 20 and 30 GHz beacon signals is purged of non relevant data intervals where no rain fading has occurred. All other data intervals which contain rain fade events are archived with the accompanying time stamps. The definition of just what constitutes a rain fade event will be discussed later. The archived data serves two purposes. First, all rain fade event data is recombined into a contiguous data series every month and every year; this will represent an uninterrupted record of the actual (i.e., not statistically processed) temporal evolution of rain fade at 20 and 30 GHz at the location of the NGS. The second purpose of the data in such a format is to enable a statistical analysis of prevailing propagation parameters such as cumulative distributions of attenuation on a monthly and yearly basis as well as fade duration probabilities below given fade thresholds, also on a monthly and yearly basis. In addition, various subsidiary statistics such as attenuation rate probabilities are derived. The purged raw rain fade data as well as the results of the analyzed data will be made available for use by parties in the private sector upon their request. The process which will be followed in this dissemination is outlined in this paper.

  15. Evaluation of SOVAT: an OLAP-GIS decision support system for community health assessment data analysis.

    PubMed

    Scotch, Matthew; Parmanto, Bambang; Monaco, Valerie

    2008-06-09

    Data analysis in community health assessment (CHA) involves the collection, integration, and analysis of large numerical and spatial data sets in order to identify health priorities. Geographic Information Systems (GIS) enable for management and analysis using spatial data, but have limitations in performing analysis of numerical data because of its traditional database architecture.On-Line Analytical Processing (OLAP) is a multidimensional datawarehouse designed to facilitate querying of large numerical data. Coupling the spatial capabilities of GIS with the numerical analysis of OLAP, might enhance CHA data analysis. OLAP-GIS systems have been developed by university researchers and corporations, yet their potential for CHA data analysis is not well understood. To evaluate the potential of an OLAP-GIS decision support system for CHA problem solving, we compared OLAP-GIS to the standard information technology (IT) currently used by many public health professionals. SOVAT, an OLAP-GIS decision support system developed at the University of Pittsburgh, was compared against current IT for data analysis for CHA. For this study, current IT was considered the combined use of SPSS and GIS ("SPSS-GIS"). Graduate students, researchers, and faculty in the health sciences at the University of Pittsburgh were recruited. Each round consisted of: an instructional video of the system being evaluated, two practice tasks, five assessment tasks, and one post-study questionnaire. Objective and subjective measurement included: task completion time, success in answering the tasks, and system satisfaction. Thirteen individuals participated. Inferential statistics were analyzed using linear mixed model analysis. SOVAT was statistically significant (alpha = .01) from SPSS-GIS for satisfaction and time (p < .002). Descriptive results indicated that participants had greater success in answering the tasks when using SOVAT as compared to SPSS-GIS. Using SOVAT, tasks were completed more efficiently, with a higher rate of success, and with greater satisfaction, than the combined use of SPSS and GIS. The results from this study indicate a potential for OLAP-GIS decision support systems as a valuable tool for CHA data analysis.

  16. Evaluation of SOVAT: An OLAP-GIS decision support system for community health assessment data analysis

    PubMed Central

    Scotch, Matthew; Parmanto, Bambang; Monaco, Valerie

    2008-01-01

    Background Data analysis in community health assessment (CHA) involves the collection, integration, and analysis of large numerical and spatial data sets in order to identify health priorities. Geographic Information Systems (GIS) enable for management and analysis using spatial data, but have limitations in performing analysis of numerical data because of its traditional database architecture. On-Line Analytical Processing (OLAP) is a multidimensional datawarehouse designed to facilitate querying of large numerical data. Coupling the spatial capabilities of GIS with the numerical analysis of OLAP, might enhance CHA data analysis. OLAP-GIS systems have been developed by university researchers and corporations, yet their potential for CHA data analysis is not well understood. To evaluate the potential of an OLAP-GIS decision support system for CHA problem solving, we compared OLAP-GIS to the standard information technology (IT) currently used by many public health professionals. Methods SOVAT, an OLAP-GIS decision support system developed at the University of Pittsburgh, was compared against current IT for data analysis for CHA. For this study, current IT was considered the combined use of SPSS and GIS ("SPSS-GIS"). Graduate students, researchers, and faculty in the health sciences at the University of Pittsburgh were recruited. Each round consisted of: an instructional video of the system being evaluated, two practice tasks, five assessment tasks, and one post-study questionnaire. Objective and subjective measurement included: task completion time, success in answering the tasks, and system satisfaction. Results Thirteen individuals participated. Inferential statistics were analyzed using linear mixed model analysis. SOVAT was statistically significant (α = .01) from SPSS-GIS for satisfaction and time (p < .002). Descriptive results indicated that participants had greater success in answering the tasks when using SOVAT as compared to SPSS-GIS. Conclusion Using SOVAT, tasks were completed more efficiently, with a higher rate of success, and with greater satisfaction, than the combined use of SPSS and GIS. The results from this study indicate a potential for OLAP-GIS decision support systems as a valuable tool for CHA data analysis. PMID:18541037

  17. Digit Span as a measure of everyday attention: a study of ecological validity.

    PubMed

    Groth-Marnat, Gary; Baker, Sonya

    2003-12-01

    This study investigated the effectiveness of the WAIS-III Digit Span subtest to predict the everyday attention of 75 participants with heterogeneous neurological conditions who were administered the Digit Span subtest as well as the ecologically valid Test of Everyday Attention. In addition, the more visually oriented Picture Completion subtest along with the verbally loaded National Adult Reading Test were administered. Analysis indicated that, although Digit Span was a weak but statistically significant predictor of attentional ability (accounting for 12.7% of the unique variance). Picture Completion was a somewhat stronger predictor (accounting for 19% of the unique variance). The weak association of Digit Span and the Test of Everyday Attention, along with the finding that Picture Completion was a better predictor of performance on the Test of Everyday Attention, question the clinical utility of using Digit Span as a measure of everyday attention.

  18. Differences in Game Statistics Between Winning and Losing Rugby Teams in the Six Nations Tournament

    PubMed Central

    Ortega, Enrique; Villarejo, Diego; Palao, José M.

    2009-01-01

    The objective of the present study was to analyze the differences in rugby game statistics between winning and losing teams. The data from 58 games of round robin play from the Six Nations tournament from the 2003-2006 seasons were analyzed. The groups of variables studied were: number of points scored, way in which the points were scored; way teams obtained the ball and how the team used it; and technical and tactical aspects of the game. A univariate (t-test) and multivariate (discriminant) analysis of data was done. Winning teams had average values that were significantly higher in points scored, conversions, successful drops, mauls won, line breaks, possessions kicked, tackles completed, and turnovers won. Losing teams had significantly higher averages for the variables scrums lost and line-outs lost. The results showed that: a) in the phases of obtaining the ball and more specifically in scrummage and line-out, winning teams lose fewer balls than losing teams (winning teams have an efficacy of 90% in both actions); b) the winning team tends to play more with their feet when they obtain the ball, to utilize the maul as a way of attacking, and to break the defensive line more often than the losing team does; and c) On defence, winning teams recovered more balls and completed more tackles than losing teams, and the percentage of tackles completed by winning teams was 94%. The value presented could be used as a reference for practice and competition in peak performance teams. Key points This paper increases the knowledge about rugby match analysis. Give normative values to establish practice and match goals. Give applications ideas to connect research with coaches practice. PMID:24149592

  19. Feeling the future: A meta-analysis of 90 experiments on the anomalous anticipation of random future events

    PubMed Central

    Bem, Daryl; Tressoldi, Patrizio; Rabeyron, Thomas; Duggan, Michael

    2016-01-01

    In 2011, one of the authors (DJB) published a report of nine experiments in the Journal of Personality and Social Psychology purporting to demonstrate that an individual’s cognitive and affective responses can be influenced by randomly selected stimulus events that do not occur until after his or her responses have already been made and recorded, a generalized variant of the phenomenon traditionally denoted by the term precognition. To encourage replications, all materials needed to conduct them were made available on request. We here report a meta-analysis of 90 experiments from 33 laboratories in 14 countries which yielded an overall effect greater than 6 sigma, z = 6.40, p = 1.2 × 10 -10  with an effect size (Hedges’ g) of 0.09. A Bayesian analysis yielded a Bayes Factor of 5.1 × 10 9, greatly exceeding the criterion value of 100 for “decisive evidence” in support of the experimental hypothesis. When DJB’s original experiments are excluded, the combined effect size for replications by independent investigators is 0.06, z = 4.16, p = 1.1 × 10 -5, and the BF value is 3,853, again exceeding the criterion for “decisive evidence.” The number of potentially unretrieved experiments required to reduce the overall effect size of the complete database to a trivial value of 0.01 is 544, and seven of eight additional statistical tests support the conclusion that the database is not significantly compromised by either selection bias or by intense “ p-hacking”—the selective suppression of findings or analyses that failed to yield statistical significance. P-curve analysis, a recently introduced statistical technique, estimates the true effect size of the experiments to be 0.20 for the complete database and 0.24 for the independent replications, virtually identical to the effect size of DJB’s original experiments (0.22) and the closely related “presentiment” experiments (0.21). We discuss the controversial status of precognition and other anomalous effects collectively known as psi. PMID:26834996

  20. What does the multiple mini interview have to offer over the panel interview?

    PubMed

    Pau, Allan; Chen, Yu Sui; Lee, Verna Kar Mun; Sow, Chew Fei; De Alwis, Ranjit

    2016-01-01

    This paper compares the panel interview (PI) performance with the multiple mini interview (MMI) performance and indication of behavioural concerns of a sample of medical school applicants. The acceptability of the MMI was also assessed. All applicants shortlisted for a PI were invited to an MMI. Applicants attended a 30-min PI with two faculty interviewers followed by an MMI consisting of ten 8-min stations. Applicants were assessed on their performance at each MMI station by one faculty. The interviewer also indicated if they perceived the applicant to be a concern. Finally, applicants completed an acceptability questionnaire. From the analysis of 133 (75.1%) completed MMI scoresheets, the MMI scores correlated statistically significantly with the PI scores (r=0.438, p=0.001). Both were not statistically associated with sex, age, race, or pre-university academic ability to any significance. Applicants assessed as a concern at two or more stations performed statistically significantly less well at the MMI when compared with those who were assessed as a concern at one station or none at all. However, there was no association with PI performance. Acceptability scores were generally high, and comparison of mean scores for each of the acceptability questionnaire items did not show statistically significant differences between sex and race categories. Although PI and MMI performances are correlated, the MMI may have the added advantage of more objectively generating multiple impressions of the applicant's interpersonal skill, thoughtfulness, and general demeanour. Results of the present study indicated that the MMI is acceptable in a multicultural context.

  1. What does the multiple mini interview have to offer over the panel interview?

    PubMed Central

    Pau, Allan; Chen, Yu Sui; Lee, Verna Kar Mun; Sow, Chew Fei; Alwis, Ranjit De

    2016-01-01

    Introduction This paper compares the panel interview (PI) performance with the multiple mini interview (MMI) performance and indication of behavioural concerns of a sample of medical school applicants. The acceptability of the MMI was also assessed. Materials and methods All applicants shortlisted for a PI were invited to an MMI. Applicants attended a 30-min PI with two faculty interviewers followed by an MMI consisting of ten 8-min stations. Applicants were assessed on their performance at each MMI station by one faculty. The interviewer also indicated if they perceived the applicant to be a concern. Finally, applicants completed an acceptability questionnaire. Results From the analysis of 133 (75.1%) completed MMI scoresheets, the MMI scores correlated statistically significantly with the PI scores (r=0.438, p=0.001). Both were not statistically associated with sex, age, race, or pre-university academic ability to any significance. Applicants assessed as a concern at two or more stations performed statistically significantly less well at the MMI when compared with those who were assessed as a concern at one station or none at all. However, there was no association with PI performance. Acceptability scores were generally high, and comparison of mean scores for each of the acceptability questionnaire items did not show statistically significant differences between sex and race categories. Conclusions Although PI and MMI performances are correlated, the MMI may have the added advantage of more objectively generating multiple impressions of the applicant's interpersonal skill, thoughtfulness, and general demeanour. Results of the present study indicated that the MMI is acceptable in a multicultural context. PMID:26873337

  2. Mortality and cause-of-death reporting and analysis systems in seven Pacific Island countries.

    PubMed

    Carter, Karen L; Rao, Chalapati; Lopez, Alan D; Taylor, Richard

    2012-06-13

    Mortality statistics are essential for population health assessment. Despite limitations in data availability, Pacific Island Countries are considered to be in epidemiological transition, with non-communicable diseases increasingly contributing to premature adult mortality. To address rapidly changing health profiles, countries would require mortality statistics from routine death registration given their relatively small population sizes. This paper uses a standard analytical framework to examine death registration systems in Fiji, Kiribati, Nauru, Palau, Solomon Islands, Tonga and Vanuatu. In all countries, legislation on death registration exists but does not necessarily reflect current practices. Health departments carry the bulk of responsibility for civil registration functions. Medical cause-of-death certificates are completed for at least hospital deaths in all countries. Overall, significantly more information is available than perceived or used. Use is primarily limited by poor understanding, lack of coordination, limited analytical skills, and insufficient technical resources. Across the region, both registration and statistics systems need strengthening to improve the availability, completeness, and quality of data. Close interaction between health staff and local communities provides a good foundation for further improvements in death reporting. System strengthening activities must include a focus on clear assignment of responsibility, provision of appropriate authority to perform assigned tasks, and fostering ownership of processes and data to ensure sustained improvements. These human elements need to be embedded in a culture of data sharing and use. Lessons from this multi-country exercise would be applicable in other regions afflicted with similar issues of availability and quality of vital statistics.

  3. What does the multiple mini interview have to offer over the panel interview?

    PubMed

    Pau, Allan; Chen, Yu Sui; Lee, Verna Kar Mun; Sow, Chew Fei; Alwis, Ranjit De

    2016-01-01

    Introduction This paper compares the panel interview (PI) performance with the multiple mini interview (MMI) performance and indication of behavioural concerns of a sample of medical school applicants. The acceptability of the MMI was also assessed. Materials and methods All applicants shortlisted for a PI were invited to an MMI. Applicants attended a 30-min PI with two faculty interviewers followed by an MMI consisting of ten 8-min stations. Applicants were assessed on their performance at each MMI station by one faculty. The interviewer also indicated if they perceived the applicant to be a concern. Finally, applicants completed an acceptability questionnaire. Results From the analysis of 133 (75.1%) completed MMI scoresheets, the MMI scores correlated statistically significantly with the PI scores (r=0.438, p=0.001). Both were not statistically associated with sex, age, race, or pre-university academic ability to any significance. Applicants assessed as a concern at two or more stations performed statistically significantly less well at the MMI when compared with those who were assessed as a concern at one station or none at all. However, there was no association with PI performance. Acceptability scores were generally high, and comparison of mean scores for each of the acceptability questionnaire items did not show statistically significant differences between sex and race categories. Conclusions Although PI and MMI performances are correlated, the MMI may have the added advantage of more objectively generating multiple impressions of the applicant's interpersonal skill, thoughtfulness, and general demeanour. Results of the present study indicated that the MMI is acceptable in a multicultural context.

  4. Forest statistics for the Southern Piedmont of Virginia, 1991

    Treesearch

    Tony G. Johnson

    1991-01-01

    This report highlights the principal findings of the sixth forest survey of the Southern Piedmont of Virginia. Field work began in March 1991 and was completed in June 1991. Five previous surveys, completed in 1940, 1957, 1965, 1976, and 1985, provide statistics for measuring changes and trends over the past 51 years. The primary emphasis in this report is on the...

  5. Forest statistics for the Northern Mountains of Virginia, 1992

    Treesearch

    Tony G. Johnson

    1992-01-01

    This report highlights the principal findings of the sixth forest survey of the Northern Mountains of Virginia. Field work began in September 1991 and was completed in November 1991. Five previous surveys, completed in 1940, 1957, 1966, 1977, and 1986, provide statistics for measuring changes and trends over the past 52 years. The primary emphasis in this report is on...

  6. Forest statistics for the Southern Coastal Plain of North Carolina, 1990

    Treesearch

    Tony G. Johnson

    1990-01-01

    This report highlights the principal findings of the sixth forest survey of the Southern Coastal Plain of North Carolina. Field work began in April 1989 and was completed in September 1989. Five previous surveys, completed in 1937, 1952, 1962, 1973, and 1983, provide statistics for measuring changes and trends over the past 53 years. The primary emphasis in this report...

  7. Forest statistics for the Northern Mountains of Virginia, 1986

    Treesearch

    Mark J. Brown

    1986-01-01

    This report highlights the findings of the fifth forest survey in the Northern Mountains of Virginia. Fieldwork began in August 1985 and was completed in October 1985. Four previous surveys, completed in 1940, 1957, 1966, and 1977, provide statistics for measuring changes and trends over the past 46 years. The primary emphasis in this report is on the changes and...

  8. Forest statistics for the mountains of North Carolina, 1984

    Treesearch

    Gerald C. Craver

    1985-01-01

    This report highlights the principal findings of the fifth forest survey in the Mountains of North Carolina. Fieldwork began in April 1984 and was completed in September 1984. Four previous surveys, completed in 1938, 1955, 1964, and 1974, provide statistics for measuring changes and trends over the past 46 years. The primary emphasis in this report is on the changes...

  9. Forest statistics for North Central Georgia, 1989

    Treesearch

    Tony G. Johnson

    1989-01-01

    This report highlights the principal findings of the sixth forest survey in North Central Georgia. Field work began in February 1989 and was completed in April 1989. Five previous surveys, completed in 1936, 1953, 1961, 1972, and 1983, provide statistics for measuring changes and trends over the past 53 years. The primary emphasis in this report is on the changes and...

  10. Forest statistics for the Piedmont of North Carolina 1975

    Treesearch

    Richard L. Welch

    1975-01-01

    This report highlights the principal findings of the fourth inventory of the timber resource in the Piedmont of North Carolina. The inventory was started in May 1964 and completed in January 1975. Three previous inventories, completed in 1937, 1956, and 1964m provide statistics for measuring changes and trends over the past 38 years. In this report, the primary...

  11. Forest statistics for the Northern Coastal Plain of North Carolina, 1984

    Treesearch

    Edgar L. Davenport

    1984-01-01

    This report highlights the principal findings of the fifth forest inventory in the Northern Coastal Plain of North Carolina. Fieldwork began in June 1983 and was completed in December 1983. Four previous surveys, completed in 1937, 1955, 1963, and 1974, provide statistics for measuring changes and trends over the past 46 years. The primary emphasis in this report is on...

  12. Forest statistics for the Southern Coastal Plain of North Carolina, 1983

    Treesearch

    John B. Tansey

    1984-01-01

    This report highlights the principal findings of the fifth forest survey in the southern Coastal Plain of North Carolina. Fieldwork began in November 1982 and was completed in June 1983. Four previous surveys, completed in 1938, 1952, 1962, and 1973, provide statistics for measuring changes and trends over the past 46 years. The primary emphasis in this report is on...

  13. Forest statistics for Florida, 1980

    Treesearch

    William A. Bechtold; Raymond M. Sheffield

    1981-01-01

    This report highlights the principal findings of the fifth inventory of Florida’s forests. Fieldwork began in September 1978 and was completed in May 1980. Four previous surveys, completed in 1936, 1949, 1959, and 1970, provide statistics for measuring changes and trends over the past 44 years. The primary emphasis in this report is on the changes and trends since 1970...

  14. Forest statistics for the Piedmont of South Carolina 1977

    Treesearch

    Nolan L. Snyder

    1977-01-01

    This report highlights the principal findings of the fifth inventory of the timber resource in the Piedmont of South Carolina. The inventory was started in April 1977 and completed in September 1977. Four previous inventories, completed in 1936, 1947, 1958, and 1967, provide statistics for measuring changes and trends over the past 41 years. In this report, the primary...

  15. Forest statistics for the Southern Coastal Plain of South Carolina 1978

    Treesearch

    Raymond M. Sheffield; Joanne Hutchison

    1978-01-01

    This report highlights the principal findings of the fifth forest inventory of the Southern Coastal Plain of South Carolina. Fieldwork began in April 1978 and was completed in August 1978. Four previous inventories, completed in 1934, 1947, 1958, and 1968, provide statistics for measuring changes and trends over the past 44 years. The primary emphasis in this report is...

  16. Forest statistics for the Northern Piedmont of Virginia, 1986

    Treesearch

    Mark J. Brown

    1986-01-01

    This report highlights the principal findings of the fifth forest survey in the Northern Piedmont of Virginia. Fieldwork began in July 1985 and was completed in September 1985. Four previous surveys, completed in 1940, 1957, 1965, and 1976, provide statistics for measuring changes and trends over the past 46 years. The primary emphasis in this report is on the changes...

  17. Forest statistics for the Piedmont of North Carolina, 1984

    Treesearch

    Cecil C. Hutchins

    1984-01-01

    This report highlights the principal findings of the fifth forest survey in the Piedmont of North Carolina, Fieldwork began in December 1983 and was completed in August 1984, Four previous surveys, completed in 1937, 1956, 1964, and 1975, provide statistics for measuring changes and trends over the past 47 years. The primary emphasis in this report is on the changes...

  18. Forest statistics for the Northern Piedmont of Virginia, 1992

    Treesearch

    Michael T. Thompson

    1992-01-01

    This report highlights the principal findings of the sixth forest survey of the Northern Piedmont of Virginia. Field work began in June 1991 and was completed in September 1991. Five previous surveys, completed in 1940, 1957, 1965, 1976, and 1986, provide statistics for measuring changes and trends over the past 52 years. The primary emphasis in this report is on the...

  19. Forest statistics for the Southern Coastal Plain of South Carolina, 1987

    Treesearch

    John B. Tansey

    1987-01-01

    This report highlights the principal findings of the sixth forest survey in the Southern Coastal plain of South Carolina. Fieldwork began in June 1986 and was completed in September 1986. Five previous surveys, completed in 1934, 1947, 1958, 1968, and 1978, provide statistics for measuring changes and trends over the past 53 years. The primary emphasis in this report...

  20. Forest statistics for South Florida, 1980

    Treesearch

    Raymond M. Sheffield; William A. Bechtold

    1981-01-01

    This report highlights the principal findings of the fifth inventory of Florida’s forests. Fieldwork began in September 1978 and was completed in May 1980. Four previous surveys, completed in 1936, 1949, 1959, and 1970, provide statistics for measuring changes and trends over the past 44 years. The primary emphasis in this report is on the changes and trends since 1970...

  1. Forest statistics for the Northern Coastal plain of North Carolina 1974

    Treesearch

    Richard L. Welch; Herbert A. Knight

    1974-01-01

    This report highlights the principal findings of the fourth inventory of the timber resource in the Northern Coastal Plain of North Carolina. The inventory was started in July 1973 and completed in May 1974. Three previous inventories, completed in 1937, 1955, and 1963, provide statistics for measuring changes and trends over the past 37 years. In this report, the...

  2. Forest statistics for the Coastal Plain of Virginia, 1991

    Treesearch

    Michael T. Thompson

    1991-01-01

    This report highlights the principal findings of the sixth forest survey of the Coastal Plain of Virginia. Field work began in October 1990 and was completed in March 1991. Five previous surveys, completed in 1940, 1956, 1966, 1976, and 1985, provide statistics for measuring changes and trends over the past 51 years. The primary emphasis in this report is on the...

  3. Forest statistics for the mountain region of North Carolina 1974

    Treesearch

    Noel D. Cost

    1974-01-01

    This report highlights the principal findings of the fourth inventory of the timber resource in the Mountain Region of North Carolina. The inventory was started in May 1974 and completed in September 1974. Three previous inventories, completed in 1938, 1955, and 1964, provide statistic for measuring changes and trends over the past 36 years. In this report, the primary...

  4. Forest statistics for the Southern Piedmont of Virginia, 1985

    Treesearch

    Mark J. Brown

    1985-01-01

    This report highlights the pricipal findings of the fifth forest survey in the Southern Piedmont of Virginia. Fieldwork began in March 1985 and was completed in July 1985. Four previous surveys, completed in 1940, 1957, 1965, and 1976, provide statistics for measuring changes and trends over the past 45 years. The primary emphasis in this report is on the changes and...

  5. Forest statistics for the mountains of North Carolina, 1990

    Treesearch

    Tony G. Johnson

    1991-01-01

    This report highlights the principal findings of the sixth forest survey of the Mountains of North Carolina. Field work began in August 1990 and was completed in November 1990. Five previous surveys, completed in 1938, 1955, 1964, 1974, and 1984, provide statistics for measuring changes and trends over the past 52 years. The primary emphasis in this report is on the...

  6. Forest statistics for the Southern Mountain region of Virginia, 1977

    Treesearch

    Raymond M. Sheffield

    1977-01-01

    This report highlights the principal findings of the fourth inventory of the timber resource in the Southern Mountain Region of Virginia. The inventory was started in December 1976 and completed in March 1977. Three previous inventories, completed in 1940, 1957, and 1966, provide statistics for measuring changes and trends over the past 37 years. In this report, the...

  7. Forest statistics for the Coastal Plain of Virginia, 1985

    Treesearch

    Mark J. Brown; Gerald C. Craver

    1985-01-01

    This report highlights the principal findings of the fifth forest survey in the Coastal Plain of Virginia. Fieldwork began in September 1984 and was completed in February 1985. Four previous surveys, completed in 1940, 1956, 1966, and 1976, provide statistics for measuring changes and trends over the past 45 years. The primary emphasis in this report is on the changes...

  8. Atomic-scale phase composition through multivariate statistical analysis of atom probe tomography data.

    PubMed

    Keenan, Michael R; Smentkowski, Vincent S; Ulfig, Robert M; Oltman, Edward; Larson, David J; Kelly, Thomas F

    2011-06-01

    We demonstrate for the first time that multivariate statistical analysis techniques can be applied to atom probe tomography data to estimate the chemical composition of a sample at the full spatial resolution of the atom probe in three dimensions. Whereas the raw atom probe data provide the specific identity of an atom at a precise location, the multivariate results can be interpreted in terms of the probabilities that an atom representing a particular chemical phase is situated there. When aggregated to the size scale of a single atom (∼0.2 nm), atom probe spectral-image datasets are huge and extremely sparse. In fact, the average spectrum will have somewhat less than one total count per spectrum due to imperfect detection efficiency. These conditions, under which the variance in the data is completely dominated by counting noise, test the limits of multivariate analysis, and an extensive discussion of how to extract the chemical information is presented. Efficient numerical approaches to performing principal component analysis (PCA) on these datasets, which may number hundreds of millions of individual spectra, are put forward, and it is shown that PCA can be computed in a few seconds on a typical laptop computer.

  9. Application Exercises Improve Transfer of Statistical Knowledge in Real-World Situations

    ERIC Educational Resources Information Center

    Daniel, Frances; Braasch, Jason L. G.

    2013-01-01

    The present research investigated whether real-world application exercises promoted students' abilities to spontaneously transfer statistical knowledge and to recognize the use of statistics in real-world contexts. Over the course of a semester of psychological statistics, two classes completed multiple application exercises designed to mimic…

  10. CIDR

    Science.gov Websites

    Targeted Informatics General Information Software Posters NIH Program Projects and Statistics QC Statistics Completed Projects Publications Contact Information NIH Contacts CIDR Contacts ___________________ -Contact

  11. Surgery residency curriculum examination scores predict future American Board of Surgery in-training examination performance.

    PubMed

    Webb, Travis P; Paul, Jasmeet; Treat, Robert; Codner, Panna; Anderson, Rebecca; Redlich, Philip

    2014-01-01

    A protected block curriculum (PBC) with postcurriculum examinations for all surgical residents has been provided to assure coverage of core curricular topics. Biannual assessment of resident competency will soon be required by the Next Accreditation System. To identify opportunities for early medical knowledge assessment and interventions, we examined whether performance in postcurriculum multiple-choice examinations (PCEs) is predictive of performance in the American Board of Surgery In-Training Examination (ABSITE) and clinical service competency assessments. Retrospective single-institutional education research study. Academic general surgery residency program. A total of 49 surgical residents. Data for PGY1 and PGY2 residents participating in the 2008 to 2012 PBC are included. Each resident completed 6 PCEs during each year. The results of 6 examinations were correlated to percentage-correct ABSITE scores and clinical assessments based on the 6 Accreditation Council for Graduate Medical Education core competencies. Individual ABSITE performance was compared between PGY1 and PGY2. Statistical analysis included multivariate linear regression and bivariate Pearson correlations. A total of 49 residents completed the PGY1 PBC and 36 completed the PGY2 curriculum. Linear regression analysis of percentage-correct ABSITE and PCE scores demonstrated a statistically significant correlation between the PGY1 PCE 1 score and the subsequent PGY1 ABSITE score (p = 0.037, β = 0.299). Similarly, the PGY2 PCE 1 score predicted performance in the PGY2 ABSITE (p = 0.015, β = 0.383). The ABSITE scores correlated between PGY1 and PGY2 with statistical significance, r = 0.675, p = 0.001. Performance on the 6 Accreditation Council for Graduate Medical Education core competencies correlated between PGY1 and PGY2, r = 0.729, p = 0.001, but did not correlate with PCE scores during either years. Within a mature PBC, early performance in a PGY1 and PGY2 PCE is predictive of performance in the respective ABSITE. This information can be used for formative assessment and early remediation of residents who are predicted to be at risk for poor performance in the ABSITE. Copyright © 2014 Association of Program Directors in Surgery. All rights reserved.

  12. Factors that Affected Functional Outcome After a Delayed Excision and Split-Thickness Skin Graft on the Dorsal Side of Burned Hands.

    PubMed

    Shichinohe, Ryuji; Yamamoto, Yuhei; Kawashima, Kunihiro; Kimura, Chu; Ono, Kentaro; Horiuchi, Katsumi; Yoshida, Tetsunori; Murao, Naoki; Hayashi, Toshihiko; Funayama, Emi; Oyama, Akihiko; Furukawa, Hiroshi

    Early excision and skin grafting is the principle treatment for a burned hand although there are occasions when it cannot be done such as severe general condition, delayed consultation, and the lack of a definitive assessment of burn depth. This study analyzes the factors that affected function after a delayed excision and skin graft for hands with a deep dermal burn. This study retrospectively evaluated 43 burned hands that required a delayed excision and split-thickness skin graft on the dorsal side. Cases were required to only have split-thickness skin grafting from the dorsum of the hand and fingers distally to at least the proximal interphalangeal joint at least 8 days after the injury. The hands were divided into two functional categories: Functional category A, normal or nearly normal joint movements, and functional category B, abnormal joint movements. Demographic data were assessed statistically by a univariate analysis following a multiple regression analysis by a stepwise selection. A significant difference was observed between the groups in the number of days from grafting to complete wound healing of the graft site and with or without an escharotomy in the analysis. These parameters were statistically significant predictors of functional category B. The functional outcome of a burned hand after a delayed excision and split-thickness skin graft on the dorsal side became degraded depending on the number of days from grafting to complete wound healing. Cases that underwent an escharotomy also showed deterioration in function.

  13. Cigarette smoking and health-promoting behaviours among tuberculosis patients in rural areas.

    PubMed

    Tsai, Shu-Lan; Lai, Chun-Liang; Chi, Miao-Ching; Chen, Mei-Yen

    2016-09-01

    To explore cigarette smoking and health-promoting behaviours among disadvantaged adults before their tuberculosis diagnosis and after their tuberculosis treatment. Although tuberculosis infection is associated with impaired immune function, healthy lifestyle habits can play a role in improving the immune system. However, limited research has explored the health-promoting behaviours and cigarette smoking habits among tuberculosis patients in Taiwan. A cross-sectional retrospective study with a convenience sample. This study was conducted between May 2013-June 2014 with 123 patients at a rural district hospital in Chiayi County, Taiwan. Statistical analyses included descriptive statistics, univariate analysis and stepwise regression analysis. Tuberculosis tended to be associated with less education, male sex, malnutrition, cigarette smoking and unhealthy lifestyle habits before the tuberculosis diagnosis. The percentage of smoking decreased from 46·9% before to 30·2% after the tuberculosis diagnosis. Body mass index and health-promoting behaviours also significantly improved after tuberculosis treatment. After controlling for potential confounding factors, multivariate analysis identified chronic disease and completed treatment as significant factors that were associated with current health-promoting behaviours. A high prevalence of cigarette smoking and low levels of health-promoting behaviours were observed before the diagnosis and during or after completing tuberculosis treatment. This study's findings indicate the importance of promoting healthy lifestyle changes among tuberculosis patients; aggressive measures should be implemented immediately after the first diagnosis of tuberculosis. Furthermore, health promotion and smoking cessation programmes should be initiated in the general population to prevent activation of latent tuberculosis infection, and these programmes should specifically target men and rural residents. © 2016 John Wiley & Sons Ltd.

  14. Retrospective analysis of imaging techniques for treatment planning and monitoring of obliteration for gamma knife treatment of cerebral arteriovenous malformation.

    PubMed

    Amponsah, Kwame; Ellis, Thomas L; Chan, Michael D; Lovato, James F; Bourland, J Daniel; deGuzman, Allan F; Ekstrand, Kenneth E; Munley, Michael T; McMullen, Kevin P; Shaw, Edward G; Tatter, Stephen B

    2012-10-01

    It has been well established that Gamma Knife radiosurgery (GKS) is an effective treatment for brain arteriovenous malformations (AVMs). To evaluate complete obliteration rates for magnetic resonance imaging (MRI)-based GKS treatment planning performed with and without angiography and to conduct a preliminary assessment of the utility of using pulsed arterial spin labeling (PASL) magnetic resonance (MR) perfusion imaging to confirm complete obliteration. Forty-six patients were identified who had undergone GKS without embolization with a minimum follow-up of 2 years. One group was planned with integrated stereotactic angiography and MR (spoiled gradient recalled) images obtained on the day of GKS. A second technique avoided the risk of arteriography by using only axial MR images. Beginning in 2007, PASL MR perfusion imaging was routinely performed as a portion of the follow-up MRI to assess the restoration of normal blood flow of the nidus and surrounding area. The overall obliteration rate for the angiography/MRI group was 88.0% (29 of 33). Patients in the MRI-only group had an obliteration rate of 61.5% (8 of 13), with P=.092 with the Fisher exact test, which is not statistically significant. A Kaplan-Meier analysis was also not statistically significant (log rank test, P=.474). Four of 9 patients with incomplete obliteration on angiography also had shown residual abnormal blood flow on PASL imaging. This retrospective analysis shows that treatment planning technique used in GKS does not play a role in the eventual obliteration of treated AVMs. PASL may have potential in the evaluation of AVM obliteration.

  15. Is there a relationship between periodontal disease and causes of death? A cross sectional study.

    PubMed

    Natto, Zuhair S; Aladmawy, Majdi; Alasqah, Mohammed; Papas, Athena

    2015-01-01

    The aim of this study was to evaluate whether there is any correlation between periodontal disease and mortality contributing factors, such as cardiovascular disease and diabetes mellitus in the elderly population. A dental evaluation was performed by a single examiner at Tufts University dental clinics for 284 patients. Periodontal assessments were performed by probing with a manual UNC-15 periodontal probe to measure pocket depth and clinical attachment level (CAL) at 6 sites. Causes of death abstracted from death certificate. Statistical analysis involved ANOVA, chi-square and multivariate logistic regression analysis. The demographics of the population sample indicated that, most were females (except for diabetes mellitus), white, married, completed 13 years of education and were 83 years old on average. CAL (continuous or dichotomous) and marital status attained statistical significance (p<0.05) in contingency table analysis (Chi-square for independence). Individuals with increased CAL were 2.16 times more likely (OR=2.16, 95% CI=1.47-3.17) to die due to CVD and this effect persisted even after control for age, marital status, gender, race, years of education (OR=2.03, 95% CI=1.35-3.03). CAL (continuous or dichotomous) was much higher among those who died due to diabetes mellitus or out of state of Massachusetts. However, these results were not statistically significant. The same pattern was observed with pocket depth (continuous or dichotomous), but these results were not statistically significant either. CAL seems to be more sensitive to chronic diseases than pocket depth. Among those conditions, cardiovascular disease has the strongest effect.

  16. Neurological Outcomes Following Suicidal Hanging: A Prospective Study of 101 Patients

    PubMed Central

    Jawaid, Mohammed Turab; Amalnath, S. Deepak; Subrahmanyam, D. K. S.

    2017-01-01

    Context: Survivors of suicidal hanging can have variable neurological outcomes – from complete recovery to irreversible brain damage. Literature on the neurological outcomes in these patients is confined to retrospective studies and case series. Hence, this prospective study was carried out. Aims: The aim is to study the neurological outcomes in suicidal hanging. Settings and Design: This was a prospective observational study carried out from July 2014 to July 2016. Subjects and Methods: Consecutive patients admitted to the emergency and medicine wards were included in the study. Details of the clinical and radiological findings, course in hospital and at 1 month postdischarge were analyzed. Statistical Analysis Used: Statistical analysis was performed using IBM SPSS advanced statistics 20.0 (SPSS Inc., Chicago, USA). Univariate analysis was performed using Chi-square test for significance and Odd's ratio was calculated. Results: Of the 101 patients, 6 died and 4 had residual neuro deficits. Cervical spine injury was seen in 3 patients. Interestingly, 39 patients could not remember the act of hanging (retrograde amnesia). Hypotension, pulmonary edema, Glasgow coma scale (GCS) score <8 at admission, need for mechanical ventilation, and cerebral edema on plain computed tomography were more in those with amnesia as compared to those with normal memory and these findings were statistically significant. Conclusions: Majority of patients recovered without any sequelae. Routine imaging of cervical spine may not be warranted in all patients, even in those with poor GCS. Retrograde amnesia might be more common than previously believed and further studies are needed to analyze this peculiar feature. PMID:28584409

  17. [Completeness of mortality statistics in Navarra, Spain].

    PubMed

    Moreno-Iribas, Conchi; Guevara, Marcela; Díaz-González, Jorge; Álvarez-Arruti, Nerea; Casado, Itziar; Delfrade, Josu; Larumbe, Emilia; Aguirre, Jesús; Floristán, Yugo

    2013-01-01

    Women in the region of Navarra, Spain, have one of the highest life expectancies at birth in Europe. The aim of this study is to assess the completeness of the official mortality statistics of Navarra in 2009 and the impact of the under-registration of deaths on life expectancy estimates. Comparison of the number of deaths in Navarra using the official statistics from the Instituto Nacional de Estadística (INE) and the data derived from a multiple-source case-finding: the electronic health record, Instituto Navarro de Medicina Legal and INE including data that they received late. 5,249 deaths were identified, of which 103 were not included in the official mortality statistics. Taking into account only deaths that occurred in Spain, which are the only ones considered for the official statistics, the completeness was 98.4%. Estimated life expectancy at birth in 2009 descended from 86.6 years to 86.4 in women and from 80.0 to 79.6 years in men, after correcting for undercount. The results of this study ruled out the existence of significant under-registration of the official mortality statistics, confirming the exceptional longevity of women in Navarra, who are in the top position in Europe with a life expectancy at birth of 86.4 years.

  18. A Statistical Analysis of IrisCode and Its Security Implications.

    PubMed

    Kong, Adams Wai-Kin

    2015-03-01

    IrisCode has been used to gather iris data for 430 million people. Because of the huge impact of IrisCode, it is vital that it is completely understood. This paper first studies the relationship between bit probabilities and a mean of iris images (The mean of iris images is defined as the average of independent iris images.) and then uses the Chi-square statistic, the correlation coefficient and a resampling algorithm to detect statistical dependence between bits. The results show that the statistical dependence forms a graph with a sparse and structural adjacency matrix. A comparison of this graph with a graph whose edges are defined by the inner product of the Gabor filters that produce IrisCodes shows that partial statistical dependence is induced by the filters and propagates through the graph. Using this statistical information, the security risk associated with two patented template protection schemes that have been deployed in commercial systems for producing application-specific IrisCodes is analyzed. To retain high identification speed, they use the same key to lock all IrisCodes in a database. The belief has been that if the key is not compromised, the IrisCodes are secure. This study shows that even without the key, application-specific IrisCodes can be unlocked and that the key can be obtained through the statistical dependence detected.

  19. Quantum signature of chaos and thermalization in the kicked Dicke model

    NASA Astrophysics Data System (ADS)

    Ray, S.; Ghosh, A.; Sinha, S.

    2016-09-01

    We study the quantum dynamics of the kicked Dicke model (KDM) in terms of the Floquet operator, and we analyze the connection between chaos and thermalization in this context. The Hamiltonian map is constructed by suitably taking the classical limit of the Heisenberg equation of motion to study the corresponding phase-space dynamics, which shows a crossover from regular to chaotic motion by tuning the kicking strength. The fixed-point analysis and calculation of the Lyapunov exponent (LE) provide us with a complete picture of the onset of chaos in phase-space dynamics. We carry out a spectral analysis of the Floquet operator, which includes a calculation of the quasienergy spacing distribution and structural entropy to show the correspondence to the random matrix theory in the chaotic regime. Finally, we analyze the thermodynamics and statistical properties of the bosonic sector as well as the spin sector, and we discuss how such a periodically kicked system relaxes to a thermalized state in accordance with the laws of statistical mechanics.

  20. Statistical and clustering analysis for disturbances: A case study of voltage dips in wind farms

    DOE PAGES

    Garcia-Sanchez, Tania; Gomez-Lazaro, Emilio; Muljadi, Eduard; ...

    2016-01-28

    This study proposes and evaluates an alternative statistical methodology to analyze a large number of voltage dips. For a given voltage dip, a set of lengths is first identified to characterize the root mean square (rms) voltage evolution along the disturbance, deduced from partial linearized time intervals and trajectories. Principal component analysis and K-means clustering processes are then applied to identify rms-voltage patterns and propose a reduced number of representative rms-voltage profiles from the linearized trajectories. This reduced group of averaged rms-voltage profiles enables the representation of a large amount of disturbances, which offers a visual and graphical representation ofmore » their evolution along the events, aspects that were not previously considered in other contributions. The complete process is evaluated on real voltage dips collected in intense field-measurement campaigns carried out in a wind farm in Spain among different years. The results are included in this paper.« less

  1. Statistical Analysis of Microarray Data with Replicated Spots: A Case Study with Synechococcus WH8102

    PubMed Central

    Thomas, E. V.; Phillippy, K. H.; Brahamsha, B.; Haaland, D. M.; Timlin, J. A.; Elbourne, L. D. H.; Palenik, B.; Paulsen, I. T.

    2009-01-01

    Until recently microarray experiments often involved relatively few arrays with only a single representation of each gene on each array. A complete genome microarray with multiple spots per gene (spread out spatially across the array) was developed in order to compare the gene expression of a marine cyanobacterium and a knockout mutant strain in a defined artificial seawater medium. Statistical methods were developed for analysis in the special situation of this case study where there is gene replication within an array and where relatively few arrays are used, which can be the case with current array technology. Due in part to the replication within an array, it was possible to detect very small changes in the levels of expression between the wild type and mutant strains. One interesting biological outcome of this experiment is the indication of the extent to which the phosphorus regulatory system of this cyanobacterium affects the expression of multiple genes beyond those strictly involved in phosphorus acquisition. PMID:19404483

  2. Safety Assessment of Food and Feed from GM Crops in Europe: Evaluating EFSA's Alternative Framework for the Rat 90-day Feeding Study.

    PubMed

    Hong, Bonnie; Du, Yingzhou; Mukerji, Pushkor; Roper, Jason M; Appenzeller, Laura M

    2017-07-12

    Regulatory-compliant rodent subchronic feeding studies are compulsory regardless of a hypothesis to test, according to recent EU legislation for the safety assessment of whole food/feed produced from genetically modified (GM) crops containing a single genetic transformation event (European Union Commission Implementing Regulation No. 503/2013). The Implementing Regulation refers to guidelines set forth by the European Food Safety Authority (EFSA) for the design, conduct, and analysis of rodent subchronic feeding studies. The set of EFSA recommendations was rigorously applied to a 90-day feeding study in Sprague-Dawley rats. After study completion, the appropriateness and applicability of these recommendations were assessed using a battery of statistical analysis approaches including both retrospective and prospective statistical power analyses as well as variance-covariance decomposition. In the interest of animal welfare considerations, alternative experimental designs were investigated and evaluated in the context of informing the health risk assessment of food/feed from GM crops.

  3. Walking execution is not affected by divided attention in patients with multiple sclerosis with no disability, but there is a motor planning impairment.

    PubMed

    Nogueira, Leandro Alberto Calazans; Santos, Luciano Teixeira Dos; Sabino, Pollyane Galinari; Alvarenga, Regina Maria Papais; Thuler, Luiz Claudio Santos

    2013-08-01

    We analysed the cognitive influence on walking in multiple sclerosis (MS) patients, in the absence of clinical disability. A case-control study was conducted with 12 MS patients with no disability and 12 matched healthy controls. Subjects were referred for completion a timed walk test of 10 m and a 3D-kinematic analysis. Participants were instructed to walk at a comfortable speed in a dual-task (arithmetic task) condition, and motor planning was measured by mental chronometry. Scores of walking speed and cadence showed no statistically significant differences between the groups in the three conditions. The dual-task condition showed an increase in the double support duration in both groups. Motor imagery analysis showed statistically significant differences between real and imagined walking in patients. MS patients with no disability did not show any influence of divided attention on walking execution. However, motor planning was overestimated as compared with real walking.

  4. Quantum signature of chaos and thermalization in the kicked Dicke model.

    PubMed

    Ray, S; Ghosh, A; Sinha, S

    2016-09-01

    We study the quantum dynamics of the kicked Dicke model (KDM) in terms of the Floquet operator, and we analyze the connection between chaos and thermalization in this context. The Hamiltonian map is constructed by suitably taking the classical limit of the Heisenberg equation of motion to study the corresponding phase-space dynamics, which shows a crossover from regular to chaotic motion by tuning the kicking strength. The fixed-point analysis and calculation of the Lyapunov exponent (LE) provide us with a complete picture of the onset of chaos in phase-space dynamics. We carry out a spectral analysis of the Floquet operator, which includes a calculation of the quasienergy spacing distribution and structural entropy to show the correspondence to the random matrix theory in the chaotic regime. Finally, we analyze the thermodynamics and statistical properties of the bosonic sector as well as the spin sector, and we discuss how such a periodically kicked system relaxes to a thermalized state in accordance with the laws of statistical mechanics.

  5. Comparing statistical and machine learning classifiers: alternatives for predictive modeling in human factors research.

    PubMed

    Carnahan, Brian; Meyer, Gérard; Kuntz, Lois-Ann

    2003-01-01

    Multivariate classification models play an increasingly important role in human factors research. In the past, these models have been based primarily on discriminant analysis and logistic regression. Models developed from machine learning research offer the human factors professional a viable alternative to these traditional statistical classification methods. To illustrate this point, two machine learning approaches--genetic programming and decision tree induction--were used to construct classification models designed to predict whether or not a student truck driver would pass his or her commercial driver license (CDL) examination. The models were developed and validated using the curriculum scores and CDL exam performances of 37 student truck drivers who had completed a 320-hr driver training course. Results indicated that the machine learning classification models were superior to discriminant analysis and logistic regression in terms of predictive accuracy. Actual or potential applications of this research include the creation of models that more accurately predict human performance outcomes.

  6. Bioconductor Workflow for Microbiome Data Analysis: from raw reads to community analyses

    PubMed Central

    Callahan, Ben J.; Sankaran, Kris; Fukuyama, Julia A.; McMurdie, Paul J.; Holmes, Susan P.

    2016-01-01

    High-throughput sequencing of PCR-amplified taxonomic markers (like the 16S rRNA gene) has enabled a new level of analysis of complex bacterial communities known as microbiomes. Many tools exist to quantify and compare abundance levels or OTU composition of communities in different conditions. The sequencing reads have to be denoised and assigned to the closest taxa from a reference database. Common approaches use a notion of 97% similarity and normalize the data by subsampling to equalize library sizes. In this paper, we show that statistical models allow more accurate abundance estimates. By providing a complete workflow in R, we enable the user to do sophisticated downstream statistical analyses, whether parametric or nonparametric. We provide examples of using the R packages dada2, phyloseq, DESeq2, ggplot2 and vegan to filter, visualize and test microbiome data. We also provide examples of supervised analyses using random forests and nonparametric testing using community networks and the ggnetwork package. PMID:27508062

  7. Statistical Analysis of Microarray Data with Replicated Spots: A Case Study with Synechococcus WH8102

    DOE PAGES

    Thomas, E. V.; Phillippy, K. H.; Brahamsha, B.; ...

    2009-01-01

    Until recently microarray experiments often involved relatively few arrays with only a single representation of each gene on each array. A complete genome microarray with multiple spots per gene (spread out spatially across the array) was developed in order to compare the gene expression of a marine cyanobacterium and a knockout mutant strain in a defined artificial seawater medium. Statistical methods were developed for analysis in the special situation of this case study where there is gene replication within an array and where relatively few arrays are used, which can be the case with current array technology. Due in partmore » to the replication within an array, it was possible to detect very small changes in the levels of expression between the wild type and mutant strains. One interesting biological outcome of this experiment is the indication of the extent to which the phosphorus regulatory system of this cyanobacterium affects the expression of multiple genes beyond those strictly involved in phosphorus acquisition.« less

  8. Calculation of Weibull strength parameters and Batdorf flow-density constants for volume- and surface-flaw-induced fracture in ceramics

    NASA Technical Reports Server (NTRS)

    Shantaram, S. Pai; Gyekenyesi, John P.

    1989-01-01

    The calculation of shape and scale parametes of the two-parameter Weibull distribution is described using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. Detailed procedures are given for evaluating 90 percent confidence intervals for maximum likelihood estimates of shape and scale parameters, the unbiased estimates of the shape parameters, and the Weibull mean values and corresponding standard deviations. Furthermore, the necessary steps are described for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull distribution. It also shows how to calculate the Batdorf flaw-density constants by using the Weibull distribution statistical parameters. The techniques described were verified with several example problems, from the open literature, and were coded in the Structural Ceramics Analysis and Reliability Evaluation (SCARE) design program.

  9. Effective Thermal Inactivation of the Spores of Bacillus cereus Biofilms Using Microwave.

    PubMed

    Park, Hyong Seok; Yang, Jungwoo; Choi, Hee Jung; Kim, Kyoung Heon

    2017-07-28

    Microwave sterilization was performed to inactivate the spores of biofilms of Bacillus cereus involved in foodborne illness. The sterilization conditions, such as the amount of water and the operating temperature and treatment time, were optimized using statistical analysis based on 15 runs of experimental results designed by the Box-Behnken method. Statistical analysis showed that the optimal conditions for the inactivation of B. cereus biofilms were 14 ml of water, 108°C of temperature, and 15 min of treatment time. Interestingly, response surface plots showed that the amount of water is the most important factor for microwave sterilization under the present conditions. Complete inactivation by microwaves was achieved in 5 min, and the inactivation efficiency by microwave was obviously higher than that by conventional steam autoclave. Finally, confocal laser scanning microscopy images showed that the principal effect of microwave treatment was cell membrane disruption. Thus, this study can contribute to the development of a process to control food-associated pathogens.

  10. Using a portable sulfide monitor as a motivational tool: a clinical study.

    PubMed

    Uppal, Ranjit Singh; Malhotra, Ranjan; Grover, Vishakha; Grover, Deepak

    2012-01-01

    Bad breath has a significant impact on daily life of those who suffer from it. Oral malodor may rank only behind dental caries and periodontal disease as the cause of patient's visit to dentist. An aim of this study was to use a portable sulfide monitor as a motivational tool for encouraging the patients towards the better oral hygiene by correlating the plaque scores with sulfide monitor scores, and comparing the sulfide monitor scores before and after complete prophylaxis and 3 months after patient motivation. 30 patients with chronic periodontitis, having chief complaint of oral malodor participated in this study. At first visit, the plaque scores (P1) and sulfide monitor scores before (BCR1) and after complete oral prophylaxis (BCR2) were taken. Then the patients were motivated towards the better oral hygiene. After 3 months, plaque scores (P2) and sulfide monitor scores (BCR3) were recorded again. It was done using SPSS (student package software for statistical analysis). Paired sample test was performed. Statistically significant reduction in sulfide monitor scores was reported after the complete oral prophylaxis and 3 months after patient motivation. Plaque scores were significantly reduced after a period of 3 months. Plaque scores and breathchecker scores were positively correlated. An intensity of the oral malodor was positively correlated with the plaque scores. The portable sulfide monitor was efficacious in motivating the patients towards the better oral hygiene.

  11. Motivational interviewing and intimate partner violence: a randomized trial.

    PubMed

    Saftlas, Audrey F; Harland, Karisa K; Wallis, Anne B; Cavanaugh, Joseph; Dickey, Penny; Peek-Asa, Corinne

    2014-02-01

    To determine if motivational interviewing (MI) improves self-efficacy (primary outcome), depressive symptoms (secondary outcome), and stage-of-readiness-to-change (secondary outcome) among women in abusive relationships. Randomized controlled trial among women who experienced intimate partner violence in a current relationship over the past 12 months. Subjects were recruited from two family planning clinics (December 2007 to May 2010). The intervention included an initial face-to-face session and three telephone sessions administered 1-, 2-, and 4-months postenrollment, each using MI to identify personal goals. Controls were referred to community-based resources. Outcomes were measured by self-administered questionnaires before randomization and 6 months later. Modified intent-to-treat analyses of completed participants were conducted using multivariate analysis of variance for continuous outcomes and polytomous logistic regression for categorical outcomes. Three hundred six eligible women were enrolled (recruitment rate = 64%); 204 completed the 6-month follow-up (completion rate = 67%). Depressive symptoms decreased to a greater extent in MI than referral women (P = .07). Self-efficacy and stage-of-readiness-to-change increased more in MI than referral women, but these differences were not statistically significant. With a lower than projected sample size, our findings did not achieve statistical significance at the 5% level but suggest a beneficial effect of the MI intervention on reducing depressive symptoms. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Robot-assisted walking training for individuals with Parkinson's disease: a pilot randomized controlled trial.

    PubMed

    Sale, Patrizio; De Pandis, Maria Francesca; Le Pera, Domenica; Sova, Ivan; Cimolin, Veronica; Ancillao, Andrea; Albertini, Giorgio; Galli, Manuela; Stocchi, Fabrizio; Franceschini, Marco

    2013-05-24

    Over the last years, the introduction of robotic technologies into Parkinson's disease rehabilitation settings has progressed from concept to reality. However, the benefit of robotic training remains elusive. This pilot randomized controlled observer trial is aimed at investigating the feasibility, the effectiveness and the efficacy of new end-effector robot training in people with mild Parkinson's disease. Design. Pilot randomized controlled trial. Robot training was feasible, acceptable, safe, and the participants completed 100% of the prescribed training sessions. A statistically significant improvement in gait index was found in favour of the EG (T0 versus T1). In particular, the statistical analysis of primary outcome (gait speed) using the Friedman test showed statistically significant improvements for the EG (p = 0,0195). The statistical analysis performed by Friedman test of Step length left (p = 0,0195) and right (p = 0,0195) and Stride length left (p = 0,0078) and right (p = 0,0195) showed a significant statistical gain. No statistically significant improvements on the CG were found. Robot training is a feasible and safe form of rehabilitative exercise for cognitively intact people with mild PD. This original approach can contribute to increase a short time lower limb motor recovery in idiopathic PD patients. The focus on the gait recovery is a further characteristic that makes this research relevant to clinical practice. On the whole, the simplicity of treatment, the lack of side effects, and the positive results from patients support the recommendation to extend the use of this treatment. Further investigation regarding the long-time effectiveness of robot training is warranted. ClinicalTrials.gov NCT01668407.

  13. An Evaluation of CPRA (Cost Performance Report Analysis) Estimate at Completion Techniques Based Upon AFWAL (Air Force Wright Aeronautical Laboratories) Cost/Schedule Control System Criteria Data

    DTIC Science & Technology

    1985-09-01

    4 C/SCSC Terms and Definitions ...... ..... 5 Cost Performance Report Analysis (CPA) Progrra" m 6 Description of CPRA Terms and Formulas...hypotheses are: 1 2 C2: al’ 02 ’ The test statistic is then calculated as: F* (( SSEI + (nI - 2)) / (SSE 2 + (n 2 - 2))] The critical F value is: F(c, nl...353.90767 SIGNIF F = .0000 44 ,1 42 •.4 m . - .TABLE B.4 General Linear Test for EAC1 and EAC5 MEAN STD DEV CASES ECAC 827534.056 1202737.882 1630 EACS

  14. Contribution of artificial intelligence to the knowledge of prognostic factors in Hodgkin's lymphoma.

    PubMed

    Buciński, Adam; Marszałł, Michał Piotr; Krysiński, Jerzy; Lemieszek, Andrzej; Załuski, Jerzy

    2010-07-01

    Hodgkin's lymphoma is one of the most curable malignancies and most patients achieve a lasting complete remission. In this study, artificial neural network (ANN) analysis was shown to provide significant factors with regard to 5-year recurrence after lymphoma treatment. Data from 114 patients treated for Hodgkin's disease were available for evaluation and comparison. A total of 31 variables were subjected to ANN analysis. The ANN approach as an advanced multivariate data processing method was shown to provide objective prognostic data. Some of these prognostic factors are consistent or even identical to the factors evaluated earlier by other statistical methods.

  15. Completion and Attrition Rates for Apprentices and Trainees, 2016. Australian Vocational Education and Training Statistics

    ERIC Educational Resources Information Center

    National Centre for Vocational Education Research (NCVER), 2017

    2017-01-01

    This publication presents completion and attrition rates for apprentices and trainees using three different methodologies: (1) contract completion and attrition rates: based on the outcomes of contracts of training; (2) individual completion rates: based on contract completion rates and adjusted for factors representing average recommencements by…

  16. Completion and Attrition Rates for Apprentices and Trainees 2014. Australian Vocational Education and Training Statistics

    ERIC Educational Resources Information Center

    National Centre for Vocational Education Research (NCVER), 2015

    2015-01-01

    This publication presents completion and attrition rates for apprentices and trainees using three different methodologies: (1) contract completion and attrition rates: based on the outcomes of contracts of training; (2) individual completion rates: based on contract completion rates and adjusted for factors representing average recommencements by…

  17. Statistical analysis and handling of missing data in cluster randomized trials: a systematic review.

    PubMed

    Fiero, Mallorie H; Huang, Shuang; Oren, Eyal; Bell, Melanie L

    2016-02-09

    Cluster randomized trials (CRTs) randomize participants in groups, rather than as individuals and are key tools used to assess interventions in health research where treatment contamination is likely or if individual randomization is not feasible. Two potential major pitfalls exist regarding CRTs, namely handling missing data and not accounting for clustering in the primary analysis. The aim of this review was to evaluate approaches for handling missing data and statistical analysis with respect to the primary outcome in CRTs. We systematically searched for CRTs published between August 2013 and July 2014 using PubMed, Web of Science, and PsycINFO. For each trial, two independent reviewers assessed the extent of the missing data and method(s) used for handling missing data in the primary and sensitivity analyses. We evaluated the primary analysis and determined whether it was at the cluster or individual level. Of the 86 included CRTs, 80 (93%) trials reported some missing outcome data. Of those reporting missing data, the median percent of individuals with a missing outcome was 19% (range 0.5 to 90%). The most common way to handle missing data in the primary analysis was complete case analysis (44, 55%), whereas 18 (22%) used mixed models, six (8%) used single imputation, four (5%) used unweighted generalized estimating equations, and two (2%) used multiple imputation. Fourteen (16%) trials reported a sensitivity analysis for missing data, but most assumed the same missing data mechanism as in the primary analysis. Overall, 67 (78%) trials accounted for clustering in the primary analysis. High rates of missing outcome data are present in the majority of CRTs, yet handling missing data in practice remains suboptimal. Researchers and applied statisticians should carry out appropriate missing data methods, which are valid under plausible assumptions in order to increase statistical power in trials and reduce the possibility of bias. Sensitivity analysis should be performed, with weakened assumptions regarding the missing data mechanism to explore the robustness of results reported in the primary analysis.

  18. Forest statistics for the Southern Coastal Plain of North Carolina 1973

    Treesearch

    Noel D. Cost

    1973-01-01

    This report highlights the principal findings of the fourth inventory of the timber resource in the Southern Coastal plain of North Carolina. The inventory was s t a r t e d in November 1972 and completed in August 1973. Three previous inventories, completed in 1937, 1952, and 1962, provide statistics for measuring changes and trends over the past 36 years. In this...

  19. Forest statistics for the Northern Coastal Plain of South Carolina, 1986

    Treesearch

    John B. Tansey

    1987-01-01

    This report highlights the principal findings of the sixth forest survey in the Northern Coastal Plain of South Carolina. Fieldwork began in April 1986 and was completed in July 1986. Five previous surveys, completed in 1936, 1947, 1958, 1968, and 1978, provide statistics for measuring changes and trends over the past 50 years. The primary emphasis in this report is on...

  20. The Effects of Tank Crew Turbulence on Tank Gunnery Performance

    DTIC Science & Technology

    1978-09-01

    complete. Crewmen’s responses were converted to mouths for all itqms and tabulated for analysis. Because data was tabulated to two digits a maximum...two- digit data tabulation, mean and standard deviation statistics are somewhat conserva- tive for items 8, 9, and 10. There were 14-1S% of the TCs who...Benjamin Harrison. ATTN Libary I HQOA (DAMA-ARI I USAPACDC. Ft SBenjamuun Haritson, ATTN; ATCP-IHR I HOCIA OAPE HRE PO) IUSA Comrm- Elect Sch

  1. FORTRAN 4 programs for the extraction of potential well parameters from the energy dependence of total elastic scattering cross sections

    NASA Technical Reports Server (NTRS)

    Labudde, R. A.

    1972-01-01

    An attempt has been made to keep the programs as subroutine oriented as possible. Usually only the main programs are directly concerned with the problem of total cross sections. In particular the subroutines POLFIT, BILINR, GASS59/MAXLIK, SYMOR, MATIN, STUDNT, DNTERP, DIFTAB, FORDIF, EPSALG, REGFAL and ADSIMP are completely general, and are concerned only with the problems of numerical analysis and statistics. Each subroutine is independently documented.

  2. Effects of remedial grouting on the ground-water flow system at Red Rock Dam near Pella, Iowa

    USGS Publications Warehouse

    Linhart, S. Mike; Schaap, Bryan D.

    2001-01-01

    Hydrographs, statistical analysis of waterlevel data, and water-chemistry data suggest that underseepage on the northeast side of the dam has been reduced but not completely eliminated. Some areas appear to have been affected to a greater degree and for a longer period of time than other areas. Future monitoring of water levels, water chemistry, and stable isotopes can aid in the evaluation of the long-term effectiveness of remedial grouting.

  3. Exploring Factors Related to Completion of an Online Undergraduate-Level Introductory Statistics Course

    ERIC Educational Resources Information Center

    Zimmerman, Whitney Alicia; Johnson, Glenn

    2017-01-01

    Data were collected from 353 online undergraduate introductory statistics students at the beginning of a semester using the Goals and Outcomes Associated with Learning Statistics (GOALS) instrument and an abbreviated form of the Statistics Anxiety Rating Scale (STARS). Data included a survey of expected grade, expected time commitment, and the…

  4. AGR-1 Thermocouple Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeff Einerson

    2012-05-01

    This report documents an effort to analyze measured and simulated data obtained in the Advanced Gas Reactor (AGR) fuel irradiation test program conducted in the INL's Advanced Test Reactor (ATR) to support the Next Generation Nuclear Plant (NGNP) R&D program. The work follows up on a previous study (Pham and Einerson, 2010), in which statistical analysis methods were applied for AGR-1 thermocouple data qualification. The present work exercises the idea that, while recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, results of the numerical simulations can be used in combination with the statistical analysis methods tomore » further improve qualification of measured data. Additionally, the combined analysis of measured and simulation data can generate insights about simulation model uncertainty that can be useful for model improvement. This report also describes an experimental control procedure to maintain fuel target temperature in the future AGR tests using regression relationships that include simulation results. The report is organized into four chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program, AGR-1 test configuration and test procedure, overview of AGR-1 measured data, and overview of physics and thermal simulation, including modeling assumptions and uncertainties. A brief summary of statistical analysis methods developed in (Pham and Einerson 2010) for AGR-1 measured data qualification within NGNP Data Management and Analysis System (NDMAS) is also included for completeness. Chapters 2-3 describe and discuss cases, in which the combined use of experimental and simulation data is realized. A set of issues associated with measurement and modeling uncertainties resulted from the combined analysis are identified. This includes demonstration that such a combined analysis led to important insights for reducing uncertainty in presentation of AGR-1 measured data (Chapter 2) and interpretation of simulation results (Chapter 3). The statistics-based simulation-aided experimental control procedure described for the future AGR tests is developed and demonstrated in Chapter 4. The procedure for controlling the target fuel temperature (capsule peak or average) is based on regression functions of thermocouple readings and other relevant parameters and accounting for possible changes in both physical and thermal conditions and in instrument performance.« less

  5. Simultaneous Analysis and Quality Assurance for Diffusion Tensor Imaging

    PubMed Central

    Lauzon, Carolyn B.; Asman, Andrew J.; Esparza, Michael L.; Burns, Scott S.; Fan, Qiuyun; Gao, Yurui; Anderson, Adam W.; Davis, Nicole; Cutting, Laurie E.; Landman, Bennett A.

    2013-01-01

    Diffusion tensor imaging (DTI) enables non-invasive, cyto-architectural mapping of in vivo tissue microarchitecture through voxel-wise mathematical modeling of multiple magnetic resonance imaging (MRI) acquisitions, each differently sensitized to water diffusion. DTI computations are fundamentally estimation processes and are sensitive to noise and artifacts. Despite widespread adoption in the neuroimaging community, maintaining consistent DTI data quality remains challenging given the propensity for patient motion, artifacts associated with fast imaging techniques, and the possibility of hardware changes/failures. Furthermore, the quantity of data acquired per voxel, the non-linear estimation process, and numerous potential use cases complicate traditional visual data inspection approaches. Currently, quality inspection of DTI data has relied on visual inspection and individual processing in DTI analysis software programs (e.g. DTIPrep, DTI-studio). However, recent advances in applied statistical methods have yielded several different metrics to assess noise level, artifact propensity, quality of tensor fit, variance of estimated measures, and bias in estimated measures. To date, these metrics have been largely studied in isolation. Herein, we select complementary metrics for integration into an automatic DTI analysis and quality assurance pipeline. The pipeline completes in 24 hours, stores statistical outputs, and produces a graphical summary quality analysis (QA) report. We assess the utility of this streamlined approach for empirical quality assessment on 608 DTI datasets from pediatric neuroimaging studies. The efficiency and accuracy of quality analysis using the proposed pipeline is compared with quality analysis based on visual inspection. The unified pipeline is found to save a statistically significant amount of time (over 70%) while improving the consistency of QA between a DTI expert and a pool of research associates. Projection of QA metrics to a low dimensional manifold reveal qualitative, but clear, QA-study associations and suggest that automated outlier/anomaly detection would be feasible. PMID:23637895

  6. Application of computer-aided diagnosis (CAD) in MR-mammography (MRM): do we really need whole lesion time curve distribution analysis?

    PubMed

    Baltzer, Pascal Andreas Thomas; Renz, Diane M; Kullnig, Petra E; Gajda, Mieczyslaw; Camara, Oumar; Kaiser, Werner A

    2009-04-01

    The identification of the most suspect enhancing part of a lesion is regarded as a major diagnostic criterion in dynamic magnetic resonance mammography. Computer-aided diagnosis (CAD) software allows the semi-automatic analysis of the kinetic characteristics of complete enhancing lesions, providing additional information about lesion vasculature. The diagnostic value of this information has not yet been quantified. Consecutive patients from routine diagnostic studies (1.5 T, 0.1 mmol gadopentetate dimeglumine, dynamic gradient-echo sequences at 1-minute intervals) were analyzed prospectively using CAD. Dynamic sequences were processed and reduced to a parametric map. Curve types were classified by initial signal increase (not significant, intermediate, and strong) and the delayed time course of signal intensity (continuous, plateau, and washout). Lesion enhancement was measured using CAD. The most suspect curve, the curve-type distribution percentage, and combined dynamic data were compared. Statistical analysis included logistic regression analysis and receiver-operating characteristic analysis. Fifty-one patients with 46 malignant and 44 benign lesions were enrolled. On receiver-operating characteristic analysis, the most suspect curve showed diagnostic accuracy of 76.7 +/- 5%. In comparison, the curve-type distribution percentage demonstrated accuracy of 80.2 +/- 4.9%. Combined dynamic data had the highest diagnostic accuracy (84.3 +/- 4.2%). These differences did not achieve statistical significance. With appropriate cutoff values, sensitivity and specificity, respectively, were found to be 80.4% and 72.7% for the most suspect curve, 76.1% and 83.6% for the curve-type distribution percentage, and 78.3% and 84.5% for both parameters. The integration of whole-lesion dynamic data tends to improve specificity. However, no statistical significance backs up this finding.

  7. A Mokken scale analysis of the peer physical examination questionnaire.

    PubMed

    Vaughan, Brett; Grace, Sandra

    2018-01-01

    Peer physical examination (PPE) is a teaching and learning strategy utilised in most health profession education programs. Perceptions of participating in PPE have been described in the literature, focusing on areas of the body students are willing, or unwilling, to examine. A small number of questionnaires exist to evaluate these perceptions, however none have described the measurement properties that may allow them to be used longitudinally. The present study undertook a Mokken scale analysis of the Peer Physical Examination Questionnaire (PPEQ) to evaluate its dimensionality and structure when used with Australian osteopathy students. Students enrolled in Year 1 of the osteopathy programs at Victoria University (Melbourne, Australia) and Southern Cross University (Lismore, Australia) were invited to complete the PPEQ prior to their first practical skills examination class. R, an open-source statistics program, was used to generate the descriptive statistics and perform a Mokken scale analysis. Mokken scale analysis is a non-parametric item response theory approach that is used to cluster items measuring a latent construct. Initial analysis suggested the PPEQ did not form a single scale. Further analysis identified three subscales: 'comfort', 'concern', and 'professionalism and education'. The properties of each subscale suggested they were unidimensional with variable internal structures. The 'comfort' subscale was the strongest of the three identified. All subscales demonstrated acceptable reliability estimation statistics (McDonald's omega > 0.75) supporting the calculation of a sum score for each subscale. The subscales identified are consistent with the literature. The 'comfort' subscale may be useful to longitudinally evaluate student perceptions of PPE. Further research is required to evaluate changes with PPE and the utility of the questionnaire with other health profession education programs.

  8. Simultaneous analysis and quality assurance for diffusion tensor imaging.

    PubMed

    Lauzon, Carolyn B; Asman, Andrew J; Esparza, Michael L; Burns, Scott S; Fan, Qiuyun; Gao, Yurui; Anderson, Adam W; Davis, Nicole; Cutting, Laurie E; Landman, Bennett A

    2013-01-01

    Diffusion tensor imaging (DTI) enables non-invasive, cyto-architectural mapping of in vivo tissue microarchitecture through voxel-wise mathematical modeling of multiple magnetic resonance imaging (MRI) acquisitions, each differently sensitized to water diffusion. DTI computations are fundamentally estimation processes and are sensitive to noise and artifacts. Despite widespread adoption in the neuroimaging community, maintaining consistent DTI data quality remains challenging given the propensity for patient motion, artifacts associated with fast imaging techniques, and the possibility of hardware changes/failures. Furthermore, the quantity of data acquired per voxel, the non-linear estimation process, and numerous potential use cases complicate traditional visual data inspection approaches. Currently, quality inspection of DTI data has relied on visual inspection and individual processing in DTI analysis software programs (e.g. DTIPrep, DTI-studio). However, recent advances in applied statistical methods have yielded several different metrics to assess noise level, artifact propensity, quality of tensor fit, variance of estimated measures, and bias in estimated measures. To date, these metrics have been largely studied in isolation. Herein, we select complementary metrics for integration into an automatic DTI analysis and quality assurance pipeline. The pipeline completes in 24 hours, stores statistical outputs, and produces a graphical summary quality analysis (QA) report. We assess the utility of this streamlined approach for empirical quality assessment on 608 DTI datasets from pediatric neuroimaging studies. The efficiency and accuracy of quality analysis using the proposed pipeline is compared with quality analysis based on visual inspection. The unified pipeline is found to save a statistically significant amount of time (over 70%) while improving the consistency of QA between a DTI expert and a pool of research associates. Projection of QA metrics to a low dimensional manifold reveal qualitative, but clear, QA-study associations and suggest that automated outlier/anomaly detection would be feasible.

  9. A national streamflow network gap analysis

    USGS Publications Warehouse

    Kiang, Julie E.; Stewart, David W.; Archfield, Stacey A.; Osborne, Emily B.; Eng, Ken

    2013-01-01

    The U.S. Geological Survey (USGS) conducted a gap analysis to evaluate how well the USGS streamgage network meets a variety of needs, focusing on the ability to calculate various statistics at locations that have streamgages (gaged) and that do not have streamgages (ungaged). This report presents the results of analysis to determine where there are gaps in the network of gaged locations, how accurately desired statistics can be calculated with a given length of record, and whether the current network allows for estimation of these statistics at ungaged locations. The analysis indicated that there is variability across the Nation’s streamflow data-collection network in terms of the spatial and temporal coverage of streamgages. In general, the Eastern United States has better coverage than the Western United States. The arid Southwestern United States, Alaska, and Hawaii were observed to have the poorest spatial coverage, using the dataset assembled for this study. Except in Hawaii, these areas also tended to have short streamflow records. Differences in hydrology lead to differences in the uncertainty of statistics calculated in different regions of the country. Arid and semiarid areas of the Central and Southwestern United States generally exhibited the highest levels of interannual variability in flow, leading to larger uncertainty in flow statistics. At ungaged locations, information can be transferred from nearby streamgages if there is sufficient similarity between the gaged watersheds and the ungaged watersheds of interest. Areas where streamgages exhibit high correlation are most likely to be suitable for this type of information transfer. The areas with the most highly correlated streamgages appear to coincide with mountainous areas of the United States. Lower correlations are found in the Central United States and coastal areas of the Southeastern United States. Information transfer from gaged basins to ungaged basins is also most likely to be successful when basin attributes show high similarity. At the scale of the analysis completed in this study, the attributes of basins upstream of USGS streamgages cover the full range of basin attributes observed at potential locations of interest fairly well. Some exceptions included very high or very low elevation areas and very arid areas.

  10. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses.

    PubMed

    Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas

    2008-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual classes; however, pooling the results across all courses and sections, SOCR effects on the treatment groups were exceptionally robust and significant. Coupling these findings with a clear decrease in the variance of the quantitative examination measures in the treatment groups indicates that employing technology, like SOCR, in a sound pedagogical and scientific manner enhances overall the students' understanding and suggests better long-term knowledge retention.

  11. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses

    PubMed Central

    Dinov, Ivo D.; Sanchez, Juana; Christou, Nicolas

    2009-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment. The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual classes; however, pooling the results across all courses and sections, SOCR effects on the treatment groups were exceptionally robust and significant. Coupling these findings with a clear decrease in the variance of the quantitative examination measures in the treatment groups indicates that employing technology, like SOCR, in a sound pedagogical and scientific manner enhances overall the students’ understanding and suggests better long-term knowledge retention. PMID:19750185

  12. Identifying reprioritization response shift in a stroke caregiver population: a comparison of missing data methods.

    PubMed

    Sajobi, Tolulope T; Lix, Lisa M; Singh, Gurbakhshash; Lowerison, Mark; Engbers, Jordan; Mayo, Nancy E

    2015-03-01

    Response shift (RS) is an important phenomenon that influences the assessment of longitudinal changes in health-related quality of life (HRQOL) studies. Given that RS effects are often small, missing data due to attrition or item non-response can contribute to failure to detect RS effects. Since missing data are often encountered in longitudinal HRQOL data, effective strategies to deal with missing data are important to consider. This study aims to compare different imputation methods on the detection of reprioritization RS in the HRQOL of caregivers of stroke survivors. Data were from a Canadian multi-center longitudinal study of caregivers of stroke survivors over a one-year period. The Stroke Impact Scale physical function score at baseline, with a cutoff of 75, was used to measure patient stroke severity for the reprioritization RS analysis. Mean imputation, likelihood-based expectation-maximization imputation, and multiple imputation methods were compared in test procedures based on changes in relative importance weights to detect RS in SF-36 domains over a 6-month period. Monte Carlo simulation methods were used to compare the statistical powers of relative importance test procedures for detecting RS in incomplete longitudinal data under different missing data mechanisms and imputation methods. Of the 409 caregivers, 15.9 and 31.3 % of them had missing data at baseline and 6 months, respectively. There were no statistically significant changes in relative importance weights on any of the domains when complete-case analysis was adopted. But statistical significant changes were detected on physical functioning and/or vitality domains when mean imputation or EM imputation was adopted. There were also statistically significant changes in relative importance weights for physical functioning, mental health, and vitality domains when multiple imputation method was adopted. Our simulations revealed that relative importance test procedures were least powerful under complete-case analysis method and most powerful when a mean imputation or multiple imputation method was adopted for missing data, regardless of the missing data mechanism and proportion of missing data. Test procedures based on relative importance measures are sensitive to the type and amount of missing data and imputation method. Relative importance test procedures based on mean imputation and multiple imputation are recommended for detecting RS in incomplete data.

  13. Common statistical and research design problems in manuscripts submitted to high-impact psychiatry journals: what editors and reviewers want authors to know.

    PubMed

    Harris, Alex H S; Reeder, Rachelle; Hyun, Jenny K

    2009-10-01

    Journal editors and statistical reviewers are often in the difficult position of catching serious problems in submitted manuscripts after the research is conducted and data have been analyzed. We sought to learn from editors and reviewers of major psychiatry journals what common statistical and design problems they most often find in submitted manuscripts and what they wished to communicate to authors regarding these issues. Our primary goal was to facilitate communication between journal editors/reviewers and researchers/authors and thereby improve the scientific and statistical quality of research and submitted manuscripts. Editors and statistical reviewers of 54 high-impact psychiatry journals were surveyed to learn what statistical or design problems they encounter most often in submitted manuscripts. Respondents completed the survey online. The authors analyzed survey text responses using content analysis procedures to identify major themes related to commonly encountered statistical or research design problems. Editors and reviewers (n=15) who handle manuscripts from 39 different high-impact psychiatry journals responded to the survey. The most commonly cited problems regarded failure to map statistical models onto research questions, improper handling of missing data, not controlling for multiple comparisons, not understanding the difference between equivalence and difference trials, and poor controls in quasi-experimental designs. The scientific quality of psychiatry research and submitted reports could be greatly improved if researchers became sensitive to, or sought consultation on frequently encountered methodological and analytic issues.

  14. The five elements and Chinese-American mortality.

    PubMed

    Smith, Gary

    2006-01-01

    D. P. Phillips, T. E. Ruth, and L. M. Wagner (1993) reported that 1969-1990 California mortality data show that Chinese Americans are particularly vulnerable to diseases that Chinese astrology and traditional Chinese medicine associate with their birth years. For example, because fire is associated with the heart, a Chinese person born in a fire year (such as 1937) is more likely to die of heart disease than is a Chinese person born in a nonfire year. However, many diseases were excluded from this study, some diseases that were included have ambiguous links to birth years, and the statistical tests were indirect. A more complete statistical analysis and independent California mortality data for the years 1960-1968 and 1991-2002 did not replicate the original results. Copyright 2006 APA, all rights reserved.

  15. Fatigue in Arthritis: A Multidimensional Phenomenon with Impact on Quality of Life : Fatigue and Quality of Life in Arthritis.

    PubMed

    Alikari, Victoria; Sachlas, Athanasios; Giatrakou, Stavroula; Stathoulis, John; Fradelos, Evagelos; Theofilou, Paraskevi; Lavdaniti, Maria; Zyga, Sofia

    2017-01-01

    An important factor which influences the quality of life of patients with arthritis is the fatigue they experience. The purpose of this study was to assess the relationship between fatigue and quality of life among patients with osteoarthritis and rheumatoid arthritis. Between January 2015 and March 2015, 179 patients with osteoarthritis and rheumatoid arthritis completed the Fatigue Assessment Scale and the Missoula-VITAS Quality of Life Index-15 (MVQoLI-15). The study was conducted in Rehabilitation Centers located in the area of Peloponnese, Greece. Data related to sociodemographic characteristics and their individual medical histories were recorded. Statistical analysis was performed using the IBM SPSS Statistics version 19. The analysis did not reveal statistically significant correlation between fatigue and quality of life neither in the total sample nor among patients with osteoarthritis (r = -0.159; p = 0.126) or rheumatoid arthritis. However, there was a statistically significant relationship between some aspects of fatigue and dimensions of quality of life. Osteoarthritis patients had statistically significant lower MVQoLI-15 score than rheumatoid arthritis patients (13.73 ± 1.811 vs 14.61 ± 1.734) and lower FAS score than rheumatoid patients (26.14 ± 3.668 vs 29.94 ± 3.377) (p-value < 0.001). The finding that different aspects of fatigue may affect dimensions of quality of life may help health care professionals by proposing the early treatment of fatigue in order to gain benefits for quality of life.

  16. Use of a statistical model of the whole femur in a large scale, multi-model study of femoral neck fracture risk.

    PubMed

    Bryan, Rebecca; Nair, Prasanth B; Taylor, Mark

    2009-09-18

    Interpatient variability is often overlooked in orthopaedic computational studies due to the substantial challenges involved in sourcing and generating large numbers of bone models. A statistical model of the whole femur incorporating both geometric and material property variation was developed as a potential solution to this problem. The statistical model was constructed using principal component analysis, applied to 21 individual computer tomography scans. To test the ability of the statistical model to generate realistic, unique, finite element (FE) femur models it was used as a source of 1000 femurs to drive a study on femoral neck fracture risk. The study simulated the impact of an oblique fall to the side, a scenario known to account for a large proportion of hip fractures in the elderly and have a lower fracture load than alternative loading approaches. FE model generation, application of subject specific loading and boundary conditions, FE processing and post processing of the solutions were completed automatically. The generated models were within the bounds of the training data used to create the statistical model with a high mesh quality, able to be used directly by the FE solver without remeshing. The results indicated that 28 of the 1000 femurs were at highest risk of fracture. Closer analysis revealed the percentage of cortical bone in the proximal femur to be a crucial differentiator between the failed and non-failed groups. The likely fracture location was indicated to be intertrochantic. Comparison to previous computational, clinical and experimental work revealed support for these findings.

  17. FieldTrip: Open source software for advanced analysis of MEG, EEG, and invasive electrophysiological data.

    PubMed

    Oostenveld, Robert; Fries, Pascal; Maris, Eric; Schoffelen, Jan-Mathijs

    2011-01-01

    This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow experimental neuroscientists to analyze experimental data. It includes algorithms for simple and advanced analysis, such as time-frequency analysis using multitapers, source reconstruction using dipoles, distributed sources and beamformers, connectivity analysis, and nonparametric statistical permutation tests at the channel and source level. The implementation as toolbox allows the user to perform elaborate and structured analyses of large data sets using the MATLAB command line and batch scripting. Furthermore, users and developers can easily extend the functionality and implement new algorithms. The modular design facilitates the reuse in other software packages.

  18. CTTITEM: SAS macro and SPSS syntax for classical item analysis.

    PubMed

    Lei, Pui-Wa; Wu, Qiong

    2007-08-01

    This article describes the functions of a SAS macro and an SPSS syntax that produce common statistics for conventional item analysis including Cronbach's alpha, item difficulty index (p-value or item mean), and item discrimination indices (D-index, point biserial and biserial correlations for dichotomous items and item-total correlation for polytomous items). These programs represent an improvement over the existing SAS and SPSS item analysis routines in terms of completeness and user-friendliness. To promote routine evaluations of item qualities in instrument development of any scale, the programs are available at no charge for interested users. The program codes along with a brief user's manual that contains instructions and examples are downloadable from suen.ed.psu.edu/-pwlei/plei.htm.

  19. A three-step approach for the derivation and validation of high-performing predictive models using an operational dataset: congestive heart failure readmission case study.

    PubMed

    AbdelRahman, Samir E; Zhang, Mingyuan; Bray, Bruce E; Kawamoto, Kensaku

    2014-05-27

    The aim of this study was to propose an analytical approach to develop high-performing predictive models for congestive heart failure (CHF) readmission using an operational dataset with incomplete records and changing data over time. Our analytical approach involves three steps: pre-processing, systematic model development, and risk factor analysis. For pre-processing, variables that were absent in >50% of records were removed. Moreover, the dataset was divided into a validation dataset and derivation datasets which were separated into three temporal subsets based on changes to the data over time. For systematic model development, using the different temporal datasets and the remaining explanatory variables, the models were developed by combining the use of various (i) statistical analyses to explore the relationships between the validation and the derivation datasets; (ii) adjustment methods for handling missing values; (iii) classifiers; (iv) feature selection methods; and (iv) discretization methods. We then selected the best derivation dataset and the models with the highest predictive performance. For risk factor analysis, factors in the highest-performing predictive models were analyzed and ranked using (i) statistical analyses of the best derivation dataset, (ii) feature rankers, and (iii) a newly developed algorithm to categorize risk factors as being strong, regular, or weak. The analysis dataset consisted of 2,787 CHF hospitalizations at University of Utah Health Care from January 2003 to June 2013. In this study, we used the complete-case analysis and mean-based imputation adjustment methods; the wrapper subset feature selection method; and four ranking strategies based on information gain, gain ratio, symmetrical uncertainty, and wrapper subset feature evaluators. The best-performing models resulted from the use of a complete-case analysis derivation dataset combined with the Class-Attribute Contingency Coefficient discretization method and a voting classifier which averaged the results of multi-nominal logistic regression and voting feature intervals classifiers. Of 42 final model risk factors, discharge disposition, discretized age, and indicators of anemia were the most significant. This model achieved a c-statistic of 86.8%. The proposed three-step analytical approach enhanced predictive model performance for CHF readmissions. It could potentially be leveraged to improve predictive model performance in other areas of clinical medicine.

  20. A Primer on Receiver Operating Characteristic Analysis and Diagnostic Efficiency Statistics for Pediatric Psychology: We Are Ready to ROC

    PubMed Central

    2014-01-01

    Objective To offer a practical demonstration of receiver operating characteristic (ROC) analyses, diagnostic efficiency statistics, and their application to clinical decision making using a popular parent checklist to assess for potential mood disorder. Method Secondary analyses of data from 589 families seeking outpatient mental health services, completing the Child Behavior Checklist and semi-structured diagnostic interviews. Results Internalizing Problems raw scores discriminated mood disorders significantly better than did age- and gender-normed T scores, or an Affective Problems score. Internalizing scores <8 had a diagnostic likelihood ratio <0.3, and scores >30 had a diagnostic likelihood ratio of 7.4. Conclusions This study illustrates a series of steps in defining a clinical problem, operationalizing it, selecting a valid study design, and using ROC analyses to generate statistics that support clinical decisions. The ROC framework offers important advantages for clinical interpretation. Appendices include sample scripts using SPSS and R to check assumptions and conduct ROC analyses. PMID:23965298

  1. A primer on receiver operating characteristic analysis and diagnostic efficiency statistics for pediatric psychology: we are ready to ROC.

    PubMed

    Youngstrom, Eric A

    2014-03-01

    To offer a practical demonstration of receiver operating characteristic (ROC) analyses, diagnostic efficiency statistics, and their application to clinical decision making using a popular parent checklist to assess for potential mood disorder. Secondary analyses of data from 589 families seeking outpatient mental health services, completing the Child Behavior Checklist and semi-structured diagnostic interviews. Internalizing Problems raw scores discriminated mood disorders significantly better than did age- and gender-normed T scores, or an Affective Problems score. Internalizing scores <8 had a diagnostic likelihood ratio <0.3, and scores >30 had a diagnostic likelihood ratio of 7.4. This study illustrates a series of steps in defining a clinical problem, operationalizing it, selecting a valid study design, and using ROC analyses to generate statistics that support clinical decisions. The ROC framework offers important advantages for clinical interpretation. Appendices include sample scripts using SPSS and R to check assumptions and conduct ROC analyses.

  2. P-Value Club: Teaching Significance Level on the Dance Floor

    ERIC Educational Resources Information Center

    Gray, Jennifer

    2010-01-01

    Courses: Beginning research methods and statistics courses, as well as advanced communication courses that require reading research articles and completing research projects involving statistics. Objective: Students will understand the difference between significant and nonsignificant statistical results based on p-value.

  3. Phylogenomic Analysis and Dynamic Evolution of Chloroplast Genomes in Salicaceae

    PubMed Central

    Huang, Yuan; Wang, Jun; Yang, Yongping; Fan, Chuanzhu; Chen, Jiahui

    2017-01-01

    Chloroplast genomes of plants are highly conserved in both gene order and gene content. Analysis of the whole chloroplast genome is known to provide much more informative DNA sites and thus generates high resolution for plant phylogenies. Here, we report the complete chloroplast genomes of three Salix species in family Salicaceae. Phylogeny of Salicaceae inferred from complete chloroplast genomes is generally consistent with previous studies but resolved with higher statistical support. Incongruences of phylogeny, however, are observed in genus Populus, which most likely results from homoplasy. By comparing three Salix chloroplast genomes with the published chloroplast genomes of other Salicaceae species, we demonstrate that the synteny and length of chloroplast genomes in Salicaceae are highly conserved but experienced dynamic evolution among species. We identify seven positively selected chloroplast genes in Salicaceae, which might be related to the adaptive evolution of Salicaceae species. Comparative chloroplast genome analysis within the family also indicates that some chloroplast genes are lost or became pseudogenes, infer that the chloroplast genes horizontally transferred to the nucleus genome. Based on the complete nucleus genome sequences from two Salicaceae species, we remarkably identify that the entire chloroplast genome is indeed transferred and integrated to the nucleus genome in the individual of the reference genome of P. trichocarpa at least once. This observation, along with presence of the large nuclear plastid DNA (NUPTs) and NUPTs-containing multiple chloroplast genes in their original order in the chloroplast genome, favors the DNA-mediated hypothesis of organelle to nucleus DNA transfer. Overall, the phylogenomic analysis using chloroplast complete genomes clearly elucidates the phylogeny of Salicaceae. The identification of positively selected chloroplast genes and dynamic chloroplast-to-nucleus gene transfers in Salicaceae provide resources to better understand the successful adaptation of Salicaceae species. PMID:28676809

  4. Estimation Model of Spacecraft Parameters and Cost Based on a Statistical Analysis of COMPASS Designs

    NASA Technical Reports Server (NTRS)

    Gerberich, Matthew W.; Oleson, Steven R.

    2013-01-01

    The Collaborative Modeling for Parametric Assessment of Space Systems (COMPASS) team at Glenn Research Center has performed integrated system analysis of conceptual spacecraft mission designs since 2006 using a multidisciplinary concurrent engineering process. The set of completed designs was archived in a database, to allow for the study of relationships between design parameters. Although COMPASS uses a parametric spacecraft costing model, this research investigated the possibility of using a top-down approach to rapidly estimate the overall vehicle costs. This paper presents the relationships between significant design variables, including breakdowns of dry mass, wet mass, and cost. It also develops a model for a broad estimate of these parameters through basic mission characteristics, including the target location distance, the payload mass, the duration, the delta-v requirement, and the type of mission, propulsion, and electrical power. Finally, this paper examines the accuracy of this model in regards to past COMPASS designs, with an assessment of outlying spacecraft, and compares the results to historical data of completed NASA missions.

  5. Histological study of the effect of some irrigating solutions on bacterial endotoxin in dogs.

    PubMed

    Silva, Léa Assed Bezerra da; Leonardo, Mario Roberto; Assed, Sada; Tanomaru Filho, Mário

    2004-01-01

    The aim of this study was to evaluate, histopathologically, the effectiveness of mechanical preparation of root canals using different irrigating solutions in dog teeth filled with LPS after pulpectomy. A total of 120 root canals of 6 mongrel dogs were filled with a solution of LPS after pulpectomy. The irrigating solutions used were saline, 1, 2.5, and 5% sodium hypochlorite, and 2% chlorhexidine. No irrigation was used in the control group. The animals were sacrificed after 60 days and the teeth were fixed and demineralized. Subsequently, serial 6-microm sections were stained with hematoxylin and eosin and Mallory's trichrome for histopathological analysis and Brown-Brenn for verification of bacterial contamination. Analysis showed that the inflammatory infiltrate was statistically less intense in the groups in which the root canals were irrigated with 5% sodium hypochlorite and 2% chlorhexidine. However, none of the irrigating solutions completely inactivated the harmful effects of LPS. Mechanical preparation associated with different irrigating solutions did not completely inactivate LPS.

  6. 29 CFR 1904.42 - Requests from the Bureau of Labor Statistics for data.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 5 2010-07-01 2010-07-01 false Requests from the Bureau of Labor Statistics for data. 1904... Statistics for data. (a) Basic requirement. If you receive a Survey of Occupational Injuries and Illnesses Form from the Bureau of Labor Statistics (BLS), or a BLS designee, you must promptly complete the form...

  7. Influence of different base thicknesses on maxillary complete denture processing: linear and angular graphic analysis on the movement of artificial teeth.

    PubMed

    Mazaro, José Vitor Quinelli; Gennari Filho, Humberto; Vedovatto, Eduardo; Amoroso, Andressa Paschoal; Pellizzer, Eduardo Piza; Zavanelli, Adriana Cristina

    2011-09-01

    The purpose of this study was to compare the dental movement that occurs during the processing of maxillary complete dentures with 3 different base thicknesses, using 2 investment methods, and microwave polymerization. A sample of 42 denture models was randomly divided into 6 groups (n = 7), with base thicknesses of 1.25, 2.50, and 3.75 mm and gypsum or silicone flask investment. Points were demarcated on the distal surface of the second molars and on the back of the gypsum cast at the alveolar ridge level to allow linear and angular measurement using AutoCAD software. The data were subjected to analysis of variance with double factor, Tukey test and Fisher (post hoc). Angular analysis of the varying methods and their interactions generated a statistical difference (P = 0.023) when the magnitudes of molar inclination were compared. Tooth movement was greater for thin-based prostheses, 1.25 mm (-0.234), versus thick 3.75 mm (0.2395), with antagonistic behavior. Prosthesis investment with silicone (0.053) showed greater vertical change compared with the gypsum investment (0.032). There was a difference between the point of analysis, demonstrating that the changes were not symmetric. All groups evaluated showed change in the position of artificial teeth after processing. The complete denture with a thin base (1.25 mm) and silicone investment showed the worst results, whereas intermediate thickness (2.50 mm) was demonstrated to be ideal for the denture base.

  8. Completion of the Circle of Willis Varies by Gender, Age, and Indication for Computed Tomography Angiography.

    PubMed

    Zaninovich, Orel A; Ramey, Wyatt L; Walter, Christina M; Dumont, Travis M

    2017-10-01

    The circle of Willis (CoW) is the foremost anastomosis and blood distribution center of the brain. Its effectiveness depends on its completion and the size and patency of its vessels. Gender-related and age-related anatomic variations in the CoW may play an important role in the pathogenesis of cerebrovascular diseases. In this study, we analyzed computed tomography angiograms (CTAs) to assess for differences in CoW completion related to gender, age, and indication for CTA. A total of 834 CTAs were retrospectively analyzed for all CoW vessels to compare the incidence of complete CoW and variation frequency based on gender, age, and indication. The incidence of complete CoW was 37.1% overall. CoW completion showed a statistically significant decrease with increasing age for all age groups in both men (47.0%, 29.4%, 18.8%) and women (59.1%, 44.2%, 30.9%). Completion was greater in women (43.8%) than in men (31.2%) overall and for all age groups. These gender differences were all statistically significant except for the 18-39 years age group. The most frequent of the 28 CoW variations were absent posterior communicating artery (PCOM) bilaterally (17.1%), right PCOM (15.3%), and left PCOM (10.9%). Ischemic stroke and the 18-39 years age group of hemorrhagic stroke showed a statistically significant reduction in completion relative to trauma. The incidence of complete CoW is likely greater in women for all age groups and likely decreases with age in both genders. The most frequently absent vessel is likely the PCOM, either unilaterally or bilaterally. Completion may play a role in ischemic stroke and a subset of patients with hemorrhagic stroke. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Machines that go "ping" may improve balance but may not improve mobility or reduce risk of falls: a systematic review.

    PubMed

    Dennett, Amy M; Taylor, Nicholas F

    2015-01-01

    To determine the effectiveness of computer-based electronic devices that provide feedback in improving mobility and balance and reducing falls. Randomized controlled trials were searched from the earliest available date to August 2013. Standardized mean differences were used to complete meta-analyses, with statistical heterogeneity being described with the I-squared statistic. The GRADE approach was used to summarize the level of evidence for each completed meta-analysis. Risk of bias for individual trials was assessed with the (Physiotherapy Evidence Database) PEDro scale. Thirty trials were included. There was high-quality evidence that computerized devices can improve dynamic balance in people with a neurological condition compared with no therapy. There was low-to-moderate-quality evidence that computerized devices have no significant effect on mobility, falls efficacy and falls risk in community-dwelling older adults, and people with a neurological condition compared with physiotherapy. There is high-quality evidence that computerized devices that provide feedback may be useful in improving balance in people with neurological conditions compared with no therapy, but there is a lack of evidence supporting more meaningful changes in mobility and falls risk.

  10. The Effect of Web-Based Education on Patient Satisfaction, Consultation Time and Conversion to Surgery.

    PubMed

    Boudreault, David J; Li, Chin-Shang; Wong, Michael S

    2016-01-01

    To evaluate the effect of web-based education on (1) patient satisfaction, (2) consultation times, and (3) conversion to surgery. A retrospective review of 767 new patient consultations seen by 4 university-based plastic surgeons was conducted between May 2012 and August 2013 to determine the effect a web-based education program has on patient satisfaction and consultation time. A standard 5-point Likert scale survey completed at the end of the consultation was used to assess satisfaction with their experience. Consult times were obtained from the electronic medical record. All analyses were done with Statistical Analysis Software version 9.2 (SAS Inc., Cary, NC). A P value less than 0.05 was considered statistically significant. Those who viewed the program before their consultation were more satisfied with their consultation compared to those who did not (satisfaction scores, mean ± SD: 1.13 ± 0.44 vs 1.36 ± 0.74; P = 0.02) and more likely to rate their experience as excellent (92% vs 75%; P = 0.02). Contrary to the claims of Emmi Solutions, patients who viewed the educational program before consultation trended toward longer visits compared to those who did not (mean time ± SD: 54 ± 26 vs 50 ± 35 minutes; P = 0.10). More patients who completed the program went on to undergo a procedure (44% vs 37%; P = 0.16), but this difference was not statistically significant. Viewing web-based educational programs significantly improved plastic surgery patients' satisfaction with their consultation, but patients who viewed the program also trended toward longer consultation times. Although there was an increase in converting to surgical procedures, this did not reach statistical significance.

  11. Lesion registration for longitudinal disease tracking in an imaging informatics-based multiple sclerosis eFolder

    NASA Astrophysics Data System (ADS)

    Ma, Kevin; Liu, Joseph; Zhang, Xuejun; Lerner, Alex; Shiroishi, Mark; Amezcua, Lilyana; Liu, Brent

    2016-03-01

    We have designed and developed a multiple sclerosis eFolder system for patient data storage, image viewing, and automatic lesion quantification results stored in DICOM-SR format. The web-based system aims to be integrated in DICOM-compliant clinical and research environments to aid clinicians in patient treatments and data analysis. The system needs to quantify lesion volumes, identify and register lesion locations to track shifts in volume and quantity of lesions in a longitudinal study. In order to perform lesion registration, we have developed a brain warping and normalizing methodology using Statistical Parametric Mapping (SPM) MATLAB toolkit for brain MRI. Patients' brain MR images are processed via SPM's normalization processes, and the brain images are analyzed and warped according to the tissue probability map. Lesion identification and contouring are completed by neuroradiologists, and lesion volume quantification is completed by the eFolder's CAD program. Lesion comparison results in longitudinal studies show key growth and active regions. The results display successful lesion registration and tracking over a longitudinal study. Lesion change results are graphically represented in the web-based user interface, and users are able to correlate patient progress and changes in the MRI images. The completed lesion and disease tracking tool would enable the eFolder to provide complete patient profiles, improve the efficiency of patient care, and perform comprehensive data analysis through an integrated imaging informatics system.

  12. Teacher Efficacy of Secondary Special Education Science Teachers

    NASA Astrophysics Data System (ADS)

    Bonton, Celeste

    Students with disabilities are a specific group of the student population that are guaranteed rights that allow them to receive a free and unbiased education in an environment with their non-disabled peers. The importance of this study relates to providing students with disabilities with the opportunity to receive instruction from the most efficient and prepared educators. The purpose of this study is to determine how specific factors influence special education belief systems. In particular, educators who provide science instruction in whole group or small group classrooms in a large metropolitan area in Georgia possess specific beliefs about their ability to provide meaningful instruction. Data was collected through a correlational study completed by educators through an online survey website. The SEBEST quantitative survey instrument was used on a medium sample size (approximately 120 teachers) in a large metropolitan school district. The selected statistical analysis was the Shapiro-Wilk and Mann-Whitney in order to determine if any correlation exists among preservice training and perceived self-efficacy of secondary special education teachers in the content area of science. The results of this study showed that special education teachers in the content area of science have a higher perceived self-efficacy if they have completed an alternative certification program. Other variables tested did not show any statistical significance. Further research can be centered on the analysis of actual teacher efficacy, year end teacher efficacy measurements, teacher stipends, increased recruitment, and special education teachers of multiple content areas.

  13. A Complete Color Normalization Approach to Histopathology Images Using Color Cues Computed From Saturation-Weighted Statistics.

    PubMed

    Li, Xingyu; Plataniotis, Konstantinos N

    2015-07-01

    In digital histopathology, tasks of segmentation and disease diagnosis are achieved by quantitative analysis of image content. However, color variation in image samples makes it challenging to produce reliable results. This paper introduces a complete normalization scheme to address the problem of color variation in histopathology images jointly caused by inconsistent biopsy staining and nonstandard imaging condition. Method : Different from existing normalization methods that either address partial cause of color variation or lump them together, our method identifies causes of color variation based on a microscopic imaging model and addresses inconsistency in biopsy imaging and staining by an illuminant normalization module and a spectral normalization module, respectively. In evaluation, we use two public datasets that are representative of histopathology images commonly received in clinics to examine the proposed method from the aspects of robustness to system settings, performance consistency against achromatic pixels, and normalization effectiveness in terms of histological information preservation. As the saturation-weighted statistics proposed in this study generates stable and reliable color cues for stain normalization, our scheme is robust to system parameters and insensitive to image content and achromatic colors. Extensive experimentation suggests that our approach outperforms state-of-the-art normalization methods as the proposed method is the only approach that succeeds to preserve histological information after normalization. The proposed color normalization solution would be useful to mitigate effects of color variation in pathology images on subsequent quantitative analysis.

  14. Anthropometric Analysis of the Face.

    PubMed

    Zacharopoulos, Georgios V; Manios, Andreas; Kau, Chung H; Velagrakis, George; Tzanakakis, George N; de Bree, Eelco

    2016-01-01

    Facial anthropometric analysis is essential for planning cosmetic and reconstructive facial surgery, but has not been available in detail for modern Greeks. In this study, multiple measurements of the face were performed on young Greek males and females to provide a complete facial anthropometric profile of this population and to compare its facial morphology with that of North American Caucasians. Thirty-one direct facial anthropometric measurements were obtained from 152 Greek students. Moreover, the prevalence of the various face types was determined. The resulting data were compared with those published regarding North American Caucasians. A complete set of average anthropometric data was obtained for each sex. Greek males, when compared to Greek females, were found to have statistically significantly longer foreheads as well as greater values in morphologic face height, mandible width, maxillary surface arc distance, and mandibular surface arc distance. In both sexes, the most common face types were mesoprosop, leptoprosop, and hyperleptoprosop. Greek males had significantly wider faces and mandibles than the North American Caucasian males, whereas Greek females had only significantly wider mandibles than their North American counterparts. Differences of statistical significance were noted in the head and face regions among sexes as well as among Greek and North American Caucasians. With the establishment of facial norms for Greek adults, this study contributes to the preoperative planning as well as postoperative evaluation of Greek patients that are, respectively, scheduled for or are to be subjected to facial reconstructive and aesthetic surgery.

  15. Speech outcome in unilateral complete cleft lip and palate patients: a descriptive study.

    PubMed

    Rullo, R; Di Maggio, D; Addabbo, F; Rullo, F; Festa, V M; Perillo, L

    2014-09-01

    In this study, resonance and articulation disorders were examined in a group of patients surgically treated for cleft lip and palate, considering family social background, and children's ability of self monitoring their speech output while speaking. Fifty children (32 males and 18 females) mean age 6.5 ± 1.6 years, affected by non-syndromic complete unilateral cleft of the lip and palate underwent the same surgical protocol. The speech level was evaluated using the Accordi's speech assessment protocol that focuses on intelligibility, nasality, nasal air escape, pharyngeal friction, and glottal stop. Pearson product-moment correlation analysis was used to detect significant associations between analysed parameters. A total of 16% (8 children) of the sample had severe to moderate degree of nasality and nasal air escape, presence of pharyngeal friction and glottal stop, which obviously compromise speech intelligibility. Ten children (10%) showed a barely acceptable phonological outcome: nasality and nasal air escape were mild to moderate, but the intelligibility remained poor. Thirty-two children (64%) had normal speech. Statistical analysis revealed a significant correlation between the severity of nasal resonance and nasal air escape (p ≤ 0.05). No statistical significant correlation was found between the final intelligibility and the patient social background, neither between the final intelligibility nor the age of the patients. The differences in speech outcome could be explained with a specific, subjective, and inborn ability, different for each child, in self-monitoring their speech output.

  16. Large-scale gene function analysis with the PANTHER classification system.

    PubMed

    Mi, Huaiyu; Muruganujan, Anushya; Casagrande, John T; Thomas, Paul D

    2013-08-01

    The PANTHER (protein annotation through evolutionary relationship) classification system (http://www.pantherdb.org/) is a comprehensive system that combines gene function, ontology, pathways and statistical analysis tools that enable biologists to analyze large-scale, genome-wide data from sequencing, proteomics or gene expression experiments. The system is built with 82 complete genomes organized into gene families and subfamilies, and their evolutionary relationships are captured in phylogenetic trees, multiple sequence alignments and statistical models (hidden Markov models or HMMs). Genes are classified according to their function in several different ways: families and subfamilies are annotated with ontology terms (Gene Ontology (GO) and PANTHER protein class), and sequences are assigned to PANTHER pathways. The PANTHER website includes a suite of tools that enable users to browse and query gene functions, and to analyze large-scale experimental data with a number of statistical tests. It is widely used by bench scientists, bioinformaticians, computer scientists and systems biologists. In the 2013 release of PANTHER (v.8.0), in addition to an update of the data content, we redesigned the website interface to improve both user experience and the system's analytical capability. This protocol provides a detailed description of how to analyze genome-wide experimental data with the PANTHER classification system.

  17. A comparative evaluation of microleakage of three different newer direct composite resins using a self etching primer in class V cavities: An in vitro study

    PubMed Central

    Hegde, Mithra N; Vyapaka, Pallavi; Shetty, Shishir

    2009-01-01

    Aims/Objectives: The aim of this in vitro study is to study, measure and compare the microleakage in three different newer direct composite resins using a self-etch adhesive bonding system in class V cavities by fluorescent dye penetration technique. Materials and Methods: Class V cavities were prepared on 45 human maxillary premolar teeth. On all specimens, one coat of G-Bond (GC Japan) applied and light cured. Teeth are then equally divided into 3 groups of 15 samples each. Filtek Z350 (3M ESPE), Ceram X duo (Dentsply Asia) and Synergy D6 (Coltene/Whaledent) resin composites were placed on samples of Groups I, II and III, respectively, in increments and light cured. After polishing the restorations, the specimens were suspended in Rhodamine 6G fluorescent dye for 48 h. The teeth were then sectioned longitudinally and observed for the extent of microleakage under the florescent microscope. Statistical Analysis Used: The results were subjected to statistical analysis using Kruskal Wallis and Mann–Whitney U Test. Results: Results showed no statistically significant difference among three groups tested. Conclusions: None of the materials tested was able to completely eliminate the microleakage in class V cavities. PMID:20543926

  18. Addressing issues associated with evaluating prediction models for survival endpoints based on the concordance statistic.

    PubMed

    Wang, Ming; Long, Qi

    2016-09-01

    Prediction models for disease risk and prognosis play an important role in biomedical research, and evaluating their predictive accuracy in the presence of censored data is of substantial interest. The standard concordance (c) statistic has been extended to provide a summary measure of predictive accuracy for survival models. Motivated by a prostate cancer study, we address several issues associated with evaluating survival prediction models based on c-statistic with a focus on estimators using the technique of inverse probability of censoring weighting (IPCW). Compared to the existing work, we provide complete results on the asymptotic properties of the IPCW estimators under the assumption of coarsening at random (CAR), and propose a sensitivity analysis under the mechanism of noncoarsening at random (NCAR). In addition, we extend the IPCW approach as well as the sensitivity analysis to high-dimensional settings. The predictive accuracy of prediction models for cancer recurrence after prostatectomy is assessed by applying the proposed approaches. We find that the estimated predictive accuracy for the models in consideration is sensitive to NCAR assumption, and thus identify the best predictive model. Finally, we further evaluate the performance of the proposed methods in both settings of low-dimensional and high-dimensional data under CAR and NCAR through simulations. © 2016, The International Biometric Society.

  19. Statistical evaluation of metal fill widths for emulated metal fill in parasitic extraction methodology

    NASA Astrophysics Data System (ADS)

    J-Me, Teh; Noh, Norlaili Mohd.; Aziz, Zalina Abdul

    2015-05-01

    In the chip industry today, the key goal of a chip development organization is to develop and market chips within a short time frame to gain foothold on market share. This paper proposes a design flow around the area of parasitic extraction to improve the design cycle time. The proposed design flow utilizes the usage of metal fill emulation as opposed to the current flow which performs metal fill insertion directly. By replacing metal fill structures with an emulation methodology in earlier iterations of the design flow, this is targeted to help reduce runtime in fill insertion stage. Statistical design of experiments methodology utilizing the randomized complete block design was used to select an appropriate emulated metal fill width to improve emulation accuracy. The experiment was conducted on test cases of different sizes, ranging from 1000 gates to 21000 gates. The metal width was varied from 1 x minimum metal width to 6 x minimum metal width. Two-way analysis of variance and Fisher's least significant difference test were used to analyze the interconnect net capacitance values of the different test cases. This paper presents the results of the statistical analysis for the 45 nm process technology. The recommended emulated metal fill width was found to be 4 x the minimum metal width.

  20. Hydrostatic paradox: experimental verification of pressure equilibrium

    NASA Astrophysics Data System (ADS)

    Kodejška, Č.; Ganci, S.; Říha, J.; Sedláčková, H.

    2017-11-01

    This work is focused on the experimental verification of the balance between the atmospheric pressure acting on the sheet of paper, which encloses the cylinder completely or partially filled with water from below, where the hydrostatic pressure of the water column acts against the atmospheric pressure. First of all this paper solves a theoretical analysis of the problem, which is based, firstly, on the equation for isothermal process and, secondly, on the equality of pressures inside and outside the cylinder. From the measured values the confirmation of the theoretical quadratic dependence of the air pressure inside the cylinder on the level of the liquid in the cylinder is obtained, the maximum change in the volume of air within the cylinder occurs for the height of the water column L of one half of the total height of the vessel H. The measurements were made for different diameters of the cylinder and with plates made of different materials located at the bottom of the cylinder to prevent liquid from flowing out of the cylinder. The measured values were subjected to statistical analysis, which demonstrated the validity of the zero hypothesis, i.e. that the measured values are not statistically significantly different from the theoretically calculated ones at the statistical significance level α  =  0.05.

  1. A Powerful Approach to Estimating Annotation-Stratified Genetic Covariance via GWAS Summary Statistics.

    PubMed

    Lu, Qiongshi; Li, Boyang; Ou, Derek; Erlendsdottir, Margret; Powles, Ryan L; Jiang, Tony; Hu, Yiming; Chang, David; Jin, Chentian; Dai, Wei; He, Qidu; Liu, Zefeng; Mukherjee, Shubhabrata; Crane, Paul K; Zhao, Hongyu

    2017-12-07

    Despite the success of large-scale genome-wide association studies (GWASs) on complex traits, our understanding of their genetic architecture is far from complete. Jointly modeling multiple traits' genetic profiles has provided insights into the shared genetic basis of many complex traits. However, large-scale inference sets a high bar for both statistical power and biological interpretability. Here we introduce a principled framework to estimate annotation-stratified genetic covariance between traits using GWAS summary statistics. Through theoretical and numerical analyses, we demonstrate that our method provides accurate covariance estimates, thereby enabling researchers to dissect both the shared and distinct genetic architecture across traits to better understand their etiologies. Among 50 complex traits with publicly accessible GWAS summary statistics (N total ≈ 4.5 million), we identified more than 170 pairs with statistically significant genetic covariance. In particular, we found strong genetic covariance between late-onset Alzheimer disease (LOAD) and amyotrophic lateral sclerosis (ALS), two major neurodegenerative diseases, in single-nucleotide polymorphisms (SNPs) with high minor allele frequencies and in SNPs located in the predicted functional genome. Joint analysis of LOAD, ALS, and other traits highlights LOAD's correlation with cognitive traits and hints at an autoimmune component for ALS. Copyright © 2017 American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  2. gsSKAT: Rapid gene set analysis and multiple testing correction for rare-variant association studies using weighted linear kernels.

    PubMed

    Larson, Nicholas B; McDonnell, Shannon; Cannon Albright, Lisa; Teerlink, Craig; Stanford, Janet; Ostrander, Elaine A; Isaacs, William B; Xu, Jianfeng; Cooney, Kathleen A; Lange, Ethan; Schleutker, Johanna; Carpten, John D; Powell, Isaac; Bailey-Wilson, Joan E; Cussenot, Olivier; Cancel-Tassin, Geraldine; Giles, Graham G; MacInnis, Robert J; Maier, Christiane; Whittemore, Alice S; Hsieh, Chih-Lin; Wiklund, Fredrik; Catalona, William J; Foulkes, William; Mandal, Diptasri; Eeles, Rosalind; Kote-Jarai, Zsofia; Ackerman, Michael J; Olson, Timothy M; Klein, Christopher J; Thibodeau, Stephen N; Schaid, Daniel J

    2017-05-01

    Next-generation sequencing technologies have afforded unprecedented characterization of low-frequency and rare genetic variation. Due to low power for single-variant testing, aggregative methods are commonly used to combine observed rare variation within a single gene. Causal variation may also aggregate across multiple genes within relevant biomolecular pathways. Kernel-machine regression and adaptive testing methods for aggregative rare-variant association testing have been demonstrated to be powerful approaches for pathway-level analysis, although these methods tend to be computationally intensive at high-variant dimensionality and require access to complete data. An additional analytical issue in scans of large pathway definition sets is multiple testing correction. Gene set definitions may exhibit substantial genic overlap, and the impact of the resultant correlation in test statistics on Type I error rate control for large agnostic gene set scans has not been fully explored. Herein, we first outline a statistical strategy for aggregative rare-variant analysis using component gene-level linear kernel score test summary statistics as well as derive simple estimators of the effective number of tests for family-wise error rate control. We then conduct extensive simulation studies to characterize the behavior of our approach relative to direct application of kernel and adaptive methods under a variety of conditions. We also apply our method to two case-control studies, respectively, evaluating rare variation in hereditary prostate cancer and schizophrenia. Finally, we provide open-source R code for public use to facilitate easy application of our methods to existing rare-variant analysis results. © 2017 WILEY PERIODICALS, INC.

  3. Statistical methods for meta-analyses including information from studies without any events-add nothing to nothing and succeed nevertheless.

    PubMed

    Kuss, O

    2015-03-30

    Meta-analyses with rare events, especially those that include studies with no event in one ('single-zero') or even both ('double-zero') treatment arms, are still a statistical challenge. In the case of double-zero studies, researchers in general delete these studies or use continuity corrections to avoid them. A number of arguments against both options has been given, and statistical methods that use the information from double-zero studies without using continuity corrections have been proposed. In this paper, we collect them and compare them by simulation. This simulation study tries to mirror real-life situations as completely as possible by deriving true underlying parameters from empirical data on actually performed meta-analyses. It is shown that for each of the commonly encountered effect estimators valid statistical methods are available that use the information from double-zero studies without using continuity corrections. Interestingly, all of them are truly random effects models, and so also the current standard method for very sparse data as recommended from the Cochrane collaboration, the Yusuf-Peto odds ratio, can be improved on. For actual analysis, we recommend to use beta-binomial regression methods to arrive at summary estimates for the odds ratio, the relative risk, or the risk difference. Methods that ignore information from double-zero studies or use continuity corrections should no longer be used. We illustrate the situation with an example where the original analysis ignores 35 double-zero studies, and a superior analysis discovers a clinically relevant advantage of off-pump surgery in coronary artery bypass grafting. Copyright © 2014 John Wiley & Sons, Ltd.

  4. Symptom Clusters in Advanced Cancer Patients: An Empirical Comparison of Statistical Methods and the Impact on Quality of Life.

    PubMed

    Dong, Skye T; Costa, Daniel S J; Butow, Phyllis N; Lovell, Melanie R; Agar, Meera; Velikova, Galina; Teckle, Paulos; Tong, Allison; Tebbutt, Niall C; Clarke, Stephen J; van der Hoek, Kim; King, Madeleine T; Fayers, Peter M

    2016-01-01

    Symptom clusters in advanced cancer can influence patient outcomes. There is large heterogeneity in the methods used to identify symptom clusters. To investigate the consistency of symptom cluster composition in advanced cancer patients using different statistical methodologies for all patients across five primary cancer sites, and to examine which clusters predict functional status, a global assessment of health and global quality of life. Principal component analysis and exploratory factor analysis (with different rotation and factor selection methods) and hierarchical cluster analysis (with different linkage and similarity measures) were used on a data set of 1562 advanced cancer patients who completed the European Organization for the Research and Treatment of Cancer Quality of Life Questionnaire-Core 30. Four clusters consistently formed for many of the methods and cancer sites: tense-worry-irritable-depressed (emotional cluster), fatigue-pain, nausea-vomiting, and concentration-memory (cognitive cluster). The emotional cluster was a stronger predictor of overall quality of life than the other clusters. Fatigue-pain was a stronger predictor of overall health than the other clusters. The cognitive cluster and fatigue-pain predicted physical functioning, role functioning, and social functioning. The four identified symptom clusters were consistent across statistical methods and cancer types, although there were some noteworthy differences. Statistical derivation of symptom clusters is in need of greater methodological guidance. A psychosocial pathway in the management of symptom clusters may improve quality of life. Biological mechanisms underpinning symptom clusters need to be delineated by future research. A framework for evidence-based screening, assessment, treatment, and follow-up of symptom clusters in advanced cancer is essential. Copyright © 2016 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.

  5. 19 CFR 141.61 - Completion of entry and entry summary documentation.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... on CBP Form 7501. (e) Statistical information—(1) Information required on entry summary or withdrawal... a separate statistical reporting number, the applicable information required by the General Statistical Notes, Harmonized Tariff Schedule of the United States (HTSUS), must be shown on the entry summary...

  6. A comparative evaluation of efficacy of protaper universal rotary retreatment system for gutta-percha removal with or without a solvent.

    PubMed

    Kumar, M Sita Ram; Sajjan, Girija S; Satish, Kalyan; Varma, K Madhu

    2012-09-01

    The aim was to evaluate and compare the efficacy of ProTaper Universal rotary retreatment system with or without solvent and stainless steel hand files for endodontic filling removal from root canals and also to compare retreatment time for each system. Thirty extracted mandibular premolars with single straight canals were endodontically treated. Teeth were divided into three major groups, having 10 specimens each. Removal of obturating material in group 1 by stainless steel hand files with RC Solve, group 2 by ProTaper Universal retreatment instruments and group 3 by ProTaper Universal retreatment instruments along with RC solve was done. Retreatment was considered complete for all groups when no filling material was observed on the instruments. The retreatment time was recorded for each tooth. All specimens were grooved longitudinally in a buccolingual direction. The split halves were examined under a stereomicroscope and images were captured and analyzed. The remaining filling debris area ratios were considered for statistical analysis. With ANOVA test, statistical analysis showed that there was statistically no significant difference regarding the amount of filling remnants between the groups (P < 0.05). Differences between the means of groups are statistically significant regarding the retreatment time. Irrespective of the technique used, all the specimens had some remnants on the root canal wall. ProTaper Universal retreatment system files alone proved to be faster than the other experimental groups.

  7. Experimental design matters for statistical analysis: how to handle blocking.

    PubMed

    Jensen, Signe M; Schaarschmidt, Frank; Onofri, Andrea; Ritz, Christian

    2018-03-01

    Nowadays, evaluation of the effects of pesticides often relies on experimental designs that involve multiple concentrations of the pesticide of interest or multiple pesticides at specific comparable concentrations and, possibly, secondary factors of interest. Unfortunately, the experimental design is often more or less neglected when analysing data. Two data examples were analysed using different modelling strategies. First, in a randomized complete block design, mean heights of maize treated with a herbicide and one of several adjuvants were compared. Second, translocation of an insecticide applied to maize as a seed treatment was evaluated using incomplete data from an unbalanced design with several layers of hierarchical sampling. Extensive simulations were carried out to further substantiate the effects of different modelling strategies. It was shown that results from suboptimal approaches (two-sample t-tests and ordinary ANOVA assuming independent observations) may be both quantitatively and qualitatively different from the results obtained using an appropriate linear mixed model. The simulations demonstrated that the different approaches may lead to differences in coverage percentages of confidence intervals and type 1 error rates, confirming that misleading conclusions can easily happen when an inappropriate statistical approach is chosen. To ensure that experimental data are summarized appropriately, avoiding misleading conclusions, the experimental design should duly be reflected in the choice of statistical approaches and models. We recommend that author guidelines should explicitly point out that authors need to indicate how the statistical analysis reflects the experimental design. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  8. A complete sample of double-lobed radio quasars for VLBI tests of source models - Definition and statistics

    NASA Technical Reports Server (NTRS)

    Hough, D. H.; Readhead, A. C. S.

    1989-01-01

    A complete, flux-density-limited sample of double-lobed radio quasars is defined, with nuclei bright enough to be mapped with the Mark III VLBI system. It is shown that the statistics of linear size, nuclear strength, and curvature are consistent with the assumption of random source orientations and simple relativistic beaming in the nuclei. However, these statistics are also consistent with the effects of interaction between the beams and the surrounding medium. The distribution of jet velocities in the nuclei, as measured with VLBI, will provide a powerful test of physical theories of extragalactic radio sources.

  9. A complete solution classification and unified algorithmic treatment for the one- and two-step asymmetric S-transverse mass event scale statistic

    NASA Astrophysics Data System (ADS)

    Walker, Joel W.

    2014-08-01

    The M T2, or "s-transverse mass", statistic was developed to associate a parent mass scale to a missing transverse energy signature, given that escaping particles are generally expected in pairs, while collider experiments are sensitive to just a single transverse momentum vector sum. This document focuses on the generalized extension of that statistic to asymmetric one- and two-step decay chains, with arbitrary child particle masses and upstream missing transverse momentum. It provides a unified theoretical formulation, complete solution classification, taxonomy of critical points, and technical algorithmic prescription for treatment of the event scale. An implementation of the described algorithm is available for download, and is also a deployable component of the author's selection cut software package AEAC uS (Algorithmic Event Arbiter and C ut Selector). appendices address combinatoric event assembly, algorithm validation, and a complete pseudocode.

  10. Use of statistical tools to evaluate the reductive dechlorination of high levels of TCE in microcosm studies.

    PubMed

    Harkness, Mark; Fisher, Angela; Lee, Michael D; Mack, E Erin; Payne, Jo Ann; Dworatzek, Sandra; Roberts, Jeff; Acheson, Carolyn; Herrmann, Ronald; Possolo, Antonio

    2012-04-01

    A large, multi-laboratory microcosm study was performed to select amendments for supporting reductive dechlorination of high levels of trichloroethylene (TCE) found at an industrial site in the United Kingdom (UK) containing dense non-aqueous phase liquid (DNAPL) TCE. The study was designed as a fractional factorial experiment involving 177 bottles distributed between four industrial laboratories and was used to assess the impact of six electron donors, bioaugmentation, addition of supplemental nutrients, and two TCE levels (0.57 and 1.90 mM or 75 and 250 mg/L in the aqueous phase) on TCE dechlorination. Performance was assessed based on the concentration changes of TCE and reductive dechlorination degradation products. The chemical data was evaluated using analysis of variance (ANOVA) and survival analysis techniques to determine both main effects and important interactions for all the experimental variables during the 203-day study. The statistically based design and analysis provided powerful tools that aided decision-making for field application of this technology. The analysis showed that emulsified vegetable oil (EVO), lactate, and methanol were the most effective electron donors, promoting rapid and complete dechlorination of TCE to ethene. Bioaugmentation and nutrient addition also had a statistically significant positive impact on TCE dechlorination. In addition, the microbial community was measured using phospholipid fatty acid analysis (PLFA) for quantification of total biomass and characterization of the community structure and quantitative polymerase chain reaction (qPCR) for enumeration of Dehalococcoides organisms (Dhc) and the vinyl chloride reductase (vcrA) gene. The highest increase in levels of total biomass and Dhc was observed in the EVO microcosms, which correlated well with the dechlorination results. Copyright © 2012 Elsevier B.V. All rights reserved.

  11. Dietary Soy Supplement on Fibromyalgia Symptoms: A Randomized, Double-Blind, Placebo-Controlled, Early Phase Trial

    PubMed Central

    Wahner-Roedler, Dietlind L.; Thompson, Jeffrey M.; Luedtke, Connie A.; King, Susan M.; Cha, Stephen S.; Elkin, Peter L.; Bruce, Barbara K.; Townsend, Cynthia O.; Bergeson, Jody R.; Eickhoff, Andrea L.; Loehrer, Laura L.; Sood, Amit; Bauer, Brent A.

    2011-01-01

    Most patients with fibromyalgia use complementary and alternative medicine (CAM). Properly designed controlled trials are necessary to assess the effectiveness of these practices. This study was a randomized, double-blind, placebo-controlled, early phase trial. Fifty patients seen at a fibromyalgia outpatient treatment program were randomly assigned to a daily soy or placebo (casein) shake. Outcome measures were scores of the Fibromyalgia Impact Questionnaire (FIQ) and the Center for Epidemiologic Studies Depression Scale (CES-D) at baseline and after 6 weeks of intervention. Analysis was with standard statistics based on the null hypothesis, and separation test for early phase CAM comparative trials. Twenty-eight patients completed the study. Use of standard statistics with intent-to-treat analysis showed that total FIQ scores decreased by 14% in the soy group (P = .02) and by 18% in the placebo group (P < .001). The difference in change in scores between the groups was not significant (P = .16). With the same analysis, CES-D scores decreased in the soy group by 16% (P = .004) and in the placebo group by 15% (P = .05). The change in scores was similar in the groups (P = .83). Results of statistical analysis using the separation test and intent-to-treat analysis revealed no benefit of soy compared with placebo. Shakes that contain soy and shakes that contain casein, when combined with a multidisciplinary fibromyalgia treatment program, provide a decrease in fibromyalgia symptoms. Separation between the effects of soy and casein (control) shakes did not favor the intervention. Therefore, large-sample studies using soy for patients with fibromyalgia are probably not indicated. PMID:18990724

  12. Drivers Motivating Community Health Improvement Plan Completion by Local Public Health Agencies and Community Partners in the Rocky Mountain Region and Western Plains.

    PubMed

    Hill, Anne; Wolf, Holly J; Scallan, Elaine; Case, Jenny; Kellar-Guenther, Yvonne

    There are numerous drivers that motivate completion of community health improvement plans (CHIPs). Some are more obvious and include voluntary public health accreditation, state requirements, federal and state funding, and nonprofit hospital requirements through IRS regulations. Less is known about other drivers, including involvement of diverse partners and belief in best practices, that may motivate CHIP completion. This research investigated the drivers that motivated CHIP completion based on experiences of 51 local public health agencies (LPHAs). An explanatory mixed-methods design, including closed- and open-ended survey questions and key informant interviews, was used to understand the drivers that motivated CHIP completion. Analysis of survey data involved descriptive statistics. Classical content analysis was used for qualitative data to clarify survey findings. The surveys and key informant interviews were conducted in the Rocky Mountain Region and Western Plains among 51 medium and large LPHAs in Colorado, Kansas, Montana, Nebraska, North Dakota, South Dakota, Utah, and Wyoming. More than 50% of respondents were public health directors; the balance of the respondents were division/program directors, accreditation coordinators, and public health planners. CHIP completion. Most LPHAs in the Rocky Mountains and Western Plains have embraced developing and publishing a CHIP, with 80% having completed their plan and another 13% working on it. CHIP completion is motivated by a belief in best practices, with LPHAs and partners seeing the benefit of quality improvement activities linked to the CHIP and the investment of nonprofit hospitals in the process. Completing a CHIP is strengthened through engagement of diverse partners and a well-functioning partnership. The future of CHIP creation depends on LPHAs and partners investing in the CHIP as a best practice, dedicating personnel to CHIP activities, and enhancing leadership skills to contribute to a synergistic partnership by effectively working and communicating with diverse partners and developing and achieving common goals.

  13. Is the Maxillary Sinus Really Suitable in Sex Determination? A Three-Dimensional Analysis of Maxillary Sinus Volume and Surface Depending on Sex and Dentition.

    PubMed

    Möhlhenrich, Stephan Christian; Heussen, Nicole; Peters, Florian; Steiner, Timm; Hölzle, Frank; Modabber, Ali

    2015-11-01

    The morphometric analysis of maxillary sinus was recently presented as a helpful instrument for sex determination. The aim of the present study was to examine the volume and surface of the fully dentate, partial, and complete edentulous maxillary sinus depending on the sex. Computed tomography data from 276 patients were imported in DICOM format via special virtual planning software, and surfaces (mm) and volumes (mm) of maxillary sinuses were measured. In sex-specific comparisons (women vs men), statistically significant differences for the mean maxillary sinus volume and surface were found between fully dentate (volume, 13,267.77 mm vs 16,623.17 mm, P < 0.0001; surface, 3480.05 mm vs 4100.83 mm, P < 0.0001) and partially edentulous (volume, 10,577.35 mm vs 14,608.10 mm, P = 0.0002; surface, 2980.11 mm vs 3797.42 mm, P < 0.0001) or complete edentulous sinuses (volume, 11,200.99 mm vs 15,382.29 mm, P < 0.0001; surface, 3118.32 mm vs 3877.25 mm, P < 0.0001). For males, the statistically different mean values were calculated between fully dentate and partially edentulous (volume, P = 0.0022; surface, P = 0.0048) maxillary sinuses. Between the sexes, no differences were only measured for female and male partially dentate fully edentulous sinuses (2 teeth missing) and between partially edentulous sinuses in women and men (1 teeth vs 2 teeth missing). With a corresponding software program, it is possible to analyze the maxillary sinus precisely. The dentition influences the volume and surface of the pneumatic maxillary sinus. Therefore, sex determination is possible by analysis of the maxillary sinus event through the increase in pneumatization.

  14. Structural Model of the Effects of Cognitive and Affective Factors on the Achievement of Arabic-Speaking Pre-Service Teachers in Introductory Statistics

    ERIC Educational Resources Information Center

    Nasser, Fadia M.

    2004-01-01

    This study examined the extent to which statistics and mathematics anxiety, attitudes toward mathematics and statistics, motivation and mathematical aptitude can explain the achievement of Arabic speaking pre-service teachers in introductory statistics. Complete data were collected from 162 pre-service teachers enrolled in an academic…

  15. Analysis of promoter polymorphism in monoamine oxidase A (MAOA) gene in completed suicide on Slovenian population.

    PubMed

    Uršič, Katarina; Zupanc, Tomaž; Paska, Alja Videtič

    2018-04-23

    Suicide is a well-defined public health problem and is a complex phenomenon influenced by a number of different risk factors, including genetic ones. Numerous studies have examined serotonin system genes. Monoamine oxidase A (MAO-A) is an outer mitochondrial membrane enzyme which is involved in the metabolic pathway of serotonin degradation. Upstream variable number of tandem repeats (uVNTR) in the promoter region of MAOA gene affects the activity of transcription. In the present study we genotyped MAOA-uVNTR polymorphism in 266 suicide victims and 191 control subjects of Slovenian population, which ranks among the European and world populations with the highest suicide rate. Genotyping was performed with polymerase chain reaction and agarose gel electrophoresis. Using a separate statistical analysis for female and male subjects we determined the differences in genotype distributions of MAOA-uVNTR polymorphism between the studied groups. Statistical analysis showed a trend towards 3R allele and suicide, and associated 3R allele with non-violent suicide method on stratified data (20 suicide victims). This is the first study associating highly suicidal Slovenian population with MAOA-uVNTR polymorphism. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. STATISTICAL ANALYSIS OF TANK 18F FLOOR SAMPLE RESULTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, S.

    2010-09-02

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 18F as per the statistical sampling plan developed by Shine [1]. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL [2]. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples resultsmore » [3] to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL{sub 95%}) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 18F. The uncertainty is quantified in this report by an upper 95% confidence limit (UCL{sub 95%}) on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL{sub 95%} was based entirely on the six current scrape sample results (each averaged across three analytical determinations).« less

  17. STATISTICAL ANALYSIS OF TANK 19F FLOOR SAMPLE RESULTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, S.

    2010-09-02

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 19F as per the statistical sampling plan developed by Harris and Shine. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples resultsmore » to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL95%) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current scrape sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 19F. The uncertainty is quantified in this report by an UCL95% on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL95% was based entirely on the six current scrape sample results (each averaged across three analytical determinations).« less

  18. Results of Li-Tho trial: a prospective randomized study on effectiveness of LigaSure® in lung resections.

    PubMed

    Bertolaccini, Luca; Viti, Andrea; Cavallo, Antonio; Terzi, Alberto

    2014-04-01

    The role of electro-thermal bipolar tissue sealing system (LigaSure(®), (LS); Covidien, Inc., CO, USA) in thoracic surgery is still undefined. Reports of its use are still limited. The objective of the trial was to evaluate the cost and benefits of LS in major lung resection surgery. A randomized blinded study of a consecutive series of 100 patients undergoing lobectomy was undertaken. After muscle-sparing thoracotomy and classification of lung fissures according to Craig-Walker, patients with fissure Grade 2-4 were randomized to Stapler group or LS group fissure completion. Recorded parameters were analysed for differences in selected intraoperative and postoperative outcomes. Statistical analysis was performed with the bootstrap method. Pearson's χ(2) test and Fisher's exact test were used to calculate probability value for dichotomous variables comparison. Cost-benefit evaluation was performed using Pareto optimal analysis. There were no significant differences between groups, regarding demographic and baseline characteristics. No patient was withdrawn from the study; no adverse effect was recorded. There was no mortality or major complications in both groups. There were no statistically significant differences as to operative time or morbidity between patients in the LS group compared with the Stapler group. In the LS group, there was a not statistically significant increase of postoperative air leaks in the first 24 postoperative hours, while a statistically significant increase of drainage amount was observed in the LS group. No statistically significant difference in hospital length of stay was observed. Overall, the LS group had a favourable multi-criteria analysis of cost/benefit ratio with a good 'Pareto optimum'. LS is a safe device for thoracic surgery and can be a valid alternative to Staplers. In this setting, LS allows functional lung tissue preservation. As to costs, LS seems equivalent to Staplers.

  19. Acoustic Emission Analysis Applet (AEAA) Software

    NASA Technical Reports Server (NTRS)

    Nichols, Charles T.; Roth, Don J.

    2013-01-01

    NASA Glenn Research and NASA White Sands Test Facility have developed software supporting an automated pressure vessel structural health monitoring (SHM) system based on acoustic emissions (AE). The software, referred to as the Acoustic Emission Analysis Applet (AEAA), provides analysts with a tool that can interrogate data collected on Digital Wave Corp. and Physical Acoustics Corp. software using a wide spectrum of powerful filters and charts. This software can be made to work with any data once the data format is known. The applet will compute basic AE statistics, and statistics as a function of time and pressure (see figure). AEAA provides value added beyond the analysis provided by the respective vendors' analysis software. The software can handle data sets of unlimited size. A wide variety of government and commercial applications could benefit from this technology, notably requalification and usage tests for compressed gas and hydrogen-fueled vehicles. Future enhancements will add features similar to a "check engine" light on a vehicle. Once installed, the system will ultimately be used to alert International Space Station crewmembers to critical structural instabilities, but will have little impact to missions otherwise. Diagnostic information could then be transmitted to experienced technicians on the ground in a timely manner to determine whether pressure vessels have been impacted, are structurally unsound, or can be safely used to complete the mission.

  20. Analysis of the chronic lower limb injuries occurrence in step aerobic instructors in relation to their working step class profile: a three year longitudinal prospective study.

    PubMed

    Malliou, P; Rokka, S; Beneka, A; Gioftsidou, A; Mavromoustakos, S; Godolias, G

    2014-01-01

    There is limited information on injury patterns in Step Aerobic Instructors (SAI) who exclusively execute "step" aerobic classes. To record the type and the anatomical position in relation to diagnosis of muscular skeletal injuries in step aerobic instructors. Also, to analyse the days of absence due to chronic injury in relation to weekly working hours, height of the step platform, working experience and working surface and footwear during the step class. The Step Aerobic Instructors Injuries Questionnaire was developed, and then validity and reliability indices were calculated. 63 SAI completed the questionnaire. For the statistical analysis of the data, the method used was the analysis of frequencies, the non-parametric test χ^{2} (chi square distribution), correlation and linear and logistic regressions analysis from the SPSS statistical package. 63 SAI reported 115 injuries that required more than 2 days absence from step aerobic classes. The chronic lower extremity injuries were 73.5%, with the leg pain, the anterior knee pain, the plantar tendinopathy and the Achilles tendinopathy being most common overuse syndromes. The working hours, the platform height, the years of aerobic dance seem to affect the days of absence due to chronic lower limb injury occurrence in SAI.

  1. More efficient parameter estimates for factor analysis of ordinal variables by ridge generalized least squares.

    PubMed

    Yuan, Ke-Hai; Jiang, Ge; Cheng, Ying

    2017-11-01

    Data in psychology are often collected using Likert-type scales, and it has been shown that factor analysis of Likert-type data is better performed on the polychoric correlation matrix than on the product-moment covariance matrix, especially when the distributions of the observed variables are skewed. In theory, factor analysis of the polychoric correlation matrix is best conducted using generalized least squares with an asymptotically correct weight matrix (AGLS). However, simulation studies showed that both least squares (LS) and diagonally weighted least squares (DWLS) perform better than AGLS, and thus LS or DWLS is routinely used in practice. In either LS or DWLS, the associations among the polychoric correlation coefficients are completely ignored. To mend such a gap between statistical theory and empirical work, this paper proposes new methods, called ridge GLS, for factor analysis of ordinal data. Monte Carlo results show that, for a wide range of sample sizes, ridge GLS methods yield uniformly more accurate parameter estimates than existing methods (LS, DWLS, AGLS). A real-data example indicates that estimates by ridge GLS are 9-20% more efficient than those by existing methods. Rescaled and adjusted test statistics as well as sandwich-type standard errors following the ridge GLS methods also perform reasonably well. © 2017 The British Psychological Society.

  2. Statistics Anxiety among Postgraduate Students

    ERIC Educational Resources Information Center

    Koh, Denise; Zawi, Mohd Khairi

    2014-01-01

    Most postgraduate programmes, that have research components, require students to take at least one course of research statistics. Not all postgraduate programmes are science based, there are a significant number of postgraduate students who are from the social sciences that will be taking statistics courses, as they try to complete their…

  3. The Fact Book: Report for the Florida College System, 2014

    ERIC Educational Resources Information Center

    Florida Department of Education, 2014

    2014-01-01

    This 2014 fact book for the Florida College System is divided into the following categories: (1) Student Information, which includes fall, annual, FTE, and program enrollment statistics, as well as credit program completion statistics; (2) Employee Information, which includes statistics regarding employee headcount by occupational activity, and…

  4. The Fact Book: Report for the Florida College System, 2015

    ERIC Educational Resources Information Center

    Florida Department of Education, 2015

    2015-01-01

    This 2015 fact book for the Florida College System is divided into the following categories: (1) Student Information, which includes fall, annual, FTE, and program enrollment statistics, as well as credit program completion statistics; (2) Employee Information, which includes statistics regarding employee headcount by occupational activity, and…

  5. The Fact Book: Report for the Florida College System, 2016

    ERIC Educational Resources Information Center

    Florida Department of Education, 2016

    2016-01-01

    This 2016 fact book for the Florida College System is divided into the following categories: (1) Student Information, which includes fall, annual, FTE, and program enrollment statistics, as well as credit program completion statistics; (2) Employee Information, which includes statistics regarding employee headcount by occupational activity and…

  6. Statistical Literacy Social Media Project for the Masses

    ERIC Educational Resources Information Center

    Gundlach, Ellen; Maybee, Clarence; O'Shea, Kevin

    2015-01-01

    This article examines a social media assignment used to teach and practice statistical literacy with over 400 students each semester in large-lecture traditional, fully online, and flipped sections of an introductory-level statistics course. Following the social media assignment, students completed a survey on how they approached the assignment.…

  7. A Pilot Study Teaching Metrology in an Introductory Statistics Course

    ERIC Educational Resources Information Center

    Casleton, Emily; Beyler, Amy; Genschel, Ulrike; Wilson, Alyson

    2014-01-01

    Undergraduate students who have just completed an introductory statistics course often lack deep understanding of variability and enthusiasm for the field of statistics. This paper argues that by introducing the commonly underemphasized concept of measurement error, students will have a better chance of attaining both. We further present lecture…

  8. Statistical properties of radiation from VUV and X-ray free electron laser

    NASA Astrophysics Data System (ADS)

    Saldin, E. L.; Schneidmiller, E. A.; Yurkov, M. V.

    1998-03-01

    The paper presents a comprehensive analysis of the statistical properties of the radiation from a self-amplified spontaneous emission (SASE) free electron laser operating in linear and nonlinear mode. The investigation has been performed in a one-dimensional approximation assuming the electron pulse length to be much larger than a coherence length of the radiation. The following statistical properties of the SASE FEL radiation have been studied in detail: time and spectral field correlations, distribution of the fluctuations of the instantaneous radiation power, distribution of the energy in the electron bunch, distribution of the radiation energy after the monochromator installed at the FEL amplifier exit and radiation spectrum. The linear high gain limit is studied analytically. It is shown that the radiation from a SASE FEL operating in the linear regime possesses all the features corresponding to completely chaotic polarized radiation. A detailed study of statistical properties of the radiation from a SASE FEL operating in linear and nonlinear regime has been performed by means of time-dependent simulation codes. All numerical results presented in the paper have been calculated for the 70 nm SASE FEL at the TESLA Test Facility being under construction at DESY.

  9. Exploratory Analysis of Survey Data for Understanding Adoption of Novel Aerospace Systems

    NASA Astrophysics Data System (ADS)

    Reddy, Lauren M.

    In order to meet the increasing demand for manned and unmanned flight, the air transportation system must constantly evolve. As new technologies or operational procedures are conceived, we must determine their effect on humans in the system. In this research, we introduce a strategy to assess how individuals or organizations would respond to a novel aerospace system. We employ the most appropriate and sophisticated exploratory analysis techniques on the survey data to generate insight and identify significant variables. We employ three different methods for eliciting views from individuals or organizations who are affected by a system: an opinion survey, a stated preference survey, and structured interviews. We conduct an opinion survey of both the general public and stakeholders in the unmanned aircraft industry to assess their knowledge, attitude, and practices regarding unmanned aircraft. We complete a statistical analysis of the multiple-choice questions using multinomial logit and multivariate probit models and conduct qualitative analysis on free-text questions. We next present a stated preference survey of the general public on the use of an unmanned aircraft package delivery service. We complete a statistical analysis of the questions using multinomial logit, ordered probit, linear regression, and negative binomial models. Finally, we discuss structured interviews conducted on stakeholders from ANSPs and airlines operating in the North Atlantic. We describe how these groups may choose to adopt a new technology (space-based ADS-B) or operational procedure (in-trail procedures). We discuss similarities and differences between the stakeholders groups, the benefits and costs of in-trail procedures and space-based ADS-B as reported by the stakeholders, and interdependencies between the groups interviewed. To demonstrate the value of the data we generated, we explore how the findings from the surveys can be used to better characterize uncertainty in the cost-benefit analysis of aerospace systems. We demonstrate how the findings from the opinion and stated preference surveys can be infused into the cost-benefit analysis of an unmanned aircraft delivery system. We also demonstrate how to apply the findings from the interviews to characterize uncertainty in the estimation of the benefits of space-based ADS-B.

  10. Data on evolutionary relationships between hearing reduction with history of disease and injuries among workers in Abadan Petroleum Refinery, Iran.

    PubMed

    Mohammadi, Mohammad Javad; Ghazlavi, Ebtesam; Gamizji, Samira Rashidi; Sharifi, Hajar; Gamizji, Fereshteh Rashidi; Zahedi, Atefeh; Geravandi, Sahar; Tahery, Noorollah; Yari, Ahmad Reza; Momtazan, Mahboobeh

    2018-02-01

    The present work examined data obtained during the analysis of Hearing Reduction (HR) of Abadan Petroleum Refinery (Abadan PR) workers of Iran with a history of disease and injuries. To this end, all workers in the refinery were chosen. In this research, the effects of history of disease and injury including trauma, electric shock, meningitis-typhoid disease and genetic illness as well as contact with lead, mercury, CO 2 and alcohol consumption were evaluated (Lie, et al., 2016) [1]. After the completion of the questionnaires by workers, the coded data were fed into EXCELL. Statistical analysis of data was carried out, using SPSS 16.

  11. A Documentary Analysis of Abstracts Presented in European Congresses on Adapted Physical Activity.

    PubMed

    Sklenarikova, Jana; Kudlacek, Martin; Baloun, Ladislav; Causgrove Dunn, Janice

    2016-07-01

    The purpose of the study was to identify trends in research abstracts published in the books of abstracts of the European Congress of Adapted Physical Activity from 2004 to 2012. A documentary analysis of the contents of 459 abstracts was completed. Data were coded based on subcategories used in a previous study by Zhang, deLisle, and Chen (2006) and by Porretta and Sherrill (2005): number of authors, data source, sample size, type of disability, data analyses, type of study, and focus of study. Descriptive statistics calculated for each subcategory revealed an overall picture of the state and trends of scientific inquiry in adapted physical activity research in Europe.

  12. When human walking becomes random walking: fractal analysis and modeling of gait rhythm fluctuations

    NASA Astrophysics Data System (ADS)

    Hausdorff, Jeffrey M.; Ashkenazy, Yosef; Peng, Chang-K.; Ivanov, Plamen Ch.; Stanley, H. Eugene; Goldberger, Ary L.

    2001-12-01

    We present a random walk, fractal analysis of the stride-to-stride fluctuations in the human gait rhythm. The gait of healthy young adults is scale-free with long-range correlations extending over hundreds of strides. This fractal scaling changes characteristically with maturation in children and older adults and becomes almost completely uncorrelated with certain neurologic diseases. Stochastic modeling of the gait rhythm dynamics, based on transitions between different “neural centers”, reproduces distinctive statistical properties of the gait pattern. By tuning one model parameter, the hopping (transition) range, the model can describe alterations in gait dynamics from childhood to adulthood - including a decrease in the correlation and volatility exponents with maturation.

  13. A simple white noise analysis of neuronal light responses.

    PubMed

    Chichilnisky, E J

    2001-05-01

    A white noise technique is presented for estimating the response properties of spiking visual system neurons. The technique is simple, robust, efficient and well suited to simultaneous recordings from multiple neurons. It provides a complete and easily interpretable model of light responses even for neurons that display a common form of response nonlinearity that precludes classical linear systems analysis. A theoretical justification of the technique is presented that relies only on elementary linear algebra and statistics. Implementation is described with examples. The technique and the underlying model of neural responses are validated using recordings from retinal ganglion cells, and in principle are applicable to other neurons. Advantages and disadvantages of the technique relative to classical approaches are discussed.

  14. SQC: secure quality control for meta-analysis of genome-wide association studies.

    PubMed

    Huang, Zhicong; Lin, Huang; Fellay, Jacques; Kutalik, Zoltán; Hubaux, Jean-Pierre

    2017-08-01

    Due to the limited power of small-scale genome-wide association studies (GWAS), researchers tend to collaborate and establish a larger consortium in order to perform large-scale GWAS. Genome-wide association meta-analysis (GWAMA) is a statistical tool that aims to synthesize results from multiple independent studies to increase the statistical power and reduce false-positive findings of GWAS. However, it has been demonstrated that the aggregate data of individual studies are subject to inference attacks, hence privacy concerns arise when researchers share study data in GWAMA. In this article, we propose a secure quality control (SQC) protocol, which enables checking the quality of data in a privacy-preserving way without revealing sensitive information to a potential adversary. SQC employs state-of-the-art cryptographic and statistical techniques for privacy protection. We implement the solution in a meta-analysis pipeline with real data to demonstrate the efficiency and scalability on commodity machines. The distributed execution of SQC on a cluster of 128 cores for one million genetic variants takes less than one hour, which is a modest cost considering the 10-month time span usually observed for the completion of the QC procedure that includes timing of logistics. SQC is implemented in Java and is publicly available at https://github.com/acs6610987/secureqc. jean-pierre.hubaux@epfl.ch. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  15. Using Genome-Wide Expression Profiling to Define Gene Networks Relevant to the Study of Complex Traits: From RNA Integrity to Network Topology

    PubMed Central

    O'Brien, M.A.; Costin, B.N.; Miles, M.F.

    2014-01-01

    Postgenomic studies of the function of genes and their role in disease have now become an area of intense study since efforts to define the raw sequence material of the genome have largely been completed. The use of whole-genome approaches such as microarray expression profiling and, more recently, RNA-sequence analysis of transcript abundance has allowed an unprecedented look at the workings of the genome. However, the accurate derivation of such high-throughput data and their analysis in terms of biological function has been critical to truly leveraging the postgenomic revolution. This chapter will describe an approach that focuses on the use of gene networks to both organize and interpret genomic expression data. Such networks, derived from statistical analysis of large genomic datasets and the application of multiple bioinformatics data resources, poten-tially allow the identification of key control elements for networks associated with human disease, and thus may lead to derivation of novel therapeutic approaches. However, as discussed in this chapter, the leveraging of such networks cannot occur without a thorough understanding of the technical and statistical factors influencing the derivation of genomic expression data. Thus, while the catch phrase may be “it's the network … stupid,” the understanding of factors extending from RNA isolation to genomic profiling technique, multivariate statistics, and bioinformatics are all critical to defining fully useful gene networks for study of complex biology. PMID:23195313

  16. Use of a Self-Instructional Radiographic Anatomy Module for Dental Hygiene Faculty Calibration.

    PubMed

    Brame, Jennifer L; AlGheithy, Demah Salem; Platin, Enrique; Mitchell, Shannon H

    2017-06-01

    Purpose: Dental hygiene educators often provide inconsistent instruction in clinical settings and various attempts to address the lack of consistency have been reported in the literature. The purpose of this pilot study was to determine if the use of a use of a self-instructional, radiographic anatomy (SIRA) module improved DH faculty calibration regarding the identifica-tion of normal intraoral and extraoral radiographic anatomy and whether its effect could be sustained over a period of four months. Methods: A convenience sample consisting of all dental hygiene faculty members involved in clinical instruction (N=23) at the University of North Carolina (UNC) was invited to complete the four parts of this online pilot study: a pre-test, review of the SIRA module, an immediate post-test, and a four-month follow-up post-test. Descriptive analyses, the Friedman's ANOVA, and the exact form of the Wilcoxon-Signed-Rank test were used to an-alyze the data. Level of significance was set at 0.05. Participants who did not complete all parts of the study were omitted from data analysis comparing the pre to post-test performance. Results: The pre-test response rate was 73.9% (N=17), and 88.2% (N=15) of those initial participants completed both the immediate and follow-up post-tests. Faculty completing all parts of the study consisted of: 5 full-time faculty, 5 part-time faculty, and 5 graduate teaching assistants. The Friedman's ANOVA revealed no statistically significant difference (P=0.179) in percentages of correct responses between the three tests (pre, post and follow-up). The exact form of the Wilcoxon-Signed-Rank test revealed marginal significance when comparing percent of correct responses at pre-test and immediate post-test (P=0.054), and no statistically significant difference when comparing percent of correct responses at immediate post-test and the follow-up post-test four months later (P=0.106). Conclusions: Use of a SIRA module did not significantly affect DH faculty test performance. Lack of statistical significance in the percentages of correct responses between the three tests may have been affected by the small number of participants completing all four parts of the study (N=15). Additional research is needed to identify and improve methods for faculty calibration. Copyright © 2017 The American Dental Hygienists’ Association.

  17. Employee and family assistance video counseling program: a post launch retrospective comparison with in-person counseling outcomes.

    PubMed

    Veder, Barbara; Pope, Stan; Mani, Michèle; Beaudoin, Kelly; Ritchie, Janice

    2014-01-01

    Access to technologically mediated information and services under the umbrella of mental and physical health has become increasingly available to clients via Internet modalities, according to a recent study. In May 2010, video counseling was added to the counseling services offered through the Employee and Family Assistance Program at Shepell·fgi as a pilot project with a full operational launch in September 2011. The objective of this study was to conduct a retrospective post launch examination of the video counseling service through an analysis of the reported clinical outcomes of video and in-person counseling modalities. A chronological sample of 68 video counseling (VC) cases and 68 in-person (IP) cases were collected from a pool of client clinical files closed in 2012. To minimize the variables impacting the study and maintain as much clinical continuity as possible, the IP and the VC clients must have attended clinical sessions with any one of six counselors who provided both the VC and the IP services. The study compared the two counseling modalities along the following data points (see glossary of terms): (1) client demographic profiles (eg, age, gender, whether the sessions involved individuals or conjoint sessions with couples or families, etc), (2) presenting issue, (3) average session hours, (4) client rating of session helpfulness, (5) rates of goal completion, (6) client withdrawal rates, (7) no show and late cancellation rates, and (8) pre/post client self-assessment. Specific to VC, we examined client geographic location. Data analysis demonstrates that the VC and the IP showed a similar representation of presenting issues with nearly identical outcomes for client ratings of session helpfulness, rates of goal completion, pre/post client self-assessment, average session duration, and client geographic location. There were no statistically significant differences in the rates of withdrawal from counseling, no shows, and late cancellations between the VC and the IP counseling. The statistical analysis of the data was done on SPSS statistical software using 2-sample and pairwise comparison t tests at a 95% level of significance. Based on the study, VC and IP show similar outcomes in terms of client rating of session and goal attainment.

  18. Efficacy and Safety of Complete RAAS Blockade with ALISKIREN in Patients with Refractory Proteinuria Who were already on Combined ACE Inhibitor, ARB, and Aldosterone Antagonist.

    PubMed

    Panattil, Prabitha; Sreelatha, M

    2016-09-01

    Proteinuria is always associated with intrinsic kidney disese and is a strong predictor of later development of End Stage Renal Disease (ESRD). As Renin Angiotensin Aldosterone System (RAAS) has a role in mediating proteinuria, inhibitors of this system are renoprotective and patients with refractory proteinuria are put on a combination of these agents. The routinely employed triple blockade of RAAS with Angiotensin Converting Enzyme (ACE) inhibitor, ARB and Aldosterone antagonist has many limitations. Addition of Aliskiren to this combination suppresses the RAAS at the earliest stage and can offset many of these limitations. This study was conducted to assess the safety and efficacy of complete RAAS blockade by the addition of Aliskiren in those patients with refractory proteinuria who were already on triple blockade with ACE inhibitor, ARB and Aldosterone antagonist. This study was conducted in Nephrology Department, Calicut Medical College. A total of 36 patients with refractory proteinuria who were already on ACE inhibitor, ARB and Aldosterone antagonist were divided in to two groups A and B. Group A received Aliskiren in addition to the above combination whereas group B continued the same treatment for 12 weeks. Efficacy of the treatment was assessed by recording 24hr urine protein and safety by S.Creatinine, S.Potassium every 2 weeks of the treatment period. Statistical analysis of the lab values was done using SPSS software. Unpaired t-test, Paired t-test and Chi-square test were done for data analysis. Statistical analysis revealed that addition of Aliskiren to the combination therapy with ACE inhibitor+ ARB+ Aldosterone antagonist offers no advantage. But mean reduction in proteinuria was more with Group A than Group B. There is no statistically significant change in S.Creatinine and S.Potassium at the end of treatment. As proteinuria is a strong risk factor for progression to ESRD, even a mild decrease in proteinuria by treatment is renoprotective. Hence treatment with group A may be considered clinically superior to group B with no alteration in safety and tolerability. But further multicentre studies with larger sample size and dose escalation are required for confirmation.

  19. Assessment of Oral Status in Pediatric Patients with Special Health Care Needs receiving Dental Rehabilitation Procedures under General Anesthesia: A Retrospective Analysis.

    PubMed

    Solanki, Neeraj; Kumar, Anuj; Awasthi, Neha; Kundu, Anjali; Mathur, Suveet; Bidhumadhav, Suresh

    2016-06-01

    Dental problems serve as additional burden on the children with special health care needs (CSHCN) because of additional hospitalization pressure, they face for the treatment of various serious medical problems. These patients have higher incidence of dental caries due to increased quantity of sugar involved in the drug therapies and lower salivary flow in the oral cavity. Such patients are difficult to treat with local anesthesia or inhaled sedatives. Single-sitting dental treatment is possible in these patients with general anesthesia. Therefore, we conducted this retrospective analysis of oral health status of CSHCN receiving various dental treatments in a given population. A total of 200 CSHCN of age 14 years or less reporting in the pediatric wing of the general hospital from 2005 to 2014 that underwent comprehensive dental treatment under general anesthesia were included in the study. Patients with history of any additional systemic illness, any malignancy, any known drug allergy, or previous history of any dental treatment were excluded from the study. Complete mouth rehabilitation was done in these patients under general anesthesia following standard protocols. Data regarding the patient's disability, type, duration, and severity of disability was collected and analyzed. All the results were analyzed by Statistical Package for the Social Sciences (SPSS) software. Chi-square test, Student's t-test, and one-way analysis of variance were used to assess the level of significance. Statistically significant results were obtained while analyzing the subject's decayed missing filled/decayed extracted filled teeth indices divided based on age. Significant difference was observed only in cases where patients underwent complete crown placement even when divided based on type of disability. While analyzing the prevalence, statistically significant results were observed in patients when divided based on their age. In CSHCN, dental pathologies and caries indices are increased regardless of the type or extent of disability. Children with special health care needs should be given special oral health care, and regular dental checkup should be conducted as they are more prone to have dental problems.

  20. FieldTrip: Open Source Software for Advanced Analysis of MEG, EEG, and Invasive Electrophysiological Data

    PubMed Central

    Oostenveld, Robert; Fries, Pascal; Maris, Eric; Schoffelen, Jan-Mathijs

    2011-01-01

    This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow experimental neuroscientists to analyze experimental data. It includes algorithms for simple and advanced analysis, such as time-frequency analysis using multitapers, source reconstruction using dipoles, distributed sources and beamformers, connectivity analysis, and nonparametric statistical permutation tests at the channel and source level. The implementation as toolbox allows the user to perform elaborate and structured analyses of large data sets using the MATLAB command line and batch scripting. Furthermore, users and developers can easily extend the functionality and implement new algorithms. The modular design facilitates the reuse in other software packages. PMID:21253357

  1. Face recognition using an enhanced independent component analysis approach.

    PubMed

    Kwak, Keun-Chang; Pedrycz, Witold

    2007-03-01

    This paper is concerned with an enhanced independent component analysis (ICA) and its application to face recognition. Typically, face representations obtained by ICA involve unsupervised learning and high-order statistics. In this paper, we develop an enhancement of the generic ICA by augmenting this method by the Fisher linear discriminant analysis (LDA); hence, its abbreviation, FICA. The FICA is systematically developed and presented along with its underlying architecture. A comparative analysis explores four distance metrics, as well as classification with support vector machines (SVMs). We demonstrate that the FICA approach leads to the formation of well-separated classes in low-dimension subspace and is endowed with a great deal of insensitivity to large variation in illumination and facial expression. The comprehensive experiments are completed for the facial-recognition technology (FERET) face database; a comparative analysis demonstrates that FICA comes with improved classification rates when compared with some other conventional approaches such as eigenface, fisherface, and the ICA itself.

  2. Impact of a variational objective analysis scheme on a regional area numerical model: The Italian Air Force Weather Service experience

    NASA Astrophysics Data System (ADS)

    Bonavita, M.; Torrisi, L.

    2005-03-01

    A new data assimilation system has been designed and implemented at the National Center for Aeronautic Meteorology and Climatology of the Italian Air Force (CNMCA) in order to improve its operational numerical weather prediction capabilities and provide more accurate guidance to operational forecasters. The system, which is undergoing testing before operational use, is based on an “observation space” version of the 3D-VAR method for the objective analysis component, and on the High Resolution Regional Model (HRM) of the Deutscher Wetterdienst (DWD) for the prognostic component. Notable features of the system include a completely parallel (MPI+OMP) implementation of the solution of analysis equations by a preconditioned conjugate gradient descent method; correlation functions in spherical geometry with thermal wind constraint between mass and wind field; derivation of the objective analysis parameters from a statistical analysis of the innovation increments.

  3. Monitoring and Evaluation: Statistical Support for Life-cycle Studies, Annual Report 2003.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skalski, John

    2003-11-01

    The ongoing mission of this project is the development of statistical tools for analyzing fisheries tagging data in the most precise and appropriate manner possible. This mission also includes providing statistical guidance on the best ways to design large-scale tagging studies. This mission continues because the technologies for conducting fish tagging studies continuously evolve. In just the last decade, fisheries biologists have seen the evolution from freeze-brands and coded wire tags (CWT) to passive integrated transponder (PIT) tags, balloon-tags, radiotelemetry, and now, acoustic-tags. With each advance, the technology holds the promise of more detailed and precise information. However, the technologymore » for analyzing and interpreting the data also becomes more complex as the tagging techniques become more sophisticated. The goal of the project is to develop the analytical tools in parallel with the technical advances in tagging studies, so that maximum information can be extracted on a timely basis. Associated with this mission is the transfer of these analytical capabilities to the field investigators to assure consistency and the highest levels of design and analysis throughout the fisheries community. Consequently, this project provides detailed technical assistance on the design and analysis of tagging studies to groups requesting assistance throughout the fisheries community. Ideally, each project and each investigator would invest in the statistical support needed for the successful completion of their study. However, this is an ideal that is rarely if every attained. Furthermore, there is only a small pool of highly trained scientists in this specialized area of tag analysis here in the Northwest. Project 198910700 provides the financial support to sustain this local expertise on the statistical theory of tag analysis at the University of Washington and make it available to the fisheries community. Piecemeal and fragmented support from various agencies and organizations would be incapable of maintaining a center of expertise. The mission of the project is to help assure tagging studies are designed and analyzed from the onset to extract the best available information using state-of-the-art statistical methods. The overarching goals of the project is to assure statistically sound survival studies so that fish managers can focus on the management implications of their findings and not be distracted by concerns whether the studies are statistically reliable or not. Specific goals and objectives of the study include the following: (1) Provide consistent application of statistical methodologies for survival estimation across all salmon life cycle stages to assure comparable performance measures and assessment of results through time, to maximize learning and adaptive management opportunities, and to improve and maintain the ability to responsibly evaluate the success of implemented Columbia River FWP salmonid mitigation programs and identify future mitigation options. (2) Improve analytical capabilities to conduct research on survival processes of wild and hatchery chinook and steelhead during smolt outmigration, to improve monitoring and evaluation capabilities and assist in-season river management to optimize operational and fish passage strategies to maximize survival. (3) Extend statistical support to estimate ocean survival and in-river survival of returning adults. Provide statistical guidance in implementing a river-wide adult PIT-tag detection capability. (4) Develop statistical methods for survival estimation for all potential users and make this information available through peer-reviewed publications, statistical software, and technology transfers to organizations such as NOAA Fisheries, the Fish Passage Center, US Fish and Wildlife Service, US Geological Survey (USGS), US Army Corps of Engineers (USACE), Public Utility Districts (PUDs), the Independent Scientific Advisory Board (ISAB), and other members of the Northwest fisheries community. (5) Provide and maintain statistical software for tag analysis and user support. (6) Provide improvements in statistical theory and software as requested by user groups. These improvements include extending software capabilities to address new research issues, adapting tagging techniques to new study designs, and extending the analysis capabilities to new technologies such as radio-tags and acoustic-tags.« less

  4. The Impact of Recreational Marijuana Legislation in Washington, DC on Marijuana Use Cognitions.

    PubMed

    Clarke, Paige; Dodge, Tonya; Stock, Michelle L

    2018-04-13

    There is little published research that tests the effect of recreational marijuana legislation on risk-related cognitions and how individuals respond immediately after legislative approval. The objective was to test whether learning about the passage of Initiative 71, a voter referendum that legalized recreational use of marijuana in the District of Columbia, would lead individuals to adopt more favorable marijuana cognitions than they had before the Initiative was passed. Undergraduate students (N = 402) completed two web-based questionnaires in 2014. The first questionnaire was completed prior to the referendum vote and the follow-up questionnaire was completed after voters approved Initiative 71. Attitudes, perceived norms, intentions, prototypes, and willingness were measured at time 1 and time 2. Study hypotheses were tested using repeated-measures analysis of covariance. Results showed that attitudes, intentions, perceived norms, and willingness to use marijuana were more favorable after Initiative 71 was passed. However, the increase in attitudes and willingness was moderated by past experience with marijuana whereby the increases were statistically significant only among those with the least experience. The increase in perceived norms was also moderated by past experience whereby increases were statistically significant among those who were moderate or heavy users. The passage of Initiative 71 had no effect on favorable prototypes. Conclusion/Importance: Legalization may have the unintended outcome of leading to more favorable intentions to use marijuana and might lead abstainers or experimental users to become more frequent users of marijuana via more positive attitudes and willingness towards marijuana use.

  5. A study of the effects of an experimental spiral physics curriculum taught to sixth grade girls and boys

    NASA Astrophysics Data System (ADS)

    Davis, Edith G.

    The pilot study compared the effectiveness of using an experimental spiral physics curriculum to a traditional linear physics curriculum for sixth through eighth grades. The study also surveyed students' parents and principals about students' academic history and background as well as identified resilient children's attributes for academic success. The pilot study was used to help validate the testing instrument as well as help refine the complete study. The purpose of the complete study was to compare the effectiveness of using an experimental spiral physics curriculum and a traditional linear curriculum with sixth graders only; seventh and eighth graders were dropped in the complete study. The study also surveyed students' parents, teachers, and principals about students' academic history and background as well as identified resilient children's attributes for academic success. Both the experimental spiral physics curriculum and the traditional linear physics curriculum increased physics achievement; however, there was no statistically significant difference in effectiveness of teaching experimental spiral physics curriculum in the aggregated sixth grade group compared to the traditional linear physics curriculum. It is important to note that the majority of the subgroups studied did show statistically significant differences in effectiveness for the experimental spiral physics curriculum compared to the traditional linear physics curriculum. The Grounded Theory analysis of resilient student characteristics resulted in categories for future studies including the empathy factor ("E" factor), the tenacity factor ("T" factor), the relational factor ("R" factor), and the spiritual factor ("S" factor).

  6. Prognostic impact of number of resected and involved lymph nodes at complete resection on survival in non-small cell lung cancer.

    PubMed

    Saji, Hisashi; Tsuboi, Masahiro; Yoshida, Koichi; Kato, Yasufumi; Nomura, Masaharu; Matsubayashi, Jun; Nagao, Toshitaka; Kakihana, Masatoshi; Usuda, Jitsuo; Kajiwara, Naohiro; Ohira, Tatsuo; Ikeda, Norihiko

    2011-11-01

    Lymph node (LN) status is a major determinant of stage and survival in patients with lung cancer. In the 7th edition of the TNM Classification of Malignant Tumors, the number of involved LNs is included in the definition of pN factors in breast, stomach, esophageal, and colorectal cancer, and the pN status significantly correlates with prognosis. We retrospectively investigated the prognostic impact of the number of resected LNs (RLNs) and involved LNs in the context of other established clinical prognostic factors, in a series of 928 consecutive patients with non-small cell lung cancer (NSCLC) who underwent complete resection at our institution between 2000 and 2007. The mean number of RLNs was 15. There was a significant difference in the total number of RLNs categorized between less than 10 and ≥10 (p = 0.0129). Although the incidence of LN involvement was statistically associated with poor prognosis, the largest statistically significant increase in overall survival was observed between 0 to 3 and ≥4 involved LNs (hazard ratio = 7.680; 95% confidence interval = 5.051-11.655, p < 0.0001). On multivariate analysis, we used the ratio between the number of involved LNs and RLNs. The number of RLNs was found to be a strong independent prognostic factor for NSCLC (hazard ratio = 6.803; 95% confidence interval = 4.137-11.186, p < 0.0001). Complete resection including 10 or more LNs influenced survival at complete NSCLC resection. Four involved LNs seemed to be a benchmark for NSCLC prognosis. The number of involved LNs is a strong independent prognostic factor in NSCLC, and the results of this study may provide new information for determining the N category in the next tumor, node, metastasis classification.

  7. [The statistical analysis for the use of the 55,787 finished resin teeth].

    PubMed

    Wu, Shu-hong; Yu, Hai-yang; Wang, Lu; Xu, Ling; Xiao, Zhi-li

    2010-08-01

    To analyze the use situation of finished resin teeth for the different location, and to provide reference for manufacturers of finished resin teeth and all the buyers. To analyze the use situation of finished resin teeth in the Dental Laboratory of the Affiliated Hospital of Stomatology of Chongqing Medical University from January 2006 to December 2008 by using statistic methods. During the use of 55,787 finished resin teeth, the study found some rules. (1) The top use of finished resin teeth was D6 with the percentage of 5.31%, and the lowest use of finished resin teeth was D3 with the percentage of 1.94%. (2) Except the maxillary canines and the mandibular lateral incisors, there was no significant difference between the usage of other same name finished resin teeth (P > 0.05). (3) Among all finished resin teeth, the usage of section B exceeded section A, and the usage of maxillary finished resin teeth exceeded mandibular finished resin teeth (P < 0.05). (4) The use of the complete denture and single complete denture was about 1/3 of the total usage of finished resin teeth. (5) Except the use situation of complete denture and single complete denture, the frequency of simultaneously using mandibular left and right central incisors was the most with the percentage of 81.46%, for the frequency of simultaneously using maxillary left and right canines was 43.26% of the total, which was the lowest. There is significant difference in the use frequency of finished resin teeth for different location. For such reason, the manufacturers should produce finished resin teeth pro rata as well as the buyers for their purchase.

  8. Difference in real-time magnetic image analysis of colonic looping patterns between males and females undergoing diagnostic colonoscopy.

    PubMed

    Lam, Jacob; Wilkinson, James; Brassett, Cecilia; Brown, Jonathan

    2018-05-01

    Background and study aim  Magnetic imaging technology is of proven benefit to trainees in colonoscopy, but few studies have examined its benefits in experienced hands. There is evidence that colonoscopy is more difficult in women. We set out to investigate (i) associations between the looping configurations in the proximal and distal colon and (ii) differences in the looping prevalence between the sexes. We have examined their significance in terms of segmental intubation times and position changes required for the completion of colonoscopy. Patients and methods  We analyzed 103 consecutive synchronized luminal and magnetic image videos of diagnostic colonoscopies with normal anatomy undertaken by a single experienced operator. Results  Deep transverse loops and sigmoid N-loops were more common in females. A deep transverse loop was more likely to be present if a sigmoid alpha-loop or N-loop had formed previously. Patients with sigmoid N-loops were turned more frequently from left lateral to supine before the sigmoid-descending junction was reached, but there was no statistical correlation between completion time and looping pattern. Conclusions  This study has reexamined the prevalence of the common looping patterns encountered during colonoscopy and has identified differences between the sexes. This finding may offer an explanation as to why colonoscopy has been shown to be more difficult in females. Although a deep transverse loop following a resolved sigmoid alpha-loop was the most commonly encountered pattern, no statistical correlation between completion time and looping pattern could be shown. It is the first study to examine segmental completion times using a magnetic imager in expert hands.

  9. A Morbidity Screening Tool for identifying fatigue, pain, upper limb dysfunction and lymphedema after breast cancer treatment: a validity study.

    PubMed

    Bulley, Catherine; Coutts, Fiona; Blyth, Christine; Jack, Wilma; Chetty, Udi; Barber, Matthew; Tan, Chee Wee

    2014-04-01

    This study aimed to investigate validity of a newly developed Morbidity Screening Tool (MST) to screen for fatigue, pain, swelling (lymphedema) and arm function after breast cancer treatment. A cross-sectional study included women attending reviews after completing treatment (surgery, chemotherapy and radiotherapy), without recurrence, who could read English. They completed the MST and comparator questionnaires: Disability of the Arm, Shoulder and Hand questionnaire (DASH), Chronic Pain Grade Questionnaire (CPGQ), Lymphedema and Breast Cancer Questionnaire (LBCQ) and Functional Assessment of Cancer Therapy questionnaire with subscales for fatigue (FACT F) and breast cancer (FACT B + 4). Bilateral combined shoulder ranges of motion were compared (upward reach; hand behind back) and percentage upper limb volume difference (%LVD =/>10% diagnosed as lymphedema) measured with the vertical perometer (400T). 613 of 617 participants completed questionnaires (mean age 62.3 years, SD 10.0; mean time since treatment 63.0 months, SD 46.6) and 417 completed objective testing. Morbidity prevalence was estimated as 35.8%, 21.9%, 19.8% and 34.4% for fatigue, impaired upper limb function, lymphedema and pain respectively. Comparing those self-reporting the presence or absence of each type of morbidity, statistically significant differences in comparator variables supported validity of the MST. Statistically significant correlations resulted between MST scores focussing on impact of morbidity, and comparator variables that reflect function and quality of life. Analysis supports the validity of all four short-forms of the MST as providing indications of both presence of morbidity and impacts on participants' lives. This may facilitate early and appropriate referral for intervention. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Usefulness and limitations of various guinea-pig test methods in detecting human skin sensitizers-validation of guinea-pig tests for skin hypersensitivity.

    PubMed

    Marzulli, F; Maguire, H C

    1982-02-01

    Several guinea-pig predictive test methods were evaluated by comparison of results with those obtained with human predictive tests, using ten compounds that have been used in cosmetics. The method involves the statistical analysis of the frequency with which guinea-pig tests agree with the findings of tests in humans. In addition, the frequencies of false positive and false negative predictive findings are considered and statistically analysed. The results clearly demonstrate the superiority of adjuvant tests (complete Freund's adjuvant) in determining skin sensitizers and the overall superiority of the guinea-pig maximization test in providing results similar to those obtained by human testing. A procedure is suggested for utilizing adjuvant and non-adjuvant test methods for characterizing compounds as of weak, moderate or strong sensitizing potential.

  11. How weak values emerge in joint measurements on cloned quantum systems.

    PubMed

    Hofmann, Holger F

    2012-07-13

    A statistical analysis of optimal universal cloning shows that it is possible to identify an ideal (but nonpositive) copying process that faithfully maps all properties of the original Hilbert space onto two separate quantum systems, resulting in perfect correlations for all observables. The joint probabilities for noncommuting measurements on separate clones then correspond to the real parts of the complex joint probabilities observed in weak measurements on a single system, where the measurements on the two clones replace the corresponding sequence of weak measurement and postselection. The imaginary parts of weak measurement statics can be obtained by replacing the cloning process with a partial swap operation. A controlled-swap operation combines both processes, making the complete weak measurement statistics accessible as a well-defined contribution to the joint probabilities of fully resolved projective measurements on the two output systems.

  12. Size of the Dynamic Bead in Polymers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agapov, Alexander L; Sokolov, Alexei P

    2010-01-01

    Presented analysis of neutron, mechanical, and MD simulation data available in the literature demonstrates that the dynamic bead size (the smallest subchain that still exhibits the Rouse-like dynamics) in most of the polymers is significantly larger than the traditionally defined Kuhn segment. Moreover, our analysis emphasizes that even the static bead size (e.g., chain statistics) disagrees with the Kuhn segment length. We demonstrate that the deficiency of the Kuhn segment definition is based on the assumption of a chain being completely extended inside a single bead. The analysis suggests that representation of a real polymer chain by the bead-and-spring modelmore » with a single parameter C cannot be correct. One needs more parameters to reflect correctly details of the chain structure in the bead-and-spring model.« less

  13. Structure and Content Analysis for Vocational High School Website in Indonesia

    NASA Astrophysics Data System (ADS)

    Subagja, H.; Abdullah, A. G.; Trisno, B.; Nandiyanto, A. B. D.

    2017-03-01

    Statistics about the condition of the school’s website in Indonesia is still difficult. This study aims to determine website quality in terms of completeness of content’s criteria of Vocational High School (VHS) in West Java, Indonesia. The method used is the content analysis and survey. Content analysis is reviewing the documents comprising the general category, while the survey is a observation process to get the facts from 272 school websites. Aspects of the structure and content of school website are including institutional information, educators and education personnel, curriculum, student, infrastructure, school achievement, and public access. The results of this study showed the average quality of the VHS website in West Java is still low. The recommendations are needed to improve the quality of the school website.

  14. Intrabolus pressure on high-resolution manometry distinguishes fibrostenotic and inflammatory phenotypes of eosinophilic esophagitis.

    PubMed

    Colizzo, J M; Clayton, S B; Richter, J E

    2016-08-01

    The aim of this investigation was to determine the motility patterns of inflammatory and fibrostenotic phenotypes of eosinophilic esophagitis (EoE) utilizing high-resolution manometry (HRM). Twenty-nine patients with a confirmed diagnosis of EoE according to clinicopathological criteria currently being managed at the Joy McCann Culverhouse Swallowing Center at the University of South Florida were included in the retrospective analysis. Only patients who completed HRM studies were included in the analysis. Patients were classified into inflammatory or fibrostenotic subtypes based on baseline endoscopic evidence. Their baseline HRM studies prior to therapy were analyzed. Manometric data including distal contractile integral, integrated relaxation pressure, and intrabolus pressure (IBP) values were recorded. HRM results were interpreted according to the Chicago Classification system. Statistical analysis was performed with SPSS software (Version 22, IBM Co., Armonk, NY, USA). Data were compared utilizing Student's t-test, χ(2) test, Pearson correlation, and Spearman correlation tests. Statistical significance was set at P < 0.05. A total of 29 patients with EoE were included into the retrospective analysis. The overall average age among patients was 40 years. Male patients comprised 62% of the overall population. Both groups were similar in age, gender, and overall clinical presentation. Seventeen patients (58%) had fibrostenotic disease, and 12 (42%) displayed inflammatory disease. The average IBP for the fibrostenotic and inflammatory groups were 18.6 ± 6.0 mmHg and 12.6 ± 3.5 mmHg, respectively (P < 0.05). Strictures were only seen in the fibrostenotic group. Of the fibrostenotic group, 6 (35%) demonstrated proximal esophageal strictures, 7 (41%) had distal strictures, 3 (18%) had mid-esophageal strictures, and 1 (6%) patient had pan-esophageal strictures. There was no statistically significant correlation between the level of esophageal stricture and degree of IBP. Integrated relaxation pressure, distal contractile integral, and other HRM metrics did not demonstrate statistical significance between the two subtypes. There also appeared no statistically significant correlation between patient demographics and esophageal metrics. Patients with the fibrostenotic phenotype of EoE demonstrated an IBP that was significantly higher than that of the inflammatory group. © 2015 International Society for Diseases of the Esophagus.

  15. Analysis of water levels in the Frenchman Flat area, Nevada Test Site

    USGS Publications Warehouse

    Bright, D.J.; Watkins, S.A.; Lisle, B.A.

    2001-01-01

    Analysis of water levels in 21 wells in the Frenchman Flat area, Nevada Test Site, provides information on the accuracy of hydraulic-head calculations, temporal water-level trends, and potential causes of water-level fluctuations. Accurate hydraulic heads are particularly important in Frenchman Flat where the hydraulic gradients are relatively flat (less than 1 foot per mile) in the alluvial aquifer. Temporal water-level trends with magnitudes near or exceeding the regional hydraulic gradient may have a substantial effect on ground-water flow directions. Water-level measurements can be adjusted for the effects of barometric pressure, formation water density (from water-temperature measurements), borehole deviation, and land-surface altitude in selected wells in the Frenchman Flat area. Water levels in one well were adjusted for the effect of density; this adjustment was significantly greater (about 17 feet) than the adjustment of water levels for barometric pressure, borehole deviation, or land-surface altitude (less than about 4 feet). Water-level measurements from five wells exhibited trends that were statistically and hydrologically significant. Statistically significant water-level trends were observed for three wells completed in the alluvial aquifer (WW-5a, UE-5n, and PW-3), for one well completed in the carbonate aquifer (SM-23), and for one well completed in the quartzite confining unit (Army-6a). Potential causes of water-level fluctuations in wells in the Frenchman Flat area include changes in atmospheric conditions (precipitation and barometric pressure), Earth tides, seismic activity, past underground nuclear testing, and nearby pumping. Periodic water-level measurements in some wells completed in the carbonate aquifer indicate cyclic-type water-level fluctuations that generally correlate with longer term changes (more than 5 years) in precipitation. Ground-water pumping fromthe alluvial aquifer at well WW-5c and pumping and discharge from well RNM-2s appear to cause water-level fluctuations in nearby observation wells. The remaining known sources of water-level fluctuations do not appear to substantially affect water-level changes (seismic activity and underground nuclear testing) or do not affect changes over a period of more than 1 year (barometric pressure and Earth tides) in wells in the Frenchman Flat area.

  16. A Randomized Comparative Trial of the Knowledge Retention and Usage Conditions in Undergraduate Medical Students Using Podcasts and Blog Posts

    PubMed Central

    Chin, Alvin; Helman, Anton; Chan, Teresa M

    2018-01-01

    Introduction Podcasts and blog posts have gained popularity in Free Open Access Medical education (FOAM). Previous work suggests that podcasts may be useful for knowledge acquisition in undergraduate medical education. However, there remains a paucity of research comparing the two mediums. This study aims to investigate if there are differences in knowledge acquisition and usage conditions by medical students using podcasts and blog posts. Methods Medical students were randomized to either the podcast or blog post group. They completed an initial online assessment of their baseline knowledge on the subject matter. Participants then received access to learning materials and were given four weeks to complete the follow-up assessment on their own time. Independent t-test, paired samples t-test, and a mixed ANOVA (analysis of variance) were conducted to assess knowledge acquisition. An intention-to-teach analysis was used to impute missing data from students lost to follow-up. Simple descriptive statistical data was used to describe media usage conditions. Results Completion of at least one follow-up assessment was comparable (68% podcasts (n = 21/31), 73% blog posts (n = 22/30)). Both groups showed significant improvements in their test scores, with an average 22% improvement for the podcast group and 29% for the blog post group. There was no significant statistical difference in knowledge acquisition between educational modalities overall. Students in the blog post group that completed both post-intervention quizzes showed a larger improvement than the podcast group in the toxicology topic, with similar improvements in the asthma topic. The podcast group tended to engage in multiple activities while using the learning materials (e.g. at least two to three of the following: driving, eating, chores, taking notes, exercising/walking), while the blog readers tended to do fewer activities (e.g. only one of the following: note taking, eating). Conclusion This study suggests that podcasts and blog posts are useful for extracurricular knowledge acquisition by undergraduate medical students with no significant difference between the two modalities. The usage conditions for each type of media differ. PMID:29552428

  17. Statistical Association Criteria in Forensic Psychiatry–A criminological evaluation of casuistry

    PubMed Central

    Gheorghiu, V; Buda, O; Popescu, I; Trandafir, MS

    2011-01-01

    Purpose. Identification of potential shared primary psychoprophylaxis and crime prevention is measured by analyzing the rate of commitments for patients–subjects to forensic examination. Material and method. The statistic trial is a retrospective, document–based study. The statistical lot consists of 770 initial examination reports performed and completed during the whole year 2007, primarily analyzed in order to summarize the data within the National Institute of Forensic Medicine, Bucharest, Romania (INML), with one of the group variables being ‘particularities of the psychiatric patient history’, containing the items ‘forensic onset’, ‘commitments within the last year prior to the examination’ and ‘absence of commitments within the last year prior to the examination’. The method used was the Kendall bivariate correlation. For this study, the authors separately analyze only the two items regarding commitments by other correlation alternatives and by modern, elaborate statistical analyses, i.e. recording of the standard case study variables, Kendall bivariate correlation, cross tabulation, factor analysis and hierarchical cluster analysis. Results. The results are varied, from theoretically presumed clinical nosography (such as schizophrenia or manic depression), to non–presumed (conduct disorders) or unexpected behavioral acts, and therefore difficult to interpret. Conclusions. One took into consideration the features of the batch as well as the results of the previous standard correlation of the whole statistical lot. The authors emphasize the role of medical security measures that are actually applied in the therapeutic management in general and in risk and second offence management in particular, as well as the role of forensic psychiatric examinations in the detection of certain aspects related to the monitoring of mental patients. PMID:21505571

  18. Assessment and statistics of surgically induced astigmatism.

    PubMed

    Naeser, Kristian

    2008-05-01

    The aim of the thesis was to develop methods for assessment of surgically induced astigmatism (SIA) in individual eyes, and in groups of eyes. The thesis is based on 12 peer-reviewed publications, published over a period of 16 years. In these publications older and contemporary literature was reviewed(1). A new method (the polar system) for analysis of SIA was developed. Multivariate statistical analysis of refractive data was described(2-4). Clinical validation studies were performed. The description of a cylinder surface with polar values and differential geometry was compared. The main results were: refractive data in the form of sphere, cylinder and axis may define an individual patient or data set, but are unsuited for mathematical and statistical analyses(1). The polar value system converts net astigmatisms to orthonormal components in dioptric space. A polar value is the difference in meridional power between two orthogonal meridians(5,6). Any pair of polar values, separated by an arch of 45 degrees, characterizes a net astigmatism completely(7). The two polar values represent the net curvital and net torsional power over the chosen meridian(8). The spherical component is described by the spherical equivalent power. Several clinical studies demonstrated the efficiency of multivariate statistical analysis of refractive data(4,9-11). Polar values and formal differential geometry describe astigmatic surfaces with similar concepts and mathematical functions(8). Other contemporary methods, such as Long's power matrix, Holladay's and Alpins' methods, Zernike(12) and Fourier analyses(8), are correlated to the polar value system. In conclusion, analysis of SIA should be performed with polar values or other contemporary component systems. The study was supported by Statens Sundhedsvidenskabeligt Forskningsråd, Cykelhandler P. Th. Rasmussen og Hustrus Mindelegat, Hotelejer Carl Larsen og Hustru Nicoline Larsens Mindelegat, Landsforeningen til Vaern om Synet, Forskningsinitiativet for Arhus Amt, Alcon Denmark, and Desirée and Niels Ydes Fond.

  19. Mortality Trends After a Voluntary Checklist-based Surgical Safety Collaborative.

    PubMed

    Haynes, Alex B; Edmondson, Lizabeth; Lipsitz, Stuart R; Molina, George; Neville, Bridget A; Singer, Sara J; Moonan, Aunyika T; Childers, Ashley Kay; Foster, Richard; Gibbons, Lorri R; Gawande, Atul A; Berry, William R

    2017-12-01

    To determine whether completion of a voluntary, checklist-based surgical quality improvement program is associated with reduced 30-day postoperative mortality. Despite evidence of efficacy of team-based surgical safety checklists in improving perioperative outcomes in research trials, effective methods of population-based implementation have been lacking. The Safe Surgery 2015 South Carolina program was designed to foster state-wide engagement of hospitals in a voluntary, collaborative implementation of a checklist program. We compared postoperative mortality rates after inpatient surgery in South Carolina utilizing state-wide all-payer discharge claims from 2008 to 2013, linked with state vital statistics, stratifying hospitals on the basis of completion of the checklist program. Changes in risk-adjusted 30-day mortality were compared between hospitals, using propensity score-adjusted difference-in-differences analysis. Fourteen hospitals completed the program by December 2013. Before program launch, there was no difference in mortality trends between the completion cohort and all others (P = 0.33), but postoperative mortality diverged thereafter (P = 0.021). Risk-adjusted 30-day mortality among completers was 3.38% in 2010 and 2.84% in 2013 (P < 0.00001), whereas mortality among other hospitals (n = 44) was 3.50% in 2010 and 3.71% in 2013 (P = 0.3281), reflecting a 22% difference between the groups on difference-in-differences analysis (P = 0.0021). Despite similar pre-existing rates and trends of postoperative mortality, hospitals in South Carolina completing a voluntary checklist-based surgical quality improvement program had a reduction in deaths after inpatient surgery over the first 3 years of the collaborative compared with other hospitals in the state. This may indicate that effective large-scale implementation of a team-based surgical safety checklist is feasible.

  20. Development of immunity following financial incentives for hepatitis B vaccination among people who inject drugs: A randomized controlled trial.

    PubMed

    Day, Carolyn A; Shanahan, Marian; Wand, Handan; Topp, Libby; Haber, Paul S; Rodgers, Craig; Deacon, Rachel; Walsh, Nick; Kaldor, John; van Beek, Ingrid; Maher, Lisa

    2016-01-01

    People who inject drugs (PWID) are at risk of hepatitis B virus (HBV) but have low rates of vaccination completion. The provision of modest financial incentives increases vaccination schedule completion, but their association with serological protection has yet to be determined. To investigate factors associated with vaccine-induced immunity among a sample of PWID randomly allocated to receive AUD$30 cash following receipt of doses two and three ('incentive condition') or standard care ('control condition') using an accelerated 3-dose (0,7,21 days) HBV vaccination schedule. A randomised controlled trial among PWID attending two inner-city health services and a field site in Sydney, Australia, assessing vaccine-induced immunity measured by hepatitis B surface antibodies (HBsAb ≥ 10 mIU/ml) at 12 weeks. The cost of the financial incentives and the provision of the vaccine program are also reported. Just over three-quarters of participants - 107/139 (77%)--completed the vaccination schedule and 79/139 (57%) were HBsAb ≥ 10 mIU/ml at 12 weeks. Vaccine series completion was the only variable significantly associated with vaccine-induced immunity in univariate analysis (62% vs 41%, p<0.035) but was not significant in multivariate analysis. There was no statistically discernible association between group allocation and series completion (62% vs 53%). The mean costs were AUD$150.5, (95% confidence interval [CI]: 142.7-158.3) and AUD$76.9 (95% CI: 72.6-81.3) for the intervention and control groups respectively. Despite increasing HBV vaccination completion, provision of financial incentives was not associated with enhanced serological protection. Further research into factors which affect response rates and the optimal vaccination regimen and incentive schemes for this population are needed. Copyright © 2015 Elsevier B.V. All rights reserved.

Top