Sample records for basic statistical analysis

  1. Statistics for nuclear engineers and scientists. Part 1. Basic statistical inference

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beggs, W.J.

    1981-02-01

    This report is intended for the use of engineers and scientists working in the nuclear industry, especially at the Bettis Atomic Power Laboratory. It serves as the basis for several Bettis in-house statistics courses. The objectives of the report are to introduce the reader to the language and concepts of statistics and to provide a basic set of techniques to apply to problems of the collection and analysis of data. Part 1 covers subjects of basic inference. The subjects include: descriptive statistics; probability; simple inference for normally distributed populations, and for non-normal populations as well; comparison of two populations; themore » analysis of variance; quality control procedures; and linear regression analysis.« less

  2. Descriptive data analysis.

    PubMed

    Thompson, Cheryl Bagley

    2009-01-01

    This 13th article of the Basics of Research series is first in a short series on statistical analysis. These articles will discuss creating your statistical analysis plan, levels of measurement, descriptive statistics, probability theory, inferential statistics, and general considerations for interpretation of the results of a statistical analysis.

  3. CADDIS Volume 4. Data Analysis: Basic Principles & Issues

    EPA Pesticide Factsheets

    Use of inferential statistics in causal analysis, introduction to data independence and autocorrelation, methods to identifying and control for confounding variables, references for the Basic Principles section of Data Analysis.

  4. A crash course on data analysis in asteroseismology

    NASA Astrophysics Data System (ADS)

    Appourchaux, Thierry

    2014-02-01

    In this course, I try to provide a few basics required for performing data analysis in asteroseismology. First, I address how one can properly treat times series: the sampling, the filtering effect, the use of Fourier transform, the associated statistics. Second, I address how one can apply statistics for decision making and for parameter estimation either in a frequentist of a Bayesian framework. Last, I review how these basic principle have been applied (or not) in asteroseismology.

  5. Applications of statistics to medical science (1) Fundamental concepts.

    PubMed

    Watanabe, Hiroshi

    2011-01-01

    The conceptual framework of statistical tests and statistical inferences are discussed, and the epidemiological background of statistics is briefly reviewed. This study is one of a series in which we survey the basics of statistics and practical methods used in medical statistics. Arguments related to actual statistical analysis procedures will be made in subsequent papers.

  6. The Social Profile of Students in Basic General Education in Ecuador: A Data Analysis

    ERIC Educational Resources Information Center

    Buri, Olga Elizabeth Minchala; Stefos, Efstathios

    2017-01-01

    The objective of this study is to examine the social profile of students who are enrolled in Basic General Education in Ecuador. Both a descriptive and multidimensional statistical analysis was carried out based on the data provided by the National Survey of Employment, Unemployment and Underemployment in 2015. The descriptive analysis shows the…

  7. 78 FR 23158 - Organization and Delegation of Duties

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-18

    ... management actions of major significance, such as those relating to changes in basic organization pattern... regard to rulemaking, enforcement, vehicle safety research and statistics and data analysis, provides... Administrator for the National Center for Statistics and Analysis, and the Associate Administrator for Vehicle...

  8. Proceedings of the NASTRAN (Tradename) Users’ Colloquium (15th) Held in Kansas City, Missouri on 4-8 May 1987

    DTIC Science & Technology

    1987-08-01

    HVAC duct hanger system over an extensive frequency range. The finite element, component mode synthesis, and statistical energy analysis methods are...800-5,000 Hz) analysis was conducted with Statistical Energy Analysis (SEA) coupled with a closed-form harmonic beam analysis program. These...resonances may be obtained by using a finer frequency increment. Statistical Energy Analysis The basic assumption used in SEA analysis is that within each band

  9. Cognition of and Demand for Education and Teaching in Medical Statistics in China: A Systematic Review and Meta-Analysis

    PubMed Central

    Li, Gaoming; Yi, Dali; Wu, Xiaojiao; Liu, Xiaoyu; Zhang, Yanqi; Liu, Ling; Yi, Dong

    2015-01-01

    Background Although a substantial number of studies focus on the teaching and application of medical statistics in China, few studies comprehensively evaluate the recognition of and demand for medical statistics. In addition, the results of these various studies differ and are insufficiently comprehensive and systematic. Objectives This investigation aimed to evaluate the general cognition of and demand for medical statistics by undergraduates, graduates, and medical staff in China. Methods We performed a comprehensive database search related to the cognition of and demand for medical statistics from January 2007 to July 2014 and conducted a meta-analysis of non-controlled studies with sub-group analysis for undergraduates, graduates, and medical staff. Results There are substantial differences with respect to the cognition of theory in medical statistics among undergraduates (73.5%), graduates (60.7%), and medical staff (39.6%). The demand for theory in medical statistics is high among graduates (94.6%), undergraduates (86.1%), and medical staff (88.3%). Regarding specific statistical methods, the cognition of basic statistical methods is higher than of advanced statistical methods. The demand for certain advanced statistical methods, including (but not limited to) multiple analysis of variance (ANOVA), multiple linear regression, and logistic regression, is higher than that for basic statistical methods. The use rates of the Statistical Package for the Social Sciences (SPSS) software and statistical analysis software (SAS) are only 55% and 15%, respectively. Conclusion The overall statistical competence of undergraduates, graduates, and medical staff is insufficient, and their ability to practically apply their statistical knowledge is limited, which constitutes an unsatisfactory state of affairs for medical statistics education. Because the demand for skills in this area is increasing, the need to reform medical statistics education in China has become urgent. PMID:26053876

  10. Cognition of and Demand for Education and Teaching in Medical Statistics in China: A Systematic Review and Meta-Analysis.

    PubMed

    Wu, Yazhou; Zhou, Liang; Li, Gaoming; Yi, Dali; Wu, Xiaojiao; Liu, Xiaoyu; Zhang, Yanqi; Liu, Ling; Yi, Dong

    2015-01-01

    Although a substantial number of studies focus on the teaching and application of medical statistics in China, few studies comprehensively evaluate the recognition of and demand for medical statistics. In addition, the results of these various studies differ and are insufficiently comprehensive and systematic. This investigation aimed to evaluate the general cognition of and demand for medical statistics by undergraduates, graduates, and medical staff in China. We performed a comprehensive database search related to the cognition of and demand for medical statistics from January 2007 to July 2014 and conducted a meta-analysis of non-controlled studies with sub-group analysis for undergraduates, graduates, and medical staff. There are substantial differences with respect to the cognition of theory in medical statistics among undergraduates (73.5%), graduates (60.7%), and medical staff (39.6%). The demand for theory in medical statistics is high among graduates (94.6%), undergraduates (86.1%), and medical staff (88.3%). Regarding specific statistical methods, the cognition of basic statistical methods is higher than of advanced statistical methods. The demand for certain advanced statistical methods, including (but not limited to) multiple analysis of variance (ANOVA), multiple linear regression, and logistic regression, is higher than that for basic statistical methods. The use rates of the Statistical Package for the Social Sciences (SPSS) software and statistical analysis software (SAS) are only 55% and 15%, respectively. The overall statistical competence of undergraduates, graduates, and medical staff is insufficient, and their ability to practically apply their statistical knowledge is limited, which constitutes an unsatisfactory state of affairs for medical statistics education. Because the demand for skills in this area is increasing, the need to reform medical statistics education in China has become urgent.

  11. Development of computer-assisted instruction application for statistical data analysis android platform as learning resource

    NASA Astrophysics Data System (ADS)

    Hendikawati, P.; Arifudin, R.; Zahid, M. Z.

    2018-03-01

    This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.

  12. 78 FR 34101 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-06

    ... and basic descriptive statistics on the quantity and type of consumer-reported patient safety events... conduct correlations, cross tabulations of responses and other statistical analysis. Estimated Annual...

  13. Ten Ways to Improve the Use of Statistical Mediation Analysis in the Practice of Child and Adolescent Treatment Research

    ERIC Educational Resources Information Center

    Maric, Marija; Wiers, Reinout W.; Prins, Pier J. M.

    2012-01-01

    Despite guidelines and repeated calls from the literature, statistical mediation analysis in youth treatment outcome research is rare. Even more concerning is that many studies that "have" reported mediation analyses do not fulfill basic requirements for mediation analysis, providing inconclusive data and clinical implications. As a result, after…

  14. [Design and implementation of online statistical analysis function in information system of air pollution and health impact monitoring].

    PubMed

    Lü, Yiran; Hao, Shuxin; Zhang, Guoqing; Liu, Jie; Liu, Yue; Xu, Dongqun

    2018-01-01

    To implement the online statistical analysis function in information system of air pollution and health impact monitoring, and obtain the data analysis information real-time. Using the descriptive statistical method as well as time-series analysis and multivariate regression analysis, SQL language and visual tools to implement online statistical analysis based on database software. Generate basic statistical tables and summary tables of air pollution exposure and health impact data online; Generate tendency charts of each data part online and proceed interaction connecting to database; Generate butting sheets which can lead to R, SAS and SPSS directly online. The information system air pollution and health impact monitoring implements the statistical analysis function online, which can provide real-time analysis result to its users.

  15. The Statistical Power of Planned Comparisons.

    ERIC Educational Resources Information Center

    Benton, Roberta L.

    Basic principles underlying statistical power are examined; and issues pertaining to effect size, sample size, error variance, and significance level are highlighted via the use of specific hypothetical examples. Analysis of variance (ANOVA) and related methods remain popular, although other procedures sometimes have more statistical power against…

  16. Resilience Among Students at the Basic Enlisted Submarine School

    DTIC Science & Technology

    2016-12-01

    reported resilience. The Hayes’ Macro in the Statistical Package for the Social Sciences (SSPS) was used to uncover factors relevant to mediation analysis... Statistical Package for the Social Sciences (SPSS) was used to uncover factors relevant to mediation analysis. Findings suggest that the encouragement of...to Stressful Experiences Scale RTC Recruit Training Command SPSS Statistical Package for the Social Sciences SS Social Support SWB Subjective Well

  17. CADDIS Volume 4. Data Analysis: Basic Analyses

    EPA Pesticide Factsheets

    Use of statistical tests to determine if an observation is outside the normal range of expected values. Details of CART, regression analysis, use of quantile regression analysis, CART in causal analysis, simplifying or pruning resulting trees.

  18. Background Information and User’s Guide for MIL-F-9490

    DTIC Science & Technology

    1975-01-01

    requirements, although different analysis results will apply to each requirement. Basic differences between the two realibility requirements are: MIL-F-8785B...provides the rationale for establishing such limits. The specific risk analysis comprises the same data which formed the average risk analysis , except...statistical analysis will be based on statistical data taken using limited exposure Limes of components and equipment. The exposure times and resulting

  19. Strategy for Promoting the Equitable Development of Basic Education in Underdeveloped Counties as Seen from Cili County

    ERIC Educational Resources Information Center

    Shihua, Peng; Rihui, Tan

    2009-01-01

    Employing statistical analysis, this study has made a preliminary exploration of promoting the equitable development of basic education in underdeveloped counties through the case study of Cili county. The unequally developed basic education in the county has been made clear, the reasons for the inequitable education have been analyzed, and,…

  20. Reinventing Biostatistics Education for Basic Scientists

    PubMed Central

    Weissgerber, Tracey L.; Garovic, Vesna D.; Milin-Lazovic, Jelena S.; Winham, Stacey J.; Obradovic, Zoran; Trzeciakowski, Jerome P.; Milic, Natasa M.

    2016-01-01

    Numerous studies demonstrating that statistical errors are common in basic science publications have led to calls to improve statistical training for basic scientists. In this article, we sought to evaluate statistical requirements for PhD training and to identify opportunities for improving biostatistics education in the basic sciences. We provide recommendations for improving statistics training for basic biomedical scientists, including: 1. Encouraging departments to require statistics training, 2. Tailoring coursework to the students’ fields of research, and 3. Developing tools and strategies to promote education and dissemination of statistical knowledge. We also provide a list of statistical considerations that should be addressed in statistics education for basic scientists. PMID:27058055

  1. Developing Competency of Teachers in Basic Education Schools

    ERIC Educational Resources Information Center

    Yuayai, Rerngrit; Chansirisira, Pacharawit; Numnaphol, Kochaporn

    2015-01-01

    This study aims to develop competency of teachers in basic education schools. The research instruments included the semi-structured in-depth interview form, questionnaire, program developing competency, and evaluation competency form. The statistics used for data analysis were percentage, mean, and standard deviation. The research found that…

  2. Consequences of common data analysis inaccuracies in CNS trauma injury basic research.

    PubMed

    Burke, Darlene A; Whittemore, Scott R; Magnuson, David S K

    2013-05-15

    The development of successful treatments for humans after traumatic brain or spinal cord injuries (TBI and SCI, respectively) requires animal research. This effort can be hampered when promising experimental results cannot be replicated because of incorrect data analysis procedures. To identify and hopefully avoid these errors in future studies, the articles in seven journals with the highest number of basic science central nervous system TBI and SCI animal research studies published in 2010 (N=125 articles) were reviewed for their data analysis procedures. After identifying the most common statistical errors, the implications of those findings were demonstrated by reanalyzing previously published data from our laboratories using the identified inappropriate statistical procedures, then comparing the two sets of results. Overall, 70% of the articles contained at least one type of inappropriate statistical procedure. The highest percentage involved incorrect post hoc t-tests (56.4%), followed by inappropriate parametric statistics (analysis of variance and t-test; 37.6%). Repeated Measures analysis was inappropriately missing in 52.0% of all articles and, among those with behavioral assessments, 58% were analyzed incorrectly. Reanalysis of our published data using the most common inappropriate statistical procedures resulted in a 14.1% average increase in significant effects compared to the original results. Specifically, an increase of 15.5% occurred with Independent t-tests and 11.1% after incorrect post hoc t-tests. Utilizing proper statistical procedures can allow more-definitive conclusions, facilitate replicability of research results, and enable more accurate translation of those results to the clinic.

  3. Introduction to Statistics. Learning Packages in the Policy Sciences Series, PS-26. Revised Edition.

    ERIC Educational Resources Information Center

    Policy Studies Associates, Croton-on-Hudson, NY.

    The primary objective of this booklet is to introduce students to basic statistical skills that are useful in the analysis of public policy data. A few, selected statistical methods are presented, and theory is not emphasized. Chapter 1 provides instruction for using tables, bar graphs, bar graphs with grouped data, trend lines, pie diagrams,…

  4. Senior Computational Scientist | Center for Cancer Research

    Cancer.gov

    The Basic Science Program (BSP) pursues independent, multidisciplinary research in basic and applied molecular biology, immunology, retrovirology, cancer biology, and human genetics. Research efforts and support are an integral part of the Center for Cancer Research (CCR) at the Frederick National Laboratory for Cancer Research (FNLCR). The Cancer & Inflammation Program (CIP), Basic Science Program, HLA Immunogenetics Section, under the leadership of Dr. Mary Carrington, studies the influence of human leukocyte antigens (HLA) and specific KIR/HLA genotypes on risk of and outcomes to infection, cancer, autoimmune disease, and maternal-fetal disease. Recent studies have focused on the impact of HLA gene expression in disease, the molecular mechanism regulating expression levels, and the functional basis for the effect of differential expression on disease outcome. The lab’s further focus is on the genetic basis for resistance/susceptibility to disease conferred by immunogenetic variation. KEY ROLES/RESPONSIBILITIES The Senior Computational Scientist will provide research support to the CIP-BSP-HLA Immunogenetics Section performing bio-statistical design, analysis and reporting of research projects conducted in the lab. This individual will be involved in the implementation of statistical models and data preparation. Successful candidate should have 5 or more years of competent, innovative biostatistics/bioinformatics research experience, beyond doctoral training Considerable experience with statistical software, such as SAS, R and S-Plus Sound knowledge, and demonstrated experience of theoretical and applied statistics Write program code to analyze data using statistical analysis software Contribute to the interpretation and publication of research results

  5. Statistical Match of the VA 1979-1980 Recipient File against the 1979-1980 Basic Grant Recipient File. Revised.

    ERIC Educational Resources Information Center

    Applied Management Sciences, Inc., Silver Spring, MD.

    The amount of misreporting of Veterans Administration (VA) benefits was assessed, along with the impact of misreporting on the Basic Educational Opportunity Grant (BEOG) program. Accurate financial information is need to determine appropriate awards. The analysis revealed: over 97% of VA beneficiaries misreported benefits; the total net loss to…

  6. The Ontology of Biological and Clinical Statistics (OBCS) for standardized and reproducible statistical analysis.

    PubMed

    Zheng, Jie; Harris, Marcelline R; Masci, Anna Maria; Lin, Yu; Hero, Alfred; Smith, Barry; He, Yongqun

    2016-09-14

    Statistics play a critical role in biological and clinical research. However, most reports of scientific results in the published literature make it difficult for the reader to reproduce the statistical analyses performed in achieving those results because they provide inadequate documentation of the statistical tests and algorithms applied. The Ontology of Biological and Clinical Statistics (OBCS) is put forward here as a step towards solving this problem. The terms in OBCS including 'data collection', 'data transformation in statistics', 'data visualization', 'statistical data analysis', and 'drawing a conclusion based on data', cover the major types of statistical processes used in basic biological research and clinical outcome studies. OBCS is aligned with the Basic Formal Ontology (BFO) and extends the Ontology of Biomedical Investigations (OBI), an OBO (Open Biological and Biomedical Ontologies) Foundry ontology supported by over 20 research communities. Currently, OBCS comprehends 878 terms, representing 20 BFO classes, 403 OBI classes, 229 OBCS specific classes, and 122 classes imported from ten other OBO ontologies. We discuss two examples illustrating how the ontology is being applied. In the first (biological) use case, we describe how OBCS was applied to represent the high throughput microarray data analysis of immunological transcriptional profiles in human subjects vaccinated with an influenza vaccine. In the second (clinical outcomes) use case, we applied OBCS to represent the processing of electronic health care data to determine the associations between hospital staffing levels and patient mortality. Our case studies were designed to show how OBCS can be used for the consistent representation of statistical analysis pipelines under two different research paradigms. Other ongoing projects using OBCS for statistical data processing are also discussed. The OBCS source code and documentation are available at: https://github.com/obcs/obcs . The Ontology of Biological and Clinical Statistics (OBCS) is a community-based open source ontology in the domain of biological and clinical statistics. OBCS is a timely ontology that represents statistics-related terms and their relations in a rigorous fashion, facilitates standard data analysis and integration, and supports reproducible biological and clinical research.

  7. System analysis for the Huntsville Operational Support Center distributed computer system

    NASA Technical Reports Server (NTRS)

    Ingels, E. M.

    1983-01-01

    A simulation model was developed and programmed in three languages BASIC, PASCAL, and SLAM. Two of the programs are included in this report, the BASIC and the PASCAL language programs. SLAM is not supported by NASA/MSFC facilities and hence was not included. The statistical comparison of simulations of the same HOSC system configurations are in good agreement and are in agreement with the operational statistics of HOSC that were obtained. Three variations of the most recent HOSC configuration was run and some conclusions drawn as to the system performance under these variations.

  8. Quality control analysis : part II : soil and aggregate base course.

    DOT National Transportation Integrated Search

    1966-07-01

    This is the second of the three reports on the quality control analysis of highway construction materials. : It deals with the statistical evaluation of results from several construction projects to determine the basic pattern of variability with res...

  9. Quality control analysis : part III : concrete and concrete aggregates.

    DOT National Transportation Integrated Search

    1966-11-01

    This is the third and last report on the Quality Control Analysis of highway construction materials. : It deals with the statistical evaluation of data from several construction projects to determine the basic pattern of variability with respect to s...

  10. Bayesian models: A statistical primer for ecologists

    USGS Publications Warehouse

    Hobbs, N. Thompson; Hooten, Mevin B.

    2015-01-01

    Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods—in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach.Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probability and develops a step-by-step sequence of connected ideas, including basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and inference from single and multiple models. This unique book places less emphasis on computer coding, favoring instead a concise presentation of the mathematical statistics needed to understand how and why Bayesian analysis works. It also explains how to write out properly formulated hierarchical Bayesian models and use them in computing, research papers, and proposals.This primer enables ecologists to understand the statistical principles behind Bayesian modeling and apply them to research, teaching, policy, and management.Presents the mathematical and statistical foundations of Bayesian modeling in language accessible to non-statisticiansCovers basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and moreDeemphasizes computer coding in favor of basic principlesExplains how to write out properly factored statistical expressions representing Bayesian models

  11. Fundamentals of Counting Statistics in Digital PCR: I Just Measured Two Target Copies-What Does It Mean?

    PubMed

    Tzonev, Svilen

    2018-01-01

    Current commercially available digital PCR (dPCR) systems and assays are capable of detecting individual target molecules with considerable reliability. As tests are developed and validated for use on clinical samples, the need to understand and develop robust statistical analysis routines increases. This chapter covers the fundamental processes and limitations of detecting and reporting on single molecule detection. We cover the basics of quantification of targets and sources of imprecision. We describe the basic test concepts: sensitivity, specificity, limit of blank, limit of detection, and limit of quantification in the context of dPCR. We provide basic guidelines how to determine those, how to choose and interpret the operating point, and what factors may influence overall test performance in practice.

  12. Powerlaw: a Python package for analysis of heavy-tailed distributions.

    PubMed

    Alstott, Jeff; Bullmore, Ed; Plenz, Dietmar

    2014-01-01

    Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible.

  13. Image analysis library software development

    NASA Technical Reports Server (NTRS)

    Guseman, L. F., Jr.; Bryant, J.

    1977-01-01

    The Image Analysis Library consists of a collection of general purpose mathematical/statistical routines and special purpose data analysis/pattern recognition routines basic to the development of image analysis techniques for support of current and future Earth Resources Programs. Work was done to provide a collection of computer routines and associated documentation which form a part of the Image Analysis Library.

  14. Have Basic Mathematical Skills Grown Obsolete in the Computer Age: Assessing Basic Mathematical Skills and Forecasting Performance in a Business Statistics Course

    ERIC Educational Resources Information Center

    Noser, Thomas C.; Tanner, John R.; Shah, Situl

    2008-01-01

    The purpose of this study was to measure the comprehension of basic mathematical skills of students enrolled in statistics classes at a large regional university, and to determine if the scores earned on a basic math skills test are useful in forecasting student performance in these statistics classes, and to determine if students' basic math…

  15. [Bayesian statistics in medicine -- part II: main applications and inference].

    PubMed

    Montomoli, C; Nichelatti, M

    2008-01-01

    Bayesian statistics is not only used when one is dealing with 2-way tables, but it can be used for inferential purposes. Using the basic concepts presented in the first part, this paper aims to give a simple overview of Bayesian methods by introducing its foundation (Bayes' theorem) and then applying this rule to a very simple practical example; whenever possible, the elementary processes at the basis of analysis are compared to those of frequentist (classical) statistical analysis. The Bayesian reasoning is naturally connected to medical activity, since it appears to be quite similar to a diagnostic process.

  16. General Nature of Multicollinearity in Multiple Regression Analysis.

    ERIC Educational Resources Information Center

    Liu, Richard

    1981-01-01

    Discusses multiple regression, a very popular statistical technique in the field of education. One of the basic assumptions in regression analysis requires that independent variables in the equation should not be highly correlated. The problem of multicollinearity and some of the solutions to it are discussed. (Author)

  17. A κ-generalized statistical mechanics approach to income analysis

    NASA Astrophysics Data System (ADS)

    Clementi, F.; Gallegati, M.; Kaniadakis, G.

    2009-02-01

    This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low-middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful.

  18. [Statistical analysis using freely-available "EZR (Easy R)" software].

    PubMed

    Kanda, Yoshinobu

    2015-10-01

    Clinicians must often perform statistical analyses for purposes such evaluating preexisting evidence and designing or executing clinical studies. R is a free software environment for statistical computing. R supports many statistical analysis functions, but does not incorporate a statistical graphical user interface (GUI). The R commander provides an easy-to-use basic-statistics GUI for R. However, the statistical function of the R commander is limited, especially in the field of biostatistics. Therefore, the author added several important statistical functions to the R commander and named it "EZR (Easy R)", which is now being distributed on the following website: http://www.jichi.ac.jp/saitama-sct/. EZR allows the application of statistical functions that are frequently used in clinical studies, such as survival analyses, including competing risk analyses and the use of time-dependent covariates and so on, by point-and-click access. In addition, by saving the script automatically created by EZR, users can learn R script writing, maintain the traceability of the analysis, and assure that the statistical process is overseen by a supervisor.

  19. A statistical mechanics approach to autopoietic immune networks

    NASA Astrophysics Data System (ADS)

    Barra, Adriano; Agliari, Elena

    2010-07-01

    In this work we aim to bridge theoretical immunology and disordered statistical mechanics. We introduce a model for the behavior of B-cells which naturally merges the clonal selection theory and the autopoietic network theory as a whole. From the analysis of its features we recover several basic phenomena such as low-dose tolerance, dynamical memory of antigens and self/non-self discrimination.

  20. Interpreting Bivariate Regression Coefficients: Going beyond the Average

    ERIC Educational Resources Information Center

    Halcoussis, Dennis; Phillips, G. Michael

    2010-01-01

    Statistics, econometrics, investment analysis, and data analysis classes often review the calculation of several types of averages, including the arithmetic mean, geometric mean, harmonic mean, and various weighted averages. This note shows how each of these can be computed using a basic regression framework. By recognizing when a regression model…

  1. All individuals are not created equal; accounting for interindividual variation in fitting life-history responses to toxicants.

    PubMed

    Jager, Tjalling

    2013-02-05

    The individuals of a species are not equal. These differences frustrate experimental biologists and ecotoxicologists who wish to study the response of a species (in general) to a treatment. In the analysis of data, differences between model predictions and observations on individual animals are usually treated as random measurement error around the true response. These deviations, however, are mainly caused by real differences between the individuals (e.g., differences in physiology and in initial conditions). Understanding these intraspecies differences, and accounting for them in the data analysis, will improve our understanding of the response to the treatment we are investigating and allow for a more powerful, less biased, statistical analysis. Here, I explore a basic scheme for statistical inference to estimate parameters governing stress that allows individuals to differ in their basic physiology. This scheme is illustrated using a simple toxicokinetic-toxicodynamic model and a data set for growth of the springtail Folsomia candida exposed to cadmium in food. This article should be seen as proof of concept; a first step in bringing more realism into the statistical inference for process-based models in ecotoxicology.

  2. Forest fires in Pennsylvania.

    Treesearch

    Donald A. Haines; William A. Main; Eugene F. McNamara

    1978-01-01

    Describes factors that contribute to forest fires in Pennsylvania. Includes an analysis of basic statistics; distribution of fires during normal, drought, and wet years; fire cause, fire activity by day-of-week; multiple-fire day; and fire climatology.

  3. Investigation of energy management strategies for photovoltaic systems - A predictive control algorithm

    NASA Technical Reports Server (NTRS)

    Cull, R. C.; Eltimsahy, A. H.

    1983-01-01

    The present investigation is concerned with the formulation of energy management strategies for stand-alone photovoltaic (PV) systems, taking into account a basic control algorithm for a possible predictive, (and adaptive) controller. The control system controls the flow of energy in the system according to the amount of energy available, and predicts the appropriate control set-points based on the energy (insolation) available by using an appropriate system model. Aspects of adaptation to the conditions of the system are also considered. Attention is given to a statistical analysis technique, the analysis inputs, the analysis procedure, and details regarding the basic control algorithm.

  4. Tools for Basic Statistical Analysis

    NASA Technical Reports Server (NTRS)

    Luz, Paul L.

    2005-01-01

    Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.

  5. Meta- and statistical analysis of single-case intervention research data: quantitative gifts and a wish list.

    PubMed

    Kratochwill, Thomas R; Levin, Joel R

    2014-04-01

    In this commentary, we add to the spirit of the articles appearing in the special series devoted to meta- and statistical analysis of single-case intervention-design data. Following a brief discussion of historical factors leading to our initial involvement in statistical analysis of such data, we discuss: (a) the value added by including statistical-analysis recommendations in the What Works Clearinghouse Standards for single-case intervention designs; (b) the importance of visual analysis in single-case intervention research, along with the distinctive role that could be played by single-case effect-size measures; and (c) the elevated internal validity and statistical-conclusion validity afforded by the incorporation of various forms of randomization into basic single-case design structures. For the future, we envision more widespread application of quantitative analyses, as critical adjuncts to visual analysis, in both primary single-case intervention research studies and literature reviews in the behavioral, educational, and health sciences. Copyright © 2014 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  6. SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel

    PubMed Central

    Chen, Bowang; Wilkening, Stefan; Drechsel, Marion; Hemminki, Kari

    2009-01-01

    Background Single nucleotide polymorphism (SNP) genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. Findings The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis). Conclusion Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software. PMID:19852806

  7. SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel.

    PubMed

    Chen, Bowang; Wilkening, Stefan; Drechsel, Marion; Hemminki, Kari

    2009-10-23

    Single nucleotide polymorphism (SNP) genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis). Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software.

  8. CORSSA: Community Online Resource for Statistical Seismicity Analysis

    NASA Astrophysics Data System (ADS)

    Zechar, J. D.; Hardebeck, J. L.; Michael, A. J.; Naylor, M.; Steacy, S.; Wiemer, S.; Zhuang, J.

    2011-12-01

    Statistical seismology is critical to the understanding of seismicity, the evaluation of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology-especially to those aspects with great impact on public policy-statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA, www.corssa.org). We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each will contain between four and eight articles. CORSSA now includes seven articles with an additional six in draft form along with forums for discussion, a glossary, and news about upcoming meetings, special issues, and recent papers. Each article is peer-reviewed and presents a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. We have also begun curating a collection of statistical seismology software packages.

  9. Conceptual versus Algorithmic Learning in High School Chemistry: The Case of Basic Quantum Chemical Concepts--Part 1. Statistical Analysis of a Quantitative Study

    ERIC Educational Resources Information Center

    Papaphotis, Georgios; Tsaparlis, Georgios

    2008-01-01

    Part 1 of the findings are presented of a quantitative study (n = 125) on basic quantum chemical concepts taught in the twelfth grade (age 17-18 years) in Greece. A paper-and-pencil test of fourteen questions was used. The study compared performance in five questions that tested recall of knowledge or application of algorithmic procedures (type-A…

  10. Meta-Analytic Derivation

    ERIC Educational Resources Information Center

    Snell, Joel C.; Marsh, Mitchell

    2011-01-01

    The authors have over the years tried to revise meta-analysis because it's basic premise is to add apples and oranges together and analyze. In other words, various data on the same subject are chosen using different samples, research strategies, and number properties. The findings are then homogenized and a statistical analysis is used (Snell, J.…

  11. Combining statistical inference and decisions in ecology

    USGS Publications Warehouse

    Williams, Perry J.; Hooten, Mevin B.

    2016-01-01

    Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation, and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem.

  12. Predicting Success in Psychological Statistics Courses.

    PubMed

    Lester, David

    2016-06-01

    Many students perform poorly in courses on psychological statistics, and it is useful to be able to predict which students will have difficulties. In a study of 93 undergraduates enrolled in Statistical Methods (18 men, 75 women; M age = 22.0 years, SD = 5.1), performance was significantly associated with sex (female students performed better) and proficiency in algebra in a linear regression analysis. Anxiety about statistics was not associated with course performance, indicating that basic mathematical skills are the best correlate for performance in statistics courses and can usefully be used to stream students into classes by ability. © The Author(s) 2016.

  13. Interpretation of correlations in clinical research.

    PubMed

    Hung, Man; Bounsanga, Jerry; Voss, Maren Wright

    2017-11-01

    Critically analyzing research is a key skill in evidence-based practice and requires knowledge of research methods, results interpretation, and applications, all of which rely on a foundation based in statistics. Evidence-based practice makes high demands on trained medical professionals to interpret an ever-expanding array of research evidence. As clinical training emphasizes medical care rather than statistics, it is useful to review the basics of statistical methods and what they mean for interpreting clinical studies. We reviewed the basic concepts of correlational associations, violations of normality, unobserved variable bias, sample size, and alpha inflation. The foundations of causal inference were discussed and sound statistical analyses were examined. We discuss four ways in which correlational analysis is misused, including causal inference overreach, over-reliance on significance, alpha inflation, and sample size bias. Recent published studies in the medical field provide evidence of causal assertion overreach drawn from correlational findings. The findings present a primer on the assumptions and nature of correlational methods of analysis and urge clinicians to exercise appropriate caution as they critically analyze the evidence before them and evaluate evidence that supports practice. Critically analyzing new evidence requires statistical knowledge in addition to clinical knowledge. Studies can overstate relationships, expressing causal assertions when only correlational evidence is available. Failure to account for the effect of sample size in the analyses tends to overstate the importance of predictive variables. It is important not to overemphasize the statistical significance without consideration of effect size and whether differences could be considered clinically meaningful.

  14. Experimental analysis of computer system dependability

    NASA Technical Reports Server (NTRS)

    Iyer, Ravishankar, K.; Tang, Dong

    1993-01-01

    This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.

  15. [Adequate application of quantitative and qualitative statistic analytic methods in acupuncture clinical trials].

    PubMed

    Tan, Ming T; Liu, Jian-ping; Lao, Lixing

    2012-08-01

    Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.

  16. Theory of Financial Risk and Derivative Pricing

    NASA Astrophysics Data System (ADS)

    Bouchaud, Jean-Philippe; Potters, Marc

    2009-01-01

    Foreword; Preface; 1. Probability theory: basic notions; 2. Maximum and addition of random variables; 3. Continuous time limit, Ito calculus and path integrals; 4. Analysis of empirical data; 5. Financial products and financial markets; 6. Statistics of real prices: basic results; 7. Non-linear correlations and volatility fluctuations; 8. Skewness and price-volatility correlations; 9. Cross-correlations; 10. Risk measures; 11. Extreme correlations and variety; 12. Optimal portfolios; 13. Futures and options: fundamental concepts; 14. Options: hedging and residual risk; 15. Options: the role of drift and correlations; 16. Options: the Black and Scholes model; 17. Options: some more specific problems; 18. Options: minimum variance Monte-Carlo; 19. The yield curve; 20. Simple mechanisms for anomalous price statistics; Index of most important symbols; Index.

  17. Theory of Financial Risk and Derivative Pricing - 2nd Edition

    NASA Astrophysics Data System (ADS)

    Bouchaud, Jean-Philippe; Potters, Marc

    2003-12-01

    Foreword; Preface; 1. Probability theory: basic notions; 2. Maximum and addition of random variables; 3. Continuous time limit, Ito calculus and path integrals; 4. Analysis of empirical data; 5. Financial products and financial markets; 6. Statistics of real prices: basic results; 7. Non-linear correlations and volatility fluctuations; 8. Skewness and price-volatility correlations; 9. Cross-correlations; 10. Risk measures; 11. Extreme correlations and variety; 12. Optimal portfolios; 13. Futures and options: fundamental concepts; 14. Options: hedging and residual risk; 15. Options: the role of drift and correlations; 16. Options: the Black and Scholes model; 17. Options: some more specific problems; 18. Options: minimum variance Monte-Carlo; 19. The yield curve; 20. Simple mechanisms for anomalous price statistics; Index of most important symbols; Index.

  18. The Data from Aeromechanics Test and Analytics -- Management and Analysis Package (DATAMAP). Volume I. User’s Manual.

    DTIC Science & Technology

    1980-12-01

    to sound pressure level in decibels assuming a fre- quency of 1000 Hz. 249 The perceived noisiness values are derived from a formula specified in...Analyses .......... 244 6.i.16 Perceived Noise Level Analysis .............249 6.1.17 Acoustic Weighting Networks ................250 6.2 DERIVATIONS...BAND ANALYSIS BASIC STATISTICAL ANALYSES: *OCTAVE ANALYSIS MEAN *THIRD OCTAVE ANALYSIS VARIANCE *PERCEIVED NOISE LEVEL STANDARD DEVIATION CALCULATION

  19. Analysis of Time-Series Quasi-Experiments. Final Report.

    ERIC Educational Resources Information Center

    Glass, Gene V.; Maguire, Thomas O.

    The objective of this project was to investigate the adequacy of statistical models developed by G. E. P. Box and G. C. Tiao for the analysis of time-series quasi-experiments: (1) The basic model developed by Box and Tiao is applied to actual time-series experiment data from two separate experiments, one in psychology and one in educational…

  20. An exploratory investigation of weight estimation techniques for hypersonic flight vehicles

    NASA Technical Reports Server (NTRS)

    Cook, E. L.

    1981-01-01

    The three basic methods of weight prediction (fixed-fraction, statistical correlation, and point stress analysis) and some of the computer programs that have been developed to implement them are discussed. A modified version of the WAATS (Weights Analysis of Advanced Transportation Systems) program is presented, along with input data forms and an example problem.

  1. Prerequisites for Systems Analysts: Analytic and Management Demands of a New Approach to Educational Administration.

    ERIC Educational Resources Information Center

    Ammentorp, William

    There is much to be gained by using systems analysis in educational administration. Most administrators, presently relying on classical statistical techniques restricted to problems having few variables, should be trained to use more sophisticated tools such as systems analysis. The systems analyst, interested in the basic processes of a group or…

  2. Software for Data Analysis with Graphical Models

    NASA Technical Reports Server (NTRS)

    Buntine, Wray L.; Roy, H. Scott

    1994-01-01

    Probabilistic graphical models are being used widely in artificial intelligence and statistics, for instance, in diagnosis and expert systems, as a framework for representing and reasoning with probabilities and independencies. They come with corresponding algorithms for performing statistical inference. This offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper illustrates the framework with an example and then presents some basic techniques for the task: problem decomposition and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.

  3. Building the Community Online Resource for Statistical Seismicity Analysis (CORSSA)

    NASA Astrophysics Data System (ADS)

    Michael, A. J.; Wiemer, S.; Zechar, J. D.; Hardebeck, J. L.; Naylor, M.; Zhuang, J.; Steacy, S.; Corssa Executive Committee

    2010-12-01

    Statistical seismology is critical to the understanding of seismicity, the testing of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology - especially to those aspects with great impact on public policy - statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA). CORSSA is a web-based educational platform that is authoritative, up-to-date, prominent, and user-friendly. We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each containing between four and eight articles. The CORSSA web page, www.corssa.org, officially unveiled on September 6, 2010, debuts with an initial set of approximately 10 to 15 articles available online for viewing and commenting with additional articles to be added over the coming months. Each article will be peer-reviewed and will present a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles will include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. A special article will compare and review available statistical seismology software packages.

  4. Statistics and Discoveries at the LHC (1/4)

    ScienceCinema

    Cowan, Glen

    2018-02-09

    The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.

  5. Statistics and Discoveries at the LHC (3/4)

    ScienceCinema

    Cowan, Glen

    2018-02-19

    The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.

  6. Statistics and Discoveries at the LHC (4/4)

    ScienceCinema

    Cowan, Glen

    2018-05-22

    The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.

  7. Statistics and Discoveries at the LHC (2/4)

    ScienceCinema

    Cowan, Glen

    2018-04-26

    The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.

  8. Urological research in sub-Saharan Africa: a retrospective cohort study of abstracts presented at the Nigerian Association of Urological Surgeons conferences.

    PubMed

    Bello, Jibril Oyekunle

    2013-11-14

    Nigeria is one of the top three countries in Africa in terms of science research output and Nigerian urologists' biomedical research output contributes to this. Each year, urologists in Nigeria gather to present their recent research at the conference of the Nigerian Association of Urological Surgeons (NAUS). These abstracts are not thoroughly vetted as are full length manuscripts published in peer reviewed journals but the information they disseminate may affect clinical practice of attendees. This study aims to describe the characteristics of abstracts presented at the annual conferences of NAUS, the quality of the abstracts as determined by the subsequent publication of full length manuscripts in peer-review indexed journals and the factors that influence such successful publication. Abstracts presented at the 2007 to 2010 NAUS conferences were identified through conference abstracts books. Using a strict search protocol, publication in peer-reviewed journals was determined. The abstracts characteristics were analyzed and their quality judged by subsequent successful publishing of full length manuscripts. Statistical analysis was performed using SPSS 16.0 software to determine factors predictive of successful publication. Only 75 abstracts were presented at the NAUS 2007 to 2010 conferences; a quarter (24%) of the presented abstracts was subsequently published as full length manuscripts. Median time to publication was 15 months (range 2-40 months). Manuscripts whose result data were analyzed with 'beyond basic' statistics of frequencies and averages were more likely to be published than those with basic or no statistics. Quality of the abstracts and thus subsequent publication success is influenced by the use of 'beyond basic' statistics in analysis of the result data presented. There is a need for improvement in the quality of urological research from Nigeria.

  9. Counting Penguins.

    ERIC Educational Resources Information Center

    Perry, Mike; Kader, Gary

    1998-01-01

    Presents an activity on the simplification of penguin counting by employing the basic ideas and principles of sampling to teach students to understand and recognize its role in statistical claims. Emphasizes estimation, data analysis and interpretation, and central limit theorem. Includes a list of items for classroom discussion. (ASK)

  10. DESIGNA ND ANALYSIS FOR THEMATIC MAP ACCURACY ASSESSMENT: FUNDAMENTAL PRINCIPLES

    EPA Science Inventory

    Before being used in scientific investigations and policy decisions, thematic maps constructed from remotely sensed data should be subjected to a statistically rigorous accuracy assessment. The three basic components of an accuracy assessment are: 1) the sampling design used to s...

  11. Statistical Characteristics of Single Sort of Grape Bulgarian Wines

    NASA Astrophysics Data System (ADS)

    Boyadzhiev, D.

    2008-10-01

    The aim of this paper is to evaluate the differences in the values of the 8 basic physicochemical indices of single sort of grape Bulgarian wines (white and red ones), obligatory for the standardization of ready production in the winery. Statistically significant differences in the values of various sorts and vintages are established and possibilities for identifying the sort and the vintage on the base of these indices by applying discriminant analysis are discussed.

  12. S.P.S.S. User's Manual #1-#4. Basic Program Construction in S.P.S.S.; S.P.S.S. Non-Procedural Statements and Procedural Commands; System Control Language and S.P.S.S.; Quick File Equate Statement Reference.

    ERIC Educational Resources Information Center

    Earl, Lorna L.

    This series of manuals describing and illustrating the Statistical Package for the Social Sciences (SPSS) was planned as a self-teaching instrument, beginning with the basics and progressing to an advanced level. Information on what the searcher must know to define the data and write a program for preliminary analysis is contained in manual 1,…

  13. [Statistical analysis of German radiologic periodicals: developmental trends in the last 10 years].

    PubMed

    Golder, W

    1999-09-01

    To identify which statistical tests are applied in German radiological publications, to what extent their use has changed during the last decade, and which factors might be responsible for this development. The major articles published in "ROFO" and "DER RADIOLOGE" during 1988, 1993 and 1998 were reviewed for statistical content. The contributions were classified by principal focus and radiological subspecialty. The methods used were assigned to descriptive, basal and advanced statistics. Sample size, significance level and power were established. The use of experts' assistance was monitored. Finally, we calculated the so-called cumulative accessibility of the publications. 525 contributions were found to be eligible. In 1988, 87% used descriptive statistics only, 12.5% basal, and 0.5% advanced statistics. The corresponding figures in 1993 and 1998 are 62 and 49%, 32 and 41%, and 6 and 10%, respectively. Statistical techniques were most likely to be used in research on musculoskeletal imaging and articles dedicated to MRI. Six basic categories of statistical methods account for the complete statistical analysis appearing in 90% of the articles. ROC analysis is the single most common advanced technique. Authors make increasingly use of statistical experts' opinion and programs. During the last decade, the use of statistical methods in German radiological journals has fundamentally improved, both quantitatively and qualitatively. Presently, advanced techniques account for 20% of the pertinent statistical tests. This development seems to be promoted by the increasing availability of statistical analysis software.

  14. Analysis of basic clustering algorithms for numerical estimation of statistical averages in biomolecules.

    PubMed

    Anandakrishnan, Ramu; Onufriev, Alexey

    2008-03-01

    In statistical mechanics, the equilibrium properties of a physical system of particles can be calculated as the statistical average over accessible microstates of the system. In general, these calculations are computationally intractable since they involve summations over an exponentially large number of microstates. Clustering algorithms are one of the methods used to numerically approximate these sums. The most basic clustering algorithms first sub-divide the system into a set of smaller subsets (clusters). Then, interactions between particles within each cluster are treated exactly, while all interactions between different clusters are ignored. These smaller clusters have far fewer microstates, making the summation over these microstates, tractable. These algorithms have been previously used for biomolecular computations, but remain relatively unexplored in this context. Presented here, is a theoretical analysis of the error and computational complexity for the two most basic clustering algorithms that were previously applied in the context of biomolecular electrostatics. We derive a tight, computationally inexpensive, error bound for the equilibrium state of a particle computed via these clustering algorithms. For some practical applications, it is the root mean square error, which can be significantly lower than the error bound, that may be more important. We how that there is a strong empirical relationship between error bound and root mean square error, suggesting that the error bound could be used as a computationally inexpensive metric for predicting the accuracy of clustering algorithms for practical applications. An example of error analysis for such an application-computation of average charge of ionizable amino-acids in proteins-is given, demonstrating that the clustering algorithm can be accurate enough for practical purposes.

  15. Combining statistical inference and decisions in ecology.

    PubMed

    Williams, Perry J; Hooten, Mevin B

    2016-09-01

    Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods, including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem. © 2016 by the Ecological Society of America.

  16. PopSc: Computing Toolkit for Basic Statistics of Molecular Population Genetics Simultaneously Implemented in Web-Based Calculator, Python and R

    PubMed Central

    Huang, Ying; Li, Cao; Liu, Linhai; Jia, Xianbo; Lai, Song-Jia

    2016-01-01

    Although various computer tools have been elaborately developed to calculate a series of statistics in molecular population genetics for both small- and large-scale DNA data, there is no efficient and easy-to-use toolkit available yet for exclusively focusing on the steps of mathematical calculation. Here, we present PopSc, a bioinformatic toolkit for calculating 45 basic statistics in molecular population genetics, which could be categorized into three classes, including (i) genetic diversity of DNA sequences, (ii) statistical tests for neutral evolution, and (iii) measures of genetic differentiation among populations. In contrast to the existing computer tools, PopSc was designed to directly accept the intermediate metadata, such as allele frequencies, rather than the raw DNA sequences or genotyping results. PopSc is first implemented as the web-based calculator with user-friendly interface, which greatly facilitates the teaching of population genetics in class and also promotes the convenient and straightforward calculation of statistics in research. Additionally, we also provide the Python library and R package of PopSc, which can be flexibly integrated into other advanced bioinformatic packages of population genetics analysis. PMID:27792763

  17. PopSc: Computing Toolkit for Basic Statistics of Molecular Population Genetics Simultaneously Implemented in Web-Based Calculator, Python and R.

    PubMed

    Chen, Shi-Yi; Deng, Feilong; Huang, Ying; Li, Cao; Liu, Linhai; Jia, Xianbo; Lai, Song-Jia

    2016-01-01

    Although various computer tools have been elaborately developed to calculate a series of statistics in molecular population genetics for both small- and large-scale DNA data, there is no efficient and easy-to-use toolkit available yet for exclusively focusing on the steps of mathematical calculation. Here, we present PopSc, a bioinformatic toolkit for calculating 45 basic statistics in molecular population genetics, which could be categorized into three classes, including (i) genetic diversity of DNA sequences, (ii) statistical tests for neutral evolution, and (iii) measures of genetic differentiation among populations. In contrast to the existing computer tools, PopSc was designed to directly accept the intermediate metadata, such as allele frequencies, rather than the raw DNA sequences or genotyping results. PopSc is first implemented as the web-based calculator with user-friendly interface, which greatly facilitates the teaching of population genetics in class and also promotes the convenient and straightforward calculation of statistics in research. Additionally, we also provide the Python library and R package of PopSc, which can be flexibly integrated into other advanced bioinformatic packages of population genetics analysis.

  18. Awareness, Attitude, and Knowledge of Basic Life Support among Medical, Dental, and Nursing Faculties and Students in the University Hospital.

    PubMed

    Sangamesh, N C; Vidya, K C; Pathi, Jugajyoti; Singh, Arpita

    2017-01-01

    To assess the awareness, attitude, and knowledge about basic life support (BLS) among medical, dental, and nursing students and faculties and the proposal of BLS skills in the academic curriculum of undergraduate (UG) course. Recognition, prevention, and effective management of life-threatening emergencies are the responsibility of health-care professionals. These situations can be successfully managed by proper knowledge and training of the BLS skills. These life-saving maneuvers can be given through the structured resuscitation programs, which are lacking in the academic curriculum. A questionnaire study consisting of 20 questions was conducted among 659 participants in the Kalinga Institute of Dental Sciences, Kalinga Institute of Medical Sciences, KIIT University. Medical junior residents, BDS faculties, interns, nursing faculties, and 3 rd -year and final-year UG students from both medical and dental colleges were chosen. The statistical analysis was carried out using SPSS software version 20.0 (Armonk, NY:IBM Corp). After collecting the data, the values were statistically analyzed and tabulated. Statistical analysis was performed using Mann-Whitney U-test. The results with P < 0.05 were considered statistically significant. Our participants were aware of BLS, showed positive attitude toward it, whereas the knowledge about BLS was lacking, with the statistically significant P value. By introducing BLS regularly in the academic curriculum and by routine hands on workshops, all the health-care providers should be well versed with the BLS skills for effectively managing the life-threatening emergencies.

  19. Macro-Econophysics

    NASA Astrophysics Data System (ADS)

    Aoyama, Hideaki; Fujiwara, Yoshi; Ikeda, Yuichi; Iyetomi, Hiroshi; Souma, Wataru; Yoshikawa, Hiroshi

    2017-07-01

    Preface; Foreword, Acknowledgements, List of tables; List of figures, prologue, 1. Introduction: reconstructing macroeconomics; 2. Basic concepts in statistical physics and stochastic models; 3. Income and firm-size distributions; 4. Productivity distribution and related topics; 5. Multivariate time-series analysis; 6. Business cycles; 7. Price dynamics and inflation/deflation; 8. Complex network, community analysis, visualization; 9. Systemic risks; Appendix A: computer program for beginners; Epilogue; Bibliography; Index.

  20. The use of quizStar application for online examination in basic physics course

    NASA Astrophysics Data System (ADS)

    Kustijono, R.; Budiningarti, H.

    2018-03-01

    The purpose of the study is to produce an online Basic Physics exam system using the QuizStar application. This is a research and development with ADDIE model. The steps are: 1) analysis; 2) design; 3) development; 4) implementation; 5) evaluation. System feasibility is reviewed for its validity, practicality, and effectiveness. The subjects of research are 60 Physics Department students of Universitas Negeri Surabaya. The data analysis used is a descriptive statistic. The validity, practicality, and effectiveness scores are measured using a Likert scale. Criteria feasible if the total score of all aspects obtained is ≥ 61%. The results obtained from the online test system by using QuizStar developed are 1) conceptually feasible to use; 2) the system can be implemented in the Basic Physics assessment process, and the existing constraints can be overcome; 3) student's response to system usage is in a good category. The results conclude that QuizStar application is eligible to be used for online Basic Physics exam system.

  1. Development of polytoxicomania in function of defence from psychoticism.

    PubMed

    Nenadović, Milutin M; Sapić, Rosa

    2011-01-01

    Polytoxicomanic proportions in subpopulations of youth have been growing steadily in recent decades, and this trend is pan-continental. Psychoticism is a psychological construct that assumes special basic dimensions of personality disintegration and cognitive functions. Psychoticism may, in general, be the basis of pathological functioning of youth and influence the patterns of thought, feelings and actions that cause dysfunction. The aim of this study was to determine the distribution of basic dimensions of psychoticism for commitment of youth to abuse psychoactive substances (PAS) in order to reduce disturbing intrapsychic experiences or manifestation of psychotic symptoms. For the purpose of this study, two groups of respondents were formed, balanced by age, gender and family structure of origin (at least one parent alive). The study applied a DELTA-9 instrument for assessment of cognitive disintegration in function of establishing psychoticism and its operationalization. The obtained results were statistically analyzed. From the parameters of descriptive statistics, the arithmetic mean was calculated with measures of dispersion. A cross-tabular analysis of variables tested was performed, as well as statistical significance with Pearson's chi2-test, and analysis of variance. Age structure and gender are approximately represented in the group of polytoximaniacs and the control group. Testing did not confirm the statistically significant difference (p > 0.5). Statistical methodology established that they significantly differed in most variables of psychoticism, polytoxicomaniacs compared with a control group of respondents. Testing confirmed a high statistical significance of differences of variables of psychoticism in the group of respondents for p < 0.001 to p < 0.01. A statistically significant representation of the dimension of psychoticism in the polytoxicomaniac group was established. The presence of factors concerning common executive dysfunction was emphasized.

  2. Finite Element Analysis of Reverberation Chambers

    NASA Technical Reports Server (NTRS)

    Bunting, Charles F.; Nguyen, Duc T.

    2000-01-01

    The primary motivating factor behind the initiation of this work was to provide a deterministic means of establishing the validity of the statistical methods that are recommended for the determination of fields that interact in -an avionics system. The application of finite element analysis to reverberation chambers is the initial step required to establish a reasonable course of inquiry in this particularly data-intensive study. The use of computational electromagnetics provides a high degree of control of the "experimental" parameters that can be utilized in a simulation of reverberating structures. As the work evolved there were four primary focus areas they are: 1. The eigenvalue problem for the source free problem. 2. The development of a complex efficient eigensolver. 3. The application of a source for the TE and TM fields for statistical characterization. 4. The examination of shielding effectiveness in a reverberating environment. One early purpose of this work was to establish the utility of finite element techniques in the development of an extended low frequency statistical model for reverberation phenomena. By employing finite element techniques, structures of arbitrary complexity can be analyzed due to the use of triangular shape functions in the spatial discretization. The effects of both frequency stirring and mechanical stirring are presented. It is suggested that for the low frequency operation the typical tuner size is inadequate to provide a sufficiently random field and that frequency stirring should be used. The results of the finite element analysis of the reverberation chamber illustrate io-W the potential utility of a 2D representation for enhancing the basic statistical characteristics of the chamber when operating in a low frequency regime. The basic field statistics are verified for frequency stirring over a wide range of frequencies. Mechanical stirring is shown to provide an effective frequency deviation.

  3. Specialized data analysis of SSME and advanced propulsion system vibration measurements

    NASA Technical Reports Server (NTRS)

    Coffin, Thomas; Swanson, Wayne L.; Jong, Yen-Yi

    1993-01-01

    The basic objectives of this contract were to perform detailed analysis and evaluation of dynamic data obtained during Space Shuttle Main Engine (SSME) test and flight operations, including analytical/statistical assessment of component dynamic performance, and to continue the development and implementation of analytical/statistical models to effectively define nominal component dynamic characteristics, detect anomalous behavior, and assess machinery operational conditions. This study was to provide timely assessment of engine component operational status, identify probable causes of malfunction, and define feasible engineering solutions. The work was performed under three broad tasks: (1) Analysis, Evaluation, and Documentation of SSME Dynamic Test Results; (2) Data Base and Analytical Model Development and Application; and (3) Development and Application of Vibration Signature Analysis Techniques.

  4. Survival analysis in hematologic malignancies: recommendations for clinicians

    PubMed Central

    Delgado, Julio; Pereira, Arturo; Villamor, Neus; López-Guillermo, Armando; Rozman, Ciril

    2014-01-01

    The widespread availability of statistical packages has undoubtedly helped hematologists worldwide in the analysis of their data, but has also led to the inappropriate use of statistical methods. In this article, we review some basic concepts of survival analysis and also make recommendations about how and when to perform each particular test using SPSS, Stata and R. In particular, we describe a simple way of defining cut-off points for continuous variables and the appropriate and inappropriate uses of the Kaplan-Meier method and Cox proportional hazard regression models. We also provide practical advice on how to check the proportional hazards assumption and briefly review the role of relative survival and multiple imputation. PMID:25176982

  5. SEDIDAT: A BASIC program for the collection and statistical analysis of particle settling velocity data

    NASA Astrophysics Data System (ADS)

    Wright, Robyn; Thornberg, Steven M.

    SEDIDAT is a series of compiled IBM-BASIC (version 2.0) programs that direct the collection, statistical calculation, and graphic presentation of particle settling velocity and equivalent spherical diameter for samples analyzed using the settling tube technique. The programs follow a menu-driven format that is understood easily by students and scientists with little previous computer experience. Settling velocity is measured directly (cm,sec) and also converted into Chi units. Equivalent spherical diameter (reported in Phi units) is calculated using a modified Gibbs equation for different particle densities. Input parameters, such as water temperature, settling distance, particle density, run time, and Phi;Chi interval are changed easily at operator discretion. Optional output to a dot-matrix printer includes a summary of moment and graphic statistical parameters, a tabulation of individual and cumulative weight percents, a listing of major distribution modes, and cumulative and histogram plots of a raw time, settling velocity. Chi and Phi data.

  6. 10 CFR 431.17 - Determination of efficiency.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... characteristics of that basic model, and (ii) Based on engineering or statistical analysis, computer simulation or... simulation or modeling, and other analytic evaluation of performance data on which the AEDM is based... applied. (iii) If requested by the Department, the manufacturer shall conduct simulations to predict the...

  7. 10 CFR 431.17 - Determination of efficiency.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... characteristics of that basic model, and (ii) Based on engineering or statistical analysis, computer simulation or... simulation or modeling, and other analytic evaluation of performance data on which the AEDM is based... applied. (iii) If requested by the Department, the manufacturer shall conduct simulations to predict the...

  8. Basic Research in Information Science in France.

    ERIC Educational Resources Information Center

    Chambaud, S.; Le Coadic, Y. F.

    1987-01-01

    Discusses the goals of French academic research policy in the field of information science, emphasizing the interdisciplinary nature of the field. Areas of research highlighted include communication, telecommunications, co-word analysis in scientific and technical documents, media, and statistical methods for the study of social sciences. (LRW)

  9. The modern Japanese color lexicon.

    PubMed

    Kuriki, Ichiro; Lange, Ryan; Muto, Yumiko; Brown, Angela M; Fukuda, Kazuho; Tokunaga, Rumi; Lindsey, Delwin T; Uchikawa, Keiji; Shioiri, Satoshi

    2017-03-01

    Despite numerous prior studies, important questions about the Japanese color lexicon persist, particularly about the number of Japanese basic color terms and their deployment across color space. Here, 57 native Japanese speakers provided monolexemic terms for 320 chromatic and 10 achromatic Munsell color samples. Through k-means cluster analysis we revealed 16 statistically distinct Japanese chromatic categories. These included eight chromatic basic color terms (aka/red, ki/yellow, midori/green, ao/blue, pink, orange, cha/brown, and murasaki/purple) plus eight additional terms: mizu ("water")/light blue, hada ("skin tone")/peach, kon ("indigo")/dark blue, matcha ("green tea")/yellow-green, enji/maroon, oudo ("sand or mud")/mustard, yamabuki ("globeflower")/gold, and cream. Of these additional terms, mizu was used by 98% of informants, and emerged as a strong candidate for a 12th Japanese basic color term. Japanese and American English color-naming systems were broadly similar, except for color categories in one language (mizu, kon, teal, lavender, magenta, lime) that had no equivalent in the other. Our analysis revealed two statistically distinct Japanese motifs (or color-naming systems), which differed mainly in the extension of mizu across our color palette. Comparison of the present data with an earlier study by Uchikawa & Boynton (1987) suggests that some changes in the Japanese color lexicon have occurred over the last 30 years.

  10. Teaching Basic Probability in Undergraduate Statistics or Management Science Courses

    ERIC Educational Resources Information Center

    Naidu, Jaideep T.; Sanford, John F.

    2017-01-01

    Standard textbooks in core Statistics and Management Science classes present various examples to introduce basic probability concepts to undergraduate business students. These include tossing of a coin, throwing a die, and examples of that nature. While these are good examples to introduce basic probability, we use improvised versions of Russian…

  11. Systems and methods for knowledge discovery in spatial data

    DOEpatents

    Obradovic, Zoran; Fiez, Timothy E.; Vucetic, Slobodan; Lazarevic, Aleksandar; Pokrajac, Dragoljub; Hoskinson, Reed L.

    2005-03-08

    Systems and methods are provided for knowledge discovery in spatial data as well as to systems and methods for optimizing recipes used in spatial environments such as may be found in precision agriculture. A spatial data analysis and modeling module is provided which allows users to interactively and flexibly analyze and mine spatial data. The spatial data analysis and modeling module applies spatial data mining algorithms through a number of steps. The data loading and generation module obtains or generates spatial data and allows for basic partitioning. The inspection module provides basic statistical analysis. The preprocessing module smoothes and cleans the data and allows for basic manipulation of the data. The partitioning module provides for more advanced data partitioning. The prediction module applies regression and classification algorithms on the spatial data. The integration module enhances prediction methods by combining and integrating models. The recommendation module provides the user with site-specific recommendations as to how to optimize a recipe for a spatial environment such as a fertilizer recipe for an agricultural field.

  12. OSPAR standard method and software for statistical analysis of beach litter data.

    PubMed

    Schulz, Marcus; van Loon, Willem; Fleet, David M; Baggelaar, Paul; van der Meulen, Eit

    2017-09-15

    The aim of this study is to develop standard statistical methods and software for the analysis of beach litter data. The optimal ensemble of statistical methods comprises the Mann-Kendall trend test, the Theil-Sen slope estimation, the Wilcoxon step trend test and basic descriptive statistics. The application of Litter Analyst, a tailor-made software for analysing the results of beach litter surveys, to OSPAR beach litter data from seven beaches bordering on the south-eastern North Sea, revealed 23 significant trends in the abundances of beach litter types for the period 2009-2014. Litter Analyst revealed a large variation in the abundance of litter types between beaches. To reduce the effects of spatial variation, trend analysis of beach litter data can most effectively be performed at the beach or national level. Spatial aggregation of beach litter data within a region is possible, but resulted in a considerable reduction in the number of significant trends. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Peers versus professional training of basic life support in Syria: a randomized controlled trial.

    PubMed

    Abbas, Fatima; Sawaf, Bisher; Hanafi, Ibrahem; Hajeer, Mohammad Younis; Zakaria, Mhd Ismael; Abbas, Wafaa; Alabdeh, Fadi; Ibrahim, Nazir

    2018-06-18

    Peer training has been identified as a useful tool for delivering undergraduate training in basic life support (BLS) which is fundamental as an initial response in cases of emergency. This study aimed to (1) Evaluate the efficacy of peer-led model in basic life support training among medical students in their first three years of study, compared to professional-led training and (2) To assess the efficacy of the course program and students' satisfaction of peer-led training. A randomized controlled trial with blinded assessors was conducted on 72 medical students from the pre-clinical years (1st to 3rd years in Syria) at Syrian Private University. Students were randomly assigned to peer-led or to professional-led training group for one-day-course of basic life support skills. Sixty-four students who underwent checklist based assessment using objective structured clinical examination design (OSCE) (practical assessment of BLS skills) and answered BLS knowledge checkpoint-questionnaire were included in the analysis. There was no statistically significant difference between the two groups in delivering BLS skills to medical students in practical (P = 0.850) and BLS knowledge questionnaire outcomes (P = 0.900). Both groups showed statistically significant improvement from pre- to post-course assessment with significant statistical difference in both practical skills and theoretical knowledge (P-Value < 0.001). Students were satisfied with the peer model of training. Peer-led training of basic life support for medical students was beneficial and it provided a quality of education which was as effective as training conducted by professionals. This method is applicable and desirable especially in poor-resource countries and in crisis situation.

  14. The microcomputer scientific software series 3: general linear model--analysis of variance.

    Treesearch

    Harold M. Rauscher

    1985-01-01

    A BASIC language set of programs, designed for use on microcomputers, is presented. This set of programs will perform the analysis of variance for any statistical model describing either balanced or unbalanced designs. The program computes and displays the degrees of freedom, Type I sum of squares, and the mean square for the overall model, the error, and each factor...

  15. [Introduction to Exploratory Factor Analysis (EFA)].

    PubMed

    Martínez, Carolina Méndez; Sepúlveda, Martín Alonso Rondón

    2012-03-01

    Exploratory Factor Analysis (EFA) has become one of the most frequently used statistical techniques, especially in the medical and social sciences. Given its popularity, it is essential to understand the basic concepts necessary for its proper application and to take into consideration the main strengths and weaknesses of this technique. To present in a clear and concise manner the main applications of this technique, to determine the basic requirements for its use providing a description step by step of its methodology, and to establish the elements that must be taken into account during its preparation in order to not incur in erroneous results and interpretations. Narrative review. This review identifies the basic concepts and briefly describes the objectives, design, assumptions, and methodology to achieve factor derivation, global adjustment evaluation, and adequate interpretation of results. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  16. The Mathematics of Garlic

    ERIC Educational Resources Information Center

    Moore, Nathan T.; Deming, John C.

    2010-01-01

    The garlic problem presented in this article develops several themes related to dimensional analysis and also introduces students to a few basic statistical ideas. This garlic problem was used in a university preparatory chemistry class, designed for students with no chemistry background. However, this course is unique because one of the primary…

  17. Analysis and Interpretation of Findings Using Multiple Regression Techniques

    ERIC Educational Resources Information Center

    Hoyt, William T.; Leierer, Stephen; Millington, Michael J.

    2006-01-01

    Multiple regression and correlation (MRC) methods form a flexible family of statistical techniques that can address a wide variety of different types of research questions of interest to rehabilitation professionals. In this article, we review basic concepts and terms, with an emphasis on interpretation of findings relevant to research questions…

  18. How to Engage Medical Students in Chronobiology: An Example on Autorhythmometry

    ERIC Educational Resources Information Center

    Rol de Lama, M. A.; Lozano, J. P.; Ortiz, V.; Sanchez-Vazquez, F. J.; Madrid, J. A.

    2005-01-01

    This contribution describes a new laboratory experience that improves medical students' learning of chronobiology by introducing them to basic chronobiology concepts as well as to methods and statistical analysis tools specific for circadian rhythms. We designed an autorhythmometry laboratory session where students simultaneously played the role…

  19. UNITY: Confronting Supernova Cosmology's Statistical and Systematic Uncertainties in a Unified Bayesian Framework

    NASA Astrophysics Data System (ADS)

    Rubin, D.; Aldering, G.; Barbary, K.; Boone, K.; Chappell, G.; Currie, M.; Deustua, S.; Fagrelius, P.; Fruchter, A.; Hayden, B.; Lidman, C.; Nordin, J.; Perlmutter, S.; Saunders, C.; Sofiatti, C.; Supernova Cosmology Project, The

    2015-11-01

    While recent supernova (SN) cosmology research has benefited from improved measurements, current analysis approaches are not statistically optimal and will prove insufficient for future surveys. This paper discusses the limitations of current SN cosmological analyses in treating outliers, selection effects, shape- and color-standardization relations, unexplained dispersion, and heterogeneous observations. We present a new Bayesian framework, called UNITY (Unified Nonlinear Inference for Type-Ia cosmologY), that incorporates significant improvements in our ability to confront these effects. We apply the framework to real SN observations and demonstrate smaller statistical and systematic uncertainties. We verify earlier results that SNe Ia require nonlinear shape and color standardizations, but we now include these nonlinear relations in a statistically well-justified way. This analysis was primarily performed blinded, in that the basic framework was first validated on simulated data before transitioning to real data. We also discuss possible extensions of the method.

  20. Space Shuttle Missions Summary

    NASA Technical Reports Server (NTRS)

    Bennett, Floyd V.; Legler, Robert D.

    2011-01-01

    This document has been produced and updated over a 21-year period. It is intended to be a handy reference document, basically one page per flight, and care has been exercised to make it as error-free as possible. This document is basically "as flown" data and has been compiled from many sources including flight logs, flight rules, flight anomaly logs, mod flight descent summary, post flight analysis of mps propellants, FDRD, FRD, SODB, and the MER shuttle flight data and inflight anomaly list. Orbit distance traveled is taken from the PAO mission statistics.

  1. Data Analysis Techniques for Physical Scientists

    NASA Astrophysics Data System (ADS)

    Pruneau, Claude A.

    2017-10-01

    Preface; How to read this book; 1. The scientific method; Part I. Foundation in Probability and Statistics: 2. Probability; 3. Probability models; 4. Classical inference I: estimators; 5. Classical inference II: optimization; 6. Classical inference III: confidence intervals and statistical tests; 7. Bayesian inference; Part II. Measurement Techniques: 8. Basic measurements; 9. Event reconstruction; 10. Correlation functions; 11. The multiple facets of correlation functions; 12. Data correction methods; Part III. Simulation Techniques: 13. Monte Carlo methods; 14. Collision and detector modeling; List of references; Index.

  2. Analysis of Variance in Statistical Image Processing

    NASA Astrophysics Data System (ADS)

    Kurz, Ludwik; Hafed Benteftifa, M.

    1997-04-01

    A key problem in practical image processing is the detection of specific features in a noisy image. Analysis of variance (ANOVA) techniques can be very effective in such situations, and this book gives a detailed account of the use of ANOVA in statistical image processing. The book begins by describing the statistical representation of images in the various ANOVA models. The authors present a number of computationally efficient algorithms and techniques to deal with such problems as line, edge, and object detection, as well as image restoration and enhancement. By describing the basic principles of these techniques, and showing their use in specific situations, the book will facilitate the design of new algorithms for particular applications. It will be of great interest to graduate students and engineers in the field of image processing and pattern recognition.

  3. Cluster analysis of cognitive performance in elderly and demented subjects.

    PubMed

    Giaquinto, S; Nolfe, G; Calvani, M

    1985-06-01

    48 elderly normals, 14 demented subjects and 76 young controls were tested for basic cognitive functions. All the tests were quantified and could therefore be subjected to statistical analysis. The results show a difference in the speed of information processing and in memory load between the young controls and elderly normals but the age groups differed in quantitative terms only. Cluster analysis showed that the elderly and the demented formed two distinctly separate groups at the qualitative level, the basic cognitive processes being damaged in the demented group. Age thus appears to be only a risk factor for dementia and not its cause. It is concluded that batteries based on precise and measurable tasks are the most appropriate not only for the study of dementia but for rehabilitation purposes too.

  4. Basic biostatistics for post-graduate students

    PubMed Central

    Dakhale, Ganesh N.; Hiware, Sachin K.; Shinde, Abhijit T.; Mahatme, Mohini S.

    2012-01-01

    Statistical methods are important to draw valid conclusions from the obtained data. This article provides background information related to fundamental methods and techniques in biostatistics for the use of postgraduate students. Main focus is given to types of data, measurement of central variations and basic tests, which are useful for analysis of different types of observations. Few parameters like normal distribution, calculation of sample size, level of significance, null hypothesis, indices of variability, and different test are explained in detail by giving suitable examples. Using these guidelines, we are confident enough that postgraduate students will be able to classify distribution of data along with application of proper test. Information is also given regarding various free software programs and websites useful for calculations of statistics. Thus, postgraduate students will be benefitted in both ways whether they opt for academics or for industry. PMID:23087501

  5. Descriptive and inferential statistical methods used in burns research.

    PubMed

    Al-Benna, Sammy; Al-Ajam, Yazan; Way, Benjamin; Steinstraesser, Lars

    2010-05-01

    Burns research articles utilise a variety of descriptive and inferential methods to present and analyse data. The aim of this study was to determine the descriptive methods (e.g. mean, median, SD, range, etc.) and survey the use of inferential methods (statistical tests) used in articles in the journal Burns. This study defined its population as all original articles published in the journal Burns in 2007. Letters to the editor, brief reports, reviews, and case reports were excluded. Study characteristics, use of descriptive statistics and the number and types of statistical methods employed were evaluated. Of the 51 articles analysed, 11(22%) were randomised controlled trials, 18(35%) were cohort studies, 11(22%) were case control studies and 11(22%) were case series. The study design and objectives were defined in all articles. All articles made use of continuous and descriptive data. Inferential statistics were used in 49(96%) articles. Data dispersion was calculated by standard deviation in 30(59%). Standard error of the mean was quoted in 19(37%). The statistical software product was named in 33(65%). Of the 49 articles that used inferential statistics, the tests were named in 47(96%). The 6 most common tests used (Student's t-test (53%), analysis of variance/co-variance (33%), chi(2) test (27%), Wilcoxon & Mann-Whitney tests (22%), Fisher's exact test (12%)) accounted for the majority (72%) of statistical methods employed. A specified significance level was named in 43(88%) and the exact significance levels were reported in 28(57%). Descriptive analysis and basic statistical techniques account for most of the statistical tests reported. This information should prove useful in deciding which tests should be emphasised in educating burn care professionals. These results highlight the need for burn care professionals to have a sound understanding of basic statistics, which is crucial in interpreting and reporting data. Advice should be sought from professionals in the fields of biostatistics and epidemiology when using more advanced statistical techniques. Copyright 2009 Elsevier Ltd and ISBI. All rights reserved.

  6. Linnorm: improved statistical analysis for single cell RNA-seq expression data

    PubMed Central

    Yip, Shun H.; Wang, Panwen; Kocher, Jean-Pierre A.; Sham, Pak Chung

    2017-01-01

    Abstract Linnorm is a novel normalization and transformation method for the analysis of single cell RNA sequencing (scRNA-seq) data. Linnorm is developed to remove technical noises and simultaneously preserve biological variations in scRNA-seq data, such that existing statistical methods can be improved. Using real scRNA-seq data, we compared Linnorm with existing normalization methods, including NODES, SAMstrt, SCnorm, scran, DESeq and TMM. Linnorm shows advantages in speed, technical noise removal and preservation of cell heterogeneity, which can improve existing methods in the discovery of novel subtypes, pseudo-temporal ordering of cells, clustering analysis, etc. Linnorm also performs better than existing DEG analysis methods, including BASiCS, NODES, SAMstrt, Seurat and DESeq2, in false positive rate control and accuracy. PMID:28981748

  7. Which statistics should tropical biologists learn?

    PubMed

    Loaiza Velásquez, Natalia; González Lutz, María Isabel; Monge-Nájera, Julián

    2011-09-01

    Tropical biologists study the richest and most endangered biodiversity in the planet, and in these times of climate change and mega-extinctions, the need for efficient, good quality research is more pressing than in the past. However, the statistical component in research published by tropical authors sometimes suffers from poor quality in data collection; mediocre or bad experimental design and a rigid and outdated view of data analysis. To suggest improvements in their statistical education, we listed all the statistical tests and other quantitative analyses used in two leading tropical journals, the Revista de Biología Tropical and Biotropica, during a year. The 12 most frequent tests in the articles were: Analysis of Variance (ANOVA), Chi-Square Test, Student's T Test, Linear Regression, Pearson's Correlation Coefficient, Mann-Whitney U Test, Kruskal-Wallis Test, Shannon's Diversity Index, Tukey's Test, Cluster Analysis, Spearman's Rank Correlation Test and Principal Component Analysis. We conclude that statistical education for tropical biologists must abandon the old syllabus based on the mathematical side of statistics and concentrate on the correct selection of these and other procedures and tests, on their biological interpretation and on the use of reliable and friendly freeware. We think that their time will be better spent understanding and protecting tropical ecosystems than trying to learn the mathematical foundations of statistics: in most cases, a well designed one-semester course should be enough for their basic requirements.

  8. A Mediation Model to Explain the Role of Mathematics Skills and Probabilistic Reasoning on Statistics Achievement

    ERIC Educational Resources Information Center

    Primi, Caterina; Donati, Maria Anna; Chiesi, Francesca

    2016-01-01

    Among the wide range of factors related to the acquisition of statistical knowledge, competence in basic mathematics, including basic probability, has received much attention. In this study, a mediation model was estimated to derive the total, direct, and indirect effects of mathematical competence on statistics achievement taking into account…

  9. Narrative Review of Statistical Reporting Checklists, Mandatory Statistical Editing, and Rectifying Common Problems in the Reporting of Scientific Articles.

    PubMed

    Dexter, Franklin; Shafer, Steven L

    2017-03-01

    Considerable attention has been drawn to poor reproducibility in the biomedical literature. One explanation is inadequate reporting of statistical methods by authors and inadequate assessment of statistical reporting and methods during peer review. In this narrative review, we examine scientific studies of several well-publicized efforts to improve statistical reporting. We also review several retrospective assessments of the impact of these efforts. These studies show that instructions to authors and statistical checklists are not sufficient; no findings suggested that either improves the quality of statistical methods and reporting. Second, even basic statistics, such as power analyses, are frequently missing or incorrectly performed. Third, statistical review is needed for all papers that involve data analysis. A consistent finding in the studies was that nonstatistical reviewers (eg, "scientific reviewers") and journal editors generally poorly assess statistical quality. We finish by discussing our experience with statistical review at Anesthesia & Analgesia from 2006 to 2016.

  10. Detector noise statistics in the non-linear regime

    NASA Technical Reports Server (NTRS)

    Shopbell, P. L.; Bland-Hawthorn, J.

    1992-01-01

    The statistical behavior of an idealized linear detector in the presence of threshold and saturation levels is examined. It is assumed that the noise is governed by the statistical fluctuations in the number of photons emitted by the source during an exposure. Since physical detectors cannot have infinite dynamic range, our model illustrates that all devices have non-linear regimes, particularly at high count rates. The primary effect is a decrease in the statistical variance about the mean signal due to a portion of the expected noise distribution being removed via clipping. Higher order statistical moments are also examined, in particular, skewness and kurtosis. In principle, the expected distortion in the detector noise characteristics can be calibrated using flatfield observations with count rates matched to the observations. For this purpose, some basic statistical methods that utilize Fourier analysis techniques are described.

  11. Characterization and recognition of mixed emotional expressions in thermal face image

    NASA Astrophysics Data System (ADS)

    Saha, Priya; Bhattacharjee, Debotosh; De, Barin K.; Nasipuri, Mita

    2016-05-01

    Facial expressions in infrared imaging have been introduced to solve the problem of illumination, which is an integral constituent of visual imagery. The paper investigates facial skin temperature distribution on mixed thermal facial expressions of our created face database where six are basic expressions and rest 12 are a mixture of those basic expressions. Temperature analysis has been performed on three facial regions of interest (ROIs); periorbital, supraorbital and mouth. Temperature variability of the ROIs in different expressions has been measured using statistical parameters. The temperature variation measurement in ROIs of a particular expression corresponds to a vector, which is later used in recognition of mixed facial expressions. Investigations show that facial features in mixed facial expressions can be characterized by positive emotion induced facial features and negative emotion induced facial features. Supraorbital is a useful facial region that can differentiate basic expressions from mixed expressions. Analysis and interpretation of mixed expressions have been conducted with the help of box and whisker plot. Facial region containing mixture of two expressions is generally less temperature inducing than corresponding facial region containing basic expressions.

  12. 10 CFR 431.445 - Determination of small electric motor efficiency.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... determined either by testing in accordance with § 431.444 of this subpart, or by application of an... method. An AEDM applied to a basic model must be: (i) Derived from a mathematical model that represents... statistical analysis, computer simulation or modeling, or other analytic evaluation of performance data. (3...

  13. A data storage, retrieval and analysis system for endocrine research. [for Skylab

    NASA Technical Reports Server (NTRS)

    Newton, L. E.; Johnston, D. A.

    1975-01-01

    This retrieval system builds, updates, retrieves, and performs basic statistical analyses on blood, urine, and diet parameters for the M071 and M073 Skylab and Apollo experiments. This system permits data entry from cards to build an indexed sequential file. Programs are easily modified for specialized analyses.

  14. Taxometric Analysis as a General Strategy for Distinguishing Categorical from Dimensional Latent Structure

    ERIC Educational Resources Information Center

    McGrath, Robert E.; Walters, Glenn D.

    2012-01-01

    Statistical analyses investigating latent structure can be divided into those that estimate structural model parameters and those that detect the structural model type. The most basic distinction among structure types is between categorical (discrete) and dimensional (continuous) models. It is a common, and potentially misleading, practice to…

  15. Cognition, comprehension and application of biostatistics in research by Indian postgraduate students in periodontics.

    PubMed

    Swetha, Jonnalagadda Laxmi; Arpita, Ramisetti; Srikanth, Chintalapani; Nutalapati, Rajasekhar

    2014-01-01

    Biostatistics is an integral part of research protocols. In any field of inquiry or investigation, data obtained is subsequently classified, analyzed and tested for accuracy by statistical methods. Statistical analysis of collected data, thus, forms the basis for all evidence-based conclusions. The aim of this study is to evaluate the cognition, comprehension and application of biostatistics in research among post graduate students in Periodontics, in India. A total of 391 post graduate students registered for a master's course in periodontics at various dental colleges across India were included in the survey. Data regarding the level of knowledge, understanding and its application in design and conduct of the research protocol was collected using a dichotomous questionnaire. A descriptive statistics was used for data analysis. Nearly 79.2% students were aware of the importance of biostatistics in research, 55-65% were familiar with MS-EXCEL spreadsheet for graphical representation of data and with the statistical softwares available on the internet, 26.0% had biostatistics as mandatory subject in their curriculum, 9.5% tried to perform statistical analysis on their own while 3.0% were successful in performing statistical analysis of their studies on their own. Biostatistics should play a central role in planning, conduct, interim analysis, final analysis and reporting of periodontal research especially by the postgraduate students. Indian postgraduate students in periodontics are aware of the importance of biostatistics in research but the level of understanding and application is still basic and needs to be addressed.

  16. Are emergency medical technician-basics able to use a selective immobilization of the cervical spine protocol?: a preliminary report.

    PubMed

    Dunn, Thomas M; Dalton, Alice; Dorfman, Todd; Dunn, William W

    2004-01-01

    To be a first step in determining whether emergency medicine technician (EMT)-Basics are capable of using a protocol that allows for selective immobilization of the cervical spine. Such protocols are coming into use at an advanced life support level and could be beneficial when used by basic life support providers. A convenience sample of participants (n=95) from 11 emergency medical services agencies and one college class participated in the study. All participants evaluated six patients in written scenarios and decided which should be placed into spinal precautions according to a selective spinal immobilization protocol. Systems without an existing selective spinal immobilization protocol received a one-hour continuing education lecture regarding the topic. College students received a similar lecture written so laypersons could understand the protocol. All participants showed proficiency when applying a selective immobilization protocol to patients in paper-based scenarios. Furthermore, EMT-Basics performed at the same level as paramedics when following the protocol. Statistical analysis revealed no significant differences between EMT-Basics and paramedics. A follow-up group of college students (added to have a non-EMS comparison group) also performed as well as paramedics when making decisions to use spinal precautions. Differences between college students and paramedics were also statistically insignificant. The results suggest that EMT-Basics are as accurate as paramedics when making decisions regarding selective immobilization of the cervical spine during paper-based scenarios. That laypersons are also proficient when using the protocol could indicate that it is extremely simple to follow. This study is a first step toward the necessary additional studies evaluating the efficacy of EMT-Basics using selective immobilization as a regular practice.

  17. Student failures on first-year medical basic science courses and the USMLE step 1: a retrospective study over a 20-year period.

    PubMed

    Burns, E Robert; Garrett, Judy

    2015-01-01

    Correlates of achievement in the basic science years in medical school and on the Step 1 of the United States Medical Licensing Examination® (USMLE®), (Step 1) in relation to preadmission variables have been the subject of considerable study. Preadmissions variables such as the undergraduate grade point average (uGPA) and Medical College Admission Test® (MCAT®) scores, solely or in combination, have previously been found to be predictors of achievement in the basic science years and/or on the Step 1. The purposes of this retrospective study were to: (1) determine if our statistical analysis confirmed previously published relationships between preadmission variables (MCAT, uGPA, and applicant pool size), and (2) study correlates of the number of failures in five M1 courses with those preadmission variables and failures on Step 1. Statistical analysis confirmed previously published relationships between all preadmission variables. Only one course, Microscopic Anatomy, demonstrated significant correlations with all variables studied including the Step 1 failures. Physiology correlated with three of the four variables studied, but not with the Step 1 failures. Analyses such as these provide a tool by which administrators will be able to identify what courses are or are not responding in appropriate ways to changes in the preadmissions variables that signal student performance on the Step 1. © 2014 American Association of Anatomists.

  18. Interpretation of statistical results.

    PubMed

    García Garmendia, J L; Maroto Monserrat, F

    2018-02-21

    The appropriate interpretation of the statistical results is crucial to understand the advances in medical science. The statistical tools allow us to transform the uncertainty and apparent chaos in nature to measurable parameters which are applicable to our clinical practice. The importance of understanding the meaning and actual extent of these instruments is essential for researchers, the funders of research and for professionals who require a permanent update based on good evidence and supports to decision making. Various aspects of the designs, results and statistical analysis are reviewed, trying to facilitate his comprehension from the basics to what is most common but no better understood, and bringing a constructive, non-exhaustive but realistic look. Copyright © 2018 Elsevier España, S.L.U. y SEMICYUC. All rights reserved.

  19. Using basic statistics on the individual patient's own numeric data.

    PubMed

    Hart, John

    2012-12-01

    This theoretical report gives an example for how coefficient of variation (CV) and quartile analysis (QA) to assess outliers might be able to be used to analyze numeric data in practice for an individual patient. A patient was examined for 8 visits using infrared instrumentation for measurement of mastoid fossa temperature differential (MFTD) readings. The CV and QA were applied to the readings. The participant also completed the Short Form-12 health perception survey on each visit, and these findings were correlated with CV to determine if CV had outcomes support (clinical significance). An outlier MFTD reading was observed on the eighth visit according to QA that coincided with the largest CV value for the MFTDs. Correlations between the Short Form-12 and CV were low to negligible, positive, and statistically nonsignificant. This case provides an example of how basic statistical analyses could possibly be applied to numerical data in chiropractic practice for an individual patient. This might add objectivity to analyzing an individual patient's data in practice, particularly if clinical significance of a clinical numerical finding is unknown.

  20. Mapping Quantitative Traits in Unselected Families: Algorithms and Examples

    PubMed Central

    Dupuis, Josée; Shi, Jianxin; Manning, Alisa K.; Benjamin, Emelia J.; Meigs, James B.; Cupples, L. Adrienne; Siegmund, David

    2009-01-01

    Linkage analysis has been widely used to identify from family data genetic variants influencing quantitative traits. Common approaches have both strengths and limitations. Likelihood ratio tests typically computed in variance component analysis can accommodate large families but are highly sensitive to departure from normality assumptions. Regression-based approaches are more robust but their use has primarily been restricted to nuclear families. In this paper, we develop methods for mapping quantitative traits in moderately large pedigrees. Our methods are based on the score statistic which in contrast to the likelihood ratio statistic, can use nonparametric estimators of variability to achieve robustness of the false positive rate against departures from the hypothesized phenotypic model. Because the score statistic is easier to calculate than the likelihood ratio statistic, our basic mapping methods utilize relatively simple computer code that performs statistical analysis on output from any program that computes estimates of identity-by-descent. This simplicity also permits development and evaluation of methods to deal with multivariate and ordinal phenotypes, and with gene-gene and gene-environment interaction. We demonstrate our methods on simulated data and on fasting insulin, a quantitative trait measured in the Framingham Heart Study. PMID:19278016

  1. [Nurse supervision in health basic units].

    PubMed

    Correia, Valesca Silveira; Servo, Maria Lúcia Silva

    2006-01-01

    This qualitative study intends to evaluate the pattern of supervision of the nurse in health basic units, in Feira de Santana city (Bahía-Brasil), between August 2001 and June 2002. The objective was to describe the supervision and the existence of supervision systematics for the nurse. A questionnaire was used to take informations from a group of sixteen (16) nurses in actual professional work. Descriptive statistical procedures for data analysis were used. It can be concluded that systematic supervision is practiced in 64% of the nurses and in 36% of the cases systematic supervision do not occur.

  2. Photon Limited Images and Their Restoration

    DTIC Science & Technology

    1976-03-01

    arises from noise inherent in the detected image data. In the first part of this report a model is developed which can be used to mathematically and...statistically describe an image detected at low light levels. This rodel serves to clarify some basic properties of photon noise , and provides a basis...for the analysi.s of image restoration. In the second part the problem of linear least-square restoration of imagery limited by photon noise is

  3. Statistical analysis and interpolation of compositional data in materials science.

    PubMed

    Pesenson, Misha Z; Suram, Santosh K; Gregoire, John M

    2015-02-09

    Compositional data are ubiquitous in chemistry and materials science: analysis of elements in multicomponent systems, combinatorial problems, etc., lead to data that are non-negative and sum to a constant (for example, atomic concentrations). The constant sum constraint restricts the sampling space to a simplex instead of the usual Euclidean space. Since statistical measures such as mean and standard deviation are defined for the Euclidean space, traditional correlation studies, multivariate analysis, and hypothesis testing may lead to erroneous dependencies and incorrect inferences when applied to compositional data. Furthermore, composition measurements that are used for data analytics may not include all of the elements contained in the material; that is, the measurements may be subcompositions of a higher-dimensional parent composition. Physically meaningful statistical analysis must yield results that are invariant under the number of composition elements, requiring the application of specialized statistical tools. We present specifics and subtleties of compositional data processing through discussion of illustrative examples. We introduce basic concepts, terminology, and methods required for the analysis of compositional data and utilize them for the spatial interpolation of composition in a sputtered thin film. The results demonstrate the importance of this mathematical framework for compositional data analysis (CDA) in the fields of materials science and chemistry.

  4. Generation of future potential scenarios in an Alpine Catchment by applying bias-correction techniques, delta-change approaches and stochastic Weather Generators at different spatial scale. Analysis of their influence on basic and drought statistics.

    NASA Astrophysics Data System (ADS)

    Collados-Lara, Antonio-Juan; Pulido-Velazquez, David; Pardo-Iguzquiza, Eulogio

    2017-04-01

    Assessing impacts of potential future climate change scenarios in precipitation and temperature is essential to design adaptive strategies in water resources systems. The objective of this work is to analyze the possibilities of different statistical downscaling methods to generate future potential scenarios in an Alpine Catchment from historical data and the available climate models simulations performed in the frame of the CORDEX EU project. The initial information employed to define these downscaling approaches are the historical climatic data (taken from the Spain02 project for the period 1971-2000 with a spatial resolution of 12.5 Km) and the future series provided by climatic models in the horizon period 2071-2100 . We have used information coming from nine climate model simulations (obtained from five different Regional climate models (RCM) nested to four different Global Climate Models (GCM)) from the European CORDEX project. In our application we have focused on the Representative Concentration Pathways (RCP) 8.5 emissions scenario, which is the most unfavorable scenario considered in the fifth Assessment Report (AR5) by the Intergovernmental Panel on Climate Change (IPCC). For each RCM we have generated future climate series for the period 2071-2100 by applying two different approaches, bias correction and delta change, and five different transformation techniques (first moment correction, first and second moment correction, regression functions, quantile mapping using distribution derived transformation and quantile mapping using empirical quantiles) for both of them. Ensembles of the obtained series were proposed to obtain more representative potential future climate scenarios to be employed to study potential impacts. In this work we propose a non-equifeaseble combination of the future series giving more weight to those coming from models (delta change approaches) or combination of models and techniques that provides better approximation to the basic and drought statistic of the historical data. A multi-objective analysis using basic statistics (mean, standard deviation and asymmetry coefficient) and droughts statistics (duration, magnitude and intensity) has been performed to identify which models are better in terms of goodness of fit to reproduce the historical series. The drought statistics have been obtained from the Standard Precipitation index (SPI) series using the Theory of Runs. This analysis allows discriminate the best RCM and the best combination of model and correction technique in the bias-correction method. We have also analyzed the possibilities of using different Stochastic Weather Generators to approximate the basic and droughts statistics of the historical series. These analyses have been performed in our case study in a lumped and in a distributed way in order to assess its sensibility to the spatial scale. The statistic of the future temperature series obtained with different ensemble options are quite homogeneous, but the precipitation shows a higher sensibility to the adopted method and spatial scale. The global increment in the mean temperature values are 31.79 %, 31.79 %, 31.03 % and 31.74 % for the distributed bias-correction, distributed delta-change, lumped bias-correction and lumped delta-change ensembles respectively and in the precipitation they are -25.48 %, -28.49 %, -26.42 % and -27.35% respectively. Acknowledgments: This research work has been partially supported by the GESINHIMPADAPT project (CGL2013-48424-C2-2-R) with Spanish MINECO funds. We would also like to thank Spain02 and CORDEX projects for the data provided for this study and the R package qmap.

  5. Linnorm: improved statistical analysis for single cell RNA-seq expression data.

    PubMed

    Yip, Shun H; Wang, Panwen; Kocher, Jean-Pierre A; Sham, Pak Chung; Wang, Junwen

    2017-12-15

    Linnorm is a novel normalization and transformation method for the analysis of single cell RNA sequencing (scRNA-seq) data. Linnorm is developed to remove technical noises and simultaneously preserve biological variations in scRNA-seq data, such that existing statistical methods can be improved. Using real scRNA-seq data, we compared Linnorm with existing normalization methods, including NODES, SAMstrt, SCnorm, scran, DESeq and TMM. Linnorm shows advantages in speed, technical noise removal and preservation of cell heterogeneity, which can improve existing methods in the discovery of novel subtypes, pseudo-temporal ordering of cells, clustering analysis, etc. Linnorm also performs better than existing DEG analysis methods, including BASiCS, NODES, SAMstrt, Seurat and DESeq2, in false positive rate control and accuracy. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  6. Modeling and replicating statistical topology and evidence for CMB nonhomogeneity

    PubMed Central

    Agami, Sarit

    2017-01-01

    Under the banner of “big data,” the detection and classification of structure in extremely large, high-dimensional, data sets are two of the central statistical challenges of our times. Among the most intriguing new approaches to this challenge is “TDA,” or “topological data analysis,” one of the primary aims of which is providing nonmetric, but topologically informative, preanalyses of data which make later, more quantitative, analyses feasible. While TDA rests on strong mathematical foundations from topology, in applications, it has faced challenges due to difficulties in handling issues of statistical reliability and robustness, often leading to an inability to make scientific claims with verifiable levels of statistical confidence. We propose a methodology for the parametric representation, estimation, and replication of persistence diagrams, the main diagnostic tool of TDA. The power of the methodology lies in the fact that even if only one persistence diagram is available for analysis—the typical case for big data applications—the replications permit conventional statistical hypothesis testing. The methodology is conceptually simple and computationally practical, and provides a broadly effective statistical framework for persistence diagram TDA analysis. We demonstrate the basic ideas on a toy example, and the power of the parametric approach to TDA modeling in an analysis of cosmic microwave background (CMB) nonhomogeneity. PMID:29078301

  7. MATHEMATICS PANEL PROGRESS REPORT FOR PERIOD MARCH 1, 1957 TO AUGUST 31, 1958

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Householder, A.S.

    1959-03-24

    ORACLE operation and programming are summarized, and progress is indicated on various current problems. Work is reviewed on numerical analysis, programming, basic mathematics, biometrics and statistics, ORACLE operations and special codes, and training. Publications and lectures for the report period are listed. (For preceding period see ORNL-2283.) (W.D.M.)

  8. Survey of basic medical researchers on the awareness of animal experimental designs and reporting standards in China.

    PubMed

    Ma, Bin; Xu, Jia-Ke; Wu, Wen-Jing; Liu, Hong-Yan; Kou, Cheng-Kun; Liu, Na; Zhao, Lulu

    2017-01-01

    To investigate the awareness and use of the Systematic Review Center for Laboratory Animal Experimentation's (SYRCLE) risk-of-bias tool, the Animal Research: Reporting of In Vivo Experiments (ARRIVE) reporting guidelines, and Gold Standard Publication Checklist (GSPC) in China in basic medical researchers of animal experimental studies. A national questionnaire-based survey targeting basic medical researchers was carried in China to investigate the basic information and awareness of SYRCLE's risk of bias tool, ARRIVE guidelines, GSPC, and animal experimental bias risk control factors. The EpiData3.1 software was used for data entry, and Microsoft Excel 2013 was used for statistical analysis in this study. The number of cases (n) and percentage (%) of classified information were statistically described, and the comparison between groups (i.e., current students vs. research staff) was performed using chi-square test. A total of 298 questionnaires were distributed, and 272 responses were received, which included 266 valid questionnaires (from 118 current students and 148 research staff). Among the 266 survey participants, only 15.8% was aware of the SYRCLE's risk of bias tool, with significant difference between the two groups (P = 0.003), and the awareness rates of ARRIVE guidelines and GSPC were only 9.4% and 9.0%, respectively; 58.6% survey participants believed that the reports of animal experimental studies in Chinese literature were inadequate, with significant difference between the two groups (P = 0.004). In addition, only approximately 1/3 of the survey participants had read systematic reviews and meta-analysis reports of animal experimental studies; only 16/266 (6.0%) had carried out/participated in and 11/266 (4.1%) had published systematic reviews/meta-analysis of animal experimental studies. The awareness and use rates of SYRCLE's risk-of-bias tool, the ARRIVE guidelines, and the GSPC were low among Chinese basic medical researchers. Therefore, specific measures are necessary to promote and popularize these standards and specifications and to introduce these standards into guidelines of Chinese domestic journals as soon as possible to raise awareness and increase use rates of researchers and journal editors, thereby improving the quality of animal experimental methods and reports.

  9. Probability of identification: a statistical model for the validation of qualitative botanical identification methods.

    PubMed

    LaBudde, Robert A; Harnly, James M

    2012-01-01

    A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.

  10. A brief introduction to probability.

    PubMed

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  11. Current genetic methodologies in the identification of disaster victims and in forensic analysis.

    PubMed

    Ziętkiewicz, Ewa; Witt, Magdalena; Daca, Patrycja; Zebracka-Gala, Jadwiga; Goniewicz, Mariusz; Jarząb, Barbara; Witt, Michał

    2012-02-01

    This review presents the basic problems and currently available molecular techniques used for genetic profiling in disaster victim identification (DVI). The environmental conditions of a mass disaster often result in severe fragmentation, decomposition and intermixing of the remains of victims. In such cases, traditional identification based on the anthropological and physical characteristics of the victims is frequently inconclusive. This is the reason why DNA profiling became the gold standard for victim identification in mass-casualty incidents (MCIs) or any forensic cases where human remains are highly fragmented and/or degraded beyond recognition. The review provides general information about the sources of genetic material for DNA profiling, the genetic markers routinely used during genetic profiling (STR markers, mtDNA and single-nucleotide polymorphisms [SNP]) and the basic statistical approaches used in DNA-based disaster victim identification. Automated technological platforms that allow the simultaneous analysis of a multitude of genetic markers used in genetic identification (oligonucleotide microarray techniques and next-generation sequencing) are also presented. Forensic and population databases containing information on human variability, routinely used for statistical analyses, are discussed. The final part of this review is focused on recent developments, which offer particularly promising tools for forensic applications (mRNA analysis, transcriptome variation in individuals/populations and genetic profiling of specific cells separated from mixtures).

  12. Interpretation of the results of statistical measurements. [search for basic probability model

    NASA Technical Reports Server (NTRS)

    Olshevskiy, V. V.

    1973-01-01

    For random processes, the calculated probability characteristic, and the measured statistical estimate are used in a quality functional, which defines the difference between the two functions. Based on the assumption that the statistical measurement procedure is organized so that the parameters for a selected model are optimized, it is shown that the interpretation of experimental research is a search for a basic probability model.

  13. Principal component regression analysis with SPSS.

    PubMed

    Liu, R X; Kuang, J; Gong, Q; Hou, X L

    2003-06-01

    The paper introduces all indices of multicollinearity diagnoses, the basic principle of principal component regression and determination of 'best' equation method. The paper uses an example to describe how to do principal component regression analysis with SPSS 10.0: including all calculating processes of the principal component regression and all operations of linear regression, factor analysis, descriptives, compute variable and bivariate correlations procedures in SPSS 10.0. The principal component regression analysis can be used to overcome disturbance of the multicollinearity. The simplified, speeded up and accurate statistical effect is reached through the principal component regression analysis with SPSS.

  14. Receiver operating characteristic (ROC) curves: review of methods with applications in diagnostic medicine

    NASA Astrophysics Data System (ADS)

    Obuchowski, Nancy A.; Bullen, Jennifer A.

    2018-04-01

    Receiver operating characteristic (ROC) analysis is a tool used to describe the discrimination accuracy of a diagnostic test or prediction model. While sensitivity and specificity are the basic metrics of accuracy, they have many limitations when characterizing test accuracy, particularly when comparing the accuracies of competing tests. In this article we review the basic study design features of ROC studies, illustrate sample size calculations, present statistical methods for measuring and comparing accuracy, and highlight commonly used ROC software. We include descriptions of multi-reader ROC study design and analysis, address frequently seen problems of verification and location bias, discuss clustered data, and provide strategies for testing endpoints in ROC studies. The methods are illustrated with a study of transmission ultrasound for diagnosing breast lesions.

  15. [How reliable is the monitoring for doping?].

    PubMed

    Hüsler, J

    1990-12-01

    The reliability of the dope control, of the chemical analysis of the urine probes in the accredited laboratories and their decisions, is discussed using probabilistic and statistical methods. Basically, we evaluated and estimated the positive predictive value which means the probability that an urine probe contains prohibited dope substances given a positive test decision. Since there are not statistical data and evidence for some important quantities in relation to the predictive value, an exact evaluation is not possible, only conservative, lower bounds can be given. We found that the predictive value is at least 90% or 95% with respect to the analysis and decision based on the A-probe only, and at least 99% with respect to both A- and B-probes. A more realistic observation, but without sufficient statistical confidence, points to the fact that the true predictive value is significantly larger than these lower estimates.

  16. Cognition, comprehension and application of biostatistics in research by Indian postgraduate students in periodontics

    PubMed Central

    Swetha, Jonnalagadda Laxmi; Arpita, Ramisetti; Srikanth, Chintalapani; Nutalapati, Rajasekhar

    2014-01-01

    Background: Biostatistics is an integral part of research protocols. In any field of inquiry or investigation, data obtained is subsequently classified, analyzed and tested for accuracy by statistical methods. Statistical analysis of collected data, thus, forms the basis for all evidence-based conclusions. Aim: The aim of this study is to evaluate the cognition, comprehension and application of biostatistics in research among post graduate students in Periodontics, in India. Materials and Methods: A total of 391 post graduate students registered for a master's course in periodontics at various dental colleges across India were included in the survey. Data regarding the level of knowledge, understanding and its application in design and conduct of the research protocol was collected using a dichotomous questionnaire. A descriptive statistics was used for data analysis. Results: Nearly 79.2% students were aware of the importance of biostatistics in research, 55-65% were familiar with MS-EXCEL spreadsheet for graphical representation of data and with the statistical softwares available on the internet, 26.0% had biostatistics as mandatory subject in their curriculum, 9.5% tried to perform statistical analysis on their own while 3.0% were successful in performing statistical analysis of their studies on their own. Conclusion: Biostatistics should play a central role in planning, conduct, interim analysis, final analysis and reporting of periodontal research especially by the postgraduate students. Indian postgraduate students in periodontics are aware of the importance of biostatistics in research but the level of understanding and application is still basic and needs to be addressed. PMID:24744547

  17. An uncertainty analysis of the flood-stage upstream from a bridge.

    PubMed

    Sowiński, M

    2006-01-01

    The paper begins with the formulation of the problem in the form of a general performance function. Next the Latin hypercube sampling (LHS) technique--a modified version of the Monte Carlo method is briefly described. The essential uncertainty analysis of the flood-stage upstream from a bridge starts with a description of the hydraulic model. This model concept is based on the HEC-RAS model developed for subcritical flow under a bridge without piers in which the energy equation is applied. The next section contains the characteristic of the basic variables including a specification of their statistics (means and variances). Next the problem of correlated variables is discussed and assumptions concerning correlation among basic variables are formulated. The analysis of results is based on LHS ranking lists obtained from the computer package UNCSAM. Results fot two examples are given: one for independent and the other for correlated variables.

  18. Statistical Analysis of Small-Scale Magnetic Flux Emergence Patterns: A Useful Subsurface Diagnostic?

    NASA Astrophysics Data System (ADS)

    Lamb, Derek A.

    2016-10-01

    While sunspots follow a well-defined pattern of emergence in space and time, small-scale flux emergence is assumed to occur randomly at all times in the quiet Sun. HMI's full-disk coverage, high cadence, spatial resolution, and duty cycle allow us to probe that basic assumption. Some case studies of emergence suggest that temporal clustering on spatial scales of 50-150 Mm may occur. If clustering is present, it could serve as a diagnostic of large-scale subsurface magnetic field structures. We present the results of a manual survey of small-scale flux emergence events over a short time period, and a statistical analysis addressing the question of whether these events show spatio-temporal behavior that is anything other than random.

  19. Wavelet analysis of birefringence images of myocardium tissue

    NASA Astrophysics Data System (ADS)

    Sakhnovskiy, M. Yu.; Ushenko, Yu. O.; Kushnerik, L.; Soltys, I. V.; Pavlyukovich, N.; Pavlyukovich, O.

    2018-01-01

    The paper consists of two parts. The first part presents short theoretical basics of the method of azimuthally-invariant Mueller-matrix description of optical anisotropy of biological tissues. It was provided experimentally measured coordinate distributions of Mueller-matrix invariants (MMI) of linear and circular birefringences of skeletal muscle tissue. It was defined the values of statistic moments, which characterize the distributions of amplitudes of wavelet coefficients of MMI at different scales of scanning. The second part presents the data of statistic analysis of the distributions of amplitude of wavelet coefficients of the distributions of linear birefringence of myocardium tissue died after the infarction and ischemic heart disease. It was defined the objective criteria of differentiation of the cause of death.

  20. Treated cabin acoustic prediction using statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Yoerkie, Charles A.; Ingraham, Steven T.; Moore, James A.

    1987-01-01

    The application of statistical energy analysis (SEA) to the modeling and design of helicopter cabin interior noise control treatment is demonstrated. The information presented here is obtained from work sponsored at NASA Langley for the development of analytic modeling techniques and the basic understanding of cabin noise. Utility and executive interior models are developed directly from existing S-76 aircraft designs. The relative importance of panel transmission loss (TL), acoustic leakage, and absorption to the control of cabin noise is shown using the SEA modeling parameters. It is shown that the major cabin noise improvement below 1000 Hz comes from increased panel TL, while above 1000 Hz it comes from reduced acoustic leakage and increased absorption in the cabin and overhead cavities.

  1. Applied Problems and Use of Technology in an Aligned Way in Basic Courses in Probability and Statistics for Engineering Students--A Way to Enhance Understanding and Increase Motivation

    ERIC Educational Resources Information Center

    Zetterqvist, Lena

    2017-01-01

    Researchers and teachers often recommend motivating exercises and use of mathematics or statistics software for the teaching of basic courses in probability and statistics. Our courses are given to large groups of engineering students at Lund Institute of Technology. We found that the mere existence of real-life data and technology in a course…

  2. A basic introduction to statistics for the orthopaedic surgeon.

    PubMed

    Bertrand, Catherine; Van Riet, Roger; Verstreken, Frederik; Michielsen, Jef

    2012-02-01

    Orthopaedic surgeons should review the orthopaedic literature in order to keep pace with the latest insights and practices. A good understanding of basic statistical principles is of crucial importance to the ability to read articles critically, to interpret results and to arrive at correct conclusions. This paper explains some of the key concepts in statistics, including hypothesis testing, Type I and Type II errors, testing of normality, sample size and p values.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tessore, Nicolas; Metcalf, R. Benton; Winther, Hans A.

    A number of alternatives to general relativity exhibit gravitational screening in the non-linear regime of structure formation. We describe a set of algorithms that can produce weak lensing maps of large scale structure in such theories and can be used to generate mock surveys for cosmological analysis. By analysing a few basic statistics we indicate how these alternatives can be distinguished from general relativity with future weak lensing surveys.

  4. Characteristics of the 100 Largest Public Elementary and Secondary School Districts in the United States: 2000-01. Statistical Analysis Report.

    ERIC Educational Resources Information Center

    Young, Beth Aronstamm

    This publication provides basic descriptive information about the 100 largest school districts (ranked by student membership) in the United States and jurisdictions (Bureau of Indian Affairs, Department of Defense Schools, American Samoa, Guam, the Northern Marianas, Puerto Rico, and the Virgin Islands). Almost one in every four public-school…

  5. Remedial Coursetaking at U.S. Public 2- and 4-Year Institutions: Scope, Experiences, and Outcomes. Statistical Analysis Report. NCES 2016-405

    ERIC Educational Resources Information Center

    Chen, Xianglei

    2016-01-01

    Every year, millions of new college students arrive on campus lacking the necessary academic skills to perform at the college level. Postsecondary institutions address this problem with extensive remedial programs designed to strengthen students' basic skills. While much research on the effectiveness of remedial education has been conducted,…

  6. Aircraft noise effects: An interdisciplinary study of the effects of aircraft noise on man. Part 1: Basic report

    NASA Technical Reports Server (NTRS)

    1980-01-01

    An area around the Munich-Riem airport was divided into 32 clusters of different noise exposure and subjects were drawn from each cluster for a social survey and for psychological, medical, and physiological testing. Extensive acoustical measurements were also carried out in each cluster. The results were then subjected to detailed statistical analysis.

  7. A preliminary analysis of library holdings as compared to the basic resources for pharmacy education list.

    PubMed

    Vaughan, K T L V; Lerner, Rachel C

    2013-01-01

    The catalogs of 11 university libraries were analyzed against the Basic Resources for Pharmaceutical Education (BRPE) to measure the percent coverage of the core total list as well as the core sublist. There is no clear trend in this data to link school age, size, or rank with percentage of coverage of the total list or the "First Purchase" core list when treated as independent variables. Approximately half of the schools have significantly higher percentages of core titles than statistically expected. Based on this data, it is difficult to predict what percentage of titles on the BRPE a library will contain.

  8. Health Literacy Impact on National Healthcare Utilization and Expenditure.

    PubMed

    Rasu, Rafia S; Bawa, Walter Agbor; Suminski, Richard; Snella, Kathleen; Warady, Bradley

    2015-08-17

    Health literacy presents an enormous challenge in the delivery of effective healthcare and quality outcomes. We evaluated the impact of low health literacy (LHL) on healthcare utilization and healthcare expenditure. Database analysis used Medical Expenditure Panel Survey (MEPS) from 2005-2008 which provides nationally representative estimates of healthcare utilization and expenditure. Health literacy scores (HLSs) were calculated based on a validated, predictive model and were scored according to the National Assessment of Adult Literacy (NAAL). HLS ranged from 0-500. Health literacy level (HLL) and categorized in 2 groups: Below basic or basic (HLS <226) and above basic (HLS ≥226). Healthcare utilization expressed as a physician, nonphysician, or emergency room (ER) visits and healthcare spending. Expenditures were adjusted to 2010 rates using the Consumer Price Index (CPI). A P value of 0.05 or less was the criterion for statistical significance in all analyses. Multivariate regression models assessed the impact of the predicted HLLs on outpatient healthcare utilization and expenditures. All analyses were performed with SAS and STATA® 11.0 statistical software. The study evaluated 22 599 samples representing 503 374 648 weighted individuals nationally from 2005-2008. The cohort had an average age of 49 years and included more females (57%). Caucasian were the predominant racial ethnic group (83%) and 37% of the cohort were from the South region of the United States of America. The proportion of the cohort with basic or below basic health literacy was 22.4%. Annual predicted values of physician visits, nonphysician visits, and ER visits were 6.6, 4.8, and 0.2, respectively, for basic or below basic compared to 4.4, 2.6, and 0.1 for above basic. Predicted values of office and ER visits expenditures were $1284 and $151, respectively, for basic or below basic and $719 and $100 for above basic (P < .05). The extrapolated national estimates show that the annual costs for prescription alone for adults with LHL possibly associated with basic and below basic health literacy could potentially reach about $172 billion. Health literacy is inversely associated with healthcare utilization and expenditure. Individuals with below basic or basic HLL have greater healthcare utilization and expendituresspending more on prescriptions compared to individuals with above basic HLL. Public health strategies promoting appropriate education among individuals with LHL may help to improve health outcomes and reduce unnecessary healthcare visits and costs. © 2015 by Kerman University of Medical Sciences.

  9. Application of microarray analysis on computer cluster and cloud platforms.

    PubMed

    Bernau, C; Boulesteix, A-L; Knaus, J

    2013-01-01

    Analysis of recent high-dimensional biological data tends to be computationally intensive as many common approaches such as resampling or permutation tests require the basic statistical analysis to be repeated many times. A crucial advantage of these methods is that they can be easily parallelized due to the computational independence of the resampling or permutation iterations, which has induced many statistics departments to establish their own computer clusters. An alternative is to rent computing resources in the cloud, e.g. at Amazon Web Services. In this article we analyze whether a selection of statistical projects, recently implemented at our department, can be efficiently realized on these cloud resources. Moreover, we illustrate an opportunity to combine computer cluster and cloud resources. In order to compare the efficiency of computer cluster and cloud implementations and their respective parallelizations we use microarray analysis procedures and compare their runtimes on the different platforms. Amazon Web Services provide various instance types which meet the particular needs of the different statistical projects we analyzed in this paper. Moreover, the network capacity is sufficient and the parallelization is comparable in efficiency to standard computer cluster implementations. Our results suggest that many statistical projects can be efficiently realized on cloud resources. It is important to mention, however, that workflows can change substantially as a result of a shift from computer cluster to cloud computing.

  10. Polarization-interference Jones-matrix mapping of biological crystal networks

    NASA Astrophysics Data System (ADS)

    Ushenko, O. G.; Dubolazov, O. V.; Pidkamin, L. Y.; Sidor, M. I.; Pavlyukovich, N.; Pavlyukovich, O.

    2018-01-01

    The paper consists of two parts. The first part presents short theoretical basics of the method of Jones-matrix mapping with the help of reference wave. It was provided experimentally measured coordinate distributions of modulus of Jones-matrix elements of polycrystalline film of bile. It was defined the values and ranges of changing of statistic moments, which characterize such distributions. The second part presents the data of statistic analysis of the distributions of matrix elements of polycrystalline film of urine of donors and patients with albuminuria. It was defined the objective criteria of differentiation of albuminuria.

  11. Using Data Mining to Teach Applied Statistics and Correlation

    ERIC Educational Resources Information Center

    Hartnett, Jessica L.

    2016-01-01

    This article describes two class activities that introduce the concept of data mining and very basic data mining analyses. Assessment data suggest that students learned some of the conceptual basics of data mining, understood some of the ethical concerns related to the practice, and were able to perform correlations via the Statistical Package for…

  12. Simple Data Sets for Distinct Basic Summary Statistics

    ERIC Educational Resources Information Center

    Lesser, Lawrence M.

    2011-01-01

    It is important to avoid ambiguity with numbers because unfortunate choices of numbers can inadvertently make it possible for students to form misconceptions or make it difficult for teachers to tell if students obtained the right answer for the right reason. Therefore, it is important to make sure when introducing basic summary statistics that…

  13. Comparing early signs and basic symptoms as methods for predicting psychotic relapse in clinical practice.

    PubMed

    Eisner, Emily; Drake, Richard; Lobban, Fiona; Bucci, Sandra; Emsley, Richard; Barrowclough, Christine

    2018-02-01

    Early signs interventions show promise but could be further developed. A recent review suggested that 'basic symptoms' should be added to conventional early signs to improve relapse prediction. This study builds on preliminary evidence that basic symptoms predict relapse and aimed to: 1. examine which phenomena participants report prior to relapse and how they describe them; 2. determine the best way of identifying pre-relapse basic symptoms; 3. assess current practice by comparing self- and casenote-reported pre-relapse experiences. Participants with non-affective psychosis were recruited from UK mental health services. In-depth interviews (n=23), verbal checklists of basic symptoms (n=23) and casenote extracts (n=208) were analysed using directed content analysis and non-parametric statistical tests. Three-quarters of interviewees reported basic symptoms and all reported conventional early signs and 'other' pre-relapse experiences. Interviewees provided rich descriptions of basic symptoms. Verbal checklist interviews asking specifically about basic symptoms identified these experiences more readily than open questions during in-depth interviews. Only 5% of casenotes recorded basic symptoms; interviewees were 16 times more likely to report basic symptoms than their casenotes did. The majority of interviewees self-reported pre-relapse basic symptoms when asked specifically about these experiences but very few casenotes reported these symptoms. Basic symptoms may be potent predictors of relapse that clinicians miss. A self-report measure would aid monitoring of basic symptoms in routine clinical practice and would facilitate a prospective investigation comparing basic symptoms and conventional early signs as predictors of relapse. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  14. Volcano plots in analyzing differential expressions with mRNA microarrays.

    PubMed

    Li, Wentian

    2012-12-01

    A volcano plot displays unstandardized signal (e.g. log-fold-change) against noise-adjusted/standardized signal (e.g. t-statistic or -log(10)(p-value) from the t-test). We review the basic and interactive use of the volcano plot and its crucial role in understanding the regularized t-statistic. The joint filtering gene selection criterion based on regularized statistics has a curved discriminant line in the volcano plot, as compared to the two perpendicular lines for the "double filtering" criterion. This review attempts to provide a unifying framework for discussions on alternative measures of differential expression, improved methods for estimating variance, and visual display of a microarray analysis result. We also discuss the possibility of applying volcano plots to other fields beyond microarray.

  15. Variations in intensity statistics for representational and abstract art, and for art from the Eastern and Western hemispheres.

    PubMed

    Graham, Daniel J; Field, David J

    2008-01-01

    Two recent studies suggest that natural scenes and paintings show similar statistical properties. But does the content or region of origin of an artwork affect its statistical properties? We addressed this question by having judges place paintings from a large, diverse collection of paintings into one of three subject-matter categories using a forced-choice paradigm. Basic statistics for images whose caterogization was agreed by all judges showed no significant differences between those judged to be 'landscape' and 'portrait/still-life', but these two classes differed from paintings judged to be 'abstract'. All categories showed basic spatial statistical regularities similar to those typical of natural scenes. A test of the full painting collection (140 images) with respect to the works' place of origin (provenance) showed significant differences between Eastern works and Western ones, differences which we find are likely related to the materials and the choice of background color. Although artists deviate slightly from reproducing natural statistics in abstract art (compared to representational art), the great majority of human art likely shares basic statistical limitations. We argue that statistical regularities in art are rooted in the need to make art visible to the eye, not in the inherent aesthetic value of natural-scene statistics, and we suggest that variability in spatial statistics may be generally imposed by manufacture.

  16. The Lake Tahoe Basin Land Use Simulation Model

    USGS Publications Warehouse

    Forney, William M.; Oldham, I. Benson

    2011-01-01

    This U.S. Geological Survey Open-File Report describes the final modeling product for the Tahoe Decision Support System project for the Lake Tahoe Basin funded by the Southern Nevada Public Land Management Act and the U.S. Geological Survey's Geographic Analysis and Monitoring Program. This research was conducted by the U.S. Geological Survey Western Geographic Science Center. The purpose of this report is to describe the basic elements of the novel Lake Tahoe Basin Land Use Simulation Model, publish samples of the data inputs, basic outputs of the model, and the details of the Python code. The results of this report include a basic description of the Land Use Simulation Model, descriptions and summary statistics of model inputs, two figures showing the graphical user interface from the web-based tool, samples of the two input files, seven tables of basic output results from the web-based tool and descriptions of their parameters, and the fully functional Python code.

  17. Basic statistics (the fundamental concepts).

    PubMed

    Lim, Eric

    2014-12-01

    An appreciation and understanding of statistics is import to all practising clinicians, not simply researchers. This is because mathematics is the fundamental basis to which we base clinical decisions, usually with reference to the benefit in relation to risk. Unless a clinician has a basic understanding of statistics, he or she will never be in a position to question healthcare management decisions that have been handed down from generation to generation, will not be able to conduct research effectively nor evaluate the validity of published evidence (usually making an assumption that most published work is either all good or all bad). This article provides a brief introduction to basic statistical methods and illustrates its use in common clinical scenarios. In addition, pitfalls of incorrect usage have been highlighted. However, it is not meant to be a substitute for formal training or consultation with a qualified and experienced medical statistician prior to starting any research project.

  18. Fieldcrest Cannon, Inc. Advanced Technical Preparation. Statistical Process Control (SPC). PRE-SPC I. Instructor Book.

    ERIC Educational Resources Information Center

    Averitt, Sallie D.

    This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills and instructing them in basic calculator…

  19. From Research to Practice: Basic Mathematics Skills and Success in Introductory Statistics

    ERIC Educational Resources Information Center

    Lunsford, M. Leigh; Poplin, Phillip

    2011-01-01

    Based on previous research of Johnson and Kuennen (2006), we conducted a study to determine factors that would possibly predict student success in an introductory statistics course. Our results were similar to Johnson and Kuennen in that we found students' basic mathematical skills, as measured on a test created by Johnson and Kuennen, were a…

  20. An Econometric Model for Estimating IQ Scores and Environmental Influences on the Pattern of IQ Scores Over Time.

    ERIC Educational Resources Information Center

    Kadane, Joseph B.; And Others

    This paper offers a preliminary analysis of the effects of a semi-segregated school system on the IQ's of its students. The basic data consist of IQ scores for fourth, sixth, and eighth grades and associated environmental data obtained from their school records. A statistical model is developed to analyze longitudinal data when both process error…

  1. Decision Process to Identify Lessons for Transition to a Distributed (or Blended) Learning Instructional Format

    DTIC Science & Technology

    2009-09-01

    instructional format. Using a mixed- method coding and analysis approach, the sample of POIs were categorized, coded, statistically analyzed, and a... Method SECURITY CLASSIFICATION OF 19. LIMITATION OF 20. NUMBER 21. RESPONSIBLE PERSON 16. REPORT Unclassified 17. ABSTRACT...transition to a distributed (or blended) learning format. Procedure: A mixed- methods approach, combining qualitative coding procedures with basic

  2. A Tutorial for SPSS/PC+ Studentware. Study Guide for the Doctor of Arts in Computer-Based Learning.

    ERIC Educational Resources Information Center

    MacFarland, Thomas W.; Hou, Cheng-I

    The purpose of this tutorial is to provide the basic information needed for success with SPSS/PC+ Studentware, a student version of the statistical analysis software offered by SPSS, Inc., for the IBM PC+ and compatible computers. It is intended as a convenient summary of how to organize and conduct the most common computer-based statistical…

  3. An Assessment of the Impact of the Department of Defense Very-High-Speed Integrated Circuit Program.

    DTIC Science & Technology

    1982-01-01

    analysis, statistical inference, device physics and other such products of basic research. Examples of such information would be: analyses of properties of...TB , for a n-p-n silicon transitor with 1018 cm- 3 base-doping, TB = Wb 2/2Dw becomes 0.4 ps in this limit so that the base contributes little to delay

  4. The Design and Development of a Context-Rich, Photo-Based Online Testing to Assess Students' Science Learning

    ERIC Educational Resources Information Center

    Lin, Min-Jin; Guo, Chorng-Jee; Hsu, Chia-Er

    2011-01-01

    This study designed and developed a CP-MCT (content-rich, photo-based multiple choice online test) to assess whether college students can apply the basic light concept to interpret daily light phenomena. One hundred college students volunteered to take the CP-MCT, and the results were statistically analyzed by applying t-test or ANOVA (Analysis of…

  5. [Prosthodontic research design from the standpoint of statistical analysis: learning and knowing the research design].

    PubMed

    Tanoue, Naomi

    2007-10-01

    For any kind of research, "Research Design" is the most important. The design is used to structure the research, to show how all of the major parts of the research project. It is necessary for all the researchers to begin the research after planning research design for what is the main theme, what is the background and reference, what kind of data is needed, and what kind of analysis is needed. It seems to be a roundabout route, but, in fact, it will be a shortcut. The research methods must be appropriate to the objectives of the study. Regarding the hypothesis-testing research that is the traditional style of the research, the research design based on statistics is undoubtedly necessary considering that the research basically proves "a hypothesis" with data and statistics theory. On the subject of the clinical trial, which is the clinical version of the hypothesis-testing research, the statistical method must be mentioned in a clinical trial planning. This report describes the basis of the research design for a prosthodontics study.

  6. Developing a suitable model for supplier selection based on supply chain risks: an empirical study from Iranian pharmaceutical companies.

    PubMed

    Mehralian, Gholamhossein; Rajabzadeh Gatari, Ali; Morakabati, Mohadese; Vatanpour, Hossein

    2012-01-01

    The supply chain represents the critical link between the development of new product and the market in pharmaceutical industry. Over the years, improvements made in supply chain operations have focused largely on ways to reduce cost and gain efficiencies in scale. In addition, powerful regulatory and market forces have provided new incentives for pharmaceutical firms to basically rethink the way they produce and distribute products, and also to re-imagine the role of the supply chain in driving strategic growth, brand differentiation and economic value in the health continuum. The purpose of this paper is to formulate basic factors involved in risk analysis of pharmaceutical industry, and also determine the effective factors involved in suppliers selection and their priorities. This paper is based on the results of literature review, experts' opinion acquisition, statistical analysis and also using MADM models on data gathered from distributed questionnaires. The model consists of the following steps and components: first factors involved in to supply chain risks are determined. Based on them a framework is considered. According the result of statistical analysis and MADM models the risk factors are formulated. The paper determines the main components and influenceial factors involving in the supply chain risks. Results showed that delivery risk can make an important contribution to mitigate the risk of pharmaceutical industry.

  7. The contribution of collective attack tactics in differentiating handball score efficiency.

    PubMed

    Rogulj, Nenad; Srhoj, Vatromir; Srhoj, Ljerka

    2004-12-01

    The prevalence of 19 elements of collective tactics in score efficient and score inefficient teams was analyzed in 90 First Croatian Handball League--Men games during the 1998-1999 season. Prediction variables were used to describe duration, continuity, system, organization and spatial direction of attacks. Analysis of the basic descriptive and distribution statistical parameters revealed normal distribution of all variables and possibility to use multivariate methods. Canonic discrimination analysis and analysis of variance showed the use of collective tactics elements on attacks to differ statistically significantly between the winning and losing teams. Counter-attacks and uninterrupted attacks predominate in winning teams. Other types of attacks such as long position attack, multiply interrupted attack, attack with one circle runner attack player/pivot, attack based on basic principles, attack based on group cooperation, attack based on independent action, attack based on group maneuvering, rightward directed attack and leftward directed attack predominate in losing teams. Winning teams were found to be clearly characterized by quick attacks against unorganized defense, whereas prolonged, interrupted position attacks against organized defense along with frequent and diverse tactical actions were characteristic of losing teams. The choice and frequency of using a particular tactical activity in position attack do not warrant score efficiency but usually are consequential to the limited anthropologic potential and low level of individual technical-tactical skills of the players in low-quality teams.

  8. Developing a Suitable Model for Supplier Selection Based on Supply Chain Risks: An Empirical Study from Iranian Pharmaceutical Companies

    PubMed Central

    Mehralian, Gholamhossein; Rajabzadeh Gatari, Ali; Morakabati, Mohadese; Vatanpour, Hossein

    2012-01-01

    The supply chain represents the critical link between the development of new product and the market in pharmaceutical industry. Over the years, improvements made in supply chain operations have focused largely on ways to reduce cost and gain efficiencies in scale. In addition, powerful regulatory and market forces have provided new incentives for pharmaceutical firms to basically rethink the way they produce and distribute products, and also to re-imagine the role of the supply chain in driving strategic growth, brand differentiation and economic value in the health continuum. The purpose of this paper is to formulate basic factors involved in risk analysis of pharmaceutical industry, and also determine the effective factors involved in suppliers selection and their priorities. This paper is based on the results of literature review, experts’ opinion acquisition, statistical analysis and also using MADM models on data gathered from distributed questionnaires. The model consists of the following steps and components: first factors involved in to supply chain risks are determined. Based on them a framework is considered. According the result of statistical analysis and MADM models the risk factors are formulated. The paper determines the main components and influenceial factors involving in the supply chain risks. Results showed that delivery risk can make an important contribution to mitigate the risk of pharmaceutical industry. PMID:24250442

  9. Radiation therapy and internet - what can patients expect? homepage analysis of german radiotherapy institutions.

    PubMed

    Janssen, Stefan; Meyer, Andreas; Vordermark, Dirk; Steinmann, Diana

    2010-12-01

    the internet as a source of medical information has emerged during the last years. There is a confusing amount of medical websites with a great diversity of quality. Websites of radiotherapy institutions could offer a safe and an easy-to-control way to assist patients' requests. 205 internet appearances of German radiotherapy institutions were analyzed in June 2009 (nonuniversity hospitals n = 108, medical practices n = 62, university hospitals n = 35). For the evaluation of each homepage verifiable criteria concerning basic information, service and medical issues were used. the quality of information published via internet by different radiotherapy institutions showed a large variety. Basic information like telephone numbers, operating hours, and direction guidance were provided in 96.7%, 40%, and 50.7%, respectively. 85% of the websites introduced the staff, 50.2% supplied photos and 14% further information on the attending physicians. The mean amount of continuative links to other websites was 5.4, the mean amount of articles supplying medical information for patients summed up to 4.6. Medical practices and university hospitals had statistically significant more informative articles and links to other websites than nonuniversity hospitals. No statistically significant differences could be found in most other categories like service issues and basic information. internet presences of radiotherapy institutions hold the chance to supply patients with professional and individualized medical information. While some websites are already using this opportunity, others show a lack of basic information or of user-friendliness.

  10. A persuasive concept of research-oriented teaching in Soil Biochemistry

    NASA Astrophysics Data System (ADS)

    Blagodatskaya, Evgenia; Kuzyakova, Irina

    2013-04-01

    One of the main problems of existing bachelor programs is disconnection of basic and experimental education: even during practical training the methods learned are not related to characterization of soil field experiments and observed soil processes. We introduce a multi-level research-oriented teaching system involving Bachelor students in four-semesters active study by integration the basic knowledge, experimental techniques, statistical approaches, project design and it's realization.The novelty of research-oriented teaching system is based 1) on linkage of ongoing experiment to the study of statistical methods and 2) on self-responsibility of students for interpretation of soil chemical and biochemical characteristics obtained in the very beginning of their study by analysing the set of soil samples allowing full-factorial data treatment. This experimental data set is related to specific soil stand and is used as a backbone of the teaching system accelerating the student's interest to soil studies and motivating them for application of basic knowledge from lecture courses. The multi-level system includes: 1) basic lecture course on soil biochemistry with analysis of research questions, 2) practical training course on laboratory analytics where small groups of students are responsible for analysis of soil samples related to the specific land-use/forest type/forest age; 3) training course on biotic (e.g. respiration) - abiotic (e.g. temperature, moisture, fire etc.) interactions in the same soil samples; 4) theoretical seminars where students present and make a first attempt to explain soil characteristics of various soil stands as affected by abiotic factors (first semester); 5) lecture and seminar course on soil statistics where students apply newly learned statistical methods to prove their conclusions and to find relationships between soil characteristics obtained during first semester; 6) seminar course on project design where students develop their scientific projects to study the uncertainties revealed in soil responses to abiotic factors (second and third semesters); 7) Lecture, seminar and training courses on estimation of active microbial biomass in soil where students realize their projects applying a new knowledge to the soils from the stands they are responsible for (fourth semester). Thus, during four semesters the students continuously combine the theoretical knowledge from the lectures with their own experimental experience, compare and discuss results of various groups during seminars and obtain the skills in project design. The successful application of research-oriented teaching system in University of Göttingen allowed each student the early-stage revealing knowledge gaps, accelerated their involvement in ongoing research projects, and motivated them to begin own scientific career.

  11. Using Self-Reflection To Increase Science Process Skills in the General Chemistry Laboratory

    NASA Astrophysics Data System (ADS)

    Veal, William R.; Taylor, Dawne; Rogers, Amy L.

    2009-03-01

    Self-reflection is a tool of instruction that has been used in the science classroom. Research has shown great promise in using video as a learning tool in the classroom. However, the integration of self-reflective practice using video in the general chemistry laboratory to help students develop process skills has not been done. Immediate video feedback and direct instruction were employed in a general chemistry laboratory course to improve students' mastery and understanding of basic and advanced process skills. Qualitative results and statistical analysis of quantitative data proved that self-reflection significantly helped students develop basic and advanced process skills, yet did not seem to influence the general understanding of the science content.

  12. Counselling by primary care physicians may help patients with heartburn-predominant uninvestigated dyspepsia.

    PubMed

    Paré, Pierre; Lee, Joanna; Hawes, Ian A

    2010-03-01

    To determine whether strategies to counsel and empower patients with heartburn-predominant dyspepsia could improve health-related quality of life. Using a cluster randomized, parallel group, multicentre design, nine centres were assigned to provide either basic or comprehensive counselling to patients (age range 18 to 50 years) presenting with heartburn-predominant upper gastrointestinal symptoms, who would be considered for drug therapy without further investigation. Patients were treated for four weeks with esomeprazole 40 mg once daily, followed by six months of treatment that was at the physician's discretion. The primary end point was the baseline change in Quality of Life in Reflux and Dyspepsia (QOLRAD) questionnaire score. A total of 135 patients from nine centres were included in the intention-to-treat analysis. There was a statistically significant baseline improvement in all domains of the QOLRAD questionnaire in both study arms at four and seven months (P<0.0001). After four months, the overall mean change in QOLRAD score appeared greater in the comprehensive counselling group than in the basic counselling group (1.77 versus 1.47, respectively); however, this difference was not statistically significant (P=0.07). After seven months, the overall mean baseline change in QOLRAD score between the comprehensive and basic counselling groups was not statistically significant (1.69 versus 1.56, respectively; P=0.63). A standardized, comprehensive counselling intervention showed a positive initial trend in improving quality of life in patients with heartburn-predominant uninvestigated dyspepsia. Further investigation is needed to confirm the potential benefits of providing patients with comprehensive counselling regarding disease management.

  13. Counselling by primary care physicians may help patients with heartburn-predominant uninvestigated dyspepsia

    PubMed Central

    Paré, Pierre; Math, Joanna Lee M; Hawes, Ian A

    2010-01-01

    OBJECTIVE: To determine whether strategies to counsel and empower patients with heartburn-predominant dyspepsia could improve health-related quality of life. METHODS: Using a cluster randomized, parallel group, multicentre design, nine centres were assigned to provide either basic or comprehensive counselling to patients (age range 18 to 50 years) presenting with heartburn-predominant upper gastrointestinal symptoms, who would be considered for drug therapy without further investigation. Patients were treated for four weeks with esomeprazole 40 mg once daily, followed by six months of treatment that was at the physician’s discretion. The primary end point was the baseline change in Quality of Life in Reflux and Dyspepsia (QOLRAD) questionnaire score. RESULTS: A total of 135 patients from nine centres were included in the intention-to-treat analysis. There was a statistically significant baseline improvement in all domains of the QOLRAD questionnaire in both study arms at four and seven months (P<0.0001). After four months, the overall mean change in QOLRAD score appeared greater in the comprehensive counselling group than in the basic counselling group (1.77 versus 1.47, respectively); however, this difference was not statistically significant (P=0.07). After seven months, the overall mean baseline change in QOLRAD score between the comprehensive and basic counselling groups was not statistically significant (1.69 versus 1.56, respectively; P=0.63). CONCLUSIONS: A standardized, comprehensive counselling intervention showed a positive initial trend in improving quality of life in patients with heartburn-predominant uninvestigated dyspepsia. Further investigation is needed to confirm the potential benefits of providing patients with comprehensive counselling regarding disease management. PMID:20352148

  14. The decade 1989-1998 in Spanish psychology: an analysis of research in basic psychological processes, history of psychology, and other related topics.

    PubMed

    Igoa, J M

    2001-11-01

    This article presents a review of research published by Spanish Faculty from the area of basic psychology in the decade 1989-1998. It provides information about research on basic psychological processes commonly studied under the labels of experimental and cognitive psychology, plus a number of topics from other research areas, including some applied psychology issues. The review analyzes the work of 241 faculty members from 27 different Spanish universities, as reflected in 1,882 published papers, book chapters, and books. The analyses carried out in this report include a description of the main research trends found in each area, with some representative references of the published materials, and statistics showing the distribution of this research work in various relevant publications (both Spanish and foreign), with figures that reveal the impact of this work both at a national and international scale.

  15. Development of LACIE CCEA-1 weather/wheat yield models. [regression analysis

    NASA Technical Reports Server (NTRS)

    Strommen, N. D.; Sakamoto, C. M.; Leduc, S. K.; Umberger, D. E. (Principal Investigator)

    1979-01-01

    The advantages and disadvantages of the casual (phenological, dynamic, physiological), statistical regression, and analog approaches to modeling for grain yield are examined. Given LACIE's primary goal of estimating wheat production for the large areas of eight major wheat-growing regions, the statistical regression approach of correlating historical yield and climate data offered the Center for Climatic and Environmental Assessment the greatest potential return within the constraints of time and data sources. The basic equation for the first generation wheat-yield model is given. Topics discussed include truncation, trend variable, selection of weather variables, episodic events, strata selection, operational data flow, weighting, and model results.

  16. Stokes-correlometry of polarization-inhomogeneous objects

    NASA Astrophysics Data System (ADS)

    Ushenko, O. G.; Dubolazov, A.; Bodnar, G. B.; Bachynskiy, V. T.; Vanchulyak, O.

    2018-01-01

    The paper consists of two parts. The first part presents short theoretical basics of the method of Stokes-correlometry description of optical anisotropy of biological tissues. It was provided experimentally measured coordinate distributions of modulus (MSV) and phase (PhSV) of complex Stokes vector of skeletal muscle tissue. It was defined the values and ranges of changes of statistic moments of the 1st-4th orders, which characterize the distributions of values of MSV and PhSV. The second part presents the data of statistic analysis of the distributions of modulus MSV and PhSV. It was defined the objective criteria of differentiation of samples with urinary incontinence.

  17. A basic analysis toolkit for biological sequences

    PubMed Central

    Giancarlo, Raffaele; Siragusa, Alessandro; Siragusa, Enrico; Utro, Filippo

    2007-01-01

    This paper presents a software library, nicknamed BATS, for some basic sequence analysis tasks. Namely, local alignments, via approximate string matching, and global alignments, via longest common subsequence and alignments with affine and concave gap cost functions. Moreover, it also supports filtering operations to select strings from a set and establish their statistical significance, via z-score computation. None of the algorithms is new, but although they are generally regarded as fundamental for sequence analysis, they have not been implemented in a single and consistent software package, as we do here. Therefore, our main contribution is to fill this gap between algorithmic theory and practice by providing an extensible and easy to use software library that includes algorithms for the mentioned string matching and alignment problems. The library consists of C/C++ library functions as well as Perl library functions. It can be interfaced with Bioperl and can also be used as a stand-alone system with a GUI. The software is available at under the GNU GPL. PMID:17877802

  18. Applying Bayesian statistics to the study of psychological trauma: A suggestion for future research.

    PubMed

    Yalch, Matthew M

    2016-03-01

    Several contemporary researchers have noted the virtues of Bayesian methods of data analysis. Although debates continue about whether conventional or Bayesian statistics is the "better" approach for researchers in general, there are reasons why Bayesian methods may be well suited to the study of psychological trauma in particular. This article describes how Bayesian statistics offers practical solutions to the problems of data non-normality, small sample size, and missing data common in research on psychological trauma. After a discussion of these problems and the effects they have on trauma research, this article explains the basic philosophical and statistical foundations of Bayesian statistics and how it provides solutions to these problems using an applied example. Results of the literature review and the accompanying example indicates the utility of Bayesian statistics in addressing problems common in trauma research. Bayesian statistics provides a set of methodological tools and a broader philosophical framework that is useful for trauma researchers. Methodological resources are also provided so that interested readers can learn more. (c) 2016 APA, all rights reserved).

  19. HyphArea--automated analysis of spatiotemporal fungal patterns.

    PubMed

    Baum, Tobias; Navarro-Quezada, Aura; Knogge, Wolfgang; Douchkov, Dimitar; Schweizer, Patrick; Seiffert, Udo

    2011-01-01

    In phytopathology quantitative measurements are rarely used to assess crop plant disease symptoms. Instead, a qualitative valuation by eye is often the method of choice. In order to close the gap between subjective human inspection and objective quantitative results, the development of an automated analysis system that is capable of recognizing and characterizing the growth patterns of fungal hyphae in micrograph images was developed. This system should enable the efficient screening of different host-pathogen combinations (e.g., barley-Blumeria graminis, barley-Rhynchosporium secalis) using different microscopy technologies (e.g., bright field, fluorescence). An image segmentation algorithm was developed for gray-scale image data that achieved good results with several microscope imaging protocols. Furthermore, adaptability towards different host-pathogen systems was obtained by using a classification that is based on a genetic algorithm. The developed software system was named HyphArea, since the quantification of the area covered by a hyphal colony is the basic task and prerequisite for all further morphological and statistical analyses in this context. By means of a typical use case the utilization and basic properties of HyphArea could be demonstrated. It was possible to detect statistically significant differences between the growth of an R. secalis wild-type strain and a virulence mutant. Copyright © 2010 Elsevier GmbH. All rights reserved.

  20. Genetics and epidemiology, congenital anomalies and cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman, J.M.

    1997-03-01

    Many of the basic statistical methods used in epidemiology - regression, analysis of variance, and estimation of relative risk, for example - originally were developed for the genetic analysis of biometric data. The familiarity that many geneticists have with this methodology has helped geneticists to understand and accept genetic epidemiology as a scientific discipline. It worth noting, however, that most of the work in genetic epidemiology during the past decade has been devoted to linkage and other family studies, rather than to population-based investigations of the type that characterize much of mainstream epidemiology. 30 refs., 2 tabs.

  1. Preliminary analyses of SIB-B radar data for recent Hawaii lava flows

    NASA Technical Reports Server (NTRS)

    Kaupp, V. H.; Derryberry, B. A.; Macdonald, H. C.; Gaddis, L. R.; Mouginis-Mark, P. J.

    1986-01-01

    The Shuttle Imaging Radar (SIR-B) experiment acquired two L-band (23 cm wavelength) radar images (at about 28 and 48 deg incidence angles) over the Kilauea Volcano area of southeastern Hawaii. Geologic analysis of these data indicates that, although aa lava flows and pyroclastic deposits can be discriminated, pahoehoe lava flows are not readily distinguished from surrounding low return materials. Preliminary analysis of data extracted from isolated flows indicates that flow type (i.e., aa or pahoehoe) and relative age can be determined from their basic statistics and illumination angle.

  2. The relevance of basic sciences in undergraduate medical education.

    PubMed

    Lynch, C; Grant, T; McLoughlin, P; Last, J

    2016-02-01

    Evolving and changing undergraduate medical curricula raise concerns that there will no longer be a place for basic sciences. National and international trends show that 5-year programmes with a pre-requisite for school chemistry are growing more prevalent. National reports in Ireland show a decline in the availability of school chemistry and physics. This observational cohort study considers if the basic sciences of physics, chemistry and biology should be a prerequisite to entering medical school, be part of the core medical curriculum or if they have a place in the practice of medicine. Comparisons of means, correlation and linear regression analysis assessed the degree of association between predictors (school and university basic sciences) and outcomes (year and degree GPA) for entrants to a 6-year Irish medical programme between 2006 and 2009 (n = 352). We found no statistically significant difference in medical programme performance between students with/without prior basic science knowledge. The Irish school exit exam and its components were mainly weak predictors of performance (-0.043 ≥ r ≤ 0.396). Success in year one of medicine, which includes a basic science curriculum, was indicative of later success (0.194 ≥ r (2) ≤ 0.534). University basic sciences were found to be more predictive than school sciences in undergraduate medical performance in our institution. The increasing emphasis of basic sciences in medical practice and the declining availability of school sciences should mandate medical schools in Ireland to consider how removing basic sciences from the curriculum might impact on future applicants.

  3. A Multidisciplinary Approach for Teaching Statistics and Probability

    ERIC Educational Resources Information Center

    Rao, C. Radhakrishna

    1971-01-01

    The author presents a syllabus for an introductory (first year after high school) course in statistics and probability and some methods of teaching statistical techniques. The description comes basically from the procedures used at the Indian Statistical Institute, Calcutta. (JG)

  4. ICAP - An Interactive Cluster Analysis Procedure for analyzing remotely sensed data

    NASA Technical Reports Server (NTRS)

    Wharton, S. W.; Turner, B. J.

    1981-01-01

    An Interactive Cluster Analysis Procedure (ICAP) was developed to derive classifier training statistics from remotely sensed data. ICAP differs from conventional clustering algorithms by allowing the analyst to optimize the cluster configuration by inspection, rather than by manipulating process parameters. Control of the clustering process alternates between the algorithm, which creates new centroids and forms clusters, and the analyst, who can evaluate and elect to modify the cluster structure. Clusters can be deleted, or lumped together pairwise, or new centroids can be added. A summary of the cluster statistics can be requested to facilitate cluster manipulation. The principal advantage of this approach is that it allows prior information (when available) to be used directly in the analysis, since the analyst interacts with ICAP in a straightforward manner, using basic terms with which he is more likely to be familiar. Results from testing ICAP showed that an informed use of ICAP can improve classification, as compared to an existing cluster analysis procedure.

  5. A Comparison of Computer-Assisted Instruction and the Traditional Method of Teaching Basic Statistics

    ERIC Educational Resources Information Center

    Ragasa, Carmelita Y.

    2008-01-01

    The objective of the study is to determine if there is a significant difference in the effects of the treatment and control groups on achievement as well as on attitude as measured by the posttest. A class of 38 sophomore college students in the basic statistics taught with the use of computer-assisted instruction and another class of 15 students…

  6. Handbook Of X-ray Astronomy

    NASA Astrophysics Data System (ADS)

    Arnaud, Keith A.; Smith, R. K.; Siemiginowska, A.; Edgar, R. J.; Grant, C. E.; Kuntz, K. D.; Schwartz, D. A.

    2011-09-01

    This poster advertises a book to be published in September 2011 by Cambridge University Press. Written for graduate students, professional astronomers and researchers who want to start working in this field, this book is a practical guide to x-ray astronomy. The handbook begins with x-ray optics, basic detector physics and CCDs, before focussing on data analysis. It introduces the reduction and calibration of x-ray data, scientific analysis, archives, statistical issues and the particular problems of highly extended sources. The book describes the main hardware used in x-ray astronomy, emphasizing the implications for data analysis. The concepts behind common x-ray astronomy data analysis software are explained. The appendices present reference material often required during data analysis.

  7. Evidence of nonextensive statistical physics behavior in the watershed distribution in active tectonic areas: examples from Greece

    NASA Astrophysics Data System (ADS)

    Vallianatos, Filippos; Kouli, Maria

    2013-08-01

    The Digital Elevation Model (DEM) for the Crete Island with a resolution of approximately 20 meters was used in order to delineate watersheds by computing the flow direction and using it in the Watershed function. The Watershed function uses a raster of flow direction to determine contributing area. The Geographic Information Systems routine procedure was applied and the watersheds as well as the streams network (using a threshold of 2000 cells, i.e. the minimum number of cells that constitute a stream) were extracted from the hydrologically corrected (free of sinks) DEM. A number of a few thousand watersheds were delineated, and their areal extent was calculated. From these watersheds a number of 300 was finally selected for further analysis as the watersheds of extremely small area were excluded in order to avoid possible artifacts. Our analysis approach is based on the basic principles of Complexity theory and Tsallis Entropy introduces in the frame of non-extensive statistical physics. This concept has been successfully used for the analysis of a variety of complex dynamic systems including natural hazards, where fractality and long-range interactions are important. The analysis indicates that the statistical distribution of watersheds can be successfully described with the theoretical estimations of non-extensive statistical physics implying the complexity that characterizes the occurrences of them.

  8. Back to basics: an introduction to statistics.

    PubMed

    Halfens, R J G; Meijers, J M M

    2013-05-01

    In the second in the series, Professor Ruud Halfens and Dr Judith Meijers give an overview of statistics, both descriptive and inferential. They describe the first principles of statistics, including some relevant inferential tests.

  9. Practicality of Elementary Statistics Module Based on CTL Completed by Instructions on Using Software R

    NASA Astrophysics Data System (ADS)

    Delyana, H.; Rismen, S.; Handayani, S.

    2018-04-01

    This research is a development research using 4-D design model (define, design, develop, and disseminate). The results of the define stage are analyzed for the needs of the following; Syllabus analysis, textbook analysis, student characteristics analysis and literature analysis. The results of textbook analysis obtained the description that of the two textbooks that must be owned by students also still difficulty in understanding it, the form of presentation also has not facilitated students to be independent in learning to find the concept, textbooks are also not equipped with data processing referrals by using software R. The developed module is considered valid by the experts. Further field trials are conducted to determine the practicality and effectiveness. The trial was conducted to the students of Mathematics Education Study Program of STKIP PGRI which was taken randomly which has not taken Basic Statistics Course that is as many as 4 people. Practical aspects of attention are easy, time efficient, easy to interpret, and equivalence. The practical value in each aspect is 3.7; 3.79, 3.7 and 3.78. Based on the results of the test students considered that the module has been very practical use in learning. This means that the module developed can be used by students in Elementary Statistics learning.

  10. Study design and statistical analysis of data in human population studies with the micronucleus assay.

    PubMed

    Ceppi, Marcello; Gallo, Fabio; Bonassi, Stefano

    2011-01-01

    The most common study design performed in population studies based on the micronucleus (MN) assay, is the cross-sectional study, which is largely performed to evaluate the DNA damaging effects of exposure to genotoxic agents in the workplace, in the environment, as well as from diet or lifestyle factors. Sample size is still a critical issue in the design of MN studies since most recent studies considering gene-environment interaction, often require a sample size of several hundred subjects, which is in many cases difficult to achieve. The control of confounding is another major threat to the validity of causal inference. The most popular confounders considered in population studies using MN are age, gender and smoking habit. Extensive attention is given to the assessment of effect modification, given the increasing inclusion of biomarkers of genetic susceptibility in the study design. Selected issues concerning the statistical treatment of data have been addressed in this mini-review, starting from data description, which is a critical step of statistical analysis, since it allows to detect possible errors in the dataset to be analysed and to check the validity of assumptions required for more complex analyses. Basic issues dealing with statistical analysis of biomarkers are extensively evaluated, including methods to explore the dose-response relationship among two continuous variables and inferential analysis. A critical approach to the use of parametric and non-parametric methods is presented, before addressing the issue of most suitable multivariate models to fit MN data. In the last decade, the quality of statistical analysis of MN data has certainly evolved, although even nowadays only a small number of studies apply the Poisson model, which is the most suitable method for the analysis of MN data.

  11. Verification of relationship model between Korean new elderly class's recovery resilience and productive aging.

    PubMed

    Cho, Gun-Sang; Kim, Dae-Sung; Yi, Eun-Surk

    2015-12-01

    The purpose of this study is to verification of relationship model between Korean new elderly class's recovery resilience and productive aging. As of 2013, this study sampled preliminary elderly people in Gyeonggi-do and other provinces nationwide. Data from a total of effective 484 subjects was analyzed. The collected data was processed using the IBM SPSS 20.0 and AMOS 20.0, and underwent descriptive statistical analysis, confirmatory factor analysis, and structure model verification. The path coefficient associated with model fitness was examined. The standardization path coefficient between recovery resilience and productive aging is β=0.975 (t=14.790), revealing a statistically significant positive effect. Thus, it was found that the proposed basic model on the direct path of recovery resilience and productive aging was fit for the model.

  12. Verification of relationship model between Korean new elderly class’s recovery resilience and productive aging

    PubMed Central

    Cho, Gun-Sang; Kim, Dae-Sung; Yi, Eun-Surk

    2015-01-01

    The purpose of this study is to verification of relationship model between Korean new elderly class’s recovery resilience and productive aging. As of 2013, this study sampled preliminary elderly people in Gyeonggi-do and other provinces nationwide. Data from a total of effective 484 subjects was analyzed. The collected data was processed using the IBM SPSS 20.0 and AMOS 20.0, and underwent descriptive statistical analysis, confirmatory factor analysis, and structure model verification. The path coefficient associated with model fitness was examined. The standardization path coefficient between recovery resilience and productive aging is β=0.975 (t=14.790), revealing a statistically significant positive effect. Thus, it was found that the proposed basic model on the direct path of recovery resilience and productive aging was fit for the model. PMID:26730383

  13. Analysis of molecular variance inferred from metric distances among DNA haplotypes: application to human mitochondrial DNA restriction data.

    PubMed

    Excoffier, L; Smouse, P E; Quattro, J M

    1992-06-01

    We present here a framework for the study of molecular variation within a single species. Information on DNA haplotype divergence is incorporated into an analysis of variance format, derived from a matrix of squared-distances among all pairs of haplotypes. This analysis of molecular variance (AMOVA) produces estimates of variance components and F-statistic analogs, designated here as phi-statistics, reflecting the correlation of haplotypic diversity at different levels of hierarchical subdivision. The method is flexible enough to accommodate several alternative input matrices, corresponding to different types of molecular data, as well as different types of evolutionary assumptions, without modifying the basic structure of the analysis. The significance of the variance components and phi-statistics is tested using a permutational approach, eliminating the normality assumption that is conventional for analysis of variance but inappropriate for molecular data. Application of AMOVA to human mitochondrial DNA haplotype data shows that population subdivisions are better resolved when some measure of molecular differences among haplotypes is introduced into the analysis. At the intraspecific level, however, the additional information provided by knowing the exact phylogenetic relations among haplotypes or by a nonlinear translation of restriction-site change into nucleotide diversity does not significantly modify the inferred population genetic structure. Monte Carlo studies show that site sampling does not fundamentally affect the significance of the molecular variance components. The AMOVA treatment is easily extended in several different directions and it constitutes a coherent and flexible framework for the statistical analysis of molecular data.

  14. Implementation and evaluation of an efficient secure computation system using ‘R’ for healthcare statistics

    PubMed Central

    Chida, Koji; Morohashi, Gembu; Fuji, Hitoshi; Magata, Fumihiko; Fujimura, Akiko; Hamada, Koki; Ikarashi, Dai; Yamamoto, Ryuichi

    2014-01-01

    Background and objective While the secondary use of medical data has gained attention, its adoption has been constrained due to protection of patient privacy. Making medical data secure by de-identification can be problematic, especially when the data concerns rare diseases. We require rigorous security management measures. Materials and methods Using secure computation, an approach from cryptography, our system can compute various statistics over encrypted medical records without decrypting them. An issue of secure computation is that the amount of processing time required is immense. We implemented a system that securely computes healthcare statistics from the statistical computing software ‘R’ by effectively combining secret-sharing-based secure computation with original computation. Results Testing confirmed that our system could correctly complete computation of average and unbiased variance of approximately 50 000 records of dummy insurance claim data in a little over a second. Computation including conditional expressions and/or comparison of values, for example, t test and median, could also be correctly completed in several tens of seconds to a few minutes. Discussion If medical records are simply encrypted, the risk of leaks exists because decryption is usually required during statistical analysis. Our system possesses high-level security because medical records remain in encrypted state even during statistical analysis. Also, our system can securely compute some basic statistics with conditional expressions using ‘R’ that works interactively while secure computation protocols generally require a significant amount of processing time. Conclusions We propose a secure statistical analysis system using ‘R’ for medical data that effectively integrates secret-sharing-based secure computation and original computation. PMID:24763677

  15. Implementation and evaluation of an efficient secure computation system using 'R' for healthcare statistics.

    PubMed

    Chida, Koji; Morohashi, Gembu; Fuji, Hitoshi; Magata, Fumihiko; Fujimura, Akiko; Hamada, Koki; Ikarashi, Dai; Yamamoto, Ryuichi

    2014-10-01

    While the secondary use of medical data has gained attention, its adoption has been constrained due to protection of patient privacy. Making medical data secure by de-identification can be problematic, especially when the data concerns rare diseases. We require rigorous security management measures. Using secure computation, an approach from cryptography, our system can compute various statistics over encrypted medical records without decrypting them. An issue of secure computation is that the amount of processing time required is immense. We implemented a system that securely computes healthcare statistics from the statistical computing software 'R' by effectively combining secret-sharing-based secure computation with original computation. Testing confirmed that our system could correctly complete computation of average and unbiased variance of approximately 50,000 records of dummy insurance claim data in a little over a second. Computation including conditional expressions and/or comparison of values, for example, t test and median, could also be correctly completed in several tens of seconds to a few minutes. If medical records are simply encrypted, the risk of leaks exists because decryption is usually required during statistical analysis. Our system possesses high-level security because medical records remain in encrypted state even during statistical analysis. Also, our system can securely compute some basic statistics with conditional expressions using 'R' that works interactively while secure computation protocols generally require a significant amount of processing time. We propose a secure statistical analysis system using 'R' for medical data that effectively integrates secret-sharing-based secure computation and original computation. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  16. Multivariate assessment of event-related potentials with the t-CWT method.

    PubMed

    Bostanov, Vladimir

    2015-11-05

    Event-related brain potentials (ERPs) are usually assessed with univariate statistical tests although they are essentially multivariate objects. Brain-computer interface applications are a notable exception to this practice, because they are based on multivariate classification of single-trial ERPs. Multivariate ERP assessment can be facilitated by feature extraction methods. One such method is t-CWT, a mathematical-statistical algorithm based on the continuous wavelet transform (CWT) and Student's t-test. This article begins with a geometric primer on some basic concepts of multivariate statistics as applied to ERP assessment in general and to the t-CWT method in particular. Further, it presents for the first time a detailed, step-by-step, formal mathematical description of the t-CWT algorithm. A new multivariate outlier rejection procedure based on principal component analysis in the frequency domain is presented as an important pre-processing step. The MATLAB and GNU Octave implementation of t-CWT is also made publicly available for the first time as free and open source code. The method is demonstrated on some example ERP data obtained in a passive oddball paradigm. Finally, some conceptually novel applications of the multivariate approach in general and of the t-CWT method in particular are suggested and discussed. Hopefully, the publication of both the t-CWT source code and its underlying mathematical algorithm along with a didactic geometric introduction to some basic concepts of multivariate statistics would make t-CWT more accessible to both users and developers in the field of neuroscience research.

  17. Understanding Statistical Concepts and Terms in Context: The GovStat Ontology and the Statistical Interactive Glossary.

    ERIC Educational Resources Information Center

    Haas, Stephanie W.; Pattuelli, Maria Cristina; Brown, Ron T.

    2003-01-01

    Describes the Statistical Interactive Glossary (SIG), an enhanced glossary of statistical terms supported by the GovStat ontology of statistical concepts. Presents a conceptual framework whose components articulate different aspects of a term's basic explanation that can be manipulated to produce a variety of presentations. The overarching…

  18. MethVisual - visualization and exploratory statistical analysis of DNA methylation profiles from bisulfite sequencing.

    PubMed

    Zackay, Arie; Steinhoff, Christine

    2010-12-15

    Exploration of DNA methylation and its impact on various regulatory mechanisms has become a very active field of research. Simultaneously there is an arising need for tools to process and analyse the data together with statistical investigation and visualisation. MethVisual is a new application that enables exploratory analysis and intuitive visualization of DNA methylation data as is typically generated by bisulfite sequencing. The package allows the import of DNA methylation sequences, aligns them and performs quality control comparison. It comprises basic analysis steps as lollipop visualization, co-occurrence display of methylation of neighbouring and distant CpG sites, summary statistics on methylation status, clustering and correspondence analysis. The package has been developed for methylation data but can be also used for other data types for which binary coding can be inferred. The application of the package, as well as a comparison to existing DNA methylation analysis tools and its workflow based on two datasets is presented in this paper. The R package MethVisual offers various analysis procedures for data that can be binarized, in particular for bisulfite sequenced methylation data. R/Bioconductor has become one of the most important environments for statistical analysis of various types of biological and medical data. Therefore, any data analysis within R that allows the integration of various data types as provided from different technological platforms is convenient. It is the first and so far the only specific package for DNA methylation analysis, in particular for bisulfite sequenced data available in R/Bioconductor enviroment. The package is available for free at http://methvisual.molgen.mpg.de/ and from the Bioconductor Consortium http://www.bioconductor.org.

  19. MethVisual - visualization and exploratory statistical analysis of DNA methylation profiles from bisulfite sequencing

    PubMed Central

    2010-01-01

    Background Exploration of DNA methylation and its impact on various regulatory mechanisms has become a very active field of research. Simultaneously there is an arising need for tools to process and analyse the data together with statistical investigation and visualisation. Findings MethVisual is a new application that enables exploratory analysis and intuitive visualization of DNA methylation data as is typically generated by bisulfite sequencing. The package allows the import of DNA methylation sequences, aligns them and performs quality control comparison. It comprises basic analysis steps as lollipop visualization, co-occurrence display of methylation of neighbouring and distant CpG sites, summary statistics on methylation status, clustering and correspondence analysis. The package has been developed for methylation data but can be also used for other data types for which binary coding can be inferred. The application of the package, as well as a comparison to existing DNA methylation analysis tools and its workflow based on two datasets is presented in this paper. Conclusions The R package MethVisual offers various analysis procedures for data that can be binarized, in particular for bisulfite sequenced methylation data. R/Bioconductor has become one of the most important environments for statistical analysis of various types of biological and medical data. Therefore, any data analysis within R that allows the integration of various data types as provided from different technological platforms is convenient. It is the first and so far the only specific package for DNA methylation analysis, in particular for bisulfite sequenced data available in R/Bioconductor enviroment. The package is available for free at http://methvisual.molgen.mpg.de/ and from the Bioconductor Consortium http://www.bioconductor.org. PMID:21159174

  20. Characteristics of the 100 Largest Public Elementary and Secondary School Districts in the United States: 2003-04. Statistical Analysis Report. NCES 2006-329

    ERIC Educational Resources Information Center

    Dalton, Ben; Sable, Jennifer; Hoffman, Lee

    2006-01-01

    The purpose of this publication is to provide basic descriptive information about the 100 largest school districts (ranked by student membership, that is, the number of students enrolled at the beginning of the school year) for the 2003-04 school year in the 50 states, the District of Columbia, Puerto Rico, the Bureau of Indian Affairs, the…

  1. Premilitary Trauma Symptomatolgy Among Female U.S. Navy Basic Trainees

    DTIC Science & Technology

    1996-12-01

    of abuse and child abuse screening. Child Abuse & Neglect. 16. 647-659. Cohen, J. (1988). Statistical power analysis for the behavioral sciences...psychosocial adjustment: A review of the research. Child Abuse & Neglect. 9(2), 251-263. Malinosky-Rummell, R. R.. & Hansen, D. J. (1993). Long-term consequences...abusive behaviors. NHRC Report No. 95-26. San Diego, CA: Naval Health Research Center. Muller, R. T. (1991). Victim blame and child abuse . Unpublished

  2. The Effectiveness of Mao’s Influence Operations at the Beginning of the Chinese Civil War

    DTIC Science & Technology

    2014-05-22

    the merchants. This class was not only comprised of the local shop keepers and trade merchants but was also home to many wealthy monopolists traders...relied on the village-and- market center community.14 These communities would be the basis in which the farmer classes received many of their basic and...1-7. 20 public relations and communications, social marketing , statistics, and trend analysis

  3. Probability, statistics, and computational science.

    PubMed

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  4. Statistical Quality Control of Moisture Data in GEOS DAS

    NASA Technical Reports Server (NTRS)

    Dee, D. P.; Rukhovets, L.; Todling, R.

    1999-01-01

    A new statistical quality control algorithm was recently implemented in the Goddard Earth Observing System Data Assimilation System (GEOS DAS). The final step in the algorithm consists of an adaptive buddy check that either accepts or rejects outlier observations based on a local statistical analysis of nearby data. A basic assumption in any such test is that the observed field is spatially coherent, in the sense that nearby data can be expected to confirm each other. However, the buddy check resulted in excessive rejection of moisture data, especially during the Northern Hemisphere summer. The analysis moisture variable in GEOS DAS is water vapor mixing ratio. Observational evidence shows that the distribution of mixing ratio errors is far from normal. Furthermore, spatial correlations among mixing ratio errors are highly anisotropic and difficult to identify. Both factors contribute to the poor performance of the statistical quality control algorithm. To alleviate the problem, we applied the buddy check to relative humidity data instead. This variable explicitly depends on temperature and therefore exhibits a much greater spatial coherence. As a result, reject rates of moisture data are much more reasonable and homogeneous in time and space.

  5. Fish: A New Computer Program for Friendly Introductory Statistics Help

    ERIC Educational Resources Information Center

    Brooks, Gordon P.; Raffle, Holly

    2005-01-01

    All introductory statistics students must master certain basic descriptive statistics, including means, standard deviations and correlations. Students must also gain insight into such complex concepts as the central limit theorem and standard error. This article introduces and describes the Friendly Introductory Statistics Help (FISH) computer…

  6. Quality of community basic medical service utilization in urban and suburban areas in Shanghai from 2009 to 2014.

    PubMed

    Guo, Lijun; Bao, Yong; Ma, Jun; Li, Shujun; Cai, Yuyang; Sun, Wei; Liu, Qiaohong

    2018-01-01

    Urban areas usually display better health care services than rural areas, but data about suburban areas in China are lacking. Hence, this cross-sectional study compared the utilization of community basic medical services in Shanghai urban and suburban areas between 2009 and 2014. These data were used to improve the efficiency of community health service utilization and to provide a reference for solving the main health problems of the residents in urban and suburban areas of Shanghai. Using a two-stage random sampling method, questionnaires were completed by 73 community health service centers that were randomly selected from six districts that were also randomly selected from 17 counties in Shanghai. Descriptive statistics, principal component analysis, and forecast analysis were used to complete a gap analysis of basic health services utilization quality between urban and suburban areas. During the 6-year study period, there was an increasing trend toward greater efficiency of basic medical service provision, benefits of basic medical service provision, effectiveness of common chronic disease management, overall satisfaction of community residents, and two-way referral effects. In addition to the implementation effect of hypertension management and two-way referral, the remaining indicators showed a superior effect in urban areas compared with the suburbs (P<0.001). In addition, among the seven principal components, four principal component scores were better in urban areas than in suburban areas (P = <0.001, 0.004, 0.036, and 0.022). The urban comprehensive score also exceeded that of the suburbs (P<0.001). In summary, over the 6-year period, there was a rapidly increasing trend in basic medical service utilization. Comprehensive satisfaction clearly improved as well. Nevertheless, there was an imbalance in health service utilization between urban and suburban areas. There is a need for the health administrative department to address this imbalance between urban and suburban institutions and to provide the required support to underdeveloped areas to improve resident satisfaction.

  7. Quality of community basic medical service utilization in urban and suburban areas in Shanghai from 2009 to 2014

    PubMed Central

    Ma, Jun; Li, Shujun; Cai, Yuyang; Sun, Wei; Liu, Qiaohong

    2018-01-01

    Urban areas usually display better health care services than rural areas, but data about suburban areas in China are lacking. Hence, this cross-sectional study compared the utilization of community basic medical services in Shanghai urban and suburban areas between 2009 and 2014. These data were used to improve the efficiency of community health service utilization and to provide a reference for solving the main health problems of the residents in urban and suburban areas of Shanghai. Using a two-stage random sampling method, questionnaires were completed by 73 community health service centers that were randomly selected from six districts that were also randomly selected from 17 counties in Shanghai. Descriptive statistics, principal component analysis, and forecast analysis were used to complete a gap analysis of basic health services utilization quality between urban and suburban areas. During the 6-year study period, there was an increasing trend toward greater efficiency of basic medical service provision, benefits of basic medical service provision, effectiveness of common chronic disease management, overall satisfaction of community residents, and two-way referral effects. In addition to the implementation effect of hypertension management and two-way referral, the remaining indicators showed a superior effect in urban areas compared with the suburbs (P<0.001). In addition, among the seven principal components, four principal component scores were better in urban areas than in suburban areas (P = <0.001, 0.004, 0.036, and 0.022). The urban comprehensive score also exceeded that of the suburbs (P<0.001). In summary, over the 6-year period, there was a rapidly increasing trend in basic medical service utilization. Comprehensive satisfaction clearly improved as well. Nevertheless, there was an imbalance in health service utilization between urban and suburban areas. There is a need for the health administrative department to address this imbalance between urban and suburban institutions and to provide the required support to underdeveloped areas to improve resident satisfaction. PMID:29791470

  8. Experimental design, power and sample size for animal reproduction experiments.

    PubMed

    Chapman, Phillip L; Seidel, George E

    2008-01-01

    The present paper concerns statistical issues in the design of animal reproduction experiments, with emphasis on the problems of sample size determination and power calculations. We include examples and non-technical discussions aimed at helping researchers avoid serious errors that may invalidate or seriously impair the validity of conclusions from experiments. Screen shots from interactive power calculation programs and basic SAS power calculation programs are presented to aid in understanding statistical power and computing power in some common experimental situations. Practical issues that are common to most statistical design problems are briefly discussed. These include one-sided hypothesis tests, power level criteria, equality of within-group variances, transformations of response variables to achieve variance equality, optimal specification of treatment group sizes, 'post hoc' power analysis and arguments for the increased use of confidence intervals in place of hypothesis tests.

  9. Statistical primer: how to deal with missing data in scientific research?

    PubMed

    Papageorgiou, Grigorios; Grant, Stuart W; Takkenberg, Johanna J M; Mokhles, Mostafa M

    2018-05-10

    Missing data are a common challenge encountered in research which can compromise the results of statistical inference when not handled appropriately. This paper aims to introduce basic concepts of missing data to a non-statistical audience, list and compare some of the most popular approaches for handling missing data in practice and provide guidelines and recommendations for dealing with and reporting missing data in scientific research. Complete case analysis and single imputation are simple approaches for handling missing data and are popular in practice, however, in most cases they are not guaranteed to provide valid inferences. Multiple imputation is a robust and general alternative which is appropriate for data missing at random, surpassing the disadvantages of the simpler approaches, but should always be conducted with care. The aforementioned approaches are illustrated and compared in an example application using Cox regression.

  10. Determination of quality parameters from statistical analysis of routine TLD dosimetry data.

    PubMed

    German, U; Weinstein, M; Pelled, O

    2006-01-01

    Following the as low as reasonably achievable (ALARA) practice, there is a need to measure very low doses, of the same order of magnitude as the natural background, and the limits of detection of the dosimetry systems. The different contributions of the background signals to the total zero dose reading of thermoluminescence dosemeter (TLD) cards were analysed by using the common basic definitions of statistical indicators: the critical level (L(C)), the detection limit (L(D)) and the determination limit (L(Q)). These key statistical parameters for the system operated at NRC-Negev were quantified, based on the history of readings of the calibration cards in use. The electronic noise seems to play a minor role, but the reading of the Teflon coating (without the presence of a TLD crystal) gave a significant contribution.

  11. Social Physique Anxiety and Intention to Be Physically Active: A Self-Determination Theory Approach.

    PubMed

    Sicilia, Álvaro; Sáenz-Alvarez, Piedad; González-Cutre, David; Ferriz, Roberto

    2016-12-01

    Based on self-determination theory, the purpose of this study was to analyze the relationship between social physique anxiety and intention to be physically active, while taking into account the mediating effects of the basic psychological needs and behavioral regulations in exercise. Having obtained parents' prior consent, 390 students in secondary school (218 boys, 172 girls; M age  = 15.10 years, SD = 1.94 years) completed a self-administered questionnaire during physical education class that assessed the target variables. Preliminary analyses included means, standard deviations, and bivariate correlations among the target variables. Next, a path analysis was performed using the maximum likelihood estimation method with the bootstrapping procedure in the statistical package AMOS 19. Analysis revealed that social physique anxiety negatively predicted intention to be physically active through mediation of the basic psychological needs and the 3 autonomous forms of motivation (i.e., intrinsic motivation, integrated regulation, and identified regulation). The results suggest that social physique anxiety is an internal source of controlling influence that hinders basic psychological need satisfaction and autonomous motivation in exercise, and interventions aimed at reducing social physique anxiety could promote future exercise.

  12. [Practical aspects regarding sample size in clinical research].

    PubMed

    Vega Ramos, B; Peraza Yanes, O; Herrera Correa, G; Saldívar Toraya, S

    1996-01-01

    The knowledge of the right sample size let us to be sure if the published results in medical papers had a suitable design and a proper conclusion according to the statistics analysis. To estimate the sample size we must consider the type I error, type II error, variance, the size of the effect, significance and power of the test. To decide what kind of mathematics formula will be used, we must define what kind of study we have, it means if its a prevalence study, a means values one or a comparative one. In this paper we explain some basic topics of statistics and we describe four simple samples of estimation of sample size.

  13. Mueller matrix mapping of biological polycrystalline layers using reference wave

    NASA Astrophysics Data System (ADS)

    Dubolazov, A.; Ushenko, O. G.; Ushenko, Yu. O.; Pidkamin, L. Y.; Sidor, M. I.; Grytsyuk, M.; Prysyazhnyuk, P. V.

    2018-01-01

    The paper consists of two parts. The first part is devoted to the short theoretical basics of the method of differential Mueller-matrix description of properties of partially depolarizing layers. It was provided the experimentally measured maps of differential matrix of the 1st order of polycrystalline structure of the histological section of brain tissue. It was defined the statistical moments of the 1st-4th orders, which characterize the distribution of matrix elements. In the second part of the paper it was provided the data of statistic analysis of birefringence and dichroism of the histological sections of mice liver tissue (normal and with diabetes). It were defined the objective criteria of differential diagnostics of diabetes.

  14. An overview of meta-analysis for clinicians.

    PubMed

    Lee, Young Ho

    2018-03-01

    The number of medical studies being published is increasing exponentially, and clinicians must routinely process large amounts of new information. Moreover, the results of individual studies are often insufficient to provide confident answers, as their results are not consistently reproducible. A meta-analysis is a statistical method for combining the results of different studies on the same topic and it may resolve conflicts among studies. Meta-analysis is being used increasingly and plays an important role in medical research. This review introduces the basic concepts, steps, advantages, and caveats of meta-analysis, to help clinicians understand it in clinical practice and research. A major advantage of a meta-analysis is that it produces a precise estimate of the effect size, with considerably increased statistical power, which is important when the power of the primary study is limited because of a small sample size. A meta-analysis may yield conclusive results when individual studies are inconclusive. Furthermore, meta-analyses investigate the source of variation and different effects among subgroups. In summary, a meta-analysis is an objective, quantitative method that provides less biased estimates on a specific topic. Understanding how to conduct a meta-analysis aids clinicians in the process of making clinical decisions.

  15. Are We Able to Pass the Mission of Statistics to Students?

    ERIC Educational Resources Information Center

    Hindls, Richard; Hronová, Stanislava

    2015-01-01

    The article illustrates our long term experience in teaching statistics for non-statisticians, especially for students of economics and humanities. The article is focused on some problems of the basic course that can weaken the interest in statistics or lead to false use of statistic methods.

  16. Rigorous Statistical Bounds in Uncertainty Quantification for One-Layer Turbulent Geophysical Flows

    NASA Astrophysics Data System (ADS)

    Qi, Di; Majda, Andrew J.

    2018-04-01

    Statistical bounds controlling the total fluctuations in mean and variance about a basic steady-state solution are developed for the truncated barotropic flow over topography. Statistical ensemble prediction is an important topic in weather and climate research. Here, the evolution of an ensemble of trajectories is considered using statistical instability analysis and is compared and contrasted with the classical deterministic instability for the growth of perturbations in one pointwise trajectory. The maximum growth of the total statistics in fluctuations is derived relying on the statistical conservation principle of the pseudo-energy. The saturation bound of the statistical mean fluctuation and variance in the unstable regimes with non-positive-definite pseudo-energy is achieved by linking with a class of stable reference states and minimizing the stable statistical energy. Two cases with dependence on initial statistical uncertainty and on external forcing and dissipation are compared and unified under a consistent statistical stability framework. The flow structures and statistical stability bounds are illustrated and verified by numerical simulations among a wide range of dynamical regimes, where subtle transient statistical instability exists in general with positive short-time exponential growth in the covariance even when the pseudo-energy is positive-definite. Among the various scenarios in this paper, there exist strong forward and backward energy exchanges between different scales which are estimated by the rigorous statistical bounds.

  17. Revealing representational content with pattern-information fMRI--an introductory guide.

    PubMed

    Mur, Marieke; Bandettini, Peter A; Kriegeskorte, Nikolaus

    2009-03-01

    Conventional statistical analysis methods for functional magnetic resonance imaging (fMRI) data are very successful at detecting brain regions that are activated as a whole during specific mental activities. The overall activation of a region is usually taken to indicate involvement of the region in the task. However, such activation analysis does not consider the multivoxel patterns of activity within a brain region. These patterns of activity, which are thought to reflect neuronal population codes, can be investigated by pattern-information analysis. In this framework, a region's multivariate pattern information is taken to indicate representational content. This tutorial introduction motivates pattern-information analysis, explains its underlying assumptions, introduces the most widespread methods in an intuitive way, and outlines the basic sequence of analysis steps.

  18. [Effects of Self-directed Feedback Practice using Smartphone Videos on Basic Nursing Skills, Confidence in Performance and Learning Satisfaction].

    PubMed

    Lee, Seul Gi; Shin, Yun Hee

    2016-04-01

    This study was done to verify effects of a self-directed feedback practice using smartphone videos on nursing students' basic nursing skills, confidence in performance and learning satisfaction. In this study an experimental study with a post-test only control group design was used. Twenty-nine students were assigned to the experimental group and 29 to the control group. Experimental treatment was exchanging feedback on deficiencies through smartphone recorded videos of nursing practice process taken by peers during self-directed practice. Basic nursing skills scores were higher for all items in the experimental group compared to the control group, and differences were statistically significant ["Measuring vital signs" (t=-2.10, p=.039); "Wearing protective equipment when entering and exiting the quarantine room and the management of waste materials" (t=-4.74, p<.001) "Gavage tube feeding" (t=-2.70, p=.009)]. Confidence in performance was higher in the experimental group compared to the control group, but the differences were not statistically significant. However, after the complete practice, there was a statistically significant difference in overall performance confidence (t=-3.07. p=.003). Learning satisfaction was higher in the experimental group compared to the control group, but the difference was not statistically significant (t=-1.67, p=.100). Results of this study indicate that self-directed feedback practice using smartphone videos can improve basic nursing skills. The significance is that it can help nursing students gain confidence in their nursing skills for the future through improvement of basic nursing skills and performance of quality care, thus providing patients with safer care.

  19. A microdestructive capillary electrophoresis method for the analysis of blue-pen-ink strokes on office paper.

    PubMed

    Calcerrada, Matías; González-Herráez, Miguel; Garcia-Ruiz, Carmen

    2015-06-26

    This manuscript describes the development of a capillary electrophoresis (CE) method for the detection of acid and basic dyes and its application to real samples, blue-pen-ink strokes on office paper. First, a capillary zone electrophoresis (CZE) method was developed for the separation of basic and acid dyes, by studying the separation medium (buffer nature, pH and relative amount of additive) and instrumental parameters (temperature, voltage and capillary dimensions). The method performance was evaluated in terms of selectivity, resolution (above 5 and 2 for acid dyes and basic dyes, respectively, except for two basic dye standards), LOD (lower than 0.4 mg/L) and precision as intraday and interday RSD values of peak migration times (lower than 0.6%). The developed method was then applied to 34 blue pens from different technologies (rollerball, ballpoint, markers) and with different ink composition (gel, water-based, oil-based). A microdestructive sample treatment using a scalpel to scratch 0.3mg of ink stroke was performed. The entire electropherogram profile allowed the visual discrimination between different types of ink and brands, being not necessary a statistical treatment. A 100% of discrimination was achieved between pen technologies, brands, and models, although non-reproducible zones in the electropherograms were found for blue gel pen samples. The two different batches of blue oil-based pens were also differentiated. Thus, this method provides a simple, microdestructive, and rapid analysis of different blue pen technologies which may complement the current analysis of questioned documents performed by forensic laboratories. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Provision of Pre-Primary Education as a Basic Right in Tanzania: Reflections from Policy Documents

    ERIC Educational Resources Information Center

    Mtahabwa, Lyabwene

    2010-01-01

    This study sought to assess provision of pre-primary education in Tanzania as a basic right through analyses of relevant policy documents. Documents which were published over the past decade were considered, including educational policies, action plans, national papers, the "Basic Education Statistics in Tanzania" documents, strategy…

  1. [New design of the Health Survey of Catalonia (Spain, 2010-2014): a step forward in health planning and evaluation].

    PubMed

    Alcañiz-Zanón, Manuela; Mompart-Penina, Anna; Guillén-Estany, Montserrat; Medina-Bustos, Antonia; Aragay-Barbany, Josep M; Brugulat-Guiteras, Pilar; Tresserras-Gaju, Ricard

    2014-01-01

    This article presents the genesis of the Health Survey of Catalonia (Spain, 2010-2014) with its semiannual subsamples and explains the basic characteristics of its multistage sampling design. In comparison with previous surveys, the organizational advantages of this new statistical operation include rapid data availability and the ability to continuously monitor the population. The main benefits are timeliness in the production of indicators and the possibility of introducing new topics through the supplemental questionnaire as a function of needs. Limitations consist of the complexity of the sample design and the lack of longitudinal follow-up of the sample. Suitable sampling weights for each specific subsample are necessary for any statistical analysis of micro-data. Accuracy in the analysis of territorial disaggregation or population subgroups increases if annual samples are accumulated. Copyright © 2013 SESPAS. Published by Elsevier Espana. All rights reserved.

  2. Quantitative analysis of spatial variability of geotechnical parameters

    NASA Astrophysics Data System (ADS)

    Fang, Xing

    2018-04-01

    Geotechnical parameters are the basic parameters of geotechnical engineering design, while the geotechnical parameters have strong regional characteristics. At the same time, the spatial variability of geotechnical parameters has been recognized. It is gradually introduced into the reliability analysis of geotechnical engineering. Based on the statistical theory of geostatistical spatial information, the spatial variability of geotechnical parameters is quantitatively analyzed. At the same time, the evaluation of geotechnical parameters and the correlation coefficient between geotechnical parameters are calculated. A residential district of Tianjin Survey Institute was selected as the research object. There are 68 boreholes in this area and 9 layers of mechanical stratification. The parameters are water content, natural gravity, void ratio, liquid limit, plasticity index, liquidity index, compressibility coefficient, compressive modulus, internal friction angle, cohesion and SP index. According to the principle of statistical correlation, the correlation coefficient of geotechnical parameters is calculated. According to the correlation coefficient, the law of geotechnical parameters is obtained.

  3. Singular Spectrum Analysis for Astronomical Time Series: Constructing a Parsimonious Hypothesis Test

    NASA Astrophysics Data System (ADS)

    Greco, G.; Kondrashov, D.; Kobayashi, S.; Ghil, M.; Branchesi, M.; Guidorzi, C.; Stratta, G.; Ciszak, M.; Marino, F.; Ortolan, A.

    We present a data-adaptive spectral method - Monte Carlo Singular Spectrum Analysis (MC-SSA) - and its modification to tackle astrophysical problems. Through numerical simulations we show the ability of the MC-SSA in dealing with 1/f β power-law noise affected by photon counting statistics. Such noise process is simulated by a first-order autoregressive, AR(1) process corrupted by intrinsic Poisson noise. In doing so, we statistically estimate a basic stochastic variation of the source and the corresponding fluctuations due to the quantum nature of light. In addition, MC-SSA test retains its effectiveness even when a significant percentage of the signal falls below a certain level of detection, e.g., caused by the instrument sensitivity. The parsimonious approach presented here may be broadly applied, from the search for extrasolar planets to the extraction of low-intensity coherent phenomena probably hidden in high energy transients.

  4. An operational definition of a statistically meaningful trend.

    PubMed

    Bryhn, Andreas C; Dimberg, Peter H

    2011-04-28

    Linear trend analysis of time series is standard procedure in many scientific disciplines. If the number of data is large, a trend may be statistically significant even if data are scattered far from the trend line. This study introduces and tests a quality criterion for time trends referred to as statistical meaningfulness, which is a stricter quality criterion for trends than high statistical significance. The time series is divided into intervals and interval mean values are calculated. Thereafter, r(2) and p values are calculated from regressions concerning time and interval mean values. If r(2) ≥ 0.65 at p ≤ 0.05 in any of these regressions, then the trend is regarded as statistically meaningful. Out of ten investigated time series from different scientific disciplines, five displayed statistically meaningful trends. A Microsoft Excel application (add-in) was developed which can perform statistical meaningfulness tests and which may increase the operationality of the test. The presented method for distinguishing statistically meaningful trends should be reasonably uncomplicated for researchers with basic statistics skills and may thus be useful for determining which trends are worth analysing further, for instance with respect to causal factors. The method can also be used for determining which segments of a time trend may be particularly worthwhile to focus on.

  5. On Learning Cluster Coefficient of Private Networks

    PubMed Central

    Wang, Yue; Wu, Xintao; Zhu, Jun; Xiang, Yang

    2013-01-01

    Enabling accurate analysis of social network data while preserving differential privacy has been challenging since graph features such as clustering coefficient or modularity often have high sensitivity, which is different from traditional aggregate functions (e.g., count and sum) on tabular data. In this paper, we treat a graph statistics as a function f and develop a divide and conquer approach to enforce differential privacy. The basic procedure of this approach is to first decompose the target computation f into several less complex unit computations f1, …, fm connected by basic mathematical operations (e.g., addition, subtraction, multiplication, division), then perturb the output of each fi with Laplace noise derived from its own sensitivity value and the distributed privacy threshold εi, and finally combine those perturbed fi as the perturbed output of computation f. We examine how various operations affect the accuracy of complex computations. When unit computations have large global sensitivity values, we enforce the differential privacy by calibrating noise based on the smooth sensitivity, rather than the global sensitivity. By doing this, we achieve the strict differential privacy guarantee with smaller magnitude noise. We illustrate our approach by using clustering coefficient, which is a popular statistics used in social network analysis. Empirical evaluations on five real social networks and various synthetic graphs generated from three random graph models show the developed divide and conquer approach outperforms the direct approach. PMID:24429843

  6. Error Analysis for RADAR Neighbor Matching Localization in Linear Logarithmic Strength Varying Wi-Fi Environment

    PubMed Central

    Tian, Zengshan; Xu, Kunjie; Yu, Xiang

    2014-01-01

    This paper studies the statistical errors for the fingerprint-based RADAR neighbor matching localization with the linearly calibrated reference points (RPs) in logarithmic received signal strength (RSS) varying Wi-Fi environment. To the best of our knowledge, little comprehensive analysis work has appeared on the error performance of neighbor matching localization with respect to the deployment of RPs. However, in order to achieve the efficient and reliable location-based services (LBSs) as well as the ubiquitous context-awareness in Wi-Fi environment, much attention has to be paid to the highly accurate and cost-efficient localization systems. To this end, the statistical errors by the widely used neighbor matching localization are significantly discussed in this paper to examine the inherent mathematical relations between the localization errors and the locations of RPs by using a basic linear logarithmic strength varying model. Furthermore, based on the mathematical demonstrations and some testing results, the closed-form solutions to the statistical errors by RADAR neighbor matching localization can be an effective tool to explore alternative deployment of fingerprint-based neighbor matching localization systems in the future. PMID:24683349

  7. Error analysis for RADAR neighbor matching localization in linear logarithmic strength varying Wi-Fi environment.

    PubMed

    Zhou, Mu; Tian, Zengshan; Xu, Kunjie; Yu, Xiang; Wu, Haibo

    2014-01-01

    This paper studies the statistical errors for the fingerprint-based RADAR neighbor matching localization with the linearly calibrated reference points (RPs) in logarithmic received signal strength (RSS) varying Wi-Fi environment. To the best of our knowledge, little comprehensive analysis work has appeared on the error performance of neighbor matching localization with respect to the deployment of RPs. However, in order to achieve the efficient and reliable location-based services (LBSs) as well as the ubiquitous context-awareness in Wi-Fi environment, much attention has to be paid to the highly accurate and cost-efficient localization systems. To this end, the statistical errors by the widely used neighbor matching localization are significantly discussed in this paper to examine the inherent mathematical relations between the localization errors and the locations of RPs by using a basic linear logarithmic strength varying model. Furthermore, based on the mathematical demonstrations and some testing results, the closed-form solutions to the statistical errors by RADAR neighbor matching localization can be an effective tool to explore alternative deployment of fingerprint-based neighbor matching localization systems in the future.

  8. Hybrid Evidence Theory-based Finite Element/Statistical Energy Analysis method for mid-frequency analysis of built-up systems with epistemic uncertainties

    NASA Astrophysics Data System (ADS)

    Yin, Shengwen; Yu, Dejie; Yin, Hui; Lü, Hui; Xia, Baizhan

    2017-09-01

    Considering the epistemic uncertainties within the hybrid Finite Element/Statistical Energy Analysis (FE/SEA) model when it is used for the response analysis of built-up systems in the mid-frequency range, the hybrid Evidence Theory-based Finite Element/Statistical Energy Analysis (ETFE/SEA) model is established by introducing the evidence theory. Based on the hybrid ETFE/SEA model and the sub-interval perturbation technique, the hybrid Sub-interval Perturbation and Evidence Theory-based Finite Element/Statistical Energy Analysis (SIP-ETFE/SEA) approach is proposed. In the hybrid ETFE/SEA model, the uncertainty in the SEA subsystem is modeled by a non-parametric ensemble, while the uncertainty in the FE subsystem is described by the focal element and basic probability assignment (BPA), and dealt with evidence theory. Within the hybrid SIP-ETFE/SEA approach, the mid-frequency response of interest, such as the ensemble average of the energy response and the cross-spectrum response, is calculated analytically by using the conventional hybrid FE/SEA method. Inspired by the probability theory, the intervals of the mean value, variance and cumulative distribution are used to describe the distribution characteristics of mid-frequency responses of built-up systems with epistemic uncertainties. In order to alleviate the computational burdens for the extreme value analysis, the sub-interval perturbation technique based on the first-order Taylor series expansion is used in ETFE/SEA model to acquire the lower and upper bounds of the mid-frequency responses over each focal element. Three numerical examples are given to illustrate the feasibility and effectiveness of the proposed method.

  9. Recruitment and Retention of New Emergency Medical Technician (EMT)-Basics and Paramedics.

    PubMed

    Chapman, Susan A; Crowe, Remle P; Bentley, Melissa A

    2016-12-01

    The purpose of this paper is to describe factors important for the recruitment and retention of Emergency Medical Technician (EMT)-Basics and EMT-Paramedics new to the Emergency Medical Services (EMS) field (defined as two years or less of EMS employment) through an analysis of 10 years of Longitudinal EMT Attributes and Demographic Study (LEADS) data. Data were obtained from 10 years of LEADS surveys (1999-2008). Individuals new to the profession were identified through responses to a survey item. Their responses were analyzed using weights reflecting each individual's probability of selection. Means, proportions, and 95% confidence intervals (CIs) were determined and used to identify statistically significant differences. There were few changes in the demographic characteristics of new EMT-Basics and Paramedics across survey years. New EMT-Basics tended to be older and less likely to have a college degree than new EMT-Paramedics. More new EMT-Basics than EMT-Paramedics worked in rural areas and small towns and reported that they were working as a volunteer. There were differences between new EMT-Basics and EMT-Paramedics in several of the reasons for entering the profession and in facets of job satisfaction. The findings provide guidance for recruiters, educators, employers, and governmental EMS policy organizations and will provide better insight into how to attract and retain new entrants to the field. Chapman SA , Crowe RP , Bentley MA . Recruitment and retention of new Emergency Medical Technician (EMT)-Basics and Paramedics. Prehosp Disaster Med. 2016;31(Suppl. 1):s70-s86.

  10. Statistical analysis of the electric energy production from photovoltaic conversion using mobile and fixed constructions

    NASA Astrophysics Data System (ADS)

    Bugała, Artur; Bednarek, Karol; Kasprzyk, Leszek; Tomczewski, Andrzej

    2017-10-01

    The paper presents the most representative - from the three-year measurement time period - characteristics of daily and monthly electricity production from a photovoltaic conversion using modules installed in a fixed and 2-axis tracking construction. Results are presented for selected summer, autumn, spring and winter days. Analyzed measuring stand is located on the roof of the Faculty of Electrical Engineering Poznan University of Technology building. The basic parameters of the statistical analysis like mean value, standard deviation, skewness, kurtosis, median, range, or coefficient of variation were used. It was found that the asymmetry factor can be useful in the analysis of the daily electricity production from a photovoltaic conversion. In order to determine the repeatability of monthly electricity production, occurring between the summer, and summer and winter months, a non-parametric Mann-Whitney U test was used as a statistical solution. In order to analyze the repeatability of daily peak hours, describing the largest value of the hourly electricity production, a non-parametric Kruskal-Wallis test was applied as an extension of the Mann-Whitney U test. Based on the analysis of the electric energy distribution from a prepared monitoring system it was found that traditional forecasting methods of the electricity production from a photovoltaic conversion, like multiple regression models, should not be the preferred methods of the analysis.

  11. On Ruch's Principle of Decreasing Mixing Distance in classical statistical physics

    NASA Astrophysics Data System (ADS)

    Busch, Paul; Quadt, Ralf

    1990-10-01

    Ruch's Principle of Decreasing Mixing Distance is reviewed as a statistical physical principle and its basic suport and geometric interpretation, the Ruch-Schranner-Seligman theorem, is generalized to be applicable to a large representative class of classical statistical systems.

  12. Analysis of the HLA population data (AHPD) submitted to the 15th International Histocompatibility/Immunogenetics Workshop by using the Gene[rate] computer tools accommodating ambiguous data (AHPD project report).

    PubMed

    Nunes, J M; Riccio, M E; Buhler, S; Di, D; Currat, M; Ries, F; Almada, A J; Benhamamouch, S; Benitez, O; Canossi, A; Fadhlaoui-Zid, K; Fischer, G; Kervaire, B; Loiseau, P; de Oliveira, D C M; Papasteriades, C; Piancatelli, D; Rahal, M; Richard, L; Romero, M; Rousseau, J; Spiroski, M; Sulcebe, G; Middleton, D; Tiercy, J-M; Sanchez-Mazas, A

    2010-07-01

    During the 15th International Histocompatibility and Immunogenetics Workshop (IHIWS), 14 human leukocyte antigen (HLA) laboratories participated in the Analysis of HLA Population Data (AHPD) project where 18 new population samples were analyzed statistically and compared with data available from previous workshops. To that aim, an original methodology was developed and used (i) to estimate frequencies by taking into account ambiguous genotypic data, (ii) to test for Hardy-Weinberg equilibrium (HWE) by using a nested likelihood ratio test involving a parameter accounting for HWE deviations, (iii) to test for selective neutrality by using a resampling algorithm, and (iv) to provide explicit graphical representations including allele frequencies and basic statistics for each series of data. A total of 66 data series (1-7 loci per population) were analyzed with this standard approach. Frequency estimates were compliant with HWE in all but one population of mixed stem cell donors. Neutrality testing confirmed the observation of heterozygote excess at all HLA loci, although a significant deviation was established in only a few cases. Population comparisons showed that HLA genetic patterns were mostly shaped by geographic and/or linguistic differentiations in Africa and Europe, but not in America where both genetic drift in isolated populations and gene flow in admixed populations led to a more complex genetic structure. Overall, a fruitful collaboration between HLA typing laboratories and population geneticists allowed finding useful solutions to the problem of estimating gene frequencies and testing basic population diversity statistics on highly complex HLA data (high numbers of alleles and ambiguities), with promising applications in either anthropological, epidemiological, or transplantation studies.

  13. Prognostic Indexes for Brain Metastases: Which Is the Most Powerful?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arruda Viani, Gustavo, E-mail: gusviani@gmail.com; Bernardes da Silva, Lucas Godoi; Stefano, Eduardo Jose

    Purpose: The purpose of the present study was to compare the prognostic indexes (PIs) of patients with brain metastases (BMs) treated with whole brain radiotherapy (WBRT) using an artificial neural network. This analysis is important, because it evaluates the prognostic power of each PI to guide clinical decision-making and outcomes research. Methods and Materials: A retrospective prognostic study was conducted of 412 patients with BMs who underwent WBRT between April 1998 and March 2010. The eligibility criteria for patients included having undergone WBRT or WBRT plus neurosurgery. The data were analyzed using the artificial neural network. The input neural datamore » consisted of all prognostic factors included in the 5 PIs (recursive partitioning analysis, graded prognostic assessment [GPA], basic score for BMs, Rotterdam score, and Germany score). The data set was randomly divided into 300 training and 112 testing examples for survival prediction. All 5 PIs were compared using our database of 412 patients with BMs. The sensibility of the 5 indexes to predict survival according to their input variables was determined statistically using receiver operating characteristic curves. The importance of each variable from each PI was subsequently evaluated. Results: The overall 1-, 2-, and 3-year survival rate was 22%, 10.2%, and 5.1%, respectively. All classes of PIs were significantly associated with survival (recursive partitioning analysis, P < .0001; GPA, P < .0001; basic score for BMs, P = .002; Rotterdam score, P = .001; and Germany score, P < .0001). Comparing the areas under the curves, the GPA was statistically most sensitive in predicting survival (GPA, 86%; recursive partitioning analysis, 81%; basic score for BMs, 79%; Rotterdam, 73%; and Germany score, 77%; P < .001). Among the variables included in each PI, the performance status and presence of extracranial metastases were the most important factors. Conclusion: A variety of prognostic models describe the survival of patients with BMs to a more or less satisfactory degree. Among the 5 PIs evaluated in the present study, GPA was the most powerful in predicting survival. Additional studies should include emerging biologic prognostic factors to improve the sensibility of these PIs.« less

  14. Lower back pain and absenteeism among professional public transport drivers.

    PubMed

    Kresal, Friderika; Roblek, Vasja; Jerman, Andrej; Meško, Maja

    2015-01-01

    Drivers in public transport are subjected to lower back pain. The reason for the pain is associated with the characteristics of the physical position imposed on the worker while performing the job. Lower back pain is the main cause of absenteeism among drivers. The present study includes 145 public transport drivers employed as professional drivers for an average of 14.14 years. Analysis of the data obtained in the study includes the basic descriptive statistics, χ(2) test and multiple regression analysis. Analysis of the incidence of lower back pain showed that the majority of our sample population suffered from pain in the lower back. We found that there are no statistically significant differences between the groups formed by the length of service as a professional driver and incidence of lower back pain; we were also interested in whether or not the risk factors of lower back pain affects the absenteeism of city bus drivers. Analysis of the data has shown that the risk factors of pain in the lower part of the spine do affect the absenteeism of city bus drivers.

  15. [Basic concepts for network meta-analysis].

    PubMed

    Catalá-López, Ferrán; Tobías, Aurelio; Roqué, Marta

    2014-12-01

    Systematic reviews and meta-analyses have long been fundamental tools for evidence-based clinical practice. Initially, meta-analyses were proposed as a technique that could improve the accuracy and the statistical power of previous research from individual studies with small sample size. However, one of its main limitations has been the fact of being able to compare no more than two treatments in an analysis, even when the clinical research question necessitates that we compare multiple interventions. Network meta-analysis (NMA) uses novel statistical methods that incorporate information from both direct and indirect treatment comparisons in a network of studies examining the effects of various competing treatments, estimating comparisons between many treatments in a single analysis. Despite its potential limitations, NMA applications in clinical epidemiology can be of great value in situations where there are several treatments that have been compared against a common comparator. Also, NMA can be relevant to a research or clinical question when many treatments must be considered or when there is a mix of both direct and indirect information in the body of evidence. Copyright © 2013 Elsevier España, S.L.U. All rights reserved.

  16. Center for Prostate Disease Research

    MedlinePlus

    ... 2017 Cancer Statistics programs Clinical Research Program Synopsis Leadership Multi-Disciplinary Clinic Staff Listing 2017 Cancer Statistics Basic Science Research Program Synopsis Leadership Gene Expression Data Research Achievements Staff Listing Lab ...

  17. Basic Aerospace Education Library

    ERIC Educational Resources Information Center

    Journal of Aerospace Education, 1975

    1975-01-01

    Lists the most significant resource items on aerospace education which are presently available. Includes source books, bibliographies, directories, encyclopedias, dictionaries, audiovisuals, curriculum/planning guides, aerospace statistics, aerospace education statistics and newsletters. (BR)

  18. Multiple-solution problems in a statistics classroom: an example

    NASA Astrophysics Data System (ADS)

    Chu, Chi Wing; Chan, Kevin L. T.; Chan, Wai-Sum; Kwong, Koon-Shing

    2017-11-01

    The mathematics education literature shows that encouraging students to develop multiple solutions for given problems has a positive effect on students' understanding and creativity. In this paper, we present an example of multiple-solution problems in statistics involving a set of non-traditional dice. In particular, we consider the exact probability mass distribution for the sum of face values. Four different ways of solving the problem are discussed. The solutions span various basic concepts in different mathematical disciplines (sample space in probability theory, the probability generating function in statistics, integer partition in basic combinatorics and individual risk model in actuarial science) and thus promotes upper undergraduate students' awareness of knowledge connections between their courses. All solutions of the example are implemented using the R statistical software package.

  19. Emergent irreversibility and entanglement spectrum statistics

    NASA Astrophysics Data System (ADS)

    Mucciolo, Eduardo; Chamon, Claudio; Hamma, Alioscia

    2014-03-01

    We study the problem of irreversibility when the dynamical evolution of a many-body system is described by a stochastic quantum circuit. Such evolution is more general than Hamitonian, and since energy levels are not well defined, the well-established connection between the statistical fluctuations of the energy spectrum and irreversibility cannot be made. We show that the entanglement spectrum provides a more general connection. Irreversibility is marked by a failure of a disentangling algorithm and is preceded by the appearance of Wigner-Dyson statistical fluctuations in the entanglement spectrum. This analysis can be done at the wavefunction level and offers a new route to study quantum chaos and quantum integrability. We acknowledge financial support from the U.S. National Science Foundation through grants CCF 1116590 and CCF 1117241, from the National Basic Research Program of China through grants 2011CBA00300 and 2011CBA00301, and from the National Natural Science Fo.

  20. Global Statistics of Bolides in the Terrestrial Atmosphere

    NASA Astrophysics Data System (ADS)

    Chernogor, L. F.; Shevelyov, M. B.

    2017-06-01

    Purpose: Evaluation and analysis of distribution of the number of meteoroid (mini asteroid) falls as a function of glow energy, velocity, the region of maximum glow altitude, and geographic coordinates. Design/methodology/approach: The satellite database on the glow of 693 mini asteroids, which were decelerated in the terrestrial atmosphere, has been used for evaluating basic meteoroid statistics. Findings: A rapid decrease in the number of asteroids with increasing of their glow energy is confirmed. The average speed of the celestial bodies is equal to about 17.9 km/s. The altitude of maximum glow most often equals to 30-40 km. The distribution law for a number of meteoroids entering the terrestrial atmosphere in longitude and latitude (after excluding the component in latitudinal dependence due to the geometry) is approximately uniform. Conclusions: Using a large enough database of measurements, the meteoroid (mini asteroid) statistics has been evaluated.

  1. 'Chain pooling' model selection as developed for the statistical analysis of a rotor burst protection experiment

    NASA Technical Reports Server (NTRS)

    Holms, A. G.

    1977-01-01

    A statistical decision procedure called chain pooling had been developed for model selection in fitting the results of a two-level fixed-effects full or fractional factorial experiment not having replication. The basic strategy included the use of one nominal level of significance for a preliminary test and a second nominal level of significance for the final test. The subject has been reexamined from the point of view of using as many as three successive statistical model deletion procedures in fitting the results of a single experiment. The investigation consisted of random number studies intended to simulate the results of a proposed aircraft turbine-engine rotor-burst-protection experiment. As a conservative approach, population model coefficients were chosen to represent a saturated 2 to the 4th power experiment with a distribution of parameter values unfavorable to the decision procedures. Three model selection strategies were developed.

  2. Imperial College near infrared spectroscopy neuroimaging analysis framework.

    PubMed

    Orihuela-Espina, Felipe; Leff, Daniel R; James, David R C; Darzi, Ara W; Yang, Guang-Zhong

    2018-01-01

    This paper describes the Imperial College near infrared spectroscopy neuroimaging analysis (ICNNA) software tool for functional near infrared spectroscopy neuroimaging data. ICNNA is a MATLAB-based object-oriented framework encompassing an application programming interface and a graphical user interface. ICNNA incorporates reconstruction based on the modified Beer-Lambert law and basic processing and data validation capabilities. Emphasis is placed on the full experiment rather than individual neuroimages as the central element of analysis. The software offers three types of analyses including classical statistical methods based on comparison of changes in relative concentrations of hemoglobin between the task and baseline periods, graph theory-based metrics of connectivity and, distinctively, an analysis approach based on manifold embedding. This paper presents the different capabilities of ICNNA in its current version.

  3. The development and discussion of computerized visual perception assessment tool for Chinese characters structures - Concurrent estimation of the overall ability and the domain ability in item response theory approach.

    PubMed

    Wu, Huey-Min; Lin, Chin-Kai; Yang, Yu-Mao; Kuo, Bor-Chen

    2014-11-12

    Visual perception is the fundamental skill required for a child to recognize words, and to read and write. There was no visual perception assessment tool developed for preschool children based on Chinese characters in Taiwan. The purposes were to develop the computerized visual perception assessment tool for Chinese Characters Structures and to explore the psychometrical characteristic of assessment tool. This study adopted purposive sampling. The study evaluated 551 kindergarten-age children (293 boys, 258 girls) ranging from 46 to 81 months of age. The test instrument used in this study consisted of three subtests and 58 items, including tests of basic strokes, single-component characters, and compound characters. Based on the results of model fit analysis, the higher-order item response theory was used to estimate the performance in visual perception, basic strokes, single-component characters, and compound characters simultaneously. Analyses of variance were used to detect significant difference in age groups and gender groups. The difficulty of identifying items in a visual perception test ranged from -2 to 1. The visual perception ability of 4- to 6-year-old children ranged from -1.66 to 2.19. Gender did not have significant effects on performance. However, there were significant differences among the different age groups. The performance of 6-year-olds was better than that of 5-year-olds, which was better than that of 4-year-olds. This study obtained detailed diagnostic scores by using a higher-order item response theory model to understand the visual perception of basic strokes, single-component characters, and compound characters. Further statistical analysis showed that, for basic strokes and compound characters, girls performed better than did boys; there also were differences within each age group. For single-component characters, there was no difference in performance between boys and girls. However, again the performance of 6-year-olds was better than that of 4-year-olds, but there were no statistical differences between the performance of 5-year-olds and 6-year-olds. Results of tests with basic strokes, single-component characters and compound characters tests had good reliability and validity. Therefore, it can be apply to diagnose the problem of visual perception at preschool. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Descriptive statistics: the specification of statistical measures and their presentation in tables and graphs. Part 7 of a series on evaluation of scientific publications.

    PubMed

    Spriestersbach, Albert; Röhrig, Bernd; du Prel, Jean-Baptist; Gerhold-Ay, Aslihan; Blettner, Maria

    2009-09-01

    Descriptive statistics are an essential part of biometric analysis and a prerequisite for the understanding of further statistical evaluations, including the drawing of inferences. When data are well presented, it is usually obvious whether the author has collected and evaluated them correctly and in keeping with accepted practice in the field. Statistical variables in medicine may be of either the metric (continuous, quantitative) or categorical (nominal, ordinal) type. Easily understandable examples are given. Basic techniques for the statistical description of collected data are presented and illustrated with examples. The goal of a scientific study must always be clearly defined. The definition of the target value or clinical endpoint determines the level of measurement of the variables in question. Nearly all variables, whatever their level of measurement, can be usefully presented graphically and numerically. The level of measurement determines what types of diagrams and statistical values are appropriate. There are also different ways of presenting combinations of two independent variables graphically and numerically. The description of collected data is indispensable. If the data are of good quality, valid and important conclusions can already be drawn when they are properly described. Furthermore, data description provides a basis for inferential statistics.

  5. 3D Mueller-matrix mapping of biological optically anisotropic networks

    NASA Astrophysics Data System (ADS)

    Ushenko, O. G.; Ushenko, V. O.; Bodnar, G. B.; Zhytaryuk, V. G.; Prydiy, O. G.; Koval, G.; Lukashevich, I.; Vanchuliak, O.

    2018-01-01

    The paper consists of two parts. The first part presents short theoretical basics of the method of azimuthally-invariant Mueller-matrix description of optical anisotropy of biological tissues. It was provided experimentally measured coordinate distributions of Mueller-matrix invariants (MMI) of linear and circular birefringences of skeletal muscle tissue. It was defined the values of statistic moments, which characterize the distributions of amplitudes of wavelet coefficients of MMI at different scales of scanning. The second part presents the data of statistic analysis of the distributions of amplitude of wavelet coefficients of the distributions of linear birefringence of myocardium tissue died after the infarction and ischemic heart disease. It was defined the objective criteria of differentiation of the cause of death.

  6. Multiscale polarization diagnostics of birefringent networks in problems of necrotic changes diagnostics

    NASA Astrophysics Data System (ADS)

    Sakhnovskiy, M. Yu.; Ushenko, Yu. O.; Ushenko, V. O.; Besaha, R. N.; Pavlyukovich, N.; Pavlyukovich, O.

    2018-01-01

    The paper consists of two parts. The first part presents short theoretical basics of the method of azimuthally-invariant Mueller-matrix description of optical anisotropy of biological tissues. It was provided experimentally measured coordinate distributions of Mueller-matrix invariants (MMI) of linear and circular birefringences of skeletal muscle tissue. It was defined the values of statistic moments, which characterize the distributions of amplitudes of wavelet coefficients of MMI at different scales of scanning. The second part presents the data of statistic analysis of the distributions of amplitude of wavelet coefficients of the distributions of linear birefringence of myocardium tissue died after the infarction and ischemic heart disease. It was defined the objective criteria of differentiation of the cause of death.

  7. Comments of statistical issue in numerical modeling for underground nuclear test monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nicholson, W.L.; Anderson, K.K.

    1993-03-01

    The Symposium concluded with prepared summaries by four experts in the involved disciplines. These experts made no mention of statistics and/or the statistical content of issues. The first author contributed an extemporaneous statement at the Symposium because there are important issues associated with conducting and evaluating numerical modeling that are familiar to statisticians and often treated successfully by them. This note expands upon these extemporaneous remarks. Statistical ideas may be helpful in resolving some numerical modeling issues. Specifically, we comment first on the role of statistical design/analysis in the quantification process to answer the question ``what do we know aboutmore » the numerical modeling of underground nuclear tests?`` and second on the peculiar nature of uncertainty analysis for situations involving numerical modeling. The simulations described in the workshop, though associated with topic areas, were basically sets of examples. Each simulation was tuned towards agreeing with either empirical evidence or an expert`s opinion of what empirical evidence would be. While the discussions were reasonable, whether the embellishments were correct or a forced fitting of reality is unclear and illustrates that ``simulation is easy.`` We also suggest that these examples of simulation are typical and the questions concerning the legitimacy and the role of knowing the reality are fair, in general, with respect to simulation. The answers will help us understand why ``prediction is difficult.``« less

  8. 29 CFR 1904.42 - Requests from the Bureau of Labor Statistics for data.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 5 2010-07-01 2010-07-01 false Requests from the Bureau of Labor Statistics for data. 1904... Statistics for data. (a) Basic requirement. If you receive a Survey of Occupational Injuries and Illnesses Form from the Bureau of Labor Statistics (BLS), or a BLS designee, you must promptly complete the form...

  9. The horse-collar aurora - A frequent pattern of the aurora in quiet times

    NASA Technical Reports Server (NTRS)

    Hones, E. W., Jr.; Craven, J. D.; Frank, L. A.; Evans, D. S.; Newell, P. T.

    1989-01-01

    The frequent appearance of the 'horse-collar aurora' pattern in quiet-time DE 1 images is reported, presenting a two-hour image sequence that displays the basic features and shows that it sometimes evolves toward the theta configuration. There is some evidence for interplanetary magnetic field B(y) influence on the temporal development of the pattern. A preliminary statistical analysis finds the pattern appearing in one-third or more of the image sequences recorded during quiet times.

  10. Design and Analysis of A Multi-Backend Database System for Performance Improvement, Functionality Expansion and Capacity Growth. Part II.

    DTIC Science & Technology

    1981-08-01

    of Transactions ..... . 29 5.5.2 Attached Execution of Transactions ........ ... 29 5.5.3 The Choice of Transaction Execution for Access Control...basic access control mech- anism for statistical security and value-dependent security. In Section 5.5, * we describe the process of execution of ...the process of request execution with access control for in- sert and non-insert requests in MDBS. We recall again (see Chapter 4) that the process

  11. Mechanics, Waves and Thermodynamics

    NASA Astrophysics Data System (ADS)

    Ranjan Jain, Sudhir

    2016-05-01

    Figures; Preface; Acknowledgement; 1. Energy, mass, momentum; 2. Kinematics, Newton's laws of motion; 3. Circular motion; 4. The principle of least action; 5. Work and energy; 6. Mechanics of a system of particles; 7. Friction; 8. Impulse and collisions; 9. Central forces; 10. Dimensional analysis; 11. Oscillations; 12. Waves; 13. Sound of music; 14. Fluid mechanics; 15. Water waves; 16. The kinetic theory of gases; 17. Concepts and laws of thermodynamics; 18. Some applications of thermodynamics; 19. Basic ideas of statistical mechanics; Bibliography; Index.

  12. Basic Student Charges at Postsecondary Institutions: Academic Year 1994-95. Tuition and Required Fees and Room and Board Charges at 4-Year, 2-Year, and Public Less-Than-2-Year Institutions. Statistical Analysis Report.

    ERIC Educational Resources Information Center

    Barbett, Samuel F.; And Others

    This document lists the typical tuition and required fees and room and board charges assessed to college students in 1994-95 based on a national "Institutional Characteristics" survey which is part of the Integrated Postsecondary Education Data System. The data were collected from over 5,000 of the 5,775 4-year, 2-year, and public…

  13. Understanding Summary Statistics and Graphical Techniques to Compare Michael Jordan versus LeBron James

    ERIC Educational Resources Information Center

    Williams, Immanuel James; Williams, Kelley Kim

    2016-01-01

    Understanding summary statistics and graphical techniques are building blocks to comprehending concepts beyond basic statistics. It's known that motivated students perform better in school. Using examples that students find engaging allows them to understand the concepts at a deeper level.

  14. Financial statistics for public health dispensary decisions in Nigeria: insights on standard presentation typologies.

    PubMed

    Agundu, Prince Umor C

    2003-01-01

    Public health dispensaries in Nigeria in recent times have demonstrated the poise to boost corporate productivity in the new millennium and to drive the nation closer to concretising the lofty goal of health-for-all. This is very pronounced considering the face-lift giving to the physical environment, increase in the recruitment and development of professionals, and upward review of financial subventions. However, there is little or no emphasis on basic statistical appreciation/application which enhances the decision making ability of corporate executives. This study used the responses from 120 senior public health officials in Nigeria and analyzed them with chi-square statistical technique. The results established low statistical aptitude, inadequate statistical training programmes, little/no emphasis on statistical literacy compared to computer literacy, amongst others. Consequently, it was recommended that these lapses be promptly addressed to enhance official executive performance in the establishments. Basic statistical data presentation typologies have been articulated in this study to serve as first-aid instructions to the target group, as they represent the contributions of eminent scholars in this area of intellectualism.

  15. Grouping patients for masseter muscle genotype-phenotype studies.

    PubMed

    Moawad, Hadwah Abdelmatloub; Sinanan, Andrea C M; Lewis, Mark P; Hunt, Nigel P

    2012-03-01

    To use various facial classifications, including either/both vertical and horizontal facial criteria, to assess their effects on the interpretation of masseter muscle (MM) gene expression. Fresh MM biopsies were obtained from 29 patients (age, 16-36 years) with various facial phenotypes. Based on clinical and cephalometric analysis, patients were grouped using three different classifications: (1) basic vertical, (2) basic horizontal, and (3) combined vertical and horizontal. Gene expression levels of the myosin heavy chain genes MYH1, MYH2, MYH3, MYH6, MYH7, and MYH8 were recorded using quantitative reverse transcriptase polymerase chain reaction (RT-PCR) and were related to the various classifications. The significance level for statistical analysis was set at P ≤ .05. Using classification 1, none of the MYH genes were found to be significantly different between long face (LF) patients and the average vertical group. Using classification 2, MYH3, MYH6, and MYH7 genes were found to be significantly upregulated in retrognathic patients compared with prognathic and average horizontal groups. Using classification 3, only the MYH7 gene was found to be significantly upregulated in retrognathic LF compared with prognathic LF, prognathic average vertical faces, and average vertical and horizontal groups. The use of basic vertical or basic horizontal facial classifications may not be sufficient for genetics-based studies of facial phenotypes. Prognathic and retrognathic facial phenotypes have different MM gene expressions; therefore, it is not recommended to combine them into one single group, even though they may have a similar vertical facial phenotype.

  16. Assessment and prediction of inter-joint upper limb movement correlations based on kinematic analysis and statistical regression

    NASA Astrophysics Data System (ADS)

    Toth-Tascau, Mirela; Balanean, Flavia; Krepelka, Mircea

    2013-10-01

    Musculoskeletal impairment of the upper limb can cause difficulties in performing basic daily activities. Three dimensional motion analyses can provide valuable data of arm movement in order to precisely determine arm movement and inter-joint coordination. The purpose of this study was to develop a method to evaluate the degree of impairment based on the influence of shoulder movements in the amplitude of elbow flexion and extension based on the assumption that a lack of motion of the elbow joint will be compensated by an increased shoulder activity. In order to develop and validate a statistical model, one healthy young volunteer has been involved in the study. The activity of choice simulated blowing the nose, starting from a slight flexion of the elbow and raising the hand until the middle finger touches the tip of the nose and return to the start position. Inter-joint coordination between the elbow and shoulder movements showed significant correlation. Statistical regression was used to fit an equation model describing the influence of shoulder movements on the elbow mobility. The study provides a brief description of the kinematic analysis protocol and statistical models that may be useful in describing the relation between inter-joint movements of daily activities.

  17. An Integrated Analysis of the Physiological Effects of Space Flight: Executive Summary

    NASA Technical Reports Server (NTRS)

    Leonard, J. I.

    1985-01-01

    A large array of models were applied in a unified manner to solve problems in space flight physiology. Mathematical simulation was used as an alternative way of looking at physiological systems and maximizing the yield from previous space flight experiments. A medical data analysis system was created which consist of an automated data base, a computerized biostatistical and data analysis system, and a set of simulation models of physiological systems. Five basic models were employed: (1) a pulsatile cardiovascular model; (2) a respiratory model; (3) a thermoregulatory model; (4) a circulatory, fluid, and electrolyte balance model; and (5) an erythropoiesis regulatory model. Algorithms were provided to perform routine statistical tests, multivariate analysis, nonlinear regression analysis, and autocorrelation analysis. Special purpose programs were prepared for rank correlation, factor analysis, and the integration of the metabolic balance data.

  18. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.

  19. PCA as a practical indicator of OPLS-DA model reliability.

    PubMed

    Worley, Bradley; Powers, Robert

    Principal Component Analysis (PCA) and Orthogonal Projections to Latent Structures Discriminant Analysis (OPLS-DA) are powerful statistical modeling tools that provide insights into separations between experimental groups based on high-dimensional spectral measurements from NMR, MS or other analytical instrumentation. However, when used without validation, these tools may lead investigators to statistically unreliable conclusions. This danger is especially real for Partial Least Squares (PLS) and OPLS, which aggressively force separations between experimental groups. As a result, OPLS-DA is often used as an alternative method when PCA fails to expose group separation, but this practice is highly dangerous. Without rigorous validation, OPLS-DA can easily yield statistically unreliable group separation. A Monte Carlo analysis of PCA group separations and OPLS-DA cross-validation metrics was performed on NMR datasets with statistically significant separations in scores-space. A linearly increasing amount of Gaussian noise was added to each data matrix followed by the construction and validation of PCA and OPLS-DA models. With increasing added noise, the PCA scores-space distance between groups rapidly decreased and the OPLS-DA cross-validation statistics simultaneously deteriorated. A decrease in correlation between the estimated loadings (added noise) and the true (original) loadings was also observed. While the validity of the OPLS-DA model diminished with increasing added noise, the group separation in scores-space remained basically unaffected. Supported by the results of Monte Carlo analyses of PCA group separations and OPLS-DA cross-validation metrics, we provide practical guidelines and cross-validatory recommendations for reliable inference from PCA and OPLS-DA models.

  20. Outcome analysis of hemoglobin A1c, weight, and blood pressure in a VA diabetes education program.

    PubMed

    North, Susan L; Palmer, Glen A

    2015-01-01

    To determine the effect of a specific diabetes education class (Basics) on hemoglobin A1c values, weight, and systolic blood pressure. In this retrospective study, the researchers compared 2 groups of male veterans with a recent diagnosis of type 2 diabetes. One group received diabetes group education (n = 175) over a 4-month period, and the other received standard diabetes management follow-up (n = 184). Outpatient clinic setting in the Midwest. Basics class compared with standard level of care. Pre- and post-laboratory values for hemoglobin A1c, weight, and systolic blood pressure. Multivariate analysis of covariance and follow-up univariate statistics for significant differences. Findings revealed significant differences in hemoglobin A1c (P < .001) and weight (P < .001) in the treatment group compared with the control group. No significant difference was found in systolic blood pressure readings between the 2 groups. There was a significant difference in weight change between groups, with the treatment group demonstrating greater weight loss. There was an association between participation in the Basics diabetes education curriculum and reduction of hemoglobin A1c values. Some participants also had added benefit of significant weight loss. Published by Elsevier Inc.

  1. Educating the Educator: U.S. Government Statistical Sources for Geographic Research and Teaching.

    ERIC Educational Resources Information Center

    Fryman, James F.; Wilkinson, Patrick J.

    Appropriate for college geography students and researchers, this paper briefly introduces basic federal statistical publications and corresponding finding aids. General references include "Statistical Abstract of the United States," and three complementary publications: "County and City Data Book,""State and Metropolitan Area Data Book," and…

  2. Statistical Cost Estimation in Higher Education: Some Alternatives.

    ERIC Educational Resources Information Center

    Brinkman, Paul T.; Niwa, Shelley

    Recent developments in econometrics that are relevant to the task of estimating costs in higher education are reviewed. The relative effectiveness of alternative statistical procedures for estimating costs are also tested. Statistical cost estimation involves three basic parts: a model, a data set, and an estimation procedure. Actual data are used…

  3. Statistical Significance Testing in Second Language Research: Basic Problems and Suggestions for Reform

    ERIC Educational Resources Information Center

    Norris, John M.

    2015-01-01

    Traditions of statistical significance testing in second language (L2) quantitative research are strongly entrenched in how researchers design studies, select analyses, and interpret results. However, statistical significance tests using "p" values are commonly misinterpreted by researchers, reviewers, readers, and others, leading to…

  4. Ethical Statistics and Statistical Ethics: Making an Interdisciplinary Module

    ERIC Educational Resources Information Center

    Lesser, Lawrence M.; Nordenhaug, Erik

    2004-01-01

    This article describes an innovative curriculum module the first author created on the two-way exchange between statistics and applied ethics. The module, having no particular mathematical prerequisites beyond high school algebra, is part of an undergraduate interdisciplinary ethics course which begins with a 3-week introduction to basic applied…

  5. Using SERVQUAL and Kano research techniques in a patient service quality survey.

    PubMed

    Christoglou, Konstantinos; Vassiliadis, Chris; Sigalas, Ioakim

    2006-01-01

    This article presents the results of a service quality study. After an introduction to the SERVQUAL and the Kano research techniques, a Kano analysis of 75 patients from the General Hospital of Katerini in Greece is presented. The service quality criterion used satisfaction and dissatisfaction indices. The Kano statistical analysis process results strengthened the hypothesis of previous research regarding the importance of personal knowledge, the courtesy of the hospital employees and their ability to convey trust and confidence (assurance dimension). Managerial suggestions are made regarding the best way of acting and approaching hospital patients based on the basic SERVQUAL model.

  6. Evaluation of SLAR and thematic mapper MSS data for forest cover mapping using computer-aided analysis techniques

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M. (Principal Investigator)

    1980-01-01

    Several possibilities were considered for defining the data set in which the same test areas could be used for each of the four different spatial resolutions being evaluated. The LARSYS CLUSTER was used to sort the vectors into spectral classes to reduce the within-spectral class variability in an effort to develop training statistics. A data quality test was written to determine the basic signal to noise characteristics within the data set being used. Because preliminary analysis of the LANDSAT MSS data revealed the presence of high cirrus clouds, other data sets are being sought.

  7. Quantitative research.

    PubMed

    Watson, Roger

    2015-04-01

    This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.

  8. Statistics Canada's Definition and Classification of Postsecondary and Adult Education Providers in Canada. Culture, Tourism and the Centre for Education Statistics. Research Paper. Catalogue no. 81-595-M No. 071

    ERIC Educational Resources Information Center

    Orton, Larry

    2009-01-01

    This document outlines the definitions and the typology now used by Statistics Canada's Centre for Education Statistics to identify, classify and delineate the universities, colleges and other providers of postsecondary and adult education in Canada for which basic enrollments, graduates, professors and finance statistics are produced. These new…

  9. Building Capacity for Developing Statistical Literacy in a Developing Country: Lessons Learned from an Intervention

    ERIC Educational Resources Information Center

    North, Delia; Gal, Iddo; Zewotir, Temesgen

    2014-01-01

    This paper aims to contribute to the emerging literature on capacity-building in statistics education by examining issues pertaining to the readiness of teachers in a developing country to teach basic statistical topics. The paper reflects on challenges and barriers to building statistics capacity at grass-roots level in a developing country,…

  10. A model to characterize psychopathological features in adults with Prader-Willi syndrome.

    PubMed

    Thuilleaux, Denise; Laurier, Virginie; Copet, Pierre; Tricot, Julie; Demeer, Geneviève; Mourre, Fabien; Tauber, Maithé; Jauregi, Joseba

    2018-01-01

    High prevalence of behavioral and psychiatric disorders in adults with Prader-Willi Syndrome (PWS) has been reported in last few years. However, data are confusing and often contradictory. In this article, we propose a model to achieve a better understanding of the psychopathological features in adults with PWS. The study is based on clinical observations of 150 adult inpatients, males and females. Non-parametric statistics were performed to analyse the association of psychopathological profiles with genotype, gender and age. We propose a model of psychiatric disorders in adults with PWS based on cognitive, emotional and behavioural issues. This model defines four psychopathological profiles: Basic, Impulsive, Compulsive, and Psychotic. The Basic profile is defined by traits and symptoms that are present in varying degrees in all persons with PWS. In our cohort, this Basic profile corresponds to 55% of the patients. The rest show, in addition to these characteristics, salient features of impulsivity (Impulsive profile, 19%), compulsivity (Compulsive profile, 7%), or psychosis (Psychotic profile, 19%). The analysis of factors associated with different profiles reveals an effect of genotype on Basic and Psychotic profiles (Deletion: 70% Basic, 9% Psychotic; Non-deletion: 23% Basic, 43% Psychotic) and a positive correlation between male sex and impulsivity, unmediated by sex hormone treatment. This is a clinical study, based on observation proposing an original model to understand the psychiatric and behavioural disorders in adults with PWS. Further studies are needed in order to test the validity of this model. © 2017 Wiley Periodicals, Inc.

  11. County-by-County Financial and Staffing I-M-P-A-C-T. FY 1994-95 Basic Education Program.

    ERIC Educational Resources Information Center

    North Carolina State Dept. of Public Instruction, Raleigh.

    This publication provides the basic statistics needed to illustrate the impact of North Carolina's Basic Education Program (BEP), an educational reform effort begun in 1985. Over 85% of the positions in the BEP are directly related to teaching and student-related activities. The new BEP programs result in smaller class sizes in kindergartens and…

  12. Development of a funding, cost, and spending model for satellite projects

    NASA Technical Reports Server (NTRS)

    Johnson, Jesse P.

    1989-01-01

    The need for a predictive budget/funging model is obvious. The current models used by the Resource Analysis Office (RAO) are used to predict the total costs of satellite projects. An effort to extend the modeling capabilities from total budget analysis to total budget and budget outlays over time analysis was conducted. A statistical based and data driven methodology was used to derive and develop the model. Th budget data for the last 18 GSFC-sponsored satellite projects were analyzed and used to build a funding model which would describe the historical spending patterns. This raw data consisted of dollars spent in that specific year and their 1989 dollar equivalent. This data was converted to the standard format used by the RAO group and placed in a database. A simple statistical analysis was performed to calculate the gross statistics associated with project length and project cost ant the conditional statistics on project length and project cost. The modeling approach used is derived form the theory of embedded statistics which states that properly analyzed data will produce the underlying generating function. The process of funding large scale projects over extended periods of time is described by Life Cycle Cost Models (LCCM). The data was analyzed to find a model in the generic form of a LCCM. The model developed is based on a Weibull function whose parameters are found by both nonlinear optimization and nonlinear regression. In order to use this model it is necessary to transform the problem from a dollar/time space to a percentage of total budget/time space. This transformation is equivalent to moving to a probability space. By using the basic rules of probability, the validity of both the optimization and the regression steps are insured. This statistically significant model is then integrated and inverted. The resulting output represents a project schedule which relates the amount of money spent to the percentage of project completion.

  13. [Training programs for staff at local Infectious Disease Surveillance Centers: the needs and usefulness].

    PubMed

    Suzuki, Tomoyuki; Kamiya, Nobuyuki; Yahata, Yuichiro; Ozeki, Yukie; Kishimoto, Tsuyoshi; Nadaoka, Yoko; Nakanishi, Yoshiko; Yoshimura, Takesumi; Shimada, Tomoe; Tada, Yuki; Shirabe, Komei; Kozawa, Kunihisa

    2013-03-01

    The objective of this study was to assess the need for and usefulness of training programs for Local Infectious Disease Surveillance Center (LIDSC) staff. A structured questionnaire survey was conducted to assess the needs and usefulness of training programs. The subjects of the survey were participants of a workshop held after an annual conference for the LIDSC staff. Data on demographic information, the necessity of training programs for LIDSC staff, the themes and contents of the training program, self-assessment of knowledge on epidemiology and statistics were covered by the questionnaire. A total of 55 local government officials responded to the questionnaire (response rate: 100%). Among these, 95% of participants believed that the training program for the LIDSC staff was necessary. Basic statistical analysis (85%), descriptive epidemiology (65%), outline of epidemiology (60%), interpretation of surveillance data (65%), background and objectives of national infectious disease surveillance in Japan (60%), methods of field epidemiology (60%), and methods of analysis data (51%) were selected by over half of the respondents as suitable themes for training programs. A total of 34 LIDSC staff answered the self-assessment question on knowledge of epidemiology. A majority of respondents selected "a little" or "none" for all questions about knowledge. Only a few respondents had received education in epidemiology. The results of this study indicate that LIDSC staff have basic demands for fundamental and specialized education to improve their work. Considering the current situation regarding the capacity of LIDSC staff, these training programs should be started immediately.

  14. Sources of Error and the Statistical Formulation of M S: m b Seismic Event Screening Analysis

    NASA Astrophysics Data System (ADS)

    Anderson, D. N.; Patton, H. J.; Taylor, S. R.; Bonner, J. L.; Selby, N. D.

    2014-03-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global ban on nuclear explosions, is currently in a ratification phase. Under the CTBT, an International Monitoring System (IMS) of seismic, hydroacoustic, infrasonic and radionuclide sensors is operational, and the data from the IMS is analysed by the International Data Centre (IDC). The IDC provides CTBT signatories basic seismic event parameters and a screening analysis indicating whether an event exhibits explosion characteristics (for example, shallow depth). An important component of the screening analysis is a statistical test of the null hypothesis H 0: explosion characteristics using empirical measurements of seismic energy (magnitudes). The established magnitude used for event size is the body-wave magnitude (denoted m b) computed from the initial segment of a seismic waveform. IDC screening analysis is applied to events with m b greater than 3.5. The Rayleigh wave magnitude (denoted M S) is a measure of later arriving surface wave energy. Magnitudes are measurements of seismic energy that include adjustments (physical correction model) for path and distance effects between event and station. Relative to m b, earthquakes generally have a larger M S magnitude than explosions. This article proposes a hypothesis test (screening analysis) using M S and m b that expressly accounts for physical correction model inadequacy in the standard error of the test statistic. With this hypothesis test formulation, the 2009 Democratic Peoples Republic of Korea announced nuclear weapon test fails to reject the null hypothesis H 0: explosion characteristics.

  15. Monte Carlo investigation of thrust imbalance of solid rocket motor pairs

    NASA Technical Reports Server (NTRS)

    Sforzini, R. H.; Foster, W. A., Jr.

    1976-01-01

    The Monte Carlo method of statistical analysis is used to investigate the theoretical thrust imbalance of pairs of solid rocket motors (SRMs) firing in parallel. Sets of the significant variables are selected using a random sampling technique and the imbalance calculated for a large number of motor pairs using a simplified, but comprehensive, model of the internal ballistics. The treatment of burning surface geometry allows for the variations in the ovality and alignment of the motor case and mandrel as well as those arising from differences in the basic size dimensions and propellant properties. The analysis is used to predict the thrust-time characteristics of 130 randomly selected pairs of Titan IIIC SRMs. A statistical comparison of the results with test data for 20 pairs shows the theory underpredicts the standard deviation in maximum thrust imbalance by 20% with variability in burning times matched within 2%. The range in thrust imbalance of Space Shuttle type SRM pairs is also estimated using applicable tolerances and variabilities and a correction factor based on the Titan IIIC analysis.

  16. Phase Transitions in Combinatorial Optimization Problems: Basics, Algorithms and Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Hartmann, Alexander K.; Weigt, Martin

    2005-10-01

    A concise, comprehensive introduction to the topic of statistical physics of combinatorial optimization, bringing together theoretical concepts and algorithms from computer science with analytical methods from physics. The result bridges the gap between statistical physics and combinatorial optimization, investigating problems taken from theoretical computing, such as the vertex-cover problem, with the concepts and methods of theoretical physics. The authors cover rapid developments and analytical methods that are both extremely complex and spread by word-of-mouth, providing all the necessary basics in required detail. Throughout, the algorithms are shown with examples and calculations, while the proofs are given in a way suitable for graduate students, post-docs, and researchers. Ideal for newcomers to this young, multidisciplinary field.

  17. SPSS and SAS programs for addressing interdependence and basic levels-of-analysis issues in psychological data.

    PubMed

    O'Connor, Brian P

    2004-02-01

    Levels-of-analysis issues arise whenever individual-level data are collected from more than one person from the same dyad, family, classroom, work group, or other interaction unit. Interdependence in data from individuals in the same interaction units also violates the independence-of-observations assumption that underlies commonly used statistical tests. This article describes the data analysis challenges that are presented by these issues and presents SPSS and SAS programs for conducting appropriate analyses. The programs conduct the within-and-between-analyses described by Dansereau, Alutto, and Yammarino (1984) and the dyad-level analyses described by Gonzalez and Griffin (1999) and Griffin and Gonzalez (1995). Contrasts with general multilevel modeling procedures are then discussed.

  18. Financial Statistics. Higher Education General Information Survey (HEGIS) [machine-readable data file].

    ERIC Educational Resources Information Center

    Center for Education Statistics (ED/OERI), Washington, DC.

    The Financial Statistics machine-readable data file (MRDF) is a subfile of the larger Higher Education General Information Survey (HEGIS). It contains basic financial statistics for over 3,000 institutions of higher education in the United States and its territories. The data are arranged sequentially by institution, with institutional…

  19. The Greyhound Strike: Using a Labor Dispute to Teach Descriptive Statistics.

    ERIC Educational Resources Information Center

    Shatz, Mark A.

    1985-01-01

    A simulation exercise of a labor-management dispute is used to teach psychology students some of the basics of descriptive statistics. Using comparable data sets generated by the instructor, students work in small groups to develop a statistical presentation that supports their particular position in the dispute. (Author/RM)

  20. Differential 3D Mueller-matrix mapping of optically anisotropic depolarizing biological layers

    NASA Astrophysics Data System (ADS)

    Ushenko, O. G.; Grytsyuk, M.; Ushenko, V. O.; Bodnar, G. B.; Vanchulyak, O.; Meglinskiy, I.

    2018-01-01

    The paper consists of two parts. The first part is devoted to the short theoretical basics of the method of differential Mueller-matrix description of properties of partially depolarizing layers. It was provided the experimentally measured maps of differential matrix of the 2nd order of polycrystalline structure of the histological section of rectum wall tissue. It was defined the values of statistical moments of the1st-4th orders, which characterize the distribution of matrix elements. In the second part of the paper it was provided the data of statistic analysis of birefringence and dichroism of the histological sections of connecting component of vagina wall tissue (normal and with prolapse). It were defined the objective criteria of differential diagnostics of pathologies of vagina wall.

  1. Evaluation of standardized and applied variables in predicting treatment outcomes of polytrauma patients.

    PubMed

    Aksamija, Goran; Mulabdic, Adi; Rasic, Ismar; Muhovic, Samir; Gavric, Igor

    2011-01-01

    Polytrauma is defined as an injury where they are affected by at least two different organ systems or body, with at least one life-threatening injuries. Given the multilevel model care of polytrauma patients within KCUS are inevitable weaknesses in the management of this category of patients. To determine the dynamics of existing procedures in treatment of polytrauma patients on admission to KCUS, and based on statistical analysis of variables applied to determine and define the factors that influence the final outcome of treatment, and determine their mutual relationship, which may result in eliminating the flaws in the approach to the problem. The study was based on 263 polytrauma patients. Parametric and non-parametric statistical methods were used. Basic statistics were calculated, based on the calculated parameters for the final achievement of research objectives, multicoleration analysis, image analysis, discriminant analysis and multifactorial analysis were used. From the universe of variables for this study we selected sample of n = 25 variables, of which the first two modular, others belong to the common measurement space (n = 23) and in this paper defined as a system variable methods, procedures and assessments of polytrauma patients. After the multicoleration analysis, since the image analysis gave a reliable measurement results, we started the analysis of eigenvalues, that is defining the factors upon which they obtain information about the system solve the problem of the existing model and its correlation with treatment outcome. The study singled out the essential factors that determine the current organizational model of care, which may affect the treatment and better outcome of polytrauma patients. This analysis has shown the maximum correlative relationships between these practices and contributed to development guidelines that are defined by isolated factors.

  2. Nurses' foot care activities in home health care.

    PubMed

    Stolt, Minna; Suhonen, Riitta; Puukka, Pauli; Viitanen, Matti; Voutilainen, Päivi; Leino-Kilpi, Helena

    2013-01-01

    This study described the basic foot care activities performed by nurses and factors associated with these in the home care of older people. Data were collected from nurses (n=322) working in nine public home care agencies in Finland using the Nurses' Foot Care Activities Questionnaire (NFAQ). Data were analyzed statistically using descriptive statistics and multivariate liner models. Although some of the basic foot care activities of nurses reported using were outdated, the majority of foot care activities were consistent with recommendations in foot care literature. Longer working experience, referring patients with foot problems to a podiatrist and physiotherapist, and patient education in wart and nail care were associated with a high score for adequate foot care activities. Continuing education should focus on updating basic foot care activities and increasing the use of evidence-based foot care methods. Also, geriatric nursing research should focus in intervention research to improve the use of evidence-based basic foot care activities. Copyright © 2013 Mosby, Inc. All rights reserved.

  3. General solution of the chemical master equation and modality of marginal distributions for hierarchic first-order reaction networks.

    PubMed

    Reis, Matthias; Kromer, Justus A; Klipp, Edda

    2018-01-20

    Multimodality is a phenomenon which complicates the analysis of statistical data based exclusively on mean and variance. Here, we present criteria for multimodality in hierarchic first-order reaction networks, consisting of catalytic and splitting reactions. Those networks are characterized by independent and dependent subnetworks. First, we prove the general solvability of the Chemical Master Equation (CME) for this type of reaction network and thereby extend the class of solvable CME's. Our general solution is analytical in the sense that it allows for a detailed analysis of its statistical properties. Given Poisson/deterministic initial conditions, we then prove the independent species to be Poisson/binomially distributed, while the dependent species exhibit generalized Poisson/Khatri Type B distributions. Generalized Poisson/Khatri Type B distributions are multimodal for an appropriate choice of parameters. We illustrate our criteria for multimodality by several basic models, as well as the well-known two-stage transcription-translation network and Bateman's model from nuclear physics. For both examples, multimodality was previously not reported.

  4. Progress in Turbulence Detection via GNSS Occultation Data

    NASA Technical Reports Server (NTRS)

    Cornman, L. B.; Goodrich, R. K.; Axelrad, P.; Barlow, E.

    2012-01-01

    The increased availability of radio occultation (RO) data offers the ability to detect and study turbulence in the Earth's atmosphere. An analysis of how RO data can be used to determine the strength and location of turbulent regions is presented. This includes the derivation of a model for the power spectrum of the log-amplitude and phase fluctuations of the permittivity (or index of refraction) field. The bulk of the paper is then concerned with the estimation of the model parameters. Parameter estimators are introduced and some of their statistical properties are studied. These estimators are then applied to simulated log-amplitude RO signals. This includes the analysis of global statistics derived from a large number of realizations, as well as case studies that illustrate various specific aspects of the problem. Improvements to the basic estimation methods are discussed, and their beneficial properties are illustrated. The estimation techniques are then applied to real occultation data. Only two cases are presented, but they illustrate some of the salient features inherent in real data.

  5. Cracking the Neural Code for Sensory Perception by Combining Statistics, Intervention, and Behavior.

    PubMed

    Panzeri, Stefano; Harvey, Christopher D; Piasini, Eugenio; Latham, Peter E; Fellin, Tommaso

    2017-02-08

    The two basic processes underlying perceptual decisions-how neural responses encode stimuli, and how they inform behavioral choices-have mainly been studied separately. Thus, although many spatiotemporal features of neural population activity, or "neural codes," have been shown to carry sensory information, it is often unknown whether the brain uses these features for perception. To address this issue, we propose a new framework centered on redefining the neural code as the neural features that carry sensory information used by the animal to drive appropriate behavior; that is, the features that have an intersection between sensory and choice information. We show how this framework leads to a new statistical analysis of neural activity recorded during behavior that can identify such neural codes, and we discuss how to combine intersection-based analysis of neural recordings with intervention on neural activity to determine definitively whether specific neural activity features are involved in a task. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Grain size analysis and depositional environment of shallow marine to basin floor, Kelantan River Delta

    NASA Astrophysics Data System (ADS)

    Afifah, M. R. Nurul; Aziz, A. Che; Roslan, M. Kamal

    2015-09-01

    Sediment samples were collected from the shallow marine from Kuala Besar, Kelantan outwards to the basin floor of South China Sea which consisted of quaternary bottom sediments. Sixty five samples were analysed for their grain size distribution and statistical relationships. Basic statistical analysis like mean, standard deviation, skewness and kurtosis were calculated and used to differentiate the depositional environment of the sediments and to derive the uniformity of depositional environment either from the beach or river environment. The sediments of all areas were varied in their sorting ranging from very well sorted to poorly sorted, strongly negative skewed to strongly positive skewed, and extremely leptokurtic to very platykurtic in nature. Bivariate plots between the grain-size parameters were then interpreted and the Coarsest-Median (CM) pattern showed the trend suggesting relationships between sediments influenced by three ongoing hydrodynamic factors namely turbidity current, littoral drift and waves dynamic, which functioned to control the sediments distribution pattern in various ways.

  7. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.

  8. Environmental analysis using integrated GIS and remotely sensed data - Some research needs and priorities

    NASA Technical Reports Server (NTRS)

    Davis, Frank W.; Quattrochi, Dale A.; Ridd, Merrill K.; Lam, Nina S.-N.; Walsh, Stephen J.

    1991-01-01

    This paper discusses some basic scientific issues and research needs in the joint processing of remotely sensed and GIS data for environmental analysis. Two general topics are treated in detail: (1) scale dependence of geographic data and the analysis of multiscale remotely sensed and GIS data, and (2) data transformations and information flow during data processing. The discussion of scale dependence focuses on the theory and applications of spatial autocorrelation, geostatistics, and fractals for characterizing and modeling spatial variation. Data transformations during processing are described within the larger framework of geographical analysis, encompassing sampling, cartography, remote sensing, and GIS. Development of better user interfaces between image processing, GIS, database management, and statistical software is needed to expedite research on these and other impediments to integrated analysis of remotely sensed and GIS data.

  9. Prediction model to estimate presence of coronary artery disease: retrospective pooled analysis of existing cohorts

    PubMed Central

    Genders, Tessa S S; Steyerberg, Ewout W; Nieman, Koen; Galema, Tjebbe W; Mollet, Nico R; de Feyter, Pim J; Krestin, Gabriel P; Alkadhi, Hatem; Leschka, Sebastian; Desbiolles, Lotus; Meijs, Matthijs F L; Cramer, Maarten J; Knuuti, Juhani; Kajander, Sami; Bogaert, Jan; Goetschalckx, Kaatje; Cademartiri, Filippo; Maffei, Erica; Martini, Chiara; Seitun, Sara; Aldrovandi, Annachiara; Wildermuth, Simon; Stinn, Björn; Fornaro, Jürgen; Feuchtner, Gudrun; De Zordo, Tobias; Auer, Thomas; Plank, Fabian; Friedrich, Guy; Pugliese, Francesca; Petersen, Steffen E; Davies, L Ceri; Schoepf, U Joseph; Rowe, Garrett W; van Mieghem, Carlos A G; van Driessche, Luc; Sinitsyn, Valentin; Gopalan, Deepa; Nikolaou, Konstantin; Bamberg, Fabian; Cury, Ricardo C; Battle, Juan; Maurovich-Horvat, Pál; Bartykowszki, Andrea; Merkely, Bela; Becker, Dávid; Hadamitzky, Martin; Hausleiter, Jörg; Dewey, Marc; Zimmermann, Elke; Laule, Michael

    2012-01-01

    Objectives To develop prediction models that better estimate the pretest probability of coronary artery disease in low prevalence populations. Design Retrospective pooled analysis of individual patient data. Setting 18 hospitals in Europe and the United States. Participants Patients with stable chest pain without evidence for previous coronary artery disease, if they were referred for computed tomography (CT) based coronary angiography or catheter based coronary angiography (indicated as low and high prevalence settings, respectively). Main outcome measures Obstructive coronary artery disease (≥50% diameter stenosis in at least one vessel found on catheter based coronary angiography). Multiple imputation accounted for missing predictors and outcomes, exploiting strong correlation between the two angiography procedures. Predictive models included a basic model (age, sex, symptoms, and setting), clinical model (basic model factors and diabetes, hypertension, dyslipidaemia, and smoking), and extended model (clinical model factors and use of the CT based coronary calcium score). We assessed discrimination (c statistic), calibration, and continuous net reclassification improvement by cross validation for the four largest low prevalence datasets separately and the smaller remaining low prevalence datasets combined. Results We included 5677 patients (3283 men, 2394 women), of whom 1634 had obstructive coronary artery disease found on catheter based coronary angiography. All potential predictors were significantly associated with the presence of disease in univariable and multivariable analyses. The clinical model improved the prediction, compared with the basic model (cross validated c statistic improvement from 0.77 to 0.79, net reclassification improvement 35%); the coronary calcium score in the extended model was a major predictor (0.79 to 0.88, 102%). Calibration for low prevalence datasets was satisfactory. Conclusions Updated prediction models including age, sex, symptoms, and cardiovascular risk factors allow for accurate estimation of the pretest probability of coronary artery disease in low prevalence populations. Addition of coronary calcium scores to the prediction models improves the estimates. PMID:22692650

  10. Simulation and statistical analysis for the optimization of nitrogen liquefaction plant with cryogenic Claude cycle using process modeling tool: ASPEN HYSYS

    NASA Astrophysics Data System (ADS)

    Joshi, D. M.

    2017-09-01

    Cryogenic technology is used for liquefaction of many gases and it has several applications in food process engineering. Temperatures below 123 K are considered to be in the field of cryogenics. Extreme low temperatures are a basic need for many industrial processes and have several applications, such as superconductivity of magnets, space, medicine and gas industries. Several methods can be used to obtain the low temperatures required for liquefaction of gases. The process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure, which is below the critical pressure, is the basic liquefaction process. Different cryogenic cycle configurations are designed for getting the liquefied form of gases at different temperatures. Each of the cryogenic cycles like Linde cycle, Claude cycle, Kapitza cycle or modified Claude cycle has its own advantages and disadvantages. The placement of heat exchangers, Joule-Thompson valve and turboexpander decides the configuration of a cryogenic cycle. Each configuration has its own efficiency according to the application. Here, a nitrogen liquefaction plant is used for the analysis purpose. The process modeling tool ASPEN HYSYS can provide a software simulation approach before the actual implementation of the plant in the field. This paper presents the simulation and statistical analysis of the Claude cycle with the process modeling tool ASPEN HYSYS. It covers the technique used to optimize the liquefaction of the plant. The simulation results so obtained can be used as a reference for the design and optimization of the nitrogen liquefaction plant. Efficient liquefaction will give the best performance and productivity to the plant.

  11. Demographic and health situation of children in conditions of economic destabilization in the Ukraine.

    PubMed

    Pantyley, Viktoriya

    2014-01-01

    In new conditions of socio-economic development in the Ukraine, the health of the population of children is considered as the most reliable indicator of socio-economic development of the country. The primary goal of the study was analysis of the effect of contemporary socio-economic transformations, their scope, and strength of effect on the demographic and social situation of children in various regions of the Ukraine. The methodological objectives of the study were as follows: development of a synthetic measure of the state of health of the population of children, based on the Hellwig's method, and selection of districts in the Ukraine according to the present health-demographic situation of children. The study was based on statistical data from the State Statistics Service of Ukraine, Centre of Medical Statistics in Kiev, Ukrainian Ministry of Defence, as well as Ministry of Education and Science, Youth and Sports of Ukraine. The following research methods were used: analysis of literature and Internet sources, selection and analysis of statistical materials, cartographic and statistical methods. Basic indices of the demographic and health situation of the population of children were analyzed, as well as factors of a socio-economic nature which affect this situation. A set of variables was developed for the synthetic evaluation of the state of health of the population of children. The typology of the Ukrainian districts was performed according to the state of health of the child population, based on the Hellwig's taxonomic method. Deterioration was observed of selected quality parameters, as well as a change in the strength and directions of effect of factors of organizational-institutional, socioeconomic, historical and cultural nature on the population of children potential.

  12. Improvements to an earth observing statistical performance model with applications to LWIR spectral variability

    NASA Astrophysics Data System (ADS)

    Zhao, Runchen; Ientilucci, Emmett J.

    2017-05-01

    Hyperspectral remote sensing systems provide spectral data composed of hundreds of narrow spectral bands. Spectral remote sensing systems can be used to identify targets, for example, without physical interaction. Often it is of interested to characterize the spectral variability of targets or objects. The purpose of this paper is to identify and characterize the LWIR spectral variability of targets based on an improved earth observing statistical performance model, known as the Forecasting and Analysis of Spectroradiometric System Performance (FASSP) model. FASSP contains three basic modules including a scene model, sensor model and a processing model. Instead of using mean surface reflectance only as input to the model, FASSP transfers user defined statistical characteristics of a scene through the image chain (i.e., from source to sensor). The radiative transfer model, MODTRAN, is used to simulate the radiative transfer based on user defined atmospheric parameters. To retrieve class emissivity and temperature statistics, or temperature / emissivity separation (TES), a LWIR atmospheric compensation method is necessary. The FASSP model has a method to transform statistics in the visible (ie., ELM) but currently does not have LWIR TES algorithm in place. This paper addresses the implementation of such a TES algorithm and its associated transformation of statistics.

  13. Stratification of complexity in congenital heart surgery: comparative study of the Risk Adjustment for Congenital Heart Surgery (RACHS-1) method, Aristotle basic score and Society of Thoracic Surgeons-European Association for Cardio- Thoracic Surgery (STS-EACTS) mortality score.

    PubMed

    Cavalcanti, Paulo Ernando Ferraz; Sá, Michel Pompeu Barros de Oliveira; Santos, Cecília Andrade dos; Esmeraldo, Isaac Melo; Chaves, Mariana Leal; Lins, Ricardo Felipe de Albuquerque; Lima, Ricardo de Carvalho

    2015-01-01

    To determine whether stratification of complexity models in congenital heart surgery (RACHS-1, Aristotle basic score and STS-EACTS mortality score) fit to our center and determine the best method of discriminating hospital mortality. Surgical procedures in congenital heart diseases in patients under 18 years of age were allocated to the categories proposed by the stratification of complexity methods currently available. The outcome hospital mortality was calculated for each category from the three models. Statistical analysis was performed to verify whether the categories presented different mortalities. The discriminatory ability of the models was determined by calculating the area under the ROC curve and a comparison between the curves of the three models was performed. 360 patients were allocated according to the three methods. There was a statistically significant difference between the mortality categories: RACHS-1 (1) - 1.3%, (2) - 11.4%, (3)-27.3%, (4) - 50 %, (P<0.001); Aristotle basic score (1) - 1.1%, (2) - 12.2%, (3) - 34%, (4) - 64.7%, (P<0.001); and STS-EACTS mortality score (1) - 5.5 %, (2) - 13.6%, (3) - 18.7%, (4) - 35.8%, (P<0.001). The three models had similar accuracy by calculating the area under the ROC curve: RACHS-1- 0.738; STS-EACTS-0.739; Aristotle- 0.766. The three models of stratification of complexity currently available in the literature are useful with different mortalities between the proposed categories with similar discriminatory capacity for hospital mortality.

  14. Personal Docente del Nivel Primario. Series Estadisticas Basicas, Nivel Educativo: Cordoba (Teaching Personnel in Primary Schools. Basic Statistics Series , Level of Education: Cordoba).

    ERIC Educational Resources Information Center

    Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.

    This document provides statistical data on the distribution and education of teaching personnel working the elementary schools of Cordoba, Colombia, between 1958 and 1967. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of the teachers. For overall statistics in…

  15. Personal Docente des Nivel Primario. Series Estadisticas Basicas, Nivel Educativo: Narino (Teaching Personnel in Primary Schools. Basic Statistics Series, Level of Education: Narino).

    ERIC Educational Resources Information Center

    Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.

    This document provides statistical data on the distribution and education of teaching personnel working in the elementary schools of Narino, Colombia, between 1958 and 1967. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of the teachers. For overall statistics in…

  16. Personal Docente del Nivel Primario. Series Estadisticas Basicas, Nivel Educativo: Cauca (Teaching Personnel in Primary Schools. Basic Statistics Series, Level of Education: Cauca).

    ERIC Educational Resources Information Center

    Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.

    This document provides statistical data on the distribution and education of teaching personnel working in the elementary schools of Cauca, Colombia, between 1958 and 1967. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of the teachers. For overall statistics in…

  17. Personal Docente del Nivel Primario. Series Estadisticas Basicas, Nivel Educativo: Caldas (Teaching Personnel in Primary Schools. Basic Statistics Series, Level of Education: Caldas).

    ERIC Educational Resources Information Center

    Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.

    This document provides statistical data on the distribution and education of teaching personnel working in the elementary schools of Caldas, Colombia, between 1958 and 1967. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of the teachers. For overall statistics in…

  18. Personal Docente del Nivel Primario. Series Estadisticas Basicas, Nivel Educativo: Boyaca (Teaching Personnel in Primary Schools. Basic Statistics Series, Level of Education: Boyaca).

    ERIC Educational Resources Information Center

    Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.

    This document provides statistical data on the distribution and education of teaching personnel working in the elementary schools of Boyaca, Colombia, between 1958 and 1967. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of the teachers. For overall statistics in…

  19. Personal Docente del Nivel Primario. Series Estadisticas Basicas, Nivel Educativo: Huila (Teaching Personnel in Primary Schools. Basic Statistics Series, Level of Education: Huila).

    ERIC Educational Resources Information Center

    Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.

    This document provides statistical data on the distribution and education of teaching personnel working in the elementary schools of Huila, Colombia, between 1958 and 1967. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of the teachers. For overall statistics in…

  20. Health Resources Statistics; Health Manpower and Health Facilities, 1968. Public Health Service Publication No. 1509.

    ERIC Educational Resources Information Center

    National Center for Health Statistics (DHEW/PHS), Hyattsville, MD.

    This report is a part of the program of the National Center for Health Statistics to provide current statistics as baseline data for the evaluation, planning, and administration of health programs. Part I presents data concerning the occupational fields: (1) administration, (2) anthropology and sociology, (3) data processing, (4) basic sciences,…

  1. Personal Docente del Nivel Primario. Series Estadisticas Basicas: Colombia (Teaching Personnel in Primary Schools. Basic Statistics Series: Colombia).

    ERIC Educational Resources Information Center

    Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.

    This document provides statistical data on the distribution and education of teacher personnel working in Colombian elementary schools between 1940 and 1968. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of teachers. (VM)

  2. Explorations in Statistics: Standard Deviations and Standard Errors

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2008-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This series in "Advances in Physiology Education" provides an opportunity to do just that: we will investigate basic concepts in statistics using the free software package R. Because this series uses R solely as a vehicle…

  3. The Precision-Power-Gradient Theory for Teaching Basic Research Statistical Tools to Graduate Students.

    ERIC Educational Resources Information Center

    Cassel, Russell N.

    This paper relates educational and psychological statistics to certain "Research Statistical Tools" (RSTs) necessary to accomplish and understand general research in the behavioral sciences. Emphasis is placed on acquiring an effective understanding of the RSTs and to this end they are are ordered to a continuum scale in terms of individual…

  4. Estimates of School Statistics, 1971-72.

    ERIC Educational Resources Information Center

    Flanigan, Jean M.

    This report presents public school statistics for the 50 States, the District of Columbia, and the regions and outlying areas of the United States. The text presents national data for each of the past 10 years and defines the basic series of statistics. Tables present the revised estimates by State and region for 1970-71 and the preliminary…

  5. Basic Student Charges at Postsecondary Institutions: Academic Year 1992-93. Tuition and Required Fees and Room and Board Charges at 4-year, 2-year, and Public Less-than-2-year Institutions. Statistical Analysis Report.

    ERIC Educational Resources Information Center

    Broyles, Susan G.; Morgan, Frank B.

    This report lists the typical tuition and required fees and room and board charges for academic year 1992-93 at nearly 5,000 4-year, 2-year, and public less-than-2-year postsecondary institutions in the United States and its outlying areas. Included are tuition and fee charges to in-state and out-of-state students at the undergraduate and graduate…

  6. Water-resources investigations in Wisconsin, 1993

    USGS Publications Warehouse

    Maertz, D.E.

    1993-01-01

    OBJECTIVE: The objectives of this study are to provide continuous discharge records for selected rivers at specific sites to supply the needs for: regulation, analytical studies, definition of statistical properties, trends analysis, determination of the occurrence, and distribution of water in streams for planning. The project is also designed to determine lake levels and to provide discharge for floods, low-flow conditions, and for water-quality investigations. Requests for streamflow data and information relating to streamflow in Wisconsin are answered. Basic data are published annually in "Water Resources Data Wisconsin."

  7. Analyzing the Commitment of College Students Using a Brief, Contextualized Measure of Need Satisfaction From the Perspective of Self-Determination Theory.

    PubMed

    Davidson, William; Beck, Hall P

    2018-01-01

    This study empirically confirmed the relationships between the degree to which students satisfied three basic needs (competence, relatedness, and autonomy) and the strength of their commitments to the university they attended and to obtaining a baccalaureate degree. A questionnaire was administered online to 1257 students at two 4-year universities. Regression analysis yielded statistically significant associations between the three needs and Institutional Commitment and Degree Commitment, explaining more than 20% of the variance in the latter two variables.

  8. Lift truck safety review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cadwallader, L.C.

    1997-03-01

    This report presents safety information about powered industrial trucks. The basic lift truck, the counterbalanced sit down rider truck, is the primary focus of the report. Lift truck engineering is briefly described, then a hazard analysis is performed on the lift truck. Case histories and accident statistics are also given. Rules and regulations about lift trucks, such as the US Occupational Safety an Health Administration laws and the Underwriter`s Laboratories standards, are discussed. Safety issues with lift trucks are reviewed, and lift truck safety and reliability are discussed. Some quantitative reliability values are given.

  9. ODM Data Analysis-A tool for the automatic validation, monitoring and generation of generic descriptive statistics of patient data.

    PubMed

    Brix, Tobias Johannes; Bruland, Philipp; Sarfraz, Saad; Ernsting, Jan; Neuhaus, Philipp; Storck, Michael; Doods, Justin; Ständer, Sonja; Dugas, Martin

    2018-01-01

    A required step for presenting results of clinical studies is the declaration of participants demographic and baseline characteristics as claimed by the FDAAA 801. The common workflow to accomplish this task is to export the clinical data from the used electronic data capture system and import it into statistical software like SAS software or IBM SPSS. This software requires trained users, who have to implement the analysis individually for each item. These expenditures may become an obstacle for small studies. Objective of this work is to design, implement and evaluate an open source application, called ODM Data Analysis, for the semi-automatic analysis of clinical study data. The system requires clinical data in the CDISC Operational Data Model format. After uploading the file, its syntax and data type conformity of the collected data is validated. The completeness of the study data is determined and basic statistics, including illustrative charts for each item, are generated. Datasets from four clinical studies have been used to evaluate the application's performance and functionality. The system is implemented as an open source web application (available at https://odmanalysis.uni-muenster.de) and also provided as Docker image which enables an easy distribution and installation on local systems. Study data is only stored in the application as long as the calculations are performed which is compliant with data protection endeavors. Analysis times are below half an hour, even for larger studies with over 6000 subjects. Medical experts have ensured the usefulness of this application to grant an overview of their collected study data for monitoring purposes and to generate descriptive statistics without further user interaction. The semi-automatic analysis has its limitations and cannot replace the complex analysis of statisticians, but it can be used as a starting point for their examination and reporting.

  10. The Importance of Medical Students' Attitudes Regarding Cognitive Competence for Teaching Applied Statistics: Multi-Site Study and Meta-Analysis

    PubMed Central

    Milic, Natasa M.; Masic, Srdjan; Milin-Lazovic, Jelena; Trajkovic, Goran; Bukumiric, Zoran; Savic, Marko; Milic, Nikola V.; Cirkovic, Andja; Gajic, Milan; Kostic, Mirjana; Ilic, Aleksandra; Stanisavljevic, Dejana

    2016-01-01

    Background The scientific community increasingly is recognizing the need to bolster standards of data analysis given the widespread concern that basic mistakes in data analysis are contributing to the irreproducibility of many published research findings. The aim of this study was to investigate students’ attitudes towards statistics within a multi-site medical educational context, monitor their changes and impact on student achievement. In addition, we performed a systematic review to better support our future pedagogical decisions in teaching applied statistics to medical students. Methods A validated Serbian Survey of Attitudes Towards Statistics (SATS-36) questionnaire was administered to medical students attending obligatory introductory courses in biostatistics from three medical universities in the Western Balkans. A systematic review of peer-reviewed publications was performed through searches of Scopus, Web of Science, Science Direct, Medline, and APA databases through 1994. A meta-analysis was performed for the correlation coefficients between SATS component scores and statistics achievement. Pooled estimates were calculated using random effects models. Results SATS-36 was completed by 461 medical students. Most of the students held positive attitudes towards statistics. Ability in mathematics and grade point average were associated in a multivariate regression model with the Cognitive Competence score, after adjusting for age, gender and computer ability. The results of 90 paired data showed that Affect, Cognitive Competence, and Effort scores demonstrated significant positive changes. The Cognitive Competence score showed the largest increase (M = 0.48, SD = 0.95). The positive correlation found between the Cognitive Competence score and students’ achievement (r = 0.41; p<0.001), was also shown in the meta-analysis (r = 0.37; 95% CI 0.32–0.41). Conclusion Students' subjective attitudes regarding Cognitive Competence at the beginning of the biostatistics course, which were directly linked to mathematical knowledge, affected their attitudes at the end of the course that, in turn, influenced students' performance. This indicates the importance of positively changing not only students’ cognitive competency, but also their perceptions of gained competency during the biostatistics course. PMID:27764123

  11. The Importance of Medical Students' Attitudes Regarding Cognitive Competence for Teaching Applied Statistics: Multi-Site Study and Meta-Analysis.

    PubMed

    Milic, Natasa M; Masic, Srdjan; Milin-Lazovic, Jelena; Trajkovic, Goran; Bukumiric, Zoran; Savic, Marko; Milic, Nikola V; Cirkovic, Andja; Gajic, Milan; Kostic, Mirjana; Ilic, Aleksandra; Stanisavljevic, Dejana

    2016-01-01

    The scientific community increasingly is recognizing the need to bolster standards of data analysis given the widespread concern that basic mistakes in data analysis are contributing to the irreproducibility of many published research findings. The aim of this study was to investigate students' attitudes towards statistics within a multi-site medical educational context, monitor their changes and impact on student achievement. In addition, we performed a systematic review to better support our future pedagogical decisions in teaching applied statistics to medical students. A validated Serbian Survey of Attitudes Towards Statistics (SATS-36) questionnaire was administered to medical students attending obligatory introductory courses in biostatistics from three medical universities in the Western Balkans. A systematic review of peer-reviewed publications was performed through searches of Scopus, Web of Science, Science Direct, Medline, and APA databases through 1994. A meta-analysis was performed for the correlation coefficients between SATS component scores and statistics achievement. Pooled estimates were calculated using random effects models. SATS-36 was completed by 461 medical students. Most of the students held positive attitudes towards statistics. Ability in mathematics and grade point average were associated in a multivariate regression model with the Cognitive Competence score, after adjusting for age, gender and computer ability. The results of 90 paired data showed that Affect, Cognitive Competence, and Effort scores demonstrated significant positive changes. The Cognitive Competence score showed the largest increase (M = 0.48, SD = 0.95). The positive correlation found between the Cognitive Competence score and students' achievement (r = 0.41; p<0.001), was also shown in the meta-analysis (r = 0.37; 95% CI 0.32-0.41). Students' subjective attitudes regarding Cognitive Competence at the beginning of the biostatistics course, which were directly linked to mathematical knowledge, affected their attitudes at the end of the course that, in turn, influenced students' performance. This indicates the importance of positively changing not only students' cognitive competency, but also their perceptions of gained competency during the biostatistics course.

  12. Evolution of massive stars in very young clusters and associations

    NASA Technical Reports Server (NTRS)

    Stothers, R. B.

    1985-01-01

    Statistics concerning the stellar content of young galactic clusters and associations which show well defined main sequence turnups have been analyzed in order to derive information about stellar evolution in high-mass galaxies. The analytical approach is semiempirical and uses natural spectroscopic groups of stars on the H-R diagram together with the stars' apparent magnitudes. The new approach does not depend on absolute luminosities and requires only the most basic elements of stellar evolution theory. The following conclusions are offered on the basis of the statistical analysis: (1) O-tupe main-sequence stars evolve to a spectral type of B1 during core hydrogen burning; (2) most O-type blue stragglers are newly formed massive stars burning core hydrogen; (3) supergiants lying redward of the main-sequence turnup are burning core helium; and most Wolf-Rayet stars are burning core helium and originally had masses greater than 30-40 solar mass. The statistics of the natural spectroscopic stars in young galactic clusters and associations are given in a table.

  13. Analysis of the procedures used to evaluate suicide crime scenes in Brazil: a statistical approach to interpret reports.

    PubMed

    Bruni, Aline Thaís; Velho, Jesus Antonio; Ferreira, Arthur Serra Lopes; Tasso, Maria Júlia; Ferrari, Raíssa Santos; Yoshida, Ricardo Luís; Dias, Marcos Salvador; Leite, Vitor Barbanti Pereira

    2014-08-01

    This study uses statistical techniques to evaluate reports on suicide scenes; it utilizes 80 reports from different locations in Brazil, randomly collected from both federal and state jurisdictions. We aimed to assess a heterogeneous group of cases in order to obtain an overall perspective of the problem. We evaluated variables regarding the characteristics of the crime scene, such as the detected traces (blood, instruments and clothes) that were found and we addressed the methodology employed by the experts. A qualitative approach using basic statistics revealed a wide distribution as to how the issue was addressed in the documents. We examined a quantitative approach involving an empirical equation and we used multivariate procedures to validate the quantitative methodology proposed for this empirical equation. The methodology successfully identified the main differences in the information presented in the reports, showing that there is no standardized method of analyzing evidences. Copyright © 2014 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  14. Analyzing Dyadic Sequence Data—Research Questions and Implied Statistical Models

    PubMed Central

    Fuchs, Peter; Nussbeck, Fridtjof W.; Meuwly, Nathalie; Bodenmann, Guy

    2017-01-01

    The analysis of observational data is often seen as a key approach to understanding dynamics in romantic relationships but also in dyadic systems in general. Statistical models for the analysis of dyadic observational data are not commonly known or applied. In this contribution, selected approaches to dyadic sequence data will be presented with a focus on models that can be applied when sample sizes are of medium size (N = 100 couples or less). Each of the statistical models is motivated by an underlying potential research question, the most important model results are presented and linked to the research question. The following research questions and models are compared with respect to their applicability using a hands on approach: (I) Is there an association between a particular behavior by one and the reaction by the other partner? (Pearson Correlation); (II) Does the behavior of one member trigger an immediate reaction by the other? (aggregated logit models; multi-level approach; basic Markov model); (III) Is there an underlying dyadic process, which might account for the observed behavior? (hidden Markov model); and (IV) Are there latent groups of dyads, which might account for observing different reaction patterns? (mixture Markov; optimal matching). Finally, recommendations for researchers to choose among the different models, issues of data handling, and advises to apply the statistical models in empirical research properly are given (e.g., in a new r-package “DySeq”). PMID:28443037

  15. The Research of Spatial-Temporal Analysis and Decision-Making Assistant System for Disabled Person Affairs Based on Mapworld

    NASA Astrophysics Data System (ADS)

    Zhang, J. H.; Yang, J.; Sun, Y. S.

    2015-06-01

    This system combines the Mapworld platform and informationization of disabled person affairs, uses the basic information of disabled person as center frame. Based on the disabled person population database, the affairs management system and the statistical account system, the data were effectively integrated and the united information resource database was built. Though the data analysis and mining, the system provides powerful data support to the decision making, the affairs managing and the public serving. It finally realizes the rationalization, normalization and scientization of disabled person affairs management. It also makes significant contributions to the great-leap-forward development of the informationization of China Disabled Person's Federation.

  16. LFSTAT - Low-Flow Analysis in R

    NASA Astrophysics Data System (ADS)

    Koffler, Daniel; Laaha, Gregor

    2013-04-01

    The calculation of characteristic stream flow during dry conditions is a basic requirement for many problems in hydrology, ecohydrology and water resources management. As opposed to floods, a number of different indices are used to characterise low flows and streamflow droughts. Although these indices and methods of calculation have been well documented in the WMO Manual on Low-flow Estimation and Prediction [1], a comprehensive software was missing which enables a fast and standardized calculation of low flow statistics. We present the new software package lfstat to fill in this obvious gap. Our software package is based on the statistical open source software R, and expands it to analyse daily stream flow data records focusing on low-flows. As command-line based programs are not everyone's preference, we also offer a plug-in for the R-Commander, an easy to use graphical user interface (GUI) provided for R which is based on tcl/tk. The functionality of lfstat includes estimation methods for low-flow indices, extreme value statistics, deficit characteristics, and additional graphical methods to control the computation of complex indices and to illustrate the data. Beside the basic low flow indices, the baseflow index and recession constants can be computed. For extreme value statistics, state-of-the-art methods for L-moment based local and regional frequency analysis (RFA) are available. The tools for deficit characteristics include various pooling and threshold selection methods to support the calculation of drought duration and deficit indices. The most common graphics for low flow analysis are available, and the plots can be modified according to the user preferences. Graphics include hydrographs for different periods, flexible streamflow deficit plots, baseflow visualisation, recession diagnostic, flow duration curves as well as double mass curves, and many more. From a technical point of view, the package uses a S3-class called lfobj (low-flow objects). This objects are usual R-data-frames including date, flow, hydrological year and possibly baseflow information. Once these objects are created, analysis can be performed by mouse-click and a script can be saved to make the analysis easily reproducible. At the moment we are offering implementation of all major methods proposed in the WMO manual on Low-flow Estimation and Predictions [1]. Future plans include a dynamic low flow report in odt-file format using odf-weave which allows automatic updates if data or analysis change. We hope to offer a tool to ease and structure the analysis of stream flow data focusing on low-flows and to make analysis transparent and communicable. The package can also be used in teaching students the first steps in low-flow hydrology. The software packages can be installed from CRAN (latest stable) and R-Forge: http://r-forge.r-project.org (development version). References: [1] Gustard, Alan; Demuth, Siegfried, (eds.) Manual on Low-flow Estimation and Prediction. Geneva, Switzerland, World Meteorological Organization, (Operational Hydrology Report No. 50, WMO-No. 1029).

  17. Stata companion.

    PubMed

    Brennan, Jennifer Sousa

    2010-01-01

    This chapter is an introductory reference guide highlighting some of the most common statistical topics, broken down into both command-line syntax and graphical interface point-and-click commands. This chapter serves to supplement more formal statistics lessons and expedite using Stata to compute basic analyses.

  18. Quantum Social Science

    NASA Astrophysics Data System (ADS)

    Haven, Emmanuel; Khrennikov, Andrei

    2013-01-01

    Preface; Part I. Physics Concepts in Social Science? A Discussion: 1. Classical, statistical and quantum mechanics: all in one; 2. Econophysics: statistical physics and social science; 3. Quantum social science: a non-mathematical motivation; Part II. Mathematics and Physics Preliminaries: 4. Vector calculus and other mathematical preliminaries; 5. Basic elements of quantum mechanics; 6. Basic elements of Bohmian mechanics; Part III. Quantum Probabilistic Effects in Psychology: Basic Questions and Answers: 7. A brief overview; 8. Interference effects in psychology - an introduction; 9. A quantum-like model of decision making; Part IV. Other Quantum Probabilistic Effects in Economics, Finance and Brain Sciences: 10. Financial/economic theory in crisis; 11. Bohmian mechanics in finance and economics; 12. The Bohm-Vigier Model and path simulation; 13. Other applications to economic/financial theory; 14. The neurophysiological sources of quantum-like processing in the brain; Conclusion; Glossary; Index.

  19. GENASIS Basics: Object-oriented utilitarian functionality for large-scale physics simulations (Version 2)

    NASA Astrophysics Data System (ADS)

    Cardall, Christian Y.; Budiardja, Reuben D.

    2017-05-01

    GenASiS Basics provides Fortran 2003 classes furnishing extensible object-oriented utilitarian functionality for large-scale physics simulations on distributed memory supercomputers. This functionality includes physical units and constants; display to the screen or standard output device; message passing; I/O to disk; and runtime parameter management and usage statistics. This revision -Version 2 of Basics - makes mostly minor additions to functionality and includes some simplifying name changes.

  20. Basic Facts and Figures about the Educational System in Japan.

    ERIC Educational Resources Information Center

    National Inst. for Educational Research, Tokyo (Japan).

    Tables, charts, and graphs convey supporting data that accompany text on various aspects of the Japanese educational system presented in this booklet. There are seven chapters: (1) Fundamental principles of education; (2) Organization of the educational system; (3) Basic statistics of education; (4) Curricula, textbooks, and instructional aids;…

  1. Classifying the Basic Parameters of Ultraviolet Copper Bromide Laser

    NASA Astrophysics Data System (ADS)

    Gocheva-Ilieva, S. G.; Iliev, I. P.; Temelkov, K. A.; Vuchkov, N. K.; Sabotinov, N. V.

    2009-10-01

    The performance of deep ultraviolet copper bromide lasers is of great importance because of their applications in medicine, microbiology, high-precision processing of new materials, high-resolution laser lithography in microelectronics, high-density optical recording of information, laser-induced fluorescence in plasma and wide-gap semiconductors and more. In this paper we present a statistical study on the classification of 12 basic lasing parameters, by using different agglomerative methods of cluster analysis. The results are based on a big amount of experimental data for UV Cu+ Ne-CuBr laser with wavelengths 248.6 nm, 252.9 nm, 260.0 nm and 270.3 nm, obtained in Georgi Nadjakov Institute of Solid State Physics, Bulgarian Academy of Sciences. The relevant influence of parameters on laser generation is also evaluated. The results are applicable in computer modeling and planning the experiments and further laser development with improved output characteristics.

  2. Recent advances in parametric neuroreceptor mapping with dynamic PET: basic concepts and graphical analyses.

    PubMed

    Seo, Seongho; Kim, Su Jin; Lee, Dong Soo; Lee, Jae Sung

    2014-10-01

    Tracer kinetic modeling in dynamic positron emission tomography (PET) has been widely used to investigate the characteristic distribution patterns or dysfunctions of neuroreceptors in brain diseases. Its practical goal has progressed from regional data quantification to parametric mapping that produces images of kinetic-model parameters by fully exploiting the spatiotemporal information in dynamic PET data. Graphical analysis (GA) is a major parametric mapping technique that is independent on any compartmental model configuration, robust to noise, and computationally efficient. In this paper, we provide an overview of recent advances in the parametric mapping of neuroreceptor binding based on GA methods. The associated basic concepts in tracer kinetic modeling are presented, including commonly-used compartment models and major parameters of interest. Technical details of GA approaches for reversible and irreversible radioligands are described, considering both plasma input and reference tissue input models. Their statistical properties are discussed in view of parametric imaging.

  3. How to design and write a clinical research protocol in Cosmetic Dermatology*

    PubMed Central

    Bagatin, Ediléia; Miot, Helio A.

    2013-01-01

    Cosmetic Dermatology is a growing subspecialty. High-quality basic science studies have been published; however, few double-blind, randomized controlled clinical trials, which are the major instrument for evidence-based medicine, have been conducted in this area. Clinical research is essential for the discovery of new knowledge, improvement of scientific basis, resolution of challenges, and good clinical practice. Some basic principles for a successful researcher include interest, availability, persistence, and honesty. It is essential to learn how to write a protocol research and to know the international and national regulatory rules. A complete clinical trial protocol should include question, background, objectives, methodology (design, variable description, sample size, randomization, inclusion and exclusion criteria, intervention, efficacy and safety measures, and statistical analysis), consent form, clinical research form, and references. Institutional ethical review board approval and financial support disclosure are necessary. Publication of positive or negative results should be an authors' commitment. PMID:23539006

  4. An undergraduate course, and new textbook, on ``Physical Models of Living Systems''

    NASA Astrophysics Data System (ADS)

    Nelson, Philip

    2015-03-01

    I'll describe an intermediate-level course on ``Physical Models of Living Systems.'' The only prerequisite is first-year university physics and calculus. The course is a response to rapidly growing interest among undergraduates in several science and engineering departments. Students acquire several research skills that are often not addressed in traditional courses, including: basic modeling skills, probabilistic modeling skills, data analysis methods, computer programming using a general-purpose platform like MATLAB or Python, dynamical systems, particularly feedback control. These basic skills, which are relevant to nearly any field of science or engineering, are presented in the context of case studies from living systems, including: virus dynamics; bacterial genetics and evolution of drug resistance; statistical inference; superresolution microscopy; synthetic biology; naturally evolved cellular circuits. Publication of a new textbook by WH Freeman and Co. is scheduled for December 2014. Supported in part by EF-0928048 and DMR-0832802.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Apte, A; Veeraraghavan, H; Oh, J

    Purpose: To present an open source and free platform to facilitate radiomics research — The “Radiomics toolbox” in CERR. Method: There is scarcity of open source tools that support end-to-end modeling of image features to predict patient outcomes. The “Radiomics toolbox” strives to fill the need for such a software platform. The platform supports (1) import of various kinds of image modalities like CT, PET, MR, SPECT, US. (2) Contouring tools to delineate structures of interest. (3) Extraction and storage of image based features like 1st order statistics, gray-scale co-occurrence and zonesize matrix based texture features and shape features andmore » (4) Statistical Analysis. Statistical analysis of the extracted features is supported with basic functionality that includes univariate correlations, Kaplan-Meir curves and advanced functionality that includes feature reduction and multivariate modeling. The graphical user interface and the data management are performed with Matlab for the ease of development and readability of code and features for wide audience. Open-source software developed with other programming languages is integrated to enhance various components of this toolbox. For example: Java-based DCM4CHE for import of DICOM, R for statistical analysis. Results: The Radiomics toolbox will be distributed as an open source, GNU copyrighted software. The toolbox was prototyped for modeling Oropharyngeal PET dataset at MSKCC. The analysis will be presented in a separate paper. Conclusion: The Radiomics Toolbox provides an extensible platform for extracting and modeling image features. To emphasize new uses of CERR for radiomics and image-based research, we have changed the name from the “Computational Environment for Radiotherapy Research” to the “Computational Environment for Radiological Research”.« less

  6. Trends in basic mathematical competencies of beginning undergraduates in Ireland, 2003-2013

    NASA Astrophysics Data System (ADS)

    Treacy, Páraic; Faulkner, Fiona

    2015-11-01

    Deficiencies in beginning undergraduate students' basic mathematical skills has been an issue of concern in higher education, particularly in the past 15 years. This issue has been tracked and analysed in a number of universities in Ireland and internationally through student scores recorded in mathematics diagnostic tests. Students beginning their science-based and technology-based undergraduate courses in the University of Limerick have had their basic mathematics skills tested without any prior warning through a 40 question diagnostic test during their initial service mathematics lecture since 1998. Data gathered through this diagnostic test have been recorded in a database kept at the university and explored to track trends in mathematical competency of these beginning undergraduates. This paper details findings surrounding an analysis of the database between 2003 and 2013, outlining changes in mathematical competencies of these beginning undergraduates in an attempt to determine reasons for such changes. The analysis found that the proportion of students tested through this diagnostic test that are predicted to be at risk of failing their service mathematics end-of-semester examinations has increased significantly between 2003 and 2013. Furthermore, when students' performance in secondary level mathematics was controlled, it was determined that the performance of beginning undergraduates in 2013 was statistically significantly below that of the performance of the beginning undergraduates recorded 10 years previously.

  7. The Future of Basic Science in Academic Surgery: Identifying Barriers to Success for Surgeon-scientists.

    PubMed

    Keswani, Sundeep G; Moles, Chad M; Morowitz, Michael; Zeh, Herbert; Kuo, John S; Levine, Matthew H; Cheng, Lily S; Hackam, David J; Ahuja, Nita; Goldstein, Allan M

    2017-06-01

    The aim of this study was to examine the challenges confronting surgeons performing basic science research in today's academic surgery environment. Multiple studies have identified challenges confronting surgeon-scientists and impacting their ability to be successful. Although these threats have been known for decades, the downward trend in the number of successful surgeon-scientists continues. Clinical demands, funding challenges, and other factors play important roles, but a rigorous analysis of academic surgeons and their experiences regarding these issues has not previously been performed. An online survey was distributed to 2504 members of the Association for Academic Surgery and Society of University Surgeons to determine factors impacting success. Survey results were subjected to statistical analyses. We also reviewed publicly available data regarding funding from the National Institutes of Health (NIH). NIH data revealed a 27% decline in the proportion of NIH funding to surgical departments relative to total NIH funding from 2007 to 2014. A total of 1033 (41%) members responded to our survey, making this the largest survey of academic surgeons to date. Surgeons most often cited the following factors as major impediments to pursuing basic investigation: pressure to be clinically productive, excessive administrative responsibilities, difficulty obtaining extramural funding, and desire for work-life balance. Surprisingly, a majority (68%) did not believe surgeons can be successful basic scientists in today's environment, including departmental leadership. We have identified important barriers that confront academic surgeons pursuing basic research and a perception that success in basic science may no longer be achievable. These barriers need to be addressed to ensure the continued development of future surgeon-scientists.

  8. WINPEPI updated: computer programs for epidemiologists, and their teaching potential

    PubMed Central

    2011-01-01

    Background The WINPEPI computer programs for epidemiologists are designed for use in practice and research in the health field and as learning or teaching aids. The programs are free, and can be downloaded from the Internet. Numerous additions have been made in recent years. Implementation There are now seven WINPEPI programs: DESCRIBE, for use in descriptive epidemiology; COMPARE2, for use in comparisons of two independent groups or samples; PAIRSetc, for use in comparisons of paired and other matched observations; LOGISTIC, for logistic regression analysis; POISSON, for Poisson regression analysis; WHATIS, a "ready reckoner" utility program; and ETCETERA, for miscellaneous other procedures. The programs now contain 122 modules, each of which provides a number, sometimes a large number, of statistical procedures. The programs are accompanied by a Finder that indicates which modules are appropriate for different purposes. The manuals explain the uses, limitations and applicability of the procedures, and furnish formulae and references. Conclusions WINPEPI is a handy resource for a wide variety of statistical routines used by epidemiologists. Because of its ready availability, portability, ease of use, and versatility, WINPEPI has a considerable potential as a learning and teaching aid, both with respect to practical procedures in the planning and analysis of epidemiological studies, and with respect to important epidemiological concepts. It can also be used as an aid in the teaching of general basic statistics. PMID:21288353

  9. Dermatological and respiratory problems in migrant construction workers of Udupi, Karnataka.

    PubMed

    Banerjee, Mayuri; Kamath, Ramachandra; Tiwari, Rajnarayan R; Nair, Narayana Pillai Sreekumaran

    2015-01-01

    India being a developing country has tremendous demand of physical infrastructure and construction work as a result there is a raising demand of construction workers. Workers in construction industry are mainly migratory and employed on contract or subcontract basis. These workers face temporary relationship between employer and employee, uncertainty in working hours, contracting and subcontracting system, lack of basic continuous employment, lack basic amenities, and inadequacy in welfare schemes. To estimate the prevalence of respiratory and dermatological symptoms among migratory construction workers. This cross-sectional study was conducted in Manipal, Karnataka, among 340 male migratory construction workers. A standard modified questionnaire was used as a tool by the interviewer and the physical examination of the workers was done by a physician. The statistical analysis was done using Statistical Package for the Social Sciences (SPSS) version 15.0. Eighty percent of the workers belong to the age group of 18-30 years. The mean age of the workers was 26 ± 8.2 years. Most (43.8%) of the workers are from West Bengal followed by those from Bihar and Jharkhand. The rates of prevalence of respiratory and dermatological symptoms were 33.2% and 36.2%, respectively. The migrant construction workers suffer from a high proportion of respiratory and dermatological problems.

  10. [Sanitation and racial inequality conditions in urban Brazil: an analysis focused on the indigenous population based on the 2010 Population Census].

    PubMed

    Raupp, Ludimila; Fávaro, Thatiana Regina; Cunha, Geraldo Marcelo; Santos, Ricardo Ventura

    2017-01-01

    The aims of this study were to analyze and describe the presence and infrastructure of basic sanitation in the urban areas of Brazil, contrasting indigenous with non-indigenous households. Methods: A cross-sectional study based on microdata from the 2010 Census was conducted. The analyses were based on descriptive statistics (prevalence) and the construction of multiple logistic regression models (adjusted by socioeconomic and demographic covariates). The odds ratios were estimated for the association between the explanatory variables (covariates) and the outcome variables (water supply, sewage, garbage collection, and adequate sanitation). The statistical significance level established was 5%. Among the analyzed services, sewage proved to be the most precarious. Regarding race or color, indigenous households presented the lowest rate of sanitary infrastructure in Urban Brazil. The adjusted regression showed that, in general, indigenous households were at a disadvantage when compared to other categories of race or color, especially in terms of the presence of garbage collection services. These inequalities were much more pronounced in the South and Southeastern regions. The analyses of this study not only confirm the profile of poor conditions and infrastructure of the basic sanitation of indigenous households in urban areas, but also demonstrate the persistence of inequalities associated with race or color in the country.

  11. A team public health research project for first-year pharmacy students to apply content from didactic courses.

    PubMed

    Fuentes, David; Deguire, Nancy; Patel, Rajul; Boyce, Eric

    2010-08-10

    To implement and assess a first-year pharmacy student group research project that provided practical hands-on application and reinforced the curricula of concurrent didactic courses. Groups of 6 to 7 students chose a public health topic based on the Healthy People 2010 Priority Areas and created a survey instrument. Faculty facilitated mock institutional review board (IRB) review sessions which provided teams with ongoing feedback and refinement recommendations before each team administered their survey instrument to a predefined population. Data analysis, formal written reports, and oral presentations were presented to peers and project faculty members. Teams complied with the requirements of the mock IRB, effectively applied basic research principles learned in class, collected survey data, performed inferential statistical analyses on the data, , and presented their project findings. Two-hundred six of 210 students (98%) reported feeling satisfied with both the results of their project and the accomplishments of their team. Teams applied a varied skill set including primary literature evaluation, basic research principles, statistics, public speaking, and peer collaboration in conducting a public health research project. First-year pharmacy students may benefit from participation in a collaborative research project that provides hands-on application of material being taught in didactic courses.

  12. The influence of a ten-week Nordic walking training-rehabilitation program on the level of lipids in blood in overweight and obese postmenopausal women

    PubMed Central

    Hagner-Derengowska, Magdalena; Kałużny, Krystian; Hagner, Wojciech; Kochański, Bartosz; Plaskiewicz, Anna; Borkowska, Alina; Bronisz, Agata; Budzyński, Jacek

    2015-01-01

    [Purpose] The aim of this study was to evaluate the effect of a ten-week Nordic Walking (NW) rehabilitation program on chosen anthropometric parameters and the level of basic lipids in overweight and obese postmenopausal women’s blood. [Subjects and Methods] The subjects were 32 women aged 50–68 (average: 59.7 ± 5.9 years). The study was carried out following a non-randomized model and entailed NW rehabilitation 5 times a week, which lasted for 10 weeks, as well as a low-calorie 1,500 kcal diet. The therapeutic results of the study were measured through changes in anthropometric and biochemical parameters. The results were subjected to a statistical analysis. [Results] After 10 weeks of NW rehabilitation it was observed that participants lost weight and their body mass index dropped. Additionally, whereas levels of total cholesterol, LDL and triglycerides dropped, and the level of HDL increased. [Conclusion] Rehabilitation carried out according to the NW model resulted in statistically significant changes in basic lipids in blood which, considerably increased the percentage of persons who achieved the recommended level of blood lipids. Obese persons were characterised by a smaller rehabilitation weight loss. More intense workouts and cooperation with a dietician are required. PMID:26644639

  13. National policies for technical change: Where are the increasing returns to economic research?

    PubMed Central

    Pavitt, Keith

    1996-01-01

    Improvements over the past 30 years in statistical data, analysis, and related theory have strengthened the basis for science and technology policy by confirming the importance of technical change in national economic performance. But two important features of scientific and technological activities in the Organization for Economic Cooperation and Development countries are still not addressed adequately in mainstream economics: (i) the justification of public funding for basic research and (ii) persistent international differences in investment in research and development and related activities. In addition, one major gap is now emerging in our systems of empirical measurement—the development of software technology, especially in the service sector. There are therefore dangers of diminishing returns to the usefulness of economic research, which continues to rely completely on established theory and established statistical sources. Alternative propositions that deserve serious consideration are: (i) the economic usefulness of basic research is in the provision of (mainly tacit) skills rather than codified and applicable information; (ii) in developing and exploiting technological opportunities, institutional competencies are just as important as the incentive structures that they face; and (iii) software technology developed in traditional service sectors may now be a more important locus of technical change than software technology developed in “high-tech” manufacturing. PMID:8917481

  14. Key statistical and analytical issues for evaluating treatment effects in periodontal research.

    PubMed

    Tu, Yu-Kang; Gilthorpe, Mark S

    2012-06-01

    Statistics is an indispensible tool for evaluating treatment effects in clinical research. Due to the complexities of periodontal disease progression and data collection, statistical analyses for periodontal research have been a great challenge for both clinicians and statisticians. The aim of this article is to provide an overview of several basic, but important, statistical issues related to the evaluation of treatment effects and to clarify some common statistical misconceptions. Some of these issues are general, concerning many disciplines, and some are unique to periodontal research. We first discuss several statistical concepts that have sometimes been overlooked or misunderstood by periodontal researchers. For instance, decisions about whether to use the t-test or analysis of covariance, or whether to use parametric tests such as the t-test or its non-parametric counterpart, the Mann-Whitney U-test, have perplexed many periodontal researchers. We also describe more advanced methodological issues that have sometimes been overlooked by researchers. For instance, the phenomenon of regression to the mean is a fundamental issue to be considered when evaluating treatment effects, and collinearity amongst covariates is a conundrum that must be resolved when explaining and predicting treatment effects. Quick and easy solutions to these methodological and analytical issues are not always available in the literature, and careful statistical thinking is paramount when conducting useful and meaningful research. © 2012 John Wiley & Sons A/S.

  15. Network meta-analysis: a technique to gather evidence from direct and indirect comparisons

    PubMed Central

    2017-01-01

    Systematic reviews and pairwise meta-analyses of randomized controlled trials, at the intersection of clinical medicine, epidemiology and statistics, are positioned at the top of evidence-based practice hierarchy. These are important tools to base drugs approval, clinical protocols and guidelines formulation and for decision-making. However, this traditional technique only partially yield information that clinicians, patients and policy-makers need to make informed decisions, since it usually compares only two interventions at the time. In the market, regardless the clinical condition under evaluation, usually many interventions are available and few of them have been studied in head-to-head studies. This scenario precludes conclusions to be drawn from comparisons of all interventions profile (e.g. efficacy and safety). The recent development and introduction of a new technique – usually referred as network meta-analysis, indirect meta-analysis, multiple or mixed treatment comparisons – has allowed the estimation of metrics for all possible comparisons in the same model, simultaneously gathering direct and indirect evidence. Over the last years this statistical tool has matured as technique with models available for all types of raw data, producing different pooled effect measures, using both Frequentist and Bayesian frameworks, with different software packages. However, the conduction, report and interpretation of network meta-analysis still poses multiple challenges that should be carefully considered, especially because this technique inherits all assumptions from pairwise meta-analysis but with increased complexity. Thus, we aim to provide a basic explanation of network meta-analysis conduction, highlighting its risks and benefits for evidence-based practice, including information on statistical methods evolution, assumptions and steps for performing the analysis. PMID:28503228

  16. Annual statistical report 2008 : based on data from CARE/EC

    DOT National Transportation Integrated Search

    2008-10-31

    This Annual Statistical Report provides the basic characteristics of road accidents in 19 member states of : the European Union for the period 1997-2006, on the basis of data collected and processed in the CARE : database, the Community Road Accident...

  17. Country Education Profiles: Algeria.

    ERIC Educational Resources Information Center

    International Bureau of Education, Geneva (Switzerland).

    One of a series of profiles prepared by the Cooperative Educational Abstracting Service, this brief outline provides basic background information on educational principles, system of administration, structure and organization, curricula, and teacher training in Algeria. Statistics provided by the Unesco Office of Statistics show enrollment at all…

  18. Analysis of the economic structure of the eating-out sector: The case of Spain.

    PubMed

    Cabiedes-Miragaya, Laura

    2017-12-01

    The objective of this article is to analyse the structure of the Spanish eating-out sector from an economic point of view, and more specifically, from the supply perspective. This aspect has been studied less than the demand side, almost certainly due to the gaps which exist in available official statistics in Spain, and which have been filled basically with consumer surveys. For this reason, focus is also placed on the economic relevance of the sector and attention is drawn to the serious shortcomings regarding official statistics in this domain, in contrast to the priority that hotel industry statistics have traditionally received in Spain. Based on official statistics, a descriptive analysis was carried out, focused mainly, though not exclusively, on diverse structural aspects of the sector. Special emphasis was placed on issues such as business demography (for instance, number and types of enterprises, survival rates, size distribution, and age structure), market concentration and structure of costs. Among other conclusions, the analysis allowed us to conclude that: part of the sector is more concentrated than it may at first appear to be; the dual structure of the sector described by the literature in relation to other countries is also present in the Spanish case; and the impact of ICTs (Information and Communication Technologies) on the sector are, and will foreseeably continue to be, particularly relevant. The main conclusion of this study refers to the fact that consumers have gained prominence in their contribution to shaping the structure of the sector. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Upper extremity disorders in heavy industry workers in Greece.

    PubMed

    Tsouvaltzidou, Thomaella; Alexopoulos, Evangelos; Fragkakis, Ioannis; Jelastopulu, Eleni

    2017-06-18

    To investigate the disability due to musculoskeletal disorders of the upper extremities in heavy industry workers. The population under study consisted of 802 employees, both white- and blue-collar, working in a shipyard industry in Athens, Greece. Data were collected through the distribution of questionnaires and the recording of individual and job-related characteristics during the period 2006-2009. The questionnaires used were the Quick Disabilities of the Arm, Shoulder and Hand (QD) Outcome Measure, the Work Ability Index (WAI) and the Short-Form-36 (SF-36) Health Survey. The QD was divided into three parameters - movement restrictions in everyday activities, work and sports/music activities - and the SF-36 into two items, physical and emotional. Multiple linear regression analysis was performed by means of the SPSS v.22 for Windows Statistical Package. The answers given by the participants for the QD did not reveal great discomfort regarding the execution of manual tasks, with the majority of the participants scoring under 5%, meaning no disability. After conducting multiple linear regression, age revealed a positive association with the parameter of restrictions in everyday activities (b = 0.64, P = 0.000). Basic education showed a statistically significant association regarding restrictions during leisure activities, with b = 2.140 ( P = 0.029) for compulsory education graduates. WAI's final score displayed negative charging in the regression analysis of all three parameters, with b = -0.142 ( P = 0.0), b = -0.099 ( P = 0.055) and b = -0.376 ( P = 0.001) respectively, while the physical and emotional components of SF-36 associated with movement restrictions only in daily activities and work. The participants' specialty made no statistically significant associations with any of the three parameters of the QD. Increased musculoskeletal disorders of the upper extremity are associated with older age, lower basic education and physical and mental/emotional health and reduced working ability.

  20. When Statistical Literacy Really Matters: Understanding Published Information about the HIV/AIDS Epidemic in South Africa

    ERIC Educational Resources Information Center

    Hobden, Sally

    2014-01-01

    Information on the HIV/AIDS epidemic in Southern Africa is often interpreted through a veil of secrecy and shame and, I argue, with flawed understanding of basic statistics. This research determined the levels of statistical literacy evident in 316 future Mathematical Literacy teachers' explanations of the median in the context of HIV/AIDS…

  1. Fracture mechanics concepts in reliability analysis of monolithic ceramics

    NASA Technical Reports Server (NTRS)

    Manderscheid, Jane M.; Gyekenyesi, John P.

    1987-01-01

    Basic design concepts for high-performance, monolithic ceramic structural components are addressed. The design of brittle ceramics differs from that of ductile metals because of the inability of ceramic materials to redistribute high local stresses caused by inherent flaws. Random flaw size and orientation requires that a probabilistic analysis be performed in order to determine component reliability. The current trend in probabilistic analysis is to combine linear elastic fracture mechanics concepts with the two parameter Weibull distribution function to predict component reliability under multiaxial stress states. Nondestructive evaluation supports this analytical effort by supplying data during verification testing. It can also help to determine statistical parameters which describe the material strength variation, in particular the material threshold strength (the third Weibull parameter), which in the past was often taken as zero for simplicity.

  2. ARTiiFACT: a tool for heart rate artifact processing and heart rate variability analysis.

    PubMed

    Kaufmann, Tobias; Sütterlin, Stefan; Schulz, Stefan M; Vögele, Claus

    2011-12-01

    The importance of appropriate handling of artifacts in interbeat interval (IBI) data must not be underestimated. Even a single artifact may cause unreliable heart rate variability (HRV) results. Thus, a robust artifact detection algorithm and the option for manual intervention by the researcher form key components for confident HRV analysis. Here, we present ARTiiFACT, a software tool for processing electrocardiogram and IBI data. Both automated and manual artifact detection and correction are available in a graphical user interface. In addition, ARTiiFACT includes time- and frequency-based HRV analyses and descriptive statistics, thus offering the basic tools for HRV analysis. Notably, all program steps can be executed separately and allow for data export, thus offering high flexibility and interoperability with a whole range of applications.

  3. Can PC-9 Zhong chong replace K-1 Yong quan for the acupunctural resuscitation of a bilateral double-amputee? Stating the “random criterion problem” in its statistical analysis

    PubMed Central

    Inchauspe, Adrián Angel

    2016-01-01

    AIM: To present an inclusion criterion for patients who have suffered bilateral amputation in order to be treated with the supplementary resuscitation treatment which is hereby proposed by the author. METHODS: This work is based on a Retrospective Cohort model so that a certainly lethal risk to the control group is avoided. RESULTS: This paper presents a hypothesis on acupunctural PC-9 Zhong chong point, further supported by previous statistical work recorded for the K-1 Yong quan resuscitation point. CONCLUSION: Thanks to the application of the resuscitation maneuver herein proposed on the previously mentioned point, patients with bilateral amputation would have another alternative treatment available in case basic and advanced CPR should fail. PMID:27152257

  4. INFORMATION: THEORY, BRAIN, AND BEHAVIOR

    PubMed Central

    Jensen, Greg; Ward, Ryan D.; Balsam, Peter D.

    2016-01-01

    In the 65 years since its formal specification, information theory has become an established statistical paradigm, providing powerful tools for quantifying probabilistic relationships. Behavior analysis has begun to adopt these tools as a novel means of measuring the interrelations between behavior, stimuli, and contingent outcomes. This approach holds great promise for making more precise determinations about the causes of behavior and the forms in which conditioning may be encoded by organisms. In addition to providing an introduction to the basics of information theory, we review some of the ways that information theory has informed the studies of Pavlovian conditioning, operant conditioning, and behavioral neuroscience. In addition to enriching each of these empirical domains, information theory has the potential to act as a common statistical framework by which results from different domains may be integrated, compared, and ultimately unified. PMID:24122456

  5. Energy transfer mechanism and probability analysis of submarine pipe laterally impacted by dropped objects

    NASA Astrophysics Data System (ADS)

    Liang, Jing; Yu, Jian-xing; Yu, Yang; Lam, W.; Zhao, Yi-yu; Duan, Jing-hui

    2016-06-01

    Energy transfer ratio is the basic-factor affecting the level of pipe damage during the impact between dropped object and submarine pipe. For the purpose of studying energy transfer and damage mechanism of submarine pipe impacted by dropped objects, series of experiments are designed and carried out. The effective yield strength is deduced to make the quasi-static analysis more reliable, and the normal distribution of energy transfer ratio caused by lateral impact on pipes is presented by statistic analysis of experimental results based on the effective yield strength, which provides experimental and theoretical basis for the risk analysis of submarine pipe system impacted by dropped objects. Failure strains of pipe material are confirmed by comparing experimental results with finite element simulation. In addition, impact contact area and impact time are proved to be the major influence factors of energy transfer by sensitivity analysis of the finite element simulation.

  6. Bench to bedside: the quest for quality in experimental stroke research.

    PubMed

    Dirnagl, Ulrich

    2006-12-01

    Over the past decades, great progress has been made in clinical as well as experimental stroke research. Disappointingly, however, hundreds of clinical trials testing neuroprotective agents have failed despite efficacy in experimental models. Recently, several systematic reviews have exposed a number of important deficits in the quality of preclinical stroke research. Many of the issues raised in these reviews are not specific to experimental stroke research, but apply to studies of animal models of disease in general. It is the aim of this article to review some quality-related sources of bias with a particular focus on experimental stroke research. Weaknesses discussed include, among others, low statistical power and hence reproducibility, defects in statistical analysis, lack of blinding and randomization, lack of quality-control mechanisms, deficiencies in reporting, and negative publication bias. Although quantitative evidence for quality problems at present is restricted to preclinical stroke research, to spur discussion and in the hope that they will be exposed to meta-analysis in the near future, I have also included some quality-related sources of bias, which have not been systematically studied. Importantly, these may be also relevant to mechanism-driven basic stroke research. I propose that by a number of rather simple measures reproducibility of experimental results, as well as the step from bench to bedside in stroke research may be made more successful. However, the ultimate proof for this has to await successful phase III stroke trials, which were built on basic research conforming to the criteria as put forward in this article.

  7. Little science, big science: strategies for research portfolio selection in academic surgery departments.

    PubMed

    Shah, Anand; Pietrobon, Ricardo; Cook, Chad; Sheth, Neil P; Nguyen, Lam; Guo, Lucie; Jacobs, Danny O; Kuo, Paul C

    2007-12-01

    To evaluate National Institutes of Health (NIH) funding for academic surgery departments and to determine whether optimal portfolio strategies exist to maximize this funding. The NIH budget is expected to be relatively stable in the foreseeable future, with a modest 0.7% increase from 2005 to 2006. Funding for basic and clinical science research in surgery is also not expected to increase. NIH funding award data for US surgery departments from 2002 to 2004 was collected using publicly available data abstracted from the NIH Information for Management, Planning, Analysis, and Coordination (IMPAC) II database. Additional information was collected from the Computer Retrieval of Information on Scientific Projects (CRISP) database regarding research area (basic vs. clinical, animal vs. human, classification of clinical and basic sciences). The primary outcome measures were total NIH award amount, number of awards, and type of grant. Statistical analysis was based on binomial proportional tests and multiple linear regression models. The smallest total NIH funding award in 2004 to an individual surgery department was a single $26,970 grant, whereas the largest was more than $35 million comprising 68 grants. From 2002 to 2004, one department experienced a 336% increase (greatest increase) in funding, whereas another experienced a 73% decrease (greatest decrease). No statistically significant differences were found between departments with decreasing or increasing funding and the subspecialty of basic science or clinical research funded. Departments (n = 5) experiencing the most drastic decrease (total dollars) in funding had a significantly higher proportion of type K (P = 0.03) grants compared with departments (n = 5) with the largest increases in total funding; the latter group had a significantly increased proportion of type U grants (P = 0.01). A linear association between amount of decrease/increase was found with the average amount of funding per grant and per investigator (P < 0.01), suggesting that departments that increased their total funding relied on investigators with large amounts of funding per grant. Although incentives to junior investigators and clinicians with secondary participation in research are important, our findings suggest that the best strategy for increasing NIH funding for surgery departments is to invest in individuals with focused research commitments and established track records of garnering large and multiple research grants.

  8. Diversity and association of phenotypic and metabolomic traits in the close model grasses Brachypodium distachyon, B. stacei and B. hybridum

    PubMed Central

    López-Álvarez, Diana; Zubair, Hassan; Beckmann, Manfred; Draper, John

    2017-01-01

    Abstract Background and Aims Morphological traits in combination with metabolite fingerprinting were used to investigate inter- and intraspecies diversity within the model annual grasses Brachypodium distachyon, Brachypodium stacei and Brachypodium hybridum. Methods Phenotypic variation of 15 morphological characters and 2219 nominal mass (m/z) signals generated using flow infusion electrospray ionization–mass spectrometry (FIE–MS) were evaluated in individuals from a total of 174 wild populations and six inbred lines, and 12 lines, of the three species, respectively. Basic statistics and multivariate principal component analysis and discriminant analysis were used to differentiate inter- and intraspecific variability of the two types of variable, and their association was assayed with the rcorr function. Key Results Basic statistics and analysis of variance detected eight phenotypic characters [(stomata) leaf guard cell length, pollen grain length, (plant) height, second leaf width, inflorescence length, number of spikelets per inflorescence, lemma length, awn length] and 434 tentatively annotated metabolite signals that significantly discriminated the three species. Three phenotypic traits (pollen grain length, spikelet length, number of flowers per inflorescence) might be genetically fixed. The three species showed different metabolomic profiles. Discriminant analysis significantly discriminated the three taxa with both morphometric and metabolome traits and the intraspecific phenotypic diversity within B. distachyon and B. stacei. The populations of B. hybridum were considerably less differentiated. Conclusions Highly explanatory metabolite signals together with morphological characters revealed concordant patterns of differentiation of the three taxa. Intraspecific phenotypic diversity was observed between northern and southern Iberian populations of B. distachyon and between eastern Mediterranean/south-western Asian and western Mediterranean populations of B. stacei. Significant association was found for pollen grain length and lemma length and ten and six metabolomic signals, respectively. These results would guide the selection of new germplasm lines of the three model grasses in ongoing genome-wide association studies. PMID:28040672

  9. Design of point-of-care (POC) microfluidic medical diagnostic devices

    NASA Astrophysics Data System (ADS)

    Leary, James F.

    2018-02-01

    Design of inexpensive and portable hand-held microfluidic flow/image cytometry devices for initial medical diagnostics at the point of initial patient contact by emergency medical personnel in the field requires careful design in terms of power/weight requirements to allow for realistic portability as a hand-held, point-of-care medical diagnostics device. True portability also requires small micro-pumps for high-throughput capability. Weight/power requirements dictate use of super-bright LEDs and very small silicon photodiodes or nanophotonic sensors that can be powered by batteries. Signal-to-noise characteristics can be greatly improved by appropriately pulsing the LED excitation sources and sampling and subtracting noise in between excitation pulses. The requirements for basic computing, imaging, GPS and basic telecommunications can be simultaneously met by use of smartphone technologies, which become part of the overall device. Software for a user-interface system, limited real-time computing, real-time imaging, and offline data analysis can be accomplished through multi-platform software development systems that are well-suited to a variety of currently available cellphone technologies which already contain all of these capabilities. Microfluidic cytometry requires judicious use of small sample volumes and appropriate statistical sampling by microfluidic cytometry or imaging for adequate statistical significance to permit real-time (typically < 15 minutes) medical decisions for patients at the physician's office or real-time decision making in the field. One or two drops of blood obtained by pin-prick should be able to provide statistically meaningful results for use in making real-time medical decisions without the need for blood fractionation, which is not realistic in the field.

  10. 75 FR 33203 - Funding Formula for Grants to States

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-11

    ... as Social Security numbers, birth dates, and medical data. Docket: To read or download submissions or... Local Area Unemployment Statistics (LAUS), both of which are compiled by DOL's Bureau of Labor Statistics. Specifies how each State's basic JVSG allocation is calculated. Identifies the procedures...

  11. Statistical Considerations for Establishing CBTE Cut-Off Scores.

    ERIC Educational Resources Information Center

    Trzasko, Joseph A.

    This report gives the basic definition and purpose of competency-based teacher education (CBTE) cut-off scores. It describes the basic characteristics of CBTE as a yes-no dichotomous decision regarding the presence of a specific ability or knowledge, which necesitates the establishment of a cut-off point to designate competency vs. incompetency on…

  12. ADULT BASIC EDUCATION. PROGRAM SUMMARY.

    ERIC Educational Resources Information Center

    Office of Education (DHEW), Washington, DC.

    A BRIEF DESCRIPTION IS GIVEN OF THE FEDERAL ADULT BASIC EDUCATION PROGRAM, UNDER THE ADULT EDUCATION ACT OF 1966, AT THE NATIONAL AND STATE LEVELS (INCLUDING PUERTO RICO, GUAM, AMERICAN SAMOA, AND THE VIRGIN ISLANDS) AS PROVIDED BY STATE EDUCATION AGENCIES. STATISTICS FOR FISCAL YEARS 1965 AND 1966, AND ESTIMATES FOR FISCAL YEAR 1967, INDICATE…

  13. Action Research of Computer-Assisted-Remediation of Basic Research Concepts.

    ERIC Educational Resources Information Center

    Packard, Abbot L.; And Others

    This study investigated the possibility of creating a computer-assisted remediation program to assist students having difficulties in basic college research and statistics courses. A team approach involving instructors and students drove the research into and creation of the computer program. The effect of student use was reviewed by looking at…

  14. Introduction to Probability, Part 1 - Basic Concepts. Student Text. Revised Edition.

    ERIC Educational Resources Information Center

    Blakeslee, David W.; And Others

    This book is designed to introduce the reader to some fundamental ideas about probability. The mathematical theory of probability plays an increasingly important role in science, government, industry, business, and economics. An understanding of the basic concepts of probability is essential for the study of statistical methods that are widely…

  15. 77 FR 37059 - Draft Guidance for Industry on Active Controls in Studies To Demonstrate Effectiveness of a New...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-20

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2012-D-0419... who conduct studies using active controls and have a basic understanding of statistical principles... clinical investigators who conduct studies using active controls and have a basic understanding of...

  16. Perspectives and realities of teaching statistics at a superior school of business administration

    NASA Astrophysics Data System (ADS)

    Nunes, Sandra

    2016-06-01

    This paper aims to describe the reality of the teaching of statistics in a superior school of business administration in Portugal. It is supported in a twenty years of experience teaching several disciplines belonging to the scientific area of Mathematics such as: Statistics and Probability, Data Analysis, Calculus, Algebra and Numerical Analysis. This experience is not limited to school of business administration but also in engineering and health courses and in all these schools there has been a substantial increase of failure in these disciplines. I intend to present the main difficulties that teachers encounter. These difficulties are due to a diversity of problems. A leading cause is undoubtedly the huge heterogeneity of the level of knowledge that students have. The large number of students in each class it is also a massive problem. I must point out that, in my opinion, the introduction of the Bologna process has aggravated this situation. The assumption of reduced classroom hours and an increase in self-study is extremely penalizing for such students. There are many challenges that teachers have to face: How to teach statistics to a class where more than half the students cannot interpret the basic concepts of mathematics? Is the approach of teaching statistics through software beneficial? Should the teaching of statistics be addressed in a more practical way? How can we install a critical thinking in the students, to enable them to use the knowledge acquired to solve problems? How can we deal and prevent the failure that is increasing each year? These are only a few questions that all the teachers need an answer.

  17. a Discussion about Effective Ways of Basic Resident Register on GIS

    NASA Astrophysics Data System (ADS)

    Oku, Naoya; Nonaka, Yasuaki; Ito, Yutaka

    2016-06-01

    In Japan, each municipality keeps a database of every resident's name, address, gender and date of birth called the Basic Resident Register. If the address information in the register is converted into coordinates by geocoding, it can be plotted as point data on a map. This would enable prompt evacuation from disaster, analysis of distribution of residents, integrating statistics and so on. Further, it can be used for not only analysis of the current situation but also future planning. However, the geographic information system (GIS) incorporating the Basic Resident Register is not widely used in Japan because of the following problems: - Geocoding In order to plot address point data, it is necessary to match the Basic Resident Register and the address dictionary by using the address as a key. The information in the Basic Resident Register does not always match the actual addresses. As the register is based on applications made by residents, the information is prone to errors, such as incorrect Kanji characters. - Security policy on personal information In the register, the address of a resident is linked with his/her name and date of birth. If the information in the Basic Resident Register were to be leaked, it could be used for malicious purposes. This paper proposes solutions to the above problems. The suitable solutions for the problems depend on the purpose of use, thus it is important that the purpose should be defined and a suitable way of the application for each purpose should be chosen. In this paper, we mainly focus on the specific purpose of use: to analyse the distribution of the residents. We provide two solutions to improve the matching rate in geocoding. First, regarding errors in Kanji characters, a correction list of possible errors should be compiled in advance. Second, some sort of analyses such as distribution of residents may not require exactly correct position for the address point. Therefore we set the matching level in order: prefecture, city, town, city-block, house-code, house, and decided to accept up to city-block level for the matching. Moreover, in terms of security policy on personal information, some part of information may not be needed for the distribution analysis. For example, the personal information like resident's name should be excluded from the attribute of address point in order to secure the safety operation of the system.

  18. An adaptive approach to the dynamic allocation of buffer storage. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Crooke, S. C.

    1970-01-01

    Several strategies for the dynamic allocation of buffer storage are simulated and compared. The basic algorithms investigated, using actual statistics observed in the Univac 1108 EXEC 8 System, include the buddy method and the first-fit method. Modifications are made to the basic methods in an effort to improve and to measure allocation performance. A simulation model of an adaptive strategy is developed which permits interchanging the two different methods, the buddy and the first-fit methods with some modifications. Using an adaptive strategy, each method may be employed in the statistical environment in which its performance is superior to the other method.

  19. Ultrasound Dopplerography of abdomen pathology using statistical computer programs

    NASA Astrophysics Data System (ADS)

    Dmitrieva, Irina V.; Arakelian, Sergei M.; Wapota, Alberto R. W.

    1998-04-01

    The modern ultrasound dopplerography give us the big possibilities in investigation of gemodynamical changes in all stages of abdomen pathology. Many of researches devoted to using of noninvasive methods in practical medicine. Now ultrasound Dopplerography is one of the basic one. We investigated 250 patients from 30 to 77 ages, including 149 men and 101 women. The basic diagnosis of all patients was the Ischaemic Pancreatitis. The Second diagnoses of pathology were the Ischaemic Disease of Heart, Gypertension, Atherosclerosis, Diabet, Vascular Disease of Extremities. We researched the abdominal aorta and her branches: Arteria Mesenterica Superior (AMS), truncus coeliacus (TC), arteria hepatica communis (AHC), arteria lienalis (AL). For investigation we use the following equipment: ACUSON 128 XP/10c, BIOMEDIC, GENERAL ELECTRIC (USA, Japan). We analyzed the following componetns of gemodynamical changes of abdominal vessels: index of pulsation, index of resistance, ratio of systol-dystol, speed of blood circulation. Statistical program included the following one: 'basic statistic's,' 'analytic program.' In conclusion we determined that the all gemodynamical components of abdominal vessels had considerable changes in abdominal ischaemia than in normal situation. Using the computer's program for definition degree of gemodynamical changes, we can recommend the individual plan of diagnostical and treatment program.

  20. Statistical methods for the analysis of climate extremes

    NASA Astrophysics Data System (ADS)

    Naveau, Philippe; Nogaj, Marta; Ammann, Caspar; Yiou, Pascal; Cooley, Daniel; Jomelli, Vincent

    2005-08-01

    Currently there is an increasing research activity in the area of climate extremes because they represent a key manifestation of non-linear systems and an enormous impact on economic and social human activities. Our understanding of the mean behavior of climate and its 'normal' variability has been improving significantly during the last decades. In comparison, climate extreme events have been hard to study and even harder to predict because they are, by definition, rare and obey different statistical laws than averages. In this context, the motivation for this paper is twofold. Firstly, we recall the basic principles of Extreme Value Theory that is used on a regular basis in finance and hydrology, but it still does not have the same success in climate studies. More precisely, the theoretical distributions of maxima and large peaks are recalled. The parameters of such distributions are estimated with the maximum likelihood estimation procedure that offers the flexibility to take into account explanatory variables in our analysis. Secondly, we detail three case-studies to show that this theory can provide a solid statistical foundation, specially when assessing the uncertainty associated with extreme events in a wide range of applications linked to the study of our climate. To cite this article: P. Naveau et al., C. R. Geoscience 337 (2005).

  1. Replicability of time-varying connectivity patterns in large resting state fMRI samples.

    PubMed

    Abrol, Anees; Damaraju, Eswar; Miller, Robyn L; Stephen, Julia M; Claus, Eric D; Mayer, Andrew R; Calhoun, Vince D

    2017-12-01

    The past few years have seen an emergence of approaches that leverage temporal changes in whole-brain patterns of functional connectivity (the chronnectome). In this chronnectome study, we investigate the replicability of the human brain's inter-regional coupling dynamics during rest by evaluating two different dynamic functional network connectivity (dFNC) analysis frameworks using 7 500 functional magnetic resonance imaging (fMRI) datasets. To quantify the extent to which the emergent functional connectivity (FC) patterns are reproducible, we characterize the temporal dynamics by deriving several summary measures across multiple large, independent age-matched samples. Reproducibility was demonstrated through the existence of basic connectivity patterns (FC states) amidst an ensemble of inter-regional connections. Furthermore, application of the methods to conservatively configured (statistically stationary, linear and Gaussian) surrogate datasets revealed that some of the studied state summary measures were indeed statistically significant and also suggested that this class of null model did not explain the fMRI data fully. This extensive testing of reproducibility of similarity statistics also suggests that the estimated FC states are robust against variation in data quality, analysis, grouping, and decomposition methods. We conclude that future investigations probing the functional and neurophysiological relevance of time-varying connectivity assume critical importance. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  2. Replicability of time-varying connectivity patterns in large resting state fMRI samples

    PubMed Central

    Abrol, Anees; Damaraju, Eswar; Miller, Robyn L.; Stephen, Julia M.; Claus, Eric D.; Mayer, Andrew R.; Calhoun, Vince D.

    2018-01-01

    The past few years have seen an emergence of approaches that leverage temporal changes in whole-brain patterns of functional connectivity (the chronnectome). In this chronnectome study, we investigate the replicability of the human brain’s inter-regional coupling dynamics during rest by evaluating two different dynamic functional network connectivity (dFNC) analysis frameworks using 7 500 functional magnetic resonance imaging (fMRI) datasets. To quantify the extent to which the emergent functional connectivity (FC) patterns are reproducible, we characterize the temporal dynamics by deriving several summary measures across multiple large, independent age-matched samples. Reproducibility was demonstrated through the existence of basic connectivity patterns (FC states) amidst an ensemble of inter-regional connections. Furthermore, application of the methods to conservatively configured (statistically stationary, linear and Gaussian) surrogate datasets revealed that some of the studied state summary measures were indeed statistically significant and also suggested that this class of null model did not explain the fMRI data fully. This extensive testing of reproducibility of similarity statistics also suggests that the estimated FC states are robust against variation in data quality, analysis, grouping, and decomposition methods. We conclude that future investigations probing the functional and neurophysiological relevance of time-varying connectivity assume critical importance. PMID:28916181

  3. Analysis of statistical misconception in terms of statistical reasoning

    NASA Astrophysics Data System (ADS)

    Maryati, I.; Priatna, N.

    2018-05-01

    Reasoning skill is needed for everyone to face globalization era, because every person have to be able to manage and use information from all over the world which can be obtained easily. Statistical reasoning skill is the ability to collect, group, process, interpret, and draw conclusion of information. Developing this skill can be done through various levels of education. However, the skill is low because many people assume that statistics is just the ability to count and using formulas and so do students. Students still have negative attitude toward course which is related to research. The purpose of this research is analyzing students’ misconception in descriptive statistic course toward the statistical reasoning skill. The observation was done by analyzing the misconception test result and statistical reasoning skill test; observing the students’ misconception effect toward statistical reasoning skill. The sample of this research was 32 students of math education department who had taken descriptive statistic course. The mean value of misconception test was 49,7 and standard deviation was 10,6 whereas the mean value of statistical reasoning skill test was 51,8 and standard deviation was 8,5. If the minimal value is 65 to state the standard achievement of a course competence, students’ mean value is lower than the standard competence. The result of students’ misconception study emphasized on which sub discussion that should be considered. Based on the assessment result, it was found that students’ misconception happen on this: 1) writing mathematical sentence and symbol well, 2) understanding basic definitions, 3) determining concept that will be used in solving problem. In statistical reasoning skill, the assessment was done to measure reasoning from: 1) data, 2) representation, 3) statistic format, 4) probability, 5) sample, and 6) association.

  4. Bayesian statistics as a new tool for spectral analysis - I. Application for the determination of basic parameters of massive stars

    NASA Astrophysics Data System (ADS)

    Mugnes, J.-M.; Robert, C.

    2015-11-01

    Spectral analysis is a powerful tool to investigate stellar properties and it has been widely used for decades now. However, the methods considered to perform this kind of analysis are mostly based on iteration among a few diagnostic lines to determine the stellar parameters. While these methods are often simple and fast, they can lead to errors and large uncertainties due to the required assumptions. Here, we present a method based on Bayesian statistics to find simultaneously the best combination of effective temperature, surface gravity, projected rotational velocity, and microturbulence velocity, using all the available spectral lines. Different tests are discussed to demonstrate the strength of our method, which we apply to 54 mid-resolution spectra of field and cluster B stars obtained at the Observatoire du Mont-Mégantic. We compare our results with those found in the literature. Differences are seen which are well explained by the different methods used. We conclude that the B-star microturbulence velocities are often underestimated. We also confirm the trend that B stars in clusters are on average faster rotators than field B stars.

  5. [Analysis of variance of repeated data measured by water maze with SPSS].

    PubMed

    Qiu, Hong; Jin, Guo-qin; Jin, Ru-feng; Zhao, Wei-kang

    2007-01-01

    To introduce the method of analyzing repeated data measured by water maze with SPSS 11.0, and offer a reference statistical method to clinical and basic medicine researchers who take the design of repeated measures. Using repeated measures and multivariate analysis of variance (ANOVA) process of the general linear model in SPSS and giving comparison among different groups and different measure time pairwise. Firstly, Mauchly's test of sphericity should be used to judge whether there were relations among the repeatedly measured data. If any (P

  6. A Simple Statistical Thermodynamics Experiment

    ERIC Educational Resources Information Center

    LoPresto, Michael C.

    2010-01-01

    Comparing the predicted and actual rolls of combinations of both two and three dice can help to introduce many of the basic concepts of statistical thermodynamics, including multiplicity, probability, microstates, and macrostates, and demonstrate that entropy is indeed a measure of randomness, that disordered states (those of higher entropy) are…

  7. 76 FR 41756 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-15

    ... materials and supplies used in production. The economic census will produce basic statistics by kind of business on number of establishments, sales, payroll, employment, inventories, and operating expenses. It also will yield a variety of subject statistics, including sales by product line; sales by class of...

  8. Descriptive Statistics: Reporting the Answers to the 5 Basic Questions of Who, What, Why, When, Where, and a Sixth, So What?

    PubMed

    Vetter, Thomas R

    2017-11-01

    Descriptive statistics are specific methods basically used to calculate, describe, and summarize collected research data in a logical, meaningful, and efficient way. Descriptive statistics are reported numerically in the manuscript text and/or in its tables, or graphically in its figures. This basic statistical tutorial discusses a series of fundamental concepts about descriptive statistics and their reporting. The mean, median, and mode are 3 measures of the center or central tendency of a set of data. In addition to a measure of its central tendency (mean, median, or mode), another important characteristic of a research data set is its variability or dispersion (ie, spread). In simplest terms, variability is how much the individual recorded scores or observed values differ from one another. The range, standard deviation, and interquartile range are 3 measures of variability or dispersion. The standard deviation is typically reported for a mean, and the interquartile range for a median. Testing for statistical significance, along with calculating the observed treatment effect (or the strength of the association between an exposure and an outcome), and generating a corresponding confidence interval are 3 tools commonly used by researchers (and their collaborating biostatistician or epidemiologist) to validly make inferences and more generalized conclusions from their collected data and descriptive statistics. A number of journals, including Anesthesia & Analgesia, strongly encourage or require the reporting of pertinent confidence intervals. A confidence interval can be calculated for virtually any variable or outcome measure in an experimental, quasi-experimental, or observational research study design. Generally speaking, in a clinical trial, the confidence interval is the range of values within which the true treatment effect in the population likely resides. In an observational study, the confidence interval is the range of values within which the true strength of the association between the exposure and the outcome (eg, the risk ratio or odds ratio) in the population likely resides. There are many possible ways to graphically display or illustrate different types of data. While there is often latitude as to the choice of format, ultimately, the simplest and most comprehensible format is preferred. Common examples include a histogram, bar chart, line chart or line graph, pie chart, scatterplot, and box-and-whisker plot. Valid and reliable descriptive statistics can answer basic yet important questions about a research data set, namely: "Who, What, Why, When, Where, How, How Much?"

  9. Assessment of knowledge and perceptions toward generic medicines among basic science undergraduate medical students at Aruba.

    PubMed

    Shankar, P Ravi; Herz, Burton L; Dubey, Arun K; Hassali, Mohamed A

    2016-10-01

    Use of generic medicines is important to reduce rising health-care costs. Proper knowledge and perception of medical students and doctors toward generic medicines are important. Xavier University School of Medicine in Aruba admits students from the United States, Canada, and other countries to the undergraduate medical (MD) program. The present study was conducted to study the knowledge and perception about generic medicines among basic science MD students. The cross-sectional study was conducted among first to fifth semester students during February 2015. A previously developed instrument was used. Basic demographic information was collected. Respondent's agreement with a set of statements was noted using a Likert-type scale. The calculated total score was compared among subgroups of respondents. One sample Kolmogorov-Smirnov test was used to study the normality of distribution, Independent samples t -test to compare the total score for dichotomous variables, and analysis of variance for others were used for statistical analysis. Fifty-six of the 85 students (65.8%) participated. Around 55% of respondents were between 20 and 25 years of age and of American nationality. Only three respondents (5.3%) provided the correct value of the regulatory bioequivalence limits. The mean total score was 43.41 (maximum 60). There was no significant difference in scores among subgroups. There was a significant knowledge gap with regard to the regulatory bioequivalence limits for generic medicines. Respondents' level of knowledge about other aspects of generic medicines was good but could be improved. Studies among clinical students in the institution and in other Caribbean medical schools are required. Deficiencies were noted and we have strengthened learning about generic medicines during the basic science years.

  10. ON THE DYNAMICAL DERIVATION OF EQUILIBRIUM STATISTICAL MECHANICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prigogine, I.; Balescu, R.; Henin, F.

    1960-12-01

    Work on nonequilibrium statistical mechanics, which allows an extension of the kinetic proof to all results of equilibrium statistical mechanics involving a finite number of degrees of freedom, is summarized. As an introduction to the general N-body problem, the scattering theory in classical mechanics is considered. The general N-body problem is considered for the case of classical mechanics, quantum mechanics with Boltzmann statistics, and quantum mechanics including quantum statistics. Six basic diagrams, which describe the elementary processes of the dynamics of correlations, were obtained. (M.C.G.)

  11. `New insight into statistical hydrology' preface to the special issue

    NASA Astrophysics Data System (ADS)

    Kochanek, Krzysztof

    2018-04-01

    Statistical methods are still the basic tool for investigating random, extreme events occurring in hydrosphere. On 21-22 September 2017, in Warsaw (Poland) the international workshop of the Statistical Hydrology (StaHy) 2017 took place under the auspices of the International Association of Hydrological Sciences. The authors of the presentations proposed to publish their research results in the Special Issue of the Acta Geophysica-`New Insight into Statistical Hydrology'. Five papers were selected for publication, touching on the most crucial issues of statistical methodology in hydrology.

  12. Measuring the impact of air pollution on respiratory infection risk in China.

    PubMed

    Tang, Sanyi; Yan, Qinling; Shi, Wei; Wang, Xia; Sun, Xiaodan; Yu, Pengbo; Wu, Jianhong; Xiao, Yanni

    2018-01-01

    China is now experiencing major public health challenges caused by air pollution. Few studies have quantified the dynamics of air pollution and its impact on the risk of respiratory infection. We conducted an integrated data analysis to quantify the association among air quality index (AQI), meteorological variables and respiratory infection risk in Shaanxi province of China in the period of November 15th, 2010 to November 14th, 2016. Our analysis illustrated a statistically significantly positive correlation between the number of influenza-like illness (ILI) cases and AQI, and the respiratory infection risk has increased progressively with increased AQI with a time lag of 0-3 days. We also developed mathematical models for the AQI trend and respiratory infection dynamics, incorporating AQI-dependent incidence and AQI-based behaviour change interventions. Our combined data and modelling analysis estimated the basic reproduction number for the respiratory infection during the studying period to be 2.4076, higher than the basic reproduction number of the 2009 pandemic influenza in the same province. Our modelling-based simulations concluded that, in terms of respiratory infection risk reduction, the persistent control of emission in the China's blue-sky programme is much more effective than substantial social-economic interventions implemented only during the smog days. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. The Brazilian Portuguese Lexicon: An Instrument for Psycholinguistic Research

    PubMed Central

    Estivalet, Gustavo L.; Meunier, Fanny

    2015-01-01

    In this article, we present the Brazilian Portuguese Lexicon, a new word-based corpus for psycholinguistic and computational linguistic research in Brazilian Portuguese. We describe the corpus development, the specific characteristics on the internet site and database for user access. We also perform distributional analyses of the corpus and comparisons to other current databases. Our main objective was to provide a large, reliable, and useful word-based corpus with a dynamic, easy-to-use, and intuitive interface with free internet access for word and word-criteria searches. We used the Núcleo Interinstitucional de Linguística Computacional’s corpus as the basic data source and developed the Brazilian Portuguese Lexicon by deriving and adding metalinguistic and psycholinguistic information about Brazilian Portuguese words. We obtained a final corpus with more than 30 million word tokens, 215 thousand word types and 25 categories of information about each word. This corpus was made available on the internet via a free-access site with two search engines: a simple search and a complex search. The simple engine basically searches for a list of words, while the complex engine accepts all types of criteria in the corpus categories. The output result presents all entries found in the corpus with the criteria specified in the input search and can be downloaded as a.csv file. We created a module in the results that delivers basic statistics about each search. The Brazilian Portuguese Lexicon also provides a pseudoword engine and specific tools for linguistic and statistical analysis. Therefore, the Brazilian Portuguese Lexicon is a convenient instrument for stimulus search, selection, control, and manipulation in psycholinguistic experiments, as also it is a powerful database for computational linguistics research and language modeling related to lexicon distribution, functioning, and behavior. PMID:26630138

  14. The Brazilian Portuguese Lexicon: An Instrument for Psycholinguistic Research.

    PubMed

    Estivalet, Gustavo L; Meunier, Fanny

    2015-01-01

    In this article, we present the Brazilian Portuguese Lexicon, a new word-based corpus for psycholinguistic and computational linguistic research in Brazilian Portuguese. We describe the corpus development, the specific characteristics on the internet site and database for user access. We also perform distributional analyses of the corpus and comparisons to other current databases. Our main objective was to provide a large, reliable, and useful word-based corpus with a dynamic, easy-to-use, and intuitive interface with free internet access for word and word-criteria searches. We used the Núcleo Interinstitucional de Linguística Computacional's corpus as the basic data source and developed the Brazilian Portuguese Lexicon by deriving and adding metalinguistic and psycholinguistic information about Brazilian Portuguese words. We obtained a final corpus with more than 30 million word tokens, 215 thousand word types and 25 categories of information about each word. This corpus was made available on the internet via a free-access site with two search engines: a simple search and a complex search. The simple engine basically searches for a list of words, while the complex engine accepts all types of criteria in the corpus categories. The output result presents all entries found in the corpus with the criteria specified in the input search and can be downloaded as a.csv file. We created a module in the results that delivers basic statistics about each search. The Brazilian Portuguese Lexicon also provides a pseudoword engine and specific tools for linguistic and statistical analysis. Therefore, the Brazilian Portuguese Lexicon is a convenient instrument for stimulus search, selection, control, and manipulation in psycholinguistic experiments, as also it is a powerful database for computational linguistics research and language modeling related to lexicon distribution, functioning, and behavior.

  15. Stratification of complexity in congenital heart surgery: comparative study of the Risk Adjustment for Congenital Heart Surgery (RACHS-1) method, Aristotle basic score and Society of Thoracic Surgeons-European Association for Cardio- Thoracic Surgery (STS-EACTS) mortality score

    PubMed Central

    Cavalcanti, Paulo Ernando Ferraz; Sá, Michel Pompeu Barros de Oliveira; dos Santos, Cecília Andrade; Esmeraldo, Isaac Melo; Chaves, Mariana Leal; Lins, Ricardo Felipe de Albuquerque; Lima, Ricardo de Carvalho

    2015-01-01

    Objective To determine whether stratification of complexity models in congenital heart surgery (RACHS-1, Aristotle basic score and STS-EACTS mortality score) fit to our center and determine the best method of discriminating hospital mortality. Methods Surgical procedures in congenital heart diseases in patients under 18 years of age were allocated to the categories proposed by the stratification of complexity methods currently available. The outcome hospital mortality was calculated for each category from the three models. Statistical analysis was performed to verify whether the categories presented different mortalities. The discriminatory ability of the models was determined by calculating the area under the ROC curve and a comparison between the curves of the three models was performed. Results 360 patients were allocated according to the three methods. There was a statistically significant difference between the mortality categories: RACHS-1 (1) - 1.3%, (2) - 11.4%, (3)-27.3%, (4) - 50 %, (P<0.001); Aristotle basic score (1) - 1.1%, (2) - 12.2%, (3) - 34%, (4) - 64.7%, (P<0.001); and STS-EACTS mortality score (1) - 5.5 %, (2) - 13.6%, (3) - 18.7%, (4) - 35.8%, (P<0.001). The three models had similar accuracy by calculating the area under the ROC curve: RACHS-1- 0.738; STS-EACTS-0.739; Aristotle- 0.766. Conclusion The three models of stratification of complexity currently available in the literature are useful with different mortalities between the proposed categories with similar discriminatory capacity for hospital mortality. PMID:26107445

  16. A phenome database (NEAUHLFPD) designed and constructed for broiler lines divergently selected for abdominal fat content.

    PubMed

    Li, Min; Dong, Xiang-yu; Liang, Hao; Leng, Li; Zhang, Hui; Wang, Shou-zhi; Li, Hui; Du, Zhi-Qiang

    2017-05-20

    Effective management and analysis of precisely recorded phenotypic traits are important components of the selection and breeding of superior livestocks. Over two decades, we divergently selected chicken lines for abdominal fat content at Northeast Agricultural University (Northeast Agricultural University High and Low Fat, NEAUHLF), and collected large volume of phenotypic data related to the investigation on molecular genetic basis of adipose tissue deposition in broilers. To effectively and systematically store, manage and analyze phenotypic data, we built the NEAUHLF Phenome Database (NEAUHLFPD). NEAUHLFPD included the following phenotypic records: pedigree (generations 1-19) and 29 phenotypes, such as body sizes and weights, carcass traits and their corresponding rates. The design and construction strategy of NEAUHLFPD were executed as follows: (1) Framework design. We used Apache as our web server, MySQL and Navicat as database management tools, and PHP as the HTML-embedded language to create dynamic interactive website. (2) Structural components. On the main interface, detailed introduction on the composition, function, and the index buttons of the basic structure of the database could be found. The functional modules of NEAUHLFPD had two main components: the first module referred to the physical storage space for phenotypic data, in which functional manipulation on data can be realized, such as data indexing, filtering, range-setting, searching, etc.; the second module related to the calculation of basic descriptive statistics, where data filtered from the database can be used for the computation of basic statistical parameters and the simultaneous conditional sorting. NEAUHLFPD could be used to effectively store and manage not only phenotypic, but also genotypic and genomics data, which can facilitate further investigation on the molecular genetic basis of chicken adipose tissue growth and development, and expedite the selection and breeding of broilers with low fat content.

  17. Acoustic Emission Analysis Applet (AEAA) Software

    NASA Technical Reports Server (NTRS)

    Nichols, Charles T.; Roth, Don J.

    2013-01-01

    NASA Glenn Research and NASA White Sands Test Facility have developed software supporting an automated pressure vessel structural health monitoring (SHM) system based on acoustic emissions (AE). The software, referred to as the Acoustic Emission Analysis Applet (AEAA), provides analysts with a tool that can interrogate data collected on Digital Wave Corp. and Physical Acoustics Corp. software using a wide spectrum of powerful filters and charts. This software can be made to work with any data once the data format is known. The applet will compute basic AE statistics, and statistics as a function of time and pressure (see figure). AEAA provides value added beyond the analysis provided by the respective vendors' analysis software. The software can handle data sets of unlimited size. A wide variety of government and commercial applications could benefit from this technology, notably requalification and usage tests for compressed gas and hydrogen-fueled vehicles. Future enhancements will add features similar to a "check engine" light on a vehicle. Once installed, the system will ultimately be used to alert International Space Station crewmembers to critical structural instabilities, but will have little impact to missions otherwise. Diagnostic information could then be transmitted to experienced technicians on the ground in a timely manner to determine whether pressure vessels have been impacted, are structurally unsound, or can be safely used to complete the mission.

  18. The multiple decrement life table: a unifying framework for cause-of-death analysis in ecology.

    PubMed

    Carey, James R

    1989-01-01

    The multiple decrement life table is used widely in the human actuarial literature and provides statistical expressions for mortality in three different forms: i) the life table from all causes-of-death combined; ii) the life table disaggregated into selected cause-of-death categories; and iii) the life table with particular causes and combinations of causes eliminated. The purpose of this paper is to introduce the multiple decrement life table to the ecological literature by applying the methods to published death-by-cause information on Rhagoletis pomonella. Interrelations between the current approach and conventional tools used in basic and applied ecology are discussed including the conventional life table, Key Factor Analysis and Abbott's Correction used in toxicological bioassay.

  19. M.S.L.A.P. Modular Spectral Line Analysis Program documentation

    NASA Technical Reports Server (NTRS)

    Joseph, Charles L.; Jenkins, Edward B.

    1991-01-01

    MSLAP is a software for analyzing spectra, providing the basic structure to identify spectral features, to make quantitative measurements of this features, and to store the measurements for convenient access. MSLAP can be used to measure not only the zeroth moment (equivalent width) of a profile, but also the first and second moments. Optical depths and the corresponding column densities across the profile can be measured as well for sufficiently high resolution data. The software was developed for an interactive, graphical analysis where the computer carries most of the computational and data organizational burden and the investigator is responsible only for all judgement decisions. It employs sophisticated statistical techniques for determining the best polynomial fit to the continuum and for calculating the uncertainties.

  20. Statistical Energy Analysis for Designers. Part 1. Basic Theory

    DTIC Science & Technology

    1974-09-01

    deterministic system. That is a possible answer, but it may not be the most useful one. The most glaring deficiency of SEA is its inability to deal with...present whether this represents the "next logical step" in the chain that we spoke of, but it bears examination. 13 A second deficiency of SEA is its...undamped string, p = lineal density, r = 0, and A=-T(D/ax) 2. Thus, Eq. (2.3.2) becomes Tk2 = pw2, or k = ±w/c , (2.3.3) where c=VT- is the speed of

  1. Water-resources investigations in Wisconsin: Programs and activities of the U.S. Geological Survey, 1991-92

    USGS Publications Warehouse

    Maertz, D.E.

    1992-01-01

    OBJECTIVE: The objectives of this study are to provide continuous discharge records for selected rivers at specific sites to supply the needs for: regulation, analytical studies, definition of statistical properties, trends analysis, determination of the occurrence, and distribution of water in streams for planning. The project is also designed to determine lake levels and to provide discharge for floods, low-flow conditions, and for water-quality investigations. Requests for streamflow data and information relating to streamflow in Wisconsin are answered. Basic data are published annually in "Water Resources Data Wisconsin."

  2. Water-resources investigations in Wisconsin

    USGS Publications Warehouse

    Maertz, D.E.

    1996-01-01

    OBJECTIVE: The objectives of this study are to provide continuous discharge records for selected rivers at specific sites to supply the needs for regulation, analytical studies, definition of statistical properties, trends analysis, determination of the occurrence, and distribution of water in streams for planning. The project is also LOCATION: Statewide PROJECT CHIEF: Barry K. Holmstrom PERIOD OF PROJECT: July 1913-Continuing designed to determine lake levels and to provide discharge for floods, low-flow conditions, and for waterquality investigations. Requests for streamflow data and information relating to streamflow in Wisconsin are answered. Basic data are published annually in the report "Water Resources Data-Wisconsin."

  3. feets: feATURE eXTRACTOR for tIME sERIES

    NASA Astrophysics Data System (ADS)

    Cabral, Juan; Sanchez, Bruno; Ramos, Felipe; Gurovich, Sebastián; Granitto, Pablo; VanderPlas, Jake

    2018-06-01

    feets characterizes and analyzes light-curves from astronomical photometric databases for modelling, classification, data cleaning, outlier detection and data analysis. It uses machine learning algorithms to determine the numerical descriptors that characterize and distinguish the different variability classes of light-curves; these range from basic statistical measures such as the mean or standard deviation to complex time-series characteristics such as the autocorrelation function. The library is not restricted to the astronomical field and could also be applied to any kind of time series. This project is a derivative work of FATS (ascl:1711.017).

  4. Improving Attendance and Punctuality of FE Basic Skill Students through an Innovative Scheme

    ERIC Educational Resources Information Center

    Ade-Ojo, Gordon O.

    2005-01-01

    This paper reports the findings of a study set up to establish the impact of a particular scheme on the attendance and punctuality performance of a group of Basic Skills learners against the backdrop of various theoretical postulations on managing undesirable behavior. Data collected on learners' performance was subjected to statistical analysis…

  5. An Inspection on the Gini Coefficient of the Budget Educational Public Expenditure per Student for China's Basic Education

    ERIC Educational Resources Information Center

    Yingxiu, Yang

    2006-01-01

    Using statistical data on the implementing conditions of China's educational expenditure published by the state, this paper studies the Gini coefficient of the budget educational public expenditure per student in order to examine the concentration degree of the educational expenditure for China's basic education and analyze its balanced…

  6. Trees for Ohio

    Treesearch

    Ernest J. Gebhart

    1980-01-01

    Other members of this panel are going to reveal the basic statistics about the coal strip mining industry in Ohio so I will confine my remarks to the revegetation of the spoil banks. So it doesn't appear that Ohio confined its tree planting efforts to spoil banks alone, I will rely on a few statistics.

  7. Idaho State University Statistical Portrait, Academic Year 1998-1999.

    ERIC Educational Resources Information Center

    Idaho State Univ., Pocatello. Office of Institutional Research.

    This report provides basic statistical data for Idaho State University, and includes both point-of-time data as well as trend data. The information is divided into sections emphasizing students, programs, faculty and staff, finances, and physical facilities. Student data includes enrollment, geographical distribution, student/faculty ratios,…

  8. Statistical Report. Fiscal Year 1995: September 1, 1994 - August 31, 1995.

    ERIC Educational Resources Information Center

    Texas Higher Education Coordinating Board, Austin.

    This report provides statistical data on Texas public and independent higher education institutions for fiscal year 1995. An introductory section provides basic information on Texas higher education institutions, while nine major sections cover: (1) student enrollment, including 1990-94 headcount data; headcount by classification, ethnic origin,…

  9. Statistical Report. Fiscal Year 1994: September 1, 1993 - August 31, 1994.

    ERIC Educational Resources Information Center

    Texas Higher Education Coordinating Board, Austin.

    This report provides statistical data on Texas public and independent higher education institutions for fiscal year 1994. An introductory section provides basic information on Texas higher education institutions, while nine major sections cover: (1) student enrollment, including 1989-93 headcount data; headcount by classification, ethnic origin,…

  10. 29 CFR 1904.42 - Requests from the Bureau of Labor Statistics for data.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... ADMINISTRATION, DEPARTMENT OF LABOR RECORDING AND REPORTING OCCUPATIONAL INJURIES AND ILLNESSES Reporting Fatality, Injury and Illness Information to the Government § 1904.42 Requests from the Bureau of Labor Statistics for data. (a) Basic requirement. If you receive a Survey of Occupational Injuries and Illnesses...

  11. 29 CFR 1904.42 - Requests from the Bureau of Labor Statistics for data.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... ADMINISTRATION, DEPARTMENT OF LABOR RECORDING AND REPORTING OCCUPATIONAL INJURIES AND ILLNESSES Reporting Fatality, Injury and Illness Information to the Government § 1904.42 Requests from the Bureau of Labor Statistics for data. (a) Basic requirement. If you receive a Survey of Occupational Injuries and Illnesses...

  12. 29 CFR 1904.42 - Requests from the Bureau of Labor Statistics for data.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... ADMINISTRATION, DEPARTMENT OF LABOR RECORDING AND REPORTING OCCUPATIONAL INJURIES AND ILLNESSES Reporting Fatality, Injury and Illness Information to the Government § 1904.42 Requests from the Bureau of Labor Statistics for data. (a) Basic requirement. If you receive a Survey of Occupational Injuries and Illnesses...

  13. 29 CFR 1904.42 - Requests from the Bureau of Labor Statistics for data.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... ADMINISTRATION, DEPARTMENT OF LABOR RECORDING AND REPORTING OCCUPATIONAL INJURIES AND ILLNESSES Reporting Fatality, Injury and Illness Information to the Government § 1904.42 Requests from the Bureau of Labor Statistics for data. (a) Basic requirement. If you receive a Survey of Occupational Injuries and Illnesses...

  14. Theoretical Frameworks for Math Fact Fluency

    ERIC Educational Resources Information Center

    Arnold, Katherine

    2012-01-01

    Recent education statistics indicate persistent low math scores for our nation's students. This drop in math proficiency includes deficits in basic number sense and automaticity of math facts. The decrease has been recorded across all grade levels with the elementary levels showing the greatest loss (National Center for Education Statistics,…

  15. Basic Statistical Concepts and Methods for Earth Scientists

    USGS Publications Warehouse

    Olea, Ricardo A.

    2008-01-01

    INTRODUCTION Statistics is the science of collecting, analyzing, interpreting, modeling, and displaying masses of numerical data primarily for the characterization and understanding of incompletely known systems. Over the years, these objectives have lead to a fair amount of analytical work to achieve, substantiate, and guide descriptions and inferences.

  16. The Future of Basic Science in Academic Surgery

    PubMed Central

    Keswani, Sundeep G.; Moles, Chad M.; Morowitz, Michael; Zeh, Herbert; Kuo, John S.; Levine, Matthew H.; Cheng, Lily S.; Hackam, David J.; Ahuja, Nita; Goldstein, Allan M.

    2017-01-01

    Objective The aim of this study was to examine the challenges confronting surgeons performing basic science research in today’s academic surgery environment. Summary of Background Data Multiple studies have identified challenges confronting surgeon-scientists and impacting their ability to be successful. Although these threats have been known for decades, the downward trend in the number of successful surgeon-scientists continues. Clinical demands, funding challenges, and other factors play important roles, but a rigorous analysis of academic surgeons and their experiences regarding these issues has not previously been performed. Methods An online survey was distributed to 2504 members of the Association for Academic Surgery and Society of University Surgeons to determine factors impacting success. Survey results were subjected to statistical analyses. We also reviewed publicly available data regarding funding from the National Institutes of Health (NIH). Results NIH data revealed a 27% decline in the proportion of NIH funding to surgical departments relative to total NIH funding from 2007 to 2014. A total of 1033 (41%) members responded to our survey, making this the largest survey of academic surgeons to date. Surgeons most often cited the following factors as major impediments to pursuing basic investigation: pressure to be clinically productive, excessive administrative responsibilities, difficulty obtaining extramural funding, and desire for work-life balance. Surprisingly, a majority (68%) did not believe surgeons can be successful basic scientists in today’s environment, including departmental leadership. Conclusions We have identified important barriers that confront academic surgeons pursuing basic research and a perception that success in basic science may no longer be achievable. These barriers need to be addressed to ensure the continued development of future surgeon-scientists. PMID:27643928

  17. Comprehensive machine learning analysis of Hydra behavior reveals a stable basal behavioral repertoire

    PubMed Central

    Taralova, Ekaterina; Dupre, Christophe; Yuste, Rafael

    2018-01-01

    Animal behavior has been studied for centuries, but few efficient methods are available to automatically identify and classify it. Quantitative behavioral studies have been hindered by the subjective and imprecise nature of human observation, and the slow speed of annotating behavioral data. Here, we developed an automatic behavior analysis pipeline for the cnidarian Hydra vulgaris using machine learning. We imaged freely behaving Hydra, extracted motion and shape features from the videos, and constructed a dictionary of visual features to classify pre-defined behaviors. We also identified unannotated behaviors with unsupervised methods. Using this analysis pipeline, we quantified 6 basic behaviors and found surprisingly similar behavior statistics across animals within the same species, regardless of experimental conditions. Our analysis indicates that the fundamental behavioral repertoire of Hydra is stable. This robustness could reflect a homeostatic neural control of "housekeeping" behaviors which could have been already present in the earliest nervous systems. PMID:29589829

  18. Assessment of statistical education in Indonesia: Preliminary results and initiation to simulation-based inference

    NASA Astrophysics Data System (ADS)

    Saputra, K. V. I.; Cahyadi, L.; Sembiring, U. A.

    2018-01-01

    Start in this paper, we assess our traditional elementary statistics education and also we introduce elementary statistics with simulation-based inference. To assess our statistical class, we adapt the well-known CAOS (Comprehensive Assessment of Outcomes in Statistics) test that serves as an external measure to assess the student’s basic statistical literacy. This test generally represents as an accepted measure of statistical literacy. We also introduce a new teaching method on elementary statistics class. Different from the traditional elementary statistics course, we will introduce a simulation-based inference method to conduct hypothesis testing. From the literature, it has shown that this new teaching method works very well in increasing student’s understanding of statistics.

  19. Systematic survey of the design, statistical analysis, and reporting of studies published in the 2008 volume of the Journal of Cerebral Blood Flow and Metabolism.

    PubMed

    Vesterinen, Hanna M; Vesterinen, Hanna V; Egan, Kieren; Deister, Amelie; Schlattmann, Peter; Macleod, Malcolm R; Dirnagl, Ulrich

    2011-04-01

    Translating experimental findings into clinically effective therapies is one of the major bottlenecks of modern medicine. As this has been particularly true for cerebrovascular research, attention has turned to the quality and validity of experimental cerebrovascular studies. We set out to assess the study design, statistical analyses, and reporting of cerebrovascular research. We assessed all original articles published in the Journal of Cerebral Blood Flow and Metabolism during the year 2008 against a checklist designed to capture the key attributes relating to study design, statistical analyses, and reporting. A total of 156 original publications were included (animal, in vitro, human). Few studies reported a primary research hypothesis, statement of purpose, or measures to safeguard internal validity (such as randomization, blinding, exclusion or inclusion criteria). Many studies lacked sufficient information regarding methods and results to form a reasonable judgment about their validity. In nearly 20% of studies, statistical tests were either not appropriate or information to allow assessment of appropriateness was lacking. This study identifies a number of factors that should be addressed if the quality of research in basic and translational biomedicine is to be improved. We support the widespread implementation of the ARRIVE (Animal Research Reporting In Vivo Experiments) statement for the reporting of experimental studies in biomedicine, for improving training in proper study design and analysis, and that reviewers and editors adopt a more constructively critical approach in the assessment of manuscripts for publication.

  20. Statistical and Detailed Analysis on Fiber Reinforced Self-Compacting Concrete Containing Admixtures- A State of Art of Review

    NASA Astrophysics Data System (ADS)

    Athiyamaan, V.; Mohan Ganesh, G.

    2017-11-01

    Self-Compacting Concrete is one of the special concretes that have ability to flow and consolidate on its own weight, completely fill the formwork even in the presence of dense reinforcement; whilst maintaining its homogeneity throughout the formwork without any requirement for vibration. Researchers all over the world are developing high performance concrete by adding various Fibers, admixtures in different proportions. Various different kinds Fibers like glass, steel, carbon, Poly propylene and aramid Fibers provide improvement in concrete properties like tensile strength, fatigue characteristic, durability, shrinkage, impact, erosion resistance and serviceability of concrete[6]. It includes fundamental study on fiber reinforced self-compacting concrete with admixtures; its rheological properties, mechanical properties and overview study on design methodology statistical approaches regarding optimizing the concrete performances. The study has been classified into seven basic chapters: introduction, phenomenal study on material properties review on self-compacting concrete, overview on fiber reinforced self-compacting concrete containing admixtures, review on design and analysis of experiment; a statistical approach, summary of existing works on FRSCC and statistical modeling, literature review and, conclusion. It is so eminent to know the resent studies that had been done on polymer based binder materials (fly ash, metakaolin, GGBS, etc.), fiber reinforced concrete and SCC; to do an effective research on fiber reinforced self-compacting concrete containing admixtures. The key aim of the study is to sort-out the research gap and to gain a complete knowledge on polymer based Self compacting fiber reinforced concrete.

  1. Systematic survey of the design, statistical analysis, and reporting of studies published in the 2008 volume of the Journal of Cerebral Blood Flow and Metabolism

    PubMed Central

    Vesterinen, Hanna V; Egan, Kieren; Deister, Amelie; Schlattmann, Peter; Macleod, Malcolm R; Dirnagl, Ulrich

    2011-01-01

    Translating experimental findings into clinically effective therapies is one of the major bottlenecks of modern medicine. As this has been particularly true for cerebrovascular research, attention has turned to the quality and validity of experimental cerebrovascular studies. We set out to assess the study design, statistical analyses, and reporting of cerebrovascular research. We assessed all original articles published in the Journal of Cerebral Blood Flow and Metabolism during the year 2008 against a checklist designed to capture the key attributes relating to study design, statistical analyses, and reporting. A total of 156 original publications were included (animal, in vitro, human). Few studies reported a primary research hypothesis, statement of purpose, or measures to safeguard internal validity (such as randomization, blinding, exclusion or inclusion criteria). Many studies lacked sufficient information regarding methods and results to form a reasonable judgment about their validity. In nearly 20% of studies, statistical tests were either not appropriate or information to allow assessment of appropriateness was lacking. This study identifies a number of factors that should be addressed if the quality of research in basic and translational biomedicine is to be improved. We support the widespread implementation of the ARRIVE (Animal Research Reporting In Vivo Experiments) statement for the reporting of experimental studies in biomedicine, for improving training in proper study design and analysis, and that reviewers and editors adopt a more constructively critical approach in the assessment of manuscripts for publication. PMID:21157472

  2. Breath Analysis in Disease Diagnosis: Methodological Considerations and Applications

    PubMed Central

    Lourenço, Célia; Turner, Claire

    2014-01-01

    Breath analysis is a promising field with great potential for non-invasive diagnosis of a number of disease states. Analysis of the concentrations of volatile organic compounds (VOCs) in breath with an acceptable accuracy are assessed by means of using analytical techniques with high sensitivity, accuracy, precision, low response time, and low detection limit, which are desirable characteristics for the detection of VOCs in human breath. “Breath fingerprinting”, indicative of a specific clinical status, relies on the use of multivariate statistics methods with powerful in-built algorithms. The need for standardisation of sample collection and analysis is the main issue concerning breath analysis, blocking the introduction of breath tests into clinical practice. This review describes recent scientific developments in basic research and clinical applications, namely issues concerning sampling and biochemistry, highlighting the diagnostic potential of breath analysis for disease diagnosis. Several considerations that need to be taken into account in breath analysis are documented here, including the growing need for metabolomics to deal with breath profiles. PMID:24957037

  3. Breath analysis in disease diagnosis: methodological considerations and applications.

    PubMed

    Lourenço, Célia; Turner, Claire

    2014-06-20

    Breath analysis is a promising field with great potential for non-invasive diagnosis of a number of disease states. Analysis of the concentrations of volatile organic compounds (VOCs) in breath with an acceptable accuracy are assessed by means of using analytical techniques with high sensitivity, accuracy, precision, low response time, and low detection limit, which are desirable characteristics for the detection of VOCs in human breath. "Breath fingerprinting", indicative of a specific clinical status, relies on the use of multivariate statistics methods with powerful in-built algorithms. The need for standardisation of sample collection and analysis is the main issue concerning breath analysis, blocking the introduction of breath tests into clinical practice. This review describes recent scientific developments in basic research and clinical applications, namely issues concerning sampling and biochemistry, highlighting the diagnostic potential of breath analysis for disease diagnosis. Several considerations that need to be taken into account in breath analysis are documented here, including the growing need for metabolomics to deal with breath profiles.

  4. Statistical analysis of the ambiguities in the asteroid period determinations

    NASA Astrophysics Data System (ADS)

    Butkiewicz-Bąk, M.; Kwiatkowski, T.; Bartczak, P.; Dudziński, G.; Marciniak, A.

    2017-09-01

    Among asteroids there exist ambiguities in their rotation period determinations. They are due to incomplete coverage of the rotation, noise and/or aliases resulting from gaps between separate lightcurves. To help to remove such uncertainties, basic characteristic of the lightcurves resulting from constraints imposed by the asteroid shapes and geometries of observations should be identified. We simulated light variations of asteroids whose shapes were modelled as Gaussian random spheres, with random orientations of spin vectors and phase angles changed every 5° from 0° to 65°. This produced 1.4 million lightcurves. For each simulated lightcurve, Fourier analysis has been made and the harmonic of the highest amplitude was recorded. From the statistical point of view, all lightcurves observed at phase angles α < 30°, with peak-to-peak amplitudes A > 0.2 mag, are bimodal. Second most frequently dominating harmonic is the first one, with the 3rd harmonic following right after. For 1 per cent of lightcurves with amplitudes A < 0.1 mag and phase angles α < 40°, 4th harmonic dominates.

  5. Statistical Analysis of Bus Networks in India

    PubMed Central

    2016-01-01

    In this paper, we model the bus networks of six major Indian cities as graphs in L-space, and evaluate their various statistical properties. While airline and railway networks have been extensively studied, a comprehensive study on the structure and growth of bus networks is lacking. In India, where bus transport plays an important role in day-to-day commutation, it is of significant interest to analyze its topological structure and answer basic questions on its evolution, growth, robustness and resiliency. Although the common feature of small-world property is observed, our analysis reveals a wide spectrum of network topologies arising due to significant variation in the degree-distribution patterns in the networks. We also observe that these networks although, robust and resilient to random attacks are particularly degree-sensitive. Unlike real-world networks, such as Internet, WWW and airline, that are virtual, bus networks are physically constrained. Our findings therefore, throw light on the evolution of such geographically and constrained networks that will help us in designing more efficient bus networks in the future. PMID:27992590

  6. Analysis of Acoustic Emission Parameters from Corrosion of AST Bottom Plate in Field Testing

    NASA Astrophysics Data System (ADS)

    Jomdecha, C.; Jirarungsatian, C.; Suwansin, W.

    Field testing of aboveground storage tank (AST) to monitor corrosion of the bottom plate is presented in this chapter. AE testing data of the ten AST with different sizes, materials, and products were employed to monitor the bottom plate condition. AE sensors of 30 and 150 kHz were used to monitor the corrosion activity of up to 24 channels including guard sensors. Acoustic emission (AE) parameters were analyzed to explore the AE parameter patterns of occurring corrosion compared to the laboratory results. Amplitude, count, duration, and energy were main parameters of analysis. Pattern recognition technique with statistical was implemented to eliminate the electrical and environmental noises. The results showed the specific AE patterns of corrosion activities related to the empirical results. In addition, plane algorithm was utilized to locate the significant AE events from corrosion. Both results of parameter patterns and AE event locations can be used to interpret and locate the corrosion activities. Finally, basic statistical grading technique was used to evaluate the bottom plate condition of the AST.

  7. Rebuilding Government Legitimacy in Post-conflict Societies: Case Studies of Nepal and Afghanistan

    DTIC Science & Technology

    2015-09-09

    administered via the verbal scales due to reduced time spent explaining the visual show cards. Statistical results corresponded with observations from...a three-step strategy for dealing with item non-response. First, basic descriptive statistics are calculated to determine the extent of item...descriptive statistics for all items in the survey), however this section of the report highlights just some of the findings. Thus, the results

  8. Biostatistical and medical statistics graduate education

    PubMed Central

    2014-01-01

    The development of graduate education in biostatistics and medical statistics is discussed in the context of training within a medical center setting. The need for medical researchers to employ a wide variety of statistical designs in clinical, genetic, basic science and translational settings justifies the ongoing integration of biostatistical training into medical center educational settings and informs its content. The integration of large data issues are a challenge. PMID:24472088

  9. A simple program to measure and analyse tree rings using Excel, R and SigmaScan

    PubMed Central

    Hietz, Peter

    2011-01-01

    I present a new software that links a program for image analysis (SigmaScan), one for spreadsheets (Excel) and one for statistical analysis (R) for applications of tree-ring analysis. The first macro measures ring width marked by the user on scanned images, stores raw and detrended data in Excel and calculates the distance to the pith and inter-series correlations. A second macro measures darkness along a defined path to identify latewood–earlywood transition in conifers, and a third shows the potential for automatic detection of boundaries. Written in Visual Basic for Applications, the code makes use of the advantages of existing programs and is consequently very economic and relatively simple to adjust to the requirements of specific projects or to expand making use of already available code. PMID:26109835

  10. A simple program to measure and analyse tree rings using Excel, R and SigmaScan.

    PubMed

    Hietz, Peter

    I present a new software that links a program for image analysis (SigmaScan), one for spreadsheets (Excel) and one for statistical analysis (R) for applications of tree-ring analysis. The first macro measures ring width marked by the user on scanned images, stores raw and detrended data in Excel and calculates the distance to the pith and inter-series correlations. A second macro measures darkness along a defined path to identify latewood-earlywood transition in conifers, and a third shows the potential for automatic detection of boundaries. Written in Visual Basic for Applications, the code makes use of the advantages of existing programs and is consequently very economic and relatively simple to adjust to the requirements of specific projects or to expand making use of already available code.

  11. Molecular Phylogenetics: Concepts for a Newcomer.

    PubMed

    Ajawatanawong, Pravech

    Molecular phylogenetics is the study of evolutionary relationships among organisms using molecular sequence data. The aim of this review is to introduce the important terminology and general concepts of tree reconstruction to biologists who lack a strong background in the field of molecular evolution. Some modern phylogenetic programs are easy to use because of their user-friendly interfaces, but understanding the phylogenetic algorithms and substitution models, which are based on advanced statistics, is still important for the analysis and interpretation without a guide. Briefly, there are five general steps in carrying out a phylogenetic analysis: (1) sequence data preparation, (2) sequence alignment, (3) choosing a phylogenetic reconstruction method, (4) identification of the best tree, and (5) evaluating the tree. Concepts in this review enable biologists to grasp the basic ideas behind phylogenetic analysis and also help provide a sound basis for discussions with expert phylogeneticists.

  12. Automation method to identify the geological structure of seabed using spatial statistic analysis of echo sounding data

    NASA Astrophysics Data System (ADS)

    Kwon, O.; Kim, W.; Kim, J.

    2017-12-01

    Recently construction of subsea tunnel has been increased globally. For safe construction of subsea tunnel, identifying the geological structure including fault at design and construction stage is more than important. Then unlike the tunnel in land, it's very difficult to obtain the data on geological structure because of the limit in geological survey. This study is intended to challenge such difficulties in a way of developing the technology to identify the geological structure of seabed automatically by using echo sounding data. When investigation a potential site for a deep subsea tunnel, there is the technical and economical limit with borehole of geophysical investigation. On the contrary, echo sounding data is easily obtainable while information reliability is higher comparing to above approaches. This study is aimed at developing the algorithm that identifies the large scale of geological structure of seabed using geostatic approach. This study is based on theory of structural geology that topographic features indicate geological structure. Basic concept of algorithm is outlined as follows; (1) convert the seabed topography to the grid data using echo sounding data, (2) apply the moving window in optimal size to the grid data, (3) estimate the spatial statistics of the grid data in the window area, (4) set the percentile standard of spatial statistics, (5) display the values satisfying the standard on the map, (6) visualize the geological structure on the map. The important elements in this study include optimal size of moving window, kinds of optimal spatial statistics and determination of optimal percentile standard. To determine such optimal elements, a numerous simulations were implemented. Eventually, user program based on R was developed using optimal analysis algorithm. The user program was designed to identify the variations of various spatial statistics. It leads to easy analysis of geological structure depending on variation of spatial statistics by arranging to easily designate the type of spatial statistics and percentile standard. This research was supported by the Korea Agency for Infrastructure Technology Advancement under the Ministry of Land, Infrastructure and Transport of the Korean government. (Project Number: 13 Construction Research T01)

  13. Lognormal Approximations of Fault Tree Uncertainty Distributions.

    PubMed

    El-Shanawany, Ashraf Ben; Ardron, Keith H; Walker, Simon P

    2018-01-26

    Fault trees are used in reliability modeling to create logical models of fault combinations that can lead to undesirable events. The output of a fault tree analysis (the top event probability) is expressed in terms of the failure probabilities of basic events that are input to the model. Typically, the basic event probabilities are not known exactly, but are modeled as probability distributions: therefore, the top event probability is also represented as an uncertainty distribution. Monte Carlo methods are generally used for evaluating the uncertainty distribution, but such calculations are computationally intensive and do not readily reveal the dominant contributors to the uncertainty. In this article, a closed-form approximation for the fault tree top event uncertainty distribution is developed, which is applicable when the uncertainties in the basic events of the model are lognormally distributed. The results of the approximate method are compared with results from two sampling-based methods: namely, the Monte Carlo method and the Wilks method based on order statistics. It is shown that the closed-form expression can provide a reasonable approximation to results obtained by Monte Carlo sampling, without incurring the computational expense. The Wilks method is found to be a useful means of providing an upper bound for the percentiles of the uncertainty distribution while being computationally inexpensive compared with full Monte Carlo sampling. The lognormal approximation method and Wilks's method appear attractive, practical alternatives for the evaluation of uncertainty in the output of fault trees and similar multilinear models. © 2018 Society for Risk Analysis.

  14. A statistical evaluation and comparison of VISSR Atmospheric Sounder (VAS) data

    NASA Technical Reports Server (NTRS)

    Jedlovec, G. J.

    1984-01-01

    In order to account for the temporal and spatial discrepancies between the VAS and rawinsonde soundings, the rawinsonde data were adjusted to a common hour of release where the new observation time corresponded to the satellite scan time. Both the satellite and rawinsonde observations of the basic atmospheric parameters (T Td, and Z) were objectively analyzed to a uniform grid maintaining the same mesoscale structure in each data set. The performance of each retrieval algorithm in producing accurate and representative soundings was evaluated using statistical parameters such as the mean, standard deviation, and root mean square of the difference fields for each parameter and grid level. Horizontal structure was also qualitatively evaluated by examining atmospheric features on constant pressure surfaces. An analysis of the vertical structure of the atmosphere were also performed by looking at colocated and grid mean vertical profiles of both the satellite and rawinsonde data sets. Highlights of these results are presented.

  15. A Monte Carlo Simulation Study of the Reliability of Intraindividual Variability

    PubMed Central

    Estabrook, Ryne; Grimm, Kevin J.; Bowles, Ryan P.

    2012-01-01

    Recent research has seen intraindividual variability (IIV) become a useful technique to incorporate trial-to-trial variability into many types of psychological studies. IIV as measured by individual standard deviations (ISDs) has shown unique prediction to several types of positive and negative outcomes (Ram, Rabbit, Stollery, & Nesselroade, 2005). One unanswered question regarding measuring intraindividual variability is its reliability and the conditions under which optimal reliability is achieved. Monte Carlo simulation studies were conducted to determine the reliability of the ISD compared to the intraindividual mean. The results indicate that ISDs generally have poor reliability and are sensitive to insufficient measurement occasions, poor test reliability, and unfavorable amounts and distributions of variability in the population. Secondary analysis of psychological data shows that use of individual standard deviations in unfavorable conditions leads to a marked reduction in statistical power, although careful adherence to underlying statistical assumptions allows their use as a basic research tool. PMID:22268793

  16. Keywords and Co-Occurrence Patterns in the Voynich Manuscript: An Information-Theoretic Analysis

    PubMed Central

    Montemurro, Marcelo A.; Zanette, Damián H.

    2013-01-01

    The Voynich manuscript has remained so far as a mystery for linguists and cryptologists. While the text written on medieval parchment -using an unknown script system- shows basic statistical patterns that bear resemblance to those from real languages, there are features that suggested to some researches that the manuscript was a forgery intended as a hoax. Here we analyse the long-range structure of the manuscript using methods from information theory. We show that the Voynich manuscript presents a complex organization in the distribution of words that is compatible with those found in real language sequences. We are also able to extract some of the most significant semantic word-networks in the text. These results together with some previously known statistical features of the Voynich manuscript, give support to the presence of a genuine message inside the book. PMID:23805215

  17. Multi-classification of cell deformation based on object alignment and run length statistic.

    PubMed

    Li, Heng; Liu, Zhiwen; An, Xing; Shi, Yonggang

    2014-01-01

    Cellular morphology is widely applied in digital pathology and is essential for improving our understanding of the basic physiological processes of organisms. One of the main issues of application is to develop efficient methods for cell deformation measurement. We propose an innovative indirect approach to analyze dynamic cell morphology in image sequences. The proposed approach considers both the cellular shape change and cytoplasm variation, and takes each frame in the image sequence into account. The cell deformation is measured by the minimum energy function of object alignment, which is invariant to object pose. Then an indirect analysis strategy is employed to overcome the limitation of gradual deformation by run length statistic. We demonstrate the power of the proposed approach with one application: multi-classification of cell deformation. Experimental results show that the proposed method is sensitive to the morphology variation and performs better than standard shape representation methods.

  18. The Shape of a Ponytail and the Statistical Physics of Hair Fiber Bundles

    NASA Astrophysics Data System (ADS)

    Goldstein, Raymond E.; Warren, Patrick B.; Ball, Robin C.

    2012-02-01

    From Leonardo to the Brothers Grimm our fascination with hair has endured in art and science. Yet, a quantitative understanding of the shapes of a hair bundles has been lacking. Here we combine experiment and theory to propose an answer to the most basic question: What is the shape of a ponytail? A model for the shape of hair bundles is developed from the perspective of statistical physics, treating individual fibers as elastic filaments with random intrinsic curvatures. The combined effects of bending elasticity, gravity, and bundle compressibility are recast as a differential equation for the envelope of a bundle, in which the compressibility enters through an ``equation of state.'' From this, we identify the balance of forces in various regions of the ponytail, extract the equation of state from analysis of ponytail shapes, and relate the observed pressure to the measured random curvatures of individual hairs.

  19. An analysis of the relationship between bodily injury severity and fall height in victims of fatal falls from height.

    PubMed

    Teresiński, Grzegorz; Milaszkiewicz, Anna; Cywka, Tomasz

    2016-01-01

    Aim of the study: One of the basic issues discussed in forensic literature regarding falls from a height is determination of fall heights and differentiation between suicidal and accidental falls. The aim of the study was to verify the usefulness of the available methods for the purposes of forensic expertises. Material and methods: The study encompassed fatalities of falls from a height whose autopsies were performed in the Department of Forensic Medicine in Lublin. Results: Similarly to other authors, the severity of injuries was assessed using the Abbreviated Injury Scale (AIS) and injury severity score (ISS). The study findings demonstrated a statistically significant correlation between the fall height and the severity of injuries according to ISS and a statistically significant difference in fall heights between the groups of accidents and suicides.

  20. Conceptual developments of non-equilibrium statistical mechanics in the early days of Japan

    NASA Astrophysics Data System (ADS)

    Ichiyanagi, Masakazu

    1995-11-01

    This paper reviews the research in nonequilibrium statistical mechanics made in Japan in the period between 1930 and 1960. Nearly thirty years have passed since the discovery of the exact formula for the electrical conductivity. With the rise of the linear response theory, the methods and results of which are quickly grasped by anyone, its rationale was pushed aside and even at the stage where the formulation was still incomplete some authors hurried to make physical applications. Such an attitude robbed it of most of its interest for the average physicist, who would approach an understanding of some basic concept, not through abstract and logical analysis but by simply increasing his technical experiences with the concept. The purpose of this review is to rescue the linear response theory from being labeled a mathematical tool and to show that it has considerable physical content. Many key papers, originally written in Japanese, are reproduced.

  1. Asymmetric statistical features of the Chinese domestic and international gold price fluctuation

    NASA Astrophysics Data System (ADS)

    Cao, Guangxi; Zhao, Yingchao; Han, Yan

    2015-05-01

    Analyzing the statistical features of fluctuation is remarkably significant for financial risk identification and measurement. In this study, the asymmetric detrended fluctuation analysis (A-DFA) method was applied to evaluate asymmetric multifractal scaling behaviors in the Shanghai and New York gold markets. Our findings showed that the multifractal features of the Chinese and international gold spot markets were asymmetric. The gold return series persisted longer in an increasing trend than in a decreasing trend. Moreover, the asymmetric degree of multifractals in the Chinese and international gold markets decreased with the increase in fluctuation range. In addition, the empirical analysis using sliding window technology indicated that multifractal asymmetry in the Chinese and international gold markets was characterized by its time-varying feature. However, the Shanghai and international gold markets basically shared a similar asymmetric degree evolution pattern. The American subprime mortgage crisis (2008) and the European debt crisis (2010) enhanced the asymmetric degree of the multifractal features of the Chinese and international gold markets. Furthermore, we also make statistical tests for the results of multifractatity and asymmetry, and discuss the origin of them. Finally, results of the empirical analysis using the threshold autoregressive conditional heteroskedasticity (TARCH) and exponential generalized autoregressive conditional heteroskedasticity (EGARCH) models exhibited that good news had a more significant effect on the cyclical fluctuation of the gold market than bad news. Moreover, good news exerted a more significant effect on the Chinese gold market than on the international gold market.

  2. Predicting Cortical Dark/Bright Asymmetries from Natural Image Statistics and Early Visual Transforms

    PubMed Central

    Cooper, Emily A.; Norcia, Anthony M.

    2015-01-01

    The nervous system has evolved in an environment with structure and predictability. One of the ubiquitous principles of sensory systems is the creation of circuits that capitalize on this predictability. Previous work has identified predictable non-uniformities in the distributions of basic visual features in natural images that are relevant to the encoding tasks of the visual system. Here, we report that the well-established statistical distributions of visual features -- such as visual contrast, spatial scale, and depth -- differ between bright and dark image components. Following this analysis, we go on to trace how these differences in natural images translate into different patterns of cortical input that arise from the separate bright (ON) and dark (OFF) pathways originating in the retina. We use models of these early visual pathways to transform natural images into statistical patterns of cortical input. The models include the receptive fields and non-linear response properties of the magnocellular (M) and parvocellular (P) pathways, with their ON and OFF pathway divisions. The results indicate that there are regularities in visual cortical input beyond those that have previously been appreciated from the direct analysis of natural images. In particular, several dark/bright asymmetries provide a potential account for recently discovered asymmetries in how the brain processes visual features, such as violations of classic energy-type models. On the basis of our analysis, we expect that the dark/bright dichotomy in natural images plays a key role in the generation of both cortical and perceptual asymmetries. PMID:26020624

  3. Investigating the limitations of tree species classification using the Combined Cluster and Discriminant Analysis method for low density ALS data from a dense forest region in Aggtelek (Hungary)

    NASA Astrophysics Data System (ADS)

    Koma, Zsófia; Deák, Márton; Kovács, József; Székely, Balázs; Kelemen, Kristóf; Standovár, Tibor

    2016-04-01

    Airborne Laser Scanning (ALS) is a widely used technology for forestry classification applications. However, single tree detection and species classification from low density ALS point cloud is limited in a dense forest region. In this study we investigate the division of a forest into homogenous groups at stand level. The study area is located in the Aggtelek karst region (Northeast Hungary) with a complex relief topography. The ALS dataset contained only 4 discrete echoes (at 2-4 pt/m2 density) from the study area during leaf-on season. Ground-truth measurements about canopy closure and proportion of tree species cover are available for every 70 meter in 500 square meter circular plots. In the first step, ALS data were processed and geometrical and intensity based features were calculated into a 5×5 meter raster based grid. The derived features contained: basic statistics of relative height, canopy RMS, echo ratio, openness, pulse penetration ratio, basic statistics of radiometric feature. In the second step the data were investigated using Combined Cluster and Discriminant Analysis (CCDA, Kovács et al., 2014). The CCDA method first determines a basic grouping for the multiple circle shaped sampling locations using hierarchical clustering and then for the arising grouping possibilities a core cycle is executed comparing the goodness of the investigated groupings with random ones. Out of these comparisons difference values arise, yielding information about the optimal grouping out of the investigated ones. If sub-groups are then further investigated, one might even find homogeneous groups. We found that low density ALS data classification into homogeneous groups are highly dependent on canopy closure, and the proportion of the dominant tree species. The presented results show high potential using CCDA for determination of homogenous separable groups in LiDAR based tree species classification. Aggtelek Karst/Slovakian Karst Caves" (HUSK/1101/221/0180, Aggtelek NP), data evaluation: 'Multipurpose assessment serving forest biodiversity conservation in the Carpathian region of Hungary', Swiss-Hungarian Cooperation Programme (SH/4/13 Project). BS contributed as an Alexander von Humboldt Research Fellow. J. Kovács, S. Kovács, N. Magyar, P. Tanos, I. G. Hatvani, and A. Anda (2014), Classification into homogeneous groups using combined cluster and discriminant analysis, Environmental Modelling & Software, 57, 52-59.

  4. The influence of essential oils on human attention. I: alertness.

    PubMed

    Ilmberger, J; Heuberger, E; Mahrhofer, C; Dessovic, H; Kowarik, D; Buchbauer, G

    2001-03-01

    Scientific research on the effects of essential oils on human behavior lags behind the promises made by popular aromatherapy. Nearly all aspects of human behavior are closely linked to processes of attention, the basic level being that of alertness, which ranges from sleep to wakefulness. In our study we measured the influence of essential oils and components of essential oils [peppermint, jasmine, ylang-ylang, 1,8-cineole (in two different dosages) and menthol] on this core attentional function, which can be experimentally defined as speed of information processing. Substances were administered by inhalation; levels of alertness were assessed by measuring motor and reaction times in a reaction time paradigm. The performances of the six experimental groups receiving substances (n = 20 in four groups, n = 30 in two groups) were compared with those of corresponding control groups receiving water. Between-group analysis, i.e. comparisons between experimental groups and their respective control groups, mainly did not reach statistical significance. However, within-group analysis showed complex correlations between subjective evaluations of substances and objective performance, indicating that effects of essentials oils or their components on basic forms of attentional behavior are mainly psychological.

  5. Regression: The Apple Does Not Fall Far From the Tree.

    PubMed

    Vetter, Thomas R; Schober, Patrick

    2018-05-15

    Researchers and clinicians are frequently interested in either: (1) assessing whether there is a relationship or association between 2 or more variables and quantifying this association; or (2) determining whether 1 or more variables can predict another variable. The strength of such an association is mainly described by the correlation. However, regression analysis and regression models can be used not only to identify whether there is a significant relationship or association between variables but also to generate estimations of such a predictive relationship between variables. This basic statistical tutorial discusses the fundamental concepts and techniques related to the most common types of regression analysis and modeling, including simple linear regression, multiple regression, logistic regression, ordinal regression, and Poisson regression, as well as the common yet often underrecognized phenomenon of regression toward the mean. The various types of regression analysis are powerful statistical techniques, which when appropriately applied, can allow for the valid interpretation of complex, multifactorial data. Regression analysis and models can assess whether there is a relationship or association between 2 or more observed variables and estimate the strength of this association, as well as determine whether 1 or more variables can predict another variable. Regression is thus being applied more commonly in anesthesia, perioperative, critical care, and pain research. However, it is crucial to note that regression can identify plausible risk factors; it does not prove causation (a definitive cause and effect relationship). The results of a regression analysis instead identify independent (predictor) variable(s) associated with the dependent (outcome) variable. As with other statistical methods, applying regression requires that certain assumptions be met, which can be tested with specific diagnostics.

  6. Polymorphisms in the FGF2 gene and risk of serous ovarian cancer: results from the Ovarian Cancer Association Consortium

    PubMed Central

    Johnatty, Sharon E.; Beesley, Jonathan; Chen, Xiaoqing; Spurdle, Amanda B.; deFazio, Anna; Webb, Penelope M; Goode, Ellen L.; Rider, David N.; Vierkant, Robert A.; Anderson, Stephanie; Wu, Anna H.; Pike, Malcolm; Van Den Berg, David; Moysich, Kirsten; Ness, Roberta; Doherty, Jennifer; Rossing, Mary-Anne; Pearce, Celeste Leigh; Chenevix-Trench, Georgia

    2009-01-01

    Fibroblast growth factor (FGF)-2 (basic) is a potent angiogenic molecule involved in tumour progression, and is one of several growth factors with a central role in ovarian carcinogenesis. We hypothesised that common single nucleotide polymorphisms (SNPs) in the FGF2 gene may alter angiogenic potential and thereby susceptibility to ovarian cancer. We analysed 25 FGF2 tgSNPs using five independent study populations from the United States and Australia. Analysis was restricted to non-Hispanic White women with serous ovarian carcinoma (1269 cases and 2829 controls). There were no statistically significant associations between any FGF2 SNPs and ovarian cancer risk. There were two nominally statistically significant associations between heterozygosity for two FGF2 SNPs (rs308379 and rs308447; p<0.05) and serous ovarian cancer risk in the combined dataset, but rare homozygous estimates did not achieve statistical significance, nor were they consistent with the log additive model of inheritance. Overall genetic variation in FGF2 does not appear to play a role in susceptibility to ovarian cancer. PMID:19456219

  7. RipleyGUI: software for analyzing spatial patterns in 3D cell distributions

    PubMed Central

    Hansson, Kristin; Jafari-Mamaghani, Mehrdad; Krieger, Patrik

    2013-01-01

    The true revolution in the age of digital neuroanatomy is the ability to extensively quantify anatomical structures and thus investigate structure-function relationships in great detail. To facilitate the quantification of neuronal cell patterns we have developed RipleyGUI, a MATLAB-based software that can be used to detect patterns in the 3D distribution of cells. RipleyGUI uses Ripley's K-function to analyze spatial distributions. In addition the software contains statistical tools to determine quantitative statistical differences, and tools for spatial transformations that are useful for analyzing non-stationary point patterns. The software has a graphical user interface making it easy to use without programming experience, and an extensive user manual explaining the basic concepts underlying the different statistical tools used to analyze spatial point patterns. The described analysis tool can be used for determining the spatial organization of neurons that is important for a detailed study of structure-function relationships. For example, neocortex that can be subdivided into six layers based on cell density and cell types can also be analyzed in terms of organizational principles distinguishing the layers. PMID:23658544

  8. Qualitative and quantitative evaluation of some vocal function parameters following fitting of a prosthesis.

    PubMed

    Cavalot, A L; Palonta, F; Preti, G; Nazionale, G; Ricci, E; Vione, N; Albera, R; Cortesina, G

    2001-12-01

    The insertion of a prosthesis and restoration with pectoralis major myocutaneous flaps for patients subjected to total pharyngolaryngectomy is a technique now universally accepted; however the literature on the subject is lacking. Our study considers 10 patients subjected to total pharyngolaryngectomy and restoration with pectoralis major myocutaneous flaps who were fitted with vocal function prostheses and a control group of 50 subjects treated with a total laryngectomy without pectoralis major myocutaneous flaps and who were fitted with vocal function prostheses. Specific qualitative and quantitative parameters were compared. The quantitative measurement of the levels of voice intensity and the evaluation of the harmonics-to-noise ratio were not statistically significant (p > 0.05) between the two study groups at either high- or low-volume speech. On the contrary, statistically significant differences were found (p < 0.05) for the basic frequency of both the low and the high volume voice. For the qualitative analysis seven parameters were established for evaluation by trained and untrained listeners: on the basis of these parameters the control group had statistically better voices.

  9. Heterogeneous Structure of Stem Cells Dynamics: Statistical Models and Quantitative Predictions

    PubMed Central

    Bogdan, Paul; Deasy, Bridget M.; Gharaibeh, Burhan; Roehrs, Timo; Marculescu, Radu

    2014-01-01

    Understanding stem cell (SC) population dynamics is essential for developing models that can be used in basic science and medicine, to aid in predicting cells fate. These models can be used as tools e.g. in studying patho-physiological events at the cellular and tissue level, predicting (mal)functions along the developmental course, and personalized regenerative medicine. Using time-lapsed imaging and statistical tools, we show that the dynamics of SC populations involve a heterogeneous structure consisting of multiple sub-population behaviors. Using non-Gaussian statistical approaches, we identify the co-existence of fast and slow dividing subpopulations, and quiescent cells, in stem cells from three species. The mathematical analysis also shows that, instead of developing independently, SCs exhibit a time-dependent fractal behavior as they interact with each other through molecular and tactile signals. These findings suggest that more sophisticated models of SC dynamics should view SC populations as a collective and avoid the simplifying homogeneity assumption by accounting for the presence of more than one dividing sub-population, and their multi-fractal characteristics. PMID:24769917

  10. Fluctuations and Noise in Stochastic Spread of Respiratory Infection Epidemics in Social Networks

    NASA Astrophysics Data System (ADS)

    Yulmetyev, Renat; Emelyanova, Natalya; Demin, Sergey; Gafarov, Fail; Hänggi, Peter; Yulmetyeva, Dinara

    2003-05-01

    For the analysis of epidemic and disease dynamics complexity, it is necessary to understand the basic principles and notions of its spreading in long-time memory media. Here we considering the problem from a theoretical and practical viewpoint, presenting the quantitative evidence confirming the existence of stochastic long-range memory and robust chaos in a real time series of respiratory infections of human upper respiratory track. In this work we present a new statistical method of analyzing the spread of grippe and acute respiratory track infections epidemic process of human upper respiratory track by means of the theory of discrete non-Markov stochastic processes. We use the results of our recent theory (Phys. Rev. E 65, 046107 (2002)) for the study of statistical effects of memory in real data series, describing the epidemic dynamics of human acute respiratory track infections and grippe. The obtained results testify to an opportunity of the strict quantitative description of the regular and stochastic components in epidemic dynamics of social networks with a view to time discreteness and effects of statistical memory.

  11. Regression modeling of ground-water flow

    USGS Publications Warehouse

    Cooley, R.L.; Naff, R.L.

    1985-01-01

    Nonlinear multiple regression methods are developed to model and analyze groundwater flow systems. Complete descriptions of regression methodology as applied to groundwater flow models allow scientists and engineers engaged in flow modeling to apply the methods to a wide range of problems. Organization of the text proceeds from an introduction that discusses the general topic of groundwater flow modeling, to a review of basic statistics necessary to properly apply regression techniques, and then to the main topic: exposition and use of linear and nonlinear regression to model groundwater flow. Statistical procedures are given to analyze and use the regression models. A number of exercises and answers are included to exercise the student on nearly all the methods that are presented for modeling and statistical analysis. Three computer programs implement the more complex methods. These three are a general two-dimensional, steady-state regression model for flow in an anisotropic, heterogeneous porous medium, a program to calculate a measure of model nonlinearity with respect to the regression parameters, and a program to analyze model errors in computed dependent variables such as hydraulic head. (USGS)

  12. Digital recovery, modification, and analysis of Tetra Tech seismic horizon mapping, National Petroleum Reserve Alaska (NPRA), northern Alaska

    USGS Publications Warehouse

    Saltus, R.W.; Kulander, Christopher S.; Potter, Christopher J.

    2002-01-01

    We have digitized, modified, and analyzed seismic interpretation maps of 12 subsurface stratigraphic horizons spanning portions of the National Petroleum Reserve in Alaska (NPRA). These original maps were prepared by Tetra Tech, Inc., based on about 15,000 miles of seismic data collected from 1974 to 1981. We have also digitized interpreted faults and seismic velocities from Tetra Tech maps. The seismic surfaces were digitized as two-way travel time horizons and converted to depth using Tetra Tech seismic velocities. The depth surfaces were then modified by long-wavelength corrections based on recent USGS seismic re-interpretation along regional seismic lines. We have developed and executed an algorithm to identify and calculate statistics on the area, volume, height, and depth of closed structures based on these seismic horizons. These closure statistics are tabulated and have been used as input to oil and gas assessment calculations for the region. Directories accompanying this report contain basic digitized data, processed data, maps, tabulations of closure statistics, and software relating to this project.

  13. Views of medical students: what, when and how do they want statistics taught?

    PubMed

    Fielding, S; Poobalan, A; Prescott, G J; Marais, D; Aucott, L

    2015-11-01

    A key skill for a practising clinician is being able to do research, understand the statistical analyses and interpret results in the medical literature. Basic statistics has become essential within medical education, but when, what and in which format is uncertain. To inform curriculum design/development we undertook a quantitative survey of fifth year medical students and followed them up with a series of focus groups to obtain their opinions as to what statistics teaching they want, when and how. A total of 145 students undertook the survey and five focus groups were held with between 3 and 9 participants each. Previous statistical training varied and students recognised their knowledge was inadequate and keen to see additional training implemented. Students were aware of the importance of statistics to their future careers, but apprehensive about learning. Face-to-face teaching supported by online resources was popular. Focus groups indicated the need for statistical training early in their degree and highlighted their lack of confidence and inconsistencies in support. The study found that the students see the importance of statistics training in the medical curriculum but that timing and mode of delivery are key. The findings have informed the design of a new course to be implemented in the third undergraduate year. Teaching will be based around published studies aiming to equip students with the basics required with additional resources available through a virtual learning environment. © The Author(s) 2015.

  14. A generalized concept for cost-effective structural design. [Statistical Decision Theory applied to aerospace systems

    NASA Technical Reports Server (NTRS)

    Thomas, J. M.; Hawk, J. D.

    1975-01-01

    A generalized concept for cost-effective structural design is introduced. It is assumed that decisions affecting the cost effectiveness of aerospace structures fall into three basic categories: design, verification, and operation. Within these basic categories, certain decisions concerning items such as design configuration, safety factors, testing methods, and operational constraints are to be made. All or some of the variables affecting these decisions may be treated probabilistically. Bayesian statistical decision theory is used as the tool for determining the cost optimum decisions. A special case of the general problem is derived herein, and some very useful parametric curves are developed and applied to several sample structures.

  15. On a Quantum Model of Brain Activities

    NASA Astrophysics Data System (ADS)

    Fichtner, K.-H.; Fichtner, L.; Freudenberg, W.; Ohya, M.

    2010-01-01

    One of the main activities of the brain is the recognition of signals. A first attempt to explain the process of recognition in terms of quantum statistics was given in [6]. Subsequently, details of the mathematical model were presented in a (still incomplete) series of papers (cf. [7, 2, 5, 10]). In the present note we want to give a general view of the principal ideas of this approach. We will introduce the basic spaces and justify the choice of spaces and operations. Further, we bring the model face to face with basic postulates any statistical model of the recognition process should fulfill. These postulates are in accordance with the opinion widely accepted in psychology and neurology.

  16. Innovations in curriculum design: A multi-disciplinary approach to teaching statistics to undergraduate medical students

    PubMed Central

    Freeman, Jenny V; Collier, Steve; Staniforth, David; Smith, Kevin J

    2008-01-01

    Background Statistics is relevant to students and practitioners in medicine and health sciences and is increasingly taught as part of the medical curriculum. However, it is common for students to dislike and under-perform in statistics. We sought to address these issues by redesigning the way that statistics is taught. Methods The project brought together a statistician, clinician and educational experts to re-conceptualize the syllabus, and focused on developing different methods of delivery. New teaching materials, including videos, animations and contextualized workbooks were designed and produced, placing greater emphasis on applying statistics and interpreting data. Results Two cohorts of students were evaluated, one with old style and one with new style teaching. Both were similar with respect to age, gender and previous level of statistics. Students who were taught using the new approach could better define the key concepts of p-value and confidence interval (p < 0.001 for both). They were more likely to regard statistics as integral to medical practice (p = 0.03), and to expect to use it in their medical career (p = 0.003). There was no significant difference in the numbers who thought that statistics was essential to understand the literature (p = 0.28) and those who felt comfortable with the basics of statistics (p = 0.06). More than half the students in both cohorts felt that they were comfortable with the basics of medical statistics. Conclusion Using a variety of media, and placing emphasis on interpretation can help make teaching, learning and understanding of statistics more people-centred and relevant, resulting in better outcomes for students. PMID:18452599

  17. A Discussion of the Measurement and Statistical Manipulation of Selected Key Variables in an Adult Basic Education Program.

    ERIC Educational Resources Information Center

    Cunningham, Phyllis M.

    Intending to explore the interaction effects of self-esteem level and perceived program utility on the retention and cognitive achievement of adult basic education students, a self-esteem instrument, to be administered verbally, was constructed with content relevant items developed from and tested on a working class, undereducated, black, adult…

  18. A Quantile Regression Approach to Understanding the Relations among Morphological Awareness, Vocabulary, and Reading Comprehension in Adult Basic Education Students

    ERIC Educational Resources Information Center

    Tighe, Elizabeth L.; Schatschneider, Christopher

    2016-01-01

    The purpose of this study was to investigate the joint and unique contributions of morphological awareness and vocabulary knowledge at five reading comprehension levels in adult basic education (ABE) students. We introduce the statistical technique of multiple quantile regression, which enabled us to assess the predictive utility of morphological…

  19. Use of communication techniques by Maryland dentists.

    PubMed

    Maybury, Catherine; Horowitz, Alice M; Wang, Min Qi; Kleinman, Dushanka V

    2013-12-01

    Health care providers' use of recommended communication techniques can increase patients' adherence to prevention and treatment regimens and improve patient health outcomes. The authors conducted a survey of Maryland dentists to determine the number and type of communication techniques they use on a routine basis. The authors mailed a 30-item questionnaire to a random sample of 1,393 general practice dentists and all 169 members of the Maryland chapter of the American Academy of Pediatric Dentistry. The overall response rate was 38.4 percent. Analysis included descriptive statistics, analysis of variance and ordinary least squares regression analysis to examine the association of dentists' characteristics with the number of communication techniques used. They set the significance level at P < .05. General dentists reported routinely using a mean of 7.9 of the 18 communication techniques and 3.6 of the seven basic techniques, whereas pediatric dentists reported using a mean of 8.4 and 3.8 of those techniques, respectively. General dentists who had taken a communication course outside of dental school were more likely than those who had not to use the 18 techniques (P < .01) but not the seven basic techniques (P < .05). Pediatric dentists who had taken a communication course outside of dental school were more likely than those who had not to use the 18 techniques (P < .05) and the seven basic techniques (P < .01). The number of communication techniques that dentists used routinely varied across the 18 techniques and was low for most techniques. Practical Implications. Professional education is needed both in dental school curricula and continuing education courses to increase use of recommended communication techniques. Specifically, dentists and their team members should consider taking communication skills courses and conducting an overall evaluation of their practices for user friendliness.

  20. Summary Statistics of CPB-Qualified Public Radio Stations: Fiscal Year 1971.

    ERIC Educational Resources Information Center

    Lee, S. Young; Pedone, Ronald J.

    Basic statistics on finance, employment, and broadcast and production activities of 103 Corporation for Public Broadcasting (CPB)--qualified radio stations in the United States and Puerto Rico for Fiscal Year 1971 are collected. The first section of the report deals with total funds, income, direct operating costs, capital expenditures, and other…

  1. Using Statistics to Lie, Distort, and Abuse Data

    ERIC Educational Resources Information Center

    Bintz, William; Moore, Sara; Adams, Cheryll; Pierce, Rebecca

    2009-01-01

    Statistics is a branch of mathematics that involves organization, presentation, and interpretation of data, both quantitative and qualitative. Data do not lie, but people do. On the surface, quantitative data are basically inanimate objects, nothing more than lifeless and meaningless symbols that appear on a page, calculator, computer, or in one's…

  2. What Software to Use in the Teaching of Mathematical Subjects?

    ERIC Educational Resources Information Center

    Berežný, Štefan

    2015-01-01

    We can consider two basic views, when using mathematical software in the teaching of mathematical subjects. First: How to learn to use specific software for the specific tasks, e. g., software Statistica for the subjects of Applied statistics, probability and mathematical statistics, or financial mathematics. Second: How to learn to use the…

  3. Intrex Subject/Title Inverted-File Characteristics.

    ERIC Educational Resources Information Center

    Uemura, Syunsuke

    The characteristics of the Intrex subject/title inverted file are analyzed. Basic statistics of the inverted file are presented including various distributions of the index words and terms from which the file was derived, and statistics on stems, the file growth process, and redundancy measurements. A study of stems both with extremely high and…

  4. The Robustness of the Studentized Range Statistic to Violations of the Normality and Homogeneity of Variance Assumptions.

    ERIC Educational Resources Information Center

    Ramseyer, Gary C.; Tcheng, Tse-Kia

    The present study was directed at determining the extent to which the Type I Error rate is affected by violations in the basic assumptions of the q statistic. Monte Carlo methods were employed, and a variety of departures from the assumptions were examined. (Author)

  5. Application of an Online Reference for Reviewing Basic Statistical Principles of Operating Room Management

    ERIC Educational Resources Information Center

    Dexter, Franklin; Masursky, Danielle; Wachtel, Ruth E.; Nussmeier, Nancy A.

    2010-01-01

    Operating room (OR) management differs from clinical anesthesia in that statistical literacy is needed daily to make good decisions. Two of the authors teach a course in operations research for surgical services to anesthesiologists, anesthesia residents, OR nursing directors, hospital administration students, and analysts to provide them with the…

  6. Statistics and Data Interpretation for Social Work

    ERIC Educational Resources Information Center

    Rosenthal, James A.

    2011-01-01

    Written by a social worker for social work students, this is a nuts and bolts guide to statistics that presents complex calculations and concepts in clear, easy-to-understand language. It includes numerous examples, data sets, and issues that students will encounter in social work practice. The first section introduces basic concepts and terms to…

  7. Using Excel in Teacher Education for Sustainability

    ERIC Educational Resources Information Center

    Aydin, Serhat

    2016-01-01

    In this study, the feasibility of using Excel software in teaching whole Basic Statistics Course and its influence on the attitudes of pre-service science teachers towards statistics were investigated. One hundred and two pre-service science teachers in their second year participated in the study. The data were collected from the prospective…

  8. Basic Math Skills and Performance in an Introductory Statistics Course

    ERIC Educational Resources Information Center

    Johnson, Marianne; Kuennen, Eric

    2006-01-01

    We identify the student characteristics most associated with success in an introductory business statistics class, placing special focus on the relationship between student math skills and course performance, as measured by student grade in the course. To determine which math skills are important for student success, we examine (1) whether the…

  9. An Online Course of Business Statistics: The Proportion of Successful Students

    ERIC Educational Resources Information Center

    Pena-Sanchez, Rolando

    2009-01-01

    This article describes the students' academic progress in an online course of business statistics through interactive software assignments and diverse educational homework, which helps these students to build their own e-learning through basic competences; i.e. interpreting results and solving problems. Cross-tables were built for the categorical…

  10. Pediatric outcomes data collection instrument scores in ambulatory children with cerebral palsy: an analysis by age groups and severity level.

    PubMed

    Barnes, Douglas; Linton, Judith L; Sullivan, Elroy; Bagley, Anita; Oeffinger, Donna; Abel, Mark; Damiano, Diane; Gorton, George; Nicholson, Diane; Romness, Mark; Rogers, Sarah; Tylkowski, Chester

    2008-01-01

    The Pediatric Outcomes Data Collection Instrument (PODCI) was developed in 1994 as a patient-based tool for use across a broad age range and wide array of musculoskeletal disorders, including children with cerebral palsy (CP). The purpose of this study was to establish means and SDs of the Parent PODCI measures by age groups and Gross Motor Function Classification System (GMFCS) levels for ambulatory children with CP. This instrument was one of several studied in a prospective, multicenter project of ambulatory patients with CP between the aged 4 and 18 years and GMFCS levels I through III. Participants included 338 boys and 221 girls at a mean age of 11.1 years, with 370 diplegic, 162 hemiplegic, and 27 quadriplegic. Both baseline and follow-up data sets of the completed Parent PODCI responses were statistically analyzed. Age was identified as a significant predictor of the PODCI measures of Upper Extremity Function, Transfers and Basic Mobility, Global Function, and Happiness With Physical Condition. Gross Motor Function Classification System levels was a significant predictor of Transfers and Basic Mobility, Sports and Physical Function, and Global Function. Pattern of involvement, sex, and prior orthopaedic surgery were not statistically significant predictors for any of the Parent PODCI measures. Mean and SD scores were calculated for age groups stratified by GMFCS levels. Analysis of the follow-up data set validated the findings derived from the baseline data. Linear regression equations were derived, with age as a continuous variable and GMFCS levels as a categorical variable, to be used for Parent PODCI predicted scores. The results of this study provide clinicians and researchers with a set of Parent PODCI values for comparison to age- and severity-matched populations of ambulatory patients with CP.

  11. Knowledge, attitude and anxiety pertaining to basic life support and medical emergencies among dental interns in Mangalore City, India.

    PubMed

    Somaraj, Vinej; Shenoy, Rekha P; Panchmal, Ganesh Shenoy; Jodalli, Praveen S; Sonde, Laxminarayan; Karkal, Ravichandra

    2017-01-01

    This cross-sectional study aimed to assess the knowledge, attitude and anxiety pertaining to basic life support (BLS) and medical emergencies among interns in dental colleges of Mangalore city, Karnataka, India. The study subjects comprised of interns who volunteered from the four dental colleges. The knowledge and attitude of interns were assessed using a 30-item questionnaire prepared based on the Basic Life Support Manual from American Heart Association and the anxiety of interns pertaining to BLS and medical emergencies were assessed using a State-Trait Anxiety Inventory (STAI) Questionnaire. Chi-square test was performed on SPSS 21.0 (IBM Statistics, 2012) to determine statistically significant differences ( P <0.05) between assessed knowledge and anxiety. Out of 183 interns, 39.89% had below average knowledge. A total of 123 (67.21%) reported unavailability of professional training. The majority (180, 98.36%) felt the urgent need of training in basic life support procedures. Assessment of stress showed a total of 27.1% participants to be above high-stress level. Comparison of assessed knowledge and stress was found to be insignificant ( P =0.983). There was an evident lack of knowledge pertaining to the management of medical emergencies among the interns. As oral health care providers moving out to the society, a focus should be placed on the training of dental interns with respect to Basic Life Support procedures.

  12. A new course and textbook on Physical Models of Living Systems, for science and engineering undergraduates

    NASA Astrophysics Data System (ADS)

    Nelson, Philip

    2015-03-01

    I'll describe an intermediate-level course on ``Physical Models of Living Systems.'' The only prerequisite is first-year university physics and calculus. The course is a response to rapidly growing interest among undergraduates in a broad range of science and engineering majors. Students acquire several research skills that are often not addressed in traditional courses: Basic modeling skills Probabilistic modeling skills Data analysis methods Computer programming using a general-purpose platform like MATLAB or Python Dynamical systems, particularly feedback control. These basic skills, which are relevant to nearly any field of science or engineering, are presented in the context of case studies from living systems, including: Virus dynamics Bacterial genetics and evolution of drug resistance Statistical inference Superresolution microscopy Synthetic biology Naturally evolved cellular circuits. Work supported by NSF Grants EF-0928048 and DMR-0832802.

  13. Density Functionals of Chemical Bonding

    PubMed Central

    Putz, Mihai V.

    2008-01-01

    The behavior of electrons in general many-electronic systems throughout the density functionals of energy is reviewed. The basic physico-chemical concepts of density functional theory are employed to highlight the energy role in chemical structure while its extended influence in electronic localization function helps in chemical bonding understanding. In this context the energy functionals accompanied by electronic localization functions may provide a comprehensive description of the global-local levels electronic structures in general and of chemical bonds in special. Becke-Edgecombe and author’s Markovian electronic localization functions are discussed at atomic, molecular and solid state levels. Then, the analytical survey of the main workable kinetic, exchange, and correlation density functionals within local and gradient density approximations is undertaken. The hierarchy of various energy functionals is formulated by employing both the parabolic and statistical correlation degree of them with the electronegativity and chemical hardness indices by means of quantitative structure-property relationship (QSPR) analysis for basic atomic and molecular systems. PMID:19325846

  14. The Prediction Properties of Inverse and Reverse Regression for the Simple Linear Calibration Problem

    NASA Technical Reports Server (NTRS)

    Parker, Peter A.; Geoffrey, Vining G.; Wilson, Sara R.; Szarka, John L., III; Johnson, Nels G.

    2010-01-01

    The calibration of measurement systems is a fundamental but under-studied problem within industrial statistics. The origins of this problem go back to basic chemical analysis based on NIST standards. In today's world these issues extend to mechanical, electrical, and materials engineering. Often, these new scenarios do not provide "gold standards" such as the standard weights provided by NIST. This paper considers the classic "forward regression followed by inverse regression" approach. In this approach the initial experiment treats the "standards" as the regressor and the observed values as the response to calibrate the instrument. The analyst then must invert the resulting regression model in order to use the instrument to make actual measurements in practice. This paper compares this classical approach to "reverse regression," which treats the standards as the response and the observed measurements as the regressor in the calibration experiment. Such an approach is intuitively appealing because it avoids the need for the inverse regression. However, it also violates some of the basic regression assumptions.

  15. Inferring epidemiological dynamics of infectious diseases using Tajima's D statistic on nucleotide sequences of pathogens.

    PubMed

    Kim, Kiyeon; Omori, Ryosuke; Ito, Kimihito

    2017-12-01

    The estimation of the basic reproduction number is essential to understand epidemic dynamics, and time series data of infected individuals are usually used for the estimation. However, such data are not always available. Methods to estimate the basic reproduction number using genealogy constructed from nucleotide sequences of pathogens have been proposed so far. Here, we propose a new method to estimate epidemiological parameters of outbreaks using the time series change of Tajima's D statistic on the nucleotide sequences of pathogens. To relate the time evolution of Tajima's D to the number of infected individuals, we constructed a parsimonious mathematical model describing both the transmission process of pathogens among hosts and the evolutionary process of the pathogens. As a case study we applied this method to the field data of nucleotide sequences of pandemic influenza A (H1N1) 2009 viruses collected in Argentina. The Tajima's D-based method estimated basic reproduction number to be 1.55 with 95% highest posterior density (HPD) between 1.31 and 2.05, and the date of epidemic peak to be 10th July with 95% HPD between 22nd June and 9th August. The estimated basic reproduction number was consistent with estimation by birth-death skyline plot and estimation using the time series of the number of infected individuals. These results suggested that Tajima's D statistic on nucleotide sequences of pathogens could be useful to estimate epidemiological parameters of outbreaks. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  16. Linear combination reading program for capture gamma rays

    USGS Publications Warehouse

    Tanner, Allan B.

    1971-01-01

    This program computes a weighting function, Qj, which gives a scalar output value of unity when applied to the spectrum of a desired element and a minimum value (considering statistics) when applied to spectra of materials not containing the desired element. Intermediate values are obtained for materials containing the desired element, in proportion to the amount of the element they contain. The program is written in the BASIC language in a format specific to the Hewlett-Packard 2000A Time-Sharing System, and is an adaptation of an earlier program for linear combination reading for X-ray fluorescence analysis (Tanner and Brinkerhoff, 1971). Following the program is a sample run from a study of the application of the linear combination technique to capture-gamma-ray analysis for calcium (report in preparation).

  17. Importance of nasal clipping in screening investigations of flow volume curve.

    PubMed

    Yanev, I

    1992-01-01

    Comparative analysis of some basic lung indices obtained from a screening investigation of the flow volume curve by using two techniques, with a nose clip and without a nose clip, was made on a cohort of 86 workers in a factory shop for the production of bearings. We found no statistically significant differences between the indices obtained by the two techniques. Our study showed that the FVC and FEV1 obtained in workers without using nose clips were equal to or better than those obtained using nose clips in 60% of the workers. The reproducibility of the two methods was similar. The analysis of the data has shown that the flow volume curve investigation gives better results when performed without a nose clip, especially in industrial conditions.

  18. Factor Analysis of Traffic Safety in Urban Roads Based on FTA-LEC

    NASA Astrophysics Data System (ADS)

    Shuicheng, TIAN; Xingbo, YANG; Xiaoqing, SHEN; Detao, ZHANG

    2018-05-01

    In order to reduce the number and the loss of urban road traffic accidents in our country, improve the safety of road traffic, a statistical analysis of the research report on major road traffic accidents in 2016 was conducted. The risk factors affecting urban road traffic in China were analyzed by using FTA to find the basic hidden events. Secondly, the risk value of the identified hidden danger events were calculated and classified into four levels I, II, III and IV through the LEC evaluation method. Finally, the graded results of risk factors are verified through a case of specific accidents in Beijing. The results show that: the case verified the scientificalness and effectiveness of hazard classification and provided guidance for urban road traffic management.

  19. Risk management of key issues of FPSO

    NASA Astrophysics Data System (ADS)

    Sun, Liping; Sun, Hai

    2012-12-01

    Risk analysis of key systems have become a growing topic late of because of the development of offshore structures. Equipment failures of offloading system and fire accidents were analyzed based on the floating production, storage and offloading (FPSO) features. Fault tree analysis (FTA), and failure modes and effects analysis (FMEA) methods were examined based on information already researched on modules of relex reliability studio (RRS). Equipment failures were also analyzed qualitatively by establishing a fault tree and Boolean structure function based on the shortage of failure cases, statistical data, and risk control measures examined. Failure modes of fire accident were classified according to the different areas of fire occurrences during the FMEA process, using risk priority number (RPN) methods to evaluate their severity rank. The qualitative analysis of FTA gave the basic insight of forming the failure modes of FPSO offloading, and the fire FMEA gave the priorities and suggested processes. The research has practical importance for the security analysis problems of FPSO.

  20. Competing risk models in reliability systems, an exponential distribution model with Bayesian analysis approach

    NASA Astrophysics Data System (ADS)

    Iskandar, I.

    2018-03-01

    The exponential distribution is the most widely used reliability analysis. This distribution is very suitable for representing the lengths of life of many cases and is available in a simple statistical form. The characteristic of this distribution is a constant hazard rate. The exponential distribution is the lower rank of the Weibull distributions. In this paper our effort is to introduce the basic notions that constitute an exponential competing risks model in reliability analysis using Bayesian analysis approach and presenting their analytic methods. The cases are limited to the models with independent causes of failure. A non-informative prior distribution is used in our analysis. This model describes the likelihood function and follows with the description of the posterior function and the estimations of the point, interval, hazard function, and reliability. The net probability of failure if only one specific risk is present, crude probability of failure due to a specific risk in the presence of other causes, and partial crude probabilities are also included.

  1. An intelligent system based on fuzzy probabilities for medical diagnosis– a study in aphasia diagnosis*

    PubMed Central

    Moshtagh-Khorasani, Majid; Akbarzadeh-T, Mohammad-R; Jahangiri, Nader; Khoobdel, Mehdi

    2009-01-01

    BACKGROUND: Aphasia diagnosis is particularly challenging due to the linguistic uncertainty and vagueness, inconsistencies in the definition of aphasic syndromes, large number of measurements with imprecision, natural diversity and subjectivity in test objects as well as in opinions of experts who diagnose the disease. METHODS: Fuzzy probability is proposed here as the basic framework for handling the uncertainties in medical diagnosis and particularly aphasia diagnosis. To efficiently construct this fuzzy probabilistic mapping, statistical analysis is performed that constructs input membership functions as well as determines an effective set of input features. RESULTS: Considering the high sensitivity of performance measures to different distribution of testing/training sets, a statistical t-test of significance is applied to compare fuzzy approach results with NN results as well as author's earlier work using fuzzy logic. The proposed fuzzy probability estimator approach clearly provides better diagnosis for both classes of data sets. Specifically, for the first and second type of fuzzy probability classifiers, i.e. spontaneous speech and comprehensive model, P-values are 2.24E-08 and 0.0059, respectively, strongly rejecting the null hypothesis. CONCLUSIONS: The technique is applied and compared on both comprehensive and spontaneous speech test data for diagnosis of four Aphasia types: Anomic, Broca, Global and Wernicke. Statistical analysis confirms that the proposed approach can significantly improve accuracy using fewer Aphasia features. PMID:21772867

  2. Data Mining CMMSs: How to Convert Data into Knowledge.

    PubMed

    Fennigkoh, Larry; Nanney, D Courtney

    2018-01-01

    Although the healthcare technology management (HTM) community has decades of accumulated medical device-related maintenance data, little knowledge has been gleaned from these data. Finding and extracting such knowledge requires the use of the well-established, but admittedly somewhat foreign to HTM, application of inferential statistics. This article sought to provide a basic background on inferential statistics and describe a case study of their application, limitations, and proper interpretation. The research question associated with this case study involved examining the effects of ventilator preventive maintenance (PM) labor hours, age, and manufacturer on needed unscheduled corrective maintenance (CM) labor hours. The study sample included more than 21,000 combined PM inspections and CM work orders on 2,045 ventilators from 26 manufacturers during a five-year period (2012-16). A multiple regression analysis revealed that device age, manufacturer, and accumulated PM inspection labor hours all influenced the amount of CM labor significantly (P < 0.001). In essence, CM labor hours increased with increasing PM labor. However, and despite the statistical significance of these predictors, the regression analysis also indicated that ventilator age, manufacturer, and PM labor hours only explained approximately 16% of all variability in CM labor, with the remainder (84%) caused by other factors that were not included in the study. As such, the regression model obtained here is not suitable for predicting ventilator CM labor hours.

  3. A Correlational Study of the Relationships between Music Aptitude and Phonemic Awareness of Kindergarten Children

    ERIC Educational Resources Information Center

    Rubinson, Laura E.

    2010-01-01

    More than one third of American children cannot read at a basic level by fourth grade (Lee, Grigg, & Donahue, 2007) and those numbers are even higher for African American, Hispanic and poor White students (Boorman et al., 2007). These are alarming statistics given that the ability to read is the most basic and fundamental skill for academic…

  4. Availability of Instructional Materials at the Basic Education Level in Enugu Educational Zone of Enugu State, Nigeria

    ERIC Educational Resources Information Center

    Chukwu, Leo C.; Eze, Thecla A. Y.; Agada, Fidelia Chinyelugo

    2016-01-01

    The study examined the availability of instructional materials at the basic education level in Enugu Education Zone of Enugu State, Nigeria. One research question and one hypothesis guided the study. The research question was answered using mean and grand mean ratings, while the hypothesis was tested using t-test statistics at 0.05 level of…

  5. Basic statistics with Microsoft Excel: a review.

    PubMed

    Divisi, Duilio; Di Leonardo, Gabriella; Zaccagna, Gino; Crisci, Roberto

    2017-06-01

    The scientific world is enriched daily with new knowledge, due to new technologies and continuous discoveries. The mathematical functions explain the statistical concepts particularly those of mean, median and mode along with those of frequency and frequency distribution associated to histograms and graphical representations, determining elaborative processes on the basis of the spreadsheet operations. The aim of the study is to highlight the mathematical basis of statistical models that regulate the operation of spreadsheets in Microsoft Excel.

  6. Basic statistics with Microsoft Excel: a review

    PubMed Central

    Di Leonardo, Gabriella; Zaccagna, Gino; Crisci, Roberto

    2017-01-01

    The scientific world is enriched daily with new knowledge, due to new technologies and continuous discoveries. The mathematical functions explain the statistical concepts particularly those of mean, median and mode along with those of frequency and frequency distribution associated to histograms and graphical representations, determining elaborative processes on the basis of the spreadsheet operations. The aim of the study is to highlight the mathematical basis of statistical models that regulate the operation of spreadsheets in Microsoft Excel. PMID:28740690

  7. Spatial variability and long-term analysis of groundwater quality of Faisalabad industrial zone

    NASA Astrophysics Data System (ADS)

    Nasir, Muhammad Salman; Nasir, Abdul; Rashid, Haroon; Shah, Syed Hamid Hussain

    2017-10-01

    Water is the basic necessity of life and is essential for healthy society. In this study, groundwater quality analysis was carried out for the industrial zone of Faisalabad city. Sixty samples of groundwater were collected from the study area. The quality maps of deliberately analyzed results were prepared in GIS. The collected samples were analyzed for chemical parameters and heavy metals, such as total hardness, alkalinity, cadmium, arsenic, nickel, lead, and fluoride, and then, the results were compared with the WHO guidelines. The values of these results were represented by a mapping of quality parameters using the ArcView GIS v9.3, and IDW was used for raster interpolation. The long-term analysis of these parameters has been carried out using the `R Statistical' software. It was concluded that water is partially not fit for drinking, and direct use of this groundwater may cause health issues.

  8. Higher Order Analysis of Turbulent Changes Found in the ELF Range Electric Field Plasma Before Major Earthquakes

    NASA Astrophysics Data System (ADS)

    Kosciesza, M.; Blecki, J. S.; Parrot, M.

    2014-12-01

    We report the structure function analysis of changes found in electric field in the ELF range plasma turbulence registered in the ionosphere over epicenter region of major earthquakes with depth less than 40 km that took place during 6.5 years of the scientific mission of the DEMETER satellite. We compare the data for the earthquakes for which we found turbulence with events without any turbulent changes. The structure functions were calculated also for the Polar CUSP region and equatorial spread F region. Basic studies of the turbulent processes were conducted with use of higher order spectra and higher order statistics. The structure function analysis was performed to locate and check if there are intermittent behaviors in the ionospheres plasma over epicenter region of the earthquakes. These registrations are correlated with the plasma parameters measured onboard DEMETER satellite and with geomagnetic indices.

  9. Applications of non-parametric statistics and analysis of variance on sample variances

    NASA Technical Reports Server (NTRS)

    Myers, R. H.

    1981-01-01

    Nonparametric methods that are available for NASA-type applications are discussed. An attempt will be made here to survey what can be used, to attempt recommendations as to when each would be applicable, and to compare the methods, when possible, with the usual normal-theory procedures that are avavilable for the Gaussion analog. It is important here to point out the hypotheses that are being tested, the assumptions that are being made, and limitations of the nonparametric procedures. The appropriateness of doing analysis of variance on sample variances are also discussed and studied. This procedure is followed in several NASA simulation projects. On the surface this would appear to be reasonably sound procedure. However, difficulties involved center around the normality problem and the basic homogeneous variance assumption that is mase in usual analysis of variance problems. These difficulties discussed and guidelines given for using the methods.

  10. A. C. C. Fact Book: A Statistical Profile of Allegany Community College and the Community It Serves.

    ERIC Educational Resources Information Center

    Andersen, Roger C.

    This document is intended to be an authoritative compilation of frequently referenced basic facts concerning Allegany Community College (ACC) in Maryland. It is a statistical profile of ACC and the community it serves, divided into six sections: enrollment, students, faculty, community, support services, and general college related information.…

  11. Basic Mathematics Test Predicts Statistics Achievement and Overall First Year Academic Success

    ERIC Educational Resources Information Center

    Fonteyne, Lot; De Fruyt, Filip; Dewulf, Nele; Duyck, Wouter; Erauw, Kris; Goeminne, Katy; Lammertyn, Jan; Marchant, Thierry; Moerkerke, Beatrijs; Oosterlinck, Tom; Rosseel, Yves

    2015-01-01

    In the psychology and educational science programs at Ghent University, only 36.1% of the new incoming students in 2011 and 2012 passed all exams. Despite availability of information, many students underestimate the scientific character of social science programs. Statistics courses are a major obstacle in this matter. Not all enrolling students…

  12. The Structure of Research Methodology Competency in Higher Education and the Role of Teaching Teams and Course Temporal Distance

    ERIC Educational Resources Information Center

    Schweizer, Karl; Steinwascher, Merle; Moosbrugger, Helfried; Reiss, Siegbert

    2011-01-01

    The development of research methodology competency is a major aim of the psychology curriculum at universities. Usually, three courses concentrating on basic statistics, advanced statistics and experimental methods, respectively, serve the achievement of this aim. However, this traditional curriculum-based course structure gives rise to the…

  13. Statistical estimators for monitoring spotted owls in Oregon and Washington in 1987.

    Treesearch

    Tlmothy A. Max; Ray A. Souter; Kathleen A. O' Halloran

    1990-01-01

    Spotted owls (Strix occidentalis) were monitored on 11 National Forests in the Pacific Northwest Region of the USDA Forest Service between March and August of 1987. The basic intent of monitoring was to provide estimates of occupancy and reproduction rates for pairs of spotted owls. This paper documents the technical details of the statistical...

  14. Statistical techniques for sampling and monitoring natural resources

    Treesearch

    Hans T. Schreuder; Richard Ernst; Hugo Ramirez-Maldonado

    2004-01-01

    We present the statistical theory of inventory and monitoring from a probabilistic point of view. We start with the basics and show the interrelationships between designs and estimators illustrating the methods with a small artificial population as well as with a mapped realistic population. For such applications, useful open source software is given in Appendix 4....

  15. Peer-Assisted Learning in Research Methods and Statistics

    ERIC Educational Resources Information Center

    Stone, Anna; Meade, Claire; Watling, Rosamond

    2012-01-01

    Feedback from students on a Level 1 Research Methods and Statistics module, studied as a core part of a BSc Psychology programme, highlighted demand for additional tutorials to help them to understand basic concepts. Students in their final year of study commonly request work experience to enhance their employability. All students on the Level 1…

  16. Adult Basic and Secondary Education Program Statistics. Fiscal Year 1976.

    ERIC Educational Resources Information Center

    Cain, Sylvester H.; Whalen, Barbara A.

    Reports submitted to the National Center for Education Statistics provided data for this compilation and tabulation of data on adult participants in U.S. educational programs in fiscal year 1976. In the summary section introducing the charts, it is noted that adult education programs funded under P.L. 91-230 served over 1.6 million persons--an…

  17. The Education Almanac, 1987-1988. Facts and Figures about Our Nation's System of Education. Third Edition.

    ERIC Educational Resources Information Center

    Goodman, Leroy V., Ed.

    This is the third edition of the Education Almanac, an assemblage of statistics, facts, commentary, and basic background information about the conduct of schools in the United States. Features of this variegated volume include an introductory section on "Education's Newsiest Developments," followed by some vital educational statistics, a set of…

  18. Study nonlinear dynamics of stratospheric ozone concentration at Pakistan Terrestrial region

    NASA Astrophysics Data System (ADS)

    Jan, Bulbul; Zai, Muhammad Ayub Khan Yousuf; Afradi, Faisal Khan; Aziz, Zohaib

    2018-03-01

    This study investigates the nonlinear dynamics of the stratospheric ozone layer at Pakistan atmospheric region. Ozone considered now the most important issue in the world because of its diverse effects on earth biosphere, including human health, ecosystem, marine life, agriculture yield and climate change. Therefore, this paper deals with total monthly time series data of stratospheric ozone over the Pakistan atmospheric region from 1970 to 2013. Two approaches, basic statistical analysis and Fractal dimension (D) have adapted to study the nature of nonlinear dynamics of stratospheric ozone level. Results obtained from this research have shown that the Hurst exponent values of both methods of fractal dimension revealed an anti-persistent behavior (negatively correlated), i.e. decreasing trend for all lags and Rescaled range analysis is more appropriate as compared to Detrended fluctuation analysis. For seasonal time series all month follows an anti-persistent behavior except in the month of November which shown persistence behavior i.e. time series is an independent and increasing trend. The normality test statistics also confirmed the nonlinear behavior of ozone and the rejection of hypothesis indicates the strong evidence of the complexity of data. This study will be useful to the researchers working in the same field in the future to verify the complex nature of stratospheric ozone.

  19. Variational Bayesian Parameter Estimation Techniques for the General Linear Model

    PubMed Central

    Starke, Ludger; Ostwald, Dirk

    2017-01-01

    Variational Bayes (VB), variational maximum likelihood (VML), restricted maximum likelihood (ReML), and maximum likelihood (ML) are cornerstone parametric statistical estimation techniques in the analysis of functional neuroimaging data. However, the theoretical underpinnings of these model parameter estimation techniques are rarely covered in introductory statistical texts. Because of the widespread practical use of VB, VML, ReML, and ML in the neuroimaging community, we reasoned that a theoretical treatment of their relationships and their application in a basic modeling scenario may be helpful for both neuroimaging novices and practitioners alike. In this technical study, we thus revisit the conceptual and formal underpinnings of VB, VML, ReML, and ML and provide a detailed account of their mathematical relationships and implementational details. We further apply VB, VML, ReML, and ML to the general linear model (GLM) with non-spherical error covariance as commonly encountered in the first-level analysis of fMRI data. To this end, we explicitly derive the corresponding free energy objective functions and ensuing iterative algorithms. Finally, in the applied part of our study, we evaluate the parameter and model recovery properties of VB, VML, ReML, and ML, first in an exemplary setting and then in the analysis of experimental fMRI data acquired from a single participant under visual stimulation. PMID:28966572

  20. Analysis of potential errors in real-time streamflow data and methods of data verification by digital computer

    USGS Publications Warehouse

    Lystrom, David J.

    1972-01-01

    Various methods of verifying real-time streamflow data are outlined in part II. Relatively large errors (those greater than 20-30 percent) can be detected readily by use of well-designed verification programs for a digital computer, and smaller errors can be detected only by discharge measurements and field observations. The capability to substitute a simulated discharge value for missing or erroneous data is incorporated in some of the verification routines described. The routines represent concepts ranging from basic statistical comparisons to complex watershed modeling and provide a selection from which real-time data users can choose a suitable level of verification.

  1. Nitroxide stable radicals interacting as Lewis bases in hydrogen bonds: A search in the Cambridge structural data base for intermolecular contacts

    NASA Astrophysics Data System (ADS)

    Alkorta, Ibon; Elguero, José; Elguero, Eric

    2017-11-01

    1125 X-ray structures of nitroxide free radicals presenting intermolecular hydrogen bonds have been reported in the Cambridge Structural Database. We will report in this paper a qualitative and quantitative analysis of these bonds. The observation in some plots of an excluded region was statistically analyzed using convex hull and kernel smooting methodologies. A theoretical study at the MP2 level with different basis has been carried out indicating that the nitronyl nitroxide radicals (five electrons) lie just in between nitroso compounds (four electrons) and amine N-oxides (six electrons) as far as hydrogen-bond basicity is concerned.

  2. Assessment of cognitive safety in clinical drug development

    PubMed Central

    Roiser, Jonathan P.; Nathan, Pradeep J.; Mander, Adrian P.; Adusei, Gabriel; Zavitz, Kenton H.; Blackwell, Andrew D.

    2016-01-01

    Cognitive impairment is increasingly recognised as an important potential adverse effect of medication. However, many drug development programmes do not incorporate sensitive cognitive measurements. Here, we review the rationale for cognitive safety assessment, and explain several basic methodological principles for measuring cognition during clinical drug development, including study design and statistical analysis, from Phase I through to postmarketing. The crucial issue of how cognition should be assessed is emphasized, especially the sensitivity of measurement. We also consider how best to interpret the magnitude of any identified effects, including comparison with benchmarks. We conclude by discussing strategies for the effective communication of cognitive risks. PMID:26610416

  3. Biclustering of gene expression data using reactive greedy randomized adaptive search procedure.

    PubMed

    Dharan, Smitha; Nair, Achuthsankar S

    2009-01-30

    Biclustering algorithms belong to a distinct class of clustering algorithms that perform simultaneous clustering of both rows and columns of the gene expression matrix and can be a very useful analysis tool when some genes have multiple functions and experimental conditions are diverse. Cheng and Church have introduced a measure called mean squared residue score to evaluate the quality of a bicluster and has become one of the most popular measures to search for biclusters. In this paper, we review basic concepts of the metaheuristics Greedy Randomized Adaptive Search Procedure (GRASP)-construction and local search phases and propose a new method which is a variant of GRASP called Reactive Greedy Randomized Adaptive Search Procedure (Reactive GRASP) to detect significant biclusters from large microarray datasets. The method has two major steps. First, high quality bicluster seeds are generated by means of k-means clustering. In the second step, these seeds are grown using the Reactive GRASP, in which the basic parameter that defines the restrictiveness of the candidate list is self-adjusted, depending on the quality of the solutions found previously. We performed statistical and biological validations of the biclusters obtained and evaluated the method against the results of basic GRASP and as well as with the classic work of Cheng and Church. The experimental results indicate that the Reactive GRASP approach outperforms the basic GRASP algorithm and Cheng and Church approach. The Reactive GRASP approach for the detection of significant biclusters is robust and does not require calibration efforts.

  4. [Tracking study to improve basic academic ability in chemistry for freshmen].

    PubMed

    Sato, Atsuko; Morone, Mieko; Azuma, Yutaka

    2010-08-01

    The aims of this study were to assess the basic academic ability of freshmen with regard to chemistry and implement suitable educational guidance measures. At Tohoku Pharmaceutical University, basic academic ability examinations are conducted in chemistry for freshmen immediately after entrance into the college. From 2003 to 2009, the examination was conducted using the same questions, and the secular changes in the mean percentage of correct response were statistically analyzed. An experience survey was also conducted on 2007 and 2009 freshmen regarding chemical experiments at senior high school. Analysis of the basic academic ability examinations revealed a significant decrease in the mean percentage of correct responses after 2007. With regard to the answers for each question, there was a significant decrease in the percentage of correct answers for approximately 80% of questions. In particular, a marked decrease was observed for calculation questions involving percentages. A significant decrease was also observed in the number of students who had experiences with chemical experiments in high school. However, notable results have been achieved through the implementation of practice incorporating calculation problems in order to improve calculation ability. Learning of chemistry and a lack of experimental experience in high school may be contributory factors in the decrease in chemistry academic ability. In consideration of the professional ability demanded of pharmacists, the decrease in calculation ability should be regarded as a serious issue and suitable measures for improving calculation ability are urgently required.

  5. Appplication of statistical mechanical methods to the modeling of social networks

    NASA Astrophysics Data System (ADS)

    Strathman, Anthony Robert

    With the recent availability of large-scale social data sets, social networks have become open to quantitative analysis via the methods of statistical physics. We examine the statistical properties of a real large-scale social network, generated from cellular phone call-trace logs. We find this network, like many other social networks to be assortative (r = 0.31) and clustered (i.e., strongly transitive, C = 0.21). We measure fluctuation scaling to identify the presence of internal structure in the network and find that structural inhomogeneity effectively disappears at the scale of a few hundred nodes, though there is no sharp cutoff. We introduce an agent-based model of social behavior, designed to model the formation and dissolution of social ties. The model is a modified Metropolis algorithm containing agents operating under the basic sociological constraints of reciprocity, communication need and transitivity. The model introduces the concept of a social temperature. We go on to show that this simple model reproduces the global statistical network features (incl. assortativity, connected fraction, mean degree, clustering, and mean shortest path length) of the real network data and undergoes two phase transitions, one being from a "gas" to a "liquid" state and the second from a liquid to a glassy state as function of this social temperature.

  6. Epidemiological Characteristics and Space-Time Analysis of the 2015 Dengue Outbreak in the Metropolitan Region of Tainan City, Taiwan

    PubMed Central

    Ng, Ka-Chon; Nguyen, Thi Luong

    2018-01-01

    The metropolitan region of Tainan City in southern Taiwan experienced a dengue outbreak in 2015. This manuscript describes basic epidemiological features of this outbreak and uses spatial and temporal analysis tools to understand the spread of dengue during the outbreak. The analysis found that, independently of gender, dengue incidence rate increased with age, and proportionally affected more males below the age of 40 years but females above the age of 40 years. A spatial scan statistic was applied to detect clusters of disease transmission. The scan statistic found that dengue spread in a north-south diffusion direction, which is across the North, West-Central and South districts of Tainan City. Spatial regression models were used to quantify factors associated with transmission. This analysis indicated that neighborhoods with high proportions of residential area (or low wetland cover) were associated with dengue transmission. However, these association patterns were non-linear. The findings presented here can help Taiwanese public health agencies to understand the fundamental epidemiological characteristics and diffusion patterns of the 2015 dengue outbreak in Tainan City. This type of information is fundamental for policy making to prevent future uncontrolled dengue outbreaks, given that results from this study suggest that control interventions should be emphasized in the North and West-Central districts of Tainan city, in areas with a moderate percentage of residential land cover. PMID:29495351

  7. Epidemiological Characteristics and Space-Time Analysis of the 2015 Dengue Outbreak in the Metropolitan Region of Tainan City, Taiwan.

    PubMed

    Chuang, Ting-Wu; Ng, Ka-Chon; Nguyen, Thi Luong; Chaves, Luis Fernando

    2018-02-26

    The metropolitan region of Tainan City in southern Taiwan experienced a dengue outbreak in 2015. This manuscript describes basic epidemiological features of this outbreak and uses spatial and temporal analysis tools to understand the spread of dengue during the outbreak. The analysis found that, independently of gender, dengue incidence rate increased with age, and proportionally affected more males below the age of 40 years but females above the age of 40 years. A spatial scan statistic was applied to detect clusters of disease transmission. The scan statistic found that dengue spread in a north-south diffusion direction, which is across the North, West-Central and South districts of Tainan City. Spatial regression models were used to quantify factors associated with transmission. This analysis indicated that neighborhoods with high proportions of residential area (or low wetland cover) were associated with dengue transmission. However, these association patterns were non-linear. The findings presented here can help Taiwanese public health agencies to understand the fundamental epidemiological characteristics and diffusion patterns of the 2015 dengue outbreak in Tainan City. This type of information is fundamental for policy making to prevent future uncontrolled dengue outbreaks, given that results from this study suggest that control interventions should be emphasized in the North and West-Central districts of Tainan city, in areas with a moderate percentage of residential land cover.

  8. Health inequalities among rural and urban population of Eastern Poland in the context of sustainable development.

    PubMed

    Pantyley, Viktoriya

    2017-09-21

    The primary goals of the study were a critical analysis of the concepts associated with health from the perspective of sustainable development, and empirical analysis of health and health- related issues among the rural and urban residents of Eastern Poland in the context of the sustainable development of the region. The study was based on the following research methods: a systemic approach, selection and analysis of the literature and statistical data, developing a special questionnaire concerning socio-economic and health inequalities among the population in the studied area, field research with an interview questionnaire conducted on randomly-selected respondents (N=1,103) in randomly selected areas of the Lubelskie, Podkarpackie, Podlaskie and eastern part of Mazowieckie Provinces (with the division between provincial capital cities - county capital cities - other cities - rural areas). The results of statistical surveys in the studied area with the use of chi-square test and contingence quotients indicated a correlation between the state of health and the following independent variables: age, life quality, social position and financial situation (C-Pearson's coefficient over 0,300); a statistically significant yet weak correlation was recorded for gender, household size, place of residence and amount of free time. The conducted analysis proved the existence of a huge gap between state of health of the population in urban and rural areas. In order to eliminate unfavourable differences in the state iof health among the residents of Eastern Poland, and provide equal sustainable development in urban and rural areas of the examined areas, special preventive programmes aimed at the residents of peripheral, marginalized rural areas should be implemented. In these programmes, attention should be paid to preventive measures, early diagnosis of basic civilization and social diseases, and better accessibility to medical services for the residents.

  9. Assessing population exposure for landslide risk analysis using dasymetric cartography

    NASA Astrophysics Data System (ADS)

    Garcia, Ricardo A. C.; Oliveira, Sergio C.; Zezere, Jose L.

    2015-04-01

    Exposed Population is a major topic that needs to be taken into account in a full landslide risk analysis. Usually, risk analysis is based on an accounting of inhabitants number or inhabitants density, applied over statistical or administrative terrain units, such as NUTS or parishes. However, this kind of approach may skew the obtained results underestimating the importance of population, mainly in territorial units with predominance of rural occupation. Furthermore, the landslide susceptibility scores calculated for each terrain unit are frequently more detailed and accurate than the location of the exposed population inside each territorial unit based on Census data. These drawbacks are not the ideal setting when landslide risk analysis is performed for urban management and emergency planning. Dasymetric cartography, which uses a parameter or set of parameters to restrict the spatial distribution of a particular phenomenon, is a methodology that may help to enhance the resolution of Census data and therefore to give a more realistic representation of the population distribution. Therefore, this work aims to map and to compare the population distribution based on a traditional approach (population per administrative terrain units) and based on dasymetric cartography (population by building). The study is developed in the Region North of Lisbon using 2011 population data and following three main steps: i) the landslide susceptibility assessment based on statistical models independently validated; ii) the evaluation of population distribution (absolute and density) for different administrative territorial units (Parishes and BGRI - the basic statistical unit in the Portuguese Census); and iii) the dasymetric population's cartography based on building areal weighting. Preliminary results show that in sparsely populated administrative units, population density differs more than two times depending on the application of the traditional approach or the dasymetric cartography. This work was supported by the FCT - Portuguese Foundation for Science and Technology.

  10. Facts about Folic Acid

    MedlinePlus

    ... Surveillance References Birth Defects COUNT Data & Statistics Research Articles & Key Findings About Us Partners Links to Other Websites Information For… Media Policy Makers Folic Acid Basics Language: English (US) ...

  11. Impact of structural and economic factors on hospitalization costs, inpatient mortality, and treatment type of traumatic hip fractures in Switzerland.

    PubMed

    Mehra, Tarun; Moos, Rudolf M; Seifert, Burkhardt; Bopp, Matthias; Senn, Oliver; Simmen, Hans-Peter; Neuhaus, Valentin; Ciritsis, Bernhard

    2017-12-01

    The assessment of structural and potentially economic factors determining cost, treatment type, and inpatient mortality of traumatic hip fractures are important health policy issues. We showed that insurance status and treatment in university hospitals were significantly associated with treatment type (i.e., primary hip replacement), cost, and lower inpatient mortality respectively. The purpose of this study was to determine the influence of the structural level of hospital care and patient insurance type on treatment, hospitalization cost, and inpatient mortality in cases with traumatic hip fractures in Switzerland. The Swiss national medical statistic 2011-2012 was screened for adults with hip fracture as primary diagnosis. Gender, age, insurance type, year of discharge, hospital infrastructure level, length-of-stay, case weight, reason for discharge, and all coded diagnoses and procedures were extracted. Descriptive statistics and multivariate logistic regression with treatment by primary hip replacement as well as inpatient mortality as dependent variables were performed. We obtained 24,678 inpatient case records from the medical statistic. Hospitalization costs were calculated from a second dataset, the Swiss national cost statistic (7528 cases with hip fractures, discharged in 2012). Average inpatient costs per case were the highest for discharges from university hospitals (US$21,471, SD US$17,015) and the lowest in basic coverage hospitals (US$18,291, SD US$12,635). Controlling for other variables, higher costs for hip fracture treatment at university hospitals were significant in multivariate regression (p < 0.001). University hospitals had a lower inpatient mortality rate than full and basic care providers (2.8% vs. both 4.0%); results confirmed in our multivariate logistic regression analysis (odds ratio (OR) 1.434, 95% CI 1.127-1.824 and OR 1.459, 95% confidence interval (CI) 1.139-1.870 for full and basic coverage hospitals vs. university hospitals respectively). The proportion of privately insured varied between 16.0% in university hospitals and 38.9% in specialized hospitals. Private insurance had an OR of 1.419 (95% CI 1.306-1.542) in predicting treatment of a hip fracture with primary hip replacement. The seeming importance of insurance type on hip fracture treatment and the large inequity in the distribution of privately insured between provider types would be worth a closer look by the regulatory authorities. Better outcomes, i.e., lower mortality rates for hip fracture treatment in hospitals with a higher structural care level advocate centralization of care.

  12. [Comment on] Statistical discrimination

    NASA Astrophysics Data System (ADS)

    Chinn, Douglas

    In the December 8, 1981, issue of Eos, a news item reported the conclusion of a National Research Council study that sexual discrimination against women with Ph.D.'s exists in the field of geophysics. Basically, the item reported that even when allowances are made for motherhood the percentage of female Ph.D.'s holding high university and corporate positions is significantly lower than the percentage of male Ph.D.'s holding the same types of positions. The sexual discrimination conclusion, based only on these statistics, assumes that there are no basic psychological differences between men and women that might cause different populations in the employment group studied. Therefore, the reasoning goes, after taking into account possible effects from differences related to anatomy, such as women stopping their careers in order to bear and raise children, the statistical distributions of positions held by male and female Ph.D.'s ought to be very similar to one another. Any significant differences between the distributions must be caused primarily by sexual discrimination.

  13. Field-Scale Soil Moisture Observations in Irrigated Agriculture Fields Using the Cosmic-ray Neutron Rover

    NASA Astrophysics Data System (ADS)

    Franz, T. E.; Avery, W. A.; Finkenbiner, C. E.; Wang, T.; Brocca, L.

    2014-12-01

    Approximately 40% of global food production comes from irrigated agriculture. With the increasing demand for food even greater pressures will be placed on water resources within these systems. In this work we aimed to characterize the spatial and temporal patterns of soil moisture at the field-scale (~500 m) using the newly developed cosmic-ray neutron rover near Waco, NE. Here we mapped soil moisture of 144 quarter section fields (a mix of maize, soybean, and natural areas) each week during the 2014 growing season (May to September). The 11 x11 km study domain also contained 3 stationary cosmic-ray neutron probes for independent validation of the rover surveys. Basic statistical analysis of the domain indicated a strong inverted parabolic relationship between the mean and variance of soil moisture. The relationship between the mean and higher order moments were not as strong. Geostatistical analysis indicated the range of the soil moisture semi-variogram was significantly shorter during periods of heavy irrigation as compared to non-irrigated periods. Scaling analysis indicated strong power law behavior between the variance of soil moisture and averaging area with minimal dependence of mean soil moisture on the slope of the power law function. Statistical relationships derived from the rover dataset offer a novel set of observations that will be useful in: 1) calibrating and validating land surface models, 2) calibrating and validating crop models, 3) soil moisture covariance estimates for statistical downscaling of remote sensing products such as SMOS and SMAP, and 4) provide center-pivot scale mean soil moisture data for optimal irrigation timing and volume amounts.

  14. The Variscan calc-alkalic plutonism of western Corsica: mineralogy and major and trace element geochemistry

    NASA Astrophysics Data System (ADS)

    Cocherie, A.; Rossi, Ph.; Le Bel, L.

    1984-10-01

    Petrographic and structural observations on the calc-alkalic plutonism of western Corsica revealed the existence of several successively emplaced units associated with large basic bodies. The present mineralogical and geochemical study deals with the genesis, evolution and relationships of these different units. Basic plutonism is represented by three genetically linked types of rock: norites and troctolites with cumulate textures characterized by low REE contents and either no Eu anomaly or a positive Eu anomaly; gabbros with enriched LREE relatively to HREE patterns, probably close to an initial basaltic liquid; and diorites ranging up to charnockites which represent liquids evolved to varying degrees, mainly by fractional crystallization. Trace element data and studies on the evolution of pyroxene pairs demonstrate the consanguinity of these calc-alkaline basic rocks which are derived from a high alumina basaltic melt. The various granitoids (granodiorites, monzogranites and leucocratic monzogranites, i.e., adamellites) have distinct evolution trends as shown by the composition of their mafic minerals and by trace element distributions. They cannot be considered as being derivatives of the basic suite and they cannot be related by a common fractionation sequence. Rather, they represent distinctive batches of crustal anatexis. In addition, hybridization phenomena with the basic melt are noticed in granodiorites. The particular problem of the low La/Yb, Eu/Eu∗ and the high U, Th, Cs leucocratic monzogranites is discussed in detail. In addition to more conventional trace element diagrams, the simultaneous statistical treatment of all the geochemical data by correspondence factor analysis is shown to be a very use tool in distinguishing between the different units and to classify the elements according to their geochemical properties.

  15. A trial-based economic evaluation of 2 nurse-led disease management programs in heart failure.

    PubMed

    Postmus, Douwe; Pari, Anees A Abdul; Jaarsma, Tiny; Luttik, Marie Louise; van Veldhuisen, Dirk J; Hillege, Hans L; Buskens, Erik

    2011-12-01

    Although previously conducted meta-analyses suggest that nurse-led disease management programs in heart failure (HF) can improve patient outcomes, uncertainty regarding the cost-effectiveness of such programs remains. To compare the relative merits of 2 variants of a nurse-led disease management program (basic or intensive support by a nurse specialized in the management of patients with HF) against care as usual (routine follow-up by a cardiologist), a trial-based economic evaluation was conducted alongside the COACH study. In terms of costs per life-year, basic support was found to dominate care as usual, whereas the incremental cost-effectiveness ratio between intensive support and basic support was found to be equal to €532,762 per life-year; in terms of costs per quality-adjusted life-year (QALY), basic support was found to dominate both care as usual and intensive support. An assessment of the uncertainty surrounding these findings showed that, at a threshold value of €20,000 per life-year/€20,000 per QALY, basic support was found to have a probability of 69/62% of being optimal against 17/30% and 14/8% for care as usual and intensive support, respectively. The results of our subgroup analysis suggest that a stratified approach based on offering basic support to patients with mild to moderate HF and intensive support to patients with severe HF would be optimal if the willingness-to-pay threshold exceeds €45,345 per life-year/€59,289 per QALY. Although the differences in costs and effects among the 3 study groups were not statistically significant, from a decision-making perspective, basic support still had a relatively large probability of generating the highest health outcomes at the lowest costs. Our results also substantiated that a stratified approach based on offering basic support to patients with mild to moderate HF and intensive support to patients with severe HF could further improve health outcomes at slightly higher costs. Copyright © 2011 Mosby, Inc. All rights reserved.

  16. Rumination in posttraumatic stress and growth after a natural disaster: a model from northern Chile 2014 earthquakes

    PubMed Central

    Leal-Soto, Francisco; Carmona-Halty, Marcos; Ferrer-Urbina, Rodrigo

    2016-01-01

    Background Traumatic experiences, such as natural disasters, produce multiple and serious impacts on people. Despite the traditional focus on negative consequences, in many cases there are also positive consequences, such as posttraumatic growth. Tedeschi and Calhoun proposed a model of posttraumatic growth that emphasizes the role of rumination after the basic beliefs breakdown due to the occurrence of a traumatic experience. Method A total of 238 volunteers affected by two major earthquakes and tsunami alerts in northern Chile on April 1 and 2, 2014, responded to an online survey measuring subjective severity, basic beliefs change, social share of emotion, rumination, posttraumatic stress, and posttraumatic growth. Results Path analyses reveal that posttraumatic stress goes through a negative change in basic beliefs, intrusive rumination, and deliberated rumination, meanwhile posttraumatic growth is only achieved directly from a positive change in basic beliefs and deliberated rumination. Discussion The model is consistent with the empirical model obtained in Chilean people affected by the earthquake and tsunami that occurred on 27 February, 2010, but it is slightly different and in a form that is more consistent with Tedeschi and Calhoun’s theoretical model. Both models remark on the role of deliberated rumination in posttraumatic growth and failure to progress from intrusive to deliberated rumination in posttraumatic stress, but the proposed one is more parsimonious and assumes subjective severity as an antecedent to basic belief changes. These conclusions must be considered in light of limitations that a cross-sectional design and the correlational nature of the statistical analysis carried out impose. Highlights of the article Role of subjective severity, change of basic beliefs, social sharing of emotion, and rumination on posttraumatic stress and growth were modeled from responses of people affected by the April 1–2, 2014, northern Chilean earthquakes.Posttraumatic stress goes through negative changes in basic beliefs, intrusive rumination, and deliberated rumination.Posttraumatic growth is achieved from positive changes in basic beliefs and deliberated rumination.Deliberated rumination and moving from intrusive to deliberated rumination appear as cornerstones in posttraumatic processing. PMID:27900935

  17. Cognitive and attitudinal predictors related to graphing achievement among pre-service elementary teachers

    NASA Astrophysics Data System (ADS)

    Szyjka, Sebastian P.

    The purpose of this study was to determine the extent to which six cognitive and attitudinal variables predicted pre-service elementary teachers' performance on line graphing. Predictors included Illinois teacher education basic skills sub-component scores in reading comprehension and mathematics, logical thinking performance scores, as well as measures of attitudes toward science, mathematics and graphing. This study also determined the strength of the relationship between each prospective predictor variable and the line graphing performance variable, as well as the extent to which measures of attitude towards science, mathematics and graphing mediated relationships between scores on mathematics, reading, logical thinking and line graphing. Ninety-four pre-service elementary education teachers enrolled in two different elementary science methods courses during the spring 2009 semester at Southern Illinois University Carbondale participated in this study. Each subject completed five different instruments designed to assess science, mathematics and graphing attitudes as well as logical thinking and graphing ability. Sixty subjects provided copies of primary basic skills score reports that listed subset scores for both reading comprehension and mathematics. The remaining scores were supplied by a faculty member who had access to a database from which the scores were drawn. Seven subjects, whose scores could not be found, were eliminated from final data analysis. Confirmatory factor analysis (CFA) was conducted in order to establish validity and reliability of the Questionnaire of Attitude Toward Line Graphs in Science (QALGS) instrument. CFA tested the statistical hypothesis that the five main factor structures within the Questionnaire of Attitude Toward Statistical Graphs (QASG) would be maintained in the revised QALGS. Stepwise Regression Analysis with backward elimination was conducted in order to generate a parsimonious and precise predictive model. This procedure allowed the researcher to explore the relationships among the affective and cognitive variables that were included in the regression analysis. The results for CFA indicated that the revised QALGS measure was sound in its psychometric properties when tested against the QASG. Reliability statistics indicated that the overall reliability for the 32 items in the QALGS was .90. The learning preferences construct had the lowest reliability (.67), while enjoyment (.89), confidence (.86) and usefulness (.77) constructs had moderate to high reliabilities. The first four measurement models fit the data well as indicated by the appropriate descriptive and statistical indices. However, the fifth measurement model did not fit the data well statistically, and only fit well with two descriptive indices. The results addressing the research question indicated that mathematical and logical thinking ability were significant predictors of line graph performance among the remaining group of variables. These predictors accounted for 41% of the total variability on the line graph performance variable. Partial correlation coefficients indicated that mathematics ability accounted for 20.5% of the variance on the line graphing performance variable when removing the effect of logical thinking. The logical thinking variable accounted for 4.7% of the variance on the line graphing performance variable when removing the effect of mathematics ability.

  18. Columbia/Willamette Skill Builders Consortium. Final Performance Report. Appendix 5B Anodizing Inc. (Aluminum Extrusion Manufacturing). Basic Measurement Math. Instructors' Reports and Sample Curriculum Materials.

    ERIC Educational Resources Information Center

    Taylor, Marjorie; And Others

    Anodizing, Inc., Teamsters Local 162, and Mt. Hood Community College (Oregon) developed a workplace literacy program for workers at Anodizing. These workers did not have the basic skill competencies to benefit from company training efforts in statistical process control and quality assurance and were not able to advance to lead and supervisory…

  19. Opportunities Unlimited: Minnesota Indians Adult Basic Education; Narrative and Statistical Evaluation Third Year 1971-72, with a Review of the First and Second Years.

    ERIC Educational Resources Information Center

    Vizenor, Gerald

    Opportunities Unlimited is a State-wide program to provide adult basic education (ABE) and training for Indians on Minnesota reservations and in Indian communities. An administrative center in Bemidji serves communities on the Red Lake, White Earth, and Leech Lake Reservations, and a Duluth center provides ABE and training for communities on the…

  20. A quantitative comparison of corrective and perfective maintenance

    NASA Technical Reports Server (NTRS)

    Henry, Joel; Cain, James

    1994-01-01

    This paper presents a quantitative comparison of corrective and perfective software maintenance activities. The comparison utilizes basic data collected throughout the maintenance process. The data collected are extensive and allow the impact of both types of maintenance to be quantitatively evaluated and compared. Basic statistical techniques test relationships between and among process and product data. The results show interesting similarities and important differences in both process and product characteristics.

Top