Sample records for present relevant statistical

  1. Clinical relevance vs. statistical significance: Using neck outcomes in patients with temporomandibular disorders as an example.

    PubMed

    Armijo-Olivo, Susan; Warren, Sharon; Fuentes, Jorge; Magee, David J

    2011-12-01

    Statistical significance has been used extensively to evaluate the results of research studies. Nevertheless, it offers only limited information to clinicians. The assessment of clinical relevance can facilitate the interpretation of the research results into clinical practice. The objective of this study was to explore different methods to evaluate the clinical relevance of the results using a cross-sectional study as an example comparing different neck outcomes between subjects with temporomandibular disorders and healthy controls. Subjects were compared for head and cervical posture, maximal cervical muscle strength, endurance of the cervical flexor and extensor muscles, and electromyographic activity of the cervical flexor muscles during the CranioCervical Flexion Test (CCFT). The evaluation of clinical relevance of the results was performed based on the effect size (ES), minimal important difference (MID), and clinical judgement. The results of this study show that it is possible to have statistical significance without having clinical relevance, to have both statistical significance and clinical relevance, to have clinical relevance without having statistical significance, or to have neither statistical significance nor clinical relevance. The evaluation of clinical relevance in clinical research is crucial to simplify the transfer of knowledge from research into practice. Clinical researchers should present the clinical relevance of their results. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Passage relevance models for genomics search.

    PubMed

    Urbain, Jay; Frieder, Ophir; Goharian, Nazli

    2009-03-19

    We present a passage relevance model for integrating syntactic and semantic evidence of biomedical concepts and topics using a probabilistic graphical model. Component models of topics, concepts, terms, and document are represented as potential functions within a Markov Random Field. The probability of a passage being relevant to a biologist's information need is represented as the joint distribution across all potential functions. Relevance model feedback of top ranked passages is used to improve distributional estimates of query concepts and topics in context, and a dimensional indexing strategy is used for efficient aggregation of concept and term statistics. By integrating multiple sources of evidence including dependencies between topics, concepts, and terms, we seek to improve genomics literature passage retrieval precision. Using this model, we are able to demonstrate statistically significant improvements in retrieval precision using a large genomics literature corpus.

  3. Technical Note: Higher-order statistical moments and a procedure that detects potentially anomalous years as two alternative methods describing alterations in continuous environmental data

    Treesearch

    I. Arismendi; S. L. Johnson; J. B. Dunham

    2015-01-01

    Statistics of central tendency and dispersion may not capture relevant or desired characteristics of the distribution of continuous phenomena and, thus, they may not adequately describe temporal patterns of change. Here, we present two methodological approaches that can help to identify temporal changes in environmental regimes. First, we use higher-order statistical...

  4. Single-molecule photon emission statistics for systems with explicit time dependence: Generating function approach

    NASA Astrophysics Data System (ADS)

    Peng, Yonggang; Xie, Shijie; Zheng, Yujun; Brown, Frank L. H.

    2009-12-01

    Generating function calculations are extended to allow for laser pulse envelopes of arbitrary shape in numerical applications. We investigate photon emission statistics for two-level and V- and Λ-type three-level systems under time-dependent excitation. Applications relevant to electromagnetically induced transparency and photon emission from single quantum dots are presented.

  5. Assessment of statistical significance and clinical relevance.

    PubMed

    Kieser, Meinhard; Friede, Tim; Gondan, Matthias

    2013-05-10

    In drug development, it is well accepted that a successful study will demonstrate not only a statistically significant result but also a clinically relevant effect size. Whereas standard hypothesis tests are used to demonstrate the former, it is less clear how the latter should be established. In the first part of this paper, we consider the responder analysis approach and study the performance of locally optimal rank tests when the outcome distribution is a mixture of responder and non-responder distributions. We find that these tests are quite sensitive to their planning assumptions and have therefore not really any advantage over standard tests such as the t-test and the Wilcoxon-Mann-Whitney test, which perform overall well and can be recommended for applications. In the second part, we present a new approach to the assessment of clinical relevance based on the so-called relative effect (or probabilistic index) and derive appropriate sample size formulae for the design of studies aiming at demonstrating both a statistically significant and clinically relevant effect. Referring to recent studies in multiple sclerosis, we discuss potential issues in the application of this approach. Copyright © 2012 John Wiley & Sons, Ltd.

  6. Organic Laboratory Experiments: Micro vs. Conventional.

    ERIC Educational Resources Information Center

    Chloupek-McGough, Marge

    1989-01-01

    Presents relevant statistics accumulated in a fall organic laboratory course. Discusses laboratory equipment setup to lower the amount of waste. Notes decreased solid wastes were produced compared to the previous semester. (MVL)

  7. Quantifying uncertainty in climate change science through empirical information theory.

    PubMed

    Majda, Andrew J; Gershgorin, Boris

    2010-08-24

    Quantifying the uncertainty for the present climate and the predictions of climate change in the suite of imperfect Atmosphere Ocean Science (AOS) computer models is a central issue in climate change science. Here, a systematic approach to these issues with firm mathematical underpinning is developed through empirical information theory. An information metric to quantify AOS model errors in the climate is proposed here which incorporates both coarse-grained mean model errors as well as covariance ratios in a transformation invariant fashion. The subtle behavior of model errors with this information metric is quantified in an instructive statistically exactly solvable test model with direct relevance to climate change science including the prototype behavior of tracer gases such as CO(2). Formulas for identifying the most sensitive climate change directions using statistics of the present climate or an AOS model approximation are developed here; these formulas just involve finding the eigenvector associated with the largest eigenvalue of a quadratic form computed through suitable unperturbed climate statistics. These climate change concepts are illustrated on a statistically exactly solvable one-dimensional stochastic model with relevance for low frequency variability of the atmosphere. Viable algorithms for implementation of these concepts are discussed throughout the paper.

  8. Statistical downscaling of GCM simulations to streamflow using relevance vector machine

    NASA Astrophysics Data System (ADS)

    Ghosh, Subimal; Mujumdar, P. P.

    2008-01-01

    General circulation models (GCMs), the climate models often used in assessing the impact of climate change, operate on a coarse scale and thus the simulation results obtained from GCMs are not particularly useful in a comparatively smaller river basin scale hydrology. The article presents a methodology of statistical downscaling based on sparse Bayesian learning and Relevance Vector Machine (RVM) to model streamflow at river basin scale for monsoon period (June, July, August, September) using GCM simulated climatic variables. NCEP/NCAR reanalysis data have been used for training the model to establish a statistical relationship between streamflow and climatic variables. The relationship thus obtained is used to project the future streamflow from GCM simulations. The statistical methodology involves principal component analysis, fuzzy clustering and RVM. Different kernel functions are used for comparison purpose. The model is applied to Mahanadi river basin in India. The results obtained using RVM are compared with those of state-of-the-art Support Vector Machine (SVM) to present the advantages of RVMs over SVMs. A decreasing trend is observed for monsoon streamflow of Mahanadi due to high surface warming in future, with the CCSR/NIES GCM and B2 scenario.

  9. A bootstrap based Neyman-Pearson test for identifying variable importance.

    PubMed

    Ditzler, Gregory; Polikar, Robi; Rosen, Gail

    2015-04-01

    Selection of most informative features that leads to a small loss on future data are arguably one of the most important steps in classification, data analysis and model selection. Several feature selection (FS) algorithms are available; however, due to noise present in any data set, FS algorithms are typically accompanied by an appropriate cross-validation scheme. In this brief, we propose a statistical hypothesis test derived from the Neyman-Pearson lemma for determining if a feature is statistically relevant. The proposed approach can be applied as a wrapper to any FS algorithm, regardless of the FS criteria used by that algorithm, to determine whether a feature belongs in the relevant set. Perhaps more importantly, this procedure efficiently determines the number of relevant features given an initial starting point. We provide freely available software implementations of the proposed methodology.

  10. Probabilistic reasoning under time pressure: an assessment in Italian, Spanish and English psychology undergraduates

    NASA Astrophysics Data System (ADS)

    Agus, M.; Hitchcott, P. K.; Penna, M. P.; Peró-Cebollero, M.; Guàrdia-Olmos, J.

    2016-11-01

    Many studies have investigated the features of probabilistic reasoning developed in relation to different formats of problem presentation, showing that it is affected by various individual and contextual factors. Incomplete understanding of the identity and role of these factors may explain the inconsistent evidence concerning the effect of problem presentation format. Thus, superior performance has sometimes been observed for graphically, rather than verbally, presented problems. The present study was undertaken to address this issue. Psychology undergraduates without any statistical expertise (N = 173 in Italy; N = 118 in Spain; N = 55 in England) were administered statistical problems in two formats (verbal-numerical and graphical-pictorial) under a condition of time pressure. Students also completed additional measures indexing several potentially relevant individual dimensions (statistical ability, statistical anxiety, attitudes towards statistics and confidence). Interestingly, a facilitatory effect of graphical presentation was observed in the Italian and Spanish samples but not in the English one. Significantly, the individual dimensions predicting statistical performance also differed between the samples, highlighting a different role of confidence. Hence, these findings confirm previous observations concerning problem presentation format while simultaneously highlighting the importance of individual dimensions.

  11. Geomatic Methods for the Analysis of Data in the Earth Sciences: Lecture Notes in Earth Sciences, Vol. 95

    NASA Astrophysics Data System (ADS)

    Pavlis, Nikolaos K.

    Geomatics is a trendy term that has been used in recent years to describe academic departments that teach and research theories, methods, algorithms, and practices used in processing and analyzing data related to the Earth and other planets. Naming trends aside, geomatics could be considered as the mathematical and statistical “toolbox” that allows Earth scientists to extract information about physically relevant parameters from the available data and accompany such information with some measure of its reliability. This book is an attempt to present the mathematical-statistical methods used in data analysis within various disciplines—geodesy, geophysics, photogrammetry and remote sensing—from a unifying perspective that inverse problem formalism permits. At the same time, it allows us to stretch the relevance of statistical methods in achieving an optimal solution.

  12. A guide to statistical analysis in microbial ecology: a community-focused, living review of multivariate data analyses.

    PubMed

    Buttigieg, Pier Luigi; Ramette, Alban

    2014-12-01

    The application of multivariate statistical analyses has become a consistent feature in microbial ecology. However, many microbial ecologists are still in the process of developing a deep understanding of these methods and appreciating their limitations. As a consequence, staying abreast of progress and debate in this arena poses an additional challenge to many microbial ecologists. To address these issues, we present the GUide to STatistical Analysis in Microbial Ecology (GUSTA ME): a dynamic, web-based resource providing accessible descriptions of numerous multivariate techniques relevant to microbial ecologists. A combination of interactive elements allows users to discover and navigate between methods relevant to their needs and examine how they have been used by others in the field. We have designed GUSTA ME to become a community-led and -curated service, which we hope will provide a common reference and forum to discuss and disseminate analytical techniques relevant to the microbial ecology community. © 2014 The Authors. FEMS Microbiology Ecology published by John Wiley & Sons Ltd on behalf of Federation of European Microbiological Societies.

  13. Second language experience facilitates statistical learning of novel linguistic materials

    PubMed Central

    Potter, Christine E.; Wang, Tianlin; Saffran, Jenny R.

    2016-01-01

    Recent research has begun to explore individual differences in statistical learning, and how those differences may be related to other cognitive abilities, particularly their effects on language learning. In the present research, we explored a different type of relationship between language learning and statistical learning: the possibility that learning a new language may also influence statistical learning by changing the regularities to which learners are sensitive. We tested two groups of participants, Mandarin Learners and Naïve Controls, at two time points, six months apart. At each time point, participants performed two different statistical learning tasks: an artificial tonal language statistical learning task and a visual statistical learning task. Only the Mandarin-learning group showed significant improvement on the linguistic task, while both groups improved equally on the visual task. These results support the view that there are multiple influences on statistical learning. Domain-relevant experiences may affect the regularities that learners can discover when presented with novel stimuli. PMID:27988939

  14. 3 CFR - Enhanced Collection of Relevant Data and Statistics Relating to Women

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 3 The President 1 2012-01-01 2012-01-01 false Enhanced Collection of Relevant Data and Statistics Relating to Women Presidential Documents Other Presidential Documents Memorandum of March 4, 2011 Enhanced Collection of Relevant Data and Statistics Relating to Women Memorandum for the Heads of Executive Departments and Agencies I am proud to work...

  15. A demonstration of the application of the new paradigm for the evaluation of forensic evidence under conditions reflecting those of a real forensic-voice-comparison case.

    PubMed

    Enzinger, Ewald; Morrison, Geoffrey Stewart; Ochoa, Felipe

    2016-01-01

    The new paradigm for the evaluation of the strength of forensic evidence includes: The use of the likelihood-ratio framework. The use of relevant data, quantitative measurements, and statistical models. Empirical testing of validity and reliability under conditions reflecting those of the case under investigation. Transparency as to decisions made and procedures employed. The present paper illustrates the use of the new paradigm to evaluate strength of evidence under conditions reflecting those of a real forensic-voice-comparison case. The offender recording was from a landline telephone system, had background office noise, and was saved in a compressed format. The suspect recording included substantial reverberation and ventilation system noise, and was saved in a different compressed format. The present paper includes descriptions of the selection of the relevant hypotheses, sampling of data from the relevant population, simulation of suspect and offender recording conditions, and acoustic measurement and statistical modelling procedures. The present paper also explores the use of different techniques to compensate for the mismatch in recording conditions. It also examines how system performance would have differed had the suspect recording been of better quality. Copyright © 2015 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.

  16. Teaching statistics to nursing students: an expert panel consensus.

    PubMed

    Hayat, Matthew J; Eckardt, Patricia; Higgins, Melinda; Kim, MyoungJin; Schmiege, Sarah J

    2013-06-01

    Statistics education is a necessary element of nursing education, and its inclusion is recommended in the American Association of Colleges of Nursing guidelines for nurse training at all levels. This article presents a cohesive summary of an expert panel discussion, "Teaching Statistics to Nursing Students," held at the 2012 Joint Statistical Meetings. All panelists were statistics experts, had extensive teaching and consulting experience, and held faculty appointments in a U.S.-based nursing college or school. The panel discussed degree-specific curriculum requirements, course content, how to ensure nursing students understand the relevance of statistics, approaches to integrating statistics consulting knowledge, experience with classroom instruction, use of knowledge from the statistics education research field to make improvements in statistics education for nursing students, and classroom pedagogy and instruction on the use of statistical software. Panelists also discussed the need for evidence to make data-informed decisions about statistics education and training for nurses. Copyright 2013, SLACK Incorporated.

  17. A robust bayesian estimate of the concordance correlation coefficient.

    PubMed

    Feng, Dai; Baumgartner, Richard; Svetnik, Vladimir

    2015-01-01

    A need for assessment of agreement arises in many situations including statistical biomarker qualification or assay or method validation. Concordance correlation coefficient (CCC) is one of the most popular scaled indices reported in evaluation of agreement. Robust methods for CCC estimation currently present an important statistical challenge. Here, we propose a novel Bayesian method of robust estimation of CCC based on multivariate Student's t-distribution and compare it with its alternatives. Furthermore, we extend the method to practically relevant settings, enabling incorporation of confounding covariates and replications. The superiority of the new approach is demonstrated using simulation as well as real datasets from biomarker application in electroencephalography (EEG). This biomarker is relevant in neuroscience for development of treatments for insomnia.

  18. Exchanging the liquidity hypothesis: Delay discounting of money and self-relevant non-money rewards

    PubMed Central

    Stuppy-Sullivan, Allison M.; Tormohlen, Kayla N.; Yi, Richard

    2015-01-01

    Evidence that primary rewards (e.g., food and drugs of abuse) are discounted more than money is frequently attributed to money's high degree of liquidity, or exchangeability for many commodities. The present study provides some evidence against this liquidity hypothesis by contrasting delay discounting of monetary rewards (liquid) and non-monetary commodities (non-liquid) that are self-relevant and utility-matched. Ninety-seven (97) undergraduate students initially completed a conventional binary-choice delay discounting of money task. Participants returned one week later and completed a self-relevant commodity delay discounting task. Both conventional hypothesis testing and more-conservative tests of statistical equivalence revealed correspondence in rate of delay discounting of money and self-relevant commodities, and in one magnitude condition, less discounting for the latter. The present results indicate that liquidity of money cannot fully account for the lower rate of delay discounting compared to non-money rewards. PMID:26556504

  19. Spacecraft software training needs assessment research, appendices

    NASA Technical Reports Server (NTRS)

    Ratcliff, Shirley; Golas, Katharine

    1990-01-01

    The appendices to the previously reported study are presented: statistical data from task rating worksheets; SSD references; survey forms; fourth generation language, a powerful, long-term solution to maintenance cost; task list; methodology; SwRI's instructional systems development model; relevant research; and references.

  20. Randomized clinical trials in implant therapy: relationships among methodological, statistical, clinical, paratextual features and number of citations.

    PubMed

    Nieri, Michele; Clauser, Carlo; Franceschi, Debora; Pagliaro, Umberto; Saletta, Daniele; Pini-Prato, Giovanpaolo

    2007-08-01

    The aim of the present study was to investigate the relationships among reported methodological, statistical, clinical and paratextual variables of randomized clinical trials (RCTs) in implant therapy, and their influence on subsequent research. The material consisted of the RCTs in implant therapy published through the end of the year 2000. Methodological, statistical, clinical and paratextual features of the articles were assessed and recorded. The perceived clinical relevance was subjectively evaluated by an experienced clinician on anonymous abstracts. The impact on research was measured by the number of citations found in the Science Citation Index. A new statistical technique (Structural learning of Bayesian Networks) was used to assess the relationships among the considered variables. Descriptive statistics revealed that the reported methodology and statistics of RCTs in implant therapy were defective. Follow-up of the studies was generally short. The perceived clinical relevance appeared to be associated with the objectives of the studies and with the number of published images in the original articles. The impact on research was related to the nationality of the involved institutions and to the number of published images. RCTs in implant therapy (until 2000) show important methodological and statistical flaws and may not be appropriate for guiding clinicians in their practice. The methodological and statistical quality of the studies did not appear to affect their impact on practice and research. Bayesian Networks suggest new and unexpected relationships among the methodological, statistical, clinical and paratextual features of RCTs.

  1. Models of dyadic social interaction.

    PubMed Central

    Griffin, Dale; Gonzalez, Richard

    2003-01-01

    We discuss the logic of research designs for dyadic interaction and present statistical models with parameters that are tied to psychologically relevant constructs. Building on Karl Pearson's classic nineteenth-century statistical analysis of within-organism similarity, we describe several approaches to indexing dyadic interdependence and provide graphical methods for visualizing dyadic data. We also describe several statistical and conceptual solutions to the 'levels of analytic' problem in analysing dyadic data. These analytic strategies allow the researcher to examine and measure psychological questions of interdependence and social influence. We provide illustrative data from casually interacting and romantic dyads. PMID:12689382

  2. Divorce and Single-Parent Family Counseling. Searchlight Plus: Relevant Resources in High Interest Areas. No 26+.

    ERIC Educational Resources Information Center

    Hale, Lynelle C.

    This document, based on a computer search of the ERIC database, presents a review of the literature on divorce and single parent families. Statistics from the 1980 census are presented which show that 19.7 percent of children under 18 live with a single parent, who in the overwhelming number of cases is the mother. The document presents data on…

  3. Do physicians understand cancer screening statistics? A national survey of primary care physicians in the United States.

    PubMed

    Wegwarth, Odette; Schwartz, Lisa M; Woloshin, Steven; Gaissmaier, Wolfgang; Gigerenzer, Gerd

    2012-03-06

    Unlike reduced mortality rates, improved survival rates and increased early detection do not prove that cancer screening tests save lives. Nevertheless, these 2 statistics are often used to promote screening. To learn whether primary care physicians understand which statistics provide evidence about whether screening saves lives. Parallel-group, randomized trial (randomization controlled for order effect only), conducted by Internet survey. (ClinicalTrials.gov registration number: NCT00981019) National sample of U.S. primary care physicians from a research panel maintained by Harris Interactive (79% cooperation rate). 297 physicians who practiced both inpatient and outpatient medicine were surveyed in 2010, and 115 physicians who practiced exclusively outpatient medicine were surveyed in 2011. Physicians received scenarios about the effect of 2 hypothetical screening tests: The effect was described as improved 5-year survival and increased early detection in one scenario and as decreased cancer mortality and increased incidence in the other. Physicians' recommendation of screening and perception of its benefit in the scenarios and general knowledge of screening statistics. Primary care physicians were more enthusiastic about the screening test supported by irrelevant evidence (5-year survival increased from 68% to 99%) than about the test supported by relevant evidence (cancer mortality reduced from 2 to 1.6 in 1000 persons). When presented with irrelevant evidence, 69% of physicians recommended the test, compared with 23% when presented with relevant evidence (P < 0.001). When asked general knowledge questions about screening statistics, many physicians did not distinguish between irrelevant and relevant screening evidence; 76% versus 81%, respectively, stated that each of these statistics proves that screening saves lives (P = 0.39). About one half (47%) of the physicians incorrectly said that finding more cases of cancer in screened as opposed to unscreened populations "proves that screening saves lives." Physicians' recommendations for screening were based on hypothetical scenarios, not actual practice. Most primary care physicians mistakenly interpreted improved survival and increased detection with screening as evidence that screening saves lives. Few correctly recognized that only reduced mortality in a randomized trial constitutes evidence of the benefit of screening. Harding Center for Risk Literacy, Max Planck Institute for Human Development.

  4. Statistics for Radiology Research.

    PubMed

    Obuchowski, Nancy A; Subhas, Naveen; Polster, Joshua

    2017-02-01

    Biostatistics is an essential component in most original research studies in imaging. In this article we discuss five key statistical concepts for study design and analyses in modern imaging research: statistical hypothesis testing, particularly focusing on noninferiority studies; imaging outcomes especially when there is no reference standard; dealing with the multiplicity problem without spending all your study power; relevance of confidence intervals in reporting and interpreting study results; and finally tools for assessing quantitative imaging biomarkers. These concepts are presented first as examples of conversations between investigator and biostatistician, and then more detailed discussions of the statistical concepts follow. Three skeletal radiology examples are used to illustrate the concepts. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  5. 76 FR 12823 - Enhanced Collection of Relevant Data and Statistics Relating to Women

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-09

    ... greater understanding of policies and programs. Preparation of this report revealed the vast data... Collection of Relevant Data and Statistics Relating to Women Memorandum for the Heads of Executive... accompanying website collection of relevant data, will assist Government officials in crafting policies in...

  6. A Bayesian Approach to Interactive Retrieval

    ERIC Educational Resources Information Center

    Tague, Jean M.

    1973-01-01

    A probabilistic model for interactive retrieval is presented. Bayesian statistical decision theory principles are applied: use of prior and sample information about the relationship of document descriptions to query relevance; maximization of expected value of a utility function, to the problem of optimally restructuring search strategies in an…

  7. Automated detection of diagnostically relevant regions in H&E stained digital pathology slides

    NASA Astrophysics Data System (ADS)

    Bahlmann, Claus; Patel, Amar; Johnson, Jeffrey; Ni, Jie; Chekkoury, Andrei; Khurd, Parmeshwar; Kamen, Ali; Grady, Leo; Krupinski, Elizabeth; Graham, Anna; Weinstein, Ronald

    2012-03-01

    We present a computationally efficient method for analyzing H&E stained digital pathology slides with the objective of discriminating diagnostically relevant vs. irrelevant regions. Such technology is useful for several applications: (1) It can speed up computer aided diagnosis (CAD) for histopathology based cancer detection and grading by an order of magnitude through a triage-like preprocessing and pruning. (2) It can improve the response time for an interactive digital pathology workstation (which is usually dealing with several GByte digital pathology slides), e.g., through controlling adaptive compression or prioritization algorithms. (3) It can support the detection and grading workflow for expert pathologists in a semi-automated diagnosis, hereby increasing throughput and accuracy. At the core of the presented method is the statistical characterization of tissue components that are indicative for the pathologist's decision about malignancy vs. benignity, such as, nuclei, tubules, cytoplasm, etc. In order to allow for effective yet computationally efficient processing, we propose visual descriptors that capture the distribution of color intensities observed for nuclei and cytoplasm. Discrimination between statistics of relevant vs. irrelevant regions is learned from annotated data, and inference is performed via linear classification. We validate the proposed method both qualitatively and quantitatively. Experiments show a cross validation error rate of 1.4%. We further show that the proposed method can prune ~90% of the area of pathological slides while maintaining 100% of all relevant information, which allows for a speedup of a factor of 10 for CAD systems.

  8. Exchanging the liquidity hypothesis: Delay discounting of money and self-relevant non-money rewards.

    PubMed

    Stuppy-Sullivan, Allison M; Tormohlen, Kayla N; Yi, Richard

    2016-01-01

    Evidence that primary rewards (e.g., food and drugs of abuse) are discounted more than money is frequently attributed to money's high degree of liquidity, or exchangeability for many commodities. The present study provides some evidence against this liquidity hypothesis by contrasting delay discounting of monetary rewards (liquid) and non-monetary commodities (non-liquid) that are self-relevant and utility-matched. Ninety-seven (97) undergraduate students initially completed a conventional binary-choice delay discounting of money task. Participants returned one week later and completed a self-relevant commodity delay discounting task. Both conventional hypothesis testing and more-conservative tests of statistical equivalence revealed correspondence in rate of delay discounting of money and self-relevant commodities, and in one magnitude condition, less discounting for the latter. The present results indicate that liquidity of money cannot fully account for the lower rate of delay discounting compared to non-money rewards. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Index of A/T's in Science.

    ERIC Educational Resources Information Center

    Bonney, Catherine

    This monograph presents an annotated index of auto-tutorial materials in science education available to middle and secondary schools in the Newark School District. Materials relevant to the study of the biological sciences enable the students to become more familiar with Biology Statistics, Cytology, Marine Field Trips, Use of Microscopes,…

  10. ESEA Title I Migrant. Final Technical Report. Publication 80.40.

    ERIC Educational Resources Information Center

    Austin Independent School District, TX. Office of Research and Evaluation.

    Data from 24 instruments used to evaluate the 1980-81 ESEA Title I Migrant program in the Austin (Texas) Independent School District are presented. A separate section for each instrument includes a description of purpose; procedures and results; and, where appropriate, relevant communications, instructions and statistical data. Summaries describe…

  11. Emotional Relationships between Mothers* and Infants: Knowns, Unknowns, and Unknown Unknowns

    PubMed Central

    Bornstein, Marc H.; Suwalsky, Joan T. D.; Breakstone, Dana A.

    2012-01-01

    An overview of the literature pertaining to the construct of emotional availability is presented, illustrated by a sampling of relevant studies. Methodological, statistical, and conceptual problems in the existing corpus of research are discussed, and suggestions for improving future investigations of this important construct are offered. PMID:22292998

  12. Statistical science: a grammar for research.

    PubMed

    Cox, David R

    2017-06-01

    I greatly appreciate the invitation to give this lecture with its century long history. The title is a warning that the lecture is rather discursive and not highly focused and technical. The theme is simple. That statistical thinking provides a unifying set of general ideas and specific methods relevant whenever appreciable natural variation is present. To be most fruitful these ideas should merge seamlessly with subject-matter considerations. By contrast, there is sometimes a temptation to regard formal statistical analysis as a ritual to be added after the serious work has been done, a ritual to satisfy convention, referees, and regulatory agencies. I want implicitly to refute that idea.

  13. Scientific progress - wireless phones and brain cancer: current state of the science.

    PubMed

    Carlo, G L; Jenrow, R S

    2000-07-11

    The current science is not definitive about health risks from wireless phones; however, the legitimate questions about safety that have arisen from recent studies make claims of absolute safety no longer supportable. The objective of this paper is to outline for primary care providers the results of the most current research on the possible impact of wireless phone use on human health. Presented are study results from Wireless Technology Research (WTR) program, the 7-year, $27 million effort funded by the wireless industry in the United States, that represents the world's most comprehensive research effort addressing this issue to date. Science-based recommendations for consumer interventions and future research are presented. Original studies performed under the WTR program as well as other relevant research from around the world. This article presents a synopsis of the peer-reviewed in vitro and in vivo laboratory research, and the peer-reviewed epidemiology studies supported by the WTR, as well as a summary of other relevant work. Only peer-reviewed scientific studies are presented, primarily WTR-sponsored research. In addition, results of the WTR literature surveillance program, which identified other relevant toxicology and epidemiology studies on an ongoing basis, are presented. These studies are presented in the context of their usefulness in providing intervention recommendations for consumers. Following a qualitative synthesis of specific relevant non-WTR research and a critical assessment of the WTR results, the following represents the current state of scientific understanding relevant to the public health impact of wireless phones: laboratory studies appear to have confirmed that radio frequency radiation from wireless phone antennas is insufficient to cause DNA breakage; however, this same radiation appears to cause genetic damage in human blood as measured through the formation of micronuclei. An increase in the rate of brain cancer mortality among hand-held cellular phone users as compared to car phone users, though not statistically significant, was observed in the WTR cohort study. A statistically significant increase in the risk of neuro-epithelial brain tumors was observed among cellular phone users in another case-control study. As new data emerge, our understanding of this complex problem will improve; however, at present there is a critical need for ongoing and open evaluation of the public health impact of new science, and communication of this science and derivative intervention options to those who are potentially affected.

  14. Broken Ergodicity in Ideal, Homogeneous, Incompressible Turbulence

    NASA Technical Reports Server (NTRS)

    Morin, Lee; Shebalin, John; Fu, Terry; Nguyen, Phu; Shum, Victor

    2010-01-01

    We discuss the statistical mechanics of numerical models of ideal homogeneous, incompressible turbulence and their relevance for dissipative fluids and magnetofluids. These numerical models are based on Fourier series and the relevant statistical theory predicts that Fourier coefficients of fluid velocity and magnetic fields (if present) are zero-mean random variables. However, numerical simulations clearly show that certain coefficients have a non-zero mean value that can be very large compared to the associated standard deviation. We explain this phenomena in terms of broken ergodicity', which is defined to occur when dynamical behavior does not match ensemble predictions on very long time-scales. We review the theoretical basis of broken ergodicity, apply it to 2-D and 3-D fluid and magnetohydrodynamic simulations of homogeneous turbulence, and show new results from simulations using GPU (graphical processing unit) computers.

  15. A toolbox for determining subdiffusive mechanisms

    NASA Astrophysics Data System (ADS)

    Meroz, Yasmine; Sokolov, Igor M.

    2015-04-01

    Subdiffusive processes have become a field of great interest in the last decades, due to amounting experimental evidence of subdiffusive behavior in complex systems, and especially in biological systems. Different physical scenarios leading to subdiffusion differ in the details of the dynamics. These differences are what allow to theoretically reconstruct the underlying physics from the results of observations, and will be the topic of this review. We review the main statistical analyses available today to distinguish between these scenarios, categorizing them according to the relevant characteristics. We collect the available tools and statistical tests, presenting them within a broader perspective. We also consider possible complications such as the subordination of subdiffusive mechanisms. Due to the advances in single particle tracking experiments in recent years, we focus on the relevant case of where the available experimental data is scant, at the level of single trajectories.

  16. Statistics in Japanese universities.

    PubMed Central

    Ito, P K

    1979-01-01

    The teaching of statistics in the U.S. and Japanese universities is briefly reviewed. It is found that H. Hotelling's articles and subsequent relevant publications on the teaching of statistics have contributed to a considerable extent to the establishment of excellent departments of statistics in U.S. universities and colleges. Today the U.S. may be proud of many well-staffed and well-organized departments of theoretical and applied statistics with excellent undergraduate and graduate programs. On the contrary, no Japanese universities have an independent department of statistics at present, and the teaching of statistics has been spread among a heterogeneous group of departments of application. This was mainly due to the Japanese government regulation concerning the establishment of a university. However, it has recently been revised so that an independent department of statistics may be started in a Japanese university with undergraduate and graduate programs. It is hoped that discussions will be started among those concerned on the question of organization of the teaching of statistics in Japanese universities as soon as possible. PMID:396154

  17. Statistical Evaluation of CRM-Simulated Cloud and Precipitation Structures Using Multi- sensor TRMM Measurements and Retrievals

    NASA Astrophysics Data System (ADS)

    Posselt, D.; L'Ecuyer, T.; Matsui, T.

    2009-05-01

    Cloud resolving models are typically used to examine the characteristics of clouds and precipitation and their relationship to radiation and the large-scale circulation. As such, they are not required to reproduce the exact location of each observed convective system, much less each individual cloud. Some of the most relevant information about clouds and precipitation is provided by instruments located on polar-orbiting satellite platforms, but these observations are intermittent "snapshots" in time, making assessment of model performance challenging. In contrast to direct comparison, model results can be evaluated statistically. This avoids the requirement for the model to reproduce the observed systems, while returning valuable information on the performance of the model in a climate-relevant sense. The focus of this talk is a model evaluation study, in which updates to the microphysics scheme used in a three-dimensional version of the Goddard Cumulus Ensemble (GCE) model are evaluated using statistics of observed clouds, precipitation, and radiation. We present the results of multiday (non-equilibrium) simulations of organized deep convection using single- and double-moment versions of a the model's cloud microphysical scheme. Statistics of TRMM multi-sensor derived clouds, precipitation, and radiative fluxes are used to evaluate the GCE results, as are simulated TRMM measurements obtained using a sophisticated instrument simulator suite. We present advantages and disadvantages of performing model comparisons in retrieval and measurement space and conclude by motivating the use of data assimilation techniques for analyzing and improving model parameterizations.

  18. Second Language Experience Facilitates Statistical Learning of Novel Linguistic Materials.

    PubMed

    Potter, Christine E; Wang, Tianlin; Saffran, Jenny R

    2017-04-01

    Recent research has begun to explore individual differences in statistical learning, and how those differences may be related to other cognitive abilities, particularly their effects on language learning. In this research, we explored a different type of relationship between language learning and statistical learning: the possibility that learning a new language may also influence statistical learning by changing the regularities to which learners are sensitive. We tested two groups of participants, Mandarin Learners and Naïve Controls, at two time points, 6 months apart. At each time point, participants performed two different statistical learning tasks: an artificial tonal language statistical learning task and a visual statistical learning task. Only the Mandarin-learning group showed significant improvement on the linguistic task, whereas both groups improved equally on the visual task. These results support the view that there are multiple influences on statistical learning. Domain-relevant experiences may affect the regularities that learners can discover when presented with novel stimuli. Copyright © 2016 Cognitive Science Society, Inc.

  19. Evaluating Video Self-Modeling Treatment Outcomes: Differentiating between Statistically and Clinically Significant Change

    ERIC Educational Resources Information Center

    La Spata, Michelle G.; Carter, Christopher W.; Johnson, Wendi L.; McGill, Ryan J.

    2016-01-01

    The present study examined the utility of video self-modeling (VSM) for reducing externalizing behaviors (e.g., aggression, conduct problems, hyperactivity, and impulsivity) observed within the classroom environment. After identification of relevant target behaviors, VSM interventions were developed for first and second grade students (N = 4),…

  20. Developing Data Graph Comprehension. Third Edition

    ERIC Educational Resources Information Center

    Curcio, Frances

    2010-01-01

    Since the dawn of civilization, pictorial representations and symbols have been used to communicate simple statistics. Efficient and effective, they are still used today in the form of pictures and graphs to record and present data. Who can tie their shoes? How many calories are in your favorite food? Make data and graphs relevant and interesting…

  1. Investigation of Primary Mathematics Student Teachers' Concept Images: Cylinder and Cone

    ERIC Educational Resources Information Center

    Ertekin, Erhan; Yazici, Ersen; Delice, Ali

    2014-01-01

    The aim of the present study is to determine the influence of concept definitions of cylinder and cone on primary mathematics student teachers' construction of relevant concept images. The study had a relational survey design and the participants were 238 primary mathematics student teachers. Statistical analyses implied the following: mathematics…

  2. Clinical relevance of IL-6 gene polymorphism in severely injured patients

    PubMed Central

    Jeremić, Vasilije; Alempijević, Tamara; Mijatović, Srđan; Šijački, Ana; Dragašević, Sanja; Pavlović, Sonja; Miličić, Biljana; Krstić, Slobodan

    2014-01-01

    In polytrauma, injuries that may be surgically treated under regular circumstances due to a systemic inflammatory response become life-threatening. The inflammatory response involves a complex pattern of humoral and cellular responses and the expression of related factors is thought to be governed by genetic variations. This aim of this paper is to examine the influence of interleukin (IL) 6 single nucleotide polymorphism (SNP) -174C/G and -596G/A on the treatment outcome in severely injured patients. Forty-seven severely injured patients were included in this study. Patients were assigned an Injury Severity Score. Blood samples were drawn within 24 h after admission (designated day 1) and on subsequent days (24, 48, 72 hours and 7days) of hospitalization. The IL-6 levels were determined through ELISA technique. Polymorphisms were analyzed by a method of Polymerase Chain Reaction-Restriction Fragment Length Polymorphism (PCR). Among subjects with different outcomes, no statistically relevant difference was found with regards to the gene IL-6 SNP-174G/C polymorphism. More than a half of subjects who died had the SNP-174G/C polymorphism, while this polymorphism was represented in a slightly lower number in survivors. The incidence of subjects without polymorphism and those with heterozygous and homozygous gene IL-6 SNP-596G/A polymorphism did not present statistically significant variations between survivors and those who died. The levels of IL-6 over the observation period did not present any statistically relevant difference among subjects without the IL-6 SNP-174 or IL-6 SNP -596 gene polymorphism and those who had either a heterozygous or a homozygous polymorphism. PMID:24856384

  3. Brain fingerprinting classification concealed information test detects US Navy military medical information with P300

    PubMed Central

    Farwell, Lawrence A.; Richardson, Drew C.; Richardson, Graham M.; Furedy, John J.

    2014-01-01

    A classification concealed information test (CIT) used the “brain fingerprinting” method of applying P300 event-related potential (ERP) in detecting information that is (1) acquired in real life and (2) unique to US Navy experts in military medicine. Military medicine experts and non-experts were asked to push buttons in response to three types of text stimuli. Targets contain known information relevant to military medicine, are identified to subjects as relevant, and require pushing one button. Subjects are told to push another button to all other stimuli. Probes contain concealed information relevant to military medicine, and are not identified to subjects. Irrelevants contain equally plausible, but incorrect/irrelevant information. Error rate was 0%. Median and mean statistical confidences for individual determinations were 99.9% with no indeterminates (results lacking sufficiently high statistical confidence to be classified). We compared error rate and statistical confidence for determinations of both information present and information absent produced by classification CIT (Is a probe ERP more similar to a target or to an irrelevant ERP?) vs. comparison CIT (Does a probe produce a larger ERP than an irrelevant?) using P300 plus the late negative component (LNP; together, P300-MERMER). Comparison CIT produced a significantly higher error rate (20%) and lower statistical confidences: mean 67%; information-absent mean was 28.9%, less than chance (50%). We compared analysis using P300 alone with the P300 + LNP. P300 alone produced the same 0% error rate but significantly lower statistical confidences. These findings add to the evidence that the brain fingerprinting methods as described here provide sufficient conditions to produce less than 1% error rate and greater than 95% median statistical confidence in a CIT on information obtained in the course of real life that is characteristic of individuals with specific training, expertise, or organizational affiliation. PMID:25565941

  4. Hadoop and friends - first experience at CERN with a new platform for high throughput analysis steps

    NASA Astrophysics Data System (ADS)

    Duellmann, D.; Surdy, K.; Menichetti, L.; Toebbicke, R.

    2017-10-01

    The statistical analysis of infrastructure metrics comes with several specific challenges, including the fairly large volume of unstructured metrics from a large set of independent data sources. Hadoop and Spark provide an ideal environment in particular for the first steps of skimming rapidly through hundreds of TB of low relevance data to find and extract the much smaller data volume that is relevant for statistical analysis and modelling. This presentation will describe the new Hadoop service at CERN and the use of several of its components for high throughput data aggregation and ad-hoc pattern searches. We will describe the hardware setup used, the service structure with a small set of decoupled clusters and the first experience with co-hosting different applications and performing software upgrades. We will further detail the common infrastructure used for data extraction and preparation from continuous monitoring and database input sources.

  5. Relationship between perceptual learning in speech and statistical learning in younger and older adults

    PubMed Central

    Neger, Thordis M.; Rietveld, Toni; Janse, Esther

    2014-01-01

    Within a few sentences, listeners learn to understand severely degraded speech such as noise-vocoded speech. However, individuals vary in the amount of such perceptual learning and it is unclear what underlies these differences. The present study investigates whether perceptual learning in speech relates to statistical learning, as sensitivity to probabilistic information may aid identification of relevant cues in novel speech input. If statistical learning and perceptual learning (partly) draw on the same general mechanisms, then statistical learning in a non-auditory modality using non-linguistic sequences should predict adaptation to degraded speech. In the present study, 73 older adults (aged over 60 years) and 60 younger adults (aged between 18 and 30 years) performed a visual artificial grammar learning task and were presented with 60 meaningful noise-vocoded sentences in an auditory recall task. Within age groups, sentence recognition performance over exposure was analyzed as a function of statistical learning performance, and other variables that may predict learning (i.e., hearing, vocabulary, attention switching control, working memory, and processing speed). Younger and older adults showed similar amounts of perceptual learning, but only younger adults showed significant statistical learning. In older adults, improvement in understanding noise-vocoded speech was constrained by age. In younger adults, amount of adaptation was associated with lexical knowledge and with statistical learning ability. Thus, individual differences in general cognitive abilities explain listeners' variability in adapting to noise-vocoded speech. Results suggest that perceptual and statistical learning share mechanisms of implicit regularity detection, but that the ability to detect statistical regularities is impaired in older adults if visual sequences are presented quickly. PMID:25225475

  6. Relationship between perceptual learning in speech and statistical learning in younger and older adults.

    PubMed

    Neger, Thordis M; Rietveld, Toni; Janse, Esther

    2014-01-01

    Within a few sentences, listeners learn to understand severely degraded speech such as noise-vocoded speech. However, individuals vary in the amount of such perceptual learning and it is unclear what underlies these differences. The present study investigates whether perceptual learning in speech relates to statistical learning, as sensitivity to probabilistic information may aid identification of relevant cues in novel speech input. If statistical learning and perceptual learning (partly) draw on the same general mechanisms, then statistical learning in a non-auditory modality using non-linguistic sequences should predict adaptation to degraded speech. In the present study, 73 older adults (aged over 60 years) and 60 younger adults (aged between 18 and 30 years) performed a visual artificial grammar learning task and were presented with 60 meaningful noise-vocoded sentences in an auditory recall task. Within age groups, sentence recognition performance over exposure was analyzed as a function of statistical learning performance, and other variables that may predict learning (i.e., hearing, vocabulary, attention switching control, working memory, and processing speed). Younger and older adults showed similar amounts of perceptual learning, but only younger adults showed significant statistical learning. In older adults, improvement in understanding noise-vocoded speech was constrained by age. In younger adults, amount of adaptation was associated with lexical knowledge and with statistical learning ability. Thus, individual differences in general cognitive abilities explain listeners' variability in adapting to noise-vocoded speech. Results suggest that perceptual and statistical learning share mechanisms of implicit regularity detection, but that the ability to detect statistical regularities is impaired in older adults if visual sequences are presented quickly.

  7. Resilience Among Students at the Basic Enlisted Submarine School

    DTIC Science & Technology

    2016-12-01

    reported resilience. The Hayes’ Macro in the Statistical Package for the Social Sciences (SSPS) was used to uncover factors relevant to mediation analysis... Statistical Package for the Social Sciences (SPSS) was used to uncover factors relevant to mediation analysis. Findings suggest that the encouragement of...to Stressful Experiences Scale RTC Recruit Training Command SPSS Statistical Package for the Social Sciences SS Social Support SWB Subjective Well

  8. A note about high blood pressure in childhood

    NASA Astrophysics Data System (ADS)

    Teodoro, M. Filomena; Simão, Carla

    2017-06-01

    In medical, behavioral and social sciences it is usual to get a binary outcome. In the present work is collected information where some of the outcomes are binary variables (1='yes'/ 0='no'). In [14] a preliminary study about the caregivers perception of pediatric hypertension was introduced. An experimental questionnaire was designed to be answered by the caregivers of routine pediatric consultation attendees in the Santa Maria's hospital (HSM). The collected data was statistically analyzed, where a descriptive analysis and a predictive model were performed. Significant relations between some socio-demographic variables and the assessed knowledge were obtained. In [14] can be found a statistical data analysis using partial questionnaire's information. The present article completes the statistical approach estimating a model for relevant remaining questions of questionnaire by Generalized Linear Models (GLM). Exploring the binary outcome issue, we intend to extend this approach using Generalized Linear Mixed Models (GLMM), but the process is still ongoing.

  9. Statistical Learning is Related to Early Literacy-Related Skills

    PubMed Central

    Spencer, Mercedes; Kaschak, Michael P.; Jones, John L.; Lonigan, Christopher J.

    2015-01-01

    It has been demonstrated that statistical learning, or the ability to use statistical information to learn the structure of one’s environment, plays a role in young children’s acquisition of linguistic knowledge. Although most research on statistical learning has focused on language acquisition processes, such as the segmentation of words from fluent speech and the learning of syntactic structure, some recent studies have explored the extent to which individual differences in statistical learning are related to literacy-relevant knowledge and skills. The present study extends on this literature by investigating the relations between two measures of statistical learning and multiple measures of skills that are critical to the development of literacy—oral language, vocabulary knowledge, and phonological processing—within a single model. Our sample included a total of 553 typically developing children from prekindergarten through second grade. Structural equation modeling revealed that statistical learning accounted for a unique portion of the variance in these literacy-related skills. Practical implications for instruction and assessment are discussed. PMID:26478658

  10. The Galactic interstellar medium: foregrounds and star formation

    NASA Astrophysics Data System (ADS)

    Miville-Deschênes, Marc-Antoine

    2018-05-01

    This review presents briefly two aspects of Galactic interstellar medium science that seem relevant for studying EoR. First, we give some statistical properties of the Galactic foreground emission in the diffuse regions of the sky. The properties of the emission observed in projection on the plane of the sky are then related to how matter is organised along the line of sight. The diffuse atomic gas is multi-phase, with dense filamentary structures occupying only about 1% of the volume but contributing to about 50% of the emission. The second part of the review presents aspect of structure formation in the Galactic interstellar medium that could be relevant for the subgrid physics used to model the formation of the first stars.

  11. The Content of Statistical Requirements for Authors in Biomedical Research Journals

    PubMed Central

    Liu, Tian-Yi; Cai, Si-Yu; Nie, Xiao-Lu; Lyu, Ya-Qi; Peng, Xiao-Xia; Feng, Guo-Shuang

    2016-01-01

    Background: Robust statistical designing, sound statistical analysis, and standardized presentation are important to enhance the quality and transparency of biomedical research. This systematic review was conducted to summarize the statistical reporting requirements introduced by biomedical research journals with an impact factor of 10 or above so that researchers are able to give statistical issues’ serious considerations not only at the stage of data analysis but also at the stage of methodological design. Methods: Detailed statistical instructions for authors were downloaded from the homepage of each of the included journals or obtained from the editors directly via email. Then, we described the types and numbers of statistical guidelines introduced by different press groups. Items of statistical reporting guideline as well as particular requirements were summarized in frequency, which were grouped into design, method of analysis, and presentation, respectively. Finally, updated statistical guidelines and particular requirements for improvement were summed up. Results: Totally, 21 of 23 press groups introduced at least one statistical guideline. More than half of press groups can update their statistical instruction for authors gradually relative to issues of new statistical reporting guidelines. In addition, 16 press groups, covering 44 journals, address particular statistical requirements. The most of the particular requirements focused on the performance of statistical analysis and transparency in statistical reporting, including “address issues relevant to research design, including participant flow diagram, eligibility criteria, and sample size estimation,” and “statistical methods and the reasons.” Conclusions: Statistical requirements for authors are becoming increasingly perfected. Statistical requirements for authors remind researchers that they should make sufficient consideration not only in regards to statistical methods during the research design, but also standardized statistical reporting, which would be beneficial in providing stronger evidence and making a greater critical appraisal of evidence more accessible. PMID:27748343

  12. The Content of Statistical Requirements for Authors in Biomedical Research Journals.

    PubMed

    Liu, Tian-Yi; Cai, Si-Yu; Nie, Xiao-Lu; Lyu, Ya-Qi; Peng, Xiao-Xia; Feng, Guo-Shuang

    2016-10-20

    Robust statistical designing, sound statistical analysis, and standardized presentation are important to enhance the quality and transparency of biomedical research. This systematic review was conducted to summarize the statistical reporting requirements introduced by biomedical research journals with an impact factor of 10 or above so that researchers are able to give statistical issues' serious considerations not only at the stage of data analysis but also at the stage of methodological design. Detailed statistical instructions for authors were downloaded from the homepage of each of the included journals or obtained from the editors directly via email. Then, we described the types and numbers of statistical guidelines introduced by different press groups. Items of statistical reporting guideline as well as particular requirements were summarized in frequency, which were grouped into design, method of analysis, and presentation, respectively. Finally, updated statistical guidelines and particular requirements for improvement were summed up. Totally, 21 of 23 press groups introduced at least one statistical guideline. More than half of press groups can update their statistical instruction for authors gradually relative to issues of new statistical reporting guidelines. In addition, 16 press groups, covering 44 journals, address particular statistical requirements. The most of the particular requirements focused on the performance of statistical analysis and transparency in statistical reporting, including "address issues relevant to research design, including participant flow diagram, eligibility criteria, and sample size estimation," and "statistical methods and the reasons." Statistical requirements for authors are becoming increasingly perfected. Statistical requirements for authors remind researchers that they should make sufficient consideration not only in regards to statistical methods during the research design, but also standardized statistical reporting, which would be beneficial in providing stronger evidence and making a greater critical appraisal of evidence more accessible.

  13. Optical properties of mice skin for optical therapy relevant wavelengths: influence of gender and pigmentation

    NASA Astrophysics Data System (ADS)

    Sabino, C. P.; Deana, A. M.; Silva, D. F. T.; França, C. M.; Yoshimura, T. M.; Ribeiro, M. S.

    2015-03-01

    Red and near-infrared light have been widely employed in optical therapies. Skin is the most common optical barrier in non-invasive techniques and in many cases it is the target tissue itself. Consequently, to optimize the outcomes brought by lightbased therapies, the optical properties of skin tissue must be very well elucidated. In the present study, we evaluated the dorsal skin optical properties of albino (BALB/c) and pigmented (C57BL/6) mice using the Kubelka-Munk photon transport model. We evaluated samples from male and female young mice of both strains. Analysis was performed for wavelengths at 630, 660, 780, 810 and 905 nm due to their prevalent use in optical therapies, such as low-level light (or laser) and photodynamic therapies. Spectrophotometric measurements of diffuse transmittance and reflectance were performed using a single integrating sphere coupled to a proper spectrophotometer. Statistic analysis was made by two-way ANOVA, with Tukey as post-test and Levenne and Shapiro-Wilks as pre-tests. Statistical significance was considered when p<0.05. Our results show only a slight transmittance increment (<10 %) as wavelengths are increased from 630 to 905 nm, and no statistical significance was observed. Albino male mice present reduced transmittance levels for all wavelengths. The organization and abundance of skin composing tissues significantly influence its scattering optical properties although absorption remains constant. We conclude that factors such as subcutaneous adiposity and connective tissue structure can have statistically significant influence on mice skin optical properties and these factors have relevant variations among different gender and strains.

  14. Person-Fit and the Rasch Model, with an Application to Knowledge of Logical Quantors.

    ERIC Educational Resources Information Center

    Molenaar, Ivo W.; Hoijtink, Herbert

    1996-01-01

    Some specific person-fit results for the Rasch model are presented, followed by an application to a test measuring knowledge of reasoning with logical quantors. Some issues are relevant to all attempts to use person-fit statistics in research, but the special role of the Rasch model is highlighted. (SLD)

  15. Women and Science: Issues and Resources [and] Women and Information Technology: A Selective Bibliography.

    ERIC Educational Resources Information Center

    Searing, Susan, Comp.; Shult, Linda, Comp.

    Two bibliographies list over 120 books, journal articles, reference materials, statistical sources, organizations, and media relevant to women's roles in science and in information technology. The first bibliography emphasizes books, most of which were published in the late 1970's and the 1980's, that present a feminist critique of scientific…

  16. Accuracy of Orthognathic Surgical Outcomes Using 2- and 3-Dimensional Landmarks-The Case for Apples and Oranges?

    PubMed

    Borba, Alexandre Meireles; José da Silva, Everton; Fernandes da Silva, André Luis; Han, Michael D; da Graça Naclério-Homem, Maria; Miloro, Michael

    2018-01-12

    To verify predicted versus obtained surgical movements in 2-dimensional (2D) and 3-dimensional (3D) measurements and compare the equivalence between these methods. A retrospective observational study of bimaxillary orthognathic surgeries was performed. Postoperative cone-beam computed tomographic (CBCT) scans were superimposed on preoperative scans and a lateral cephalometric radiograph was generated from each CBCT scan. After identification of the sella, nasion, and upper central incisor tip landmarks on 2D and 3D images, actual and planned movements were compared by cephalometric measurements. One-sample t test was used to statistically evaluate results, with expected mean discrepancy values ranging from 0 to 2 mm. Equivalence of 2D and 3D values was compared using paired t test. The final sample of 46 cases showed by 2D cephalometry that differences between actual and planned movements in the horizontal axis were statistically relevant for expected means of 0, 0.5, and 2 mm without relevance for expected means of 1 and 1.5 mm; vertical movements were statistically relevant for expected means of 0 and 0.5 mm without relevance for expected means of 1, 1.5, and 2 mm. For 3D cephalometry in the horizontal axis, there were statistically relevant differences for expected means of 0, 1.5, and 2 mm without relevance for expected means of 0.5 and 1 mm; vertical movements showed statistically relevant differences for expected means of 0, 0.5, 1.5 and 2 mm without relevance for the expected mean of 1 mm. Comparison of 2D and 3D values displayed statistical differences for the horizontal and vertical axes. Comparison of 2D and 3D surgical outcome assessments should be performed with caution because there seems to be a difference in acceptable levels of accuracy between these 2 methods of evaluation. Moreover, 3D accuracy studies should no longer rely on a 2-mm level of discrepancy but on a 1-mm level. Copyright © 2018 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  17. Level statistics of words: Finding keywords in literary texts and symbolic sequences

    NASA Astrophysics Data System (ADS)

    Carpena, P.; Bernaola-Galván, P.; Hackenberg, M.; Coronado, A. V.; Oliver, J. L.

    2009-03-01

    Using a generalization of the level statistics analysis of quantum disordered systems, we present an approach able to extract automatically keywords in literary texts. Our approach takes into account not only the frequencies of the words present in the text but also their spatial distribution along the text, and is based on the fact that relevant words are significantly clustered (i.e., they self-attract each other), while irrelevant words are distributed randomly in the text. Since a reference corpus is not needed, our approach is especially suitable for single documents for which no a priori information is available. In addition, we show that our method works also in generic symbolic sequences (continuous texts without spaces), thus suggesting its general applicability.

  18. Is there room in the DSM for consideration of deaf people?

    PubMed

    Lala, F J

    1998-10-01

    Recent changes to the Diagnostic and Statistical Manual of Mental Disorders, or DSM (4th ed., American Psychiatric Association, 1994), show recognition that cultural factors are relevant to assessment; thus, including specific information relevant to Deaf culture should help DSM users understand their deaf clients. For the present article, literature was surveyed on the psychological needs of the Deaf, and specifically how the Deaf views those needs. The review focused on four articles (Carver, 1995, 1997; Chapman, 1994; Dolnick, 1993). These articles suggest consensus on the thesis that the Deaf, as a minority culture, should provide information on Deaf culture to members of the helping professions. In addition to enhancing care providers' understanding, this would help society do a better job of including the Deaf in planning relevant to their needs. In particular, culturally deaf people should urge inclusion of relevant information about the Deaf in the next DSM revision.

  19. Does experimental pain affect auditory processing of speech-relevant signals? A study in healthy young adults.

    PubMed

    Sapir, Shimon; Pud, Dorit

    2008-01-01

    To assess the effect of tonic pain stimulation on auditory processing of speech-relevant acoustic signals in healthy pain-free volunteers. Sixty university students, randomly assigned to either a thermal pain stimulation (46 degrees C/6 min) group (PS) or no pain stimulation group (NPS), performed a rate change detection task (RCDT) involving sinusoidally frequency-modulated vowel-like signals. Task difficulty was manipulated by changing the rate of the modulated signals (henceforth rate). Perceived pain intensity was evaluated using a visual analog scale (VAS) (0-100). Mean pain rating was approximately 33 in the PS group and approximately 3 in the NPS group. Pain stimulation was associated with poorer performance on the RCDT, but this trend was not statistically significant. Performance worsened with increasing rate of signal modulation in both groups (p < 0.0001), with no pain by rate interaction. The present findings indicate a trend whereby mild or moderate pain appears to affect auditory processing of speech-relevant acoustic signals. This trend, however, was not statistically significant. It is possible that more intense pain would yield more pronounced (deleterious) effects on auditory processing, but this needs to be verified empirically.

  20. Electrical Conductivity of Charged Particle Systems and Zubarev's Nonequilibrium Statistical Operator Method

    NASA Astrophysics Data System (ADS)

    Röpke, G.

    2018-01-01

    One of the fundamental problems in physics that are not yet rigorously solved is the statistical mechanics of nonequilibrium processes. An important contribution to describing irreversible behavior starting from reversible Hamiltonian dynamics was given by D. N. Zubarev, who invented the method of the nonequilibrium statistical operator. We discuss this approach, in particular, the extended von Neumann equation, and as an example consider the electrical conductivity of a system of charged particles. We consider the selection of the set of relevant observables. We show the relation between kinetic theory and linear response theory. Using thermodynamic Green's functions, we present a systematic treatment of correlation functions, but the convergence needs investigation. We compare different expressions for the conductivity and list open questions.

  1. Statistics on continuous IBD data: Exact distribution evaluation for a pair of full(half)-sibs and a pair of a (great-) grandchild with a (great-) grandparent

    PubMed Central

    Stefanov, Valeri T

    2002-01-01

    Background Pairs of related individuals are widely used in linkage analysis. Most of the tests for linkage analysis are based on statistics associated with identity by descent (IBD) data. The current biotechnology provides data on very densely packed loci, and therefore, it may provide almost continuous IBD data for pairs of closely related individuals. Therefore, the distribution theory for statistics on continuous IBD data is of interest. In particular, distributional results which allow the evaluation of p-values for relevant tests are of importance. Results A technology is provided for numerical evaluation, with any given accuracy, of the cumulative probabilities of some statistics on continuous genome data for pairs of closely related individuals. In the case of a pair of full-sibs, the following statistics are considered: (i) the proportion of genome with 2 (at least 1) haplotypes shared identical-by-descent (IBD) on a chromosomal segment, (ii) the number of distinct pieces (subsegments) of a chromosomal segment, on each of which exactly 2 (at least 1) haplotypes are shared IBD. The natural counterparts of these statistics for the other relationships are also considered. Relevant Maple codes are provided for a rapid evaluation of the cumulative probabilities of such statistics. The genomic continuum model, with Haldane's model for the crossover process, is assumed. Conclusions A technology, together with relevant software codes for its automated implementation, are provided for exact evaluation of the distributions of relevant statistics associated with continuous genome data on closely related individuals. PMID:11996673

  2. Schools and Data: The Educator's Guide for Using Data To Improve Decision Making.

    ERIC Educational Resources Information Center

    Creighton, Theodore B.

    This book focuses on the relevance of statistics in the day-to-day lives of principals and teachers. The step-by-step guide to using existing school data can help school leaders make more appropriate and effective decisions. The information is presented with easy-to-follow instructions, illustrations, and pertinent examples. The chapters are: (1)…

  3. Reproducible and Verifiable Equations of State Using Microfabricated Materials

    NASA Astrophysics Data System (ADS)

    Martin, J. F.; Pigott, J. S.; Panero, W. R.

    2017-12-01

    Accurate interpretation of observable geophysical data, relevant to the structure, composition, and evolution of planetary interiors, requires precise determination of appropriate equations of state. We present the synthesis of controlled-geometry nanofabricated samples and insulation layers for the laser-heated diamond anvil cell. We present electron-gun evaporation, sputter deposition, and photolithography methods to mass-produce Pt/SiO2/Fe/SiO2 stacks and MgO insulating disks to be used in LHDAC experiments to reduce uncertainties in equation of state measurements due to large temperature gradients. We present a reanalysis of published iron PVT data to establish a statistically-valid extrapolation of the equation of state to inner core conditions with quantified uncertainties, addressing the complication of covariance in equation of state parameters. We use this reanalysis, together with the synthesized samples, to propose a scheme for measurement and validation of high-precision equations of state relevant to the Earth and super-Earth exoplanets.

  4. Promoting Statistical Thinking in Schools with Road Injury Data

    ERIC Educational Resources Information Center

    Woltman, Marie

    2017-01-01

    Road injury is an immediately relevant topic for 9-19 year olds. Current availability of Open Data makes it increasingly possible to find locally relevant data. Statistical lessons developed from these data can mutually reinforce life lessons about minimizing risk on the road. Devon County Council demonstrate how a wide array of statistical…

  5. Utilisation of Local Inputs in the Funding and Administration of Education in Nigeria

    ERIC Educational Resources Information Center

    Akiri, Agharuwhe A.

    2014-01-01

    The article discussed how, why and who is in charge of administering and funding schools in Nigeria. The author utilised the relevant statistical approach which examined and discussed various political and historical trends affecting education. Besides this, relevant documented statistical data were used to both buttress and substantiate related…

  6. Statistical approach for selection of biologically informative genes.

    PubMed

    Das, Samarendra; Rai, Anil; Mishra, D C; Rai, Shesh N

    2018-05-20

    Selection of informative genes from high dimensional gene expression data has emerged as an important research area in genomics. Many gene selection techniques have been proposed so far are either based on relevancy or redundancy measure. Further, the performance of these techniques has been adjudged through post selection classification accuracy computed through a classifier using the selected genes. This performance metric may be statistically sound but may not be biologically relevant. A statistical approach, i.e. Boot-MRMR, was proposed based on a composite measure of maximum relevance and minimum redundancy, which is both statistically sound and biologically relevant for informative gene selection. For comparative evaluation of the proposed approach, we developed two biological sufficient criteria, i.e. Gene Set Enrichment with QTL (GSEQ) and biological similarity score based on Gene Ontology (GO). Further, a systematic and rigorous evaluation of the proposed technique with 12 existing gene selection techniques was carried out using five gene expression datasets. This evaluation was based on a broad spectrum of statistically sound (e.g. subject classification) and biological relevant (based on QTL and GO) criteria under a multiple criteria decision-making framework. The performance analysis showed that the proposed technique selects informative genes which are more biologically relevant. The proposed technique is also found to be quite competitive with the existing techniques with respect to subject classification and computational time. Our results also showed that under the multiple criteria decision-making setup, the proposed technique is best for informative gene selection over the available alternatives. Based on the proposed approach, an R Package, i.e. BootMRMR has been developed and available at https://cran.r-project.org/web/packages/BootMRMR. This study will provide a practical guide to select statistical techniques for selecting informative genes from high dimensional expression data for breeding and system biology studies. Published by Elsevier B.V.

  7. Analytics for Cyber Network Defense

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plantenga, Todd.; Kolda, Tamara Gibson

    2011-06-01

    This report provides a brief survey of analytics tools considered relevant to cyber network defense (CND). Ideas and tools come from elds such as statistics, data mining, and knowledge discovery. Some analytics are considered standard mathematical or statistical techniques, while others re ect current research directions. In all cases the report attempts to explain the relevance to CND with brief examples.

  8. The Effects of Clinically Relevant Multiple-Choice Items on the Statistical Discrimination of Physician Clinical Competence.

    ERIC Educational Resources Information Center

    Downing, Steven M.; Maatsch, Jack L.

    To test the effect of clinically relevant multiple-choice item content on the validity of statistical discriminations of physicians' clinical competence, data were collected from a field test of the Emergency Medicine Examination, test items for the certification of specialists in emergency medicine. Two 91-item multiple-choice subscales were…

  9. Statistical functions and relevant correlation coefficients of clearness index

    NASA Astrophysics Data System (ADS)

    Pavanello, Diego; Zaaiman, Willem; Colli, Alessandra; Heiser, John; Smith, Scott

    2015-08-01

    This article presents a statistical analysis of the sky conditions, during years from 2010 to 2012, for three different locations: the Joint Research Centre site in Ispra (Italy, European Solar Test Installation - ESTI laboratories), the site of National Renewable Energy Laboratory in Golden (Colorado, USA) and the site of Brookhaven National Laboratories in Upton (New York, USA). The key parameter is the clearness index kT, a dimensionless expression of the global irradiance impinging upon a horizontal surface at a given instant of time. In the first part, the sky conditions are characterized using daily averages, giving a general overview of the three sites. In the second part the analysis is performed using data sets with a short-term resolution of 1 sample per minute, demonstrating remarkable properties of the statistical distributions of the clearness index, reinforced by a proof using fuzzy logic methods. Successively some time-dependent correlations between different meteorological variables are presented in terms of Pearson and Spearman correlation coefficients, and introducing a new one.

  10. Who Needs Statistics? | Poster

    Cancer.gov

    You may know the feeling. You have collected a lot of new data on an important experiment. Now you are faced with multiple groups of data, a sea of numbers, and a deadline for submitting your paper to a peer-reviewed journal. And you are not sure which data are relevant, or even the best way to present them. The statisticians at Data Management Services (DMS) know how to help.

  11. A MULTIPLE TESTING OF THE ABC METHOD AND THE DEVELOPMENT OF A SECOND-GENERATION MODEL. PART II, TEST RESULTS AND AN ANALYSIS OF RECALL RATIO.

    ERIC Educational Resources Information Center

    ALTMANN, BERTHOLD

    AFTER A BRIEF SUMMARY OF THE TEST PROGRAM (DESCRIBED MORE FULLY IN LI 000 318), THE STATISTICAL RESULTS TABULATED AS OVERALL "ABC (APPROACH BY CONCEPT)-RELEVANCE RATIOS" AND "ABC-RECALL FIGURES" ARE PRESENTED AND REVIEWED. AN ABSTRACT MODEL DEVELOPED IN ACCORDANCE WITH MAX WEBER'S "IDEALTYPUS" ("DIE OBJEKTIVITAET…

  12. [Recent developments in intra-European migration since 1974].

    PubMed

    Lebon, A; Falchi, G

    1980-01-01

    This article represents the text of a paper presented at a conference on European migration organized by the Council of Europe in Strasbourg, May 6-8, 1979. The authors examine changes in European migration since the oil crisis of 1974 and include a review of the relevant statistical data, a review of the main problems, and a summary of some possible future trends in European migration

  13. UXO Burial Prediction Fidelity: A Summary

    DTIC Science & Technology

    2017-07-01

    should not be construed as representing the official position of either the Department of Defense or the sponsoring organization. For More Information ...equilibrium. Any complete picture of munition evolution in sediment would need to account for these effects. More relevant to the present topic: these...of adds uncertainty to predictions of munition fate, and assessments of risk probabilities would need to account for the statistical distribution of

  14. Hazardous Communication and Tools for Quality: Basic Statistics. Responsive Text. Educational Materials for the Workplace.

    ERIC Educational Resources Information Center

    Vermont Inst. for Self-Reliance, Rutland.

    This guide provides a description of Responsive Text (RT), a method for presenting job-relevant information within a computer-based support system. A summary of what RT is and why it is important is provided first. The first section of the guide provides a brief overview of what research tells about the reading process and how the general design…

  15. Body mass index is not associated with reoperation rates in patients with a surgically treated perforated peptic ulcer.

    PubMed

    Duch, Patricia; Møller, Morten Hylander

    2015-03-01

    The aim of the present nationwide Danish cohort study was to examine the association between body mass index (BMI) and reoperation in patients who are sur-gically treated for perforated peptic ulcer (PPU). This was a nationwide cohort study of all Danish patients who were surgically treated for benign gastric or duodenal PPU between 2011 and 2013. reoperation within 30 days of the primary surgical procedure and 90-day survival. The association between BMI and reoperation are presented as crude and adjusted odds ratios (OR) with 95% confidence intervals (CIs). A total of 726 patients were included. The median age was 69.5 years (range: 18.2-101.7 years), 51.4% were women (n = 373), 78.4% (n = 569) of the patients had at least one co-existing disease, and 47.5% (n = 345) were categorised as American Society of Anesthesiologists (ASA) class ≥ 3. Re-operative surgery was done in 124 patients (17.1%). No statistically significant adjusted association between underweight, overweight or obesity and re-operation was found (adjusted OR (95% CI): 0.456 (0.181-1.148), 1.468 (0.857-2.517), and 1.314 (0.663-2.601), respectively). Patients undergoing reoperative surgery had a statistically significantly lower crude 90-day survival than patients without need of reperative surgery; 63.9% (83/124) versus 75.9% (457/602), p = 0.037. In the present nationwide cohort study of PPU patients, no statistically significantly adjusted correlation between BMI and re-operation rates was found. Patients undergoing reoperative surgery had a decreased 90-day survival. not relevant. not relevant.

  16. Text mining by Tsallis entropy

    NASA Astrophysics Data System (ADS)

    Jamaati, Maryam; Mehri, Ali

    2018-01-01

    Long-range correlations between the elements of natural languages enable them to convey very complex information. Complex structure of human language, as a manifestation of natural languages, motivates us to apply nonextensive statistical mechanics in text mining. Tsallis entropy appropriately ranks the terms' relevance to document subject, taking advantage of their spatial correlation length. We apply this statistical concept as a new powerful word ranking metric in order to extract keywords of a single document. We carry out an experimental evaluation, which shows capability of the presented method in keyword extraction. We find that, Tsallis entropy has reliable word ranking performance, at the same level of the best previous ranking methods.

  17. Statistical hadronization with exclusive channels in e +e - annihilation

    DOE PAGES

    Ferroni, L.; Becattini, F.

    2012-01-01

    We present a systematic analysis of exclusive hadronic channels in e +e - collisions at centre-of-mass energies between 2.1 and 2.6 GeV within the statistical hadronization model. Because of the low multiplicities involved, calculations have been carried out in the full microcanonical ensemble, including conservation of energy-momentum, angular momentum, parity, isospin, and all relevant charges. We show that the data is in an overall good agreement with the model for an energy density of about 0.5 GeV/fm 3 and an extra strangeness suppression parameter γ S 0:7, essentially the same values found with fits to inclusive multiplicities at higher energy.

  18. Statistical evaluation of GLONASS amplitude scintillation over low latitudes in the Brazilian territory

    NASA Astrophysics Data System (ADS)

    de Oliveira Moraes, Alison; Muella, Marcio T. A. H.; de Paula, Eurico R.; de Oliveira, César B. A.; Terra, William P.; Perrella, Waldecir J.; Meibach-Rosa, Pâmela R. P.

    2018-04-01

    The ionospheric scintillation, generated by the ionospheric plasma irregularities, affects the radio signals that pass through it. Their effects are widely studied in the literature with two different approaches. The first one deals with the use of radio signals to study and understand the morphology of this phenomenon, while the second one seeks to understand and model how much this phenomenon interferes in the radio signals and consequently in the services to which these systems work. The interest of several areas, particularly to those that are life critical, has increased using the concept of satellite multi-constellation, which consists of receiving, processing and using data from different navigation and positioning systems. Although there is a vast literature analyzing the effects of ionospheric scintillation on satellite navigation systems, the number of studies using signals received from the Russian satellite positioning system (named GLONASS) is still very rare. This work presents for the first time in the Brazilian low-latitude sector a statistical analysis of ionospheric scintillation data for all levels of magnetic activities obtained by a set of scintillation monitors that receive signals from the GLONASS system. In this study, data collected from four stations were used in the analysis; Fortaleza, Presidente Prudente, São José dos Campos and Porto Alegre. The GLONASS L-band signals were analyzed for the period from December 21, 2012 to June 20, 2016, which includes the peak of the solar cycle 24 that occurred in 2014. The main characteristics of scintillation presented in this study include: (1) the statistical evaluation of seasonal and solar activity, showing the chances that an user on similar geophysical conditions may be susceptible to the effects of ionospheric scintillation; (2) a temporal analysis based on the local time distribution of scintillation at different seasons and intensity levels; and (3) the evaluation of number of simultaneously affected channels and its effects on the dilution of precision (DOP) for GNSS users are also presented in order to alert the timetables in which navigation will be most susceptible to such effects, as well as statistics on simultaneously affected channels. Relevant results about these statistical characteristics of scintillation are presented and analyzed providing relevant information about availability of a navigation system.

  19. Information on 'Overdiagnosis' in Breast Cancer Screening on Prominent United Kingdom- and Australia-Oriented Health Websites.

    PubMed

    Ghanouni, Alex; Meisel, Susanne F; Hersch, Jolyn; Waller, Jo; Wardle, Jane; Renzi, Cristina

    2016-01-01

    Health-related websites are an important source of information for the public. Increasing public awareness of overdiagnosis and ductal carcinoma in situ (DCIS) in breast cancer screening may facilitate more informed decision-making. This study assessed the extent to which such information was included on prominent health websites oriented towards the general public, and evaluated how it was explained. Cross-sectional study. Websites identified through Google searches in England (United Kingdom) and New South Wales (Australia) for "breast cancer screening" and further websites included based on our prior knowledge of relevant organisations. Content analysis was used to determine whether information on overdiagnosis or DCIS existed on each site, how the concepts were described, and what statistics were used to quantify overdiagnosis. After exclusions, ten UK websites and eight Australian websites were considered relevant and evaluated. They originated from charities, health service providers, government agencies, and an independent health organisation. Most contained some information on overdiagnosis (and/or DCIS). Descriptive information was similar across websites. Among UK websites, statistical information was often based on estimates from the Independent UK Panel on Breast Cancer Screening; the most commonly provided statistic was the ratio of breast cancer deaths prevented to overdiagnosed cases (1:3). A range of other statistics was included, such as the yearly number of overdiagnosed cases and the proportion of women screened who would be overdiagnosed. Information on DCIS and statistical information was less common on the Australian websites. Online information about overdiagnosis has become more widely available in 2015-16 compared with the limited accessibility indicated by older research. However, there may be scope to offer more information on DCIS and overdiagnosis statistics on Australian websites. Moreover, the variability in how estimates are presented across UK websites may be confusing for the general public.

  20. Comparison of the perceived relevance of oral biology reported by students and interns of a Pakistani dental college.

    PubMed

    Farooq, I; Ali, S

    2014-11-01

    The purpose of this study was to analyse and compare the perceived relevance of oral biology with dentistry as reported by dental students and interns and to investigate the most popular teaching approach and learning resource. A questionnaire aiming to ask about the relevance of oral biology to dentistry, most popular teaching method and learning resource was utilised in this study. Study groups encompassed second-year dental students who had completed their course and dental interns. The data were obtained and analysed statistically. The overall response rate for both groups was 60%. Both groups reported high relevance of oral biology to dentistry. Perception of dental interns regarding the relevance of oral biology to dentistry was higher than that of students. Both groups identified student presentations as the most important teaching method. Amongst the most important learning resources, textbooks were considered most imperative by interns, whereas lecture handouts received the highest importance score by students. Dental students and interns considered oral biology to be relevant to dentistry, although greater relevance was reported by interns. Year-wise advancement in dental education and training improves the perception of the students about the relevance of oral biology to dentistry. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  1. Affirmative Action Plans, January 1, 1994--December 31, 1994. Revision

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-02-16

    This document is the Affirmative Action Plan for January 1, 1994 through December 31, 1994 for the Lawrence Berkeley Laboratory, University of California (``LBL`` or ``the Laboratory.``) This is an official document that will be presented upon request to the Office of Federal Contract Compliance Programs, US Department of Labor. The plan is prepared in accordance with the Executive Order 11246 and 41 CFR Section 60-1 et seq. covering equal employment opportunity and will be updated during the year, if appropriate. Analyses included in this volume as required by government regulations are based on statistical comparisons. All statistical comparisons involvemore » the use of geographic areas and various sources of statistics. The geographic areas and sources of statistics used here are in compliance with the government regulations, as interpreted. The use of any geographic area or statistic does not indicate agreement that the geographic area is the most appropriate or that the statistic is the most relevant. The use of such geographic areas and statistics is intended to have no significance outside the context of this Affirmative Action Plan, although, of course, such statistics and geographic areas will be used in good faith with respect to this Affirmative Action Plan.« less

  2. Climate Change Education in Earth System Science

    NASA Astrophysics Data System (ADS)

    Hänsel, Stephanie; Matschullat, Jörg

    2013-04-01

    The course "Atmospheric Research - Climate Change" is offered to master Earth System Science students within the specialisation "Climate and Environment" at the Technical University Bergakademie Freiberg. This module takes a comprehensive approach to climate sciences, reaching from the natural sciences background of climate change via the social components of the issue to the statistical analysis of changes in climate parameters. The course aims at qualifying the students to structure the physical and chemical basics of the climate system including relevant feedbacks. The students can evaluate relevant drivers of climate variability and change on various temporal and spatial scales and can transform knowledge from climate history to the present and the future. Special focus is given to the assessment of uncertainties related to climate observations and projections as well as the specific challenges of extreme weather and climate events. At the end of the course the students are able to critically reflect and evaluate climate change related results of scientific studies and related issues in media. The course is divided into two parts - "Climate Change" and "Climate Data Analysis" and encompasses two lectures, one seminar and one exercise. The weekly "Climate change" lecture transmits the physical and chemical background for climate variation and change. (Pre)historical, observed and projected climate changes and their effects on various sectors are being introduced and discussed regarding their implications for society, economics, ecology and politics. The related seminar presents and discusses the multiple reasons for controversy in climate change issues, based on various texts. Students train the presentation of scientific content and the discussion of climate change aspects. The biweekly lecture on "Climate data analysis" introduces the most relevant statistical tools and methods in climate science. Starting with checking data quality via tools of exploratory data analysis the approaches on climate time series, trend analysis and extreme events analysis are explained. Tools to describe relations within the data sets and significance tests further corroborate this. Within the weekly exercises that have to be prepared at home, the students work with self-selected climate data sets and apply the learned methods. The presentation and discussion of intermediate results by the students is as much part of the exercises as the illustration of possible methodological procedures by the teacher using exemplary data sets. The total time expenditure of the course is 270 hours with 90 attendance hours. The remainder consists of individual studies, e.g., preparation of discussions and presentations, statistical data analysis, and scientific writing. Different forms of examination are applied including written or oral examination, scientific report, presentation and portfolio work.

  3. Back to the basics: Identifying and addressing underlying challenges in achieving high quality and relevant health statistics for indigenous populations in Canada

    PubMed Central

    Smylie, Janet; Firestone, Michelle

    2015-01-01

    Canada is known internationally for excellence in both the quality and public policy relevance of its health and social statistics. There is a double standard however with respect to the relevance and quality of statistics for Indigenous populations in Canada. Indigenous specific health and social statistics gathering is informed by unique ethical, rights-based, policy and practice imperatives regarding the need for Indigenous participation and leadership in Indigenous data processes throughout the spectrum of indicator development, data collection, management, analysis and use. We demonstrate how current Indigenous data quality challenges including misclassification errors and non-response bias systematically contribute to a significant underestimate of inequities in health determinants, health status, and health care access between Indigenous and non-Indigenous people in Canada. The major quality challenge underlying these errors and biases is the lack of Indigenous specific identifiers that are consistent and relevant in major health and social data sources. The recent removal of an Indigenous identity question from the Canadian census has resulted in further deterioration of an already suboptimal system. A revision of core health data sources to include relevant, consistent, and inclusive Indigenous self-identification is urgently required. These changes need to be carried out in partnership with Indigenous peoples and their representative and governing organizations. PMID:26793283

  4. Back to basics: an introduction to statistics.

    PubMed

    Halfens, R J G; Meijers, J M M

    2013-05-01

    In the second in the series, Professor Ruud Halfens and Dr Judith Meijers give an overview of statistics, both descriptive and inferential. They describe the first principles of statistics, including some relevant inferential tests.

  5. Relevant Scatterers Characterization in SAR Images

    NASA Astrophysics Data System (ADS)

    Chaabouni, Houda; Datcu, Mihai

    2006-11-01

    Recognizing scenes in a single look meter resolution Synthetic Aperture Radar (SAR) images, requires the capability to identify relevant signal signatures in condition of variable image acquisition geometry, arbitrary objects poses and configurations. Among the methods to detect relevant scatterers in SAR images, we can mention the internal coherence. The SAR spectrum splitted in azimuth generates a series of images which preserve high coherence only for particular object scattering. The detection of relevant scatterers can be done by correlation study or Independent Component Analysis (ICA) methods. The present article deals with the state of the art for SAR internal correlation analysis and proposes further extensions using elements of inference based on information theory applied to complex valued signals. The set of azimuth looks images is analyzed using mutual information measures and an equivalent channel capacity is derived. The localization of the "target" requires analysis in a small image window, thus resulting in imprecise estimation of the second order statistics of the signal. For a better precision, a Hausdorff measure is introduced. The method is applied to detect and characterize relevant objects in urban areas.

  6. Active browsing using similarity pyramids

    NASA Astrophysics Data System (ADS)

    Chen, Jau-Yuen; Bouman, Charles A.; Dalton, John C.

    1998-12-01

    In this paper, we describe a new approach to managing large image databases, which we call active browsing. Active browsing integrates relevance feedback into the browsing environment, so that users can modify the database's organization to suit the desired task. Our method is based on a similarity pyramid data structure, which hierarchically organizes the database, so that it can be efficiently browsed. At coarse levels, the similarity pyramid allows users to view the database as large clusters of similar images. Alternatively, users can 'zoom into' finer levels to view individual images. We discuss relevance feedback for the browsing process, and argue that it is fundamentally different from relevance feedback for more traditional search-by-query tasks. We propose two fundamental operations for active browsing: pruning and reorganization. Both of these operations depend on a user-defined relevance set, which represents the image or set of images desired by the user. We present statistical methods for accurately pruning the database, and we propose a new 'worm hole' distance metric for reorganizing the database, so that members of the relevance set are grouped together.

  7. Selecting relevant 3D image features of margin sharpness and texture for lung nodule retrieval.

    PubMed

    Ferreira, José Raniery; de Azevedo-Marques, Paulo Mazzoncini; Oliveira, Marcelo Costa

    2017-03-01

    Lung cancer is the leading cause of cancer-related deaths in the world. Its diagnosis is a challenge task to specialists due to several aspects on the classification of lung nodules. Therefore, it is important to integrate content-based image retrieval methods on the lung nodule classification process, since they are capable of retrieving similar cases from databases that were previously diagnosed. However, this mechanism depends on extracting relevant image features in order to obtain high efficiency. The goal of this paper is to perform the selection of 3D image features of margin sharpness and texture that can be relevant on the retrieval of similar cancerous and benign lung nodules. A total of 48 3D image attributes were extracted from the nodule volume. Border sharpness features were extracted from perpendicular lines drawn over the lesion boundary. Second-order texture features were extracted from a cooccurrence matrix. Relevant features were selected by a correlation-based method and a statistical significance analysis. Retrieval performance was assessed according to the nodule's potential malignancy on the 10 most similar cases and by the parameters of precision and recall. Statistical significant features reduced retrieval performance. Correlation-based method selected 2 margin sharpness attributes and 6 texture attributes and obtained higher precision compared to all 48 extracted features on similar nodule retrieval. Feature space dimensionality reduction of 83 % obtained higher retrieval performance and presented to be a computationaly low cost method of retrieving similar nodules for the diagnosis of lung cancer.

  8. Establishment of a Uniform Format for Data Reporting of Structural Material Properties for Reliability Analysis

    DTIC Science & Technology

    1994-06-30

    tip Opening Displacement (CTOD) Fracture Toughness Measurement". 48 The method has found application in the elastic-plastic fracture mechanics ( EPFM ...68 6.1 Proposed Material Property Database Format and Hierarchy .............. 68 6.2 Sample Application of the Material Property Database...the E 49.05 sub-committee. The relevant quality indicators applicable to the present program are: source of data, statistical basis of data

  9. Overview and Results of ISS Space Medicine Operations Team (SMOT) Activities

    NASA Technical Reports Server (NTRS)

    Johnson, H. Magee; Sargsyan, Ashot E.; Armstrong, Cheryl; McDonald, P. Vernon; Duncan, James M.; Bogomolov, V. V.

    2007-01-01

    The Space Medicine Operations Team (SMOT) was created to integrate International Space Station (ISS) Medical Operations, promote awareness of all Partners, provide emergency response capability and management, provide operational input from all Partners for medically relevant concerns, and provide a source of medical input to ISS Mission Management. The viewgraph presentation provides an overview of educational objectives, purpose, operations, products, statistics, and its use in off-nominal situations.

  10. Fluid-particle characteristics in fully-developed cluster-induced turbulence

    NASA Astrophysics Data System (ADS)

    Capecelatro, Jesse; Desjardins, Olivier; Fox, Rodney

    2014-11-01

    In this study, we present a theoretical framework for collisional fluid-particle turbulence. To identify the key mechanisms responsible for energy exchange between the two phases, an Eulerian-Lagrangian strategy is used to simulate fully-developed cluster-inudced turbulence (CIT) under a range of Reynolds numbers, where fluctuations in particle concentration generate and sustain the carrier-phase turbulence. Using a novel filtering approach, a length-scale separation between the correlated particle velocity and uncorrelated granular temperature (GT) is achieved. This separation allows us to extract the instantaneous Eulerian volume fraction, velocity and GT fields from the Lagrangian data. Direct comparisons can thus be made with the relevant terms that appear in the multiphase turbulence model. It is shown that the granular pressure is highly anisotropic, and thus additional transport equations (as opposed to a single equation for GT) are necessary in formulating a predictive multiphase turbulence model. In addition to reporting the relevant contributions to the Reynolds stresses of each phase, two-point statistics, integral length/timescales, averages conditioned on the local volume fraction, and PDFs of the key multiphase statistics are presented and discussed. The research reported in this paper is partially supported by the HPC equipment purchased through U.S. National Science Foundation MRI Grant Number CNS 1229081 and CRI Grant Number 1205413.

  11. Statistical Study in the mid-altitude cusp region: wave and particle data comparison using a normalized cusp crossing duration

    NASA Astrophysics Data System (ADS)

    Grison, B.; Escoubet, C. P.; Pitout, F.; Cornilleau-Wehrlin, N.; Dandouras, I.; Lucek, E.

    2009-04-01

    In the mid altitude cusp region the DC magnetic field presents a diamagnetic cavity due to intense ion earthward flux coming from the magnetosheath. A strong ultra low frequency (ULF) magnetic activity is also commonly observed in this region. Most of the mid altitude cusp statistical studies have focused on the location of the cusp and its dependence and response to solar wind, interplanetary magnetic field, dipole tilt angle parameters. In our study we use the database build by Pitout et al. (2006) in order to study the link of wave power in the ULF range (0.35-10Hz) measured by STAFF SC instrument with the ion plasma properties as measured by CIS (and CODIF) instrument as well as the diamagnetic cavity in the mid-altitude cusp region with FGM data. To compare the different crossings we don`t use the cusp position and dynamics but we use a normalized cusp crossing duration that permits to easily average the properties over a large number of crossings. As usual in the cusp, it is particularly relevant to sort the crossings by the corresponding interplanetary magnetic field (IMF) orientation in order to analyse the results. In particular we try to find out what is the most relevant parameter to link the strong wave activity with. The global statistic confirms previous single case observations that have noticed a simultaneity between ion injections and wave activity enhancements. We will also present results concerning other ion parameters and the diamagnetic cavity observed in the mid altitude cusp region.

  12. Features of statistical dynamics in a finite system

    NASA Astrophysics Data System (ADS)

    Yan, Shiwei; Sakata, Fumihiko; Zhuo, Yizhong

    2002-03-01

    We study features of statistical dynamics in a finite Hamilton system composed of a relevant one degree of freedom coupled to an irrelevant multidegree of freedom system through a weak interaction. Special attention is paid on how the statistical dynamics changes depending on the number of degrees of freedom in the irrelevant system. It is found that the macrolevel statistical aspects are strongly related to an appearance of the microlevel chaotic motion, and a dissipation of the relevant motion is realized passing through three distinct stages: dephasing, statistical relaxation, and equilibrium regimes. It is clarified that the dynamical description and the conventional transport approach provide us with almost the same macrolevel and microlevel mechanisms only for the system with a very large number of irrelevant degrees of freedom. It is also shown that the statistical relaxation in the finite system is an anomalous diffusion and the fluctuation effects have a finite correlation time.

  13. Features of statistical dynamics in a finite system.

    PubMed

    Yan, Shiwei; Sakata, Fumihiko; Zhuo, Yizhong

    2002-03-01

    We study features of statistical dynamics in a finite Hamilton system composed of a relevant one degree of freedom coupled to an irrelevant multidegree of freedom system through a weak interaction. Special attention is paid on how the statistical dynamics changes depending on the number of degrees of freedom in the irrelevant system. It is found that the macrolevel statistical aspects are strongly related to an appearance of the microlevel chaotic motion, and a dissipation of the relevant motion is realized passing through three distinct stages: dephasing, statistical relaxation, and equilibrium regimes. It is clarified that the dynamical description and the conventional transport approach provide us with almost the same macrolevel and microlevel mechanisms only for the system with a very large number of irrelevant degrees of freedom. It is also shown that the statistical relaxation in the finite system is an anomalous diffusion and the fluctuation effects have a finite correlation time.

  14. Self-control and frequency of model presentation: effects on learning a ballet passé relevé.

    PubMed

    Fagundes, Julie; Chen, David D; Laguna, Patricia

    2013-08-01

    The purpose of this experiment was to examine the combined effects of self-control and frequency of model presentation on learning a complex motor skill, i.e., ballet passé relevé. Before practice started self-control participants were asked to choose two viewings or six viewings (before practice and then every five trials) and the externally controlled groups were yoked to their self-control counterparts. All participants completed 15 acquisition trials followed by 5 trials for the immediate and 5 trials for the delayed retention tests 48 hours later. Dependent variables included cognitive representation scores, physical reproduction rankings, and balance time. Statistical analyses indicated that under limited physical practice conditions self-control and higher frequency of model presentation facilitated the development of cognitive representation and did not produce further benefits in movement reproductions and balance time. The results were discussed with respect to the social cognitive theory. Published by Elsevier B.V.

  15. IEA Bioenergy Countries' Report: Bioenergy policies and status of implementation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bacovsky, Dina; Ludwiczek, Nikolaus; Pointner, Christian

    2016-08-05

    This report was prepared from IEA statistical data, information from IRENA, and IEA Bioenergy Tasks’ country reports, combined with data provided by the IEA Bioenergy Executive Committee. All individual country reports were reviewed by the national delegates to the IEA Bioenergy Executive Committee, who have approved the content. In the first section of each country report, national renewable energy targets are presented (first table in each country report), and the main pieces of national legislation are discussed. In the second section of each country report the total primary energy supply (TPES) by resources and the contribution of bioenergy are presented.more » All data is taken from IEA statistics for the year 2014. Where 2014 data was not available, 2013 data was used. It is worth noting that data reported in national statistics can differ from the IEA data presented, as the reporting categories and definitions are different. In the third section of each country report, the research focus related to bioenergy is discussed. Relevant funding programs, major research institutes and projects are described. In the fourth section, recent major bioenergy developments are described. Finally, in the fifth section, links to sources of information are provided.« less

  16. Transportation energy data book

    NASA Astrophysics Data System (ADS)

    Davis, S. C.; Hu, P. S.

    1991-01-01

    The Transportation Energy Data Book: Edition 11 is a statistical compendium prepared and published by Oak Ridge National Laboratory (ORNL) under contract with the Office of Transportation Technologies in the Department of Energy (DOE). Designed for use as a desk-top reference, the data book represents an assembly and display of statistics and information that characterize transportation activity, and presents data on other factors that influence transportation energy use. The purpose of this document is to present relevant statistical data in the form of tables and graphs. Each of the major transportation modes - highway, air, water, rail, pipeline - is treated in separate chapters or sections. Chapter 1 compares U.S. transportation data with data from seven other countries. Aggregate energy use and energy supply data for all modes are presented in Chapter 2. The highway mode, which accounts for over three-fourths of total transportation energy consumption, is dealt with in Chapter 3. Topics in this chapter include automobiles, trucks, buses, fleet automobiles, Federal standards, fuel economies, and household data. Chapter 4 is a new addition to the data book series, containing information on alternative fuels and alternatively-fueled vehicles. The last chapter, Chapter 5, covers each of the nonhighway modes: air, water, pipeline, and rail, respectively.

  17. Surveys Assessing Students' Attitudes toward Statistics: A Systematic Review of Validity and Reliability

    ERIC Educational Resources Information Center

    Nolan, Meaghan M.; Beran, Tanya; Hecker, Kent G.

    2012-01-01

    Students with positive attitudes toward statistics are likely to show strong academic performance in statistics courses. Multiple surveys measuring students' attitudes toward statistics exist; however, a comparison of the validity and reliability of interpretations based on their scores is needed. A systematic review of relevant electronic…

  18. Assessing the prevalence and clinical relevance of positive abdominal and pelvic CT findings in senior patients presenting to the emergency department.

    PubMed

    Alabousi, Abdullah; Patlas, Michael N; Meshki, Malek; Monteiro, Sandra; Katz, Douglas S

    2016-04-01

    The purpose of our study was to retrospectively evaluate the prevalence and clinical relevance of positive abdominal and pelvic CT findings for patients 65 years of age and older, when compared with all other scanned adult Emergency Department (ED) patients, at a single tertiary care hospital. Our hypothesis was that there is an increased prevalence and clinical relevance of positive abdominal/pelvic CT findings in senior patients. A research ethics board-approved retrospective review of all adult patients who underwent an emergency CT of the abdomen and pelvis for acute nontraumatic abdominal and/or pelvic signs and symptoms was performed. Two thousand one hundred two patients between October 1, 2011, and September 30, 2013, were reviewed. Six hundred thirty-one patients were included in the <65 group (298 men and 333 women; mean age 46, age range 18-64), and 462 were included in the >65 group (209 men and 253 women; mean age 77.6, age range 65-99). Overall, there were more positive CT findings for patients <65 (389 positive cases, 61.6 %) compared with the >65 group (257 positive cases, 55.6 %), which was a statistically significant difference (p < 0.03). Moreover, with the exception of complicated appendicitis cases, which were more common in the >65 group, there were no statistically significant differences in the clinical/surgical relevance of the positive CT findings between the two groups. The findings of our retrospective study therefore refute our hypothesis that there is an increased prevalence of positive abdominal CT findings in patients >65. This may be related to ED physicians at our institution being more hesitant to order CT examinations for the younger population, presumably due to radiation concerns. However, older patients in our series were more likely to present with complicated appendicitis, and a lower threshold for ordering CT examinations of the abdomen and pelvis in this patient population should therefore be considered.

  19. Statistical interpretation of machine learning-based feature importance scores for biomarker discovery.

    PubMed

    Huynh-Thu, Vân Anh; Saeys, Yvan; Wehenkel, Louis; Geurts, Pierre

    2012-07-01

    Univariate statistical tests are widely used for biomarker discovery in bioinformatics. These procedures are simple, fast and their output is easily interpretable by biologists but they can only identify variables that provide a significant amount of information in isolation from the other variables. As biological processes are expected to involve complex interactions between variables, univariate methods thus potentially miss some informative biomarkers. Variable relevance scores provided by machine learning techniques, however, are potentially able to highlight multivariate interacting effects, but unlike the p-values returned by univariate tests, these relevance scores are usually not statistically interpretable. This lack of interpretability hampers the determination of a relevance threshold for extracting a feature subset from the rankings and also prevents the wide adoption of these methods by practicians. We evaluated several, existing and novel, procedures that extract relevant features from rankings derived from machine learning approaches. These procedures replace the relevance scores with measures that can be interpreted in a statistical way, such as p-values, false discovery rates, or family wise error rates, for which it is easier to determine a significance level. Experiments were performed on several artificial problems as well as on real microarray datasets. Although the methods differ in terms of computing times and the tradeoff, they achieve in terms of false positives and false negatives, some of them greatly help in the extraction of truly relevant biomarkers and should thus be of great practical interest for biologists and physicians. As a side conclusion, our experiments also clearly highlight that using model performance as a criterion for feature selection is often counter-productive. Python source codes of all tested methods, as well as the MATLAB scripts used for data simulation, can be found in the Supplementary Material.

  20. Hybrid statistics-simulations based method for atom-counting from ADF STEM images.

    PubMed

    De Wael, Annelies; De Backer, Annick; Jones, Lewys; Nellist, Peter D; Van Aert, Sandra

    2017-06-01

    A hybrid statistics-simulations based method for atom-counting from annular dark field scanning transmission electron microscopy (ADF STEM) images of monotype crystalline nanostructures is presented. Different atom-counting methods already exist for model-like systems. However, the increasing relevance of radiation damage in the study of nanostructures demands a method that allows atom-counting from low dose images with a low signal-to-noise ratio. Therefore, the hybrid method directly includes prior knowledge from image simulations into the existing statistics-based method for atom-counting, and accounts in this manner for possible discrepancies between actual and simulated experimental conditions. It is shown by means of simulations and experiments that this hybrid method outperforms the statistics-based method, especially for low electron doses and small nanoparticles. The analysis of a simulated low dose image of a small nanoparticle suggests that this method allows for far more reliable quantitative analysis of beam-sensitive materials. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Evaluation of a Partial Genome Screening of Two Asthma Susceptibility Regions Using Bayesian Network Based Bayesian Multilevel Analysis of Relevance

    PubMed Central

    Antal, Péter; Kiszel, Petra Sz.; Gézsi, András; Hadadi, Éva; Virág, Viktor; Hajós, Gergely; Millinghoffer, András; Nagy, Adrienne; Kiss, András; Semsei, Ágnes F.; Temesi, Gergely; Melegh, Béla; Kisfali, Péter; Széll, Márta; Bikov, András; Gálffy, Gabriella; Tamási, Lilla; Falus, András; Szalai, Csaba

    2012-01-01

    Genetic studies indicate high number of potential factors related to asthma. Based on earlier linkage analyses we selected the 11q13 and 14q22 asthma susceptibility regions, for which we designed a partial genome screening study using 145 SNPs in 1201 individuals (436 asthmatic children and 765 controls). The results were evaluated with traditional frequentist methods and we applied a new statistical method, called Bayesian network based Bayesian multilevel analysis of relevance (BN-BMLA). This method uses Bayesian network representation to provide detailed characterization of the relevance of factors, such as joint significance, the type of dependency, and multi-target aspects. We estimated posteriors for these relations within the Bayesian statistical framework, in order to estimate the posteriors whether a variable is directly relevant or its association is only mediated. With frequentist methods one SNP (rs3751464 in the FRMD6 gene) provided evidence for an association with asthma (OR = 1.43(1.2–1.8); p = 3×10−4). The possible role of the FRMD6 gene in asthma was also confirmed in an animal model and human asthmatics. In the BN-BMLA analysis altogether 5 SNPs in 4 genes were found relevant in connection with asthma phenotype: PRPF19 on chromosome 11, and FRMD6, PTGER2 and PTGDR on chromosome 14. In a subsequent step a partial dataset containing rhinitis and further clinical parameters was used, which allowed the analysis of relevance of SNPs for asthma and multiple targets. These analyses suggested that SNPs in the AHNAK and MS4A2 genes were indirectly associated with asthma. This paper indicates that BN-BMLA explores the relevant factors more comprehensively than traditional statistical methods and extends the scope of strong relevance based methods to include partial relevance, global characterization of relevance and multi-target relevance. PMID:22432035

  2. Conducting Simulation Studies in the R Programming Environment.

    PubMed

    Hallgren, Kevin A

    2013-10-12

    Simulation studies allow researchers to answer specific questions about data analysis, statistical power, and best-practices for obtaining accurate results in empirical research. Despite the benefits that simulation research can provide, many researchers are unfamiliar with available tools for conducting their own simulation studies. The use of simulation studies need not be restricted to researchers with advanced skills in statistics and computer programming, and such methods can be implemented by researchers with a variety of abilities and interests. The present paper provides an introduction to methods used for running simulation studies using the R statistical programming environment and is written for individuals with minimal experience running simulation studies or using R. The paper describes the rationale and benefits of using simulations and introduces R functions relevant for many simulation studies. Three examples illustrate different applications for simulation studies, including (a) the use of simulations to answer a novel question about statistical analysis, (b) the use of simulations to estimate statistical power, and (c) the use of simulations to obtain confidence intervals of parameter estimates through bootstrapping. Results and fully annotated syntax from these examples are provided.

  3. Understanding common statistical methods, Part I: descriptive methods, probability, and continuous data.

    PubMed

    Skinner, Carl G; Patel, Manish M; Thomas, Jerry D; Miller, Michael A

    2011-01-01

    Statistical methods are pervasive in medical research and general medical literature. Understanding general statistical concepts will enhance our ability to critically appraise the current literature and ultimately improve the delivery of patient care. This article intends to provide an overview of the common statistical methods relevant to medicine.

  4. MSW Students' Perceptions of Relevance and Application of Statistics: Implications for Field Education

    ERIC Educational Resources Information Center

    Davis, Ashley; Mirick, Rebecca G.

    2015-01-01

    Many social work students feel anxious when taking a statistics course. Their attitudes, beliefs, and behaviors after learning statistics are less known. However, such information could help instructors support students' ongoing development of statistical knowledge. With a sample of MSW students (N = 101) in one program, this study examined…

  5. Critical Bursts in Filtration

    NASA Astrophysics Data System (ADS)

    Bianchi, Filippo; Thielmann, Marcel; de Arcangelis, Lucilla; Herrmann, Hans Jürgen

    2018-01-01

    Particle detachment bursts during the flow of suspensions through porous media are a phenomenon that can severely affect the efficiency of deep bed filters. Despite the relevance in several industrial fields, little is known about the statistical properties and the temporal organization of these events. We present experiments of suspensions of deionized water carrying quartz particles pushed with a peristaltic pump through a filter of glass beads measuring simultaneously the pressure drop, flux, and suspension solid fraction. We find that the burst size distribution scales consistently with a power law, suggesting that we are in the presence of a novel experimental realization of a self-organized critical system. Temporal correlations are present in the time series, like in other phenomena such as earthquakes or neuronal activity bursts, and also an analog to Omori's law can be shown. The understanding of burst statistics could provide novel insights in different fields, e.g., in the filter and petroleum industries.

  6. Computer assisted outcomes research in orthopedics: total joint replacement.

    PubMed

    Arslanian, C; Bond, M

    1999-06-01

    Long-term studies are needed to determine clinically relevant outcomes within the practice of orthopedic surgery. Historically, the patient's subjective feelings of quality of life have been largely ignored. However, there has been a strong movement toward measuring perceived quality of life through such instruments as the SF-36. In a large database from an orthopedic practice results are presented. First, computerized data entry using touch screen technology is not only cost effective but user friendly. Second, patients undergoing hip or knee arthroplasty surgeries make statistically significant improvements in seven of the eight domains of the SF-36 in the first 3 months after surgery. Additional statistically significant improvements over the next 6 to 12 months are also seen. The data are presented here in detail to demonstrate the benefits of a patient outcomes program, to enhance the understanding and use of outcomes data and to encourage further work in outcomes measurement in orthopedics.

  7. Nuclear magnetic resonance (NMR) study of the effect of cisplatin on the metabolic profile of MG-63 osteosarcoma cells.

    PubMed

    Duarte, Iola F; Lamego, Ines; Marques, Joana; Marques, M Paula M; Blaise, Benjamin J; Gil, Ana M

    2010-11-05

    In the present study, (1)H HRMAS NMR spectroscopy was used to assess the changes in the intracellular metabolic profile of MG-63 human osteosarcoma (OS) cells induced by the chemotherapy agent cisplatin (CDDP) at different times of exposure. Multivariate analysis was applied to the cells spectra, enabling consistent variation patterns to be detected and drug-specific metabolic effects to be identified. Statistical recoupling of variables (SRV) analysis and spectral integration enabled the most relevant spectral changes to be evaluated, revealing significant time-dependent alterations in lipids, choline-containing compounds, some amino acids, polyalcohols, and nitrogenated bases. The metabolic relevance of these compounds in the response of MG-63 cells to CDDP treatment is discussed.

  8. Health-Related Quality of Life up to Six Years After {sup 125}I Brachytherapy for Early-Stage Prostate Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roeloffzen, Ellen M.A., E-mail: E.M.A.Roeloffzen@UMCUtrecht.n; Lips, Irene M.; Gellekom, Marion P.R. van

    2010-03-15

    Purpose: Health-related quality of life (HRQOL) after prostate brachytherapy has been extensively described in published reports but hardly any long-term data are available. The aim of the present study was to prospectively assess long-term HRQOL 6 years after {sup 125}I prostate brachytherapy. Methods and Materials: A total of 127 patients treated with {sup 125}I brachytherapy for early-stage prostate cancer between December 2000 and June 2003 completed a HRQOL questionnaire at five time-points: before treatment and 1 month, 6 months, 1 year, and 6 years after treatment. The questionnaire included the RAND-36 generic health survey, the cancer-specific European Organization for Researchmore » and Treatment of Cancer core questionnaire (EORTCQLQ-C30), and the tumor-specific EORTC prostate cancer module (EORTC-PR25). A change in a score of >=10 points was considered clinically relevant. Results: Overall, the HRQOL at 6 years after {sup 125}I prostate brachytherapy did not significantly differ from baseline. Although a statistically significant deterioration in HRQOL at 6 years was seen for urinary symptoms, bowel symptoms, pain, physical functioning, and sexual activity (p <.01), most changes were not clinically relevant. A statistically significant improvement at 6 years was seen for mental health, emotional functioning, and insomnia (p <.01). The only clinically relevant changes were seen for emotional functioning and sexual activity. Conclusion: This is the first study presenting prospective HRQOL data up to 6 years after {sup 125}I prostate brachytherapy. HRQOL scores returned to approximately baseline values at 1 year and remained stable up to 6 years after treatment. {sup 125}I prostate brachytherapy did not adversely affect patients' long-term HRQOL.« less

  9. The large sample size fallacy.

    PubMed

    Lantz, Björn

    2013-06-01

    Significance in the statistical sense has little to do with significance in the common practical sense. Statistical significance is a necessary but not a sufficient condition for practical significance. Hence, results that are extremely statistically significant may be highly nonsignificant in practice. The degree of practical significance is generally determined by the size of the observed effect, not the p-value. The results of studies based on large samples are often characterized by extreme statistical significance despite small or even trivial effect sizes. Interpreting such results as significant in practice without further analysis is referred to as the large sample size fallacy in this article. The aim of this article is to explore the relevance of the large sample size fallacy in contemporary nursing research. Relatively few nursing articles display explicit measures of observed effect sizes or include a qualitative discussion of observed effect sizes. Statistical significance is often treated as an end in itself. Effect sizes should generally be calculated and presented along with p-values for statistically significant results, and observed effect sizes should be discussed qualitatively through direct and explicit comparisons with the effects in related literature. © 2012 Nordic College of Caring Science.

  10. Use of observational and model-derived fields and regime model output statistics in mesoscale forecasting

    NASA Technical Reports Server (NTRS)

    Forbes, G. S.; Pielke, R. A.

    1985-01-01

    Various empirical and statistical weather-forecasting studies which utilize stratification by weather regime are described. Objective classification was used to determine weather regime in some studies. In other cases the weather pattern was determined on the basis of a parameter representing the physical and dynamical processes relevant to the anticipated mesoscale phenomena, such as low level moisture convergence and convective precipitation, or the Froude number and the occurrence of cold-air damming. For mesoscale phenomena already in existence, new forecasting techniques were developed. The use of cloud models in operational forecasting is discussed. Models to calculate the spatial scales of forcings and resultant response for mesoscale systems are presented. The use of these models to represent the climatologically most prevalent systems, and to perform case-by-case simulations is reviewed. Operational implementation of mesoscale data into weather forecasts, using both actual simulation output and method-output statistics is discussed.

  11. Annual Research Briefs, 1987

    NASA Technical Reports Server (NTRS)

    Moin, Parviz; Reynolds, William C.

    1988-01-01

    Lagrangian techniques have found widespread application to the prediction and understanding of turbulent transport phenomena and have yielded satisfactory results for different cases of shear flow problems. However, it must be kept in mind that in most experiments what is really available are Eulerian statistics, and it is far from obvious how to extract from them the information relevant to the Lagrangian behavior of the flow; in consequence, Lagrangian models still include some hypothesis for which no adequate supporting evidence was until now available. Direct numerical simulation of turbulence offers a new way to obtain Lagrangian statistics and so verify the validity of the current predictive models and the accuracy of their results. After the pioneering work of Riley (Riley and Patterson, 1974) in the 70's, some such results have just appeared in the literature (Lee et al, Yeung and Pope). The present contribution follows in part similar lines, but focuses on two particle statistics and comparison with existing models.

  12. Fundamentals of nuclear medicine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alazraki, N.P.; Mishkin, F.S.

    1988-01-01

    The book begins with basic science and statistics relevant to nuclear medicine, and specific organ systems are addressed in separate chapters. A section of the text also covers imaging of groups of disease processes (eg, trauma, cancer). The authors present a comparison between nuclear medicine techniques and other diagnostic imaging studies. A table is given which comments on sensitivities and specificities of common nuclear medicine studies. The sensitivities and specificities are categorized as very high, high, moderate, and so forth.

  13. Analysis of the economic structure of the eating-out sector: The case of Spain.

    PubMed

    Cabiedes-Miragaya, Laura

    2017-12-01

    The objective of this article is to analyse the structure of the Spanish eating-out sector from an economic point of view, and more specifically, from the supply perspective. This aspect has been studied less than the demand side, almost certainly due to the gaps which exist in available official statistics in Spain, and which have been filled basically with consumer surveys. For this reason, focus is also placed on the economic relevance of the sector and attention is drawn to the serious shortcomings regarding official statistics in this domain, in contrast to the priority that hotel industry statistics have traditionally received in Spain. Based on official statistics, a descriptive analysis was carried out, focused mainly, though not exclusively, on diverse structural aspects of the sector. Special emphasis was placed on issues such as business demography (for instance, number and types of enterprises, survival rates, size distribution, and age structure), market concentration and structure of costs. Among other conclusions, the analysis allowed us to conclude that: part of the sector is more concentrated than it may at first appear to be; the dual structure of the sector described by the literature in relation to other countries is also present in the Spanish case; and the impact of ICTs (Information and Communication Technologies) on the sector are, and will foreseeably continue to be, particularly relevant. The main conclusion of this study refers to the fact that consumers have gained prominence in their contribution to shaping the structure of the sector. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Statistical Study of Aircraft Icing Probabilities at the 700- and 500- Millibar Levels over Ocean Areas in the Northern Hemisphere

    NASA Technical Reports Server (NTRS)

    Perkins, Porter J.; Lewis, William; Mulholland, Donald R.

    1957-01-01

    A statistical study is made of icing data reported from weather reconnaissance aircraft flown by Air Weather Service (USAF). The weather missions studied were flown at fixed flight levels of 500 millibars (18,000 ft) and 700 millibars (10,000 ft) over wide areas of the Pacific, Atlantic, and Arctic Oceans. This report is presented as part of a program conducted by the NACA to obtain extensive icing statistics relevant to aircraft design and operation. The thousands of in-flight observations recorded over a 2- to 4-year period provide reliable statistics on icing encounters for the specific areas, altitudes, and seasons included in the data. The relative frequencies of icing occurrence are presented, together with the estimated icing probabilities and the relation of these probabilities to the frequencies of flight in clouds and cloud temperatures. The results show that aircraft operators can expect icing probabilities to vary widely throughout the year from near zero in the cold Arctic areas in winter up to 7 percent in areas where greater cloudiness and warmer temperatures prevail. The data also reveal a general tendency of colder cloud temperatures to reduce the probability of icing in equally cloudy conditions.

  15. A transdisciplinary focus on drug abuse prevention: an introduction.

    PubMed

    Sussman, Steve; Stacy, Alan W; Johnson, C Anderson; Pentz, Mary Ann; Robertson, Elizabeth

    2004-01-01

    This article introduces the scope of the Special Issue. A variety of scientific disciplines are brought together to establish theoretical integration of the arenas of drug use, misuse, "abuse," and drug misuse prevention. Transdisciplinary scientific collaboration (TDSC) is utilized as a process of integration. Introductory comments regarding the strengths and limitations of TDSC are presented. Then, the relevance of genetics to substance misuse and substance misuse prevention is presented. Next, the relevance of cognition for prevention is discussed. Specifically, neurologically plausible distinctions in cognition and implicit cognition and their relevance for prevention are discussed. At a relatively molar social-level of analysis, social network theory, systems dynamic models, geographic information systems models, cultural psychology, and political science approaches to drug misuse and its prevention are introduced. The uses of both quantitative and qualitative statistical approaches to prevention are mentioned next. Finally, targeted prevention, bridging the efficacy-effectiveness gap, and a statement on overcoming disbalance round out the Special Issue. The bridges created will serve to propel drug misuse "prevention science" forward in the years to come. Advances in understanding etiological issues, translation to programs, and ecological fit of programming are desired results.

  16. Statistical inference and Aristotle's Rhetoric.

    PubMed

    Macdonald, Ranald R

    2004-11-01

    Formal logic operates in a closed system where all the information relevant to any conclusion is present, whereas this is not the case when one reasons about events and states of the world. Pollard and Richardson drew attention to the fact that the reasoning behind statistical tests does not lead to logically justifiable conclusions. In this paper statistical inferences are defended not by logic but by the standards of everyday reasoning. Aristotle invented formal logic, but argued that people mostly get at the truth with the aid of enthymemes--incomplete syllogisms which include arguing from examples, analogies and signs. It is proposed that statistical tests work in the same way--in that they are based on examples, invoke the analogy of a model and use the size of the effect under test as a sign that the chance hypothesis is unlikely. Of existing theories of statistical inference only a weak version of Fisher's takes this into account. Aristotle anticipated Fisher by producing an argument of the form that there were too many cases in which an outcome went in a particular direction for that direction to be plausibly attributed to chance. We can therefore conclude that Aristotle would have approved of statistical inference and there is a good reason for calling this form of statistical inference classical.

  17. Citation analytics: Data exploration and comparative analyses of CiteScores of Open Access and Subscription-Based publications indexed in Scopus (2014-2016).

    PubMed

    Atayero, Aderemi A; Popoola, Segun I; Egeonu, Jesse; Oludayo, Olumuyiwa

    2018-08-01

    Citation is one of the important metrics that are used in measuring the relevance and the impact of research publications. The potentials of citation analytics may be exploited to understand the gains of publishing scholarly peer-reviewed research outputs in either Open Access (OA) sources or Subscription-Based (SB) sources in the bid to increase citation impact. However, relevant data required for such comparative analysis must be freely accessible for evidence-based findings and conclusions. In this data article, citation scores ( CiteScores ) of 2542 OA sources and 15,040 SB sources indexed in Scopus from 2014 to 2016 were presented and analyzed based on a set of five inclusion criteria. A robust dataset, which contains the CiteScores of OA and SB publication sources included, is attached as supplementary material to this data article to facilitate further reuse. Descriptive statistics and frequency distributions of OA CiteScores and SB CiteScores are presented in tables. Boxplot representations and scatter plots are provided to show the statistical distributions of OA CiteScores and SB CiteScores across the three sub-categories (Book Series, Journal, and Trade Journal). Correlation coefficient and p-value matrices are made available within the data article. In addition, Probability Density Functions (PDFs) and Cumulative Distribution Functions (CDFs) of OA CiteScores and SB CiteScores are computed and the results are presented using tables and graphs. Furthermore, Analysis of Variance (ANOVA) and multiple comparison post-hoc tests are conducted to understand the statistical difference (and its significance, if any) in the citation impact of OA publication sources and SB publication source based on CiteScore . In the long run, the data provided in this article will help policy makers and researchers in Higher Education Institutions (HEIs) to identify the appropriate publication source type and category for dissemination of scholarly research findings with maximum citation impact.

  18. Information on 'Overdiagnosis' in Breast Cancer Screening on Prominent United Kingdom- and Australia-Oriented Health Websites

    PubMed Central

    Ghanouni, Alex; Meisel, Susanne F.; Hersch, Jolyn; Waller, Jo; Renzi, Cristina

    2016-01-01

    Objectives Health-related websites are an important source of information for the public. Increasing public awareness of overdiagnosis and ductal carcinoma in situ (DCIS) in breast cancer screening may facilitate more informed decision-making. This study assessed the extent to which such information was included on prominent health websites oriented towards the general public, and evaluated how it was explained. Design Cross-sectional study. Setting Websites identified through Google searches in England (United Kingdom) and New South Wales (Australia) for “breast cancer screening” and further websites included based on our prior knowledge of relevant organisations. Main Outcomes Content analysis was used to determine whether information on overdiagnosis or DCIS existed on each site, how the concepts were described, and what statistics were used to quantify overdiagnosis. Results After exclusions, ten UK websites and eight Australian websites were considered relevant and evaluated. They originated from charities, health service providers, government agencies, and an independent health organisation. Most contained some information on overdiagnosis (and/or DCIS). Descriptive information was similar across websites. Among UK websites, statistical information was often based on estimates from the Independent UK Panel on Breast Cancer Screening; the most commonly provided statistic was the ratio of breast cancer deaths prevented to overdiagnosed cases (1:3). A range of other statistics was included, such as the yearly number of overdiagnosed cases and the proportion of women screened who would be overdiagnosed. Information on DCIS and statistical information was less common on the Australian websites. Conclusions Online information about overdiagnosis has become more widely available in 2015–16 compared with the limited accessibility indicated by older research. However, there may be scope to offer more information on DCIS and overdiagnosis statistics on Australian websites. Moreover, the variability in how estimates are presented across UK websites may be confusing for the general public. PMID:27010593

  19. Prediction of biomechanical parameters of the proximal femur using statistical appearance models and support vector regression.

    PubMed

    Fritscher, Karl; Schuler, Benedikt; Link, Thomas; Eckstein, Felix; Suhm, Norbert; Hänni, Markus; Hengg, Clemens; Schubert, Rainer

    2008-01-01

    Fractures of the proximal femur are one of the principal causes of mortality among elderly persons. Traditional methods for the determination of femoral fracture risk use methods for measuring bone mineral density. However, BMD alone is not sufficient to predict bone failure load for an individual patient and additional parameters have to be determined for this purpose. In this work an approach that uses statistical models of appearance to identify relevant regions and parameters for the prediction of biomechanical properties of the proximal femur will be presented. By using Support Vector Regression the proposed model based approach is capable of predicting two different biomechanical parameters accurately and fully automatically in two different testing scenarios.

  20. Assessment of NDE reliability data

    NASA Technical Reports Server (NTRS)

    Yee, B. G. W.; Couchman, J. C.; Chang, F. H.; Packman, D. F.

    1975-01-01

    Twenty sets of relevant nondestructive test (NDT) reliability data were identified, collected, compiled, and categorized. A criterion for the selection of data for statistical analysis considerations was formulated, and a model to grade the quality and validity of the data sets was developed. Data input formats, which record the pertinent parameters of the defect/specimen and inspection procedures, were formulated for each NDE method. A comprehensive computer program was written and debugged to calculate the probability of flaw detection at several confidence limits by the binomial distribution. This program also selects the desired data sets for pooling and tests the statistical pooling criteria before calculating the composite detection reliability. An example of the calculated reliability of crack detection in bolt holes by an automatic eddy current method is presented.

  1. A Study of the Effectiveness of the Contextual Lab Activity in the Teaching and Learning Statistics at the UTHM (Universiti Tun Hussein Onn Malaysia)

    ERIC Educational Resources Information Center

    Kamaruddin, Nafisah Kamariah Md; Jaafar, Norzilaila bt; Amin, Zulkarnain Md

    2012-01-01

    Inaccurate concept in statistics contributes to the assumption by the students that statistics do not relate to the real world and are not relevant to the engineering field. There are universities which introduced learning statistics using statistics lab activities. However, the learning is more on the learning how to use software and not to…

  2. [Donepezil in patients with Alzheimer's disease--a critical appraisal of the AD2000 study].

    PubMed

    Kaiser, Thomas; Florack, Christiane; Franz, Heinrich; Sawicki, Peter T

    2005-03-15

    The AD2000 study was a randomized placebo-controlled trial, the effects of donepezil, a cholinesterase inhibitor, in patients with Alzheimer's disease. It was the first long-term RCT not sponsored by the pharmaceutical industry. The study did not show any significant effect on patient-relevant outcomes. However, donepezil had a significant effect on cognitive scores. More patients taking donepezil stopped treatment due to adverse events, even when taking only 5 mg once daily. There are major concerns regarding the conduction of the AD2000 study as well as the presentation of the results. Much less patients than previously planned have been recruited, resulting in a low statistical power to detect a significant difference between both treatments. In addition, no true intention-to-treat analysis based on the first randomization is presented. The validity of the AD2000 trial has to be questioned. However, there is still insufficient evidence to support the claim that cholinesterase inhibitors have beneficial effects on patient-relevant outcomes in patients with Alzheimer's disease. The change of cognitive performance as measured by different scales does not necessarily correspond to substantial changes in patient-relevant outcomes. In conclusion, the widespread use of cholinesterase inhibitors in patients with Alzheimer's disease is not supported by current evidence. Long-term-randomized controlled trials focusing on patient-relevant outcomes instead of cognitive scores are urgently needed.

  3. Salaried and Professional Women: Relevant Statistics. Publication #92-3.

    ERIC Educational Resources Information Center

    Wilson, Pamela, Ed.

    This document contains 29 statistical tables grouped into five sections: "General Statistics,""Occupations and Earnings,""Earnings of Selected Professional Occupations,""Women and Higher Education," and "Family Income and Composition." Among the tables are those that show the following: (1) 1991 annual average U.S. civilian work force by…

  4. 34 CFR Appendix A to Subpart N of... - Sample Default Prevention Plan

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... relevant default prevention statistics, including a statistical analysis of the borrowers who default on...'s delinquency status by obtaining reports from data managers and FFEL Program lenders. 5. Enhance... academic study. III. Statistics for Measuring Progress 1. The number of students enrolled at your...

  5. Statistical Cost Estimation in Higher Education: Some Alternatives.

    ERIC Educational Resources Information Center

    Brinkman, Paul T.; Niwa, Shelley

    Recent developments in econometrics that are relevant to the task of estimating costs in higher education are reviewed. The relative effectiveness of alternative statistical procedures for estimating costs are also tested. Statistical cost estimation involves three basic parts: a model, a data set, and an estimation procedure. Actual data are used…

  6. Counting Better? An Examination of the Impact of Quantitative Method Teaching on Statistical Anxiety and Confidence

    ERIC Educational Resources Information Center

    Chamberlain, John Martyn; Hillier, John; Signoretta, Paola

    2015-01-01

    This article reports the results of research concerned with students' statistical anxiety and confidence to both complete and learn to complete statistical tasks. Data were collected at the beginning and end of a quantitative method statistics module. Students recognised the value of numeracy skills but felt they were not necessarily relevant for…

  7. Use of antiepileptic drugs and risk of falls in old age: A systematic review.

    PubMed

    Haasum, Ylva; Johnell, Kristina

    2017-12-01

    The aim of this study is to systematically review the scientific literature to investigate if use of antiepileptic drugs (AEDs) is associated with falls and/or recurrent falls in old age. We searched the literature for relevant articles in PubMed and Embase published up until 3rd December 2015. Studies on people aged 60 years and over with an observational design assessing the risk of fall in people exposed to AEDs compared to people not exposed to AED were included. We found 744 studies by searching Medline and Embase and an additional 9 studies by reviewing relevant reference lists. Of these studies, 13 fulfilled our predefined criteria. The articles were of various study design, sizes and follow-up times, and presented the results in different ways. Also, confounder adjustment varied considerably between the studies. Ten studies presented results for the association between use of any AED and any fall/injurious fall. Of these studies, 6 presented adjusted estimates, of which all but one showed statistically significant associations between use of any AED and any fall/injurious fall. Six studies investigated the association between use of any AED and recurrent falls. Of these, only 3 studies presented adjusted effect estimates of which 2 reached statistical significance for the association between use of AEDs and recurrent falls in elderly people. Our results indicate an association between use of AEDs and risk of falls and recurrent falls in older people. This finding may be clinically important given that a substantial amount of older people use these drugs. However, further research is needed to increase the knowledge about the actual risk of falls when using these drugs in old age. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Cardiovascular services and human resources in Puerto Rico - 2008.

    PubMed

    García-Palmieri, Mario R

    2009-01-01

    Available information (2004-2008) concerning population statistics, the occurrence of cardiovascular disease, cardiovascular services and human resources in Puerto Rico is presented. Relevant information concerning life expectancy at birth, death by specific causes in a recent four years period, the commonest causes of death, and the related cardiovascular risk factors prevalence data available is included. The surgical and medical interventional services rendered to cardiovascular patients in different institutions and their locations in Puerto Rico in the year 2008 is presented. Some remarks concerning the productivity of physicians by our Schools of Medicine is included. Information about ACGME accredited postgraduate cardiovascular training programs conducted in Puerto Rico is presented. Data concerning the prevalence of hypertension, diabetes mellitus, overweight and obesity obtained by BRFSS in presented.

  9. Inventory of DOT Statistical Information Systems

    DOT National Transportation Integrated Search

    1983-01-01

    The inventory represents an update of relevant systems described in the Transportation Statistical Reference File (TSRF), coordinated with the GAO update of Congressional Sources and Systems, and the Information Collection Budget. The inventory compi...

  10. Health significance and statistical uncertainty. The value of P-value.

    PubMed

    Consonni, Dario; Bertazzi, Pier Alberto

    2017-10-27

    The P-value is widely used as a summary statistics of scientific results. Unfortunately, there is a widespread tendency to dichotomize its value in "P<0.05" (defined as "statistically significant") and "P>0.05" ("statistically not significant"), with the former implying a "positive" result and the latter a "negative" one. To show the unsuitability of such an approach when evaluating the effects of environmental and occupational risk factors. We provide examples of distorted use of P-value and of the negative consequences for science and public health of such a black-and-white vision. The rigid interpretation of P-value as a dichotomy favors the confusion between health relevance and statistical significance, discourages thoughtful thinking, and distorts attention from what really matters, the health significance. A much better way to express and communicate scientific results involves reporting effect estimates (e.g., risks, risks ratios or risk differences) and their confidence intervals (CI), which summarize and convey both health significance and statistical uncertainty. Unfortunately, many researchers do not usually consider the whole interval of CI but only examine if it includes the null-value, therefore degrading this procedure to the same P-value dichotomy (statistical significance or not). In reporting statistical results of scientific research present effects estimates with their confidence intervals and do not qualify the P-value as "significant" or "not significant".

  11. Considerations in the design of large space structures

    NASA Technical Reports Server (NTRS)

    Hedgepeth, J. M.; Macneal, R. H.; Knapp, K.; Macgillivray, C. S.

    1981-01-01

    Several analytical studies of topics relevant to the design of large space structures are presented. Topics covered are: the types and quantitative evaluation of the disturbances to which large Earth-oriented microwave reflectors would be subjected and the resulting attitude errors of such spacecraft; the influence of errors in the structural geometry of the performance of radiofrequency antennas; the effect of creasing on the flatness of tensioned reflector membrane surface; and an analysis of the statistics of damage to truss-type structures due to meteoroids.

  12. A state-of-the-art review of transportation systems evaluation techniques relevant to air transportation, volume 1. [urban planning and urban transportation using decision theory

    NASA Technical Reports Server (NTRS)

    Haefner, L. E.

    1975-01-01

    Mathematical and philosophical approaches are presented for evaluation and implementation of ground and air transportation systems. Basic decision processes are examined that are used for cost analyses and planning (i.e, statistical decision theory, linear and dynamic programming, optimization, game theory). The effects on the environment and the community that a transportation system may have are discussed and modelled. Algorithmic structures are examined and selected bibliographic annotations are included. Transportation dynamic models were developed. Citizen participation in transportation projects (i.e, in Maryland and Massachusetts) is discussed. The relevance of the modelling and evaluation approaches to air transportation (i.e, airport planning) is examined in a case study in St. Louis, Missouri.

  13. Analysis of model output and science data in the Virtual Model Repository (VMR).

    NASA Astrophysics Data System (ADS)

    De Zeeuw, D.; Ridley, A. J.

    2014-12-01

    Big scientific data not only includes large repositories of data from scientific platforms like satelites and ground observation, but also the vast output of numerical models. The Virtual Model Repository (VMR) provides scientific analysis and visualization tools for a many numerical models of the Earth-Sun system. Individual runs can be analyzed in the VMR and compared to relevant data through relevant metadata, but larger collections of runs can also now be studied and statistics generated on the accuracy and tendancies of model output. The vast model repository at the CCMC with over 1000 simulations of the Earth's magnetosphere was used to look at overall trends in accuracy when compared to satelites such as GOES, Geotail, and Cluster. Methodology for this analysis as well as case studies will be presented.

  14. High-speed data search

    NASA Technical Reports Server (NTRS)

    Driscoll, James N.

    1994-01-01

    The high-speed data search system developed for KSC incorporates existing and emerging information retrieval technology to help a user intelligently and rapidly locate information found in large textual databases. This technology includes: natural language input; statistical ranking of retrieved information; an artificial intelligence concept called semantics, where 'surface level' knowledge found in text is used to improve the ranking of retrieved information; and relevance feedback, where user judgements about viewed information are used to automatically modify the search for further information. Semantics and relevance feedback are features of the system which are not available commercially. The system further demonstrates focus on paragraphs of information to decide relevance; and it can be used (without modification) to intelligently search all kinds of document collections, such as collections of legal documents medical documents, news stories, patents, and so forth. The purpose of this paper is to demonstrate the usefulness of statistical ranking, our semantic improvement, and relevance feedback.

  15. Improvement of Biocatalysts for Industrial and Environmental Purposes by Saturation Mutagenesis

    PubMed Central

    Valetti, Francesca; Gilardi, Gianfranco

    2013-01-01

    Laboratory evolution techniques are becoming increasingly widespread among protein engineers for the development of novel and designed biocatalysts. The palette of different approaches ranges from complete randomized strategies to rational and structure-guided mutagenesis, with a wide variety of costs, impacts, drawbacks and relevance to biotechnology. A technique that convincingly compromises the extremes of fully randomized vs. rational mutagenesis, with a high benefit/cost ratio, is saturation mutagenesis. Here we will present and discuss this approach in its many facets, also tackling the issue of randomization, statistical evaluation of library completeness and throughput efficiency of screening methods. Successful recent applications covering different classes of enzymes will be presented referring to the literature and to research lines pursued in our group. The focus is put on saturation mutagenesis as a tool for designing novel biocatalysts specifically relevant to production of fine chemicals for improving bulk enzymes for industry and engineering technical enzymes involved in treatment of waste, detoxification and production of clean energy from renewable sources. PMID:24970191

  16. Computational domain length and Reynolds number effects on large-scale coherent motions in turbulent pipe flow

    NASA Astrophysics Data System (ADS)

    Feldmann, Daniel; Bauer, Christian; Wagner, Claus

    2018-03-01

    We present results from direct numerical simulations (DNS) of turbulent pipe flow at shear Reynolds numbers up to Reτ = 1500 using different computational domains with lengths up to ?. The objectives are to analyse the effect of the finite size of the periodic pipe domain on large flow structures in dependency of Reτ and to assess a minimum ? required for relevant turbulent scales to be captured and a minimum Reτ for very large-scale motions (VLSM) to be analysed. Analysing one-point statistics revealed that the mean velocity profile is invariant for ?. The wall-normal location at which deviations occur in shorter domains changes strongly with increasing Reτ from the near-wall region to the outer layer, where VLSM are believed to live. The root mean square velocity profiles exhibit domain length dependencies for pipes shorter than 14R and 7R depending on Reτ. For all Reτ, the higher-order statistical moments show only weak dependencies and only for the shortest domain considered here. However, the analysis of one- and two-dimensional pre-multiplied energy spectra revealed that even for larger ?, not all physically relevant scales are fully captured, even though the aforementioned statistics are in good agreement with the literature. We found ? to be sufficiently large to capture VLSM-relevant turbulent scales in the considered range of Reτ based on our definition of an integral energy threshold of 10%. The requirement to capture at least 1/10 of the global maximum energy level is justified by a 14% increase of the streamwise turbulence intensity in the outer region between Reτ = 720 and 1500, which can be related to VLSM-relevant length scales. Based on this scaling anomaly, we found Reτ⪆1500 to be a necessary minimum requirement to investigate VLSM-related effects in pipe flow, even though the streamwise energy spectra does not yet indicate sufficient scale separation between the most energetic and the very long motions.

  17. Intensity invariance properties of auditory neurons compared to the statistics of relevant natural signals in grasshoppers.

    PubMed

    Clemens, Jan; Weschke, Gerroth; Vogel, Astrid; Ronacher, Bernhard

    2010-04-01

    The temporal pattern of amplitude modulations (AM) is often used to recognize acoustic objects. To identify objects reliably, intensity invariant representations have to be formed. We approached this problem within the auditory pathway of grasshoppers. We presented AM patterns modulated at different time scales and intensities. Metric space analysis of neuronal responses allowed us to determine how well, how invariantly, and at which time scales AM frequency is encoded. We find that in some neurons spike-count cues contribute substantially (20-60%) to the decoding of AM frequency at a single intensity. However, such cues are not robust when intensity varies. The general intensity invariance of the system is poor. However, there exists a range of AM frequencies around 83 Hz where intensity invariance of local interneurons is relatively high. In this range, natural communication signals exhibit much variation between species, suggesting an important behavioral role for this frequency band. We hypothesize, just as has been proposed for human speech, that the communication signals might have evolved to match the processing properties of the receivers. This contrasts with optimal coding theory, which postulates that neuronal systems are adapted to the statistics of the relevant signals.

  18. Quantitatively measured tremor in hand-arm vibration-exposed workers.

    PubMed

    Edlund, Maria; Burström, Lage; Hagberg, Mats; Lundström, Ronnie; Nilsson, Tohr; Sandén, Helena; Wastensson, Gunilla

    2015-04-01

    The aim of the present study was to investigate the possible increase in hand tremor in relation to hand-arm vibration (HAV) exposure in a cohort of exposed and unexposed workers. Participants were 178 male workers with or without exposure to HAV. The study is cross-sectional regarding the outcome of tremor and has a longitudinal design with respect to exposure. The dose of HAV exposure was collected via questionnaires and measurements at several follow-ups. The CATSYS Tremor Pen(®) was used for measuring postural tremor. Multiple linear regression methods were used to analyze associations between different tremor variables and HAV exposure, along with predictor variables with biological relevance. There were no statistically significant associations between the different tremor variables and cumulative HAV or current exposure. Age was a statistically significant predictor of variation in tremor outcomes for three of the four tremor variables, whereas nicotine use was a statistically significant predictor of either left or right hand or both hands for all four tremor variables. In the present study, there was no evidence of an exposure-response association between HAV exposure and measured postural tremor. Increase in age and nicotine use appeared to be the strongest predictors of tremor.

  19. Relevance of the c-statistic when evaluating risk-adjustment models in surgery.

    PubMed

    Merkow, Ryan P; Hall, Bruce L; Cohen, Mark E; Dimick, Justin B; Wang, Edward; Chow, Warren B; Ko, Clifford Y; Bilimoria, Karl Y

    2012-05-01

    The measurement of hospital quality based on outcomes requires risk adjustment. The c-statistic is a popular tool used to judge model performance, but can be limited, particularly when evaluating specific operations in focused populations. Our objectives were to examine the interpretation and relevance of the c-statistic when used in models with increasingly similar case mix and to consider an alternative perspective on model calibration based on a graphical depiction of model fit. From the American College of Surgeons National Surgical Quality Improvement Program (2008-2009), patients were identified who underwent a general surgery procedure, and procedure groups were increasingly restricted: colorectal-all, colorectal-elective cases only, and colorectal-elective cancer cases only. Mortality and serious morbidity outcomes were evaluated using logistic regression-based risk adjustment, and model c-statistics and calibration curves were used to compare model performance. During the study period, 323,427 general, 47,605 colorectal-all, 39,860 colorectal-elective, and 21,680 colorectal cancer patients were studied. Mortality ranged from 1.0% in general surgery to 4.1% in the colorectal-all group, and serious morbidity ranged from 3.9% in general surgery to 12.4% in the colorectal-all procedural group. As case mix was restricted, c-statistics progressively declined from the general to the colorectal cancer surgery cohorts for both mortality and serious morbidity (mortality: 0.949 to 0.866; serious morbidity: 0.861 to 0.668). Calibration was evaluated graphically by examining predicted vs observed number of events over risk deciles. For both mortality and serious morbidity, there was no qualitative difference in calibration identified between the procedure groups. In the present study, we demonstrate how the c-statistic can become less informative and, in certain circumstances, can lead to incorrect model-based conclusions, as case mix is restricted and patients become more homogenous. Although it remains an important tool, caution is advised when the c-statistic is advanced as the sole measure of a model performance. Copyright © 2012 American College of Surgeons. All rights reserved.

  20. Application of the "threshold of toxicological concern" to derive tolerable concentrations of "non-relevant metabolites" formed from plant protection products in ground and drinking water.

    PubMed

    Melching-Kollmuss, Stephanie; Dekant, Wolfgang; Kalberlah, Fritz

    2010-03-01

    Limits for tolerable concentrations of ground water metabolites ("non-relevant metabolites" without targeted toxicities and specific classification and labeling) derived from active ingredients (AI) of plant protection products (PPPs) are discussed in the European Union. Risk assessments for "non-relevant metabolites" need to be performed when concentrations are above 0.75 microg/L. Since oral uptake is the only relevant exposure pathway for "non-relevant metabolites", risk assessment approaches as used for other chemicals with predominantly oral exposure in humans are applicable. The concept of "thresholds of toxicological concern" (TTC) defines tolerable dietary intakes for chemicals without toxicity data and is widely applied to chemicals present in food in low concentrations such as flavorings. Based on a statistical evaluation of the results of many toxicity studies and considerations of chemical structures, the TTC concept derives a maximum daily oral intake without concern of 90 microg/person/day for non-genotoxic chemicals, even for those with appreciable toxicity. When using the typical exposure assessment for drinking water contaminants (consumption of 2L of drinking water/person/day, allocation of 10% of the tolerable daily intake to drinking water), a TTC-based upper concentration limit of 4.5 microg/L for "non-relevant metabolites" in ground/drinking water is delineated. In the present publication it has been evaluated, whether this value would cover all relevant toxicities (repeated dose, reproductive and developmental, and immune effects). Taking into account, that after evaluation of specific reproduction toxicity data from chemicals and pharmaceuticals, a value of 1 microg/kgbw/day has been assessed as to cover developmental and reproduction toxicity, a TTC value of 60 microg/person/day was assessed as to represent a safe value. Based on these reasonable worst case assumptions, a TTC-derived threshold of 3 microg/L in drinking water is derived. When a non-relevant metabolite is present in concentration below 3 microg/L, animal testing for toxicity is not considered necessary for a compound-specific risk assessment since the application of the TTC covers all relevant toxicities to be considered in such assessment and any health risk resulting from these exposures is very low. (c) 2009 Elsevier Inc. All rights reserved.

  1. Statistical Literacy in the Data Science Workplace

    ERIC Educational Resources Information Center

    Grant, Robert

    2017-01-01

    Statistical literacy, the ability to understand and make use of statistical information including methods, has particular relevance in the age of data science, when complex analyses are undertaken by teams from diverse backgrounds. Not only is it essential to communicate to the consumers of information but also within the team. Writing from the…

  2. Which Type of Risk Information to Use for Whom? Moderating Role of Outcome-Relevant Involvement in the Effects of Statistical and Exemplified Risk Information on Risk Perceptions.

    PubMed

    So, Jiyeon; Jeong, Se-Hoon; Hwang, Yoori

    2017-04-01

    The extant empirical research examining the effectiveness of statistical and exemplar-based health information is largely inconsistent. Under the premise that the inconsistency may be due to an unacknowledged moderator (O'Keefe, 2002), this study examined a moderating role of outcome-relevant involvement (Johnson & Eagly, 1989) in the effects of statistical and exemplified risk information on risk perception. Consistent with predictions based on elaboration likelihood model (Petty & Cacioppo, 1984), findings from an experiment (N = 237) concerning alcohol consumption risks showed that statistical risk information predicted risk perceptions of individuals with high, rather than low, involvement, while exemplified risk information predicted risk perceptions of those with low, rather than high, involvement. Moreover, statistical risk information contributed to negative attitude toward drinking via increased risk perception only for highly involved individuals, while exemplified risk information influenced the attitude through the same mechanism only for individuals with low involvement. Theoretical and practical implications for health risk communication are discussed.

  3. Statistical fluctuations in pedestrian evacuation times and the effect of social contagion

    NASA Astrophysics Data System (ADS)

    Nicolas, Alexandre; Bouzat, Sebastián; Kuperman, Marcelo N.

    2016-08-01

    Mathematical models of pedestrian evacuation and the associated simulation software have become essential tools for the assessment of the safety of public facilities and buildings. While a variety of models is now available, their calibration and test against empirical data are generally restricted to global averaged quantities; the statistics compiled from the time series of individual escapes ("microscopic" statistics) measured in recent experiments are thus overlooked. In the same spirit, much research has primarily focused on the average global evacuation time, whereas the whole distribution of evacuation times over some set of realizations should matter. In the present paper we propose and discuss the validity of a simple relation between this distribution and the microscopic statistics, which is theoretically valid in the absence of correlations. To this purpose, we develop a minimal cellular automaton, with features that afford a semiquantitative reproduction of the experimental microscopic statistics. We then introduce a process of social contagion of impatient behavior in the model and show that the simple relation under test may dramatically fail at high contagion strengths, the latter being responsible for the emergence of strong correlations in the system. We conclude with comments on the potential practical relevance for safety science of calculations based on microscopic statistics.

  4. Identifying climate analogues for precipitation extremes for Denmark based on RCM simulations from the ENSEMBLES database.

    PubMed

    Arnbjerg-Nielsen, K; Funder, S G; Madsen, H

    2015-01-01

    Climate analogues, also denoted Space-For-Time, may be used to identify regions where the present climatic conditions resemble conditions of a past or future state of another location or region based on robust climate variable statistics in combination with projections of how these statistics change over time. The study focuses on assessing climate analogues for Denmark based on current climate data set (E-OBS) observations as well as the ENSEMBLES database of future climates with the aim of projecting future precipitation extremes. The local present precipitation extremes are assessed by means of intensity-duration-frequency curves for urban drainage design for the relevant locations being France, the Netherlands, Belgium, Germany, the United Kingdom, and Denmark. Based on this approach projected increases of extreme precipitation by 2100 of 9 and 21% are expected for 2 and 10 year return periods, respectively. The results should be interpreted with caution as the best region to represent future conditions for Denmark is the coastal areas of Northern France, for which only little information is available with respect to present precipitation extremes.

  5. Evaluating the Relevance, Reliability, and Applicability of CMIP5 Climate Projections for Water Resources and Environmental Planning

    NASA Astrophysics Data System (ADS)

    Brekke, L. D.; Scott, J.; Ferguson, I. M.; Arnold, J.; Raff, D. A.; Webb, R. S.

    2012-12-01

    Water managers need to understand the applicability of climate projection information available for decision-support at the scale of their applications. Applicability depends on information reliability and relevance. This need to understand applicability stems from expectations that entities rationalize adaptation investments or decisions to delay investment. It is also occurring at a time when new global climate projections are being released through the World Climate Research Programme Coupled Model Intercomparison Project phase 5 (CMIP5), which introduces new information opportunities and interpretation challenges. This project involves an interagency collaboration to evaluate the applicability of CMIP5 projections for use in water and environmental resources planning. The overarching goal is to develop and demonstrate a framework that involves dual evaluations of relevance and reliability informing an ultimate discussion and judgment of applicability, which is expected to vary with decision-making context. The framework is being developed and demonstrated within the context of reservoir systems management in California's Sacramento and San Joaquin River basins. The relevance evaluation focuses on identifying the climate variables and statistical measures relevant to long-term management questions, which may depend on satisfying multiple objectives. Past studies' results are being considered in this evaluation, along with new results from system sensitivity analyses conducted through this effort. The reliability evaluation focuses on the CMIP5 climate models' ability to simulate past conditions relative to observed references. The evaluation is being conducted across the global domain using a large menu of climate variables and statistical measures, leveraging lessons learned from similar evaluations of CMIP3 climate models. The global focus addresses a broader project goal of producing a web resource that can serve reliability information to applicability discussions around the world, with evaluation results being served through a web-portal similar to that developed by NOAA/CIRES to serve CMIP3 information on future climate extremes (http://www.esrl.noaa.gov/psd/ipcc/extremes/). The framework concludes with an applicability discussion informed by relevance and reliability results. The goal is to observe the discussion process and identify features, choice points, and challenges that might be summarized and shared with other resource management groups facing applicability questions. This presentation will discuss the project framework and preliminary results. In addition to considering CMIP5 21st century projection information, the framework is being developed to support evaluation of CMIP5 decadal predictability experiment simulations and reconcile those simulations with 21st century projections. The presentation will also discuss implications of considering the applicability of bias-corrected and downscaled information within this framework.

  6. EvolQG - An R package for evolutionary quantitative genetics

    PubMed Central

    Melo, Diogo; Garcia, Guilherme; Hubbe, Alex; Assis, Ana Paula; Marroig, Gabriel

    2016-01-01

    We present an open source package for performing evolutionary quantitative genetics analyses in the R environment for statistical computing. Evolutionary theory shows that evolution depends critically on the available variation in a given population. When dealing with many quantitative traits this variation is expressed in the form of a covariance matrix, particularly the additive genetic covariance matrix or sometimes the phenotypic matrix, when the genetic matrix is unavailable and there is evidence the phenotypic matrix is sufficiently similar to the genetic matrix. Given this mathematical representation of available variation, the \\textbf{EvolQG} package provides functions for calculation of relevant evolutionary statistics; estimation of sampling error; corrections for this error; matrix comparison via correlations, distances and matrix decomposition; analysis of modularity patterns; and functions for testing evolutionary hypotheses on taxa diversification. PMID:27785352

  7. A statistical view of protein chemical synthesis using NCL and extended methodologies.

    PubMed

    Agouridas, Vangelis; El Mahdi, Ouafâa; Cargoët, Marine; Melnyk, Oleg

    2017-09-15

    Native chemical ligation and extended methodologies are the most popular chemoselective reactions for protein chemical synthesis. Their combination with desulfurization techniques can give access to small or challenging proteins that are exploited in a large variety of research areas. In this report, we have conducted a statistical review of their use for protein chemical synthesis in order to provide a flavor of the recent trends and identify the most popular chemical tools used by protein chemists. To this end, a protein chemical synthesis (PCS) database (http://pcs-db.fr) was created by collecting a set of relevant data from more than 450 publications covering the period 1994-2017. A preliminary account of what this database tells us is presented in this report. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Statistical analysis of field data for aircraft warranties

    NASA Astrophysics Data System (ADS)

    Lakey, Mary J.

    Air Force and Navy maintenance data collection systems were researched to determine their scientific applicability to the warranty process. New and unique algorithms were developed to extract failure distributions which were then used to characterize how selected families of equipment typically fails. Families of similar equipment were identified in terms of function, technology and failure patterns. Statistical analyses and applications such as goodness-of-fit test, maximum likelihood estimation and derivation of confidence intervals for the probability density function parameters were applied to characterize the distributions and their failure patterns. Statistical and reliability theory, with relevance to equipment design and operational failures were also determining factors in characterizing the failure patterns of the equipment families. Inferences about the families with relevance to warranty needs were then made.

  9. Constructing and Modifying Sequence Statistics for relevent Using informR in 𝖱

    PubMed Central

    Marcum, Christopher Steven; Butts, Carter T.

    2015-01-01

    The informR package greatly simplifies the analysis of complex event histories in 𝖱 by providing user friendly tools to build sufficient statistics for the relevent package. Historically, building sufficient statistics to model event sequences (of the form a→b) using the egocentric generalization of Butts’ (2008) relational event framework for modeling social action has been cumbersome. The informR package simplifies the construction of the complex list of arrays needed by the rem() model fitting for a variety of cases involving egocentric event data, multiple event types, and/or support constraints. This paper introduces these tools using examples from real data extracted from the American Time Use Survey. PMID:26185488

  10. A methodology for the design of experiments in computational intelligence with multiple regression models.

    PubMed

    Fernandez-Lozano, Carlos; Gestal, Marcos; Munteanu, Cristian R; Dorado, Julian; Pazos, Alejandro

    2016-01-01

    The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable.

  11. A methodology for the design of experiments in computational intelligence with multiple regression models

    PubMed Central

    Gestal, Marcos; Munteanu, Cristian R.; Dorado, Julian; Pazos, Alejandro

    2016-01-01

    The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable. PMID:27920952

  12. A statistical assessment of differences and equivalences between genetically modified and reference plant varieties

    PubMed Central

    2011-01-01

    Background Safety assessment of genetically modified organisms is currently often performed by comparative evaluation. However, natural variation of plant characteristics between commercial varieties is usually not considered explicitly in the statistical computations underlying the assessment. Results Statistical methods are described for the assessment of the difference between a genetically modified (GM) plant variety and a conventional non-GM counterpart, and for the assessment of the equivalence between the GM variety and a group of reference plant varieties which have a history of safe use. It is proposed to present the results of both difference and equivalence testing for all relevant plant characteristics simultaneously in one or a few graphs, as an aid for further interpretation in safety assessment. A procedure is suggested to derive equivalence limits from the observed results for the reference plant varieties using a specific implementation of the linear mixed model. Three different equivalence tests are defined to classify any result in one of four equivalence classes. The performance of the proposed methods is investigated by a simulation study, and the methods are illustrated on compositional data from a field study on maize grain. Conclusions A clear distinction of practical relevance is shown between difference and equivalence testing. The proposed tests are shown to have appropriate performance characteristics by simulation, and the proposed simultaneous graphical representation of results was found to be helpful for the interpretation of results from a practical field trial data set. PMID:21324199

  13. Statistical characterization of fluctuations of a laser beam transmitted through a random air-water interface: new results from a laboratory experiment

    NASA Astrophysics Data System (ADS)

    Majumdar, Arun K.; Land, Phillip; Siegenthaler, John

    2014-10-01

    New results for characterizing laser intensity fluctuation statistics of a laser beam transmitted through a random air-water interface relevant to underwater communications are presented. A laboratory watertank experiment is described to investigate the beam wandering effects of the transmitted beam. Preliminary results from the experiment provide information about histograms of the probability density functions of intensity fluctuations for different wind speeds measured by a CMOS camera for the transmitted beam. Angular displacements of the centroids of the fluctuating laser beam generates the beam wander effects. This research develops a probabilistic model for optical propagation at the random air-water interface for a transmission case under different wind speed conditions. Preliminary results for bit-error-rate (BER) estimates as a function of fade margin for an on-off keying (OOK) optical communication through the air-water interface are presented for a communication system where a random air-water interface is a part of the communication channel.

  14. Higher-order statistical moments and a procedure that detects potentially anomalous years as two alternative methods describing alterations in continuous environmental data

    USGS Publications Warehouse

    Arismendi, Ivan; Johnson, Sherri L.; Dunham, Jason B.

    2015-01-01

    Statistics of central tendency and dispersion may not capture relevant or desired characteristics of the distribution of continuous phenomena and, thus, they may not adequately describe temporal patterns of change. Here, we present two methodological approaches that can help to identify temporal changes in environmental regimes. First, we use higher-order statistical moments (skewness and kurtosis) to examine potential changes of empirical distributions at decadal extents. Second, we adapt a statistical procedure combining a non-metric multidimensional scaling technique and higher density region plots to detect potentially anomalous years. We illustrate the use of these approaches by examining long-term stream temperature data from minimally and highly human-influenced streams. In particular, we contrast predictions about thermal regime responses to changing climates and human-related water uses. Using these methods, we effectively diagnose years with unusual thermal variability and patterns in variability through time, as well as spatial variability linked to regional and local factors that influence stream temperature. Our findings highlight the complexity of responses of thermal regimes of streams and reveal their differential vulnerability to climate warming and human-related water uses. The two approaches presented here can be applied with a variety of other continuous phenomena to address historical changes, extreme events, and their associated ecological responses.

  15. BTS guide to good statistical practice

    DOT National Transportation Integrated Search

    2002-09-01

    Quality of data has many faces. Primarily, it has to be relevant to its users. Relevance is : an outcome that is achieved through a series of steps starting with a planning process that : link user needs to data requirements. It continues through acq...

  16. Prevalence of peri-implantitis in patients with implant-supported fixed prostheses.

    PubMed

    Schuldt Filho, Guenther; Dalago, Haline Renata; Oliveira de Souza, João Gustavo; Stanley, Kyle; Jovanovic, Sascha; Bianchini, Marco Aurélio

    2014-01-01

    The purpose of this study was to evaluate periimplantitis prevalence in patients using implant-supported fixed prostheses that did not have any routine maintenance care. A total of 161 implants (27 patients) were evaluated in patients using implant-supported fixed prostheses. Collected data included information related to patient general health and local factors such as characteristics of implants, time in function, type of loading, positioning, Modified Bleeding Index, bacterial plaque, bleeding on probing (BOP), marginal recession, probing depth (PD), keratinized mucosa, and radiographic bone loss (BL). Factors related to the prostheses were also evaluated. The exclusion criteria were patients that have had any follow-up visit for plaque control of the prosthesis and/or the implants. From a total of 161 implants, 116 (72%) presented without peri-implantitis (PD > 4 mm + BOP + BL > 2 mm) while 45 (28%) had some sign of the disease. Implants placed in the maxilla were 2.98 times more likely to develop the disease (P < .05). Moreover, patients aged ≤ 60 years old were 3.24 times more likely to develop peri-implantitis (P < .05). Another analysis with statistical relevance (P < .05) was that implants with less than 3 mm interimplant distance were three times more likely to have peri-implantitis. There was no statistical relevance considering other analyses. It can be concluded that patients aged ≤ 60 years have a greater chance of presenting periimplantitis, as well as for implants positioned in the maxilla and those placed with an interimplant distance < 3 mm.

  17. On the Way to 2020: Data for Vocational Education and Training Policies. Country Statistical Overviews. Update 2013

    ERIC Educational Resources Information Center

    Cedefop - European Centre for the Development of Vocational Training, 2014

    2014-01-01

    This report provides an updated statistical overview of vocational education and training (VET) and lifelong learning in European countries. These country statistical snapshots illustrate progress on indicators selected for their policy relevance and contribution to Europe 2020 objectives. The indicators take 2010 as the baseline year and present…

  18. Analysis of covariance as a remedy for demographic mismatch of research subject groups: some sobering simulations.

    PubMed

    Adams, K M; Brown, G G; Grant, I

    1985-08-01

    Analysis of Covariance (ANCOVA) is often used in neuropsychological studies to effect ex-post-facto adjustment of performance variables amongst groups of subjects mismatched on some relevant demographic variable. This paper reviews some of the statistical assumptions underlying this usage. In an attempt to illustrate the complexities of this statistical technique, three sham studies using actual patient data are presented. These staged simulations have varying relationships between group test performance differences and levels of covariate discrepancy. The results were robust and consistent in their nature, and were held to support the wisdom of previous cautions by statisticians concerning the employment of ANCOVA to justify comparisons between incomparable groups. ANCOVA should not be used in neuropsychological research to equate groups unequal on variables such as age and education or to exert statistical control whose objective is to eliminate consideration of the covariate as an explanation for results. Finally, the report advocates by example the use of simulation to further our understanding of neuropsychological variables.

  19. Frequent statistics of link-layer bit stream data based on AC-IM algorithm

    NASA Astrophysics Data System (ADS)

    Cao, Chenghong; Lei, Yingke; Xu, Yiming

    2017-08-01

    At present, there are many relevant researches on data processing using classical pattern matching and its improved algorithm, but few researches on statistical data of link-layer bit stream. This paper adopts a frequent statistical method of link-layer bit stream data based on AC-IM algorithm for classical multi-pattern matching algorithms such as AC algorithm has high computational complexity, low efficiency and it cannot be applied to binary bit stream data. The method's maximum jump distance of the mode tree is length of the shortest mode string plus 3 in case of no missing? In this paper, theoretical analysis is made on the principle of algorithm construction firstly, and then the experimental results show that the algorithm can adapt to the binary bit stream data environment and extract the frequent sequence more accurately, the effect is obvious. Meanwhile, comparing with the classical AC algorithm and other improved algorithms, AC-IM algorithm has a greater maximum jump distance and less time-consuming.

  20. A benchmark for statistical microarray data analysis that preserves actual biological and technical variance.

    PubMed

    De Hertogh, Benoît; De Meulder, Bertrand; Berger, Fabrice; Pierre, Michael; Bareke, Eric; Gaigneaux, Anthoula; Depiereux, Eric

    2010-01-11

    Recent reanalysis of spike-in datasets underscored the need for new and more accurate benchmark datasets for statistical microarray analysis. We present here a fresh method using biologically-relevant data to evaluate the performance of statistical methods. Our novel method ranks the probesets from a dataset composed of publicly-available biological microarray data and extracts subset matrices with precise information/noise ratios. Our method can be used to determine the capability of different methods to better estimate variance for a given number of replicates. The mean-variance and mean-fold change relationships of the matrices revealed a closer approximation of biological reality. Performance analysis refined the results from benchmarks published previously.We show that the Shrinkage t test (close to Limma) was the best of the methods tested, except when two replicates were examined, where the Regularized t test and the Window t test performed slightly better. The R scripts used for the analysis are available at http://urbm-cluster.urbm.fundp.ac.be/~bdemeulder/.

  1. [The principal components analysis--method to classify the statistical variables with applications in medicine].

    PubMed

    Dascălu, Cristina Gena; Antohe, Magda Ecaterina

    2009-01-01

    Based on the eigenvalues and the eigenvectors analysis, the principal component analysis has the purpose to identify the subspace of the main components from a set of parameters, which are enough to characterize the whole set of parameters. Interpreting the data for analysis as a cloud of points, we find through geometrical transformations the directions where the cloud's dispersion is maximal--the lines that pass through the cloud's center of weight and have a maximal density of points around them (by defining an appropriate criteria function and its minimization. This method can be successfully used in order to simplify the statistical analysis on questionnaires--because it helps us to select from a set of items only the most relevant ones, which cover the variations of the whole set of data. For instance, in the presented sample we started from a questionnaire with 28 items and, applying the principal component analysis we identified 7 principal components--or main items--fact that simplifies significantly the further data statistical analysis.

  2. Statistics Poster Challenge for Schools

    ERIC Educational Resources Information Center

    Payne, Brad; Freeman, Jenny; Stillman, Eleanor

    2013-01-01

    The analysis and interpretation of data are important life skills. A poster challenge for schoolchildren provides an innovative outlet for these skills and demonstrates their relevance to daily life. We discuss our Statistics Poster Challenge and the lessons we have learned.

  3. Research methodology in dentistry: Part II — The relevance of statistics in research

    PubMed Central

    Krithikadatta, Jogikalmat; Valarmathi, Srinivasan

    2012-01-01

    The lifeline of original research depends on adept statistical analysis. However, there have been reports of statistical misconduct in studies that could arise from the inadequate understanding of the fundamental of statistics. There have been several reports on this across medical and dental literature. This article aims at encouraging the reader to approach statistics from its logic rather than its theoretical perspective. The article also provides information on statistical misuse in the Journal of Conservative Dentistry between the years 2008 and 2011 PMID:22876003

  4. Estimation of internal organ motion-induced variance in radiation dose in non-gated radiotherapy

    NASA Astrophysics Data System (ADS)

    Zhou, Sumin; Zhu, Xiaofeng; Zhang, Mutian; Zheng, Dandan; Lei, Yu; Li, Sicong; Bennion, Nathan; Verma, Vivek; Zhen, Weining; Enke, Charles

    2016-12-01

    In the delivery of non-gated radiotherapy (RT), owing to intra-fraction organ motion, a certain degree of RT dose uncertainty is present. Herein, we propose a novel mathematical algorithm to estimate the mean and variance of RT dose that is delivered without gating. These parameters are specific to individual internal organ motion, dependent on individual treatment plans, and relevant to the RT delivery process. This algorithm uses images from a patient’s 4D simulation study to model the actual patient internal organ motion during RT delivery. All necessary dose rate calculations are performed in fixed patient internal organ motion states. The analytical and deterministic formulae of mean and variance in dose from non-gated RT were derived directly via statistical averaging of the calculated dose rate over possible random internal organ motion initial phases, and did not require constructing relevant histograms. All results are expressed in dose rate Fourier transform coefficients for computational efficiency. Exact solutions are provided to simplified, yet still clinically relevant, cases. Results from a volumetric-modulated arc therapy (VMAT) patient case are also presented. The results obtained from our mathematical algorithm can aid clinical decisions by providing information regarding both mean and variance of radiation dose to non-gated patients prior to RT delivery.

  5. Prolonged grief symptoms related to loss of physical functioning: examining unique associations with medical service utilization.

    PubMed

    Holland, Jason M; Graves, Stacy; Klingspon, Kara L; Rozalski, Vincent

    2016-01-01

    Prolonged grief, a severe and chronic form of grieving most commonly studied in the context of bereavement, may have relevance to losses associated with chronic illness (e.g. grief related to loss of functioning or loss of a planned future). The purpose of the present study is to examine the unique associations between prolonged grief symptoms and service utilization patterns. An online self-report assessment battery was administered among a sample of 275 older adults with at least one chronic illness that caused significant physical impairment. Even after statistically controlling for relevant physical health (e.g. severity of physical limitations, somatic symptoms, number of chronic illnesses) and psychosocial variables (e.g. social support, depression/anxiety), more severe prolonged grief symptoms were associated with a greater number of emergency room visits, overnight stays in the hospital and total nights in the hospital. These findings highlight the importance of screening for prolonged grief symptomatology with older individuals with a debilitating chronic illness. Recent evidence suggests that prolonged grief may have relevance for losses associated with physical illness. The present study shows that prolonged grief reactions related to physical illness (e.g. grieving the loss of functioning) are uniquely associated with increased hospital-based service utilization. Given the relevance of prolonged grief reactions in this population, practitioners may wish to assess for these symptoms. Future clinical research should focus on developing interventions to target prolonged grief symptoms associated with these losses.

  6. Twitter: A Novel Tool for Studying the Health and Social Needs of Transgender Communities

    PubMed Central

    Young, Sean D

    2015-01-01

    Background Limited research has examined the health and social needs of transgender and gender nonconforming populations. Due to high levels of stigma, transgender individuals may avoid disclosing their identities to researchers, hindering this type of work. Further, researchers have traditionally relied on clinic-based sampling methods, which may mask the true heterogeneity of transgender and gender nonconforming communities. Online social networking websites present a novel platform for studying this diverse, difficult-to-reach population. Objective The objective of this study was to attempt to examine the perceived health and social needs of transgender and gender nonconforming communities by examining messages posted to the popular microblogging platform, Twitter. Methods Tweets were collected from 13 transgender-related hashtags on July 11, 2014. They were read and coded according to general themes addressed, and a content analysis was performed. Qualitative and descriptive statistics are presented. Results There were 1135 tweets that were collected in total. Both “positive” and “negative” events were discussed, in both personal and social contexts. Violence, discrimination, suicide, and sexual risk behavior were discussed. There were 34.36% (390/1135) of tweets that addressed transgender-relevant current events, and 60.79% (690/1135) provided a link to a relevant news article or resource. Conclusions This study found that transgender individuals and allies use Twitter to discuss health and social needs relevant to the population. Real-time social media sites like Twitter can be used to study issues relevant to transgender communities. PMID:26082941

  7. Twitter: A Novel Tool for Studying the Health and Social Needs of Transgender Communities.

    PubMed

    Krueger, Evan A; Young, Sean D

    2015-01-01

    Limited research has examined the health and social needs of transgender and gender nonconforming populations. Due to high levels of stigma, transgender individuals may avoid disclosing their identities to researchers, hindering this type of work. Further, researchers have traditionally relied on clinic-based sampling methods, which may mask the true heterogeneity of transgender and gender nonconforming communities. Online social networking websites present a novel platform for studying this diverse, difficult-to-reach population. The objective of this study was to attempt to examine the perceived health and social needs of transgender and gender nonconforming communities by examining messages posted to the popular microblogging platform, Twitter. Tweets were collected from 13 transgender-related hashtags on July 11, 2014. They were read and coded according to general themes addressed, and a content analysis was performed. Qualitative and descriptive statistics are presented. There were 1135 tweets that were collected in total. Both "positive" and "negative" events were discussed, in both personal and social contexts. Violence, discrimination, suicide, and sexual risk behavior were discussed. There were 34.36% (390/1135) of tweets that addressed transgender-relevant current events, and 60.79% (690/1135) provided a link to a relevant news article or resource. This study found that transgender individuals and allies use Twitter to discuss health and social needs relevant to the population. Real-time social media sites like Twitter can be used to study issues relevant to transgender communities.

  8. Guide to good statistical practice in the transportation field

    DOT National Transportation Integrated Search

    2003-05-01

    Quality of data has many faces. Primarily, it has to be relevant (i.e., useful) to its users. Relevance is achieved through a series of steps starting with a planning process that links user needs to data requirements. It continues through acquisitio...

  9. 12 CFR 348.2 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... means a natural person, corporation, or other business entity. (m) Relevant metropolitan statistical... median family income for the metropolitan statistical area (MSA), if a depository organization is located... exclusively to the business of retail merchandising or manufacturing; (ii) A person whose management functions...

  10. Introduction to the topical issue: Nonadditive entropy and nonextensive statistical mechanics

    NASA Astrophysics Data System (ADS)

    Sugiyama, Masaru

    . Dear CMT readers, it is my pleasure to introduce you to this topical issue dealing with a new research field of great interest, nonextensive statistical mechanics. This theory was initiated by Constantino Tsallis' work in 1998, as a possible generalization of Boltzmann-Gibbs thermostatistics. It is based on a nonadditive entropy, nowadays referred to as the Tsallis entropy. Nonextensive statistical mechanics is expected to be a consistent and unified theoretical framework for describing the macroscopic properties of complex systems that are anomalous in view of ordinary thermostatistics. In such systems, the long-standing problem regarding the relationship between statistical and dynamical laws becomes highlighted, since ergodicity and mixing may not be well realized in situations such as the edge of chaos. The phase space appears to self-organize in a structure that is not simply Euclidean but (multi)fractal. Due to this nontrivial structure, the concept of homogeneity of the system, which is the basic premise in ordinary thermodynamics, is violated and accordingly the additivity postulate for the thermodynamic quantities such as the internal energy and entropy may not be justified, in general. (Physically, nonadditivity is deeply relevant to nonextensivity of a system, in which the thermodynamic quantities do not scale with size in a simple way. Typical examples are systems with long-range interactions like self-gravitating systems as well as nonneutral charged ones.) A point of crucial importance here is that, phenomenologically, such an exotic phase-space structure has a fairly long lifetime. Therefore, this state, referred to as a metaequilibrium state or a nonequilibrium stationary state, appears to be described by a generalized entropic principle different from the traditional Boltzmann-Gibbs form, even though it may eventually approach the Boltzmann-Gibbs equilibrium state. The limits t-> ∞ and N-> ∞ do not commute, where t and N are time and the number of particles, respectively. The present topical issue is devoted to summarizing the current status of nonextensive statistical mechanics from various perspectives. It is my hope that this issue can inform the reader of one of the foremost research areas in thermostatistics. This issue consists of eight articles. The first one by Tsallis and Brigatti presents a general introduction and an overview of nonextensive statistical mechanics. At first glance, generalization of the ordinary Boltzmann-Gibbs-Shannon entropy might be completely arbitrary. But Abe's article explains how Tsallis' generalization of the statistical entropy can uniquely be characterized by both physical and mathematical principles. Then, the article by Pluchino, Latora, and Rapisarda presents a strong evidence that nonextensive statistical mechanics is in fact relevant to nonextensive systems with long-range interactions. The articles by Rajagopal, by Wada, and by Plastino, Miller, and Plastino are concerned with the macroscopic thermodynamic properties of nonextensive statistical mechanics. Rajagopal discusses the first and second laws of thermodynamics. Wada develops a discussion about the condition under which the nonextensive statistical-mechanical formalism is thermodynamically stable. The work of Plastino, Miller, and Plastino addresses the thermodynamic Legendre-transform structure and its robustness for generalizations of entropy. After these fundamental investigations, Sakagami and Taruya examine the theory for self-gravitating systems. Finally, Beck presents a novel idea of the so-called superstatistics, which provides nonextensive statistical mechanics with a physical interpretation based on nonequilibrium concepts including temperature fluctuations. Its applications to hydrodynamic turbulence and pattern formation in thermal convection states are also discussed. Nonextensive statistical mechanics is already a well-studied field, and a number of works are available in the literature. It is recommended that the interested reader visit the URL http: //tsallis.cat.cbpf.br/TEMUCO.pdf. There, one can find a comprehensive list of references to more than one thousand papers including important results that, due to lack of space, have not been mentioned in the present issue. Though there are so many published works, nonextensive statistical mechanics is still a developing field. This can naturally be understood, since the program that has been undertaken is an extremely ambitious one that makes a serious attempt to enlarge the horizons of the realm of statistical mechanics. The possible influence of nonextensive statistical mechanics on continuum mechanics and thermodynamics seems to be wide and deep. I will therefore be happy if this issue contributes to attracting the interest of researchers and stimulates research activities not only in the very field of nonextensive statistical mechanics but also in the field of continuum mechanics and thermodynamics in a wider context. As the editor of the present topical issue, I would like to express my sincere thanks to all those who joined up to make this issue. I cordially thank Professor S. Abe for advising me on the editorial policy. Without his help, the present topical issue would never have been brought out.

  11. Rare Event Simulation for T-cell Activation

    NASA Astrophysics Data System (ADS)

    Lipsmeier, Florian; Baake, Ellen

    2009-02-01

    The problem of statistical recognition is considered, as it arises in immunobiology, namely, the discrimination of foreign antigens against a background of the body's own molecules. The precise mechanism of this foreign-self-distinction, though one of the major tasks of the immune system, continues to be a fundamental puzzle. Recent progress has been made by van den Berg, Rand, and Burroughs (J. Theor. Biol. 209:465-486, 2001), who modelled the probabilistic nature of the interaction between the relevant cell types, namely, T-cells and antigen-presenting cells (APCs). Here, the stochasticity is due to the random sample of antigens present on the surface of every APC, and to the random receptor type that characterises individual T-cells. It has been shown previously (van den Berg et al. in J. Theor. Biol. 209:465-486, 2001; Zint et al. in J. Math. Biol. 57:841-861, 2008) that this model, though highly idealised, is capable of reproducing important aspects of the recognition phenomenon, and of explaining them on the basis of stochastic rare events. These results were obtained with the help of a refined large deviation theorem and were thus asymptotic in nature. Simulations have, so far, been restricted to the straightforward simple sampling approach, which does not allow for sample sizes large enough to address more detailed questions. Building on the available large deviation results, we develop an importance sampling technique that allows for a convenient exploration of the relevant tail events by means of simulation. With its help, we investigate the mechanism of statistical recognition in some depth. In particular, we illustrate how a foreign antigen can stand out against the self background if it is present in sufficiently many copies, although no a priori difference between self and nonself is built into the model.

  12. Constructing Noise-Invariant Representations of Sound in the Auditory Pathway

    PubMed Central

    Rabinowitz, Neil C.; Willmore, Ben D. B.; King, Andrew J.; Schnupp, Jan W. H.

    2013-01-01

    Identifying behaviorally relevant sounds in the presence of background noise is one of the most important and poorly understood challenges faced by the auditory system. An elegant solution to this problem would be for the auditory system to represent sounds in a noise-invariant fashion. Since a major effect of background noise is to alter the statistics of the sounds reaching the ear, noise-invariant representations could be promoted by neurons adapting to stimulus statistics. Here we investigated the extent of neuronal adaptation to the mean and contrast of auditory stimulation as one ascends the auditory pathway. We measured these forms of adaptation by presenting complex synthetic and natural sounds, recording neuronal responses in the inferior colliculus and primary fields of the auditory cortex of anaesthetized ferrets, and comparing these responses with a sophisticated model of the auditory nerve. We find that the strength of both forms of adaptation increases as one ascends the auditory pathway. To investigate whether this adaptation to stimulus statistics contributes to the construction of noise-invariant sound representations, we also presented complex, natural sounds embedded in stationary noise, and used a decoding approach to assess the noise tolerance of the neuronal population code. We find that the code for complex sounds in the periphery is affected more by the addition of noise than the cortical code. We also find that noise tolerance is correlated with adaptation to stimulus statistics, so that populations that show the strongest adaptation to stimulus statistics are also the most noise-tolerant. This suggests that the increase in adaptation to sound statistics from auditory nerve to midbrain to cortex is an important stage in the construction of noise-invariant sound representations in the higher auditory brain. PMID:24265596

  13. The importance of coherence in inverse problems in optics

    NASA Astrophysics Data System (ADS)

    Ferwerda, H. A.; Baltes, H. P.; Glass, A. S.; Steinle, B.

    1981-12-01

    Current inverse problems of statistical optics are presented with a guide to relevant literature. The inverse problems are categorized into four groups, and the Van Cittert-Zernike theorem and its generalization are discussed. The retrieval of structural information from the far-zone degree of coherence and the time-averaged intensity distribution of radiation scattered by a superposition of random and periodic scatterers are also discussed. In addition, formulas for the calculation of far-zone properties are derived within the framework of scalar optics, and results are applied to two examples.

  14. Stochastic Growth of Ion Cyclotron And Mirror Waves In Earth's Magnetosheath

    NASA Technical Reports Server (NTRS)

    Cairns, Iver H.; Grubits, K. A.

    2001-01-01

    Electromagnetic ion cyclotron and mirror waves in Earth's magnetosheath are bursty, have widely variable fields, and are unexpectedly persistent, properties difficult to reconcile with uniform secular growth. Here it is shown for specific periods that stochastic growth theory (SGT) quantitatively accounts for the functional form of the wave statistics and qualitatively explains the wave properties. The wave statistics are inconsistent with uniform secular growth or self-organized criticality, but nonlinear processes sometimes play a role at high fields. The results show SGT's relevance near marginal stability and suggest that it is widely relevant to space and astrophysical plasmas.

  15. Identifying natural flow regimes using fish communities

    NASA Astrophysics Data System (ADS)

    Chang, Fi-John; Tsai, Wen-Ping; Wu, Tzu-Ching; Chen, Hung-kwai; Herricks, Edwin E.

    2011-10-01

    SummaryModern water resources management has adopted natural flow regimes as reasonable targets for river restoration and conservation. The characterization of a natural flow regime begins with the development of hydrologic statistics from flow records. However, little guidance exists for defining the period of record needed for regime determination. In Taiwan, the Taiwan Eco-hydrological Indicator System (TEIS), a group of hydrologic statistics selected for fisheries relevance, is being used to evaluate ecological flows. The TEIS consists of a group of hydrologic statistics selected to characterize the relationships between flow and the life history of indigenous species. Using the TEIS and biosurvey data for Taiwan, this paper identifies the length of hydrologic record sufficient for natural flow regime characterization. To define the ecological hydrology of fish communities, this study connected hydrologic statistics to fish communities by using methods to define antecedent conditions that influence existing community composition. A moving average method was applied to TEIS statistics to reflect the effects of antecedent flow condition and a point-biserial correlation method was used to relate fisheries collections with TEIS statistics. The resulting fish species-TEIS (FISH-TEIS) hydrologic statistics matrix takes full advantage of historical flows and fisheries data. The analysis indicates that, in the watersheds analyzed, averaging TEIS statistics for the present year and 3 years prior to the sampling date, termed MA(4), is sufficient to develop a natural flow regime. This result suggests that flow regimes based on hydrologic statistics for the period of record can be replaced by regimes developed for sampled fish communities.

  16. SPATIO-TEMPORAL MODELING OF AGRICULTURAL YIELD DATA WITH AN APPLICATION TO PRICING CROP INSURANCE CONTRACTS

    PubMed Central

    Ozaki, Vitor A.; Ghosh, Sujit K.; Goodwin, Barry K.; Shirota, Ricardo

    2009-01-01

    This article presents a statistical model of agricultural yield data based on a set of hierarchical Bayesian models that allows joint modeling of temporal and spatial autocorrelation. This method captures a comprehensive range of the various uncertainties involved in predicting crop insurance premium rates as opposed to the more traditional ad hoc, two-stage methods that are typically based on independent estimation and prediction. A panel data set of county-average yield data was analyzed for 290 counties in the State of Paraná (Brazil) for the period of 1990 through 2002. Posterior predictive criteria are used to evaluate different model specifications. This article provides substantial improvements in the statistical and actuarial methods often applied to the calculation of insurance premium rates. These improvements are especially relevant to situations where data are limited. PMID:19890450

  17. Entropy production in mesoscopic stochastic thermodynamics: nonequilibrium kinetic cycles driven by chemical potentials, temperatures, and mechanical forces

    NASA Astrophysics Data System (ADS)

    Qian, Hong; Kjelstrup, Signe; Kolomeisky, Anatoly B.; Bedeaux, Dick

    2016-04-01

    Nonequilibrium thermodynamics (NET) investigates processes in systems out of global equilibrium. On a mesoscopic level, it provides a statistical dynamic description of various complex phenomena such as chemical reactions, ion transport, diffusion, thermochemical, thermomechanical and mechanochemical fluxes. In the present review, we introduce a mesoscopic stochastic formulation of NET by analyzing entropy production in several simple examples. The fundamental role of nonequilibrium steady-state cycle kinetics is emphasized. The statistical mechanics of Onsager’s reciprocal relations in this context is elucidated. Chemomechanical, thermomechanical, and enzyme-catalyzed thermochemical energy transduction processes are discussed. It is argued that mesoscopic stochastic NET in phase space provides a rigorous mathematical basis of fundamental concepts needed for understanding complex processes in chemistry, physics and biology. This theory is also relevant for nanoscale technological advances.

  18. Low-dose ionizing radiation increases the mortality risk of solid cancers in nuclear industry workers: A meta-analysis.

    PubMed

    Qu, Shu-Gen; Gao, Jin; Tang, Bo; Yu, Bo; Shen, Yue-Ping; Tu, Yu

    2018-05-01

    Low-dose ionizing radiation (LDIR) may increase the mortality of solid cancers in nuclear industry workers, but only few individual cohort studies exist, and the available reports have low statistical power. The aim of the present study was to focus on solid cancer mortality risk from LDIR in the nuclear industry using standard mortality ratios (SMRs) and 95% confidence intervals. A systematic literature search through the PubMed and Embase databases identified 27 studies relevant to this meta-analysis. There was statistical significance for total, solid and lung cancers, with meta-SMR values of 0.88, 0.80, and 0.89, respectively. There was evidence of stochastic effects by IR, but more definitive conclusions require additional analyses using standardized protocols to determine whether LDIR increases the risk of solid cancer-related mortality.

  19. On the statistical assessment of classifiers using DNA microarray data

    PubMed Central

    Ancona, N; Maglietta, R; Piepoli, A; D'Addabbo, A; Cotugno, R; Savino, M; Liuni, S; Carella, M; Pesole, G; Perri, F

    2006-01-01

    Background In this paper we present a method for the statistical assessment of cancer predictors which make use of gene expression profiles. The methodology is applied to a new data set of microarray gene expression data collected in Casa Sollievo della Sofferenza Hospital, Foggia – Italy. The data set is made up of normal (22) and tumor (25) specimens extracted from 25 patients affected by colon cancer. We propose to give answers to some questions which are relevant for the automatic diagnosis of cancer such as: Is the size of the available data set sufficient to build accurate classifiers? What is the statistical significance of the associated error rates? In what ways can accuracy be considered dependant on the adopted classification scheme? How many genes are correlated with the pathology and how many are sufficient for an accurate colon cancer classification? The method we propose answers these questions whilst avoiding the potential pitfalls hidden in the analysis and interpretation of microarray data. Results We estimate the generalization error, evaluated through the Leave-K-Out Cross Validation error, for three different classification schemes by varying the number of training examples and the number of the genes used. The statistical significance of the error rate is measured by using a permutation test. We provide a statistical analysis in terms of the frequencies of the genes involved in the classification. Using the whole set of genes, we found that the Weighted Voting Algorithm (WVA) classifier learns the distinction between normal and tumor specimens with 25 training examples, providing e = 21% (p = 0.045) as an error rate. This remains constant even when the number of examples increases. Moreover, Regularized Least Squares (RLS) and Support Vector Machines (SVM) classifiers can learn with only 15 training examples, with an error rate of e = 19% (p = 0.035) and e = 18% (p = 0.037) respectively. Moreover, the error rate decreases as the training set size increases, reaching its best performances with 35 training examples. In this case, RLS and SVM have error rates of e = 14% (p = 0.027) and e = 11% (p = 0.019). Concerning the number of genes, we found about 6000 genes (p < 0.05) correlated with the pathology, resulting from the signal-to-noise statistic. Moreover the performances of RLS and SVM classifiers do not change when 74% of genes is used. They progressively reduce up to e = 16% (p < 0.05) when only 2 genes are employed. The biological relevance of a set of genes determined by our statistical analysis and the major roles they play in colorectal tumorigenesis is discussed. Conclusions The method proposed provides statistically significant answers to precise questions relevant for the diagnosis and prognosis of cancer. We found that, with as few as 15 examples, it is possible to train statistically significant classifiers for colon cancer diagnosis. As for the definition of the number of genes sufficient for a reliable classification of colon cancer, our results suggest that it depends on the accuracy required. PMID:16919171

  20. Cognitive function in early clinical phase huntington disease after rivastigmine treatment.

    PubMed

    Sešok, Sanja; Bolle, Nika; Kobal, Jan; Bucik, Valentin; Vodušek, David B

    2014-09-01

    In Huntington disease (HD) patients receiving rivastigmine treatment improvement of behavioral symptoms and of cognitive function (assessed with screening diagnostic instruments) has been reported. The aim of the present study was to verify such improvement in cognitive function by cognitive function assessment with a detailed neuropsychological battery covering all relevant cognitive systems expected to be impaired in early phase HD. Eighteen (18) HD patients entered the study and were randomly allocated to the rivastigmine and placebo group. All subjects underwent neuropsychological assessment at baseline. Follow-up neuropsychological assessment was applied after 6 months of rivastigmine or placebo treatment. Eighteen (18) healthy controls entered the study to control for practice effect and underwent neuropsychological assessment at baseline and after 6 months, without treatment. The neuropsychological battery consisted of assessment tools that are sensitive to cognitive impairment seen in early phase HD: CTMT, SDMT, Stroop (attention and information control), RFFT, TOL, Verbal fluency (executive functioning), CVLT-II, RCFT (learning and memory). Effect of rivastigmine and possible effect of practice was assessed using the mixed ANOVA model. No statistically significant effect of rivastigmine treatment on cognitive function in HD patients was detected. There was no evidence for practice or placebo effect. Detailed neuropsychological assessment did not confirm previously reported effect of rivastigmine treatment on cognitive function in HD patients. The limitations of our study are, in particular, small sample size and the lack of a single measure of relevant cognitive functioning in HD patients. Instead of focusing solely on statistical significance, a clinical relevance study is proposed to clarify the issue of rivastigmine effects in HD.

  1. An open source Bayesian Monte Carlo isotope mixing model with applications in Earth surface processes

    NASA Astrophysics Data System (ADS)

    Arendt, Carli A.; Aciego, Sarah M.; Hetland, Eric A.

    2015-05-01

    The implementation of isotopic tracers as constraints on source contributions has become increasingly relevant to understanding Earth surface processes. Interpretation of these isotopic tracers has become more accessible with the development of Bayesian Monte Carlo (BMC) mixing models, which allow uncertainty in mixing end-members and provide methodology for systems with multicomponent mixing. This study presents an open source multiple isotope BMC mixing model that is applicable to Earth surface environments with sources exhibiting distinct end-member isotopic signatures. Our model is first applied to new δ18O and δD measurements from the Athabasca Glacier, which showed expected seasonal melt evolution trends and vigorously assessed the statistical relevance of the resulting fraction estimations. To highlight the broad applicability of our model to a variety of Earth surface environments and relevant isotopic systems, we expand our model to two additional case studies: deriving melt sources from δ18O, δD, and 222Rn measurements of Greenland Ice Sheet bulk water samples and assessing nutrient sources from ɛNd and 87Sr/86Sr measurements of Hawaiian soil cores. The model produces results for the Greenland Ice Sheet and Hawaiian soil data sets that are consistent with the originally published fractional contribution estimates. The advantage of this method is that it quantifies the error induced by variability in the end-member compositions, unrealized by the models previously applied to the above case studies. Results from all three case studies demonstrate the broad applicability of this statistical BMC isotopic mixing model for estimating source contribution fractions in a variety of Earth surface systems.

  2. Arthroscopic Debridement for Primary Degenerative Osteoarthritis of the Elbow Leads to Significant Improvement in Range of Motion and Clinical Outcomes: A Systematic Review.

    PubMed

    Sochacki, Kyle R; Jack, Robert A; Hirase, Takashi; McCulloch, Patrick C; Lintner, David M; Liberman, Shari R; Harris, Joshua D

    2017-12-01

    The purpose of this investigation was to determine whether arthroscopic debridement of primary elbow osteoarthritis results in statistically significant and clinically relevant improvement in (1) elbow range of motion and (2) clinical outcomes with (3) low complication and reoperation rates. A systematic review was registered with PROSPERO and performed using PRISMA guidelines. Databases were searched for studies that investigated the outcomes of arthroscopic debridement for the treatment of primary osteoarthritis of the elbow in adult human patients. Study methodological quality was analyzed. Studies that included post-traumatic arthritis were excluded. Elbow motion and all elbow-specific patient-reported outcome scores were eligible for analysis. Comparisons between preoperative and postoperative values from each study were made using 2-sample Z-tests (http://in-silico.net/tools/statistics/ztest) using a P value < .05. Nine articles (209 subjects, 213 elbows, 187 males, 22 females, mean age 45.7 ± 7.1 years, mean follow-up 41.7 ± 16.3. months; 75% right, 25% left; 79% dominant elbow, 21% nondominant) were analyzed. Elbow extension (23.4°-10.7°, Δ 12.7°), flexion (115.9°-128.7°, Δ 12.8°), and global arc of motion (94.5°-117.6°, Δ 23.1°) had statistically significant and clinically relevant improvement following arthroscopic debridement (P < .0001 for all). There was also a statistically significant (P < .0001) and clinically relevant improvement in the Mayo Elbow Performance Score (60.7-84.6, Δ 23.9) postoperatively. Six patients (2.8%) had postoperative complications. Nine (4.2%) underwent reoperation. Elbow arthroscopic debridement for primary degenerative osteoarthritis results in statistically significant and clinically relevant improvement in elbow range of motion and clinical outcomes with low complication and reoperation rates. Systematic review of level IV studies. Copyright © 2017 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  3. Discovering Conformational Sub-States Relevant to Protein Function

    PubMed Central

    Ramanathan, Arvind; Savol, Andrej J.; Langmead, Christopher J.; Agarwal, Pratul K.; Chennubhotla, Chakra S.

    2011-01-01

    Background Internal motions enable proteins to explore a range of conformations, even in the vicinity of native state. The role of conformational fluctuations in the designated function of a protein is widely debated. Emerging evidence suggests that sub-groups within the range of conformations (or sub-states) contain properties that may be functionally relevant. However, low populations in these sub-states and the transient nature of conformational transitions between these sub-states present significant challenges for their identification and characterization. Methods and Findings To overcome these challenges we have developed a new computational technique, quasi-anharmonic analysis (QAA). QAA utilizes higher-order statistics of protein motions to identify sub-states in the conformational landscape. Further, the focus on anharmonicity allows identification of conformational fluctuations that enable transitions between sub-states. QAA applied to equilibrium simulations of human ubiquitin and T4 lysozyme reveals functionally relevant sub-states and protein motions involved in molecular recognition. In combination with a reaction pathway sampling method, QAA characterizes conformational sub-states associated with cis/trans peptidyl-prolyl isomerization catalyzed by the enzyme cyclophilin A. In these three proteins, QAA allows identification of conformational sub-states, with critical structural and dynamical features relevant to protein function. Conclusions Overall, QAA provides a novel framework to intuitively understand the biophysical basis of conformational diversity and its relevance to protein function. PMID:21297978

  4. Prognostic relevance of motor talent predictors in early adolescence: A group- and individual-based evaluation considering different levels of achievement in youth football.

    PubMed

    Höner, Oliver; Votteler, Andreas

    2016-12-01

    In the debate about the usefulness of motor diagnostics in the talent identification process, the prognostic validity for tests conducted in early adolescence is of critical interest. Using a group- and individual-based statistical approach, this prospective cohort study evaluated a nationwide assessment of speed abilities and technical skills regarding its relevance for future achievement levels. The sample consisted of 22,843 U12-players belonging to the top 4% in German football. The U12-results in five tests served as predictors for players' selection levels in U16-U19 (youth national team, regional association, youth academy, not selected). Group-mean differences proved the prognostic relevance for all predictors. Low individual selection probabilities demonstrated limited predictive values, while excellent test results proved their particular prognostic relevance. Players scoring percentile ranks (PRs) ≥ 99 had a 12 times higher chance to become youth national team players than players scoring PR < 99. Simulating increasing score cut-off values not only enhanced specificity (correctly identified non-talents) but also led to lower sensitivity (loss of talents). Extending the current research, these different approaches revealed the ambiguity of the diagnostics' prognostic relevance, representing both the usefulness and several pitfalls of nationwide diagnostics. Therefore, the present diagnostics can support but not substitute for coaches' subjective decisions for talent identification, and multidisciplinary designs are required.

  5. First results of a national initiative to enable quality improvement of cardiovascular care by transparently reporting on patient-relevant outcomes.

    PubMed

    van Veghel, Dennis; Marteijn, Marloes; de Mol, Bas

    2016-06-01

    The aims of this study were to assess patient-relevant outcomes of delivered cardiovascular care by focusing on disease management as determined by a multidisciplinary Heart Team, to establish and share best practices by comparing outcomes and to embed value-based decision-making to improve quality and efficiency in Dutch heart centres. In 2014, 12 Dutch heart centres pooled patient-relevant outcome data, which resulted in transparent publication of the outcomes, including long-term follow-up up to 5 years, of approximately 86 000 heart patients. This study presents the results of both disease- and treatment patient-relevant outcome measures for coronary artery disease and aortic valve disease. The patients included were presented to a Heart Team and underwent invasive or operative treatment. In-hospital and out-of-hospital patient-relevant outcome measures were collected as well as initial conditions. Quality of life was assessed using the Short Form (SF)-36 or SF-12 health survey. In the Netherlands, patient-relevant and risk-adjusted outcomes of cardiovascular care in participating heart centres are published annually. Data were sufficiently reliable to enable comparisons and to subtract best practices. The statistically lower risk-adjusted mortality rate after coronary artery bypass grafting resulted in a voluntary roll-out of a perioperative safety check. The in-depth analysis of outcomes after percutaneous coronary intervention resulted in process improvements in several heart centres, such as pre-hydration for patients with renal insufficiency and the need of target vessel revascularizations within a year. Annual data collection on follow-up of patient-relevant outcomes of cardiovascular care, initiated and organized by physicians, appears feasible. Transparent publication of outcomes drives the improvement of quality within heart centres. The system of using a limited set of patient-relevant outcome measures enables reliable comparisons and exposes the quality of decision-making and the operational process. Transparent communication on outcomes is feasible, safe and cost-effective, and stimulates professional decision-making and disease management. © The Author 2016. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  6. Do two machine-learning based prognostic signatures for breast cancer capture the same biological processes?

    PubMed

    Drier, Yotam; Domany, Eytan

    2011-03-14

    The fact that there is very little if any overlap between the genes of different prognostic signatures for early-discovery breast cancer is well documented. The reasons for this apparent discrepancy have been explained by the limits of simple machine-learning identification and ranking techniques, and the biological relevance and meaning of the prognostic gene lists was questioned. Subsequently, proponents of the prognostic gene lists claimed that different lists do capture similar underlying biological processes and pathways. The present study places under scrutiny the validity of this claim, for two important gene lists that are at the focus of current large-scale validation efforts. We performed careful enrichment analysis, controlling the effects of multiple testing in a manner which takes into account the nested dependent structure of gene ontologies. In contradiction to several previous publications, we find that the only biological process or pathway for which statistically significant concordance can be claimed is cell proliferation, a process whose relevance and prognostic value was well known long before gene expression profiling. We found that the claims reported by others, of wider concordance between the biological processes captured by the two prognostic signatures studied, were found either to be lacking statistical rigor or were in fact based on addressing some other question.

  7. Identifying Minefields and Verifying Clearance: Adapting Statistical Methods for UXO Target Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, Richard O.; O'Brien, Robert F.; Wilson, John E.

    2003-09-01

    It may not be feasible to completely survey large tracts of land suspected of containing minefields. It is desirable to develop a characterization protocol that will confidently identify minefields within these large land tracts if they exist. Naturally, surveying areas of greatest concern and most likely locations would be necessary but will not provide the needed confidence that an unknown minefield had not eluded detection. Once minefields are detected, methods are needed to bound the area that will require detailed mine detection surveys. The US Department of Defense Strategic Environmental Research and Development Program (SERDP) is sponsoring the development ofmore » statistical survey methods and tools for detecting potential UXO targets. These methods may be directly applicable to demining efforts. Statistical methods are employed to determine the optimal geophysical survey transect spacing to have confidence of detecting target areas of a critical size, shape, and anomaly density. Other methods under development determine the proportion of a land area that must be surveyed to confidently conclude that there are no UXO present. Adaptive sampling schemes are also being developed as an approach for bounding the target areas. These methods and tools will be presented and the status of relevant research in this area will be discussed.« less

  8. Statistical considerations on prognostic models for glioma

    PubMed Central

    Molinaro, Annette M.; Wrensch, Margaret R.; Jenkins, Robert B.; Eckel-Passow, Jeanette E.

    2016-01-01

    Given the lack of beneficial treatments in glioma, there is a need for prognostic models for therapeutic decision making and life planning. Recently several studies defining subtypes of glioma have been published. Here, we review the statistical considerations of how to build and validate prognostic models, explain the models presented in the current glioma literature, and discuss advantages and disadvantages of each model. The 3 statistical considerations to establishing clinically useful prognostic models are: study design, model building, and validation. Careful study design helps to ensure that the model is unbiased and generalizable to the population of interest. During model building, a discovery cohort of patients can be used to choose variables, construct models, and estimate prediction performance via internal validation. Via external validation, an independent dataset can assess how well the model performs. It is imperative that published models properly detail the study design and methods for both model building and validation. This provides readers the information necessary to assess the bias in a study, compare other published models, and determine the model's clinical usefulness. As editors, reviewers, and readers of the relevant literature, we should be cognizant of the needed statistical considerations and insist on their use. PMID:26657835

  9. Robust hypothesis tests for detecting statistical evidence of two-dimensional and three-dimensional interactions in single-molecule measurements

    NASA Astrophysics Data System (ADS)

    Calderon, Christopher P.; Weiss, Lucien E.; Moerner, W. E.

    2014-05-01

    Experimental advances have improved the two- (2D) and three-dimensional (3D) spatial resolution that can be extracted from in vivo single-molecule measurements. This enables researchers to quantitatively infer the magnitude and directionality of forces experienced by biomolecules in their native environment. Situations where such force information is relevant range from mitosis to directed transport of protein cargo along cytoskeletal structures. Models commonly applied to quantify single-molecule dynamics assume that effective forces and velocity in the x ,y (or x ,y,z) directions are statistically independent, but this assumption is physically unrealistic in many situations. We present a hypothesis testing approach capable of determining if there is evidence of statistical dependence between positional coordinates in experimentally measured trajectories; if the hypothesis of independence between spatial coordinates is rejected, then a new model accounting for 2D (3D) interactions can and should be considered. Our hypothesis testing technique is robust, meaning it can detect interactions, even if the noise statistics are not well captured by the model. The approach is demonstrated on control simulations and on experimental data (directed transport of intraflagellar transport protein 88 homolog in the primary cilium).

  10. Some Tests of Randomness with Applications

    DTIC Science & Technology

    1981-02-01

    freedom. For further details, the reader is referred to Gnanadesikan (1977, p. 169) wherein other relevant tests are also given, Graphical tests, as...sample from a gamma distri- bution. J. Am. Statist. Assoc. 71, 480-7. Gnanadesikan , R. (1977). Methods for Statistical Data Analysis of Multivariate

  11. Computer-aided auditing of prescription drug claims.

    PubMed

    Iyengar, Vijay S; Hermiz, Keith B; Natarajan, Ramesh

    2014-09-01

    We describe a methodology for identifying and ranking candidate audit targets from a database of prescription drug claims. The relevant audit targets may include various entities such as prescribers, patients and pharmacies, who exhibit certain statistical behavior indicative of potential fraud and abuse over the prescription claims during a specified period of interest. Our overall approach is consistent with related work in statistical methods for detection of fraud and abuse, but has a relative emphasis on three specific aspects: first, based on the assessment of domain experts, certain focus areas are selected and data elements pertinent to the audit analysis in each focus area are identified; second, specialized statistical models are developed to characterize the normalized baseline behavior in each focus area; and third, statistical hypothesis testing is used to identify entities that diverge significantly from their expected behavior according to the relevant baseline model. The application of this overall methodology to a prescription claims database from a large health plan is considered in detail.

  12. Determinants of Judgments of Explanatory Power: Credibility, Generality, and Statistical Relevance.

    PubMed

    Colombo, Matteo; Bucher, Leandra; Sprenger, Jan

    2017-01-01

    Explanation is a central concept in human psychology. Drawing upon philosophical theories of explanation, psychologists have recently begun to examine the relationship between explanation, probability and causality. Our study advances this growing literature at the intersection of psychology and philosophy of science by systematically investigating how judgments of explanatory power are affected by (i) the prior credibility of an explanatory hypothesis, (ii) the causal framing of the hypothesis, (iii) the perceived generalizability of the explanation, and (iv) the relation of statistical relevance between hypothesis and evidence. Collectively, the results of our five experiments support the hypothesis that the prior credibility of a causal explanation plays a central role in explanatory reasoning: first, because of the presence of strong main effects on judgments of explanatory power, and second, because of the gate-keeping role it has for other factors. Highly credible explanations are not susceptible to causal framing effects, but they are sensitive to the effects of normatively relevant factors: the generalizability of an explanation, and its statistical relevance for the evidence. These results advance current literature in the philosophy and psychology of explanation in three ways. First, they yield a more nuanced understanding of the determinants of judgments of explanatory power, and the interaction between these factors. Second, they show the close relationship between prior beliefs and explanatory power. Third, they elucidate the nature of abductive reasoning.

  13. Determinants of Judgments of Explanatory Power: Credibility, Generality, and Statistical Relevance

    PubMed Central

    Colombo, Matteo; Bucher, Leandra; Sprenger, Jan

    2017-01-01

    Explanation is a central concept in human psychology. Drawing upon philosophical theories of explanation, psychologists have recently begun to examine the relationship between explanation, probability and causality. Our study advances this growing literature at the intersection of psychology and philosophy of science by systematically investigating how judgments of explanatory power are affected by (i) the prior credibility of an explanatory hypothesis, (ii) the causal framing of the hypothesis, (iii) the perceived generalizability of the explanation, and (iv) the relation of statistical relevance between hypothesis and evidence. Collectively, the results of our five experiments support the hypothesis that the prior credibility of a causal explanation plays a central role in explanatory reasoning: first, because of the presence of strong main effects on judgments of explanatory power, and second, because of the gate-keeping role it has for other factors. Highly credible explanations are not susceptible to causal framing effects, but they are sensitive to the effects of normatively relevant factors: the generalizability of an explanation, and its statistical relevance for the evidence. These results advance current literature in the philosophy and psychology of explanation in three ways. First, they yield a more nuanced understanding of the determinants of judgments of explanatory power, and the interaction between these factors. Second, they show the close relationship between prior beliefs and explanatory power. Third, they elucidate the nature of abductive reasoning. PMID:28928679

  14. A Response to White and Gorard: Against Inferential Statistics: How and Why Current Statistics Teaching Gets It Wrong

    ERIC Educational Resources Information Center

    Nicholson, James; Ridgway, Jim

    2017-01-01

    White and Gorard make important and relevant criticisms of some of the methods commonly used in social science research, but go further by criticising the logical basis for inferential statistical tests. This paper comments briefly on matters we broadly agree on with them and more fully on matters where we disagree. We agree that too little…

  15. Estimation of gene induction enables a relevance-based ranking of gene sets.

    PubMed

    Bartholomé, Kilian; Kreutz, Clemens; Timmer, Jens

    2009-07-01

    In order to handle and interpret the vast amounts of data produced by microarray experiments, the analysis of sets of genes with a common biological functionality has been shown to be advantageous compared to single gene analyses. Some statistical methods have been proposed to analyse the differential gene expression of gene sets in microarray experiments. However, most of these methods either require threshhold values to be chosen for the analysis, or they need some reference set for the determination of significance. We present a method that estimates the number of differentially expressed genes in a gene set without requiring a threshold value for significance of genes. The method is self-contained (i.e., it does not require a reference set for comparison). In contrast to other methods which are focused on significance, our approach emphasizes the relevance of the regulation of gene sets. The presented method measures the degree of regulation of a gene set and is a useful tool to compare the induction of different gene sets and place the results of microarray experiments into the biological context. An R-package is available.

  16. Dissociation: Defining the Concept in Criminal Forensic Psychiatry.

    PubMed

    Bourget, Dominique; Gagné, Pierre; Wood, Stephen Floyd

    2017-06-01

    Claims of amnesia and dissociative experiences in association with a violent crime are not uncommon. Research has shown that dissociation is a risk factor for violence and is seen most often in crimes of extreme violence. The subject matter is most relevant to forensic psychiatry. Peritraumatic dissociation for instance, with or without a history of dissociative disorder, is quite frequently reported by offenders presenting for a forensic psychiatric examination. Dissociation or dissociative amnesia for serious offenses can have legal repercussions stemming from their relevance to the legal constructs of fitness to stand trial, criminal responsibility, and diminished capacity. The complexity in forensic psychiatric assessments often lies in the difficulty of connecting clinical symptomatology reported by violent offenders to a specific condition included in the Diagnostic and Statistical Manual of Mental Disorders (DSM). This article provides a review of diagnostic considerations with regard to dissociation across the DSM nomenclature, with a focus on the main clinical constructs related to dissociation. Forensic implications are discussed, along with some guides for the forensic evaluator of offenders presenting with dissociation. © 2017 American Academy of Psychiatry and the Law.

  17. Event Detection and Sub-state Discovery from Bio-molecular Simulations Using Higher-Order Statistics: Application To Enzyme Adenylate Kinase

    PubMed Central

    Ramanathan, Arvind; Savol, Andrej J.; Agarwal, Pratul K.; Chennubhotla, Chakra S.

    2012-01-01

    Biomolecular simulations at milli-second and longer timescales can provide vital insights into functional mechanisms. Since post-simulation analyses of such large trajectory data-sets can be a limiting factor in obtaining biological insights, there is an emerging need to identify key dynamical events and relating these events to the biological function online, that is, as simulations are progressing. Recently, we have introduced a novel computational technique, quasi-anharmonic analysis (QAA) (PLoS One 6(1): e15827), for partitioning the conformational landscape into a hierarchy of functionally relevant sub-states. The unique capabilities of QAA are enabled by exploiting anharmonicity in the form of fourth-order statistics for characterizing atomic fluctuations. In this paper, we extend QAA for analyzing long time-scale simulations online. In particular, we present HOST4MD - a higher-order statistical toolbox for molecular dynamics simulations, which (1) identifies key dynamical events as simulations are in progress, (2) explores potential sub-states and (3) identifies conformational transitions that enable the protein to access those sub-states. We demonstrate HOST4MD on micro-second time-scale simulations of the enzyme adenylate kinase in its apo state. HOST4MD identifies several conformational events in these simulations, revealing how the intrinsic coupling between the three sub-domains (LID, CORE and NMP) changes during the simulations. Further, it also identifies an inherent asymmetry in the opening/closing of the two binding sites. We anticipate HOST4MD will provide a powerful and extensible framework for detecting biophysically relevant conformational coordinates from long time-scale simulations. PMID:22733562

  18. Creating Near-Term Climate Scenarios for AgMIP

    NASA Astrophysics Data System (ADS)

    Goddard, L.; Greene, A. M.; Baethgen, W.

    2012-12-01

    For the next assessment report of the IPCC (AR5), attention is being given to development of climate information that is appropriate for adaptation, such as decadal-scale and near-term predictions intended to capture the combined effects of natural climate variability and the emerging climate change signal. While the science and practice evolve for the production and use of dynamic decadal prediction, information relevant to agricultural decision-makers can be gained from analysis of past decadal-scale trends and variability. Statistical approaches that mimic the characteristics of observed year-to-year variability can indicate the range of possibilities and their likelihood. In this talk we present work towards development of near-term climate scenarios, which are needed to engage decision-makers and stakeholders in the regions in current decision-making. The work includes analyses of decadal-scale variability and trends in the AgMIP regions, and statistical approaches that capture year-to-year variability and the associated persistence of wet and dry years. We will outline the general methodology and some of the specific considerations in the regional application of the methodology for different AgMIP regions, such those for Western Africa versus southern Africa. We will also show some examples of quality checks and informational summaries of the generated data, including (1) metrics of information quality such as probabilistic reliability for a suite of relevant climate variables and indices important for agriculture; (2) quality checks relative to the use of this climate data in crop models; and, (3) summary statistics (e.g., for 5-10-year periods or across given spatial scales).

  19. Generalized theory of semiflexible polymers.

    PubMed

    Wiggins, Paul A; Nelson, Philip C

    2006-03-01

    DNA bending on length scales shorter than a persistence length plays an integral role in the translation of genetic information from DNA to cellular function. Quantitative experimental studies of these biological systems have led to a renewed interest in the polymer mechanics relevant for describing the conformational free energy of DNA bending induced by protein-DNA complexes. Recent experimental results from DNA cyclization studies have cast doubt on the applicability of the canonical semiflexible polymer theory, the wormlike chain (WLC) model, to DNA bending on biologically relevant length scales. This paper develops a theory of the chain statistics of a class of generalized semiflexible polymer models. Our focus is on the theoretical development of these models and the calculation of experimental observables. To illustrate our methods, we focus on a specific, illustrative model of DNA bending. We show that the WLC model generically describes the long-length-scale chain statistics of semiflexible polymers, as predicted by renormalization group arguments. In particular, we show that either the WLC or our present model adequately describes force-extension, solution scattering, and long-contour-length cyclization experiments, regardless of the details of DNA bend elasticity. In contrast, experiments sensitive to short-length-scale chain behavior can in principle reveal dramatic departures from the linear elastic behavior assumed in the WLC model. We demonstrate this explicitly by showing that our toy model can reproduce the anomalously large short-contour-length cyclization factors recently measured by Cloutier and Widom. Finally, we discuss the applicability of these models to DNA chain statistics in the context of future experiments.

  20. Comparative evaluation of topographical data of dental implant surfaces applying optical interferometry and scanning electron microscopy.

    PubMed

    Kournetas, N; Spintzyk, S; Schweizer, E; Sawada, T; Said, F; Schmid, P; Geis-Gerstorfer, J; Eliades, G; Rupp, F

    2017-08-01

    Comparability of topographical data of implant surfaces in literature is low and their clinical relevance often equivocal. The aim of this study was to investigate the ability of scanning electron microscopy and optical interferometry to assess statistically similar 3-dimensional roughness parameter results and to evaluate these data based on predefined criteria regarded relevant for a favorable biological response. Four different commercial dental screw-type implants (NanoTite Certain Prevail, TiUnite Brånemark Mk III, XiVE S Plus and SLA Standard Plus) were analyzed by stereo scanning electron microscopy and white light interferometry. Surface height, spatial and hybrid roughness parameters (Sa, Sz, Ssk, Sku, Sal, Str, Sdr) were assessed from raw and filtered data (Gaussian 50μm and 5μm cut-off-filters), respectively. Data were statistically compared by one-way ANOVA and Tukey-Kramer post-hoc test. For a clinically relevant interpretation, a categorizing evaluation approach was used based on predefined threshold criteria for each roughness parameter. The two methods exhibited predominantly statistical differences. Dependent on roughness parameters and filter settings, both methods showed variations in rankings of the implant surfaces and differed in their ability to discriminate the different topographies. Overall, the analyses revealed scale-dependent roughness data. Compared to the pure statistical approach, the categorizing evaluation resulted in much more similarities between the two methods. This study suggests to reconsider current approaches for the topographical evaluation of implant surfaces and to further seek after proper experimental settings. Furthermore, the specific role of different roughness parameters for the bioresponse has to be studied in detail in order to better define clinically relevant, scale-dependent and parameter-specific thresholds and ranges. Copyright © 2017 The Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  1. Long-term variability of global statistical properties of epileptic brain networks

    NASA Astrophysics Data System (ADS)

    Kuhnert, Marie-Therese; Elger, Christian E.; Lehnertz, Klaus

    2010-12-01

    We investigate the influence of various pathophysiologic and physiologic processes on global statistical properties of epileptic brain networks. We construct binary functional networks from long-term, multichannel electroencephalographic data recorded from 13 epilepsy patients, and the average shortest path length and the clustering coefficient serve as global statistical network characteristics. For time-resolved estimates of these characteristics we observe large fluctuations over time, however, with some periodic temporal structure. These fluctuations can—to a large extent—be attributed to daily rhythms while relevant aspects of the epileptic process contribute only marginally. Particularly, we could not observe clear cut changes in network states that can be regarded as predictive of an impending seizure. Our findings are of particular relevance for studies aiming at an improved understanding of the epileptic process with graph-theoretical approaches.

  2. Differentiation of five body fluids from forensic samples by expression analysis of four microRNAs using quantitative PCR.

    PubMed

    Sauer, Eva; Reinke, Ann-Kathrin; Courts, Cornelius

    2016-05-01

    Applying molecular genetic approaches for the identification of forensically relevant body fluids, which often yield crucial information for the reconstruction of a potential crime, is a current topic of forensic research. Due to their body fluid specific expression patterns and stability against degradation, microRNAs (miRNA) emerged as a promising molecular species, with a range of candidate markers published. The analysis of miRNA via quantitative Real-Time PCR, however, should be based on a relevant strategy of normalization of non-biological variances to deliver reliable and biologically meaningful results. The herein presented work is the as yet most comprehensive study of forensic body fluid identification via miRNA expression analysis based on a thoroughly validated qPCR procedure and unbiased statistical decision making to identify single source samples. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  3. GetReal in network meta-analysis: a review of the methodology.

    PubMed

    Efthimiou, Orestis; Debray, Thomas P A; van Valkenhoef, Gert; Trelle, Sven; Panayidou, Klea; Moons, Karel G M; Reitsma, Johannes B; Shang, Aijing; Salanti, Georgia

    2016-09-01

    Pairwise meta-analysis is an established statistical tool for synthesizing evidence from multiple trials, but it is informative only about the relative efficacy of two specific interventions. The usefulness of pairwise meta-analysis is thus limited in real-life medical practice, where many competing interventions may be available for a certain condition and studies informing some of the pairwise comparisons may be lacking. This commonly encountered scenario has led to the development of network meta-analysis (NMA). In the last decade, several applications, methodological developments, and empirical studies in NMA have been published, and the area is thriving as its relevance to public health is increasingly recognized. This article presents a review of the relevant literature on NMA methodology aiming to pinpoint the developments that have appeared in the field. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  4. Statistical learning algorithms for identifying contrasting tillage practices with landsat thematic mapper data

    USDA-ARS?s Scientific Manuscript database

    Tillage management practices have direct impact on water holding capacity, evaporation, carbon sequestration, and water quality. This study examines the feasibility of two statistical learning algorithms, such as Least Square Support Vector Machine (LSSVM) and Relevance Vector Machine (RVM), for cla...

  5. 50 CFR 600.133 - Scientific and Statistical Committee (SSC).

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... information as is relevant to such Council's development and amendment of any fishery management plan. (b...). 600.133 Section 600.133 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC... Fishery Management Councils § 600.133 Scientific and Statistical Committee (SSC). (a) Each Council shall...

  6. 50 CFR 600.133 - Scientific and Statistical Committee (SSC).

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... information as is relevant to such Council's development and amendment of any fishery management plan. (b...). 600.133 Section 600.133 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC... Fishery Management Councils § 600.133 Scientific and Statistical Committee (SSC). (a) Each Council shall...

  7. Clarifying changes in student empathy throughout medical school: a scoping review.

    PubMed

    Ferreira-Valente, Alexandra; Monteiro, Joana S; Barbosa, Rita M; Salgueira, Ana; Costa, Patrício; Costa, Manuel J

    2017-12-01

    Despite the increasing awareness of the relevance of empathy in patient care, some findings suggest that medical schools may be contributing to the deterioration of students' empathy. Therefore, it is important to clarify the magnitude and direction of changes in empathy during medical school. We employed a scoping review to elucidate trends in students' empathy changes/differences throughout medical school and examine potential bias associated with research design. The literature published in English, Spanish, Portuguese and French from 2009 to 2016 was searched. Two-hundred and nine potentially relevant citations were identified. Twenty articles met the inclusion criteria. Effect sizes of empathy scores variations were calculated to assess the practical significance of results. Our results demonstrate that scoped studies differed considerably in their design, measures used, sample sizes and results. Most studies (12 out of 20 studies) reported either positive or non-statistically significant changes/differences in empathy regardless of the measure used. The predominant trend in cross-sectional studies (ten out of 13 studies) was of significantly higher empathy scores in later years or of similar empathy scores across years, while most longitudinal studies presented either mixed-results or empathy declines. There was not a generalized international trend in changes in students' empathy throughout medical school. Although statistically significant changes/differences were detected in 13 out of 20 studies, the calculated effect sizes were small in all but two studies, suggesting little practical significance. At the present moment, the literature does not offer clear conclusions relative to changes in student empathy throughout medical school.

  8. Australian oral health case notes: assessment of forensic relevance and adherence to recording guidelines.

    PubMed

    Stow, L; James, H; Richards, L

    2016-06-01

    Dental case notes record clinical diagnoses and treatments, as well as providing continuity of patient care. They are also used for dento-legal litigation and forensic purposes. Maintaining accurate and comprehensive dental patient records is a dental worker's ethical and legal obligation. Australian registered specialist forensic odontologists were surveyed to determine the relevance of recorded case note items for dental identification. A dental case notes sample was assessed for adherence with odontologist nominated forensic value and compiled professional record keeping guidelines of forensic relevance. Frequency of item recording, confidence interval, examiner agreement and statistical significance were determined. Broad agreement existed between forensic odontologists as to which recorded dental items have most forensic relevance. Inclusion frequency of these items in sampled case notes varied widely (e.g. single area radiographic view present in 75%, CI = 65.65-82.50; completed odontogram in 56%, CI = 46.23-65.33). Recording of information specified by professional record keeping guidelines also varied, although overall inclusion was higher than for forensically desired items (e.g. patient's full name in 99%, CI = 94.01 - >99.99; named treating practitioner in 23%, CI = 15.78-32.31). Many sampled dental case notes lacked details identified as being valuable by forensic specialists and as specified by professional record keeping guidelines. © 2016 Australian Dental Association.

  9. Successful classification of cocaine dependence using brain imaging: a generalizable machine learning approach.

    PubMed

    Mete, Mutlu; Sakoglu, Unal; Spence, Jeffrey S; Devous, Michael D; Harris, Thomas S; Adinoff, Bryon

    2016-10-06

    Neuroimaging studies have yielded significant advances in the understanding of neural processes relevant to the development and persistence of addiction. However, these advances have not explored extensively for diagnostic accuracy in human subjects. The aim of this study was to develop a statistical approach, using a machine learning framework, to correctly classify brain images of cocaine-dependent participants and healthy controls. In this study, a framework suitable for educing potential brain regions that differed between the two groups was developed and implemented. Single Photon Emission Computerized Tomography (SPECT) images obtained during rest or a saline infusion in three cohorts of 2-4 week abstinent cocaine-dependent participants (n = 93) and healthy controls (n = 69) were used to develop a classification model. An information theoretic-based feature selection algorithm was first conducted to reduce the number of voxels. A density-based clustering algorithm was then used to form spatially connected voxel clouds in three-dimensional space. A statistical classifier, Support Vectors Machine (SVM), was then used for participant classification. Statistically insignificant voxels of spatially connected brain regions were removed iteratively and classification accuracy was reported through the iterations. The voxel-based analysis identified 1,500 spatially connected voxels in 30 distinct clusters after a grid search in SVM parameters. Participants were successfully classified with 0.88 and 0.89 F-measure accuracies in 10-fold cross validation (10xCV) and leave-one-out (LOO) approaches, respectively. Sensitivity and specificity were 0.90 and 0.89 for LOO; 0.83 and 0.83 for 10xCV. Many of the 30 selected clusters are highly relevant to the addictive process, including regions relevant to cognitive control, default mode network related self-referential thought, behavioral inhibition, and contextual memories. Relative hyperactivity and hypoactivity of regional cerebral blood flow in brain regions in cocaine-dependent participants are presented with corresponding level of significance. The SVM-based approach successfully classified cocaine-dependent and healthy control participants using voxels selected with information theoretic-based and statistical methods from participants' SPECT data. The regions found in this study align with brain regions reported in the literature. These findings support the future use of brain imaging and SVM-based classifier in the diagnosis of substance use disorders and furthering an understanding of their underlying pathology.

  10. Determining Primary Care Physician Information Needs to Inform Ambulatory Visit Note Display

    PubMed Central

    Clarke, M.A.; Steege, L.M.; Moore, J.L.; Koopman, R.J.; Belden, J.L.; Kim, M.S.

    2014-01-01

    Summary Background With the increase in the adoption of electronic health records (EHR) across the US, primary care physicians are experiencing information overload. The purpose of this pilot study was to determine the information needs of primary care physicians (PCPs) as they review clinic visit notes to inform EHR display. Method Data collection was conducted with 15 primary care physicians during semi-structured interviews, including a third party observer to control bias. Physicians reviewed major sections of an artificial but typical acute and chronic care visit note to identify the note sections that were relevant to their information needs. Statistical methods used were McNemar-Mosteller’s and Cochran Q. Results Physicians identified History of Present Illness (HPI), Assessment, and Plan (A&P) as the most important sections of a visit note. In contrast, they largely judged the Review of Systems (ROS) to be superfluous. There was also a statistical difference in physicians’ highlighting among all seven major note sections in acute (p = 0.00) and chronic (p = 0.00) care visit notes. Conclusion A&P and HPI sections were most frequently identified as important which suggests that physicians may have to identify a few key sections out of a long, unnecessarily verbose visit note. ROS is viewed by doctors as mostly “not needed,” but can have relevant information. The ROS can contain information needed for patient care when other sections of the Visit note, such as the HPI, lack the relevant information. Future studies should include producing a display that provides only relevant information to increase physician efficiency at the point of care. Also, research on moving A&P to the top of visit notes instead of having A&P at the bottom of the page is needed, since those are usually the first sections physicians refer to and reviewing from top to bottom may cause cognitive load. PMID:24734131

  11. Learning predictive statistics from temporal sequences: Dynamics and strategies.

    PubMed

    Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew E; Kourtzi, Zoe

    2017-10-01

    Human behavior is guided by our expectations about the future. Often, we make predictions by monitoring how event sequences unfold, even though such sequences may appear incomprehensible. Event structures in the natural environment typically vary in complexity, from simple repetition to complex probabilistic combinations. How do we learn these structures? Here we investigate the dynamics of structure learning by tracking human responses to temporal sequences that change in structure unbeknownst to the participants. Participants were asked to predict the upcoming item following a probabilistic sequence of symbols. Using a Markov process, we created a family of sequences, from simple frequency statistics (e.g., some symbols are more probable than others) to context-based statistics (e.g., symbol probability is contingent on preceding symbols). We demonstrate the dynamics with which individuals adapt to changes in the environment's statistics-that is, they extract the behaviorally relevant structures to make predictions about upcoming events. Further, we show that this structure learning relates to individual decision strategy; faster learning of complex structures relates to selection of the most probable outcome in a given context (maximizing) rather than matching of the exact sequence statistics. Our findings provide evidence for alternate routes to learning of behaviorally relevant statistics that facilitate our ability to predict future events in variable environments.

  12. Natural variability of biochemical biomarkers in the macro-zoobenthos: Dependence on life stage and environmental factors.

    PubMed

    Scarduelli, Lucia; Giacchini, Roberto; Parenti, Paolo; Migliorati, Sonia; Di Brisco, Agnese Maria; Vighi, Marco

    2017-11-01

    Biomarkers are widely used in ecotoxicology as indicators of exposure to toxicants. However, their ability to provide ecologically relevant information remains controversial. One of the major problems is understanding whether the measured responses are determined by stress factors or lie within the natural variability range. In a previous work, the natural variability of enzymatic levels in invertebrates sampled in pristine rivers was proven to be relevant across both space and time. In the present study, the experimental design was improved by considering different life stages of the selected taxa and by measuring more environmental parameters. The experimental design considered sampling sites in 2 different rivers, 8 sampling dates covering the whole seasonal cycle, 4 species from 3 different taxonomic groups (Plecoptera, Perla grandis; Ephemeroptera, Baetis alpinus and Epeorus alpicula; Tricoptera, Hydropsyche pellucidula), different life stages for each species, and 4 enzymes (acetylcholinesterase, glutathione S-transferase, alkaline phosphatase, and catalase). Biomarker levels were related to environmental (physicochemical) parameters to verify any kind of dependence. Data were statistically elaborated using hierarchical multilevel Bayesian models. Natural variability was found to be relevant across both space and time. The results of the present study proved that care should be paid when interpreting biomarker results. Further research is needed to better understand the dependence of the natural variability on environmental parameters. Environ Toxicol Chem 2017;36:3158-3167. © 2017 SETAC. © 2017 SETAC.

  13. Library Statistical Data Base Formats and Definitions.

    ERIC Educational Resources Information Center

    Jones, Dennis; And Others

    Represented are the detailed set of data structures relevant to the categorization of information, terminology, and definitions employed in the design of the library statistical data base. The data base, or management information system, provides administrators with a framework of information and standardized data for library management, planning,…

  14. Understanding Broadscale Wildfire Risks in a Human-Dominated Landscape

    Treesearch

    Jeffrey P. Prestemon; John M. Pye; David T. Butry; Thomas P. Holmes; D. Evan Mercer

    2002-01-01

    Broadscale statistical evaluations of wildfire incidence can answer policy relevant questions about the effectiveness of microlevel vegetation management and can identify subjects needing further study. A dynamic time series cross-sectional model was used to evaluate the statistical links between forest wildfire and vegetation management, human land use, and climatic...

  15. A Bifactor Approach to Model Multifaceted Constructs in Statistical Mediation Analysis

    ERIC Educational Resources Information Center

    Gonzalez, Oscar; MacKinnon, David P.

    2018-01-01

    Statistical mediation analysis allows researchers to identify the most important mediating constructs in the causal process studied. Identifying specific mediators is especially relevant when the hypothesized mediating construct consists of multiple related facets. The general definition of the construct and its facets might relate differently to…

  16. A Statistical Decision Model for Periodical Selection for a Specialized Information Center

    ERIC Educational Resources Information Center

    Dym, Eleanor D.; Shirey, Donald L.

    1973-01-01

    An experiment is described which attempts to define a quantitative methodology for the identification and evaluation of all possibly relevant periodical titles containing toxicological-biological information. A statistical decision model was designed and employed, along with yes/no criteria questions, a training technique and a quality control…

  17. Introductory Statistics and Fish Management.

    ERIC Educational Resources Information Center

    Jardine, Dick

    2002-01-01

    Describes how fisheries research and management data (available on a website) have been incorporated into an Introductory Statistics course. In addition to the motivation gained from seeing the practical relevance of the course, some students have participated in the data collection and analysis for the New Hampshire Fish and Game Department. (MM)

  18. The Importance of Introductory Statistics Students Understanding Appropriate Sampling Techniques

    ERIC Educational Resources Information Center

    Menil, Violeta C.

    2005-01-01

    In this paper the author discusses the meaning of sampling, the reasons for sampling, the Central Limit Theorem, and the different techniques of sampling. Practical and relevant examples are given to make the appropriate sampling techniques understandable to students of Introductory Statistics courses. With a thorough knowledge of sampling…

  19. Teaching Primary School Mathematics and Statistics: Evidence-Based Practice

    ERIC Educational Resources Information Center

    Averill, Robin; Harvey, Roger

    2010-01-01

    Here is the only reference book you will ever need for teaching primary school mathematics and statistics. It is full of exciting and engaging snapshots of excellent classroom practice relevant to "The New Zealand Curriculum" and national mathematics standards. There are many fascinating examples of investigative learning experiences,…

  20. Forecast in foreign exchange markets

    NASA Astrophysics Data System (ADS)

    Baviera, R.; Pasquini, M.; Serva, M.; Vergni, D.; Vulpiani, A.

    2001-04-01

    We perform a statistical study of weak efficiency in Deutschemark/US dollar exchange rates using high frequency data. The presence of correlations in the returns sequence implies the possibility of a statistical forecast of market behavior. We show the existence of correlations and how information theory can be relevant in this context.

  1. Establishment and Assessment of Plasma Disruption and Warning Databases from EAST

    NASA Astrophysics Data System (ADS)

    Wang, Bo; Robert, Granetz; Xiao, Bingjia; Li, Jiangang; Yang, Fei; Li, Junjun; Chen, Dalong

    2016-12-01

    Disruption database and disruption warning database of the EAST tokamak had been established by a disruption research group. The disruption database, based on Structured Query Language (SQL), comprises 41 disruption parameters, which include current quench characteristics, EFIT equilibrium characteristics, kinetic parameters, halo currents, and vertical motion. Presently most disruption databases are based on plasma experiments of non-superconducting tokamak devices. The purposes of the EAST database are to find disruption characteristics and disruption statistics to the fully superconducting tokamak EAST, to elucidate the physics underlying tokamak disruptions, to explore the influence of disruption on superconducting magnets and to extrapolate toward future burning plasma devices. In order to quantitatively assess the usefulness of various plasma parameters for predicting disruptions, a similar SQL database to Alcator C-Mod for EAST has been created by compiling values for a number of proposed disruption-relevant parameters sampled from all plasma discharges in the 2015 campaign. The detailed statistic results and analysis of two databases on the EAST tokamak are presented. supported by the National Magnetic Confinement Fusion Science Program of China (No. 2014GB103000)

  2. Protecting High Energy Barriers: A New Equation to Regulate Boost Energy in Accelerated Molecular Dynamics Simulations.

    PubMed

    Sinko, William; de Oliveira, César Augusto F; Pierce, Levi C T; McCammon, J Andrew

    2012-01-10

    Molecular dynamics (MD) is one of the most common tools in computational chemistry. Recently, our group has employed accelerated molecular dynamics (aMD) to improve the conformational sampling over conventional molecular dynamics techniques. In the original aMD implementation, sampling is greatly improved by raising energy wells below a predefined energy level. Recently, our group presented an alternative aMD implementation where simulations are accelerated by lowering energy barriers of the potential energy surface. When coupled with thermodynamic integration simulations, this implementation showed very promising results. However, when applied to large systems, such as proteins, the simulation tends to be biased to high energy regions of the potential landscape. The reason for this behavior lies in the boost equation used since the highest energy barriers are dramatically more affected than the lower ones. To address this issue, in this work, we present a new boost equation that prevents oversampling of unfavorable high energy conformational states. The new boost potential provides not only better recovery of statistics throughout the simulation but also enhanced sampling of statistically relevant regions in explicit solvent MD simulations.

  3. Dynamic scaling in natural swarms

    NASA Astrophysics Data System (ADS)

    Cavagna, Andrea; Conti, Daniele; Creato, Chiara; Del Castello, Lorenzo; Giardina, Irene; Grigera, Tomas S.; Melillo, Stefania; Parisi, Leonardo; Viale, Massimiliano

    2017-09-01

    Collective behaviour in biological systems presents theoretical challenges beyond the borders of classical statistical physics. The lack of concepts such as scaling and renormalization is particularly problematic, as it forces us to negotiate details whose relevance is often hard to assess. In an attempt to improve this situation, we present here experimental evidence of the emergence of dynamic scaling laws in natural swarms of midges. We find that spatio-temporal correlation functions in different swarms can be rescaled by using a single characteristic time, which grows with the correlation length with a dynamical critical exponent z ~ 1, a value not found in any other standard statistical model. To check whether out-of-equilibrium effects may be responsible for this anomalous exponent, we run simulations of the simplest model of self-propelled particles and find z ~ 2, suggesting that natural swarms belong to a novel dynamic universality class. This conclusion is strengthened by experimental evidence of the presence of non-dissipative modes in the relaxation, indicating that previously overlooked inertial effects are needed to describe swarm dynamics. The absence of a purely dissipative regime suggests that natural swarms undergo a near-critical censorship of hydrodynamics.

  4. A unifying perspective on personality pathology across the life span: Developmental considerations for the fifth edition of the Diagnostic and Statistical Manual of Mental Disorders

    PubMed Central

    TACKETT, JENNIFER L.; BALSIS, STEVE; OLTMANNS, THOMAS F.; KRUEGER, ROBERT F.

    2010-01-01

    Proposed changes in the fifth edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-V) include replacing current personality disorder (PD) categories on Axis II with a taxonomy of dimensional maladaptive personality traits. Most of the work on dimensional models of personality pathology, and on personality disorders per se, has been conducted on young and middle-aged adult populations. Numerous questions remain regarding the applicability and limitations of applying various PD models to early and later life. In the present paper, we provide an overview of such dimensional models and review current proposals for conceptualizing PDs in DSM-V. Next, we extensively review existing evidence on the development, measurement, and manifestation of personality pathology in early and later life focusing on those issues deemed most relevant for informing DSM-V. Finally, we present overall conclusions regarding the need to incorporate developmental issues in conceptualizing PDs in DSM-V and highlight the advantages of a dimensional model in unifying PD perspectives across the life span. PMID:19583880

  5. Innovations in curriculum design: A multi-disciplinary approach to teaching statistics to undergraduate medical students

    PubMed Central

    Freeman, Jenny V; Collier, Steve; Staniforth, David; Smith, Kevin J

    2008-01-01

    Background Statistics is relevant to students and practitioners in medicine and health sciences and is increasingly taught as part of the medical curriculum. However, it is common for students to dislike and under-perform in statistics. We sought to address these issues by redesigning the way that statistics is taught. Methods The project brought together a statistician, clinician and educational experts to re-conceptualize the syllabus, and focused on developing different methods of delivery. New teaching materials, including videos, animations and contextualized workbooks were designed and produced, placing greater emphasis on applying statistics and interpreting data. Results Two cohorts of students were evaluated, one with old style and one with new style teaching. Both were similar with respect to age, gender and previous level of statistics. Students who were taught using the new approach could better define the key concepts of p-value and confidence interval (p < 0.001 for both). They were more likely to regard statistics as integral to medical practice (p = 0.03), and to expect to use it in their medical career (p = 0.003). There was no significant difference in the numbers who thought that statistics was essential to understand the literature (p = 0.28) and those who felt comfortable with the basics of statistics (p = 0.06). More than half the students in both cohorts felt that they were comfortable with the basics of medical statistics. Conclusion Using a variety of media, and placing emphasis on interpretation can help make teaching, learning and understanding of statistics more people-centred and relevant, resulting in better outcomes for students. PMID:18452599

  6. [Clinical research IV. Relevancy of the statistical test chosen].

    PubMed

    Talavera, Juan O; Rivas-Ruiz, Rodolfo

    2011-01-01

    When we look at the difference between two therapies or the association of a risk factor or prognostic indicator with its outcome, we need to evaluate the accuracy of the result. This assessment is based on a judgment that uses information about the study design and statistical management of the information. This paper specifically mentions the relevance of the statistical test selected. Statistical tests are chosen mainly from two characteristics: the objective of the study and type of variables. The objective can be divided into three test groups: a) those in which you want to show differences between groups or inside a group before and after a maneuver, b) those that seek to show the relationship (correlation) between variables, and c) those that aim to predict an outcome. The types of variables are divided in two: quantitative (continuous and discontinuous) and qualitative (ordinal and dichotomous). For example, if we seek to demonstrate differences in age (quantitative variable) among patients with systemic lupus erythematosus (SLE) with and without neurological disease (two groups), the appropriate test is the "Student t test for independent samples." But if the comparison is about the frequency of females (binomial variable), then the appropriate statistical test is the χ(2).

  7. Using Algal Metrics and Biomass to Evaluate Multiple Ways of Defining Concentration-Based Nutrient Criteria in Streams and their Ecological Relevance

    EPA Science Inventory

    We examined the utility of nutrient criteria derived solely from total phosphorus (TP) concentrations in streams (regression models and percentile distributions) and evaluated their ecological relevance to diatom and algal biomass responses. We used a variety of statistics to cha...

  8. Biometrics in the Medical School Curriculum: Making the Necessary Relevant.

    ERIC Educational Resources Information Center

    Murphy, James R.

    1980-01-01

    Because a student is more likely to learn and retain course content perceived as relevant, an attempt was made to change medical students' perceptions of a biometrics course by introducing statistical methods as a means of solving problems in the interpretation of clinical lab data. Retrospective analysis of student course evaluations indicates a…

  9. On the limits of statistical learning: Intertrial contextual cueing is confined to temporally close contingencies.

    PubMed

    Thomas, Cyril; Didierjean, André; Maquestiaux, François; Goujon, Annabelle

    2018-04-12

    Since the seminal study by Chun and Jiang (Cognitive Psychology, 36, 28-71, 1998), a large body of research based on the contextual-cueing paradigm has shown that the cognitive system is capable of extracting statistical contingencies from visual environments. Most of these studies have focused on how individuals learn regularities found within an intratrial temporal window: A context predicts the target position within a given trial. However, Ono, Jiang, and Kawahara (Journal of Experimental Psychology, 31, 703-712, 2005) provided evidence of an intertrial implicit-learning effect when a distractor configuration in preceding trials N - 1 predicted the target location in trials N. The aim of the present study was to gain further insight into this effect by examining whether it occurs when predictive relationships are impeded by interfering task-relevant noise (Experiments 2 and 3) or by a long delay (Experiments 1, 4, and 5). Our results replicated the intertrial contextual-cueing effect, which occurred in the condition of temporally close contingencies. However, there was no evidence of integration across long-range spatiotemporal contingencies, suggesting a temporal limitation of statistical learning.

  10. A Complementary Note to 'A Lag-1 Smoother Approach to System-Error Estimation': The Intrinsic Limitations of Residual Diagnostics

    NASA Technical Reports Server (NTRS)

    Todling, Ricardo

    2015-01-01

    Recently, this author studied an approach to the estimation of system error based on combining observation residuals derived from a sequential filter and fixed lag-1 smoother. While extending the methodology to a variational formulation, experimenting with simple models and making sure consistency was found between the sequential and variational formulations, the limitations of the residual-based approach came clearly to the surface. This note uses the sequential assimilation application to simple nonlinear dynamics to highlight the issue. Only when some of the underlying error statistics are assumed known is it possible to estimate the unknown component. In general, when considerable uncertainties exist in the underlying statistics as a whole, attempts to obtain separate estimates of the various error covariances are bound to lead to misrepresentation of errors. The conclusions are particularly relevant to present-day attempts to estimate observation-error correlations from observation residual statistics. A brief illustration of the issue is also provided by comparing estimates of error correlations derived from a quasi-operational assimilation system and a corresponding Observing System Simulation Experiments framework.

  11. Bayesian hypothesis testing for human threat conditioning research: an introduction and the condir R package

    PubMed Central

    Krypotos, Angelos-Miltiadis; Klugkist, Irene; Engelhard, Iris M.

    2017-01-01

    ABSTRACT Threat conditioning procedures have allowed the experimental investigation of the pathogenesis of Post-Traumatic Stress Disorder. The findings of these procedures have also provided stable foundations for the development of relevant intervention programs (e.g. exposure therapy). Statistical inference of threat conditioning procedures is commonly based on p-values and Null Hypothesis Significance Testing (NHST). Nowadays, however, there is a growing concern about this statistical approach, as many scientists point to the various limitations of p-values and NHST. As an alternative, the use of Bayes factors and Bayesian hypothesis testing has been suggested. In this article, we apply this statistical approach to threat conditioning data. In order to enable the easy computation of Bayes factors for threat conditioning data we present a new R package named condir, which can be used either via the R console or via a Shiny application. This article provides both a non-technical introduction to Bayesian analysis for researchers using the threat conditioning paradigm, and the necessary tools for computing Bayes factors easily. PMID:29038683

  12. Bootstrapping under constraint for the assessment of group behavior in human contact networks

    NASA Astrophysics Data System (ADS)

    Tremblay, Nicolas; Barrat, Alain; Forest, Cary; Nornberg, Mark; Pinton, Jean-François; Borgnat, Pierre

    2013-11-01

    The increasing availability of time- and space-resolved data describing human activities and interactions gives insights into both static and dynamic properties of human behavior. In practice, nevertheless, real-world data sets can often be considered as only one realization of a particular event. This highlights a key issue in social network analysis: the statistical significance of estimated properties. In this context, we focus here on the assessment of quantitative features of specific subset of nodes in empirical networks. We present a method of statistical resampling based on bootstrapping groups of nodes under constraints within the empirical network. The method enables us to define acceptance intervals for various null hypotheses concerning relevant properties of the subset of nodes under consideration in order to characterize by a statistical test its behavior as “normal” or not. We apply this method to a high-resolution data set describing the face-to-face proximity of individuals during two colocated scientific conferences. As a case study, we show how to probe whether colocating the two conferences succeeded in bringing together the two corresponding groups of scientists.

  13. Deep data mining in a real space: Separation of intertwined electronic responses in a lightly doped BaFe 2As 2

    DOE PAGES

    Ziatdinov, Maxim; Maksov, Artem; Li, Li; ...

    2016-10-25

    Electronic interactions present in material compositions close to the superconducting dome play a key role in the manifestation of high-T c superconductivity. In many correlated electron systems, however, the parent or underdoped states exhibit strongly inhomogeneous electronic landscape at the nanoscale that may be associated with competing, coexisting, or intertwined chemical disorder, strain, magnetic, and structural order parameters. Here we demonstrate an approach based on a combination of scanning tunneling microscopy/spectroscopy and advanced statistical learning for an automatic separation and extraction of statistically significant electronic behaviors in the spin density wave regime of a lightly (~1%) gold-doped BaFe 2As 2.more » Lastly, we show that the decomposed STS spectral features have a direct relevance to fundamental physical properties of the system, such as SDW-induced gap, pseudogap-like state, and impurity resonance states.« less

  14. Deep data mining in a real space: Separation of intertwined electronic responses in a lightly doped BaFe 2As 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ziatdinov, Maxim; Maksov, Artem; Li, Li

    Electronic interactions present in material compositions close to the superconducting dome play a key role in the manifestation of high-T c superconductivity. In many correlated electron systems, however, the parent or underdoped states exhibit strongly inhomogeneous electronic landscape at the nanoscale that may be associated with competing, coexisting, or intertwined chemical disorder, strain, magnetic, and structural order parameters. Here we demonstrate an approach based on a combination of scanning tunneling microscopy/spectroscopy and advanced statistical learning for an automatic separation and extraction of statistically significant electronic behaviors in the spin density wave regime of a lightly (~1%) gold-doped BaFe 2As 2.more » Lastly, we show that the decomposed STS spectral features have a direct relevance to fundamental physical properties of the system, such as SDW-induced gap, pseudogap-like state, and impurity resonance states.« less

  15. Low-dose ionizing radiation increases the mortality risk of solid cancers in nuclear industry workers: A meta-analysis

    PubMed Central

    Qu, Shu-Gen; Gao, Jin; Tang, Bo; Yu, Bo; Shen, Yue-Ping; Tu, Yu

    2018-01-01

    Low-dose ionizing radiation (LDIR) may increase the mortality of solid cancers in nuclear industry workers, but only few individual cohort studies exist, and the available reports have low statistical power. The aim of the present study was to focus on solid cancer mortality risk from LDIR in the nuclear industry using standard mortality ratios (SMRs) and 95% confidence intervals. A systematic literature search through the PubMed and Embase databases identified 27 studies relevant to this meta-analysis. There was statistical significance for total, solid and lung cancers, with meta-SMR values of 0.88, 0.80, and 0.89, respectively. There was evidence of stochastic effects by IR, but more definitive conclusions require additional analyses using standardized protocols to determine whether LDIR increases the risk of solid cancer-related mortality. PMID:29725540

  16. Descriptive epidemiology of breast cancer in China: incidence, mortality, survival and prevalence.

    PubMed

    Li, Tong; Mello-Thoms, Claudia; Brennan, Patrick C

    2016-10-01

    Breast cancer is the most common neoplasm diagnosed amongst women worldwide and is the leading cause of female cancer death. However, breast cancer in China is not comprehensively understood compared with Westernised countries, although the 5-year prevalence statistics indicate that approximately 11 % of worldwide breast cancer occurs in China and that the incidence has increased rapidly in recent decades. This paper reviews the descriptive epidemiology of Chinese breast cancer in terms of incidence, mortality, survival and prevalence, and explores relevant factors such as age of manifestation and geographic locations. The statistics are compared with data from the Westernised world with particular emphasis on the United States and Australia. Potential causal agents responsible for differences in breast cancer epidemiology between Chinese and other populations are also explored. The need to minimise variability and discrepancies in methods of data acquisition, analysis and presentation is highlighted.

  17. The Australasian Resuscitation in Sepsis Evaluation (ARISE) trial statistical analysis plan.

    PubMed

    Delaney, Anthony P; Peake, Sandra L; Bellomo, Rinaldo; Cameron, Peter; Holdgate, Anna; Howe, Belinda; Higgins, Alisa; Presneill, Jeffrey; Webb, Steve

    2013-09-01

    The Australasian Resuscitation in Sepsis Evaluation (ARISE) study is an international, multicentre, randomised, controlled trial designed to evaluate the effectiveness of early goal-directed therapy compared with standard care for patients presenting to the emergency department with severe sepsis. In keeping with current practice, and considering aspects of trial design and reporting specific to non-pharmacological interventions, our plan outlines the principles and methods for analysing and reporting the trial results. The document is prepared before completion of recruitment into the ARISE study, without knowledge of the results of the interim analysis conducted by the data safety and monitoring committee and before completion of the two related international studies. Our statistical analysis plan was designed by the ARISE chief investigators, and reviewed and approved by the ARISE steering committee. We reviewed the data collected by the research team as specified in the study protocol and detailed in the study case report form. We describe information related to baseline characteristics, characteristics of delivery of the trial interventions, details of resuscitation, other related therapies and other relevant data with appropriate comparisons between groups. We define the primary, secondary and tertiary outcomes for the study, with description of the planned statistical analyses. We have developed a statistical analysis plan with a trial profile, mock-up tables and figures. We describe a plan for presenting baseline characteristics, microbiological and antibiotic therapy, details of the interventions, processes of care and concomitant therapies and adverse events. We describe the primary, secondary and tertiary outcomes with identification of subgroups to be analysed. We have developed a statistical analysis plan for the ARISE study, available in the public domain, before the completion of recruitment into the study. This will minimise analytical bias and conforms to current best practice in conducting clinical trials.

  18. Analysis of respiratory events in obstructive sleep apnea syndrome: Inter-relations and association to simple nocturnal features.

    PubMed

    Ghandeharioun, H; Rezaeitalab, F; Lotfi, R

    2016-01-01

    This study carefully evaluates the association of different respiration-related events to each other and to simple nocturnal features in obstructive sleep apnea-hypopnea syndrome (OSAS). The events include apneas, hypopneas, respiratory event-related arousals and snores. We conducted a statistical study on 158 adults who underwent polysomnography between July 2012 and May 2014. To monitor relevance, along with linear statistical strategies like analysis of variance and bootstrapping a correlation coefficient standard error, the non-linear method of mutual information is also applied to illuminate vague results of linear techniques. Based on normalized mutual information weights (NMIW), indices of apnea are 1.3 times more relevant to AHI values than those of hypopnea. NMIW for the number of blood oxygen desaturation below 95% is considerable (0.531). The next relevant feature is "respiratory arousals index" with NMIW of 0.501. Snore indices (0.314), and BMI (0.203) take the next place. Based on NMIW values, snoring events are nearly one-third (29.9%) more dependent to hypopneas than RERAs. 1. The more sever the OSAS is, the more frequently the apneic events happen. 2. The association of snore with hypopnea/RERA revealed which is routinely ignored in regression-based OSAS modeling. 3. The statistical dependencies of oximetry features potentially can lead to home-based screening of OSAS. 4. Poor ESS-AHI relevance in the database under study indicates its disability for the OSA diagnosis compared to oximetry. 5. Based on poor RERA-snore/ESS relevance, detailed history of the symptoms plus polysomnography is suggested for accurate diagnosis of RERAs. Copyright © 2015 Sociedade Portuguesa de Pneumologia. Published by Elsevier España, S.L.U. All rights reserved.

  19. Development of ecological indicator guilds for land management

    USGS Publications Warehouse

    Krzysik, A.J.; Balbach, H.E.; Duda, J.J.; Emlen, J.M.; Freeman, D.C.; Graham, J.H.; Kovacic, D.A.; Smith, L.M.; Zak, J.C.

    2005-01-01

    Agency land-use must be efficiently and cost-effectively monitored to assess conditions and trends in ecosystem processes and natural resources relevant to mission requirements and legal mandates. Ecological Indicators represent important land management tools for tracking ecological changes and preventing irreversible environmental damage in disturbed landscapes. The overall objective of the research was to develop both individual and integrated sets (i.e., statistically derived guilds) of Ecological Indicators to: quantify habitat conditions and trends, track and monitor ecological changes, provide early warning or threshold detection, and provide guidance for land managers. The derivation of Ecological Indicators was based on statistical criteria, ecosystem relevance, reliability and robustness, economy and ease of use for land managers, multi-scale performance, and stress response criteria. The basis for the development of statistically based Ecological Indicators was the identification of ecosystem metrics that analytically tracked a landscape disturbance gradient.

  20. Development and Validation of Instruments to Measure Learning of Expert-Like Thinking

    NASA Astrophysics Data System (ADS)

    Adams, Wendy K.; Wieman, Carl E.

    2011-06-01

    This paper describes the process for creating and validating an assessment test that measures the effectiveness of instruction by probing how well that instruction causes students in a class to think like experts about specific areas of science. The design principles and process are laid out and it is shown how these align with professional standards that have been established for educational and psychological testing and the elements of assessment called for in a recent National Research Council study on assessment. The importance of student interviews for creating and validating the test is emphasized, and the appropriate interview procedures are presented. The relevance and use of standard psychometric statistical tests are discussed. Additionally, techniques for effective test administration are presented.

  1. Operationalization Of The Professional Risks Assessment Activity

    NASA Astrophysics Data System (ADS)

    Ivascu, Victoria Larisa; Cirjaliu, Bianca; Draghici, Anca

    2015-07-01

    Professional risks assessment approach (integration of analysis and evaluation processes) is linked with the general concerns of nowadays companies for their employees' health and safety assurances, in the context of organizations sustainable development. The paper presents an approach for the operationalization of the professional risk assessment activity in companies through the implementation and use of the OnRisk platform (this have been tested in some industrial companies). The short presentation of the relevant technical reports and statistics on OSH management at the European Union level underlines the need for the development of a professional risks assessment. Finally, there have been described the designed and developed OnRisk platform as a web platform together with some case studies that have validate the created tool.

  2. Retrieval of diagnostic and treatment studies for clinical use through PubMed and PubMed's Clinical Queries filters.

    PubMed

    Lokker, Cynthia; Haynes, R Brian; Wilczynski, Nancy L; McKibbon, K Ann; Walter, Stephen D

    2011-01-01

    Clinical Queries filters were developed to improve the retrieval of high-quality studies in searches on clinical matters. The study objective was to determine the yield of relevant citations and physician satisfaction while searching for diagnostic and treatment studies using the Clinical Queries page of PubMed compared with searching PubMed without these filters. Forty practicing physicians, presented with standardized treatment and diagnosis questions and one question of their choosing, entered search terms which were processed in a random, blinded fashion through PubMed alone and PubMed Clinical Queries. Participants rated search retrievals for applicability to the question at hand and satisfaction. For treatment, the primary outcome of retrieval of relevant articles was not significantly different between the groups, but a higher proportion of articles from the Clinical Queries searches met methodologic criteria (p=0.049), and more articles were published in core internal medicine journals (p=0.056). For diagnosis, the filtered results returned more relevant articles (p=0.031) and fewer irrelevant articles (overall retrieval less, p=0.023); participants needed to screen fewer articles before arriving at the first relevant citation (p<0.05). Relevance was also influenced by content terms used by participants in searching. Participants varied greatly in their search performance. Clinical Queries filtered searches returned more high-quality studies, though the retrieval of relevant articles was only statistically different between the groups for diagnosis questions. Retrieving clinically important research studies from Medline is a challenging task for physicians. Methodological search filters can improve search retrieval.

  3. Joint principal trend analysis for longitudinal high-dimensional data.

    PubMed

    Zhang, Yuping; Ouyang, Zhengqing

    2018-06-01

    We consider a research scenario motivated by integrating multiple sources of information for better knowledge discovery in diverse dynamic biological processes. Given two longitudinal high-dimensional datasets for a group of subjects, we want to extract shared latent trends and identify relevant features. To solve this problem, we present a new statistical method named as joint principal trend analysis (JPTA). We demonstrate the utility of JPTA through simulations and applications to gene expression data of the mammalian cell cycle and longitudinal transcriptional profiling data in response to influenza viral infections. © 2017, The International Biometric Society.

  4. Reconnection Dynamics and Mutual Friction in Quantum Turbulence

    NASA Astrophysics Data System (ADS)

    Laurie, Jason; Baggaley, Andrew W.

    2015-07-01

    We investigate the behaviour of the mutual friction force in finite temperature quantum turbulence in He, paying particular attention to the role of quantized vortex reconnections. Through the use of the vortex filament model, we produce three experimentally relevant types of vortex tangles in steady-state conditions, and examine through statistical analysis, how local properties of the tangle influence the mutual friction force. Finally, by monitoring reconnection events, we present evidence to indicate that vortex reconnections are the dominant mechanism for producing areas of high curvature and velocity leading to regions of high mutual friction, particularly for homogeneous and isotropic vortex tangles.

  5. Development of a method to extend by boron neutron capture process the therapeutic possibilities of a liver autograft

    NASA Astrophysics Data System (ADS)

    Pinelli, Tazio; Altieri, Saverio; Fossati, F.; Zonta, Aris; Prati, U.; Roveda, L.; Nano, Rosanna

    1997-02-01

    We present results on surgical technique, neutron filed and irradiation facility concerning the original treatment of the liver diffused metastases. Our method plans to irradiate the isolated organ at a thermal neutron field soon after having been explanted and boron enriched and before being grafted into the same donor. In particular the crucial point of boron uptake was investigated by a rat model with a relevant number of procedure. We give for the first time statistically significant results on the selective boron absorption by tumor tissues.

  6. Visualizing biological reaction intermediates with DNA curtains

    NASA Astrophysics Data System (ADS)

    Zhao, Yiling; Jiang, Yanzhou; Qi, Zhi

    2017-04-01

    Single-molecule approaches have tremendous potential analyzing dynamic biological reaction with heterogeneity that cannot be effectively accessed via traditional ensemble-level biochemical approaches. The approach of deoxyribonucleic acid (DNA) curtains developed by Dr Eric Greene and his research team at Columbia University is a high-throughput single-molecule technique that utilizes fluorescent imaging to visualize protein-DNA interactions directly and allows the acquisition of statistically relevant information from hundreds or even thousands of individual reactions. This review aims to summarize the past, present, and future of DNA curtains, with an emphasis on its applications to solve important biological questions.

  7. Physics at the FMQT’08 conference

    NASA Astrophysics Data System (ADS)

    Špička, V.; Nieuwenhuizen, Th. M.; Keefe, P. D.

    2010-01-01

    This paper summarizes the recent state of the art of the following topics presented at the FQMT’08 conference: Foundations of quantum physics, Quantum measurement; Quantum noise, decoherence and dephasing; Cold atoms and Bose-Einstein condensation; Physics of quantum computing and information; Nonequilibrium quantum statistical mechanics; Quantum, mesoscopic and partly classical thermodynamics; Mesoscopic, nano-electro-mechanical systems and optomechanical systems; Spins systems and their dynamics, Brownian motion and molecular motors; Physics of biological systems, and Relevant experiments from the nanoscale to the macroscale. To all these subjects an introduction is given and the recent literature is overviewed. The paper contains some 680 references in total.

  8. Evaluation of Cepstrum Algorithm with Impact Seeded Fault Data of Helicopter Oil Cooler Fan Bearings and Machine Fault Simulator Data

    DTIC Science & Technology

    2013-02-01

    of a bearing must be put into practice. There are many potential methods, the most traditional being the use of statistical time-domain features...accelerate degradation to test multiples bearings to gain statistical relevance and extrapolate results to scale for field conditions. Temperature...as time statistics , frequency estimation to improve the fault frequency detection. For future investigations, one can further explore the

  9. An Evidence-Based Medicine Curriculum Improves General Surgery Residents' Standardized Test Scores in Research and Statistics.

    PubMed

    Trickey, Amber W; Crosby, Moira E; Singh, Monika; Dort, Jonathan M

    2014-12-01

    The application of evidence-based medicine to patient care requires unique skills of the physician. Advancing residents' abilities to accurately evaluate the quality of evidence is built on understanding of fundamental research concepts. The American Board of Surgery In-Training Examination (ABSITE) provides a relevant measure of surgical residents' knowledge of research design and statistics. We implemented a research education curriculum in an independent academic medical center general residency program, and assessed the effect on ABSITE scores. The curriculum consisted of five 1-hour monthly research and statistics lectures. The lectures were presented before the 2012 and 2013 examinations. Forty residents completing ABSITE examinations from 2007 to 2013 were included in the study. Two investigators independently identified research-related item topics from examination summary reports. Correct and incorrect responses were compared precurriculum and postcurriculum. Regression models were calculated to estimate improvement in postcurriculum scores, adjusted for individuals' scores over time and postgraduate year level. Residents demonstrated significant improvement in postcurriculum examination scores for research and statistics items. Correct responses increased 27% (P < .001). Residents were 5 times more likely to achieve a perfect score on research and statistics items postcurriculum (P < .001). Residents at all levels demonstrated improved research and statistics scores after receiving the curriculum. Because the ABSITE includes a wide spectrum of research topics, sustained improvements suggest a genuine level of understanding that will promote lifelong evaluation and clinical application of the surgical literature.

  10. Use of a Self-Reflection Tool to Enhance Resident Learning on an Adolescent Medicine Rotation.

    PubMed

    Greenberg, Katherine Blumoff; Baldwin, Constance

    2016-08-01

    Adolescent Medicine (AM) educators in pediatric residency programs are seeking new ways to engage learners in adolescent health. This mixed-methods study presents a novel self-reflection tool and addresses whether self-reflection enhanced residents' perception of the value of an adolescent rotation, in particular, its relevance to their future practice. The self-reflection tool included 17 Likert scale items on residents' comfort with the essential tasks of adolescent care and open-ended questions that promoted self-reflection and goal setting. Semi-structured, postrotation interviews encouraged residents to discuss their experiences. Likert scale data were analyzed using descriptive statistics, and interview notes and written comments on the self-reflection tool were combined for qualitative data analysis. Residents' pre-to post-self-evaluations showed statistically significant increases in comfort with most of the adolescent health care tasks. Four major themes emerged from our qualitative analysis: (1) the value of observing skilled attendings as role models; (2) the comfort gained through broad and frequent adolescent care experiences; (3) the career relevance of AM; and (4) the ability to set personally meaningful goals for the rotation. Residents used the self-reflection tool to mindfully set goals and found their AM education valuable and relevant to their future careers. Our tool helped make explicit to residents the norms, values, and beliefs of the hidden curriculum applied to the care of adolescents and helped them to improve the self-assessed quality of their rapport and communications with adolescents. We conclude that a structured self-reflection exercise can enhance residents' experiences on an AM rotation. Copyright © 2016 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  11. Perspectives on statistics education: observations from statistical consulting in an academic nursing environment.

    PubMed

    Hayat, Matthew J; Schmiege, Sarah J; Cook, Paul F

    2014-04-01

    Statistics knowledge is essential for understanding the nursing and health care literature, as well as for applying rigorous science in nursing research. Statistical consultants providing services to faculty and students in an academic nursing program have the opportunity to identify gaps and challenges in statistics education for nursing students. This information may be useful to curriculum committees and statistics educators. This article aims to provide perspective on statistics education stemming from the experiences of three experienced statistics educators who regularly collaborate and consult with nurse investigators. The authors share their knowledge and express their views about data management, data screening and manipulation, statistical software, types of scientific investigation, and advanced statistical topics not covered in the usual coursework. The suggestions provided promote a call for data to study these topics. Relevant data about statistics education can assist educators in developing comprehensive statistics coursework for nursing students. Copyright 2014, SLACK Incorporated.

  12. Bayesian Statistics in Educational Research: A Look at the Current State of Affairs

    ERIC Educational Resources Information Center

    König, Christoph; van de Schoot, Rens

    2018-01-01

    The ability of a scientific discipline to build cumulative knowledge depends on its predominant method of data analysis. A steady accumulation of knowledge requires approaches which allow researchers to consider results from comparable prior research. Bayesian statistics is especially relevant for establishing a cumulative scientific discipline,…

  13. Teaching Biology through Statistics: Application of Statistical Methods in Genetics and Zoology Courses

    ERIC Educational Resources Information Center

    Colon-Berlingeri, Migdalisel; Burrowes, Patricia A.

    2011-01-01

    Incorporation of mathematics into biology curricula is critical to underscore for undergraduate students the relevance of mathematics to most fields of biology and the usefulness of developing quantitative process skills demanded in modern biology. At our institution, we have made significant changes to better integrate mathematics into the…

  14. Addressing the statistical mechanics of planet orbits in the solar system

    NASA Astrophysics Data System (ADS)

    Mogavero, Federico

    2017-10-01

    The chaotic nature of planet dynamics in the solar system suggests the relevance of a statistical approach to planetary orbits. In such a statistical description, the time-dependent position and velocity of the planets are replaced by the probability density function (PDF) of their orbital elements. It is natural to set up this kind of approach in the framework of statistical mechanics. In the present paper, I focus on the collisionless excitation of eccentricities and inclinations via gravitational interactions in a planetary system. The future planet trajectories in the solar system constitute the prototype of this kind of dynamics. I thus address the statistical mechanics of the solar system planet orbits and try to reproduce the PDFs numerically constructed by Laskar (2008, Icarus, 196, 1). I show that the microcanonical ensemble of the Laplace-Lagrange theory accurately reproduces the statistics of the giant planet orbits. To model the inner planets I then investigate the ansatz of equiprobability in the phase space constrained by the secular integrals of motion. The eccentricity and inclination PDFs of Earth and Venus are reproduced with no free parameters. Within the limitations of a stationary model, the predictions also show a reasonable agreement with Mars PDFs and that of Mercury inclination. The eccentricity of Mercury demands in contrast a deeper analysis. I finally revisit the random walk approach of Laskar to the time dependence of the inner planet PDFs. Such a statistical theory could be combined with direct numerical simulations of planet trajectories in the context of planet formation, which is likely to be a chaotic process.

  15. Household hazardous waste management: a review.

    PubMed

    Inglezakis, Vassilis J; Moustakas, Konstantinos

    2015-03-01

    This paper deals with the waste stream of household hazardous waste (HHW) presenting existing management systems, legislation overview and other relevant quantitative and qualitative information. European Union legislation and international management schemes are summarized and presented in a concise manner by the use of diagrams in order to provide crucial information on HHW. Furthermore, sources and types, numerical figures about generation, collection and relevant management costs are within the scope of the present paper. The review shows that the term used to refer to hazardous waste generated in households is not clearly defined in legislation, while there is absence of specific acts regulating the management of HHW. The lack of obligation to segregate HHW from the household waste and the different terminology used makes it difficult to determine the quantities and composition of this waste stream, while its generation amount is relatively small and, therefore, is commonly overlooked in waste statistics. The paper aims to cover the gap in the related literature on a subject that is included within the crucial waste management challenges at world level, considering that HHW can also have impact on other waste streams by altering the redox conditions or causing direct reactions with other non hazardous waste substances. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Process evaluation to explore internal and external validity of the "Act in Case of Depression" care program in nursing homes.

    PubMed

    Leontjevas, Ruslan; Gerritsen, Debby L; Koopmans, Raymond T C M; Smalbrugge, Martin; Vernooij-Dassen, Myrra J F J

    2012-06-01

    A multidisciplinary, evidence-based care program to improve the management of depression in nursing home residents was implemented and tested using a stepped-wedge design in 23 nursing homes (NHs): "Act in case of Depression" (AiD). Before effect analyses, to evaluate AiD process data on sampling quality (recruitment and randomization, reach) and intervention quality (relevance and feasibility, extent to which AiD was performed), which can be used for understanding internal and external validity. In this article, a model is presented that divides process evaluation data into first- and second-order process data. Qualitative and quantitative data based on personal files of residents, interviews of nursing home professionals, and a research database were analyzed according to the following process evaluation components: sampling quality and intervention quality. Nursing home. The pattern of residents' informed consent rates differed for dementia special care units and somatic units during the study. The nursing home staff was satisfied with the AiD program and reported that the program was feasible and relevant. With the exception of the first screening step (nursing staff members using a short observer-based depression scale), AiD components were not performed fully by NH staff as prescribed in the AiD protocol. Although NH staff found the program relevant and feasible and was satisfied with the program content, individual AiD components may have different feasibility. The results on sampling quality implied that statistical analyses of AiD effectiveness should account for the type of unit, whereas the findings on intervention quality implied that, next to the type of unit, analyses should account for the extent to which individual AiD program components were performed. In general, our first-order process data evaluation confirmed internal and external validity of the AiD trial, and this evaluation enabled further statistical fine tuning. The importance of evaluating the first-order process data before executing statistical effect analyses is thus underlined. Copyright © 2012 American Medical Directors Association, Inc. Published by Elsevier Inc. All rights reserved.

  17. Analysis of nasopharyngeal carcinoma risk factors with Bayesian networks.

    PubMed

    Aussem, Alex; de Morais, Sérgio Rodrigues; Corbex, Marilys

    2012-01-01

    We propose a new graphical framework for extracting the relevant dietary, social and environmental risk factors that are associated with an increased risk of nasopharyngeal carcinoma (NPC) on a case-control epidemiologic study that consists of 1289 subjects and 150 risk factors. This framework builds on the use of Bayesian networks (BNs) for representing statistical dependencies between the random variables. We discuss a novel constraint-based procedure, called Hybrid Parents and Children (HPC), that builds recursively a local graph that includes all the relevant features statistically associated to the NPC, without having to find the whole BN first. The local graph is afterwards directed by the domain expert according to his knowledge. It provides a statistical profile of the recruited population, and meanwhile helps identify the risk factors associated to NPC. Extensive experiments on synthetic data sampled from known BNs show that the HPC outperforms state-of-the-art algorithms that appeared in the recent literature. From a biological perspective, the present study confirms that chemical products, pesticides and domestic fume intake from incomplete combustion of coal and wood are significantly associated with NPC risk. These results suggest that industrial workers are often exposed to noxious chemicals and poisonous substances that are used in the course of manufacturing. This study also supports previous findings that the consumption of a number of preserved food items, like house made proteins and sheep fat, are a major risk factor for NPC. BNs are valuable data mining tools for the analysis of epidemiologic data. They can explicitly combine both expert knowledge from the field and information inferred from the data. These techniques therefore merit consideration as valuable alternatives to traditional multivariate regression techniques in epidemiologic studies. Copyright © 2011 Elsevier B.V. All rights reserved.

  18. Event detection and sub-state discovery from biomolecular simulations using higher-order statistics: application to enzyme adenylate kinase.

    PubMed

    Ramanathan, Arvind; Savol, Andrej J; Agarwal, Pratul K; Chennubhotla, Chakra S

    2012-11-01

    Biomolecular simulations at millisecond and longer time-scales can provide vital insights into functional mechanisms. Because post-simulation analyses of such large trajectory datasets can be a limiting factor in obtaining biological insights, there is an emerging need to identify key dynamical events and relating these events to the biological function online, that is, as simulations are progressing. Recently, we have introduced a novel computational technique, quasi-anharmonic analysis (QAA) (Ramanathan et al., PLoS One 2011;6:e15827), for partitioning the conformational landscape into a hierarchy of functionally relevant sub-states. The unique capabilities of QAA are enabled by exploiting anharmonicity in the form of fourth-order statistics for characterizing atomic fluctuations. In this article, we extend QAA for analyzing long time-scale simulations online. In particular, we present HOST4MD--a higher-order statistical toolbox for molecular dynamics simulations, which (1) identifies key dynamical events as simulations are in progress, (2) explores potential sub-states, and (3) identifies conformational transitions that enable the protein to access those sub-states. We demonstrate HOST4MD on microsecond timescale simulations of the enzyme adenylate kinase in its apo state. HOST4MD identifies several conformational events in these simulations, revealing how the intrinsic coupling between the three subdomains (LID, CORE, and NMP) changes during the simulations. Further, it also identifies an inherent asymmetry in the opening/closing of the two binding sites. We anticipate that HOST4MD will provide a powerful and extensible framework for detecting biophysically relevant conformational coordinates from long time-scale simulations. Copyright © 2012 Wiley Periodicals, Inc.

  19. Transportation Energy Data Book: Edition 34

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Stacy Cagle; Williams, Susan E; Boundy, Robert Gary

    2015-08-01

    The Transportation Energy Data Book: Edition 34 is a statistical compendium prepared and published by Oak Ridge National Laboratory (ORNL) under contract with the U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy, Vehicle Technologies Office. Designed for use as a desk-top reference, the Data Book represents an assembly and display of statistics and information that characterize transportation activity, and presents data on other factors that influence transportation energy use. The purpose of this document is to present relevant statistical data in the form of tables and graphs. The latest edition of the Data Book is available tomore » a larger audience via the Internet (cta.ornl.gov/data). This edition of the Data Book has 12 chapters which focus on various aspects of the transportation industry. Chapter 1 focuses on petroleum; Chapter 2 energy; Chapter 3 highway vehicles; Chapter 4 light vehicles; Chapter 5 heavy vehicles; Chapter 6 alternative fuel vehicles; Chapter 7 fleet vehicles; Chapter 8 household vehicles; Chapter 9 nonhighway modes; Chapter 10 transportation and the economy; Chapter 11 greenhouse gas emissions; and Chapter 12 criteria pollutant emissions. The sources used represent the latest available data. There are also three appendices which include detailed source information for some tables, measures of conversion, and the definition of Census divisions and regions. A glossary of terms and a title index are also included for the reader s convenience.« less

  20. Transportation Energy Data Book: Edition 35

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Stacy Cagle; Williams, Susan E.; Boundy, Robert Gary

    2016-10-01

    The Transportation Energy Data Book: Edition 35 is a statistical compendium prepared and published by Oak Ridge National Laboratory (ORNL) under contract with the U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy, Vehicle Technologies Office. Designed for use as a desk-top reference, the Data Book represents an assembly and display of statistics and information that characterize transportation activity, and presents data on other factors that influence transportation energy use. The purpose of this document is to present relevant statistical data in the form of tables and graphs. The latest edition of the Data Book is available tomore » a larger audience via the Internet (cta.ornl.gov/data). This edition of the Data Book has 12 chapters which focus on various aspects of the transportation industry. Chapter 1 focuses on petroleum; Chapter 2 energy; Chapter 3 highway vehicles; Chapter 4 light vehicles; Chapter 5 heavy vehicles; Chapter 6 alternative fuel vehicles; Chapter 7 fleet vehicles; Chapter 8 household vehicles; Chapter 9 nonhighway modes; Chapter 10 transportation and the economy; Chapter 11 greenhouse gas emissions; and Chapter 12 criteria pollutant emissions. The sources used represent the latest available data. There are also three appendices which include detailed source information for some tables, measures of conversion, and the definition of Census divisions and regions. A glossary of terms and a title index are also included for the reader s convenience.« less

  1. Transportation Energy Data Book: Edition 30

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Stacy Cagle; Diegel, Susan W; Boundy, Robert Gary

    2011-07-01

    The Transportation Energy Data Book: Edition 30 is a statistical compendium prepared and published by Oak Ridge National Laboratory (ORNL) under contract with the U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy, Vehicle Technologies Program. Designed for use as a desk-top reference, the Data Book represents an assembly and display of statistics and information that characterize transportation activity, and presents data on other factors that influence transportation energy use. The purpose of this document is to present relevant statistical data in the form of tables and graphs. The latest edition of the Data Book is available tomore » a larger audience via the Internet (cta.ornl.gov/data). This edition of the Data Book has 12 chapters which focus on various aspects of the transportation industry. Chapter 1 focuses on petroleum; Chapter 2 energy; Chapter 3 highway vehicles; Chapter 4 light vehicles; Chapter 5 heavy vehicles; Chapter 6 alternative fuel vehicles; Chapter 7 fleet vehicles; Chapter 8 household vehicles; Chapter 9 nonhighway modes; Chapter 10 transportation and the economy; Chapter 11 greenhouse gas emissions; and Chapter 12 criteria pollutant emissions. The sources used represent the latest available data. There are also three appendices which include detailed source information for some tables, measures of conversion, and the definition of Census divisions and regions. A glossary of terms and a title index are also included for the reader s convenience.« less

  2. A Geostatistical Scaling Approach for the Generation of Non Gaussian Random Variables and Increments

    NASA Astrophysics Data System (ADS)

    Guadagnini, Alberto; Neuman, Shlomo P.; Riva, Monica; Panzeri, Marco

    2016-04-01

    We address manifestations of non-Gaussian statistical scaling displayed by many variables, Y, and their (spatial or temporal) increments. Evidence of such behavior includes symmetry of increment distributions at all separation distances (or lags) with sharp peaks and heavy tails which tend to decay asymptotically as lag increases. Variables reported to exhibit such distributions include quantities of direct relevance to hydrogeological sciences, e.g. porosity, log permeability, electrical resistivity, soil and sediment texture, sediment transport rate, rainfall, measured and simulated turbulent fluid velocity, and other. No model known to us captures all of the documented statistical scaling behaviors in a unique and consistent manner. We recently proposed a generalized sub-Gaussian model (GSG) which reconciles within a unique theoretical framework the probability distributions of a target variable and its increments. We presented an algorithm to generate unconditional random realizations of statistically isotropic or anisotropic GSG functions and illustrated it in two dimensions. In this context, we demonstrated the feasibility of estimating all key parameters of a GSG model underlying a single realization of Y by analyzing jointly spatial moments of Y data and corresponding increments. Here, we extend our GSG model to account for noisy measurements of Y at a discrete set of points in space (or time), present an algorithm to generate conditional realizations of corresponding isotropic or anisotropic random field, and explore them on one- and two-dimensional synthetic test cases.

  3. Transportation Energy Data Book. Edition 33

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Stacy Cagle; Williams, Susan E.; Boundy, Robert Gary

    2014-07-01

    The Transportation Energy Data Book: Edition 33 is a statistical compendium prepared and published by Oak Ridge National Laboratory (ORNL) under contract with the U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy, Vehicle Technologies Office. Designed for use as a desk-top reference, the Data Book represents an assembly and display of statistics and information that characterize transportation activity, and presents data on other factors that influence transportation energy use. The purpose of this document is to present relevant statistical data in the form of tables and graphs. The latest edition of the Data Book is available tomore » a larger audience via the Internet (cta.ornl.gov/data). This edition of the Data Book has 12 chapters which focus on various aspects of the transportation industry. Chapter 1 focuses on petroleum; Chapter 2 energy; Chapter 3 highway vehicles; Chapter 4 light vehicles; Chapter 5 heavy vehicles; Chapter 6 alternative fuel vehicles; Chapter 7 fleet vehicles; Chapter 8 household vehicles; Chapter 9 nonhighway modes; Chapter 10 transportation and the economy; Chapter 11 greenhouse gas emissions; and Chapter 12 criteria pollutant emissions. The sources used represent the latest available data. There are also three appendices which include detailed source information for some tables, measures of conversion, and the definition of Census divisions and regions. A glossary of terms and a title index are also included for the reader s convenience.« less

  4. Transportation Energy Data Book: Edition 32

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Stacy Cagle; Diegel, Susan W; Boundy, Robert Gary

    2013-08-01

    The Transportation Energy Data Book: Edition 32 is a statistical compendium prepared and published by Oak Ridge National Laboratory (ORNL) under contract with the U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy, Vehicle Technologies Office. Designed for use as a desk-top reference, the Data Book represents an assembly and display of statistics and information that characterize transportation activity, and presents data on other factors that influence transportation energy use. The purpose of this document is to present relevant statistical data in the form of tables and graphs. The latest edition of the Data Book is available tomore » a larger audience via the Internet (cta.ornl.gov/data). This edition of the Data Book has 12 chapters which focus on various aspects of the transportation industry. Chapter 1 focuses on petroleum; Chapter 2 energy; Chapter 3 highway vehicles; Chapter 4 light vehicles; Chapter 5 heavy vehicles; Chapter 6 alternative fuel vehicles; Chapter 7 fleet vehicles; Chapter 8 household vehicles; Chapter 9 nonhighway modes; Chapter 10 transportation and the economy; Chapter 11 greenhouse gas emissions; and Chapter 12 criteria pollutant emissions. The sources used represent the latest available data. There are also three appendices which include detailed source information for some tables, measures of conversion, and the definition of Census divisions and regions. A glossary of terms and a title index are also included for the reader s convenience.« less

  5. Transportation Energy Data Book: Edition 31

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Stacy Cagle; Diegel, Susan W; Boundy, Robert Gary

    2012-08-01

    The Transportation Energy Data Book: Edition 31 is a statistical compendium prepared and published by Oak Ridge National Laboratory (ORNL) under contract with the U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy, Vehicle Technologies Program. Designed for use as a desk-top reference, the Data Book represents an assembly and display of statistics and information that characterize transportation activity, and presents data on other factors that influence transportation energy use. The purpose of this document is to present relevant statistical data in the form of tables and graphs. The latest edition of the Data Book is available tomore » a larger audience via the Internet (cta.ornl.gov/data). This edition of the Data Book has 12 chapters which focus on various aspects of the transportation industry. Chapter 1 focuses on petroleum; Chapter 2 energy; Chapter 3 highway vehicles; Chapter 4 light vehicles; Chapter 5 heavy vehicles; Chapter 6 alternative fuel vehicles; Chapter 7 fleet vehicles; Chapter 8 household vehicles; Chapter 9 nonhighway modes; Chapter 10 transportation and the economy; Chapter 11 greenhouse gas emissions; and Chapter 12 criteria pollutant emissions. The sources used represent the latest available data. There are also three appendices which include detailed source information for some tables, measures of conversion, and the definition of Census divisions and regions. A glossary of terms and a title index are also included for the reader s convenience.« less

  6. The Performance Analysis Based on SAR Sample Covariance Matrix

    PubMed Central

    Erten, Esra

    2012-01-01

    Multi-channel systems appear in several fields of application in science. In the Synthetic Aperture Radar (SAR) context, multi-channel systems may refer to different domains, as multi-polarization, multi-interferometric or multi-temporal data, or even a combination of them. Due to the inherent speckle phenomenon present in SAR images, the statistical description of the data is almost mandatory for its utilization. The complex images acquired over natural media present in general zero-mean circular Gaussian characteristics. In this case, second order statistics as the multi-channel covariance matrix fully describe the data. For practical situations however, the covariance matrix has to be estimated using a limited number of samples, and this sample covariance matrix follow the complex Wishart distribution. In this context, the eigendecomposition of the multi-channel covariance matrix has been shown in different areas of high relevance regarding the physical properties of the imaged scene. Specifically, the maximum eigenvalue of the covariance matrix has been frequently used in different applications as target or change detection, estimation of the dominant scattering mechanism in polarimetric data, moving target indication, etc. In this paper, the statistical behavior of the maximum eigenvalue derived from the eigendecomposition of the sample multi-channel covariance matrix in terms of multi-channel SAR images is simplified for SAR community. Validation is performed against simulated data and examples of estimation and detection problems using the analytical expressions are as well given. PMID:22736976

  7. Transportation Energy Data Book: Edition 29

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Stacy Cagle; Diegel, Susan W; Boundy, Robert Gary

    2010-07-01

    The Transportation Energy Data Book: Edition 29 is a statistical compendium prepared and published by Oak Ridge National Laboratory (ORNL) under contract with the U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy, Vehicle Technologies Program. Designed for use as a desk-top reference, the Data Book represents an assembly and display of statistics and information that characterize transportation activity, and presents data on other factors that influence transportation energy use. The purpose of this document is to present relevant statistical data in the form of tables and graphs. The latest edition of the Data Book is available tomore » a larger audience via the Internet (cta.ornl.gov/data). This edition of the Data Book has 12 chapters which focus on various aspects of the transportation industry. Chapter 1 focuses on petroleum; Chapter 2 energy; Chapter 3 highway vehicles; Chapter 4 light vehicles; Chapter 5 heavy vehicles; Chapter 6 alternative fuel vehicles; Chapter 7 fleet vehicles; Chapter 8 household vehicles; Chapter 9 nonhighway modes; Chapter 10 transportation and the economy; Chapter 11 greenhouse gas emissions; and Chapter 12 criteria pollutant emissions. The sources used represent the latest available data. There are also three appendices which include detailed source information for some tables, measures of conversion, and the definition of Census divisions and regions. A glossary of terms and a title index are also included for the reader s convenience.« less

  8. Transportation Energy Data Book: Edition 36

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Susan E.; Davis, Stacy Cagle; Boundy, Robert Gary

    The Transportation Energy Data Book: Edition 36 is a statistical compendium prepared and published by Oak Ridge National Laboratory (ORNL) under contract with the U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy, Vehicle Technologies Office. Designed for use as a desk-top reference, the Data Book represents an assembly and display of statistics and information that characterize transportation activity, and presents data on other factors that influence transportation energy use. The purpose of this document is to present relevant statistical data in the form of tables and graphs. The latest edition of the Data Book is available viamore » the Internet (cta.ornl.gov/data). This edition of the Data Book has 12 chapters which focus on various aspects of the transportation industry. Chapter 1 focuses on petroleum; Chapter 2 – energy; Chapter 3 – highway vehicles; Chapter 4 – light vehicles; Chapter 5 – heavy vehicles; Chapter 6 – alternative fuel vehicles; Chapter 7 – fleet vehicles; Chapter 8 – household vehicles; Chapter 9 – nonhighway modes; Chapter 10 – transportation and the economy; Chapter 11 – greenhouse gas emissions; and Chapter 12 – criteria pollutant emissions. The sources used represent the latest available data. There are also three appendices which include detailed source information for some tables, measures of conversion, and the definition of Census divisions and regions. A glossary of terms is also included for the reader’s convenience.« less

  9. Rainfall runoff modelling of the Upper Ganga and Brahmaputra basins using PERSiST.

    PubMed

    Futter, M N; Whitehead, P G; Sarkar, S; Rodda, H; Crossman, J

    2015-06-01

    There are ongoing discussions about the appropriate level of complexity and sources of uncertainty in rainfall runoff models. Simulations for operational hydrology, flood forecasting or nutrient transport all warrant different levels of complexity in the modelling approach. More complex model structures are appropriate for simulations of land-cover dependent nutrient transport while more parsimonious model structures may be adequate for runoff simulation. The appropriate level of complexity is also dependent on data availability. Here, we use PERSiST; a simple, semi-distributed dynamic rainfall-runoff modelling toolkit to simulate flows in the Upper Ganges and Brahmaputra rivers. We present two sets of simulations driven by single time series of daily precipitation and temperature using simple (A) and complex (B) model structures based on uniform and hydrochemically relevant land covers respectively. Models were compared based on ensembles of Bayesian Information Criterion (BIC) statistics. Equifinality was observed for parameters but not for model structures. Model performance was better for the more complex (B) structural representations than for parsimonious model structures. The results show that structural uncertainty is more important than parameter uncertainty. The ensembles of BIC statistics suggested that neither structural representation was preferable in a statistical sense. Simulations presented here confirm that relatively simple models with limited data requirements can be used to credibly simulate flows and water balance components needed for nutrient flux modelling in large, data-poor basins.

  10. A statistical approach to bioclimatic trend detection in the airborne pollen records of Catalonia (NE Spain)

    NASA Astrophysics Data System (ADS)

    Fernández-Llamazares, Álvaro; Belmonte, Jordina; Delgado, Rosario; De Linares, Concepción

    2014-04-01

    Airborne pollen records are a suitable indicator for the study of climate change. The present work focuses on the role of annual pollen indices for the detection of bioclimatic trends through the analysis of the aerobiological spectra of 11 taxa of great biogeographical relevance in Catalonia over an 18-year period (1994-2011), by means of different parametric and non-parametric statistical methods. Among others, two non-parametric rank-based statistical tests were performed for detecting monotonic trends in time series data of the selected airborne pollen types and we have observed that they have similar power in detecting trends. Except for those cases in which the pollen data can be well-modeled by a normal distribution, it is better to apply non-parametric statistical methods to aerobiological studies. Our results provide a reliable representation of the pollen trends in the region and suggest that greater pollen quantities are being liberated to the atmosphere in the last years, specially by Mediterranean taxa such as Pinus, Total Quercus and Evergreen Quercus, although the trends may differ geographically. Longer aerobiological monitoring periods are required to corroborate these results and survey the increasing levels of certain pollen types that could exert an impact in terms of public health.

  11. Comparing posteroanterior with lateral and anteroposterior chest radiography in the initial detection of parapneumonic effusions.

    PubMed

    Moffett, Bryan K; Panchabhai, Tanmay S; Nakamatsu, Raul; Arnold, Forest W; Peyrani, Paula; Wiemken, Timothy; Guardiola, Juan; Ramirez, Julio A

    2016-12-01

    It is unclear whether anteroposterior (AP) or posteroanterior with lateral (PA/Lat) chest radiographs are superior in the early detection of clinically relevant parapneumonic effusions (CR-PPEs). The objective of this study was to identify which technique is preferred for detection of PPEs using chest computed tomography (CCT) as a reference standard. A secondary analysis of a pneumonia database was conducted to identify patients who received a CCT within 24 hours of presentation and also received AP or PA/Lat chest radiographs within 24 hours of CCT. Sensitivity and specificity were then calculated by comparing the radiographic diagnosis of PPEs of both types of radiographs compared with CCT by using the existing attending radiologist interpretation. Clinical relevance of effusions was determined by CCT effusion measurement of >2.5 cm or presence of loculation. There was a statistically significant difference between the sensitivity of AP (67.3%) and PA/Lat (83.9%) chest radiography for the initial detection of CR-PPE. Of 16 CR-PPEs initially missed by AP radiography, 7 either required drainage initially or developed empyema within 30 days, whereas no complicated PPE or empyema was found in those missed by PA/Lat radiography. PA/Lat chest radiography should be the initial imaging of choice in pneumonia patients for detection of PPEs because it appears to be statistically superior to AP chest radiography. Published by Elsevier Inc.

  12. Occupational noise exposure, psychosocial working conditions and the risk of tinnitus.

    PubMed

    Frederiksen, Thomas Winther; Ramlau-Hansen, Cecilia Høst; Stokholm, Zara Ann; Grynderup, Matias Brødsgaard; Hansen, Åse Marie; Lund, Søren Peter; Kristiansen, Jesper; Vestergaard, Jesper Medom; Bonde, Jens Peter; Kolstad, Henrik Albert

    2017-02-01

    The purpose of this study was to evaluate the influence of occupational noise (current and cumulative doses) and psychosocial work factors (psychological demands and decision latitude) on tinnitus occurrence among workers, using objective and non-self-reported exposure measures to prevent reporting bias. In a cross-sectional study, we analyzed data from a Danish survey from 2009 to 2010 that included 534 workers from children day care units and 10 manufacturing trades. Associations between risk factors (current noise exposure, cumulative noise exposure and psychosocial working conditions) and tinnitus were analyzed with logistic regression. We found no statistically significant associations between either current [OR 0.95 (95% CI 0.89; 1.01)] or cumulative [OR 0.93 (95% CI 0.81; 1.06)] occupational noise exposure and tinnitus. Likewise, results for psychosocial working conditions showed no statistically significant association between work place decision latitude [OR 1.06 (95% CI 0.94; 1.13)] or psychological demands [OR 1.07 (95% CI 0.90; 1.26)] and tinnitus. Our results suggest that current Danish occupational noise levels (in combination with relevant noise protection) are not associated with tinnitus. Also, results indicated that the psychosocial working conditions we observed in this cohort of mainly industrial workers were not associated with tinnitus. Therefore, psychosocial working conditions comparable to those observed in this study are probably not relevant to take into account in the evaluation of workers presenting with tinnitus.

  13. Exploratory Visual Analysis of Statistical Results from Microarray Experiments Comparing High and Low Grade Glioma

    PubMed Central

    Reif, David M.; Israel, Mark A.; Moore, Jason H.

    2007-01-01

    The biological interpretation of gene expression microarray results is a daunting challenge. For complex diseases such as cancer, wherein the body of published research is extensive, the incorporation of expert knowledge provides a useful analytical framework. We have previously developed the Exploratory Visual Analysis (EVA) software for exploring data analysis results in the context of annotation information about each gene, as well as biologically relevant groups of genes. We present EVA as a flexible combination of statistics and biological annotation that provides a straightforward visual interface for the interpretation of microarray analyses of gene expression in the most commonly occuring class of brain tumors, glioma. We demonstrate the utility of EVA for the biological interpretation of statistical results by analyzing publicly available gene expression profiles of two important glial tumors. The results of a statistical comparison between 21 malignant, high-grade glioblastoma multiforme (GBM) tumors and 19 indolent, low-grade pilocytic astrocytomas were analyzed using EVA. By using EVA to examine the results of a relatively simple statistical analysis, we were able to identify tumor class-specific gene expression patterns having both statistical and biological significance. Our interactive analysis highlighted the potential importance of genes involved in cell cycle progression, proliferation, signaling, adhesion, migration, motility, and structure, as well as candidate gene loci on a region of Chromosome 7 that has been implicated in glioma. Because EVA does not require statistical or computational expertise and has the flexibility to accommodate any type of statistical analysis, we anticipate EVA will prove a useful addition to the repertoire of computational methods used for microarray data analysis. EVA is available at no charge to academic users and can be found at http://www.epistasis.org. PMID:19390666

  14. Research on the development efficiency of regional high-end talent in China: A complex network approach

    PubMed Central

    Zhang, Wenbin

    2017-01-01

    In this paper, based on the panel data of 31 provinces and cities in China from 1991 to 2016, the regional development efficiency matrix of high-end talent is obtained by DEA method, and the matrix is converted into a continuous change of complex networks through the construction of sliding window. Using a series of continuous changes in the complex network topology statistics, the characteristics of regional high-end talent development efficiency system are analyzed. And the results show that the average development efficiency of high-end talent in the western region is at a low level. After 2005, the national regional high-end talent development efficiency network has both short-range relevance and long-range relevance in the evolution process. The central region plays an important intermediary role in the national regional high-end talent development system. And the western region has high clustering characteristics. With the implementation of the high-end talent policies with regional characteristics by different provinces and cities, the relevance of high-end talent development efficiency in various provinces and cities presents a weakening trend, and the geographical characteristics of high-end talent are more and more obvious. PMID:29272286

  15. Functional relevance of interindividual differences in temporal lobe callosal pathways: a DTI tractography study.

    PubMed

    Westerhausen, René; Grüner, Renate; Specht, Karsten; Hugdahl, Kenneth

    2009-06-01

    The midsagittal corpus callosum is topographically organized, that is, with regard to their cortical origin several subtracts can be distinguished within the corpus callosum that belong to specific functional brain networks. Recent diffusion tensor tractography studies have also revealed remarkable interindividual differences in the size and exact localization of these tracts. To examine the functional relevance of interindividual variability in callosal tracts, 17 right-handed male participants underwent structural and diffusion tensor magnetic resonance imaging. Probabilistic tractography was carried out to identify the callosal subregions that interconnect left and right temporal lobe auditory processing areas, and the midsagittal size of this tract was seen as indicator of the (anatomical) strength of this connection. Auditory information transfer was assessed applying an auditory speech perception task with dichotic presentations of consonant-vowel syllables (e.g., /ba-ga/). The frequency of correct left ear reports in this task served as a functional measure of interhemispheric transfer. Statistical analysis showed that a stronger anatomical connection between the superior temporal lobe areas supports a better information transfer. This specific structure-function association in the auditory modality supports the general notion that interindividual differences in callosal topography possess functional relevance.

  16. Toward resolution of the debate regarding purported crypto-Jews in a Spanish-American population: evidence from the Y chromosome.

    PubMed

    Sutton, Wesley K; Knight, Alec; Underhill, Peter A; Neulander, Judith S; Disotell, Todd R; Mountain, Joanna L

    2006-01-01

    The ethnic heritage of northernmost New Spain, including present-day northern New Mexico and southernmost Colorado, USA, is intensely debated. Local Spanish-American folkways and anecdotal narratives led to claims that the region was colonized primarily by secret- or crypto-Jews. Despite ethnographic criticisms, the notion of substantial crypto-Jewish ancestry among Spanish-Americans persists. We tested the null hypothesis that Spanish-Americans of northern New Mexico carry essentially the same profile of paternally inherited DNA variation as the peoples of Iberia, and the relevant alternative hypothesis that the sampled Spanish-Americans possess inherited DNA variation that reflects Jewish ancestry significantly greater than that in present-day Iberia. We report frequencies of 19 Y-chromosome unique event polymorphism (UEP) biallelic markers for 139 men from across northern New Mexico and southern Colorado, USA, who self-identify as 'Spanish-American'. We used three different statistical tests of differentiation to compare frequencies of major UEP-defined clades or haplogroups with published data for Iberians, Jews, and other Mediterranean populations. We also report frequencies of derived UEP markers within each major haplogroup, compared with published data for relevant populations. All tests of differentiation showed that, for frequencies of the major UEP-defined clades, Spanish-Americans and Iberians are statistically indistinguishable. All other pairwise comparisons, including between Spanish-Americans and Jews, and Iberians and Jews, revealed highly significant differences in UEP frequencies. Our results indicate that paternal genetic inheritance of Spanish-Americans is indistinguishable from that of Iberians and refute the popular and widely publicized scenario of significant crypto-Jewish ancestry of the Spanish-American population.

  17. Toxicity of analytically cleaned pentabromodiphenylether after prolonged exposure in estuarine European flounder (Platichthys flesus), and partial life-cycle exposure in fresh water zebrafish (Danio rerio).

    PubMed

    Kuiper, Raoul V; Vethaak, A D; Cantón, Roćio F; Anselmo, Henrique; Dubbeldam, Marco; van den Brandhof, Evert-Jan; Leonards, Pim E G; Wester, Piet W; van den Berg, Martin

    2008-09-01

    Residues of polybrominated diphenylethers (PBDEs), extensively applied as flame retardants, are widely spread in the aquatic environment and biota. The present study investigates effects of the environmentally relevant lower brominated diphenylethers in two fish species in vivo under controlled laboratory conditions. Euryhaline flounder (Platichthys flesus) and freshwater zebrafish (Danio rerio) were exposed to a range of concentrations of a commercial pentabromodiphenylether mixture, DE-71. Chemical analysis of exposed fish showed a pattern of PBDE congeners that was very similar to that in wild fish. The resulting range included environmentally relevant, as well as higher levels. Animals were investigated histopathologically with emphasis on endocrine and reproductive organs. In zebrafish, hatching of embryos and larval development were assessed. Biochemical parameters were investigated in flounder as markers for suggested dioxin-like activity (ethoxyresorufin-O-deethylase=EROD), and activation of endogenous estrogen synthesis (gonad aromatase activity). Thyroid hormones were analyzed in plasma in both species. Benchmark analysis using internal PBDE concentrations showed a mild dose-dependent decrease of hepatic EROD and ovarian aromatase activities, and plasma thyroxin levels in flounder, and an increase of plasma thyroid hormone levels in zebrafish. These trends did not result in statistically significant differences from control fish, and major histopathological changes were not observed. Reproduction in zebrafish appeared to be the most sensitive parameter with statistically significantly reduced larval survival and non-significant indications for decreased egg production at internal levels that were more than 55 times the highest environmental recordings. The present results indicate limited risk for endocrine or reproductive effects of current environmental PBDE contamination in fish.

  18. Problematic smartphone use: A conceptual overview and systematic review of relations with anxiety and depression psychopathology.

    PubMed

    Elhai, Jon D; Dvorak, Robert D; Levine, Jason C; Hall, Brian J

    2017-01-01

    Research literature on problematic smartphone use, or smartphone addiction, has proliferated. However, relationships with existing categories of psychopathology are not well defined. We discuss the concept of problematic smartphone use, including possible causal pathways to such use. We conducted a systematic review of the relationship between problematic use with psychopathology. Using scholarly bibliographic databases, we screened 117 total citations, resulting in 23 peer-reviewer papers examining statistical relations between standardized measures of problematic smartphone use/use severity and the severity of psychopathology. Most papers examined problematic use in relation to depression, anxiety, chronic stress and/or low self-esteem. Across this literature, without statistically adjusting for other relevant variables, depression severity was consistently related to problematic smartphone use, demonstrating at least medium effect sizes. Anxiety was also consistently related to problem use, but with small effect sizes. Stress was somewhat consistently related, with small to medium effects. Self-esteem was inconsistently related, with small to medium effects when found. Statistically adjusting for other relevant variables yielded similar but somewhat smaller effects. We only included correlational studies in our systematic review, but address the few relevant experimental studies also. We discuss causal explanations for relationships between problem smartphone use and psychopathology. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Statistics teaching in medical school: opinions of practising doctors.

    PubMed

    Miles, Susan; Price, Gill M; Swift, Louise; Shepstone, Lee; Leinster, Sam J

    2010-11-04

    The General Medical Council expects UK medical graduates to gain some statistical knowledge during their undergraduate education; but provides no specific guidance as to amount, content or teaching method. Published work on statistics teaching for medical undergraduates has been dominated by medical statisticians, with little input from the doctors who will actually be using this knowledge and these skills after graduation. Furthermore, doctor's statistical training needs may have changed due to advances in information technology and the increasing importance of evidence-based medicine. Thus there exists a need to investigate the views of practising medical doctors as to the statistical training required for undergraduate medical students, based on their own use of these skills in daily practice. A questionnaire was designed to investigate doctors' views about undergraduate training in statistics and the need for these skills in daily practice, with a view to informing future teaching. The questionnaire was emailed to all clinicians with a link to the University of East Anglia Medical School. Open ended questions were included to elicit doctors' opinions about both their own undergraduate training in statistics and recommendations for the training of current medical students. Content analysis was performed by two of the authors to systematically categorize and describe all the responses provided by participants. 130 doctors responded, including both hospital consultants and general practitioners. The findings indicated that most had not recognised the value of their undergraduate teaching in statistics and probability at the time, but had subsequently found the skills relevant to their career. Suggestions for improving undergraduate teaching in these areas included referring to actual research and ensuring relevance to, and integration with, clinical practice. Grounding the teaching of statistics in the context of real research studies and including examples of typical clinical work may better prepare medical students for their subsequent career.

  20. Making Online Instruction Count: Statistical Reporting of Web-Based Library Instruction Activities

    ERIC Educational Resources Information Center

    Bottorff, Tim; Todd, Andrew

    2012-01-01

    Statistical reporting of library instruction (LI) activities has historically focused on measures relevant to face-to-face (F2F) settings. However, newer forms of LI conducted in the online realm may be difficult to count in traditional ways, leading to inaccurate reporting to both internal and external stakeholders. A thorough literature review…

  1. Orangutans (Pongo spp.) may prefer tools with rigid properties to flimsy tools.

    PubMed

    Walkup, Kristina R; Shumaker, Robert W; Pruetz, Jill D

    2010-11-01

    Preference for tools with either rigid or flexible properties was explored in orangutans (Pongo spp.) through an extension of D. J. Povinelli, J. E. Reaux, and L. A. Theall's (2000) flimsy-tool problem. Three captive orangutans were presented with three unfamiliar pairs of tools to solve a novel problem. Although each orangutan has spontaneously used tools in the past, the tools presented in this study were novel to the apes. Each pair of tools contained one tool with rigid properties (functional) and one tool with flimsy properties (nonfunctional). Solving the problem required selection of a rigid tool to retrieve a food reward. The functional tool was selected in nearly all trials. Moreover, two of the orangutans demonstrated this within the first test trials with each of the three tool types. Although further research is required to test this statistically, it suggests either a preexisting preference for rigid tools or comprehension of the relevant features required in a tool to solve the task. The results of this study demonstrate that orangutans can recognize, or learn to recognize, relevant tool properties and can choose an appropriate tool to solve a problem. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  2. Learning predictive statistics from temporal sequences: Dynamics and strategies

    PubMed Central

    Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew E.; Kourtzi, Zoe

    2017-01-01

    Human behavior is guided by our expectations about the future. Often, we make predictions by monitoring how event sequences unfold, even though such sequences may appear incomprehensible. Event structures in the natural environment typically vary in complexity, from simple repetition to complex probabilistic combinations. How do we learn these structures? Here we investigate the dynamics of structure learning by tracking human responses to temporal sequences that change in structure unbeknownst to the participants. Participants were asked to predict the upcoming item following a probabilistic sequence of symbols. Using a Markov process, we created a family of sequences, from simple frequency statistics (e.g., some symbols are more probable than others) to context-based statistics (e.g., symbol probability is contingent on preceding symbols). We demonstrate the dynamics with which individuals adapt to changes in the environment's statistics—that is, they extract the behaviorally relevant structures to make predictions about upcoming events. Further, we show that this structure learning relates to individual decision strategy; faster learning of complex structures relates to selection of the most probable outcome in a given context (maximizing) rather than matching of the exact sequence statistics. Our findings provide evidence for alternate routes to learning of behaviorally relevant statistics that facilitate our ability to predict future events in variable environments. PMID:28973111

  3. Comparison of traditional methods with 3D computer models in the instruction of hepatobiliary anatomy.

    PubMed

    Keedy, Alexander W; Durack, Jeremy C; Sandhu, Parmbir; Chen, Eric M; O'Sullivan, Patricia S; Breiman, Richard S

    2011-01-01

    This study was designed to determine whether an interactive three-dimensional presentation depicting liver and biliary anatomy is more effective for teaching medical students than a traditional textbook format presentation of the same material. Forty-six medical students volunteered for participation in this study. Baseline demographic information, spatial ability, and knowledge of relevant anatomy were measured. Participants were randomized into two groups and presented with a computer-based interactive learning module comprised of animations and still images to highlight various anatomical structures (3D group), or a computer-based text document containing the same images and text without animation or interactive features (2D group). Following each teaching module, students completed a satisfaction survey and nine-item anatomic knowledge post-test. The 3D group scored higher on the post-test than the 2D group, with a mean score of 74% and 64%, respectively; however, when baseline differences in pretest scores were accounted for, this difference was not statistically significant (P = 0.33). Spatial ability did not statistically significantly correlate with post-test scores for the 3D group or the 2D group. In the post-test satisfaction survey the 3D group expressed a statistically significantly higher overall satisfaction rating compared to students in the 2D control group (4.5 versus 3.7 out of 5, P = 0.02). While the interactive 3D multimedia module received higher satisfaction ratings from students, it neither enhanced nor inhibited learning of complex hepatobiliary anatomy compared to an informationally equivalent traditional textbook style approach. . Copyright © 2011 American Association of Anatomists.

  4. Statistical methods applied to the study of opinion formation models: a brief overview and results of a numerical study of a model based on the social impact theory

    NASA Astrophysics Data System (ADS)

    Bordogna, Clelia María; Albano, Ezequiel V.

    2007-02-01

    The aim of this paper is twofold. On the one hand we present a brief overview on the application of statistical physics methods to the modelling of social phenomena focusing our attention on models for opinion formation. On the other hand, we discuss and present original results of a model for opinion formation based on the social impact theory developed by Latané. The presented model accounts for the interaction among the members of a social group under the competitive influence of a strong leader and the mass media, both supporting two different states of opinion. Extensive simulations of the model are presented, showing that they led to the observation of a rich scenery of complex behaviour including, among others, critical behaviour and phase transitions between a state of opinion dominated by the leader and another dominated by the mass media. The occurrence of interesting finite-size effects reveals that, in small communities, the opinion of the leader may prevail over that of the mass media. This observation is relevant for the understanding of social phenomena involving a finite number of individuals, in contrast to actual physical phase transitions that take place in the thermodynamic limit. Finally, we give a brief outlook of open questions and lines for future work.

  5. Pathway-GPS and SIGORA: identifying relevant pathways based on the over-representation of their gene-pair signatures

    PubMed Central

    Foroushani, Amir B.K.; Brinkman, Fiona S.L.

    2013-01-01

    Motivation. Predominant pathway analysis approaches treat pathways as collections of individual genes and consider all pathway members as equally informative. As a result, at times spurious and misleading pathways are inappropriately identified as statistically significant, solely due to components that they share with the more relevant pathways. Results. We introduce the concept of Pathway Gene-Pair Signatures (Pathway-GPS) as pairs of genes that, as a combination, are specific to a single pathway. We devised and implemented a novel approach to pathway analysis, Signature Over-representation Analysis (SIGORA), which focuses on the statistically significant enrichment of Pathway-GPS in a user-specified gene list of interest. In a comparative evaluation of several published datasets, SIGORA outperformed traditional methods by delivering biologically more plausible and relevant results. Availability. An efficient implementation of SIGORA, as an R package with precompiled GPS data for several human and mouse pathway repositories is available for download from http://sigora.googlecode.com/svn/. PMID:24432194

  6. Group Influences on Young Adult Warfighters’ Risk Taking

    DTIC Science & Technology

    2016-12-01

    Statistical Analysis Latent linear growth models were fitted using the maximum likelihood estimation method in Mplus (version 7.0; Muthen & Muthen...condition had a higher net score than those in the alone condition (b = 20.53, SE = 6.29, p < .001). Results of the relevant statistical analyses are...8.56 110.86*** 22.01 158.25*** 29.91 Model fit statistics BIC 4004.50 5302.539 5540.58 Chi-square (df) 41.51*** (16) 38.10** (20) 42.19** (20

  7. Optimal periodic proof test based on cost-effective and reliability criteria

    NASA Technical Reports Server (NTRS)

    Yang, J.-N.

    1976-01-01

    An exploratory study for the optimization of periodic proof tests for fatigue-critical structures is presented. The optimal proof load level and the optimal number of periodic proof tests are determined by minimizing the total expected (statistical average) cost, while the constraint on the allowable level of structural reliability is satisfied. The total expected cost consists of the expected cost of proof tests, the expected cost of structures destroyed by proof tests, and the expected cost of structural failure in service. It is demonstrated by numerical examples that significant cost saving and reliability improvement for fatigue-critical structures can be achieved by the application of the optimal periodic proof test. The present study is relevant to the establishment of optimal maintenance procedures for fatigue-critical structures.

  8. A re-examination of the effects of biased lineup instructions in eyewitness identification.

    PubMed

    Clark, Steven E

    2005-10-01

    A meta-analytic review of research comparing biased and unbiased instructions in eyewitness identification experiments showed an asymmetry; specifically, that biased instructions led to a large and consistent decrease in accuracy in target-absent lineups, but produced inconsistent results for target-present lineups, with an average effect size near zero (Steblay, 1997). The results for target-present lineups are surprising, and are inconsistent with statistical decision theories (i.e., Green & Swets, 1966). A re-examination of the relevant studies and the meta-analysis of those studies shows clear evidence that correct identification rates do increase with biased lineup instructions, and that biased witnesses make correct identifications at a rate considerably above chance. Implications for theory, as well as police procedure and policy, are discussed.

  9. A re-examination of the effects of biased lineup instructions in eyewitness identification.

    PubMed

    Clark, Steven E

    2005-08-01

    A meta-analytic review of research comparing biased and unbiased instructions in eyewitness identification experiments showed an asymmetry, specifically that biased instructions led to a large and consistent decrease in accuracy in target-absent lineups, but produced inconsistent results for target-present lineups, with an average effect size near zero (N. M. Steblay, 1997). The results for target-present lineups are surprising, and are inconsistent with statistical decision theories (i.e., D. M. Green & J. A. Swets, 1966). A re-examination of the relevant studies and the meta-analysis of those studies shows clear evidence that correct identification rates do increase with biased lineup instructions, and that biased witnesses make correct identifications at a rate considerably above chance. Implications for theory, as well as police procedure and policy, are discussed.

  10. The transfer of analytical procedures.

    PubMed

    Ermer, J; Limberger, M; Lis, K; Wätzig, H

    2013-11-01

    Analytical method transfers are certainly among the most discussed topics in the GMP regulated sector. However, they are surprisingly little regulated in detail. General information is provided by USP, WHO, and ISPE in particular. Most recently, the EU emphasized the importance of analytical transfer by including it in their draft of the revised GMP Guideline. In this article, an overview and comparison of these guidelines is provided. The key to success for method transfers is the excellent communication between sending and receiving unit. In order to facilitate this communication, procedures, flow charts and checklists for responsibilities, success factors, transfer categories, the transfer plan and report, strategies in case of failed transfers, tables with acceptance limits are provided here, together with a comprehensive glossary. Potential pitfalls are described such that they can be avoided. In order to assure an efficient and sustainable transfer of analytical procedures, a practically relevant and scientifically sound evaluation with corresponding acceptance criteria is crucial. Various strategies and statistical tools such as significance tests, absolute acceptance criteria, and equivalence tests are thoroughly descibed and compared in detail giving examples. Significance tests should be avoided. The success criterion is not statistical significance, but rather analytical relevance. Depending on a risk assessment of the analytical procedure in question, statistical equivalence tests are recommended, because they include both, a practically relevant acceptance limit and a direct control of the statistical risks. However, for lower risk procedures, a simple comparison of the transfer performance parameters to absolute limits is also regarded as sufficient. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. A systematic review of motivational interviewing in healthcare: the potential of motivational interviewing to address the lifestyle factors relevant to multimorbidity

    PubMed Central

    McKenzie, Kylie J.; Pierce, David; Gunn, Jane M.

    2015-01-01

    Internationally, health systems face an increasing demand for services from people living with multimorbidity. Multimorbidity is often associated with high levels of treatment burden. Targeting lifestyle factors that impact across multiple conditions may promote quality of life and better health outcomes for people with multimorbidity. Motivational interviewing (MI) has been studied as one approach to supporting lifestyle behaviour change. A systematic review was conducted to assess the effectiveness of MI in healthcare settings and to consider its relevance for multimorbidity. Twelve meta-analyses pertinent to multimorbidity lifestyle factors were identified. As an intervention, MI has been found to have a small-to-medium statistically significant effect across a wide variety of single diseases and for a range of behavioural outcomes. This review highlights the need for specific research into the application of MI to determine if the benefits of MI seen with single diseases are also present in the context of multimorbidity. PMID:29090164

  12. Origin of the spike-timing-dependent plasticity rule

    NASA Astrophysics Data System (ADS)

    Cho, Myoung Won; Choi, M. Y.

    2016-08-01

    A biological synapse changes its efficacy depending on the difference between pre- and post-synaptic spike timings. Formulating spike-timing-dependent interactions in terms of the path integral, we establish a neural-network model, which makes it possible to predict relevant quantities rigorously by means of standard methods in statistical mechanics and field theory. In particular, the biological synaptic plasticity rule is shown to emerge as the optimal form for minimizing the free energy. It is further revealed that maximization of the entropy of neural activities gives rise to the competitive behavior of biological learning. This demonstrates that statistical mechanics helps to understand rigorously key characteristic behaviors of a neural network, thus providing the possibility of physics serving as a useful and relevant framework for probing life.

  13. The intercrater plains of Mercury and the Moon: Their nature, origin and role in terrestrial planet evolution. Geologic mapping of Mercury and the Moon. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Leake, M. A.

    1982-01-01

    The geologic framework of the intercrater plains on Mercury and the Moon as determined through geologic mapping is presented. The strategies used in such mapping are discussed first. Then, because the degree of crater degradation is applied to both mapping and crater statistics, the correlation of degradation classification of lunar and Mercurian craters is thoroughly addressed. Different imaging systems can potentially affect this classification, and are therefore also discussed. The techniques used in mapping Mercury are discussed in Section 2, followed by presentation of the Geologic Map of Mercury in Section 3. Material units, structures, and relevant albedo and color data are discussed therein. Preliminary conclusions regarding plains' origins are given there. The last section presents the mapping analyses of the lunar intercrater plains, including tentative conclusions of their origin.

  14. An Update on Statistical Boosting in Biomedicine.

    PubMed

    Mayr, Andreas; Hofner, Benjamin; Waldmann, Elisabeth; Hepp, Tobias; Meyer, Sebastian; Gefeller, Olaf

    2017-01-01

    Statistical boosting algorithms have triggered a lot of research during the last decade. They combine a powerful machine learning approach with classical statistical modelling, offering various practical advantages like automated variable selection and implicit regularization of effect estimates. They are extremely flexible, as the underlying base-learners (regression functions defining the type of effect for the explanatory variables) can be combined with any kind of loss function (target function to be optimized, defining the type of regression setting). In this review article, we highlight the most recent methodological developments on statistical boosting regarding variable selection, functional regression, and advanced time-to-event modelling. Additionally, we provide a short overview on relevant applications of statistical boosting in biomedicine.

  15. Physical Regulation of the Self-Assembly of Tobacco Mosaic Virus Coat Protein

    PubMed Central

    Kegel, Willem K.; van der Schoot, Paul

    2006-01-01

    We present a statistical mechanical model based on the principle of mass action that explains the main features of the in vitro aggregation behavior of the coat protein of tobacco mosaic virus (TMV). By comparing our model to experimentally obtained stability diagrams, titration experiments, and calorimetric data, we pin down three competing factors that regulate the transitions between the different kinds of aggregated state of the coat protein. These are hydrophobic interactions, electrostatic interactions, and the formation of so-called “Caspar” carboxylate pairs. We suggest that these factors could be universal and relevant to a large class of virus coat proteins. PMID:16731551

  16. Plan execution monitoring with distributed intelligent agents for battle command

    NASA Astrophysics Data System (ADS)

    Allen, James P.; Barry, Kevin P.; McCormick, John M.; Paul, Ross A.

    2004-07-01

    As military tactics evolve toward execution centric operations the ability to analyze vast amounts of mission relevant data is essential to command and control decision making. To maintain operational tempo and achieve information superiority we have developed Vigilant Advisor, a mobile agent-based distributed Plan Execution Monitoring system. It provides military commanders with continuous contingency monitoring tailored to their preferences while overcoming the network bandwidth problem often associated with traditional remote data querying. This paper presents an overview of Plan Execution Monitoring as well as a detailed view of the Vigilant Advisor system including key features and statistical analysis of resource savings provided by its mobile agent-based approach.

  17. Lognormal Uncertainty Estimation for Failure Rates

    NASA Technical Reports Server (NTRS)

    Britton, Paul T.; Al Hassan, Mohammad; Ring, Robert W.

    2017-01-01

    "Uncertainty analysis itself is uncertain, therefore, you cannot evaluate it exactly," Source Uncertain. Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This presentation will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.

  18. Field reliability of Ricor microcoolers

    NASA Astrophysics Data System (ADS)

    Pundak, N.; Porat, Z.; Barak, M.; Zur, Y.; Pasternak, G.

    2009-05-01

    Over the recent 25 years Ricor has fielded in excess of 50,000 Stirling cryocoolers, among which approximately 30,000 units are of micro integral rotary driven type. The statistical population of the fielded units is counted in thousands/ hundreds per application category. In contrast to MTTF values as gathered and presented based on standard reliability demonstration tests, where the failure of the weakest component dictates the end of product life, in the case of field reliability, where design and workmanship failures are counted and considered, the values are usually reported in number of failures per million hours of operation. These values are important and relevant to the prediction of service capabilities and plan.

  19. Influence of leadership on quality nursing care.

    PubMed

    Mendes, Luis; Fradique, Maria de Jesus José Gil

    2014-01-01

    The purpose of this paper is to investigate the extent to which nursing leadership, perceived by nursing staff, influences nursing quality. Data were collected between August and October 2011 in a Portuguese health center via a questionnaire completed by nurses. Our original sample included 283 employees; 184 questionnaires were received (65% response). The theoretical model presents reasonably satisfactory fit indices (values above literature reference). Path analysis between latent constructs clearly suggests that nursing leadership has a direct (beta = 0.724) and statistically significant (p = 0.007) effect on nursing quality. Results reinforce several ideas propagated throughout the literature, which suggests the relationship's relevance, but lacks empirical support, which this study corrects.

  20. Pan-European household and industrial water demand: regional relevant estimations

    NASA Astrophysics Data System (ADS)

    Bernhard, Jeroen; Reynaud, Arnaud; de Roo, Ad

    2016-04-01

    Sustainable water management is of high importance to provide adequate quality and quantity of water to European households, industries and agriculture. Especially since demographic, economic and climate changes are expected to increase competition for water between these sectors in the future. A shortage of water implies a reduction in welfare of households or damage to economic sectors. This socio-economic component should be incorporated into the decision-making process when developing water allocation schemes, requiring detailed water use information and cost/benefit functions. We now present the results of our study which is focused at providing regionally relevant pan-European water demand and cost-benefit estimations for the household and industry sector. We gathered consistent data on water consumption, water prices and other relevant variables at the highest spatial detail available from national statistical offices and other organizational bodies. This database provides the most detailed up to date picture of present water use and water prices across Europe. The use of homogeneous data allowed us to compare regions and analyze spatial patterns. We applied econometric methods to determine the main determinants of water demand and make a monetary valuation of water for both the domestic and industry sector. This monetary valuation is important to allow water allocation based on economic damage estimates. We also attempted to estimate how population growth, as well as socio-economic and climatic changes impact future water demand up to 2050 using a homogeneous method for all countries. European projections for the identified major drivers of water demand were used to simulate future conditions. Subsequently, water demand functions were applied to estimate future water use and potential economic damage caused by water shortages. We present our results while also providing some estimation of the uncertainty of our predictions.

  1. The Promises and Pitfalls of Genoeconomics*

    PubMed Central

    Benjamin, Daniel J.; Cesarini, David; Chabris, Christopher F.; Glaeser, Edward L.; Laibson, David I.; Guðnason, Vilmundur; Harris, Tamara B.; Launer, Lenore J.; Purcell, Shaun; Smith, Albert Vernon; Johannesson, Magnus; Magnusson, Patrik K.E.; Beauchamp, Jonathan P.; Christakis, Nicholas A.; Atwood, Craig S.; Hebert, Benjamin; Freese, Jeremy; Hauser, Robert M.; Hauser, Taissa S.; Grankvist, Alexander; Hultman, Christina M.; Lichtenstein, Paul

    2012-01-01

    This article reviews existing research at the intersection of genetics and economics, presents some new findings that illustrate the state of genoeconomics research, and surveys the prospects of this emerging field. Twin studies suggest that economic outcomes and preferences, once corrected for measurement error, appear to be about as heritable as many medical conditions and personality traits. Consistent with this pattern, we present new evidence on the heritability of permanent income and wealth. Turning to genetic association studies, we survey the main ways that the direct measurement of genetic variation across individuals is likely to contribute to economics, and we outline the challenges that have slowed progress in making these contributions. The most urgent problem facing researchers in this field is that most existing efforts to find associations between genetic variation and economic behavior are based on samples that are too small to ensure adequate statistical power. This has led to many false positives in the literature. We suggest a number of possible strategies to improve and remedy this problem: (a) pooling data sets, (b) using statistical techniques that exploit the greater information content of many genes considered jointly, and (c) focusing on economically relevant traits that are most proximate to known biological mechanisms. PMID:23482589

  2. Visualization of Spatio-Temporal Relations in Movement Event Using Multi-View

    NASA Astrophysics Data System (ADS)

    Zheng, K.; Gu, D.; Fang, F.; Wang, Y.; Liu, H.; Zhao, W.; Zhang, M.; Li, Q.

    2017-09-01

    Spatio-temporal relations among movement events extracted from temporally varying trajectory data can provide useful information about the evolution of individual or collective movers, as well as their interactions with their spatial and temporal contexts. However, the pure statistical tools commonly used by analysts pose many difficulties, due to the large number of attributes embedded in multi-scale and multi-semantic trajectory data. The need for models that operate at multiple scales to search for relations at different locations within time and space, as well as intuitively interpret what these relations mean, also presents challenges. Since analysts do not know where or when these relevant spatio-temporal relations might emerge, these models must compute statistical summaries of multiple attributes at different granularities. In this paper, we propose a multi-view approach to visualize the spatio-temporal relations among movement events. We describe a method for visualizing movement events and spatio-temporal relations that uses multiple displays. A visual interface is presented, and the user can interactively select or filter spatial and temporal extents to guide the knowledge discovery process. We also demonstrate how this approach can help analysts to derive and explain the spatio-temporal relations of movement events from taxi trajectory data.

  3. Effects of Cognitive Load on Trust

    DTIC Science & Technology

    2013-10-01

    that may be affected by load  Build a parsing tool to extract relevant features  Statistical analysis of results (by load components) Achieved...for a business application. Participants assessed potential job candidates and reviewed the applicants’ virtual resume which included standard...substantially different from each other that would make any confounding problems or other issues. Some statistics of the Australian data collection are

  4. Actitudes de Estudiantes Universitarios que Tomaron Cursos Introductorios de Estadistica y su Relacion con el Exito Academico en La Disciplina

    ERIC Educational Resources Information Center

    Colon-Rosa, Hector Wm.

    2012-01-01

    Considering the range of changes in the instruction and learning of statistics, several questions emerge regarding how those changes influence students' attitudes. Equally, other questions emerge to reflect that statistics is a fundamental course in the university academic programs because of its relevance to the professional development of the…

  5. Arab oil and gas directory 1985

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1985-01-01

    The directory provides detailed statistics and information on aspects of oil and gas production, exploration and developments in the 24 Arab countries of the Middle East and North Africa and in Iran. It includes the texts of relevant new laws and official documents, official surveys, current projects and developments, up-to-date statistics covering OPEC and OAPEC member countries, and has 26 maps.

  6. Condensation of an ideal gas obeying non-Abelian statistics.

    PubMed

    Mirza, Behrouz; Mohammadzadeh, Hosein

    2011-09-01

    We consider the thermodynamic geometry of an ideal non-Abelian gas. We show that, for a certain value of the fractional parameter and at the relevant maximum value of fugacity, the thermodynamic curvature has a singular point. This indicates a condensation such as Bose-Einstein condensation for non-Abelian statistics and we work out the phase transition temperature in various dimensions.

  7. No-reference image quality assessment based on natural scene statistics and gradient magnitude similarity

    NASA Astrophysics Data System (ADS)

    Jia, Huizhen; Sun, Quansen; Ji, Zexuan; Wang, Tonghan; Chen, Qiang

    2014-11-01

    The goal of no-reference/blind image quality assessment (NR-IQA) is to devise a perceptual model that can accurately predict the quality of a distorted image as human opinions, in which feature extraction is an important issue. However, the features used in the state-of-the-art "general purpose" NR-IQA algorithms are usually natural scene statistics (NSS) based or are perceptually relevant; therefore, the performance of these models is limited. To further improve the performance of NR-IQA, we propose a general purpose NR-IQA algorithm which combines NSS-based features with perceptually relevant features. The new method extracts features in both the spatial and gradient domains. In the spatial domain, we extract the point-wise statistics for single pixel values which are characterized by a generalized Gaussian distribution model to form the underlying features. In the gradient domain, statistical features based on neighboring gradient magnitude similarity are extracted. Then a mapping is learned to predict quality scores using a support vector regression. The experimental results on the benchmark image databases demonstrate that the proposed algorithm correlates highly with human judgments of quality and leads to significant performance improvements over state-of-the-art methods.

  8. Decrease of Fisher information and the information geometry of evolution equations for quantum mechanical probability amplitudes.

    PubMed

    Cafaro, Carlo; Alsing, Paul M

    2018-04-01

    The relevance of the concept of Fisher information is increasing in both statistical physics and quantum computing. From a statistical mechanical standpoint, the application of Fisher information in the kinetic theory of gases is characterized by its decrease along the solutions of the Boltzmann equation for Maxwellian molecules in the two-dimensional case. From a quantum mechanical standpoint, the output state in Grover's quantum search algorithm follows a geodesic path obtained from the Fubini-Study metric on the manifold of Hilbert-space rays. Additionally, Grover's algorithm is specified by constant Fisher information. In this paper, we present an information geometric characterization of the oscillatory or monotonic behavior of statistically parametrized squared probability amplitudes originating from special functional forms of the Fisher information function: constant, exponential decay, and power-law decay. Furthermore, for each case, we compute both the computational speed and the availability loss of the corresponding physical processes by exploiting a convenient Riemannian geometrization of useful thermodynamical concepts. Finally, we briefly comment on the possibility of using the proposed methods of information geometry to help identify a suitable trade-off between speed and thermodynamic efficiency in quantum search algorithms.

  9. Decrease of Fisher information and the information geometry of evolution equations for quantum mechanical probability amplitudes

    NASA Astrophysics Data System (ADS)

    Cafaro, Carlo; Alsing, Paul M.

    2018-04-01

    The relevance of the concept of Fisher information is increasing in both statistical physics and quantum computing. From a statistical mechanical standpoint, the application of Fisher information in the kinetic theory of gases is characterized by its decrease along the solutions of the Boltzmann equation for Maxwellian molecules in the two-dimensional case. From a quantum mechanical standpoint, the output state in Grover's quantum search algorithm follows a geodesic path obtained from the Fubini-Study metric on the manifold of Hilbert-space rays. Additionally, Grover's algorithm is specified by constant Fisher information. In this paper, we present an information geometric characterization of the oscillatory or monotonic behavior of statistically parametrized squared probability amplitudes originating from special functional forms of the Fisher information function: constant, exponential decay, and power-law decay. Furthermore, for each case, we compute both the computational speed and the availability loss of the corresponding physical processes by exploiting a convenient Riemannian geometrization of useful thermodynamical concepts. Finally, we briefly comment on the possibility of using the proposed methods of information geometry to help identify a suitable trade-off between speed and thermodynamic efficiency in quantum search algorithms.

  10. Effectiveness of propolis on oral health: a meta-analysis.

    PubMed

    Hwu, Yueh-Juen; Lin, Feng-Yu

    2014-12-01

    The use of propolis mouth rinse or gel as a supplementary intervention has increased during the last decade in Taiwan. However, the effect of propolis on oral health is not well understood. The purpose of this meta-analysis was to present the best available evidence regarding the effects of propolis use on oral health, including oral infection, dental plaque, and stomatitis. Researchers searched seven electronic databases for relevant articles published between 1969 and 2012. Data were collected using inclusion and exclusion criteria. The Joanna Briggs Institute Meta Analysis of Statistics Assessment and Review Instrument was used to evaluate the quality of the identified articles. Eight trials published from 1997 to 2011 with 194 participants had extractable data. The result of the meta-analysis indicated that, although propolis had an effect on reducing dental plaque, this effect was not statistically significant. The results were not statistically significant for oral infection or stomatitis. Although there are a number of promising indications, in view of the limited number and quality of studies and the variation in results among studies, this review highlights the need for additional well-designed trials to draw conclusions that are more robust.

  11. Synthetic Earthquake Statistics From Physical Fault Models for the Lower Rhine Embayment

    NASA Astrophysics Data System (ADS)

    Brietzke, G. B.; Hainzl, S.; Zöller, G.

    2012-04-01

    As of today, seismic risk and hazard estimates mostly use pure empirical, stochastic models of earthquake fault systems tuned specifically to the vulnerable areas of interest. Although such models allow for reasonable risk estimates they fail to provide a link between the observed seismicity and the underlying physical processes. Solving a state-of-the-art fully dynamic description set of all relevant physical processes related to earthquake fault systems is likely not useful since it comes with a large number of degrees of freedom, poor constraints on its model parameters and a huge computational effort. Here, quasi-static and quasi-dynamic physical fault simulators provide a compromise between physical completeness and computational affordability and aim at providing a link between basic physical concepts and statistics of seismicity. Within the framework of quasi-static and quasi-dynamic earthquake simulators we investigate a model of the Lower Rhine Embayment (LRE) that is based upon seismological and geological data. We present and discuss statistics of the spatio-temporal behavior of generated synthetic earthquake catalogs with respect to simplification (e.g. simple two-fault cases) as well as to complication (e.g. hidden faults, geometric complexity, heterogeneities of constitutive parameters).

  12. Networking—a statistical physics perspective

    NASA Astrophysics Data System (ADS)

    Yeung, Chi Ho; Saad, David

    2013-03-01

    Networking encompasses a variety of tasks related to the communication of information on networks; it has a substantial economic and societal impact on a broad range of areas including transportation systems, wired and wireless communications and a range of Internet applications. As transportation and communication networks become increasingly more complex, the ever increasing demand for congestion control, higher traffic capacity, quality of service, robustness and reduced energy consumption requires new tools and methods to meet these conflicting requirements. The new methodology should serve for gaining better understanding of the properties of networking systems at the macroscopic level, as well as for the development of new principled optimization and management algorithms at the microscopic level. Methods of statistical physics seem best placed to provide new approaches as they have been developed specifically to deal with nonlinear large-scale systems. This review aims at presenting an overview of tools and methods that have been developed within the statistical physics community and that can be readily applied to address the emerging problems in networking. These include diffusion processes, methods from disordered systems and polymer physics, probabilistic inference, which have direct relevance to network routing, file and frequency distribution, the exploration of network structures and vulnerability, and various other practical networking applications.

  13. Temperature in and out of equilibrium: A review of concepts, tools and attempts

    NASA Astrophysics Data System (ADS)

    Puglisi, A.; Sarracino, A.; Vulpiani, A.

    2017-11-01

    We review the general aspects of the concept of temperature in equilibrium and non-equilibrium statistical mechanics. Although temperature is an old and well-established notion, it still presents controversial facets. After a short historical survey of the key role of temperature in thermodynamics and statistical mechanics, we tackle a series of issues which have been recently reconsidered. In particular, we discuss different definitions and their relevance for energy fluctuations. The interest in such a topic has been triggered by the recent observation of negative temperatures in condensed matter experiments. Moreover, the ability to manipulate systems at the micro and nano-scale urges to understand and clarify some aspects related to the statistical properties of small systems (as the issue of temperature's ;fluctuations;). We also discuss the notion of temperature in a dynamical context, within the theory of linear response for Hamiltonian systems at equilibrium and stochastic models with detailed balance, and the generalized fluctuation-response relations, which provide a hint for an extension of the definition of temperature in far-from-equilibrium systems. To conclude we consider non-Hamiltonian systems, such as granular materials, turbulence and active matter, where a general theoretical framework is still lacking.

  14. 50 CFR 648.260 - Specifications.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Scientific and Statistical Committee (SSC), and any other relevant information, the Red Crab PDT shall... appropriate, concerning the environmental, economic, and social impacts of the recommendations. The Regional...

  15. GeneLab

    NASA Technical Reports Server (NTRS)

    Berrios, Daniel C.; Thompson, Terri G.

    2015-01-01

    NASA GeneLab is expected to capture and distribute omics data and experimental and process conditions most relevant to research community in their statistical and theoretical analysis of NASAs omics data.

  16. Information Selection in Intelligence Processing

    DTIC Science & Technology

    2011-12-01

    given. Edges connecting nodes representing irrelevant persons with either relevant or irrelevant persons are added randomly, as in an Erdos- Renyi ...graph (Erdos at Renyi , 1959): For each irrelevant node i , and another node j (either relevant or irrelevant) there is a predetermined probability that...statistics for engineering and the sciences (7th ed.). Boston: Duxbury Press. Erdos, P., & Renyi , A. (1959). “On Random Graphs,” Publicationes

  17. What Are the Odds? Modern Relevance and Bayes Factor Solutions for MacAlister's Problem from the 1881 "Educational Times"

    ERIC Educational Resources Information Center

    Jamil, Tahira; Marsman, Maarten; Ly, Alexander; Morey, Richard D.; Wagenmakers, Eric-Jan

    2017-01-01

    In 1881, Donald MacAlister posed a problem in the "Educational Times" that remains relevant today. The problem centers on the statistical evidence for the effectiveness of a treatment based on a comparison between two proportions. A brief historical sketch is followed by a discussion of two default Bayesian solutions, one based on a…

  18. A note on generalized Genome Scan Meta-Analysis statistics

    PubMed Central

    Koziol, James A; Feng, Anne C

    2005-01-01

    Background Wise et al. introduced a rank-based statistical technique for meta-analysis of genome scans, the Genome Scan Meta-Analysis (GSMA) method. Levinson et al. recently described two generalizations of the GSMA statistic: (i) a weighted version of the GSMA statistic, so that different studies could be ascribed different weights for analysis; and (ii) an order statistic approach, reflecting the fact that a GSMA statistic can be computed for each chromosomal region or bin width across the various genome scan studies. Results We provide an Edgeworth approximation to the null distribution of the weighted GSMA statistic, and, we examine the limiting distribution of the GSMA statistics under the order statistic formulation, and quantify the relevance of the pairwise correlations of the GSMA statistics across different bins on this limiting distribution. We also remark on aggregate criteria and multiple testing for determining significance of GSMA results. Conclusion Theoretical considerations detailed herein can lead to clarification and simplification of testing criteria for generalizations of the GSMA statistic. PMID:15717930

  19. Statistical genetics concepts and approaches in schizophrenia and related neuropsychiatric research.

    PubMed

    Schork, Nicholas J; Greenwood, Tiffany A; Braff, David L

    2007-01-01

    Statistical genetics is a research field that focuses on mathematical models and statistical inference methodologies that relate genetic variations (ie, naturally occurring human DNA sequence variations or "polymorphisms") to particular traits or diseases (phenotypes) usually from data collected on large samples of families or individuals. The ultimate goal of such analysis is the identification of genes and genetic variations that influence disease susceptibility. Although of extreme interest and importance, the fact that many genes and environmental factors contribute to neuropsychiatric diseases of public health importance (eg, schizophrenia, bipolar disorder, and depression) complicates relevant studies and suggests that very sophisticated mathematical and statistical modeling may be required. In addition, large-scale contemporary human DNA sequencing and related projects, such as the Human Genome Project and the International HapMap Project, as well as the development of high-throughput DNA sequencing and genotyping technologies have provided statistical geneticists with a great deal of very relevant and appropriate information and resources. Unfortunately, the use of these resources and their interpretation are not straightforward when applied to complex, multifactorial diseases such as schizophrenia. In this brief and largely nonmathematical review of the field of statistical genetics, we describe many of the main concepts, definitions, and issues that motivate contemporary research. We also provide a discussion of the most pressing contemporary problems that demand further research if progress is to be made in the identification of genes and genetic variations that predispose to complex neuropsychiatric diseases.

  20. Assessing Attitudes towards Statistics among Medical Students: Psychometric Properties of the Serbian Version of the Survey of Attitudes Towards Statistics (SATS)

    PubMed Central

    Stanisavljevic, Dejana; Trajkovic, Goran; Marinkovic, Jelena; Bukumiric, Zoran; Cirkovic, Andja; Milic, Natasa

    2014-01-01

    Background Medical statistics has become important and relevant for future doctors, enabling them to practice evidence based medicine. Recent studies report that students’ attitudes towards statistics play an important role in their statistics achievements. The aim of the study was to test the psychometric properties of the Serbian version of the Survey of Attitudes Towards Statistics (SATS) in order to acquire a valid instrument to measure attitudes inside the Serbian educational context. Methods The validation study was performed on a cohort of 417 medical students who were enrolled in an obligatory introductory statistics course. The SATS adaptation was based on an internationally accepted methodology for translation and cultural adaptation. Psychometric properties of the Serbian version of the SATS were analyzed through the examination of factorial structure and internal consistency. Results Most medical students held positive attitudes towards statistics. The average total SATS score was above neutral (4.3±0.8), and varied from 1.9 to 6.2. Confirmatory factor analysis validated the six-factor structure of the questionnaire (Affect, Cognitive Competence, Value, Difficulty, Interest and Effort). Values for fit indices TLI (0.940) and CFI (0.961) were above the cut-off of ≥0.90. The RMSEA value of 0.064 (0.051–0.078) was below the suggested value of ≤0.08. Cronbach’s alpha of the entire scale was 0.90, indicating scale reliability. In a multivariate regression model, self-rating of ability in mathematics and current grade point average were significantly associated with the total SATS score after adjusting for age and gender. Conclusion Present study provided the evidence for the appropriate metric properties of the Serbian version of SATS. Confirmatory factor analysis validated the six-factor structure of the scale. The SATS might be reliable and a valid instrument for identifying medical students’ attitudes towards statistics in the Serbian educational context. PMID:25405489

  1. Assessing attitudes towards statistics among medical students: psychometric properties of the Serbian version of the Survey of Attitudes Towards Statistics (SATS).

    PubMed

    Stanisavljevic, Dejana; Trajkovic, Goran; Marinkovic, Jelena; Bukumiric, Zoran; Cirkovic, Andja; Milic, Natasa

    2014-01-01

    Medical statistics has become important and relevant for future doctors, enabling them to practice evidence based medicine. Recent studies report that students' attitudes towards statistics play an important role in their statistics achievements. The aim of the study was to test the psychometric properties of the Serbian version of the Survey of Attitudes Towards Statistics (SATS) in order to acquire a valid instrument to measure attitudes inside the Serbian educational context. The validation study was performed on a cohort of 417 medical students who were enrolled in an obligatory introductory statistics course. The SATS adaptation was based on an internationally accepted methodology for translation and cultural adaptation. Psychometric properties of the Serbian version of the SATS were analyzed through the examination of factorial structure and internal consistency. Most medical students held positive attitudes towards statistics. The average total SATS score was above neutral (4.3±0.8), and varied from 1.9 to 6.2. Confirmatory factor analysis validated the six-factor structure of the questionnaire (Affect, Cognitive Competence, Value, Difficulty, Interest and Effort). Values for fit indices TLI (0.940) and CFI (0.961) were above the cut-off of ≥0.90. The RMSEA value of 0.064 (0.051-0.078) was below the suggested value of ≤0.08. Cronbach's alpha of the entire scale was 0.90, indicating scale reliability. In a multivariate regression model, self-rating of ability in mathematics and current grade point average were significantly associated with the total SATS score after adjusting for age and gender. Present study provided the evidence for the appropriate metric properties of the Serbian version of SATS. Confirmatory factor analysis validated the six-factor structure of the scale. The SATS might be reliable and a valid instrument for identifying medical students' attitudes towards statistics in the Serbian educational context.

  2. Relating Climate Change Risks to Water Supply Planning Assumptions: Recent Applications by the U.S. Bureau of Reclamation (Invited)

    NASA Astrophysics Data System (ADS)

    Brekke, L. D.

    2009-12-01

    Presentation highlights recent methods carried by Reclamation to incorporate climate change and variability information into water supply assumptions for longer-term planning. Presentation also highlights limitations of these methods, and possible method adjustments that might be made to address these limitations. Reclamation was established more than one hundred years ago with a mission centered on the construction of irrigation and hydropower projects in the Western United States. Reclamation’s mission has evolved since its creation to include other activities, including municipal and industrial water supply projects, ecosystem restoration, and the protection and management of water supplies. Reclamation continues to explore ways to better address mission objectives, often considering proposals to develop new infrastructure and/or modify long-term criteria for operations. Such studies typically feature operations analysis to disclose benefits and effects of a given proposal, which are sensitive to assumptions made about future water supplies, water demands, and operating constraints. Development of these assumptions requires consideration to more fundamental future drivers such as land use, demographics, and climate. On the matter of establishing planning assumptions for water supplies under climate change, Reclamation has applied several methods. This presentation highlights two activities where the first focuses on potential changes in hydroclimate frequencies and the second focuses on potential changes in hydroclimate period-statistics. The first activity took place in the Colorado River Basin where there was interest in the interarrival possibilities of drought and surplus events of varying severity relevant to proposals on new criteria for handling lower basin shortages. The second activity occurred in California’s Central Valley where stakeholders were interested in how projected climate change possibilities translated into changes in hydrologic and water supply statistics relevant to a long-term federal Endangered Species Act consultation. Projected climate change possibilities were characterized by surveying a large ensemble of climate projections for change in period climate-statistics and then selecting a small set of projections featuring a bracketing set of period-changes relative to the those from the complete ensemble. Although both methods served the needs of their respective planning activities, each has limited applicability for other planning activities. First, each method addresses only one climate change aspect and not the other. Some planning activities may need to consider potential changes in both period-statistics and frequencies. Second, neither method addresses CMIP3 projected changes in climate variability. The first method bases frequency possibilities on historical information while the second method only surveys CMIP3 projections for change in period-mean and then superimposes those changes on historical variability. Third, artifacts of CMIP3 design lead to interpretation challenges when implementing the second method (e.g., inconsistent projection initialization, model-dependent expressions of multi-decadal variability). Presentation summarizes these issues and also potential method adjustments to address them when defining planning assumptions for water supplies.

  3. Perception of randomness: On the time of streaks.

    PubMed

    Sun, Yanlong; Wang, Hongbin

    2010-12-01

    People tend to think that streaks in random sequential events are rare and remarkable. When they actually encounter streaks, they tend to consider the underlying process as non-random. The present paper examines the time of pattern occurrences in sequences of Bernoulli trials, and shows that among all patterns of the same length, a streak is the most delayed pattern for its first occurrence. It is argued that when time is of essence, how often a pattern is to occur (mean time, or, frequency) and when a pattern is to first occur (waiting time) are different questions and bear different psychological relevance. The waiting time statistics may provide a quantitative measure to the psychological distance when people are expecting a probabilistic event, and such measure is consistent with both of the representativeness and availability heuristics in people's perception of randomness. We discuss some of the recent empirical findings and suggest that people's judgment and generation of random sequences may be guided by their actual experiences of the waiting time statistics. Published by Elsevier Inc.

  4. Translating statistical species-habitat models to interactive decision support tools

    USGS Publications Warehouse

    Wszola, Lyndsie S.; Simonsen, Victoria L.; Stuber, Erica F.; Gillespie, Caitlyn R.; Messinger, Lindsey N.; Decker, Karie L.; Lusk, Jeffrey J.; Jorgensen, Christopher F.; Bishop, Andrew A.; Fontaine, Joseph J.

    2017-01-01

    Understanding species-habitat relationships is vital to successful conservation, but the tools used to communicate species-habitat relationships are often poorly suited to the information needs of conservation practitioners. Here we present a novel method for translating a statistical species-habitat model, a regression analysis relating ring-necked pheasant abundance to landcover, into an interactive online tool. The Pheasant Habitat Simulator combines the analytical power of the R programming environment with the user-friendly Shiny web interface to create an online platform in which wildlife professionals can explore the effects of variation in local landcover on relative pheasant habitat suitability within spatial scales relevant to individual wildlife managers. Our tool allows users to virtually manipulate the landcover composition of a simulated space to explore how changes in landcover may affect pheasant relative habitat suitability, and guides users through the economic tradeoffs of landscape changes. We offer suggestions for development of similar interactive applications and demonstrate their potential as innovative science delivery tools for diverse professional and public audiences.

  5. Extracting semantic representations from word co-occurrence statistics: stop-lists, stemming, and SVD.

    PubMed

    Bullinaria, John A; Levy, Joseph P

    2012-09-01

    In a previous article, we presented a systematic computational study of the extraction of semantic representations from the word-word co-occurrence statistics of large text corpora. The conclusion was that semantic vectors of pointwise mutual information values from very small co-occurrence windows, together with a cosine distance measure, consistently resulted in the best representations across a range of psychologically relevant semantic tasks. This article extends that study by investigating the use of three further factors--namely, the application of stop-lists, word stemming, and dimensionality reduction using singular value decomposition (SVD)--that have been used to provide improved performance elsewhere. It also introduces an additional semantic task and explores the advantages of using a much larger corpus. This leads to the discovery and analysis of improved SVD-based methods for generating semantic representations (that provide new state-of-the-art performance on a standard TOEFL task) and the identification and discussion of problems and misleading results that can arise without a full systematic study.

  6. Stochastic Spatial Models in Ecology: A Statistical Physics Approach

    NASA Astrophysics Data System (ADS)

    Pigolotti, Simone; Cencini, Massimo; Molina, Daniel; Muñoz, Miguel A.

    2018-07-01

    Ecosystems display a complex spatial organization. Ecologists have long tried to characterize them by looking at how different measures of biodiversity change across spatial scales. Ecological neutral theory has provided simple predictions accounting for general empirical patterns in communities of competing species. However, while neutral theory in well-mixed ecosystems is mathematically well understood, spatial models still present several open problems, limiting the quantitative understanding of spatial biodiversity. In this review, we discuss the state of the art in spatial neutral theory. We emphasize the connection between spatial ecological models and the physics of non-equilibrium phase transitions and how concepts developed in statistical physics translate in population dynamics, and vice versa. We focus on non-trivial scaling laws arising at the critical dimension D = 2 of spatial neutral models, and their relevance for biological populations inhabiting two-dimensional environments. We conclude by discussing models incorporating non-neutral effects in the form of spatial and temporal disorder, and analyze how their predictions deviate from those of purely neutral theories.

  7. Linear regression analysis: part 14 of a series on evaluation of scientific publications.

    PubMed

    Schneider, Astrid; Hommel, Gerhard; Blettner, Maria

    2010-11-01

    Regression analysis is an important statistical method for the analysis of medical data. It enables the identification and characterization of relationships among multiple factors. It also enables the identification of prognostically relevant risk factors and the calculation of risk scores for individual prognostication. This article is based on selected textbooks of statistics, a selective review of the literature, and our own experience. After a brief introduction of the uni- and multivariable regression models, illustrative examples are given to explain what the important considerations are before a regression analysis is performed, and how the results should be interpreted. The reader should then be able to judge whether the method has been used correctly and interpret the results appropriately. The performance and interpretation of linear regression analysis are subject to a variety of pitfalls, which are discussed here in detail. The reader is made aware of common errors of interpretation through practical examples. Both the opportunities for applying linear regression analysis and its limitations are presented.

  8. [The Revision and 5th Edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5): Consequences for the Diagnostic Work with Children and Adolescents].

    PubMed

    Zulauf Logoz, Marina

    2014-01-01

    The Revision and 5th Edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5): Consequences for the Diagnostic Work with Children and Adolescents.The present paper describes and discusses the major revisions in DSM-5 for children and adolescents. A major modification is that the separate chapter for disorders first diagnosed in childhood and adolescence was abandoned in favour of the integration of these clinical pictures into the relevant disorder-specific chapters. Several new diagnoses and diagnostic groups were introduced: "Disruptive mood regulation disorder" is a new diagnosis; the different diagnoses for autism were brought together into one, and a new diagnostic group for obsessive-compulsive disorders has been established. The developmental approach of DSM-5 and the integration of dimensional assessment tools are to be welcomed. Practice will show if the critiques afraid of possible increases in prevalences or those who approve the changes will end up being right.

  9. Data on xylem sap proteins from Mn- and Fe-deficient tomato plants obtained using shotgun proteomics.

    PubMed

    Ceballos-Laita, Laura; Gutierrez-Carbonell, Elain; Takahashi, Daisuke; Abadía, Anunciación; Uemura, Matsuo; Abadía, Javier; López-Millán, Ana Flor

    2018-04-01

    This article contains consolidated proteomic data obtained from xylem sap collected from tomato plants grown in Fe- and Mn-sufficient control, as well as Fe-deficient and Mn-deficient conditions. Data presented here cover proteins identified and quantified by shotgun proteomics and Progenesis LC-MS analyses: proteins identified with at least two peptides and showing changes statistically significant (ANOVA; p ≤ 0.05) and above a biologically relevant selected threshold (fold ≥ 2) between treatments are listed. The comparison between Fe-deficient, Mn-deficient and control xylem sap samples using a multivariate statistical data analysis (Principal Component Analysis, PCA) is also included. Data included in this article are discussed in depth in the research article entitled "Effects of Fe and Mn deficiencies on the protein profiles of tomato ( Solanum lycopersicum) xylem sap as revealed by shotgun analyses" [1]. This dataset is made available to support the cited study as well to extend analyses at a later stage.

  10. Translating statistical species-habitat models to interactive decision support tools.

    PubMed

    Wszola, Lyndsie S; Simonsen, Victoria L; Stuber, Erica F; Gillespie, Caitlyn R; Messinger, Lindsey N; Decker, Karie L; Lusk, Jeffrey J; Jorgensen, Christopher F; Bishop, Andrew A; Fontaine, Joseph J

    2017-01-01

    Understanding species-habitat relationships is vital to successful conservation, but the tools used to communicate species-habitat relationships are often poorly suited to the information needs of conservation practitioners. Here we present a novel method for translating a statistical species-habitat model, a regression analysis relating ring-necked pheasant abundance to landcover, into an interactive online tool. The Pheasant Habitat Simulator combines the analytical power of the R programming environment with the user-friendly Shiny web interface to create an online platform in which wildlife professionals can explore the effects of variation in local landcover on relative pheasant habitat suitability within spatial scales relevant to individual wildlife managers. Our tool allows users to virtually manipulate the landcover composition of a simulated space to explore how changes in landcover may affect pheasant relative habitat suitability, and guides users through the economic tradeoffs of landscape changes. We offer suggestions for development of similar interactive applications and demonstrate their potential as innovative science delivery tools for diverse professional and public audiences.

  11. Translating statistical species-habitat models to interactive decision support tools

    PubMed Central

    Simonsen, Victoria L.; Stuber, Erica F.; Gillespie, Caitlyn R.; Messinger, Lindsey N.; Decker, Karie L.; Lusk, Jeffrey J.; Jorgensen, Christopher F.; Bishop, Andrew A.; Fontaine, Joseph J.

    2017-01-01

    Understanding species-habitat relationships is vital to successful conservation, but the tools used to communicate species-habitat relationships are often poorly suited to the information needs of conservation practitioners. Here we present a novel method for translating a statistical species-habitat model, a regression analysis relating ring-necked pheasant abundance to landcover, into an interactive online tool. The Pheasant Habitat Simulator combines the analytical power of the R programming environment with the user-friendly Shiny web interface to create an online platform in which wildlife professionals can explore the effects of variation in local landcover on relative pheasant habitat suitability within spatial scales relevant to individual wildlife managers. Our tool allows users to virtually manipulate the landcover composition of a simulated space to explore how changes in landcover may affect pheasant relative habitat suitability, and guides users through the economic tradeoffs of landscape changes. We offer suggestions for development of similar interactive applications and demonstrate their potential as innovative science delivery tools for diverse professional and public audiences. PMID:29236707

  12. Stochastic Spatial Models in Ecology: A Statistical Physics Approach

    NASA Astrophysics Data System (ADS)

    Pigolotti, Simone; Cencini, Massimo; Molina, Daniel; Muñoz, Miguel A.

    2017-11-01

    Ecosystems display a complex spatial organization. Ecologists have long tried to characterize them by looking at how different measures of biodiversity change across spatial scales. Ecological neutral theory has provided simple predictions accounting for general empirical patterns in communities of competing species. However, while neutral theory in well-mixed ecosystems is mathematically well understood, spatial models still present several open problems, limiting the quantitative understanding of spatial biodiversity. In this review, we discuss the state of the art in spatial neutral theory. We emphasize the connection between spatial ecological models and the physics of non-equilibrium phase transitions and how concepts developed in statistical physics translate in population dynamics, and vice versa. We focus on non-trivial scaling laws arising at the critical dimension D = 2 of spatial neutral models, and their relevance for biological populations inhabiting two-dimensional environments. We conclude by discussing models incorporating non-neutral effects in the form of spatial and temporal disorder, and analyze how their predictions deviate from those of purely neutral theories.

  13. minet: A R/Bioconductor package for inferring large transcriptional networks using mutual information.

    PubMed

    Meyer, Patrick E; Lafitte, Frédéric; Bontempi, Gianluca

    2008-10-29

    This paper presents the R/Bioconductor package minet (version 1.1.6) which provides a set of functions to infer mutual information networks from a dataset. Once fed with a microarray dataset, the package returns a network where nodes denote genes, edges model statistical dependencies between genes and the weight of an edge quantifies the statistical evidence of a specific (e.g transcriptional) gene-to-gene interaction. Four different entropy estimators are made available in the package minet (empirical, Miller-Madow, Schurmann-Grassberger and shrink) as well as four different inference methods, namely relevance networks, ARACNE, CLR and MRNET. Also, the package integrates accuracy assessment tools, like F-scores, PR-curves and ROC-curves in order to compare the inferred network with a reference one. The package minet provides a series of tools for inferring transcriptional networks from microarray data. It is freely available from the Comprehensive R Archive Network (CRAN) as well as from the Bioconductor website.

  14. A Bayesian statistical analysis of mouse dermal tumor promotion assay data for evaluating cigarette smoke condensate.

    PubMed

    Kathman, Steven J; Potts, Ryan J; Ayres, Paul H; Harp, Paul R; Wilson, Cody L; Garner, Charles D

    2010-10-01

    The mouse dermal assay has long been used to assess the dermal tumorigenicity of cigarette smoke condensate (CSC). This mouse skin model has been developed for use in carcinogenicity testing utilizing the SENCAR mouse as the standard strain. Though the model has limitations, it remains as the most relevant method available to study the dermal tumor promoting potential of mainstream cigarette smoke. In the typical SENCAR mouse CSC bioassay, CSC is applied for 29 weeks following the application of a tumor initiator such as 7,12-dimethylbenz[a]anthracene (DMBA). Several endpoints are considered for analysis including: the percentage of animals with at least one mass, latency, and number of masses per animal. In this paper, a relatively straightforward analytic model and procedure is presented for analyzing the time course of the incidence of masses. The procedure considered here takes advantage of Bayesian statistical techniques, which provide powerful methods for model fitting and simulation. Two datasets are analyzed to illustrate how the model fits the data, how well the model may perform in predicting data from such trials, and how the model may be used as a decision tool when comparing the dermal tumorigenicity of cigarette smoke condensate from multiple cigarette types. The analysis presented here was developed as a statistical decision tool for differentiating between two or more prototype products based on the dermal tumorigenicity. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  15. Transportation Energy Data Book: Edition 28

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Stacy Cagle; Diegel, Susan W; Boundy, Robert Gary

    2009-06-01

    The Transportation Energy Data Book: Edition 28 is a statistical compendium prepared and published by Oak Ridge National Laboratory (ORNL) under contract with U.S Department of Energy, Office of Energy Efficiency and Renewable Energy, Vehicle Technologies Program and the Hydrogen, Fuel Cells, and Infrastructure Technologies Program. Designed for use as a desk-top reference, the data book represents an assembly and display of statistics and information that characterize transportation activity, and presents data on other factors that influence transportation energy use. The purpose of this document is to present relevant statistical data in the form of tables and graphs. The latestmore » edition of the Data Book are available to a larger audience via the Internet (cta.ornl.gov/data). This edition of the Data Book has 12 chapters which focus on various aspects of the transportation industry. Chapter 1 focuses on petroleum; Chapter 2 energy; Chapter 3 highway vehicles; Chapter 4 light vehicles; Chapter 5 heavy vehicles; Chapter 6 alternative fuel vehicles; Chapter 7 fleet vehicles; Chapter 8 household vehicles; and Chapter 9 nonhighway modes; Chapter 10 transportation and the economy; Chapter 11 greenhouse gas emissions; and Chapter 12 criteria pollutant emissions. The sources used represent the latest available data. There are also three appendices which include detailed source information for some tables, measures of conversion, and the definition of Census divisions and regions. A glossary of terms and a title index are also included for the readers convenience.« less

  16. Transportation Energy Data Book: Edition 27

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Stacy Cagle; Diegel, Susan W; Boundy, Robert Gary

    2008-06-01

    The Transportation Energy Data Book: Edition 27 is a statistical compendium prepared and published by Oak Ridge National Laboratory (ORNL) under contract with the Office of Planning, Budget Formulation, and Analysis, under the Energy Efficiency and Renewable Energy (EERE) program in the Department of Energy (DOE). Designed for use as a desk-top reference, the data book represents an assembly and display of statistics and information that characterize transportation activity, and presents data on other factors that influence transportation energy use. The purpose of this document is to present relevant statistical data in the form of tables and graphs. The latestmore » editions of the Data Book are available to a larger audience via the Internet (cta.ornl.gov/data). This edition of the Data Book has 12 chapters which focus on various aspects of the transportation industry. Chapter 1 focuses on petroleum; Chapter 2 energy; Chapter 3 highway vehicles; Chapter 4 light vehicles; Chapter 5 heavy vehicles; Chapter 6 alternative fuel vehicles; Chapter 7 fleet vehicles; Chapter 8 household vehicles; and Chapter 9 nonhighway modes; Chapter 10 transportation and the economy; Chapter 11 greenhouse gas emissions; and Chapter 12 criteria pollutant emissions. The sources used represent the latest available data. There are also three appendices which include detailed source information for some tables, measures of conversion, and the definition of Census divisions and regions. A glossary of terms and a title index are also included for the readers convenience.« less

  17. BIOREL: the benchmark resource to estimate the relevance of the gene networks.

    PubMed

    Antonov, Alexey V; Mewes, Hans W

    2006-02-06

    The progress of high-throughput methodologies in functional genomics has lead to the development of statistical procedures to infer gene networks from various types of high-throughput data. However, due to the lack of common standards, the biological significance of the results of the different studies is hard to compare. To overcome this problem we propose a benchmark procedure and have developed a web resource (BIOREL), which is useful for estimating the biological relevance of any genetic network by integrating different sources of biological information. The associations of each gene from the network are classified as biologically relevant or not. The proportion of genes in the network classified as "relevant" is used as the overall network relevance score. Employing synthetic data we demonstrated that such a score ranks the networks fairly in respect to the relevance level. Using BIOREL as the benchmark resource we compared the quality of experimental and theoretically predicted protein interaction data.

  18. Simulation and analysis of scalable non-Gaussian statistically anisotropic random functions

    NASA Astrophysics Data System (ADS)

    Riva, Monica; Panzeri, Marco; Guadagnini, Alberto; Neuman, Shlomo P.

    2015-12-01

    Many earth and environmental (as well as other) variables, Y, and their spatial or temporal increments, ΔY, exhibit non-Gaussian statistical scaling. Previously we were able to capture some key aspects of such scaling by treating Y or ΔY as standard sub-Gaussian random functions. We were however unable to reconcile two seemingly contradictory observations, namely that whereas sample frequency distributions of Y (or its logarithm) exhibit relatively mild non-Gaussian peaks and tails, those of ΔY display peaks that grow sharper and tails that become heavier with decreasing separation distance or lag. Recently we overcame this difficulty by developing a new generalized sub-Gaussian model which captures both behaviors in a unified and consistent manner, exploring it on synthetically generated random functions in one dimension (Riva et al., 2015). Here we extend our generalized sub-Gaussian model to multiple dimensions, present an algorithm to generate corresponding random realizations of statistically isotropic or anisotropic sub-Gaussian functions and illustrate it in two dimensions. We demonstrate the accuracy of our algorithm by comparing ensemble statistics of Y and ΔY (such as, mean, variance, variogram and probability density function) with those of Monte Carlo generated realizations. We end by exploring the feasibility of estimating all relevant parameters of our model by analyzing jointly spatial moments of Y and ΔY obtained from a single realization of Y.

  19. The Use of Cronbach's Alpha When Developing and Reporting Research Instruments in Science Education

    NASA Astrophysics Data System (ADS)

    Taber, Keith S.

    2017-06-01

    Cronbach's alpha is a statistic commonly quoted by authors to demonstrate that tests and scales that have been constructed or adopted for research projects are fit for purpose. Cronbach's alpha is regularly adopted in studies in science education: it was referred to in 69 different papers published in 4 leading science education journals in a single year (2015)—usually as a measure of reliability. This article explores how this statistic is used in reporting science education research and what it represents. Authors often cite alpha values with little commentary to explain why they feel this statistic is relevant and seldom interpret the result for readers beyond citing an arbitrary threshold for an acceptable value. Those authors who do offer readers qualitative descriptors interpreting alpha values adopt a diverse and seemingly arbitrary terminology. More seriously, illustrative examples from the science education literature demonstrate that alpha may be acceptable even when there are recognised problems with the scales concerned. Alpha is also sometimes inappropriately used to claim an instrument is unidimensional. It is argued that a high value of alpha offers limited evidence of the reliability of a research instrument, and that indeed a very high value may actually be undesirable when developing a test of scientific knowledge or understanding. Guidance is offered to authors reporting, and readers evaluating, studies that present Cronbach's alpha statistic as evidence of instrument quality.

  20. Information Entropy Production of Maximum Entropy Markov Chains from Spike Trains

    NASA Astrophysics Data System (ADS)

    Cofré, Rodrigo; Maldonado, Cesar

    2018-01-01

    We consider the maximum entropy Markov chain inference approach to characterize the collective statistics of neuronal spike trains, focusing on the statistical properties of the inferred model. We review large deviations techniques useful in this context to describe properties of accuracy and convergence in terms of sampling size. We use these results to study the statistical fluctuation of correlations, distinguishability and irreversibility of maximum entropy Markov chains. We illustrate these applications using simple examples where the large deviation rate function is explicitly obtained for maximum entropy models of relevance in this field.

  1. Some new mathematical methods for variational objective analysis

    NASA Technical Reports Server (NTRS)

    Wahba, Grace; Johnson, Donald R.

    1994-01-01

    Numerous results were obtained relevant to remote sensing, variational objective analysis, and data assimilation. A list of publications relevant in whole or in part is attached. The principal investigator gave many invited lectures, disseminating the results to the meteorological community as well as the statistical community. A list of invited lectures at meetings is attached, as well as a list of departmental colloquia at various universities and institutes.

  2. Statistics for clinical nursing practice: an introduction.

    PubMed

    Rickard, Claire M

    2008-11-01

    Difficulty in understanding statistics is one of the most frequently reported barriers to nurses applying research results in their practice. Yet the amount of nursing research published each year continues to grow, as does the expectation that nurses will undertake practice based on this evidence. Critical care nurses do not need to be statisticians, but they do need to develop a working knowledge of statistics so they can be informed consumers of research and so practice can evolve and improve. For those undertaking a research project, statistical literacy is required to interact with other researchers and statisticians, so as to best design and undertake the project. This article is the first in a series that guides critical care nurses through statistical terms and concepts relevant to their practice.

  3. Techniques for recognizing identity of several response functions from the data of visual inspection

    NASA Astrophysics Data System (ADS)

    Nechval, Nicholas A.

    1996-08-01

    The purpose of this paper is to present some efficient techniques for recognizing from the observed data whether several response functions are identical to each other. For example, in an industrial setting the problem may be to determine whether the production coefficients established in a small-scale pilot study apply to each of several large- scale production facilities. The techniques proposed here combine sensor information from automated visual inspection of manufactured products which is carried out by means of pixel-by-pixel comparison of the sensed image of the product to be inspected with some reference pattern (or image). Let (a1, . . . , am) be p-dimensional parameters associated with m response models of the same type. This study is concerned with the simultaneous comparison of a1, . . . , am. A generalized maximum likelihood ratio (GMLR) test is derived for testing equality of these parameters, where each of the parameters represents a corresponding vector of regression coefficients. The GMLR test reduces to an equivalent test based on a statistic that has an F distribution. The main advantage of the test lies in its relative simplicity and the ease with which it can be applied. Another interesting test for the same problem is an application of Fisher's method of combining independent test statistics which can be considered as a parallel procedure to the GMLR test. The combination of independent test statistics does not appear to have been used very much in applied statistics. There does, however, seem to be potential data analytic value in techniques for combining distributional assessments in relation to statistically independent samples which are of joint experimental relevance. In addition, a new iterated test for the problem defined above is presented. A rejection of the null hypothesis by this test provides some reason why all the parameters are not equal. A numerical example is discussed in the context of the proposed procedures for hypothesis testing.

  4. Australasian Resuscitation In Sepsis Evaluation trial statistical analysis plan.

    PubMed

    Delaney, Anthony; Peake, Sandra L; Bellomo, Rinaldo; Cameron, Peter; Holdgate, Anna; Howe, Belinda; Higgins, Alisa; Presneill, Jeffrey; Webb, Steve

    2013-10-01

    The Australasian Resuscitation In Sepsis Evaluation (ARISE) study is an international, multicentre, randomised, controlled trial designed to evaluate the effectiveness of early goal-directed therapy compared with standard care for patients presenting to the ED with severe sepsis. In keeping with current practice, and taking into considerations aspects of trial design and reporting specific to non-pharmacologic interventions, this document outlines the principles and methods for analysing and reporting the trial results. The document is prepared prior to completion of recruitment into the ARISE study, without knowledge of the results of the interim analysis conducted by the data safety and monitoring committee and prior to completion of the two related international studies. The statistical analysis plan was designed by the ARISE chief investigators, and reviewed and approved by the ARISE steering committee. The data collected by the research team as specified in the study protocol, and detailed in the study case report form were reviewed. Information related to baseline characteristics, characteristics of delivery of the trial interventions, details of resuscitation and other related therapies, and other relevant data are described with appropriate comparisons between groups. The primary, secondary and tertiary outcomes for the study are defined, with description of the planned statistical analyses. A statistical analysis plan was developed, along with a trial profile, mock-up tables and figures. A plan for presenting baseline characteristics, microbiological and antibiotic therapy, details of the interventions, processes of care and concomitant therapies, along with adverse events are described. The primary, secondary and tertiary outcomes are described along with identification of subgroups to be analysed. A statistical analysis plan for the ARISE study has been developed, and is available in the public domain, prior to the completion of recruitment into the study. This will minimise analytic bias and conforms to current best practice in conducting clinical trials. © 2013 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.

  5. Invited Commentary: The Need for Cognitive Science in Methodology.

    PubMed

    Greenland, Sander

    2017-09-15

    There is no complete solution for the problem of abuse of statistics, but methodological training needs to cover cognitive biases and other psychosocial factors affecting inferences. The present paper discusses 3 common cognitive distortions: 1) dichotomania, the compulsion to perceive quantities as dichotomous even when dichotomization is unnecessary and misleading, as in inferences based on whether a P value is "statistically significant"; 2) nullism, the tendency to privilege the hypothesis of no difference or no effect when there is no scientific basis for doing so, as when testing only the null hypothesis; and 3) statistical reification, treating hypothetical data distributions and statistical models as if they reflect known physical laws rather than speculative assumptions for thought experiments. As commonly misused, null-hypothesis significance testing combines these cognitive problems to produce highly distorted interpretation and reporting of study results. Interval estimation has so far proven to be an inadequate solution because it involves dichotomization, an avenue for nullism. Sensitivity and bias analyses have been proposed to address reproducibility problems (Am J Epidemiol. 2017;186(6):646-647); these methods can indeed address reification, but they can also introduce new distortions via misleading specifications for bias parameters. P values can be reframed to lessen distortions by presenting them without reference to a cutoff, providing them for relevant alternatives to the null, and recognizing their dependence on all assumptions used in their computation; they nonetheless require rescaling for measuring evidence. I conclude that methodological development and training should go beyond coverage of mechanistic biases (e.g., confounding, selection bias, measurement error) to cover distortions of conclusions produced by statistical methods and psychosocial forces. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. Adaptation in Coding by Large Populations of Neurons in the Retina

    NASA Astrophysics Data System (ADS)

    Ioffe, Mark L.

    A comprehensive theory of neural computation requires an understanding of the statistical properties of the neural population code. The focus of this work is the experimental study and theoretical analysis of the statistical properties of neural activity in the tiger salamander retina. This is an accessible yet complex system, for which we control the visual input and record from a substantial portion--greater than a half--of the ganglion cell population generating the spiking output. Our experiments probe adaptation of the retina to visual statistics: a central feature of sensory systems which have to adjust their limited dynamic range to a far larger space of possible inputs. In Chapter 1 we place our work in context with a brief overview of the relevant background. In Chapter 2 we describe the experimental methodology of recording from 100+ ganglion cells in the tiger salamander retina. In Chapter 3 we first present the measurements of adaptation of individual cells to changes in stimulation statistics and then investigate whether pairwise correlations in fluctuations of ganglion cell activity change across different stimulation conditions. We then transition to a study of the population-level probability distribution of the retinal response captured with maximum-entropy models. Convergence of the model inference is presented in Chapter 4. In Chapter 5 we first test the empirical presence of a phase transition in such models fitting the retinal response to different experimental conditions, and then proceed to develop other characterizations which are sensitive to complexity in the interaction matrix. This includes an analysis of the dynamics of sampling at finite temperature, which demonstrates a range of subtle attractor-like properties in the energy landscape. These are largely conserved when ambient illumination is varied 1000-fold, a result not necessarily apparent from the measured low-order statistics of the distribution. Our results form a consistent picture which is discussed at the end of Chapter 5. We conclude with a few future directions related to this thesis.

  7. An analysis of the positional distribution of DNA motifs in promoter regions and its biological relevance.

    PubMed

    Casimiro, Ana C; Vinga, Susana; Freitas, Ana T; Oliveira, Arlindo L

    2008-02-07

    Motif finding algorithms have developed in their ability to use computationally efficient methods to detect patterns in biological sequences. However the posterior classification of the output still suffers from some limitations, which makes it difficult to assess the biological significance of the motifs found. Previous work has highlighted the existence of positional bias of motifs in the DNA sequences, which might indicate not only that the pattern is important, but also provide hints of the positions where these patterns occur preferentially. We propose to integrate position uniformity tests and over-representation tests to improve the accuracy of the classification of motifs. Using artificial data, we have compared three different statistical tests (Chi-Square, Kolmogorov-Smirnov and a Chi-Square bootstrap) to assess whether a given motif occurs uniformly in the promoter region of a gene. Using the test that performed better in this dataset, we proceeded to study the positional distribution of several well known cis-regulatory elements, in the promoter sequences of different organisms (S. cerevisiae, H. sapiens, D. melanogaster, E. coli and several Dicotyledons plants). The results show that position conservation is relevant for the transcriptional machinery. We conclude that many biologically relevant motifs appear heterogeneously distributed in the promoter region of genes, and therefore, that non-uniformity is a good indicator of biological relevance and can be used to complement over-representation tests commonly used. In this article we present the results obtained for the S. cerevisiae data sets.

  8. Large eddy simulation of a wing-body junction flow

    NASA Astrophysics Data System (ADS)

    Ryu, Sungmin; Emory, Michael; Campos, Alejandro; Duraisamy, Karthik; Iaccarino, Gianluca

    2014-11-01

    We present numerical simulations of the wing-body junction flow experimentally investigated by Devenport & Simpson (1990). Wall-junction flows are common in engineering applications but relevant flow physics close to the corner region is not well understood. Moreover, performance of turbulence models for the body-junction case is not well characterized. Motivated by the insufficient investigations, we have numerically investigated the case with Reynolds-averaged Naiver-Stokes equation (RANS) and Large Eddy Simulation (LES) approaches. The Vreman model applied for the LES and SST k- ω model for the RANS simulation are validated focusing on the ability to predict turbulence statistics near the junction region. Moreover, a sensitivity study of the form of the Vreman model will also be presented. This work is funded under NASA Cooperative Agreement NNX11AI41A (Technical Monitor Dr. Stephen Woodruff)

  9. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.

  10. Why Can't We Resolve Recruitment?

    NASA Astrophysics Data System (ADS)

    Ferreira, S. A.; Payne, M. R.; Hátún, H.; MacKenzie, B. R.; Butenschön, M.; Visser, A. W.

    2016-02-01

    During the last century, Johan Hjort's work has lead to signicant advances in explaining anomalous year-classes within sheries science. However, distinguishing between the competing mechanisms of year-class regulation (e.g., food conditions, predation, transport) has proved challenging. We use blue whiting (Micromesistius poutassou) in the North-east Atlantic Ocean as a case study, which, during the late 1990s and early 2000s, generated year-classes up to nearly an order of magnitude higher than those seen before or after. There presently exists no models that can quantify past variations in recruitment for this stock. Using modern stock-statistical and observational tools, we catalog a range of environmentally-driven hypotheses relevant for recruitment of blue whiting, including physical and biogeographic conditions, phenology, parental effects and predation. We have run the analyses to test some hypotheses and results will be presented at the session.

  11. Framing Electronic Medical Records as Polylingual Documents in Query Expansion

    PubMed Central

    Huang, Edward W; Wang, Sheng; Lee, Doris Jung-Lin; Zhang, Runshun; Liu, Baoyan; Zhou, Xuezhong; Zhai, ChengXiang

    2017-01-01

    We present a study of electronic medical record (EMR) retrieval that emulates situations in which a doctor treats a new patient. Given a query consisting of a new patient’s symptoms, the retrieval system returns the set of most relevant records of previously treated patients. However, due to semantic, functional, and treatment synonyms in medical terminology, queries are often incomplete and thus require enhancement. In this paper, we present a topic model that frames symptoms and treatments as separate languages. Our experimental results show that this method improves retrieval performance over several baselines with statistical significance. These baselines include methods used in prior studies as well as state-of-the-art embedding techniques. Finally, we show that our proposed topic model discovers all three types of synonyms to improve medical record retrieval. PMID:29854161

  12. Publications in anesthesia journals: quality and clinical relevance.

    PubMed

    Lauritsen, Jakob; Moller, Ann M

    2004-11-01

    Clinicians performing evidence-based anesthesia rely on anesthesia journals for clinically relevant information. The objective of this study was to analyze the proportion of clinically relevant articles in five high impact anesthesia journals. We evaluated all articles published in Anesthesiology, Anesthesia & Analgesia, British Journal of Anesthesia, Anesthesia, and Acta Anaesthesiologica Scandinavica from January to June, 2000. Articles were assessed and classified according to type, outcome, and design; 1379 articles consisting of 5468 pages were evaluated and categorized. The most common types of article were animal and laboratory research (31.2%) and randomized clinical trial (20.4%). A clinically relevant article was defined as an article that used a statistically valid method and had a clinically relevant end-point. Altogether 18.6% of the pages had as their subject matter clinically relevant trials. We compared the Journal Impact Factor (a measure of the number of citations per article in a journal) and the proportion of clinically relevant pages and found that they were inversely proportional to each other.

  13. Fear and loathing: undergraduate nursing students' experiences of a mandatory course in applied statistics.

    PubMed

    Hagen, Brad; Awosoga, Oluwagbohunmi A; Kellett, Peter; Damgaard, Marie

    2013-04-23

    This article describes the results of a qualitative research study evaluating nursing students' experiences of a mandatory course in applied statistics, and the perceived effectiveness of teaching methods implemented during the course. Fifteen nursing students in the third year of a four-year baccalaureate program in nursing participated in focus groups before and after taking the mandatory course in statistics. The interviews were transcribed and analyzed using content analysis to reveal four major themes: (i) "one of those courses you throw out?," (ii) "numbers and terrifying equations," (iii) "first aid for statistics casualties," and (iv) "re-thinking curriculum." Overall, the data revealed that although nursing students initially enter statistics courses with considerable skepticism, fear, and anxiety, there are a number of concrete actions statistics instructors can take to reduce student fear and increase the perceived relevance of courses in statistics.

  14. Uncovering robust patterns of microRNA co-expression across cancers using Bayesian Relevance Networks

    PubMed Central

    2017-01-01

    Co-expression networks have long been used as a tool for investigating the molecular circuitry governing biological systems. However, most algorithms for constructing co-expression networks were developed in the microarray era, before high-throughput sequencing—with its unique statistical properties—became the norm for expression measurement. Here we develop Bayesian Relevance Networks, an algorithm that uses Bayesian reasoning about expression levels to account for the differing levels of uncertainty in expression measurements between highly- and lowly-expressed entities, and between samples with different sequencing depths. It combines data from groups of samples (e.g., replicates) to estimate group expression levels and confidence ranges. It then computes uncertainty-moderated estimates of cross-group correlations between entities, and uses permutation testing to assess their statistical significance. Using large scale miRNA data from The Cancer Genome Atlas, we show that our Bayesian update of the classical Relevance Networks algorithm provides improved reproducibility in co-expression estimates and lower false discovery rates in the resulting co-expression networks. Software is available at www.perkinslab.ca. PMID:28817636

  15. Uncovering robust patterns of microRNA co-expression across cancers using Bayesian Relevance Networks.

    PubMed

    Ramachandran, Parameswaran; Sánchez-Taltavull, Daniel; Perkins, Theodore J

    2017-01-01

    Co-expression networks have long been used as a tool for investigating the molecular circuitry governing biological systems. However, most algorithms for constructing co-expression networks were developed in the microarray era, before high-throughput sequencing-with its unique statistical properties-became the norm for expression measurement. Here we develop Bayesian Relevance Networks, an algorithm that uses Bayesian reasoning about expression levels to account for the differing levels of uncertainty in expression measurements between highly- and lowly-expressed entities, and between samples with different sequencing depths. It combines data from groups of samples (e.g., replicates) to estimate group expression levels and confidence ranges. It then computes uncertainty-moderated estimates of cross-group correlations between entities, and uses permutation testing to assess their statistical significance. Using large scale miRNA data from The Cancer Genome Atlas, we show that our Bayesian update of the classical Relevance Networks algorithm provides improved reproducibility in co-expression estimates and lower false discovery rates in the resulting co-expression networks. Software is available at www.perkinslab.ca.

  16. Quantifying Trace Amounts of Aggregates in Biopharmaceuticals Using Analytical Ultracentrifugation Sedimentation Velocity: Bayesian Analyses and F Statistics.

    PubMed

    Wafer, Lucas; Kloczewiak, Marek; Luo, Yin

    2016-07-01

    Analytical ultracentrifugation-sedimentation velocity (AUC-SV) is often used to quantify high molar mass species (HMMS) present in biopharmaceuticals. Although these species are often present in trace quantities, they have received significant attention due to their potential immunogenicity. Commonly, AUC-SV data is analyzed as a diffusion-corrected, sedimentation coefficient distribution, or c(s), using SEDFIT to numerically solve Lamm-type equations. SEDFIT also utilizes maximum entropy or Tikhonov-Phillips regularization to further allow the user to determine relevant sample information, including the number of species present, their sedimentation coefficients, and their relative abundance. However, this methodology has several, often unstated, limitations, which may impact the final analysis of protein therapeutics. These include regularization-specific effects, artificial "ripple peaks," and spurious shifts in the sedimentation coefficients. In this investigation, we experimentally verified that an explicit Bayesian approach, as implemented in SEDFIT, can largely correct for these effects. Clear guidelines on how to implement this technique and interpret the resulting data, especially for samples containing micro-heterogeneity (e.g., differential glycosylation), are also provided. In addition, we demonstrated how the Bayesian approach can be combined with F statistics to draw more accurate conclusions and rigorously exclude artifactual peaks. Numerous examples with an antibody and an antibody-drug conjugate were used to illustrate the strengths and drawbacks of each technique.

  17. Microbiological contamination of cubicle curtains in an out-patient podiatry clinic

    PubMed Central

    2010-01-01

    Background Exposure to potential pathogens on contaminated healthcare garments and curtains can occur through direct or indirect contact. This study aimed to identify the microorganisms present on podiatry clinic curtains and measure the contamination pre and post a standard hospital laundry process. Method Baseline swabs were taken to determine colony counts present on cubical curtains before laundering. Curtains were swabbed again immediately after, one and three weeks post laundering. Total colony counts were calculated and compared to baseline, with identification of micro-organisms. Results Total colony counts increased very slightly by 3% immediately after laundry, which was not statistically significant, and declined significantly (p = 0.0002) by 56% one-week post laundry. Three weeks post laundry colony counts had increased by 16%; although clinically relevant, this was not statistically significant. The two most frequent microorganisms present throughout were Coagulase Negative Staphylococcus and Micrococcus species. Laundering was not completely effective, as both species demonstrated no significant change following laundry. Conclusion This work suggests current laundry procedures may not be 100% effective in killing all microorganisms found on curtains, although a delayed decrease in total colony counts was evident. Cubicle curtains may act as a reservoir for microorganisms creating potential for cross contamination. This highlights the need for additional cleaning methods to decrease the risk of cross infection and the importance of maintaining good hand hygiene. PMID:21087486

  18. Addressing criticisms of existing predictive bias research: cognitive ability test scores still overpredict African Americans' job performance.

    PubMed

    Berry, Christopher M; Zhao, Peng

    2015-01-01

    Predictive bias studies have generally suggested that cognitive ability test scores overpredict job performance of African Americans, meaning these tests are not predictively biased against African Americans. However, at least 2 issues call into question existing over-/underprediction evidence: (a) a bias identified by Aguinis, Culpepper, and Pierce (2010) in the intercept test typically used to assess over-/underprediction and (b) a focus on the level of observed validity instead of operational validity. The present study developed and utilized a method of assessing over-/underprediction that draws on the math of subgroup regression intercept differences, does not rely on the biased intercept test, allows for analysis at the level of operational validity, and can use meta-analytic estimates as input values. Therefore, existing meta-analytic estimates of key parameters, corrected for relevant statistical artifacts, were used to determine whether African American job performance remains overpredicted at the level of operational validity. African American job performance was typically overpredicted by cognitive ability tests across levels of job complexity and across conditions wherein African American and White regression slopes did and did not differ. Because the present study does not rely on the biased intercept test and because appropriate statistical artifact corrections were carried out, the present study's results are not affected by the 2 issues mentioned above. The present study represents strong evidence that cognitive ability tests generally overpredict job performance of African Americans. (c) 2015 APA, all rights reserved.

  19. An analytical poroelastic model for ultrasound elastography imaging of tumors

    NASA Astrophysics Data System (ADS)

    Tauhidul Islam, Md; Chaudhry, Anuj; Unnikrishnan, Ginu; Reddy, J. N.; Righetti, Raffaella

    2018-01-01

    The mechanical behavior of biological tissues has been studied using a number of mechanical models. Due to the relatively high fluid content and mobility, many biological tissues have been modeled as poroelastic materials. Diseases such as cancers are known to alter the poroelastic response of a tissue. Tissue poroelastic properties such as compressibility, interstitial permeability and fluid pressure also play a key role for the assessment of cancer treatments and for improved therapies. At the present time, however, a limited number of poroelastic models for soft tissues are retrievable in the literature, and the ones available are not directly applicable to tumors as they typically refer to uniform tissues. In this paper, we report the analytical poroelastic model for a non-uniform tissue under stress relaxation. Displacement, strain and fluid pressure fields in a cylindrical poroelastic sample containing a cylindrical inclusion during stress relaxation are computed. Finite element simulations are then used to validate the proposed theoretical model. Statistical analysis demonstrates that the proposed analytical model matches the finite element results with less than 0.5% error. The availability of the analytical model and solutions presented in this paper may be useful to estimate diagnostically relevant poroelastic parameters such as interstitial permeability and fluid pressure, and, in general, for a better interpretation of clinically-relevant ultrasound elastography results.

  20. Optical bedside monitoring of cerebral perfusion: technological and methodological advances applied in a study on acute ischemic stroke

    NASA Astrophysics Data System (ADS)

    Steinkellner, Oliver; Gruber, Clemens; Wabnitz, Heidrun; Jelzow, Alexander; Steinbrink, Jens; Fiebach, Jochen B.; MacDonald, Rainer; Obrig, Hellmuth

    2010-11-01

    We present results of a clinical study on bedside perfusion monitoring of the human brain by optical bolus tracking. We measure the kinetics of the contrast agent indocyanine green using time-domain near-IR spectroscopy (tdNIRS) in 10 patients suffering from acute unilateral ischemic stroke. In all patients, a delay of the bolus over the affected when compared to the unaffected hemisphere is found (mean: 1.5 s, range: 0.2 s to 5.2 s). A portable time-domain near-IR reflectometer is optimized and approved for clinical studies. Data analysis based on statistical moments of time-of-flight distributions of diffusely reflected photons enables high sensitivity to intracerebral changes in bolus kinetics. Since the second centralized moment, variance, is preferentially sensitive to deep absorption changes, it provides a suitable representation of the cerebral signals relevant for perfusion monitoring in stroke. We show that variance-based bolus tracking is also less susceptible to motion artifacts, which often occur in severely affected patients. We present data that clearly manifest the applicability of the tdNIRS approach to assess cerebral perfusion in acute stroke patients at the bedside. This may be of high relevance to its introduction as a monitoring tool on stroke units.

  1. Computational and Statistical Models: A Comparison for Policy Modeling of Childhood Obesity

    NASA Astrophysics Data System (ADS)

    Mabry, Patricia L.; Hammond, Ross; Ip, Edward Hak-Sing; Huang, Terry T.-K.

    As systems science methodologies have begun to emerge as a set of innovative approaches to address complex problems in behavioral, social science, and public health research, some apparent conflicts with traditional statistical methodologies for public health have arisen. Computational modeling is an approach set in context that integrates diverse sources of data to test the plausibility of working hypotheses and to elicit novel ones. Statistical models are reductionist approaches geared towards proving the null hypothesis. While these two approaches may seem contrary to each other, we propose that they are in fact complementary and can be used jointly to advance solutions to complex problems. Outputs from statistical models can be fed into computational models, and outputs from computational models can lead to further empirical data collection and statistical models. Together, this presents an iterative process that refines the models and contributes to a greater understanding of the problem and its potential solutions. The purpose of this panel is to foster communication and understanding between statistical and computational modelers. Our goal is to shed light on the differences between the approaches and convey what kinds of research inquiries each one is best for addressing and how they can serve complementary (and synergistic) roles in the research process, to mutual benefit. For each approach the panel will cover the relevant "assumptions" and how the differences in what is assumed can foster misunderstandings. The interpretations of the results from each approach will be compared and contrasted and the limitations for each approach will be delineated. We will use illustrative examples from CompMod, the Comparative Modeling Network for Childhood Obesity Policy. The panel will also incorporate interactive discussions with the audience on the issues raised here.

  2. Nonstationarity RC Workshop Report: Nonstationary Weather Patterns and Extreme Events Informing Design and Planning for Long-Lived Infrastructure

    DTIC Science & Technology

    2017-11-01

    magnitude, intensity, and seasonality of climate. For infrastructure projects, relevant design life often exceeds 30 years—a period of time of...uncertainty about future statistical properties of climate at time and spatial scales required for planning and design purposes. Information...about future statistical properties of climate at time and spatial scales required for planning and design , and for assessing future operational

  3. Occlusal status and prevalence of occlusal malocclusion traits among 9-year-old schoolchildren.

    PubMed

    Lux, Christopher J; Dücker, Britta; Pritsch, Maria; Komposch, Gerda; Niekusch, Uwe

    2009-06-01

    The aim of this study was to provide detailed information concerning clinically relevant occlusal traits and the prevalence of occlusal anomalies in an orthodontically relevant period of dental development. Four hundred and ninety-four German schoolchildren (237 males and 257 females), median age 9 years, were orthodontically examined. Overjet and overbite were measured to the nearest 0.5 mm, and sagittal molar relationships were registered clinically to the nearest quarter unit. In addition, crossbites, scissor bites, and midline displacements were evaluated. Descriptive statistics was complemented by testing gender differences and differences between groups with Class I and Class II anomalies (Mann-Whitney U-test) as well as a statistical evaluation of differences between the three dental stages (Kruskal-Wallis test). Overjet exhibited an extreme range between -2 and 12 mm (median values 3-3.5 mm). An increased overjet was more prevalent than a reduced or reverse overjet, and a severely increased overjet greater than 6 mm was a common finding affecting around 5-10 per cent of the children. Similarly, overbite showed considerable variations of between -1 and 9 mm (medians 3-3.5 mm) and males exhibited a significantly larger overbite than females. In Class II malocclusion subjects, overbite was significantly enlarged (on average between 0.5 and 1 mm) when compared with those with a Class I malocclusion. Traumatic contact of the gingiva affected every 14th child. A Class II molar relationship of three-quarter units or more was a frequent finding affecting more than one child in five. In addition, at 9 years of age, 3 per cent of the children exhibited a Class III molar relationship of at least a half unit. The wide range of orthodontically relevant occlusal traits found in the present study underlines the need for orthodontic screening at 9 years of age (or earlier).

  4. Employability and career experiences of international graduates of MSc Public Health: a mixed methods study.

    PubMed

    Buunaaisie, C; Manyara, A M; Annett, H; Bird, E L; Bray, I; Ige, J; Jones, M; Orme, J; Pilkington, P; Evans, D

    2018-05-08

    This article aims to describe the public health career experiences of international graduates of a Master of Science in Public Health (MSc PH) programme and to contribute to developing the evidence base on international public health workforce capacity development. A sequential mixed methods study was conducted between January 2017 and April 2017. Ninety-seven international graduates of one UK university's MSc PH programme were invited to take part in an online survey followed by semistructured interviews, for respondents who consented to be interviewed. We computed the descriptive statistics of the quantitative data obtained, and qualitative data were thematically analysed. The response rate was 48.5%. Most respondents (63%) were employed by various agencies within 1 year after graduation. Others (15%) were at different stages of doctor of philosophy studies. Respondents reported enhanced roles after graduation in areas such as public health policy analysis (74%); planning, implementation and evaluation of public health interventions (74%); leadership roles (72%); and research (70%). The common perceived skills that were relevant to the respondents' present jobs were critical analysis (87%), multidisciplinary thinking (86%), demonstrating public health leadership skills (84%) and research (77%). Almost all respondents (90%) were confident in conducting research. Respondents recommended the provision of longer public health placement opportunities, elective courses on project management and advanced statistics, and 'internationalisation' of the programme's curriculum. The study has revealed the relevance of higher education in public health in developing the career prospects and skills of graduates. International graduates of this MSc PH programme were satisfied with the relevance and impact of the skills they acquired during their studies. The outcomes of this study can be used for curriculum reformation. Employers' perspectives of the capabilities of these graduates, however, need further consideration. Copyright © 2018 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  5. Information and material flows in complex networks

    NASA Astrophysics Data System (ADS)

    Helbing, Dirk; Armbruster, Dieter; Mikhailov, Alexander S.; Lefeber, Erjen

    2006-04-01

    In this special issue, an overview of the Thematic Institute (TI) on Information and Material Flows in Complex Systems is given. The TI was carried out within EXYSTENCE, the first EU Network of Excellence in the area of complex systems. Its motivation, research approach and subjects are presented here. Among the various methods used are many-particle and statistical physics, nonlinear dynamics, as well as complex systems, network and control theory. The contributions are relevant for complex systems as diverse as vehicle and data traffic in networks, logistics, production, and material flows in biological systems. The key disciplines involved are socio-, econo-, traffic- and bio-physics, and a new research area that could be called “biologistics”.

  6. Comparing Distributions of Environmental Outcomes for Regulatory Environmental Justice Analysis

    PubMed Central

    Maguire, Kelly; Sheriff, Glenn

    2011-01-01

    Economists have long been interested in measuring distributional impacts of policy interventions. As environmental justice (EJ) emerged as an ethical issue in the 1970s, the academic literature has provided statistical analyses of the incidence and causes of various environmental outcomes as they relate to race, income, and other demographic variables. In the context of regulatory impacts, however, there is a lack of consensus regarding what information is relevant for EJ analysis, and how best to present it. This paper helps frame the discussion by suggesting a set of questions fundamental to regulatory EJ analysis, reviewing past approaches to quantifying distributional equity, and discussing the potential for adapting existing tools to the regulatory context. PMID:21655146

  7. Semantic Annotation of Complex Text Structures in Problem Reports

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Throop, David R.; Fleming, Land D.

    2011-01-01

    Text analysis is important for effective information retrieval from databases where the critical information is embedded in text fields. Aerospace safety depends on effective retrieval of relevant and related problem reports for the purpose of trend analysis. The complex text syntax in problem descriptions has limited statistical text mining of problem reports. The presentation describes an intelligent tagging approach that applies syntactic and then semantic analysis to overcome this problem. The tags identify types of problems and equipment that are embedded in the text descriptions. The power of these tags is illustrated in a faceted searching and browsing interface for problem report trending that combines automatically generated tags with database code fields and temporal information.

  8. Efficacy and mode of action of an immunomodulator herbal preparation containing Echinacea, wild indigo, and white cedar.

    PubMed

    Wüstenberg, P; Henneicke-von Zepelin, H H; Köhler, G; Stammwitz, U

    1999-01-01

    Using the example of an allopathic herbal combined preparation containing Echinacea root, wild indigo root, and white cedar leaf tips (Echinaceae radix + Baptisiae tinctoriae radix + Thujae occidentalis herba = Esberitox N), the efficacy and mode of action of a phytoimmunomodulator, or immune system enhancer, is described. Efficacy of the immunomodulator has been demonstrated in studies of acute viral respiratory tract infections and infections requiring antibiotic therapy. In a recent study compliant to GCP, the therapeutic superiority of the herbal immunomodulator over placebo was confirmed as statistically significant and clinically relevant. The present overview describes a model of the antigen-independent mode of action of phytoimmunomodulation ("immunobalancing").

  9. Evaluation of excitation functions of proton and deuteron induced reactions on enriched tellurium isotopes with special relevance to the production of iodine-124.

    PubMed

    Aslam, M N; Sudár, S; Hussain, M; Malik, A A; Shah, H A; Qaim, S M

    2010-09-01

    Cross-section data for the production of medically important radionuclide (124)I via five proton and deuteron induced reactions on enriched tellurium isotopes were evaluated. The nuclear model codes, STAPRE, EMPIRE and TALYS, were used for consistency checks of the experimental data. Recommended excitation functions were derived using a well-defined statistical procedure. Therefrom integral yields were calculated. The various production routes of (124)I were compared. Presently the (124)Te(p,n)(124)I reaction is the method of choice; however, the (125)Te(p,2n)(124)I reaction also appears to have great potential.

  10. Learning probability distributions from smooth observables and the maximum entropy principle: some remarks

    NASA Astrophysics Data System (ADS)

    Obuchi, Tomoyuki; Monasson, Rémi

    2015-09-01

    The maximum entropy principle (MEP) is a very useful working hypothesis in a wide variety of inference problems, ranging from biological to engineering tasks. To better understand the reasons of the success of MEP, we propose a statistical-mechanical formulation to treat the space of probability distributions constrained by the measures of (experimental) observables. In this paper we first review the results of a detailed analysis of the simplest case of randomly chosen observables. In addition, we investigate by numerical and analytical means the case of smooth observables, which is of practical relevance. Our preliminary results are presented and discussed with respect to the efficiency of the MEP.

  11. Information Measures of Degree Distributions with an Application to Labeled Graphs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joslyn, Cliff A.; Purvine, Emilie AH

    2016-01-11

    The problem of describing the distribution of labels over a set of objects is relevant to many domains. For example: cyber security, social media, and protein interactions all care about the manner in which labels are distributed among different objects. In this paper we present three interacting statistical measures on label distributions, inspired by entropy and information theory. Labeled graphs are discussed as a specific case of labels distributed over a set of edges. We describe a use case in cyber security using a labeled directed multi-graph of IPFLOW. Finally we show how these measures respond when labels are updatedmore » in certain ways.« less

  12. Intracultural diversity in a model of social dynamics

    NASA Astrophysics Data System (ADS)

    Parravano, A.; Rivera-Ramirez, H.; Cosenza, M. G.

    2007-06-01

    We study the consequences of introducing individual nonconformity in social interactions, based on Axelrod's model for the dissemination of culture. A constraint on the number of situations in which interaction may take place is introduced in order to lift the unavoidable homogeneity present in the final configurations arising in Axelrod's related models. The inclusion of this constraint leads to the occurrence of complex patterns of intracultural diversity whose statistical properties and spatial distribution are characterized by means of the concepts of cultural affinity and cultural cline. It is found that the relevant quantity that determines the properties of intracultural diversity is given by the fraction of cultural features that characterizes the cultural nonconformity of individuals.

  13. Nonlocal polarization interferometer for entanglement detection

    DOE PAGES

    Williams, Brian P.; Humble, Travis S.; Grice, Warren P.

    2014-10-30

    We report a nonlocal interferometer capable of detecting entanglement and identifying Bell states statistically. This is possible due to the interferometer's unique correlation dependence on the antidiagonal elements of the density matrix, which have distinct bounds for separable states and unique values for the four Bell states. The interferometer consists of two spatially separated balanced Mach-Zehnder or Sagnac interferometers that share a polarization-entangled source. Correlations between these interferometers exhibit nonlocal interference, while single-photon interference is suppressed. This interferometer also allows for a unique version of the Clauser-Horne-Shimony-Holt Bell test where the local reality is the photon polarization. In conclusion, wemore » present the relevant theory and experimental results.« less

  14. Direct measurement of electron transfer distance decay constants of single redox proteins by electrochemical tunneling spectroscopy.

    PubMed

    Artés, Juan M; Díez-Pérez, Ismael; Sanz, Fausto; Gorostiza, Pau

    2011-03-22

    We present a method to measure directly and at the single-molecule level the distance decay constant that characterizes the rate of electron transfer (ET) in redox proteins. Using an electrochemical tunneling microscope under bipotentiostatic control, we obtained current−distance spectroscopic recordings of individual redox proteins confined within a nanometric tunneling gap at a well-defined molecular orientation. The tunneling current decays exponentially, and the corresponding decay constant (β) strongly supports a two-step tunneling ET mechanism. Statistical analysis of decay constant measurements reveals differences between the reduced and oxidized states that may be relevant to the control of ET rates in enzymes and biological electron transport chains.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennink, Ryan S.; Ferragut, Erik M.; Humble, Travis S.

    Modeling and simulation are essential for predicting and verifying the behavior of fabricated quantum circuits, but existing simulation methods are either impractically costly or require an unrealistic simplification of error processes. In this paper, we present a method of simulating noisy Clifford circuits that is both accurate and practical in experimentally relevant regimes. In particular, the cost is weakly exponential in the size and the degree of non-Cliffordness of the circuit. Our approach is based on the construction of exact representations of quantum channels as quasiprobability distributions over stabilizer operations, which are then sampled, simulated, and weighted to yield unbiasedmore » statistical estimates of circuit outputs and other observables. As a demonstration of these techniques, we simulate a Steane [[7,1,3

  16. Implementation of the common phrase index method on the phrase query for information retrieval

    NASA Astrophysics Data System (ADS)

    Fatmawati, Triyah; Zaman, Badrus; Werdiningsih, Indah

    2017-08-01

    As the development of technology, the process of finding information on the news text is easy, because the text of the news is not only distributed in print media, such as newspapers, but also in electronic media that can be accessed using the search engine. In the process of finding relevant documents on the search engine, a phrase often used as a query. The number of words that make up the phrase query and their position obviously affect the relevance of the document produced. As a result, the accuracy of the information obtained will be affected. Based on the outlined problem, the purpose of this research was to analyze the implementation of the common phrase index method on information retrieval. This research will be conducted in English news text and implemented on a prototype to determine the relevance level of the documents produced. The system is built with the stages of pre-processing, indexing, term weighting calculation, and cosine similarity calculation. Then the system will display the document search results in a sequence, based on the cosine similarity. Furthermore, system testing will be conducted using 100 documents and 20 queries. That result is then used for the evaluation stage. First, determine the relevant documents using kappa statistic calculation. Second, determine the system success rate using precision, recall, and F-measure calculation. In this research, the result of kappa statistic calculation was 0.71, so that the relevant documents are eligible for the system evaluation. Then the calculation of precision, recall, and F-measure produces precision of 0.37, recall of 0.50, and F-measure of 0.43. From this result can be said that the success rate of the system to produce relevant documents is low.

  17. Building a database for statistical characterization of ELMs on DIII-D

    NASA Astrophysics Data System (ADS)

    Fritch, B. J.; Marinoni, A.; Bortolon, A.

    2017-10-01

    Edge localized modes (ELMs) are bursty instabilities which occur in the edge region of H-mode plasmas and have the potential to damage in-vessel components of future fusion machines by exposing the divertor region to large energy and particle fluxes during each ELM event. While most ELM studies focus on average quantities (e.g. energy loss per ELM), this work investigates the statistical distributions of ELM characteristics, as a function of plasma parameters. A semi-automatic algorithm is being used to create a database documenting trigger times of the tens of thousands of ELMs for DIII-D discharges in scenarios relevant to ITER, thus allowing statistically significant analysis. Probability distributions of inter-ELM periods and energy losses will be determined and related to relevant plasma parameters such as density, stored energy, and current in order to constrain models and improve estimates of the expected inter-ELM periods and sizes, both of which must be controlled in future reactors. Work supported in part by US DoE under the Science Undergraduate Laboratory Internships (SULI) program, DE-FC02-04ER54698 and DE-FG02- 94ER54235.

  18. Infants are superior in implicit crossmodal learning and use other learning mechanisms than adults

    PubMed Central

    von Frieling, Marco; Röder, Brigitte

    2017-01-01

    During development internal models of the sensory world must be acquired which have to be continuously adapted later. We used event-related potentials (ERP) to test the hypothesis that infants extract crossmodal statistics implicitly while adults learn them when task relevant. Participants were passively exposed to frequent standard audio-visual combinations (A1V1, A2V2, p=0.35 each), rare recombinations of these standard stimuli (A1V2, A2V1, p=0.10 each), and a rare audio-visual deviant with infrequent auditory and visual elements (A3V3, p=0.10). While both six-month-old infants and adults differentiated between rare deviants and standards involving early neural processing stages only infants were sensitive to crossmodal statistics as indicated by a late ERP difference between standard and recombined stimuli. A second experiment revealed that adults differentiated recombined and standard combinations when crossmodal combinations were task relevant. These results demonstrate a heightened sensitivity for crossmodal statistics in infants and a change in learning mode from infancy to adulthood. PMID:28949291

  19. Establishing Statistical Equivalence of Data from Different Sampling Approaches for Assessment of Bacterial Phenotypic Antimicrobial Resistance

    PubMed Central

    2018-01-01

    ABSTRACT To assess phenotypic bacterial antimicrobial resistance (AMR) in different strata (e.g., host populations, environmental areas, manure, or sewage effluents) for epidemiological purposes, isolates of target bacteria can be obtained from a stratum using various sample types. Also, different sample processing methods can be applied. The MIC of each target antimicrobial drug for each isolate is measured. Statistical equivalence testing of the MIC data for the isolates allows evaluation of whether different sample types or sample processing methods yield equivalent estimates of the bacterial antimicrobial susceptibility in the stratum. We demonstrate this approach on the antimicrobial susceptibility estimates for (i) nontyphoidal Salmonella spp. from ground or trimmed meat versus cecal content samples of cattle in processing plants in 2013-2014 and (ii) nontyphoidal Salmonella spp. from urine, fecal, and blood human samples in 2015 (U.S. National Antimicrobial Resistance Monitoring System data). We found that the sample types for cattle yielded nonequivalent susceptibility estimates for several antimicrobial drug classes and thus may gauge distinct subpopulations of salmonellae. The quinolone and fluoroquinolone susceptibility estimates for nontyphoidal salmonellae from human blood are nonequivalent to those from urine or feces, conjecturally due to the fluoroquinolone (ciprofloxacin) use to treat infections caused by nontyphoidal salmonellae. We also demonstrate statistical equivalence testing for comparing sample processing methods for fecal samples (culturing one versus multiple aliquots per sample) to assess AMR in fecal Escherichia coli. These methods yield equivalent results, except for tetracyclines. Importantly, statistical equivalence testing provides the MIC difference at which the data from two sample types or sample processing methods differ statistically. Data users (e.g., microbiologists and epidemiologists) may then interpret practical relevance of the difference. IMPORTANCE Bacterial antimicrobial resistance (AMR) needs to be assessed in different populations or strata for the purposes of surveillance and determination of the efficacy of interventions to halt AMR dissemination. To assess phenotypic antimicrobial susceptibility, isolates of target bacteria can be obtained from a stratum using different sample types or employing different sample processing methods in the laboratory. The MIC of each target antimicrobial drug for each of the isolates is measured, yielding the MIC distribution across the isolates from each sample type or sample processing method. We describe statistical equivalence testing for the MIC data for evaluating whether two sample types or sample processing methods yield equivalent estimates of the bacterial phenotypic antimicrobial susceptibility in the stratum. This includes estimating the MIC difference at which the data from the two approaches differ statistically. Data users (e.g., microbiologists, epidemiologists, and public health professionals) can then interpret whether that present difference is practically relevant. PMID:29475868

  20. Establishing Statistical Equivalence of Data from Different Sampling Approaches for Assessment of Bacterial Phenotypic Antimicrobial Resistance.

    PubMed

    Shakeri, Heman; Volkova, Victoriya; Wen, Xuesong; Deters, Andrea; Cull, Charley; Drouillard, James; Müller, Christian; Moradijamei, Behnaz; Jaberi-Douraki, Majid

    2018-05-01

    To assess phenotypic bacterial antimicrobial resistance (AMR) in different strata (e.g., host populations, environmental areas, manure, or sewage effluents) for epidemiological purposes, isolates of target bacteria can be obtained from a stratum using various sample types. Also, different sample processing methods can be applied. The MIC of each target antimicrobial drug for each isolate is measured. Statistical equivalence testing of the MIC data for the isolates allows evaluation of whether different sample types or sample processing methods yield equivalent estimates of the bacterial antimicrobial susceptibility in the stratum. We demonstrate this approach on the antimicrobial susceptibility estimates for (i) nontyphoidal Salmonella spp. from ground or trimmed meat versus cecal content samples of cattle in processing plants in 2013-2014 and (ii) nontyphoidal Salmonella spp. from urine, fecal, and blood human samples in 2015 (U.S. National Antimicrobial Resistance Monitoring System data). We found that the sample types for cattle yielded nonequivalent susceptibility estimates for several antimicrobial drug classes and thus may gauge distinct subpopulations of salmonellae. The quinolone and fluoroquinolone susceptibility estimates for nontyphoidal salmonellae from human blood are nonequivalent to those from urine or feces, conjecturally due to the fluoroquinolone (ciprofloxacin) use to treat infections caused by nontyphoidal salmonellae. We also demonstrate statistical equivalence testing for comparing sample processing methods for fecal samples (culturing one versus multiple aliquots per sample) to assess AMR in fecal Escherichia coli These methods yield equivalent results, except for tetracyclines. Importantly, statistical equivalence testing provides the MIC difference at which the data from two sample types or sample processing methods differ statistically. Data users (e.g., microbiologists and epidemiologists) may then interpret practical relevance of the difference. IMPORTANCE Bacterial antimicrobial resistance (AMR) needs to be assessed in different populations or strata for the purposes of surveillance and determination of the efficacy of interventions to halt AMR dissemination. To assess phenotypic antimicrobial susceptibility, isolates of target bacteria can be obtained from a stratum using different sample types or employing different sample processing methods in the laboratory. The MIC of each target antimicrobial drug for each of the isolates is measured, yielding the MIC distribution across the isolates from each sample type or sample processing method. We describe statistical equivalence testing for the MIC data for evaluating whether two sample types or sample processing methods yield equivalent estimates of the bacterial phenotypic antimicrobial susceptibility in the stratum. This includes estimating the MIC difference at which the data from the two approaches differ statistically. Data users (e.g., microbiologists, epidemiologists, and public health professionals) can then interpret whether that present difference is practically relevant. Copyright © 2018 Shakeri et al.

  1. Record statistics of financial time series and geometric random walks

    NASA Astrophysics Data System (ADS)

    Sabir, Behlool; Santhanam, M. S.

    2014-09-01

    The study of record statistics of correlated series in physics, such as random walks, is gaining momentum, and several analytical results have been obtained in the past few years. In this work, we study the record statistics of correlated empirical data for which random walk models have relevance. We obtain results for the records statistics of select stock market data and the geometric random walk, primarily through simulations. We show that the distribution of the age of records is a power law with the exponent α lying in the range 1.5≤α≤1.8. Further, the longest record ages follow the Fréchet distribution of extreme value theory. The records statistics of geometric random walk series is in good agreement with that obtained from empirical stock data.

  2. 46 CFR 201.132 - Conduct of the hearing.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., such as an official report, decision, opinion, or published scientific or economic statistical data... relevant part thereof. (h) Oral argument at hearings. A request for oral argument at the close of testimony...

  3. 46 CFR 201.132 - Conduct of the hearing.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., such as an official report, decision, opinion, or published scientific or economic statistical data... relevant part thereof. (h) Oral argument at hearings. A request for oral argument at the close of testimony...

  4. 46 CFR 201.132 - Conduct of the hearing.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., such as an official report, decision, opinion, or published scientific or economic statistical data... relevant part thereof. (h) Oral argument at hearings. A request for oral argument at the close of testimony...

  5. 46 CFR 201.132 - Conduct of the hearing.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ..., such as an official report, decision, opinion, or published scientific or economic statistical data... relevant part thereof. (h) Oral argument at hearings. A request for oral argument at the close of testimony...

  6. 46 CFR 201.132 - Conduct of the hearing.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ..., such as an official report, decision, opinion, or published scientific or economic statistical data... relevant part thereof. (h) Oral argument at hearings. A request for oral argument at the close of testimony...

  7. Cancer Related-Knowledge - Small Area Estimates

    Cancer.gov

    These model-based estimates are produced using statistical models that combine data from the Health Information National Trends Survey, and auxiliary variables obtained from relevant sources and borrow strength from other areas with similar characteristics.

  8. Assessment of NDE Reliability Data

    NASA Technical Reports Server (NTRS)

    Yee, B. G. W.; Chang, F. H.; Couchman, J. C.; Lemon, G. H.; Packman, P. F.

    1976-01-01

    Twenty sets of relevant Nondestructive Evaluation (NDE) reliability data have been identified, collected, compiled, and categorized. A criterion for the selection of data for statistical analysis considerations has been formulated. A model to grade the quality and validity of the data sets has been developed. Data input formats, which record the pertinent parameters of the defect/specimen and inspection procedures, have been formulated for each NDE method. A comprehensive computer program has been written to calculate the probability of flaw detection at several confidence levels by the binomial distribution. This program also selects the desired data sets for pooling and tests the statistical pooling criteria before calculating the composite detection reliability. Probability of detection curves at 95 and 50 percent confidence levels have been plotted for individual sets of relevant data as well as for several sets of merged data with common sets of NDE parameters.

  9. Influence of environmental statistics on inhibition of saccadic return

    PubMed Central

    Farrell, Simon; Ludwig, Casimir J. H.; Ellis, Lucy A.; Gilchrist, Iain D.

    2009-01-01

    Initiating an eye movement is slowed if the saccade is directed to a location that has been fixated in the recent past. We show that this inhibitory effect is modulated by the temporal statistics of the environment: If a return location is likely to become behaviorally relevant, inhibition of return is absent. By fitting an accumulator model of saccadic decision-making, we show that the inhibitory effect and the sensitivity to local statistics can be dissociated in their effects on the rate of accumulation of evidence, and the threshold controlling the amount of evidence needed to generate a saccade. PMID:20080778

  10. Contribution of Apollo lunar photography to the establishment of selenodetic control

    NASA Technical Reports Server (NTRS)

    Dermanis, A.

    1975-01-01

    Among the various types of available data relevant to the establishment of geometric control on the moon, the only one covering significant portions of the lunar surface (20%) with sufficient information content, is lunar photography, taken at the proximity of the moon from lunar orbiters. The idea of free geodetic networks is introduced as a tool for the statistical comparison of the geometric aspects of the various data used. Methods were developed for the updating of the statistics of observations and the a priori parameter estimates to obtain statistically consistent solutions by means of the optimum relative weighting concept.

  11. Asymptotic coincidence of the statistics for degenerate and non-degenerate correlated real Wishart ensembles

    NASA Astrophysics Data System (ADS)

    Wirtz, Tim; Kieburg, Mario; Guhr, Thomas

    2017-06-01

    The correlated Wishart model provides the standard benchmark when analyzing time series of any kind. Unfortunately, the real case, which is the most relevant one in applications, poses serious challenges for analytical calculations. Often these challenges are due to square root singularities which cannot be handled using common random matrix techniques. We present a new way to tackle this issue. Using supersymmetry, we carry out an anlaytical study which we support by numerical simulations. For large but finite matrix dimensions, we show that statistical properties of the fully correlated real Wishart model generically approach those of a correlated real Wishart model with doubled matrix dimensions and doubly degenerate empirical eigenvalues. This holds for the local and global spectral statistics. With Monte Carlo simulations we show that this is even approximately true for small matrix dimensions. We explicitly investigate the k-point correlation function as well as the distribution of the largest eigenvalue for which we find a surprisingly compact formula in the doubly degenerate case. Moreover we show that on the local scale the k-point correlation function exhibits the sine and the Airy kernel in the bulk and at the soft edges, respectively. We also address the positions and the fluctuations of the possible outliers in the data.

  12. Nearfield Summary and Statistical Analysis of the Second AIAA Sonic Boom Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Park, Michael A.; Nemec, Marian

    2017-01-01

    A summary is provided for the Second AIAA Sonic Boom Workshop held 8-9 January 2017 in conjunction with AIAA SciTech 2017. The workshop used three required models of increasing complexity: an axisymmetric body, a wing body, and a complete configuration with flow-through nacelle. An optional complete configuration with propulsion boundary conditions is also provided. These models are designed with similar nearfield signatures to isolate geometry and shock/expansion interaction effects. Eleven international participant groups submitted nearfield signatures with forces, pitching moment, and iterative convergence norms. Statistics and grid convergence of these nearfield signatures are presented. These submissions are propagated to the ground, and noise levels are computed. This allows the grid convergence and the statistical distribution of a noise level to be computed. While progress is documented since the first workshop, improvement to the analysis methods for a possible subsequent workshop are provided. The complete configuration with flow-through nacelle showed the most dramatic improvement between the two workshops. The current workshop cases are more relevant to vehicles with lower loudness and have the potential for lower annoyance than the first workshop cases. The models for this workshop with quieter ground noise levels than the first workshop exposed weaknesses in analysis, particularly in convective discretization.

  13. Future Needs and Recommendations in the Development of ...

    EPA Pesticide Factsheets

    A species sensitivity distribution (SSD) is a probability model of the variation of species sensitivities to a stressor, in particular chemical exposure. The SSD approach has been used as a decision support tool in environmental protection and management since the 1980s, and the ecotoxicological, statistical and regulatory basis and applications continue to evolve. This article summarizes the findings of a 2014 workshop held by ECETOC (the European Center for Toxicology and Ecotoxicology of Chemicals) and the UK Environment Agency in Amsterdam, the Netherlands on the ecological relevance, statistical basis, and regulatory applications of SSDs. An array of research recommendations categorized under the topical areas of Use of SSDs, Ecological Considerations, Guideline Considerations, Method Development and Validation, Toxicity Data, Mechanistic Understanding and Uncertainty were identified and prioritized. A rationale for the most critical research needs identified in the workshop is provided. The workshop reviewed the technical basis and historical development and application of SSDs, described approaches to estimating generic and scenario specific SSD-based thresholds, evaluated utility and application of SSDs as diagnostic tools, and presented new statistical approaches to formulate SSDs. Collectively, these address many of the research needs to expand and improve their application. The highest priority work, from a pragmatic regulatory point of view, is t

  14. Identifying Galactic Cosmic Ray Origins With Super-TIGER

    NASA Technical Reports Server (NTRS)

    deNolfo, Georgia; Binns, W. R.; Israel, M. H.; Christian, E. R.; Mitchell, J. W.; Hams, T.; Link, J. T.; Sasaki, M.; Labrador, A. W.; Mewaldt, R. A.; hide

    2009-01-01

    Super-TIGER (Super Trans-Iron Galactic Element Recorder) is a new long-duration balloon-borne instrument designed to test and clarify an emerging model of cosmic-ray origins and models for atomic processes by which nuclei are selected for acceleration. A sensitive test of the origin of cosmic rays is the measurement of ultra heavy elemental abundances (Z > or equal 30). Super-TIGER is a large-area (5 sq m) instrument designed to measure the elements in the interval 30 < or equal Z < or equal 42 with individual-element resolution and high statistical precision, and make exploratory measurements through Z = 60. It will also measure with high statistical accuracy the energy spectra of the more abundant elements in the interval 14 < or equal Z < or equal 30 at energies 0.8 < or equal E < or equal 10 GeV/nucleon. These spectra will give a sensitive test of the hypothesis that microquasars or other sources could superpose spectral features on the otherwise smooth energy spectra previously measured with less statistical accuracy. Super-TIGER builds on the heritage of the smaller TIGER, which produced the first well-resolved measurements of elemental abundances of the elements Ga-31, Ge-32, and Se-34. We present the Super-TIGER design, schedule, and progress to date, and discuss the relevance of UH measurements to cosmic-ray origins.

  15. Statistical analysis and data mining of digital reconstructions of dendritic morphologies.

    PubMed

    Polavaram, Sridevi; Gillette, Todd A; Parekh, Ruchi; Ascoli, Giorgio A

    2014-01-01

    Neuronal morphology is diverse among animal species, developmental stages, brain regions, and cell types. The geometry of individual neurons also varies substantially even within the same cell class. Moreover, specific histological, imaging, and reconstruction methodologies can differentially affect morphometric measures. The quantitative characterization of neuronal arbors is necessary for in-depth understanding of the structure-function relationship in nervous systems. The large collection of community-contributed digitally reconstructed neurons available at NeuroMorpho.Org constitutes a "big data" research opportunity for neuroscience discovery beyond the approaches typically pursued in single laboratories. To illustrate these potential and related challenges, we present a database-wide statistical analysis of dendritic arbors enabling the quantification of major morphological similarities and differences across broadly adopted metadata categories. Furthermore, we adopt a complementary unsupervised approach based on clustering and dimensionality reduction to identify the main morphological parameters leading to the most statistically informative structural classification. We find that specific combinations of measures related to branching density, overall size, tortuosity, bifurcation angles, arbor flatness, and topological asymmetry can capture anatomically and functionally relevant features of dendritic trees. The reported results only represent a small fraction of the relationships available for data exploration and hypothesis testing enabled by sharing of digital morphological reconstructions.

  16. Application of maximum entropy to statistical inference for inversion of data from a single track segment.

    PubMed

    Stotts, Steven A; Koch, Robert A

    2017-08-01

    In this paper an approach is presented to estimate the constraint required to apply maximum entropy (ME) for statistical inference with underwater acoustic data from a single track segment. Previous algorithms for estimating the ME constraint require multiple source track segments to determine the constraint. The approach is relevant for addressing model mismatch effects, i.e., inaccuracies in parameter values determined from inversions because the propagation model does not account for all acoustic processes that contribute to the measured data. One effect of model mismatch is that the lowest cost inversion solution may be well outside a relatively well-known parameter value's uncertainty interval (prior), e.g., source speed from track reconstruction or towed source levels. The approach requires, for some particular parameter value, the ME constraint to produce an inferred uncertainty interval that encompasses the prior. Motivating this approach is the hypothesis that the proposed constraint determination procedure would produce a posterior probability density that accounts for the effect of model mismatch on inferred values of other inversion parameters for which the priors might be quite broad. Applications to both measured and simulated data are presented for model mismatch that produces minimum cost solutions either inside or outside some priors.

  17. Internet gaming disorder in early adolescence: Associations with parental and adolescent mental health.

    PubMed

    Wartberg, L; Kriston, L; Kramer, M; Schwedler, A; Lincoln, T M; Kammerl, R

    2017-06-01

    Internet gaming disorder (IGD) has been included in the Diagnostic and Statistical Manual of Mental Disorders (DSM-5). Currently, associations between IGD in early adolescence and mental health are largely unexplained. In the present study, the relation of IGD with adolescent and parental mental health was investigated for the first time. We surveyed 1095 family dyads (an adolescent aged 12-14 years and a related parent) with a standardized questionnaire for IGD as well as for adolescent and parental mental health. We conducted linear (dimensional approach) and logistic (categorical approach) regression analyses. Both with dimensional and categorical approaches, we observed statistically significant associations between IGD and male gender, a higher degree of adolescent antisocial behavior, anger control problems, emotional distress, self-esteem problems, hyperactivity/inattention and parental anxiety (linear regression model: corrected R 2 =0.41, logistic regression model: Nagelkerke's R 2 =0.41). IGD appears to be associated with internalizing and externalizing problems in adolescents. Moreover, the findings of the present study provide first evidence that not only adolescent but also parental mental health is relevant to IGD in early adolescence. Adolescent and parental mental health should be considered in prevention and intervention programs for IGD in adolescence. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  18. Characterising the disintegration properties of tablets in opaque media using texture analysis.

    PubMed

    Scheuerle, Rebekah L; Gerrard, Stephen E; Kendall, Richard A; Tuleu, Catherine; Slater, Nigel K H; Mahbubani, Krishnaa T

    2015-01-01

    Tablet disintegration characterisation is used in pharmaceutical research, development, and quality control. Standard methods used to characterise tablet disintegration are often dependent on visual observation in measurement of disintegration times. This presents a challenge for disintegration studies of tablets in opaque, physiologically relevant media that could be useful for tablet formulation optimisation. This study has explored an application of texture analysis disintegration testing, a non-visual, quantitative means of determining tablet disintegration end point, by analysing the disintegration behaviour of two tablet formulations in opaque media. In this study, the disintegration behaviour of one tablet formulation manufactured in-house, and Sybedia Flashtab placebo tablets in water, bovine, and human milk were characterised. A novel method is presented to characterise the disintegration process and to quantify the disintegration end points of the tablets in various media using load data generated by a texture analyser probe. The disintegration times in the different media were found to be statistically different (P<0.0001) from one another for both tablet formulations using one-way ANOVA. Using the Tukey post-hoc test, the Sybedia Flashtab placebo tablets were found not to have statistically significant disintegration times from each other in human versus bovine milk (adjusted P value 0.1685). Copyright © 2015 Elsevier B.V. All rights reserved.

  19. The ALICE Electronic Logbook

    NASA Astrophysics Data System (ADS)

    Altini, V.; Carena, F.; Carena, W.; Chapeland, S.; Chibante Barroso, V.; Costa, F.; Divià, R.; Fuchs, U.; Makhlyueva, I.; Roukoutakis, F.; Schossmaier, K.; Soòs, C.; Vande Vyvre, P.; Von Haller, B.; ALICE Collaboration

    2010-04-01

    All major experiments need tools that provide a way to keep a record of the events and activities, both during commissioning and operations. In ALICE (A Large Ion Collider Experiment) at CERN, this task is performed by the Alice Electronic Logbook (eLogbook), a custom-made application developed and maintained by the Data-Acquisition group (DAQ). Started as a statistics repository, the eLogbook has evolved to become not only a fully functional electronic logbook, but also a massive information repository used to store the conditions and statistics of the several online systems. It's currently used by more than 600 users in 30 different countries and it plays an important role in the daily ALICE collaboration activities. This paper will describe the LAMP (Linux, Apache, MySQL and PHP) based architecture of the eLogbook, the database schema and the relevance of the information stored in the eLogbook to the different ALICE actors, not only for near real time procedures but also for long term data-mining and analysis. It will also present the web interface, including the different used technologies, the implemented security measures and the current main features. Finally it will present the roadmap for the future, including a migration to the web 2.0 paradigm, the handling of the database ever-increasing data volume and the deployment of data-mining tools.

  20. Cyber Risk Management for Critical Infrastructure: A Risk Analysis Model and Three Case Studies.

    PubMed

    Paté-Cornell, M-Elisabeth; Kuypers, Marshall; Smith, Matthew; Keller, Philip

    2018-02-01

    Managing cyber security in an organization involves allocating the protection budget across a spectrum of possible options. This requires assessing the benefits and the costs of these options. The risk analyses presented here are statistical when relevant data are available, and system-based for high-consequence events that have not happened yet. This article presents, first, a general probabilistic risk analysis framework for cyber security in an organization to be specified. It then describes three examples of forward-looking analyses motivated by recent cyber attacks. The first one is the statistical analysis of an actual database, extended at the upper end of the loss distribution by a Bayesian analysis of possible, high-consequence attack scenarios that may happen in the future. The second is a systems analysis of cyber risks for a smart, connected electric grid, showing that there is an optimal level of connectivity. The third is an analysis of sequential decisions to upgrade the software of an existing cyber security system or to adopt a new one to stay ahead of adversaries trying to find their way in. The results are distributions of losses to cyber attacks, with and without some considered countermeasures in support of risk management decisions based both on past data and anticipated incidents. © 2017 Society for Risk Analysis.

  1. Atmospheric Visibility Monitoring for planetary optical communications

    NASA Technical Reports Server (NTRS)

    Cowles, Kelly

    1991-01-01

    The Atmospheric Visibility Monitoring project endeavors to improve current atmospheric models and generate visibility statistics relevant to prospective earth-satellite optical communications systems. Three autonomous observatories are being used to measure atmospheric conditions on the basis of observed starlight; these data will yield clear-sky and transmission statistics for three sites with high clear-sky probabilities. Ground-based data will be compared with satellite imagery to determine the correlation between satellite data and ground-based observations.

  2. Stakeholder opinion of functional communication activities following traumatic brain injury.

    PubMed

    Larkins, B M; Worrall, L E; Hickson, L M

    2004-07-01

    To establish a process whereby assessment of functional communication reflects the authentic communication of the target population. The major functional communication assessments available from the USA may not be as relevant to those who reside elsewhere, nor assessments developed primarily for persons who have had a stroke as relevant for traumatic brain injury rehabilitation. The investigation used the Nominal Group Technique to elicit free opinion and support individuals who have compromised communication ability. A survey mailed out sampled a larger number of stakeholders to test out differences among groups. Five stakeholder groups generated items and the survey determined relative 'importance'. The stakeholder groups in both studies comprised individuals with traumatic brain injury and their families, health professionals, third-party payers, employers, and Maori, the indigenous population of New Zealand. There was no statistically significant difference found between groups for 19 of the 31 items. Only half of the items explicitly appear on a well-known USA functional communication assessment. The present study has implications for whether functional communication assessments are valid across cultures and the type of impairment.

  3. Implementation of a standardized out-of-hospital management method for Parkinson dysphagia.

    PubMed

    Wei, Hongying; Sun, Dongxiu; Liu, Meiping

    2017-12-01

    Our objective is to explore the effectiveness and feasibility of establishing a swallowing management clinic to implement out-of-hospital management for Parkinson disease (PD) patients with dysphagia. Two-hundred seventeen (217) voluntary PD patients with dysphagia in a PD outpatient clinic were divided into a control group with 100 people, and an experimental group with 117 people. The control group was given dysphagia rehabilitation guidance. The experimental group was presented with the standardized out-of-hospital management method as overall management and information and education materials. Rehabilitation efficiency and incidence rate of dysphagia, as well as relevant complications of both groups were compared after a 6-month intervention. Rehabilitation efficiency and the incidence rate of dysphagia including relevant complications of patients treated with the standardized out-of-hospital management were compared with those seen in the control group. The differences have distinct statistics meaning (p<0.01). Establishing a swallowing management protocol for outpatient setting can effectively help the recovery of the function of swallowing, reduce the incidence rate of dysphagia complications and improve the quality of life in patients with PD.

  4. Stabilized FE simulation of prototype thermal-hydraulics problems with integrated adjoint-based capabilities

    NASA Astrophysics Data System (ADS)

    Shadid, J. N.; Smith, T. M.; Cyr, E. C.; Wildey, T. M.; Pawlowski, R. P.

    2016-09-01

    A critical aspect of applying modern computational solution methods to complex multiphysics systems of relevance to nuclear reactor modeling, is the assessment of the predictive capability of specific proposed mathematical models. In this respect the understanding of numerical error, the sensitivity of the solution to parameters associated with input data, boundary condition uncertainty, and mathematical models is critical. Additionally, the ability to evaluate and or approximate the model efficiently, to allow development of a reasonable level of statistical diagnostics of the mathematical model and the physical system, is of central importance. In this study we report on initial efforts to apply integrated adjoint-based computational analysis and automatic differentiation tools to begin to address these issues. The study is carried out in the context of a Reynolds averaged Navier-Stokes approximation to turbulent fluid flow and heat transfer using a particular spatial discretization based on implicit fully-coupled stabilized FE methods. Initial results are presented that show the promise of these computational techniques in the context of nuclear reactor relevant prototype thermal-hydraulics problems.

  5. Comparison of Selected Weather Translation Products

    NASA Technical Reports Server (NTRS)

    Kulkarni, Deepak

    2017-01-01

    Weather is a primary contributor to the air traffic delays within the National Airspace System (NAS). At present, it is the individual decision makers who use weather information and assess its operational impact in creating effective air traffic management solutions. As a result, the estimation of the impact of forecast weather and the quality of ATM response relies on the skill and experience level of the decision maker. FAA Weather-ATM working groups have developed a Weather-ATM integration framework that consists of weather collection, weather translation, ATM impact conversion and ATM decision support. Some weather translation measures have been developed for hypothetical operations such as decentralized free flight, whereas others are meant to be relevant in current operations. This paper does comparative study of two different weather translation products relevant in current operations and finds that these products have strong correlation with each other. Given inaccuracies in prediction of weather, these differences would not be expected to be of significance in statistical study of a large number of decisions made with a look-ahead time of two hours or more.

  6. Th17-related genes and celiac disease susceptibility.

    PubMed

    Medrano, Luz María; García-Magariños, Manuel; Dema, Bárbara; Espino, Laura; Maluenda, Carlos; Polanco, Isabel; Figueredo, M Ángeles; Fernández-Arquero, Miguel; Núñez, Concepción

    2012-01-01

    Th17 cells are known to be involved in several autoimmune or inflammatory diseases. In celiac disease (CD), recent studies suggest an implication of those cells in disease pathogenesis. We aimed at studying the role of genes relevant for the Th17 immune response in CD susceptibility. A total of 101 single nucleotide polymorphisms (SNPs), mainly selected to cover most of the variability present in 16 Th17-related genes (IL23R, RORC, IL6R, IL17A, IL17F, CCR6, IL6, JAK2, TNFSF15, IL23A, IL22, STAT3, TBX21, SOCS3, IL12RB1 and IL17RA), were genotyped in 735 CD patients and 549 ethnically matched healthy controls. Case-control comparisons for each SNP and for the haplotypes resulting from the SNPs studied in each gene were performed using chi-square tests. Gene-gene interactions were also evaluated following different methodological approaches. No significant results emerged after performing the appropriate statistical corrections. Our results seem to discard a relevant role of Th17 cells on CD risk.

  7. Forensic Science Research and Development at the National Institute of Justice: Opportunities in Applied Physics

    NASA Astrophysics Data System (ADS)

    Dutton, Gregory

    Forensic science is a collection of applied disciplines that draws from all branches of science. A key question in forensic analysis is: to what degree do a piece of evidence and a known reference sample share characteristics? Quantification of similarity, estimation of uncertainty, and determination of relevant population statistics are of current concern. A 2016 PCAST report questioned the foundational validity and the validity in practice of several forensic disciplines, including latent fingerprints, firearms comparisons and DNA mixture interpretation. One recommendation was the advancement of objective, automated comparison methods based on image analysis and machine learning. These concerns parallel the National Institute of Justice's ongoing R&D investments in applied chemistry, biology and physics. NIJ maintains a funding program spanning fundamental research with potential for forensic application to the validation of novel instruments and methods. Since 2009, NIJ has funded over 179M in external research to support the advancement of accuracy, validity and efficiency in the forensic sciences. An overview of NIJ's programs will be presented, with examples of relevant projects from fluid dynamics, 3D imaging, acoustics, and materials science.

  8. Stabilized FE simulation of prototype thermal-hydraulics problems with integrated adjoint-based capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shadid, J.N., E-mail: jnshadi@sandia.gov; Department of Mathematics and Statistics, University of New Mexico; Smith, T.M.

    A critical aspect of applying modern computational solution methods to complex multiphysics systems of relevance to nuclear reactor modeling, is the assessment of the predictive capability of specific proposed mathematical models. In this respect the understanding of numerical error, the sensitivity of the solution to parameters associated with input data, boundary condition uncertainty, and mathematical models is critical. Additionally, the ability to evaluate and or approximate the model efficiently, to allow development of a reasonable level of statistical diagnostics of the mathematical model and the physical system, is of central importance. In this study we report on initial efforts tomore » apply integrated adjoint-based computational analysis and automatic differentiation tools to begin to address these issues. The study is carried out in the context of a Reynolds averaged Navier–Stokes approximation to turbulent fluid flow and heat transfer using a particular spatial discretization based on implicit fully-coupled stabilized FE methods. Initial results are presented that show the promise of these computational techniques in the context of nuclear reactor relevant prototype thermal-hydraulics problems.« less

  9. Stabilized FE simulation of prototype thermal-hydraulics problems with integrated adjoint-based capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shadid, J. N.; Smith, T. M.; Cyr, E. C.

    A critical aspect of applying modern computational solution methods to complex multiphysics systems of relevance to nuclear reactor modeling, is the assessment of the predictive capability of specific proposed mathematical models. The understanding of numerical error, the sensitivity of the solution to parameters associated with input data, boundary condition uncertainty, and mathematical models is critical. Additionally, the ability to evaluate and or approximate the model efficiently, to allow development of a reasonable level of statistical diagnostics of the mathematical model and the physical system, is of central importance. In our study we report on initial efforts to apply integrated adjoint-basedmore » computational analysis and automatic differentiation tools to begin to address these issues. The study is carried out in the context of a Reynolds averaged Navier–Stokes approximation to turbulent fluid flow and heat transfer using a particular spatial discretization based on implicit fully-coupled stabilized FE methods. We present the initial results that show the promise of these computational techniques in the context of nuclear reactor relevant prototype thermal-hydraulics problems.« less

  10. Stabilized FE simulation of prototype thermal-hydraulics problems with integrated adjoint-based capabilities

    DOE PAGES

    Shadid, J. N.; Smith, T. M.; Cyr, E. C.; ...

    2016-05-20

    A critical aspect of applying modern computational solution methods to complex multiphysics systems of relevance to nuclear reactor modeling, is the assessment of the predictive capability of specific proposed mathematical models. The understanding of numerical error, the sensitivity of the solution to parameters associated with input data, boundary condition uncertainty, and mathematical models is critical. Additionally, the ability to evaluate and or approximate the model efficiently, to allow development of a reasonable level of statistical diagnostics of the mathematical model and the physical system, is of central importance. In our study we report on initial efforts to apply integrated adjoint-basedmore » computational analysis and automatic differentiation tools to begin to address these issues. The study is carried out in the context of a Reynolds averaged Navier–Stokes approximation to turbulent fluid flow and heat transfer using a particular spatial discretization based on implicit fully-coupled stabilized FE methods. We present the initial results that show the promise of these computational techniques in the context of nuclear reactor relevant prototype thermal-hydraulics problems.« less

  11. GIS, geostatistics, metadata banking, and tree-based models for data analysis and mapping in environmental monitoring and epidemiology.

    PubMed

    Schröder, Winfried

    2006-05-01

    By the example of environmental monitoring, some applications of geographic information systems (GIS), geostatistics, metadata banking, and Classification and Regression Trees (CART) are presented. These tools are recommended for mapping statistically estimated hot spots of vectors and pathogens. GIS were introduced as tools for spatially modelling the real world. The modelling can be done by mapping objects according to the spatial information content of data. Additionally, this can be supported by geostatistical and multivariate statistical modelling. This is demonstrated by the example of modelling marine habitats of benthic communities and of terrestrial ecoregions. Such ecoregionalisations may be used to predict phenomena based on the statistical relation between measurements of an interesting phenomenon such as, e.g., the incidence of medically relevant species and correlated characteristics of the ecoregions. The combination of meteorological data and data on plant phenology can enhance the spatial resolution of the information on climate change. To this end, meteorological and phenological data have to be correlated. To enable this, both data sets which are from disparate monitoring networks have to be spatially connected by means of geostatistical estimation. This is demonstrated by the example of transformation of site-specific data on plant phenology into surface data. The analysis allows for spatial comparison of the phenology during the two periods 1961-1990 and 1991-2002 covering whole Germany. The changes in both plant phenology and air temperature were proved to be statistically significant. Thus, they can be combined by GIS overlay technique to enhance the spatial resolution of the information on the climate change and use them for the prediction of vector incidences at the regional scale. The localisation of such risk hot spots can be done by geometrically merging surface data on promoting factors. This is demonstrated by the example of the transfer of heavy metals through soils. The predicted hot spots of heavy metal transfer can be validated empirically by measurement data which can be inquired by a metadata base linked with a geographic information system. A corresponding strategy for the detection of vector hot spots in medical epidemiology is recommended. Data on incidences and habitats of the Anophelinae in the marsh regions of Lower Saxony (Germany) were used to calculate a habitat model by CART, which together with climate data and data on ecoregions can be further used for the prediction of habitats of medically relevant vector species. In the future, this approach should be supported by an internet-based information system consisting of three components: metadata questionnaire, metadata base, and GIS to link metadata, surface data, and measurement data on incidences and habitats of medically relevant species and related data on climate, phenology, and ecoregional characteristic conditions.

  12. Mathematical aspects of assessing extreme events for the safety of nuclear plants

    NASA Astrophysics Data System (ADS)

    Potempski, Slawomir; Borysiewicz, Mieczyslaw

    2015-04-01

    In the paper the review of mathematical methodologies applied for assessing low frequencies of rare natural events like earthquakes, tsunamis, hurricanes or tornadoes, floods (in particular flash floods and surge storms), lightning, solar flares, etc., will be given in the perspective of the safety assessment of nuclear plants. The statistical methods are usually based on the extreme value theory, which deals with the analysis of extreme deviation from the median (or the mean). In this respect application of various mathematical tools can be useful, like: the extreme value theorem of Fisher-Tippett-Gnedenko leading to possible choices of general extreme value distributions, or the Pickands-Balkema-de Haan theorem for tail fitting, or the methods related to large deviation theory. In the paper the most important stochastic distributions relevant for performing rare events statistical analysis will be presented. This concerns, for example, the analysis of the data with the annual extreme values (maxima - "Annual Maxima Series" or minima), or the peak values, exceeding given thresholds at some periods of interest ("Peak Over Threshold"), or the estimation of the size of exceedance. Despite of the fact that there is a lack of sufficient statistical data directly containing rare events, in some cases it is still possible to extract useful information from existing larger data sets. As an example one can consider some data sets available from the web sites for floods, earthquakes or generally natural hazards. Some aspects of such data sets will be also presented taking into account their usefulness for the practical assessment of risk for nuclear power plants coming from extreme weather conditions.

  13. Statistical Modeling of Extreme Values and Evidence of Presence of Dragon King (DK) in Solar Wind

    NASA Astrophysics Data System (ADS)

    Gomes, T.; Ramos, F.; Rempel, E. L.; Silva, S.; C-L Chian, A.

    2017-12-01

    The solar wind constitutes a nonlinear dynamical system, presenting intermittent turbulence, multifractality and chaotic dynamics. One characteristic shared by many such complex systems is the presence of extreme events, that play an important role in several Geophysical phenomena and their statistical characterization is a problem of great practical relevance. This work investigates the presence of extreme events in time series of the modulus of the interplanetary magnetic field measured by Cluster spacecraft on February 2, 2002. One of the main results is that the solar wind near the Earth's bow shock can be modeled by the Generalized Pareto (GP) and Generalized Extreme Values (GEV) distributions. Both models present a statistically significant positive shape parameter which implyies a heavy tail in the probability distribution functions and an unbounded growth in return values as return periods become too long. There is evidence that current sheets are the main responsible for positive values of the shape parameter. It is also shown that magnetic reconnection at the interface between two interplanetary magnetic flux ropes in the solar wind can be considered as Dragon Kings (DK), a class of extreme events whose formation mechanisms are fundamentally different from others. As long as magnetic reconnection can be classified as a Dragon King, there is the possibility of its identification and even its prediction. Dragon kings had previously been identified in time series of financial crashes, nuclear power generation accidents, stock market and so on. It is believed that they are associated with the occurrence of extreme events in dynamical systems at phase transition, bifurcation, crises or tipping points.

  14. Cervical Musculoskeletal Impairments and Temporomandibular Disorders

    PubMed Central

    Magee, David

    2012-01-01

    ABSTRACT Objectives The study of cervical muscles and their significance in the development and perpetuation of Temporomandibular Disorders has not been elucidated. Thus this project was designed to investigate the association between cervical musculoskeletal impairments and Temporomandibular Disorders. Material and Methods A sample of 154 subjects participated in this study. All subjects underwent a series of physical tests and electromyographic assessment (i.e. head and neck posture, maximal cervical muscle strength, cervical flexor and extensor muscles endurance, and cervical flexor muscle performance) to determine cervical musculoskeletal impairments. Results A strong relationship between neck disability and jaw disability was found (r = 0.82). Craniocervical posture was statistically different between patients with myogenous Temporomandibular Disorders (TMD) and healthy subjects. However, the difference was too small (3.3º) to be considered clinically relevant. Maximal cervical flexor muscle strength was not statistically or clinically different between patients with TMD and healthy subjects. No statistically significant differences were found in electromyographic activity of the sternocleidomastoid or the anterior scalene muscles in patients with TMD when compared to healthy subjects while executing the craniocervical flexion test (P = 0.07). However, clinically important effect sizes (0.42 - 0.82) were found. Subjects with TMD presented with reduced cervical flexor as well as extensor muscle endurance while performing the flexor and extensor muscle endurance tests when compared to healthy individuals. Conclusions Subjects with Temporomandibular Disorders presented with impairments of the cervical flexors and extensors muscles. These results could help guide clinicians in the assessment and prescription of more effective interventions for individuals with Temporomandibular Disorders. PMID:24422022

  15. Probabilistic risk analysis of building contamination.

    PubMed

    Bolster, D T; Tartakovsky, D M

    2008-10-01

    We present a general framework for probabilistic risk assessment (PRA) of building contamination. PRA provides a powerful tool for the rigorous quantification of risk in contamination of building spaces. A typical PRA starts by identifying relevant components of a system (e.g. ventilation system components, potential sources of contaminants, remediation methods) and proceeds by using available information and statistical inference to estimate the probabilities of their failure. These probabilities are then combined by means of fault-tree analyses to yield probabilistic estimates of the risk of system failure (e.g. building contamination). A sensitivity study of PRAs can identify features and potential problems that need to be addressed with the most urgency. Often PRAs are amenable to approximations, which can significantly simplify the approach. All these features of PRA are presented in this paper via a simple illustrative example, which can be built upon in further studies. The tool presented here can be used to design and maintain adequate ventilation systems to minimize exposure of occupants to contaminants.

  16. Identification of relevant single-nucleotide polymorphisms in Pneumocystis jirovecii: relationship with clinical data.

    PubMed

    Esteves, F; Gaspar, J; Marques, T; Leite, R; Antunes, F; Mansinho, K; Matos, O

    2010-07-01

    Pneumocystis jirovecii is a poorly understood pathogen that causes opportunistic pneumonia (Pneumocystis pneumonia (PcP)) in patients with AIDS. The present study was aimed at correlating genetic differences in P. jirovecii isolates and clinical patient data. A description of genetic diversity in P. jirovecii isolates from human immunodeficiency virus-positive patients, based on the identification of multiple single-nucleotide polymorphisms (SNPs) at five distinct loci encoding mitochondrial large-subunit rRNA (mtLSU rRNA), cytochrome b (CYB), superoxide dismutase (SOD), dihydrofolate reductase (DHFR), and dihydropteroate synthase (DHPS), was achieved using PCR with DNA sequencing and restriction fragment length polymorphism analysis. The statistical analysis revealed several interesting correlations among the four most relevant SNPs (mt85, SOD110, SOD215, and DHFR312) and specific clinical parameters: mt85C was associated with undiagnosed or atypical PcP episodes and favourable follow-up; SOD215C was associated with favourable follow-up; and DHFR312T was associated with PcP cases presenting moderate to high parasite burdens. The genotypes mt85C/SOD215C and SOD110T/SOD215C were found to be associated with less virulent P. jirovecii infections, whereas the genotype SOD110T/SOD215T was found to be related to more virulent PcP episodes. The present work demonstrated that potential P. jirovecii haplotypes may be related to the clinical data and outcome of PcP.

  17. Behavioral relevance of gamma-band activity for short-term memory-based auditory decision-making.

    PubMed

    Kaiser, Jochen; Heidegger, Tonio; Lutzenberger, Werner

    2008-06-01

    Oscillatory activity in the gamma-band range has been established as a correlate of cognitive processes, including perception, attention and memory. Only a few studies, however, have provided evidence for an association between gamma-band activity (GBA) and measures of behavioral performance. Here we focused on the comparison between sample and test stimuli S1 and S2 during an auditory spatial short-term memory task. Applying statistical probability mapping to magnetoencephalographic recordings from 28 human subjects, we identified GBA components distinguishing nonidentical from identical S1-S2 pairs. This activity was found at frequencies between 65 and 90 Hz and was localized over posterior cortical regions contralateral to the hemifield in which the stimuli were presented. The 10 best task performers showed higher amplitudes of this GBA component than the 10 worst performers. This group difference was most pronounced between about 150 and 300 ms after stimulus onset. Apparently the decision about whether test stimuli matched the stored representation of previously presented sample sounds relied partly on the oscillatory activation of networks representing differences between both stimuli. This result could be replicated by reanalyzing the combined data from two previous studies assessing short-term memory for sound duration and sound lateralization, respectively. Similarly to our main study, GBA amplitudes to nonmatching vs. matching S1-S2 pairs were higher in good performers than poor performers. The present findings demonstrate the behavioral relevance of GBA.

  18. Fragrance allergy in patients with hand eczema - a clinical study.

    PubMed

    Heydorn, Siri; Johansen, Jeanne Duus; Andersen, Klaus E; Bruze, Magnus; Svedman, Cecilia; White, Ian R; Basketter, David A; Menné, Torkil

    2003-06-01

    Fragrance allergy and hand eczema are both common among dermatological patients. Fragrance mix (FM) and its constituents have a recognized relevance to exposure to fine fragrances and cosmetic products. Based on extensive chemical analysis and database search, a new selection of fragrances was established, including 14 known fragrance allergens present in products to which hand exposure would occur. A non-irritating patch-test concentration for some fragrances was established in 212 consecutive patients. 658 consecutive patients presenting with hand eczema were patch tested with the European standard series and the developed selection of fragrances. 67 (10.2%) of the 658 patients had a positive reaction to 1 or more of our selection of fragrance chemicals present in the new selection. The most common reactions to fragrances not included in the FM were to citral, Lyral (hydroxyisohexyl-3-cyclohexene carboxaldehyde) and oxidized l-limonene. A concomitant reaction to the FM identified potential fragrance allergy in less than (1/2) of these patients. Exposure assessment and a statistically significant association between a positive patch test to our selected fragrances and patients' history support the relevance of this selection of fragrances. Those with a positive reaction to our selected fragrances were significantly more likely to have 1 or more positive patch tests in the standard series. This observation is the basis for the hypothesis concerning cross-reactivity and the effect of simultaneous exposure. The study found that fragrance allergy could be a common problem in patients with eczema on the hands.

  19. Imaging flow cytometry for phytoplankton analysis.

    PubMed

    Dashkova, Veronika; Malashenkov, Dmitry; Poulton, Nicole; Vorobjev, Ivan; Barteneva, Natasha S

    2017-01-01

    This review highlights the concepts and instrumentation of imaging flow cytometry technology and in particular its use for phytoplankton analysis. Imaging flow cytometry, a hybrid technology combining speed and statistical capabilities of flow cytometry with imaging features of microscopy, is rapidly advancing as a cell imaging platform that overcomes many of the limitations of current techniques and contributed significantly to the advancement of phytoplankton analysis in recent years. This review presents the various instrumentation relevant to the field and currently used for assessment of complex phytoplankton communities' composition and abundance, size structure determination, biovolume estimation, detection of harmful algal bloom species, evaluation of viability and metabolic activity and other applications. Also we present our data on viability and metabolic assessment of Aphanizomenon sp. cyanobacteria using Imagestream X Mark II imaging cytometer. Herein, we highlight the immense potential of imaging flow cytometry for microalgal research, but also discuss limitations and future developments. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.

  1. Isotope effect on blob-statistics in gyrofluid simulations of scrape-off layer turbulence

    NASA Astrophysics Data System (ADS)

    Meyer, O. H. H.; Kendl, A.

    2017-12-01

    In this contribution we apply a recently established stochastic model for scrape-off layer fluctuations to long time series obtained from gyrofluid simulations of fusion edge plasma turbulence. Characteristic parameters are estimated for different fusion relevant isotopic compositions (protium, deuterium, tritium and singly charged helium) by means of conditional averaging. It is shown that large amplitude fluctuations associated with radially propagating filaments in the scrape-off layer feature double-exponential wave-forms. We find increased pulse duration and longer waiting times between peaks for heavier ions, while the amplitudes are similar. The associated radial blob velocity is shown to be reduced for heavier ions. A parabolic relation between skewness and kurtosis of density fluctuations seems to be present. Improved particle confinement in terms of reduced mean value close to the outermost radial boundary and blob characteristics for heavier plasmas is presented.

  2. A Methodology for Anatomic Ultrasound Image Diagnostic Quality Assessment.

    PubMed

    Hemmsen, Martin Christian; Lange, Theis; Brandt, Andreas Hjelm; Nielsen, Michael Bachmann; Jensen, Jorgen Arendt

    2017-01-01

    This paper discusses the methods for the assessment of ultrasound image quality based on our experiences with evaluating new methods for anatomic imaging. It presents a methodology to ensure a fair assessment between competing imaging methods using clinically relevant evaluations. The methodology is valuable in the continuing process of method optimization and guided development of new imaging methods. It includes a three phased study plan covering from initial prototype development to clinical assessment. Recommendations to the clinical assessment protocol, software, and statistical analysis are presented. Earlier uses of the methodology has shown that it ensures validity of the assessment, as it separates the influences between developer, investigator, and assessor once a research protocol has been established. This separation reduces confounding influences on the result from the developer to properly reveal the clinical value. This paper exemplifies the methodology using recent studies of synthetic aperture sequential beamforming tissue harmonic imaging.

  3. A synthesis of research on color, typography and graphics as they relate to readability

    NASA Astrophysics Data System (ADS)

    Lamoreaux, M. E.

    1985-09-01

    A foundation for future research on the use of color, typography, and graphics to improve readability is provided. Articles from the broad fields of education and psychology, as well as from the fields of journalism and printing, have been reviewed for research relating color, typography, and graphics to reading ease, speed, or comprehension. The most relevant articles reviewed are presented in an annoated bibliography; the remaining articles are also presented in bibliographic format. This literature review indicates that recognition and recall of printed material may be improved through the use of headings, underlining, color, and, especially, illustrations. Current research suggests that individuals can remember pictures far longer than past research indicates. However, researchers are divided on the usefulness of illustrations to improve reading comprehension. On the other hand, reading comprehension can be improved through the use of statistical graphs and tables if the reader is properly trained in the use of these devices.

  4. Potential Application of a Graphical Processing Unit to Parallel Computations in the NUBEAM Code

    NASA Astrophysics Data System (ADS)

    Payne, J.; McCune, D.; Prater, R.

    2010-11-01

    NUBEAM is a comprehensive computational Monte Carlo based model for neutral beam injection (NBI) in tokamaks. NUBEAM computes NBI-relevant profiles in tokamak plasmas by tracking the deposition and the slowing of fast ions. At the core of NUBEAM are vector calculations used to track fast ions. These calculations have recently been parallelized to run on MPI clusters. However, cost and interlink bandwidth limit the ability to fully parallelize NUBEAM on an MPI cluster. Recent implementation of double precision capabilities for Graphical Processing Units (GPUs) presents a cost effective and high performance alternative or complement to MPI computation. Commercially available graphics cards can achieve up to 672 GFLOPS double precision and can handle hundreds of thousands of threads. The ability to execute at least one thread per particle simultaneously could significantly reduce the execution time and the statistical noise of NUBEAM. Progress on implementation on a GPU will be presented.

  5. Dynamic P-Technique for Modeling Patterns of Data: Applications to Pediatric Psychology Research

    PubMed Central

    Aylward, Brandon S.; Rausch, Joseph R.

    2011-01-01

    Objective Dynamic p-technique (DPT) is a potentially useful statistical method for examining relationships among dynamic constructs in a single individual or small group of individuals over time. The purpose of this article is to offer a nontechnical introduction to DPT. Method An overview of DPT analysis, with an emphasis on potential applications to pediatric psychology research, is provided. To illustrate how DPT might be applied, an example using simulated data is presented for daily pain and negative mood ratings. Results The simulated example demonstrates the application of DPT to a relevant pediatric psychology research area. In addition, the potential application of DPT to the longitudinal study of adherence is presented. Conclusion Although it has not been utilized frequently within pediatric psychology, DPT could be particularly well-suited for research in this field because of its ability to powerfully model repeated observations from very small samples. PMID:21486938

  6. Dynamic p-technique for modeling patterns of data: applications to pediatric psychology research.

    PubMed

    Nelson, Timothy D; Aylward, Brandon S; Rausch, Joseph R

    2011-10-01

    Dynamic p-technique (DPT) is a potentially useful statistical method for examining relationships among dynamic constructs in a single individual or small group of individuals over time. The purpose of this article is to offer a nontechnical introduction to DPT. An overview of DPT analysis, with an emphasis on potential applications to pediatric psychology research, is provided. To illustrate how DPT might be applied, an example using simulated data is presented for daily pain and negative mood ratings. The simulated example demonstrates the application of DPT to a relevant pediatric psychology research area. In addition, the potential application of DPT to the longitudinal study of adherence is presented. Although it has not been utilized frequently within pediatric psychology, DPT could be particularly well-suited for research in this field because of its ability to powerfully model repeated observations from very small samples.

  7. Improve ay101 teaching by single modifications versus unchanged controls: statistically-supported examples

    NASA Astrophysics Data System (ADS)

    Byrd, Gene G.; Byrd, Dana

    2017-06-01

    The two main purposes of this paper on improving Ay101 courses are presentations of (1) some very effective single changes and (2) a method to improve teaching by making just single changes which are evaluated statistically versus a control group class. We show how simple statistical comparison can be done even with Excel in Windows. Of course, other more sophisticated and powerful methods could be used if available. One of several examples to be discussed on our poster is our modification of an online introductory astronomy lab course evaluated by the multiple choice final exam. We composed questions related to the learning objectives of the course modules (LOQs). Students could “talk to themselves” by discursively answering these for extra credit prior to the final. Results were compared to an otherwise identical previous unmodified class. Modified classes showed statistically much better final exam average scores (78% vs. 66%). This modification helped those students who most need help. Students in the lower third of the class preferentially answered the LOQs to improve their scores and the class average on the exam. These results also show the effectiveness of relevant extra credit work. Other examples will be discussed as specific examples of evaluating improvement by making one change and then testing it versus a control. Essentially, this is an evolutionary approach in which single favorable “mutations” are retained and the unfavorable removed. The temptation to make more than one change each time must be resisted!

  8. Fine-scale landscape genetics of the American badger (Taxidea taxus): disentangling landscape effects and sampling artifacts in a poorly understood species

    PubMed Central

    Kierepka, E M; Latch, E K

    2016-01-01

    Landscape genetics is a powerful tool for conservation because it identifies landscape features that are important for maintaining genetic connectivity between populations within heterogeneous landscapes. However, using landscape genetics in poorly understood species presents a number of challenges, namely, limited life history information for the focal population and spatially biased sampling. Both obstacles can reduce power in statistics, particularly in individual-based studies. In this study, we genotyped 233 American badgers in Wisconsin at 12 microsatellite loci to identify alternative statistical approaches that can be applied to poorly understood species in an individual-based framework. Badgers are protected in Wisconsin owing to an overall lack in life history information, so our study utilized partial redundancy analysis (RDA) and spatially lagged regressions to quantify how three landscape factors (Wisconsin River, Ecoregions and land cover) impacted gene flow. We also performed simulations to quantify errors created by spatially biased sampling. Statistical analyses first found that geographic distance was an important influence on gene flow, mainly driven by fine-scale positive spatial autocorrelations. After controlling for geographic distance, both RDA and regressions found that Wisconsin River and Agriculture were correlated with genetic differentiation. However, only Agriculture had an acceptable type I error rate (3–5%) to be considered biologically relevant. Collectively, this study highlights the benefits of combining robust statistics and error assessment via simulations and provides a method for hypothesis testing in individual-based landscape genetics. PMID:26243136

  9. Evaluation of reconstruction techniques in regional cerebral blood flow SPECT using trade-off plots: a Monte Carlo study.

    PubMed

    Olsson, Anna; Arlig, Asa; Carlsson, Gudrun Alm; Gustafsson, Agnetha

    2007-09-01

    The image quality of single photon emission computed tomography (SPECT) depends on the reconstruction algorithm used. The purpose of the present study was to evaluate parameters in ordered subset expectation maximization (OSEM) and to compare systematically with filtered back-projection (FBP) for reconstruction of regional cerebral blood flow (rCBF) SPECT, incorporating attenuation and scatter correction. The evaluation was based on the trade-off between contrast recovery and statistical noise using different sizes of subsets, number of iterations and filter parameters. Monte Carlo simulated SPECT studies of a digital human brain phantom were used. The contrast recovery was calculated as measured contrast divided by true contrast. Statistical noise in the reconstructed images was calculated as the coefficient of variation in pixel values. A constant contrast level was reached above 195 equivalent maximum likelihood expectation maximization iterations. The choice of subset size was not crucial as long as there were > or = 2 projections per subset. The OSEM reconstruction was found to give 5-14% higher contrast recovery than FBP for all clinically relevant noise levels in rCBF SPECT. The Butterworth filter, power 6, achieved the highest stable contrast recovery level at all clinically relevant noise levels. The cut-off frequency should be chosen according to the noise level accepted in the image. Trade-off plots are shown to be a practical way of deciding the number of iterations and subset size for the OSEM reconstruction and can be used for other examination types in nuclear medicine.

  10. Targeting targeted agents: open issues for clinical trial design.

    PubMed

    Bria, Emilio; Di Maio, Massimo; Carlini, Paolo; Cuppone, Federica; Giannarelli, Diana; Cognetti, Francesco; Milella, Michele

    2009-05-22

    Molecularly targeted agents for the treatment of solid tumors had entered the market in the last 5 years, with a great impact upon both the scientific community and the society. Many randomized phase III trials conducted in recent years with new targeted agents, despite previous data coming from preclinical research and from phase II trials were often promising, have produced disappointingly negative results. Some other trials have actually met their primary endpoint, demonstrating a statistically significant result favouring the experimental treatment. Unfortunately, with a few relevant exceptions, this advantage is often small, if not negligible, in absolute terms. The difference between statistical significance and clinical relevance should always be considered when translating clinical trials' results in the practice. The reason why this 'revolution' did not significantly impact on cancer treatment to displace chemotherapy from the patient' bedside is in part due to complicated, and in many cases, unknown, mechanisms of action of such drugs; indeed, the traditional way the clinical investigators were used to test the efficacy of 'older' chemotherapeutics, has become 'out of date' from the methodological perspective. As these drugs should be theoretically tailored upon featured bio-markers expressed by the patients, the clinical trial design should follow new rules based upon stronger hypotheses than those developed so far. Indeed, the early phases of basic and clinical drug development are crucial in the correct process which is able to correctly identify the target (when present). Targeted trial designs can result in easier studies, with less, better selected, and supported by stronger proofs of response evidences, patients, in order to not waste time and resources.

  11. Quantifying the statistical importance of utilizing regression over classic energy intensity calculations for tracking efficiency improvements in industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nimbalkar, Sachin U.; Wenning, Thomas J.; Guo, Wei

    In the United States, manufacturing facilities account for about 32% of total domestic energy consumption in 2014. Robust energy tracking methodologies are critical to understanding energy performance in manufacturing facilities. Due to its simplicity and intuitiveness, the classic energy intensity method (i.e. the ratio of total energy use over total production) is the most widely adopted. However, the classic energy intensity method does not take into account the variation of other relevant parameters (i.e. product type, feed stock type, weather, etc.). Furthermore, the energy intensity method assumes that the facilities’ base energy consumption (energy use at zero production) is zero,more » which rarely holds true. Therefore, it is commonly recommended to utilize regression models rather than the energy intensity approach for tracking improvements at the facility level. Unfortunately, many energy managers have difficulties understanding why regression models are statistically better than utilizing the classic energy intensity method. While anecdotes and qualitative information may convince some, many have major reservations about the accuracy of regression models and whether it is worth the time and effort to gather data and build quality regression models. This paper will explain why regression models are theoretically and quantitatively more accurate for tracking energy performance improvements. Based on the analysis of data from 114 manufacturing plants over 12 years, this paper will present quantitative results on the importance of utilizing regression models over the energy intensity methodology. This paper will also document scenarios where regression models do not have significant relevance over the energy intensity method.« less

  12. Statistical Regularities Attract Attention when Task-Relevant.

    PubMed

    Alamia, Andrea; Zénon, Alexandre

    2016-01-01

    Visual attention seems essential for learning the statistical regularities in our environment, a process known as statistical learning. However, how attention is allocated when exploring a novel visual scene whose statistical structure is unknown remains unclear. In order to address this question, we investigated visual attention allocation during a task in which we manipulated the conditional probability of occurrence of colored stimuli, unbeknown to the subjects. Participants were instructed to detect a target colored dot among two dots moving along separate circular paths. We evaluated implicit statistical learning, i.e., the effect of color predictability on reaction times (RTs), and recorded eye position concurrently. Attention allocation was indexed by comparing the Mahalanobis distance between the position, velocity and acceleration of the eyes and the two colored dots. We found that learning the conditional probabilities occurred very early during the course of the experiment as shown by the fact that, starting already from the first block, predictable stimuli were detected with shorter RT than unpredictable ones. In terms of attentional allocation, we found that the predictive stimulus attracted gaze only when it was informative about the occurrence of the target but not when it predicted the occurrence of a task-irrelevant stimulus. This suggests that attention allocation was influenced by regularities only when they were instrumental in performing the task. Moreover, we found that the attentional bias towards task-relevant predictive stimuli occurred at a very early stage of learning, concomitantly with the first effects of learning on RT. In conclusion, these results show that statistical regularities capture visual attention only after a few occurrences, provided these regularities are instrumental to perform the task.

  13. Neural Systems with Numerically Matched Input-Output Statistic: Isotonic Bivariate Statistical Modeling

    PubMed Central

    Fiori, Simone

    2007-01-01

    Bivariate statistical modeling from incomplete data is a useful statistical tool that allows to discover the model underlying two data sets when the data in the two sets do not correspond in size nor in ordering. Such situation may occur when the sizes of the two data sets do not match (i.e., there are “holes” in the data) or when the data sets have been acquired independently. Also, statistical modeling is useful when the amount of available data is enough to show relevant statistical features of the phenomenon underlying the data. We propose to tackle the problem of statistical modeling via a neural (nonlinear) system that is able to match its input-output statistic to the statistic of the available data sets. A key point of the new implementation proposed here is that it is based on look-up-table (LUT) neural systems, which guarantee a computationally advantageous way of implementing neural systems. A number of numerical experiments, performed on both synthetic and real-world data sets, illustrate the features of the proposed modeling procedure. PMID:18566641

  14. Determination of statistics for any rotation of axes of a bivariate normal elliptical distribution. [of wind vector components

    NASA Technical Reports Server (NTRS)

    Falls, L. W.; Crutcher, H. L.

    1976-01-01

    Transformation of statistics from a dimensional set to another dimensional set involves linear functions of the original set of statistics. Similarly, linear functions will transform statistics within a dimensional set such that the new statistics are relevant to a new set of coordinate axes. A restricted case of the latter is the rotation of axes in a coordinate system involving any two correlated random variables. A special case is the transformation for horizontal wind distributions. Wind statistics are usually provided in terms of wind speed and direction (measured clockwise from north) or in east-west and north-south components. A direct application of this technique allows the determination of appropriate wind statistics parallel and normal to any preselected flight path of a space vehicle. Among the constraints for launching space vehicles are critical values selected from the distribution of the expected winds parallel to and normal to the flight path. These procedures are applied to space vehicle launches at Cape Kennedy, Florida.

  15. Perceptual statistical learning over one week in child speech production.

    PubMed

    Richtsmeier, Peter T; Goffman, Lisa

    2017-07-01

    What cognitive mechanisms account for the trajectory of speech sound development, in particular, gradually increasing accuracy during childhood? An intriguing potential contributor is statistical learning, a type of learning that has been studied frequently in infant perception but less often in child speech production. To assess the relevance of statistical learning to developing speech accuracy, we carried out a statistical learning experiment with four- and five-year-olds in which statistical learning was examined over one week. Children were familiarized with and tested on word-medial consonant sequences in novel words. There was only modest evidence for statistical learning, primarily in the first few productions of the first session. This initial learning effect nevertheless aligns with previous statistical learning research. Furthermore, the overall learning effect was similar to an estimate of weekly accuracy growth based on normative studies. The results implicate other important factors in speech sound development, particularly learning via production. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. PhyloExplorer: a web server to validate, explore and query phylogenetic trees

    PubMed Central

    Ranwez, Vincent; Clairon, Nicolas; Delsuc, Frédéric; Pourali, Saeed; Auberval, Nicolas; Diser, Sorel; Berry, Vincent

    2009-01-01

    Background Many important problems in evolutionary biology require molecular phylogenies to be reconstructed. Phylogenetic trees must then be manipulated for subsequent inclusion in publications or analyses such as supertree inference and tree comparisons. However, no tool is currently available to facilitate the management of tree collections providing, for instance: standardisation of taxon names among trees with respect to a reference taxonomy; selection of relevant subsets of trees or sub-trees according to a taxonomic query; or simply computation of descriptive statistics on the collection. Moreover, although several databases of phylogenetic trees exist, there is currently no easy way to find trees that are both relevant and complementary to a given collection of trees. Results We propose a tool to facilitate assessment and management of phylogenetic tree collections. Given an input collection of rooted trees, PhyloExplorer provides facilities for obtaining statistics describing the collection, correcting invalid taxon names, extracting taxonomically relevant parts of the collection using a dedicated query language, and identifying related trees in the TreeBASE database. Conclusion PhyloExplorer is a simple and interactive website implemented through underlying Python libraries and MySQL databases. It is available at: and the source code can be downloaded from: . PMID:19450253

  17. Decompression models: review, relevance and validation capabilities.

    PubMed

    Hugon, J

    2014-01-01

    For more than a century, several types of mathematical models have been proposed to describe tissue desaturation mechanisms in order to limit decompression sickness. These models are statistically assessed by DCS cases, and, over time, have gradually included bubble formation biophysics. This paper proposes to review this evolution and discuss its limitations. This review is organized around the comparison of decompression model biophysical criteria and theoretical foundations. Then, the DCS-predictive capability was analyzed to assess whether it could be improved by combining different approaches. Most of the operational decompression models have a neo-Haldanian form. Nevertheless, bubble modeling has been gaining popularity, and the circulating bubble amount has become a major output. By merging both views, it seems possible to build a relevant global decompression model that intends to simulate bubble production while predicting DCS risks for all types of exposures and decompression profiles. A statistical approach combining both DCS and bubble detection databases has to be developed to calibrate a global decompression model. Doppler ultrasound and DCS data are essential: i. to make correlation and validation phases reliable; ii. to adjust biophysical criteria to fit at best the observed bubble kinetics; and iii. to build a relevant risk function.

  18. Burns education for non-burn specialist clinicians in Western Australia.

    PubMed

    McWilliams, Tania; Hendricks, Joyce; Twigg, Di; Wood, Fiona

    2015-03-01

    Burn patients often receive their initial care by non-burn specialist clinicians, with increasingly collaborative burn models of care. The provision of relevant and accessible education for these clinicians is therefore vital for optimal patient care. A two phase design was used. A state-wide survey of multidisciplinary non-burn specialist clinicians throughout Western Australia identified learning needs related to paediatric burn care. A targeted education programme was developed and delivered live via videoconference. Pre-post-test analysis evaluated changes in knowledge as a result of attendance at each education session. Non-burn specialist clinicians identified numerous areas of burn care relevant to their practice. Statistically significant differences between perceived relevance of care and confidence in care provision were reported for aspects of acute burn care. Following attendance at the education sessions, statistically significant increases in knowledge were noted for most areas of acute burn care. Identification of learning needs facilitated the development of a targeted education programme for non-burn specialist clinicians. Increased non-burn specialist clinician knowledge following attendance at most education sessions supports the use of videoconferencing as an acceptable and effective method of delivering burns education in Western Australia. Copyright © 2014 Elsevier Ltd and ISBI. All rights reserved.

  19. Three Approaches to Understanding and Classifying Mental Disorder: ICD-11, DSM-5, and the National Institute of Mental Health's Research Domain Criteria (RDoC).

    PubMed

    Clark, Lee Anna; Cuthbert, Bruce; Lewis-Fernández, Roberto; Narrow, William E; Reed, Geoffrey M

    2017-09-01

    The diagnosis of mental disorder initially appears relatively straightforward: Patients present with symptoms or visible signs of illness; health professionals make diagnoses based primarily on these symptoms and signs; and they prescribe medication, psychotherapy, or both, accordingly. However, despite a dramatic expansion of knowledge about mental disorders during the past half century, understanding of their components and processes remains rudimentary. We provide histories and descriptions of three systems with different purposes relevant to understanding and classifying mental disorder. Two major diagnostic manuals-the International Classification of Diseases and the Diagnostic and Statistical Manual of Mental Disorders-provide classification systems relevant to public health, clinical diagnosis, service provision, and specific research applications, the former internationally and the latter primarily for the United States. In contrast, the National Institute of Mental Health's Research Domain Criteria provides a framework that emphasizes integration of basic behavioral and neuroscience research to deepen the understanding of mental disorder. We identify four key issues that present challenges to understanding and classifying mental disorder: etiology, including the multiple causality of mental disorder; whether the relevant phenomena are discrete categories or dimensions; thresholds, which set the boundaries between disorder and nondisorder; and comorbidity, the fact that individuals with mental illness often meet diagnostic requirements for multiple conditions. We discuss how the three systems' approaches to these key issues correspond or diverge as a result of their different histories, purposes, and constituencies. Although the systems have varying degrees of overlap and distinguishing features, they share the goal of reducing the burden of suffering due to mental disorder.

  20. Selection, calibration, and validation of models of tumor growth.

    PubMed

    Lima, E A B F; Oden, J T; Hormuth, D A; Yankeelov, T E; Almeida, R C

    2016-11-01

    This paper presents general approaches for addressing some of the most important issues in predictive computational oncology concerned with developing classes of predictive models of tumor growth. First, the process of developing mathematical models of vascular tumors evolving in the complex, heterogeneous, macroenvironment of living tissue; second, the selection of the most plausible models among these classes, given relevant observational data; third, the statistical calibration and validation of models in these classes, and finally, the prediction of key Quantities of Interest (QOIs) relevant to patient survival and the effect of various therapies. The most challenging aspects of this endeavor is that all of these issues often involve confounding uncertainties: in observational data, in model parameters, in model selection, and in the features targeted in the prediction. Our approach can be referred to as "model agnostic" in that no single model is advocated; rather, a general approach that explores powerful mixture-theory representations of tissue behavior while accounting for a range of relevant biological factors is presented, which leads to many potentially predictive models. Then representative classes are identified which provide a starting point for the implementation of OPAL, the Occam Plausibility Algorithm (OPAL) which enables the modeler to select the most plausible models (for given data) and to determine if the model is a valid tool for predicting tumor growth and morphology ( in vivo ). All of these approaches account for uncertainties in the model, the observational data, the model parameters, and the target QOI. We demonstrate these processes by comparing a list of models for tumor growth, including reaction-diffusion models, phase-fields models, and models with and without mechanical deformation effects, for glioma growth measured in murine experiments. Examples are provided that exhibit quite acceptable predictions of tumor growth in laboratory animals while demonstrating successful implementations of OPAL.

  1. Renal incidental findings on computed tomography

    PubMed Central

    Meyer, Hans Jonas; Pfeil, Alina; Schramm, Dominik; Bach, Andreas Gunter; Surov, Alexey

    2017-01-01

    Abstract Renal incidental findings (IFs) are common. However, previous reports investigated renal IFs were limited to patient selection. The purpose of this study was to estimate the prevalence and distribution of all renal IFs on computed tomography (CT) in a large patient collective. All patients, who underwent CT investigations of the abdominal region at our institution in the time period between January 2006 and February 2014 were included in this study. Inclusion criteria were as follows: no previous history of renal diseases and well image quality. Patients with known kidney disorders were excluded from the study. Overall, 7365 patients meet the inclusion criteria were identified. There were 2924 (39.7%) women and 4441 men (60.3%) with a mean age of 59.8 ± 16.7 years. All CTs were retrospectively analyzed in consensus by 2 radiologists. Collected data were evaluated by means of descriptive statistics. Overall, 2756 patients (37.42% of all included patients) showed 3425 different renal IFs (1.24 findings per patient). Of all renal IFs, 123 (3.6%) findings were clinically relevant, 259 (7.6%) were categorized as possibly clinically relevant, and 3043 (88.8%) were clinically non relevant. Different renal IFs can be detected on CT. The present study provides a real prevalence and proportion of them in daily clinical routine. Kidneys should be thoroughly evaluated because of the fact that incidental renal findings occur frequently. PMID:28658098

  2. A combined pre-clinical meta-analysis and randomized confirmatory trial approach to improve data validity for therapeutic target validation.

    PubMed

    Kleikers, Pamela W M; Hooijmans, Carlijn; Göb, Eva; Langhauser, Friederike; Rewell, Sarah S J; Radermacher, Kim; Ritskes-Hoitinga, Merel; Howells, David W; Kleinschnitz, Christoph; Schmidt, Harald H H W

    2015-08-27

    Biomedical research suffers from a dramatically poor translational success. For example, in ischemic stroke, a condition with a high medical need, over a thousand experimental drug targets were unsuccessful. Here, we adopt methods from clinical research for a late-stage pre-clinical meta-analysis (MA) and randomized confirmatory trial (pRCT) approach. A profound body of literature suggests NOX2 to be a major therapeutic target in stroke. Systematic review and MA of all available NOX2(-/y) studies revealed a positive publication bias and lack of statistical power to detect a relevant reduction in infarct size. A fully powered multi-center pRCT rejects NOX2 as a target to improve neurofunctional outcomes or achieve a translationally relevant infarct size reduction. Thus stringent statistical thresholds, reporting negative data and a MA-pRCT approach can ensure biomedical data validity and overcome risks of bias.

  3. Data and methods for studying commercial motor vehicle driver fatigue, highway safety and long-term driver health.

    PubMed

    Stern, Hal S; Blower, Daniel; Cohen, Michael L; Czeisler, Charles A; Dinges, David F; Greenhouse, Joel B; Guo, Feng; Hanowski, Richard J; Hartenbaum, Natalie P; Krueger, Gerald P; Mallis, Melissa M; Pain, Richard F; Rizzo, Matthew; Sinha, Esha; Small, Dylan S; Stuart, Elizabeth A; Wegman, David H

    2018-03-09

    This article summarizes the recommendations on data and methodology issues for studying commercial motor vehicle driver fatigue of a National Academies of Sciences, Engineering, and Medicine study. A framework is provided that identifies the various factors affecting driver fatigue and relating driver fatigue to crash risk and long-term driver health. The relevant factors include characteristics of the driver, vehicle, carrier and environment. Limitations of existing data are considered and potential sources of additional data described. Statistical methods that can be used to improve understanding of the relevant relationships from observational data are also described. The recommendations for enhanced data collection and the use of modern statistical methods for causal inference have the potential to enhance our understanding of the relationship of fatigue to highway safety and to long-term driver health. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  4. To What Extent Is Mathematical Ability Predictive of Performance in a Methodology and Statistics Course? Can an Action Research Approach Be Used to Understand the Relevance of Mathematical Ability in Psychology Undergraduates?

    ERIC Educational Resources Information Center

    Bourne, Victoria J.

    2014-01-01

    Research methods and statistical analysis is typically the least liked and most anxiety provoking aspect of a psychology undergraduate degree, in large part due to the mathematical component of the content. In this first cycle of a piece of action research, students' mathematical ability is examined in relation to their performance across…

  5. Nonstationary envelope process and first excursion probability

    NASA Technical Reports Server (NTRS)

    Yang, J.

    1972-01-01

    A definition of the envelope of nonstationary random processes is proposed. The establishment of the envelope definition makes it possible to simulate the nonstationary random envelope directly. Envelope statistics, such as the density function, joint density function, moment function, and level crossing rate, which are relevent to analyses of catastrophic failure, fatigue, and crack propagation in structures, are derived. Applications of the envelope statistics to the prediction of structural reliability under random loadings are discussed in detail.

  6. Image correlation and sampling study

    NASA Technical Reports Server (NTRS)

    Popp, D. J.; Mccormack, D. S.; Sedwick, J. L.

    1972-01-01

    The development of analytical approaches for solving image correlation and image sampling of multispectral data is discussed. Relevant multispectral image statistics which are applicable to image correlation and sampling are identified. The general image statistics include intensity mean, variance, amplitude histogram, power spectral density function, and autocorrelation function. The translation problem associated with digital image registration and the analytical means for comparing commonly used correlation techniques are considered. General expressions for determining the reconstruction error for specific image sampling strategies are developed.

  7. Classical subjective expected utility.

    PubMed

    Cerreia-Vioglio, Simone; Maccheroni, Fabio; Marinacci, Massimo; Montrucchio, Luigi

    2013-04-23

    We consider decision makers who know that payoff-relevant observations are generated by a process that belongs to a given class M, as postulated in Wald [Wald A (1950) Statistical Decision Functions (Wiley, New York)]. We incorporate this Waldean piece of objective information within an otherwise subjective setting à la Savage [Savage LJ (1954) The Foundations of Statistics (Wiley, New York)] and show that this leads to a two-stage subjective expected utility model that accounts for both state and model uncertainty.

  8. The Emergence of Contextual Social Psychology.

    PubMed

    Pettigrew, Thomas F

    2018-07-01

    Social psychology experiences recurring so-called "crises." This article maintains that these episodes actually mark advances in the discipline; these "crises" have enhanced relevance and led to greater methodological and statistical sophistication. New statistical tools have allowed social psychologists to begin to achieve a major goal: placing psychological phenomena in their larger social contexts. This growing trend is illustrated with numerous recent studies; they demonstrate how cultures and social norms moderate basic psychological processes. Contextual social psychology is finally emerging.

  9. A multi-analyte serum test for the detection of non-small cell lung cancer

    PubMed Central

    Farlow, E C; Vercillo, M S; Coon, J S; Basu, S; Kim, A W; Faber, L P; Warren, W H; Bonomi, P; Liptay, M J; Borgia, J A

    2010-01-01

    Background: In this study, we appraised a wide assortment of biomarkers previously shown to have diagnostic or prognostic value for non-small cell lung cancer (NSCLC) with the intent of establishing a multi-analyte serum test capable of identifying patients with lung cancer. Methods: Circulating levels of 47 biomarkers were evaluated against patient cohorts consisting of 90 NSCLC and 43 non-cancer controls using commercial immunoassays. Multivariate statistical methods were used on all biomarkers achieving statistical relevance to define an optimised panel of diagnostic biomarkers for NSCLC. The resulting biomarkers were fashioned into a classification algorithm and validated against serum from a second patient cohort. Results: A total of 14 analytes achieved statistical relevance upon evaluation. Multivariate statistical methods then identified a panel of six biomarkers (tumour necrosis factor-α, CYFRA 21-1, interleukin-1ra, matrix metalloproteinase-2, monocyte chemotactic protein-1 and sE-selectin) as being the most efficacious for diagnosing early stage NSCLC. When tested against a second patient cohort, the panel successfully classified 75 of 88 patients. Conclusions: Here, we report the development of a serum algorithm with high specificity for classifying patients with NSCLC against cohorts of various ‘high-risk' individuals. A high rate of false positives was observed within the cohort in which patients had non-neoplastic lung nodules, possibly as a consequence of the inflammatory nature of these conditions. PMID:20859284

  10. Users' manual for the Hydroecological Integrity Assessment Process software (including the New Jersey Assessment Tools)

    USGS Publications Warehouse

    Henriksen, James A.; Heasley, John; Kennen, Jonathan G.; Nieswand, Steven

    2006-01-01

    Applying the Hydroecological Integrity Assessment Process involves four steps: (1) a hydrologic classification of relatively unmodified streams in a geographic area using long-term gage records and 171 ecologically relevant indices; (2) the identification of statistically significant, nonredundant, hydroecologically relevant indices associated with the five major flow components for each stream class; and (3) the development of a stream-classification tool and a hydrologic assessment tool. Four computer software tools have been developed.

  11. Atmospheric propagation issues relevant to optical communications

    NASA Technical Reports Server (NTRS)

    Churnside, James H.; Shaik, Kamran

    1989-01-01

    Atmospheric propagation issues relevant to space-to-ground optical communications for near-earth applications are studied. Propagation effects, current optical communication activities, potential applications, and communication techniques are surveyed. It is concluded that a direct-detection space-to-ground link using redundant receiver sites and temporal encoding is likely to be employed to transmit earth-sensing satellite data to the ground some time in the future. Low-level, long-term studies of link availability, fading statistics, and turbulence climatology are recommended to support this type of application.

  12. Codifference as a practical tool to measure interdependence

    NASA Astrophysics Data System (ADS)

    Wyłomańska, Agnieszka; Chechkin, Aleksei; Gajda, Janusz; Sokolov, Igor M.

    2015-03-01

    Correlation and spectral analysis represent the standard tools to study interdependence in statistical data. However, for the stochastic processes with heavy-tailed distributions such that the variance diverges, these tools are inadequate. The heavy-tailed processes are ubiquitous in nature and finance. We here discuss codifference as a convenient measure to study statistical interdependence, and we aim to give a short introductory review of its properties. By taking different known stochastic processes as generic examples, we present explicit formulas for their codifferences. We show that for the Gaussian processes codifference is equivalent to covariance. For processes with finite variance these two measures behave similarly with time. For the processes with infinite variance the covariance does not exist, however, the codifference is relevant. We demonstrate the practical importance of the codifference by extracting this function from simulated as well as real data taken from turbulent plasma of fusion device and financial market. We conclude that the codifference serves as a convenient practical tool to study interdependence for stochastic processes with both infinite and finite variances as well.

  13. Radio and infrared properties of young stars

    NASA Technical Reports Server (NTRS)

    Panagia, Nino

    1987-01-01

    Observing young stars, or more appropriately, pre-main-sequence (PMS) stars, in the infrared and at radio frequencies has the advantage over optical observation in that the heavy extinction associated with a star forming region is only a minor problem, so that the whole region can be studied thoroughly. Therefore, it means being able to: (1) search for stars and do statistical studies on the rate of star formation; (2) determine their luminosity, hence, to study luminosity functions and initial mass functions down to low masses; and (3) to study their spectra and, thus, to determine the prevailing conditions at and near the surface of a newly born star and its relations with the surrounding environment. The third point is of principal interest. The report limits itself to a consideration of the observations concerning the processes of outflows from, and accretion onto, PMS stars and the theory necessary to interpret them. Section 2 discusses the radiative processes relevant in stellar outflows. The main observational results are presented in Section 3. A discussion of the statistical properties of stellar winds from PMS stars are given in Section 4.

  14. A “Hybrid” Bacteriology Course: The Professor’s Design and Expectations; The Students’ Performance and Assessment

    PubMed Central

    KRAWIEC, STEVEN; SALTER, DIANE; KAY, EDWIN J.

    2005-01-01

    A basic bacteriology course was offered in two successive academic years, first in a conventional format and subsequently as a “hybrid” course. The latter combined (i) online presentation of content, (ii) an emphasis on online resources, (iii) thrice-weekly, face-to-face conversations to advance understanding, and (iv) frequent student postings on an electronic discussion board. We compared the two courses through statistical analysis of student performances on the final examinations and the course overall and student assessment of teaching. The data indicated that there was no statistical difference in performance on the final examinations or the course overall. Responses on an instrument of evaluation revealed that students less strongly affirmed the following measures in the hybrid course: (i) The amount of work was appropriate for the credit received, (ii) Interactions between students and instructor were positive, (iii) I learned a great deal in this course, and (iv) I would recommend this course to other students. We recommend clear direction about active learning tasks and relevant feedback to enhance learning in a hybrid course. PMID:23653558

  15. COP21 climate negotiators' responses to climate model forecasts

    NASA Astrophysics Data System (ADS)

    Bosetti, Valentina; Weber, Elke; Berger, Loïc; Budescu, David V.; Liu, Ning; Tavoni, Massimo

    2017-02-01

    Policymakers involved in climate change negotiations are key users of climate science. It is therefore vital to understand how to communicate scientific information most effectively to this group. We tested how a unique sample of policymakers and negotiators at the Paris COP21 conference update their beliefs on year 2100 global mean temperature increases in response to a statistical summary of climate models' forecasts. We randomized the way information was provided across participants using three different formats similar to those used in Intergovernmental Panel on Climate Change reports. In spite of having received all available relevant scientific information, policymakers adopted such information very conservatively, assigning it less weight than their own prior beliefs. However, providing individual model estimates in addition to the statistical range was more effective in mitigating such inertia. The experiment was repeated with a population of European MBA students who, despite starting from similar priors, reported conditional probabilities closer to the provided models' forecasts than policymakers. There was also no effect of presentation format in the MBA sample. These results highlight the importance of testing visualization tools directly on the population of interest.

  16. UniEnt: uniform entropy model for the dynamics of a neuronal population

    NASA Astrophysics Data System (ADS)

    Hernandez Lahme, Damian; Nemenman, Ilya

    Sensory information and motor responses are encoded in the brain in a collective spiking activity of a large number of neurons. Understanding the neural code requires inferring statistical properties of such collective dynamics from multicellular neurophysiological recordings. Questions of whether synchronous activity or silence of multiple neurons carries information about the stimuli or the motor responses are especially interesting. Unfortunately, detection of such high order statistical interactions from data is especially challenging due to the exponentially large dimensionality of the state space of neural collectives. Here we present UniEnt, a method for the inference of strengths of multivariate neural interaction patterns. The method is based on the Bayesian prior that makes no assumptions (uniform a priori expectations) about the value of the entropy of the observed multivariate neural activity, in contrast to popular approaches that maximize this entropy. We then study previously published multi-electrode recordings data from salamander retina, exposing the relevance of higher order neural interaction patterns for information encoding in this system. This work was supported in part by Grants JSMF/220020321 and NSF/IOS/1208126.

  17. Length bias correction in gene ontology enrichment analysis using logistic regression.

    PubMed

    Mi, Gu; Di, Yanming; Emerson, Sarah; Cumbie, Jason S; Chang, Jeff H

    2012-01-01

    When assessing differential gene expression from RNA sequencing data, commonly used statistical tests tend to have greater power to detect differential expression of genes encoding longer transcripts. This phenomenon, called "length bias", will influence subsequent analyses such as Gene Ontology enrichment analysis. In the presence of length bias, Gene Ontology categories that include longer genes are more likely to be identified as enriched. These categories, however, are not necessarily biologically more relevant. We show that one can effectively adjust for length bias in Gene Ontology analysis by including transcript length as a covariate in a logistic regression model. The logistic regression model makes the statistical issue underlying length bias more transparent: transcript length becomes a confounding factor when it correlates with both the Gene Ontology membership and the significance of the differential expression test. The inclusion of the transcript length as a covariate allows one to investigate the direct correlation between the Gene Ontology membership and the significance of testing differential expression, conditional on the transcript length. We present both real and simulated data examples to show that the logistic regression approach is simple, effective, and flexible.

  18. Statistical issues in reporting quality data: small samples and casemix variation.

    PubMed

    Zaslavsky, A M

    2001-12-01

    To present two key statistical issues that arise in analysis and reporting of quality data. Casemix variation is relevant to quality reporting when the units being measured have differing distributions of patient characteristics that also affect the quality outcome. When this is the case, adjustment using stratification or regression may be appropriate. Such adjustments may be controversial when the patient characteristic does not have an obvious relationship to the outcome. Stratified reporting poses problems for sample size and reporting format, but may be useful when casemix effects vary across units. Although there are no absolute standards of reliability, high reliabilities (interunit F > or = 10 or reliability > or = 0.9) are desirable for distinguishing above- and below-average units. When small or unequal sample sizes complicate reporting, precision may be improved using indirect estimation techniques that incorporate auxiliary information, and 'shrinkage' estimation can help to summarize the strength of evidence about units with small samples. With broader understanding of casemix adjustment and methods for analyzing small samples, quality data can be analysed and reported more accurately.

  19. Cross section of α-induced reactions on iridium isotopes obtained from thick target yield measurement for the astrophysical γ process

    NASA Astrophysics Data System (ADS)

    Szücs, T.; Kiss, G. G.; Gyürky, Gy.; Halász, Z.; Fülöp, Zs.; Rauscher, T.

    2018-01-01

    The stellar reaction rates of radiative α-capture reactions on heavy isotopes are of crucial importance for the γ process network calculations. These rates are usually derived from statistical model calculations, which need to be validated, but the experimental database is very scarce. This paper presents the results of α-induced reaction cross section measurements on iridium isotopes carried out at first close to the astrophysically relevant energy region. Thick target yields of 191Ir(α,γ)195Au, 191Ir(α,n)194Au, 193Ir(α,n)196mAu, 193Ir(α,n)196Au reactions have been measured with the activation technique between Eα = 13.4 MeV and 17 MeV. For the first time the thick target yield was determined with X-ray counting. This led to a previously unprecedented sensitivity. From the measured thick target yields, reaction cross sections are derived and compared with statistical model calculations. The recently suggested energy-dependent modification of the α + nucleus optical potential gives a good description of the experimental data.

  20. An incremental knowledge assimilation system (IKAS) for mine detection

    NASA Astrophysics Data System (ADS)

    Porway, Jake; Raju, Chaitanya; Varadarajan, Karthik Mahesh; Nguyen, Hieu; Yadegar, Joseph

    2010-04-01

    In this paper we present an adaptive incremental learning system for underwater mine detection and classification that utilizes statistical models of seabed texture and an adaptive nearest-neighbor classifier to identify varied underwater targets in many different environments. The first stage of processing uses our Background Adaptive ANomaly detector (BAAN), which identifies statistically likely target regions using Gabor filter responses over the image. Using this information, BAAN classifies the background type and updates its detection using background-specific parameters. To perform classification, a Fully Adaptive Nearest Neighbor (FAAN) determines the best label for each detection. FAAN uses an extremely fast version of Nearest Neighbor to find the most likely label for the target. The classifier perpetually assimilates new and relevant information into its existing knowledge database in an incremental fashion, allowing improved classification accuracy and capturing concept drift in the target classes. Experiments show that the system achieves >90% classification accuracy on underwater mine detection tasks performed on synthesized datasets provided by the Office of Naval Research. We have also demonstrated that the system can incrementally improve its detection accuracy by constantly learning from new samples.

  1. VALUE - A Framework to Validate Downscaling Approaches for Climate Change Studies

    NASA Astrophysics Data System (ADS)

    Maraun, Douglas; Widmann, Martin; Gutiérrez, José M.; Kotlarski, Sven; Chandler, Richard E.; Hertig, Elke; Wibig, Joanna; Huth, Radan; Wilke, Renate A. I.

    2015-04-01

    VALUE is an open European network to validate and compare downscaling methods for climate change research. VALUE aims to foster collaboration and knowledge exchange between climatologists, impact modellers, statisticians, and stakeholders to establish an interdisciplinary downscaling community. A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods. Here, we present the key ingredients of this framework. VALUE's main approach to validation is user-focused: starting from a specific user problem, a validation tree guides the selection of relevant validation indices and performance measures. Several experiments have been designed to isolate specific points in the downscaling procedure where problems may occur: what is the isolated downscaling skill? How do statistical and dynamical methods compare? How do methods perform at different spatial scales? Do methods fail in representing regional climate change? How is the overall representation of regional climate, including errors inherited from global climate models? The framework will be the basis for a comprehensive community-open downscaling intercomparison study, but is intended also to provide general guidance for other validation studies.

  2. VALUE: A framework to validate downscaling approaches for climate change studies

    NASA Astrophysics Data System (ADS)

    Maraun, Douglas; Widmann, Martin; Gutiérrez, José M.; Kotlarski, Sven; Chandler, Richard E.; Hertig, Elke; Wibig, Joanna; Huth, Radan; Wilcke, Renate A. I.

    2015-01-01

    VALUE is an open European network to validate and compare downscaling methods for climate change research. VALUE aims to foster collaboration and knowledge exchange between climatologists, impact modellers, statisticians, and stakeholders to establish an interdisciplinary downscaling community. A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods. In this paper, we present the key ingredients of this framework. VALUE's main approach to validation is user- focused: starting from a specific user problem, a validation tree guides the selection of relevant validation indices and performance measures. Several experiments have been designed to isolate specific points in the downscaling procedure where problems may occur: what is the isolated downscaling skill? How do statistical and dynamical methods compare? How do methods perform at different spatial scales? Do methods fail in representing regional climate change? How is the overall representation of regional climate, including errors inherited from global climate models? The framework will be the basis for a comprehensive community-open downscaling intercomparison study, but is intended also to provide general guidance for other validation studies.

  3. Linking Arctic plant biodiversity measurements with landscape heterogeneity

    NASA Astrophysics Data System (ADS)

    Gerber, F.; Schaepman-Strub, G.; Furrer, R.

    2016-12-01

    Climate warming in the Arctic region triggers changes in the vegetation productivity and species composition of the tundra. To investigate these changes and their feedback to climate, we consider species richness and abundance data of the International Tundra EXperiment (ITEX). As this information is very sparse in time and space, we aim to upscale available records to climatically relevant scales with a remote sensing based characterization of the study sites. More precisely, we relate species richness and evenness derived from the ITEX data to summary statistics describing the landscape heterogeneity, which are derived from an elevation model (ASTER GDEM) and spectral satellite observations (LANDSAT 5 and 7). Preliminary results from the statistical analysis using generalized linear mixed models show that no remote sensing based landscape characterization does significantly explain species richness. Reasons could be a mismatch of the spatial scales, an inappropriate characterization of the test sites through the satellite measurements, incomparable plot measurements from the different test sites and/or too few plot measurements. We are looking forward to presenting our results and getting your inputs.

  4. Set-free Markov state model building

    NASA Astrophysics Data System (ADS)

    Weber, Marcus; Fackeldey, Konstantin; Schütte, Christof

    2017-03-01

    Molecular dynamics (MD) simulations face challenging problems since the time scales of interest often are much longer than what is possible to simulate; and even if sufficiently long simulations are possible the complex nature of the resulting simulation data makes interpretation difficult. Markov State Models (MSMs) help to overcome these problems by making experimentally relevant time scales accessible via coarse grained representations that also allow for convenient interpretation. However, standard set-based MSMs exhibit some caveats limiting their approximation quality and statistical significance. One of the main caveats results from the fact that typical MD trajectories repeatedly re-cross the boundary between the sets used to build the MSM which causes statistical bias in estimating the transition probabilities between these sets. In this article, we present a set-free approach to MSM building utilizing smooth overlapping ansatz functions instead of sets and an adaptive refinement approach. This kind of meshless discretization helps to overcome the recrossing problem and yields an adaptive refinement procedure that allows us to improve the quality of the model while exploring state space and inserting new ansatz functions into the MSM.

  5. Inferring the Mode of Selection from the Transient Response to Demographic Perturbations

    NASA Astrophysics Data System (ADS)

    Balick, Daniel; Do, Ron; Reich, David; Sunyaev, Shamil

    2014-03-01

    Despite substantial recent progress in theoretical population genetics, most models work under the assumption of a constant population size. Deviations from fixed population sizes are ubiquitous in natural populations, many of which experience population bottlenecks and re-expansions. The non-equilibrium dynamics introduced by a large perturbation in population size are generally viewed as a confounding factor. In the present work, we take advantage of the transient response to a population bottleneck to infer features of the mode of selection and the distribution of selective effects. We develop an analytic framework and a corresponding statistical test that qualitatively differentiates between alleles under additive and those under recessive or more general epistatic selection. This statistic can be used to bound the joint distribution of selective effects and dominance effects in any diploid sexual organism. We apply this technique to human population genetic data, and severely restrict the space of allowed selective coefficients in humans. Additionally, one can test a set of functionally or medically relevant alleles for the primary mode of selection, or determine the local regional variation in dominance coefficients along the genome.

  6. The empirical basis of substance use disorders diagnosis: research recommendations for the Diagnostic and Statistical Manual of Mental Disorders, fifth edition (DSM-V).

    PubMed

    Schuckit, Marc A; Saunders, John B

    2006-09-01

    This paper presents the recommendations, developed from a 3-year consultation process, for a program of research to underpin the development of diagnostic concepts and criteria in the Substance Use Disorders section of the Diagnostic and Statistical Manual of Mental Disorders (DSM) and potentially the relevant section of the next revision of the International Classification of Diseases (ICD). A preliminary list of research topics was developed at the DSM-V Launch Conference in 2004. This led to the presentation of articles on these topics at a specific Substance Use Disorders Conference in February 2005, at the end of which a preliminary list of research questions was developed. This was further refined through an iterative process involving conference participants over the following year. Research questions have been placed into four categories: (1) questions that could be addressed immediately through secondary analyses of existing data sets; (2) items likely to require position papers to propose criteria or more focused questions with a view to subsequent analyses of existing data sets; (3) issues that could be proposed for literature reviews, but with a lower probability that these might progress to a data analytic phase; and (4) suggestions or comments that might not require immediate action, but that could be considered by the DSM-V and ICD 11 revision committees as part of their deliberations. A broadly based research agenda for the development of diagnostic concepts and criteria for substance use disorders is presented.

  7. Development of Supersonic Combustion Experiments for CFD Modeling

    NASA Technical Reports Server (NTRS)

    Baurle, Robert; Bivolaru, Daniel; Tedder, Sarah; Danehy, Paul M.; Cutler, Andrew D.; Magnotti, Gaetano

    2007-01-01

    This paper describes the development of an experiment to acquire data for developing and validating computational fluid dynamics (CFD) models for turbulence in supersonic combusting flows. The intent is that the flow field would be simple yet relevant to flows within hypersonic air-breathing engine combustors undergoing testing in vitiated-air ground-testing facilities. Specifically, it describes development of laboratory-scale hardware to produce a supersonic combusting coaxial jet, discusses design calculations, operability and types of flames observed. These flames are studied using the dual-pump coherent anti- Stokes Raman spectroscopy (CARS) - interferometric Rayleigh scattering (IRS) technique. This technique simultaneously and instantaneously measures temperature, composition, and velocity in the flow, from which many of the important turbulence statistics can be found. Some preliminary CARS data are presented.

  8. Preferential sampling and Bayesian geostatistics: Statistical modeling and examples.

    PubMed

    Cecconi, Lorenzo; Grisotto, Laura; Catelan, Dolores; Lagazio, Corrado; Berrocal, Veronica; Biggeri, Annibale

    2016-08-01

    Preferential sampling refers to any situation in which the spatial process and the sampling locations are not stochastically independent. In this paper, we present two examples of geostatistical analysis in which the usual assumption of stochastic independence between the point process and the measurement process is violated. To account for preferential sampling, we specify a flexible and general Bayesian geostatistical model that includes a shared spatial random component. We apply the proposed model to two different case studies that allow us to highlight three different modeling and inferential aspects of geostatistical modeling under preferential sampling: (1) continuous or finite spatial sampling frame; (2) underlying causal model and relevant covariates; and (3) inferential goals related to mean prediction surface or prediction uncertainty. © The Author(s) 2016.

  9. Psychological Pathways Linking Social Support to Health Outcomes: A Visit with the “Ghosts” of Research Past, Present, and Future

    PubMed Central

    Uchino, Bert N.; Bowen, Kimberly; Carlisle, McKenzie; Birmingham, Wendy

    2012-01-01

    Contemporary models postulate the importance of psychological mechanisms linking perceived and received social support to physical health outcomes. In this review, we examine studies that directly tested the potential psychological mechanisms responsible for links between social support and health-relevant physiological processes (1980s to 2010). Inconsistent with existing theoretical models, no evidence was found that psychological mechanisms such as depression, perceived stress, and other affective processes are directly responsible for links between support and health. We discuss the importance of considering statistical/design issues, emerging conceptual perspectives, and limitations of our existing models for future research aimed at elucidating the psychological mechanisms responsible for links between social support and physical health outcomes. PMID:22326104

  10. Unbiased simulation of near-Clifford quantum circuits

    DOE PAGES

    Bennink, Ryan S.; Ferragut, Erik M.; Humble, Travis S.; ...

    2017-06-28

    Modeling and simulation are essential for predicting and verifying the behavior of fabricated quantum circuits, but existing simulation methods are either impractically costly or require an unrealistic simplification of error processes. In this paper, we present a method of simulating noisy Clifford circuits that is both accurate and practical in experimentally relevant regimes. In particular, the cost is weakly exponential in the size and the degree of non-Cliffordness of the circuit. Our approach is based on the construction of exact representations of quantum channels as quasiprobability distributions over stabilizer operations, which are then sampled, simulated, and weighted to yield unbiasedmore » statistical estimates of circuit outputs and other observables. As a demonstration of these techniques, we simulate a Steane [[7,1,3

  11. A novel approach for introducing cloud spatial structure into cloud radiative transfer parameterizations

    NASA Astrophysics Data System (ADS)

    Huang, Dong; Liu, Yangang

    2014-12-01

    Subgrid-scale variability is one of the main reasons why parameterizations are needed in large-scale models. Although some parameterizations started to address the issue of subgrid variability by introducing a subgrid probability distribution function for relevant quantities, the spatial structure has been typically ignored and thus the subgrid-scale interactions cannot be accounted for physically. Here we present a new statistical-physics-like approach whereby the spatial autocorrelation function can be used to physically capture the net effects of subgrid cloud interaction with radiation. The new approach is able to faithfully reproduce the Monte Carlo 3D simulation results with several orders less computational cost, allowing for more realistic representation of cloud radiation interactions in large-scale models.

  12. AERONET derived (BC) aerosol absorption

    NASA Astrophysics Data System (ADS)

    Kinne, S.

    2015-12-01

    AERONET is a ground-based sun-/sky-photometer network with good annual statistics at more than 400 sites worldwide. Inversion methods applied to these data define all relevant column aerosol optical properties and reveal even microphysical detail. The extracted data include estimates for aerosol size-distributions and for aerosol refractive indices at four different solar wavelengths. Hereby, the imaginary parts of the refractive indices define the aerosol column absorption. For regional and global averages and radiative impact assessment with off-line radiative transfer, these local data have been extended with distribution patterns offered by AeroCom modeling experiments. Annual and seasonal absorption distributions for total aerosol and estimates for component contributions (such as BC) are presented and associated direct forcing impacts are quantified.

  13. Test of ATLAS RPCs Front-End electronics

    NASA Astrophysics Data System (ADS)

    Aielli, G.; Camarri, P.; Cardarelli, R.; Di Ciaccio, A.; Di Stante, L.; Liberti, B.; Paoloni, A.; Pastori, E.; Santonico, R.

    2003-08-01

    The Front-End Electronics performing the ATLAS RPCs readout is a full custom 8 channels GaAs circuit, which integrates in a single die both the analog and digital signal processing. The die is bonded on the Front-End board which is completely closed inside the detector Faraday cage. About 50 000 FE boards are foreseen for the experiment. The complete functionality of the FE boards will be certificated before the detector assembly. We describe here the systematic test devoted to check the dynamic functionality of each single channel and the selection criteria applied. It measures and registers all relevant electronics parameters to build up a complete database for the experiment. The statistical results from more than 1100 channels are presented.

  14. What are the most important variables for Poaceae airborne pollen forecasting?

    PubMed

    Navares, Ricardo; Aznarte, José Luis

    2017-02-01

    In this paper, the problem of predicting future concentrations of airborne pollen is solved through a computational intelligence data-driven approach. The proposed method is able to identify the most important variables among those considered by other authors (mainly recent pollen concentrations and weather parameters), without any prior assumptions about the phenological relevance of the variables. Furthermore, an inferential procedure based on non-parametric hypothesis testing is presented to provide statistical evidence of the results, which are coherent to the literature and outperform previous proposals in terms of accuracy. The study is built upon Poaceae airborne pollen concentrations recorded in seven different locations across the Spanish province of Madrid. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Visual feature extraction from voxel-weighted averaging of stimulus images in 2 fMRI studies.

    PubMed

    Hart, Corey B; Rose, William J

    2013-11-01

    Multiple studies have provided evidence for distributed object representation in the brain, with several recent experiments leveraging basis function estimates for partial image reconstruction from fMRI data. Using a novel combination of statistical decomposition, generalized linear models, and stimulus averaging on previously examined image sets and Bayesian regression of recorded fMRI activity during presentation of these data sets, we identify a subset of relevant voxels that appear to code for covarying object features. Using a technique we term "voxel-weighted averaging," we isolate image filters that these voxels appear to implement. The results, though very cursory, appear to have significant implications for hierarchical and deep-learning-type approaches toward the understanding of neural coding and representation.

  16. [What kind of information do German health information pamphlets provide on mammography screening?].

    PubMed

    Kurzenhäuser, Stephanie

    2003-02-01

    To make an informed decision on participation in mammography screening, women have to be educated about all the risks and benefits of the procedure in a manner that is detailed and understandable. But an analysis of 27 German health pamphlets on mammography screening shows that many relevant pieces of information about the benefits, the risks, and especially the meaning of screening results are only insufficiently communicated. Many statements were presented narratively rather than as precise statistics. Depending on content, 17 to 62% of the quantifiable statements were actually given as numerical data. To provide comprehensive information and to avoid misunderstandings, it is necessary to supplement the currently available health pamphlets and make the information on mammography screening more precise.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wurtz, R.; Kaplan, A.

    Pulse shape discrimination (PSD) is a variety of statistical classifier. Fully-­realized statistical classifiers rely on a comprehensive set of tools for designing, building, and implementing. PSD advances rely on improvements to the implemented algorithm. PSD advances can be improved by using conventional statistical classifier or machine learning methods. This paper provides the reader with a glossary of classifier-­building elements and their functions in a fully-­designed and operational classifier framework that can be used to discover opportunities for improving PSD classifier projects. This paper recommends reporting the PSD classifier’s receiver operating characteristic (ROC) curve and its behavior at a gamma rejectionmore » rate (GRR) relevant for realistic applications.« less

  18. Temporal and spatial scaling impacts on extreme precipitation

    NASA Astrophysics Data System (ADS)

    Eggert, B.; Berg, P.; Haerter, J. O.; Jacob, D.; Moseley, C.

    2015-01-01

    Both in the current climate and in the light of climate change, understanding of the causes and risk of precipitation extremes is essential for protection of human life and adequate design of infrastructure. Precipitation extreme events depend qualitatively on the temporal and spatial scales at which they are measured, in part due to the distinct types of rain formation processes that dominate extremes at different scales. To capture these differences, we first filter large datasets of high-resolution radar measurements over Germany (5 min temporally and 1 km spatially) using synoptic cloud observations, to distinguish convective and stratiform rain events. In a second step, for each precipitation type, the observed data are aggregated over a sequence of time intervals and spatial areas. The resulting matrix allows a detailed investigation of the resolutions at which convective or stratiform events are expected to contribute most to the extremes. We analyze where the statistics of the two types differ and discuss at which resolutions transitions occur between dominance of either of the two precipitation types. We characterize the scales at which the convective or stratiform events will dominate the statistics. For both types, we further develop a mapping between pairs of spatially and temporally aggregated statistics. The resulting curve is relevant when deciding on data resolutions where statistical information in space and time is balanced. Our study may hence also serve as a practical guide for modelers, and for planning the space-time layout of measurement campaigns. We also describe a mapping between different pairs of resolutions, possibly relevant when working with mismatched model and observational resolutions, such as in statistical bias correction.

  19. Regional projection of climate impact indices over the Mediterranean region

    NASA Astrophysics Data System (ADS)

    Casanueva, Ana; Frías, M.; Dolores; Herrera, Sixto; Bedia, Joaquín; San Martín, Daniel; Gutiérrez, José Manuel; Zaninovic, Ksenija

    2014-05-01

    Climate Impact Indices (CIIs) are being increasingly used in different socioeconomic sectors to transfer information about climate change impacts and risks to stakeholders. CIIs are typically based on different weather variables such as temperature, wind speed, precipitation or humidity and comprise, in a single index, the relevant meteorological information for the particular impact sector (in this study wildfires and tourism). This dependence on several climate variables poses important limitations to the application of statistical downscaling techniques, since physical consistency among variables is required in most cases to obtain reliable local projections. The present study assesses the suitability of the "direct" downscaling approach, in which the downscaling method is directly applied to the CII. In particular, for illustrative purposes, we consider two popular indices used in the wildfire and tourism sectors, the Fire Weather Index (FWI) and the Physiological Equivalent Temperature (PET), respectively. As an example, two case studies are analysed over two representative Mediterranean regions of interest for the EU CLIM-RUN project: continental Spain for the FWI and Croatia for the PET. Results obtained with this "direct" downscaling approach are similar to those found from the application of the statistical downscaling to the individual meteorological drivers prior to the index calculation ("component" downscaling) thus, a wider range of statistical downscaling methods could be used. As an illustration, future changes in both indices are projected by applying two direct statistical downscaling methods, analogs and linear regression, to the ECHAM5 model. Larger differences were found between the two direct statistical downscaling approaches than between the direct and the component approaches with a single downscaling method. While these examples focus on particular indices and Mediterranean regions of interest for CLIM-RUN stakeholders, the same study could be extended to other indices and regions.

  20. Identifying biologically relevant differences between metagenomic communities.

    PubMed

    Parks, Donovan H; Beiko, Robert G

    2010-03-15

    Metagenomics is the study of genetic material recovered directly from environmental samples. Taxonomic and functional differences between metagenomic samples can highlight the influence of ecological factors on patterns of microbial life in a wide range of habitats. Statistical hypothesis tests can help us distinguish ecological influences from sampling artifacts, but knowledge of only the P-value from a statistical hypothesis test is insufficient to make inferences about biological relevance. Current reporting practices for pairwise comparative metagenomics are inadequate, and better tools are needed for comparative metagenomic analysis. We have developed a new software package, STAMP, for comparative metagenomics that supports best practices in analysis and reporting. Examination of a pair of iron mine metagenomes demonstrates that deeper biological insights can be gained using statistical techniques available in our software. An analysis of the functional potential of 'Candidatus Accumulibacter phosphatis' in two enhanced biological phosphorus removal metagenomes identified several subsystems that differ between the A.phosphatis stains in these related communities, including phosphate metabolism, secretion and metal transport. Python source code and binaries are freely available from our website at http://kiwi.cs.dal.ca/Software/STAMP CONTACT: beiko@cs.dal.ca Supplementary data are available at Bioinformatics online.

  1. Infrared sauna in patients with rheumatoid arthritis and ankylosing spondylitis. A pilot study showing good tolerance, short-term improvement of pain and stiffness, and a trend towards long-term beneficial effects.

    PubMed

    Oosterveld, Fredrikus G J; Rasker, Johannes J; Floors, Mark; Landkroon, Robert; van Rennes, Bob; Zwijnenberg, Jan; van de Laar, Mart A F J; Koel, Gerard J

    2009-01-01

    To study the effects of infrared (IR) Sauna, a form of total-body hyperthermia in patients with rheumatoid arthritis (RA) and ankylosing spondylitis (AS) patients were treated for a 4-week period with a series of eight IR treatments. Seventeen RA patients and 17 AS patients were studied. IR was well tolerated, and no adverse effects were reported, no exacerbation of disease. Pain and stiffness decreased clinically, and improvements were statistically significant (p < 0.05 and p < 0.001 in RA and AS patients, respectively) during an IR session. Fatigue also decreased. Both RA and AS patients felt comfortable on average during and especially after treatment. In the RA and AS patients, pain, stiffness, and fatigue also showed clinical improvements during the 4-week treatment period, but these did not reach statistical significance. No relevant changes in disease activity scores were found, indicating no exacerbation of disease activity. In conclusion, infrared treatment has statistically significant short-term beneficial effects and clinically relevant period effects during treatment in RA and AS patients without enhancing disease activity. IR has good tolerability and no adverse effects.

  2. Statistical Models of Fracture Relevant to Nuclear-Grade Graphite: Review and Recommendations

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Bratton, Robert L.

    2011-01-01

    The nuclear-grade (low-impurity) graphite needed for the fuel element and moderator material for next-generation (Gen IV) reactors displays large scatter in strength and a nonlinear stress-strain response from damage accumulation. This response can be characterized as quasi-brittle. In this expanded review, relevant statistical failure models for various brittle and quasi-brittle material systems are discussed with regard to strength distribution, size effect, multiaxial strength, and damage accumulation. This includes descriptions of the Weibull, Batdorf, and Burchell models as well as models that describe the strength response of composite materials, which involves distributed damage. Results from lattice simulations are included for a physics-based description of material breakdown. Consideration is given to the predicted transition between brittle and quasi-brittle damage behavior versus the density of damage (level of disorder) within the material system. The literature indicates that weakest-link-based failure modeling approaches appear to be reasonably robust in that they can be applied to materials that display distributed damage, provided that the level of disorder in the material is not too large. The Weibull distribution is argued to be the most appropriate statistical distribution to model the stochastic-strength response of graphite.

  3. Confirmation of ovarian homogeneity in post-vitellogenic cultured white sturgeon, Acipenser transmontanus.

    PubMed

    Talbott, Mariah J; Servid, Sarah A; Cavinato, Anna G; Van Eenennaam, Joel P; Doroshov, Serge I; Struffenegger, Peter; Webb, Molly A H

    2014-02-01

    Assessing stage of oocyte maturity in female sturgeon by calculating oocyte polarization index (PI) is a necessary tool for both conservation propagation managers and caviar producers to know when to hormonally induce spawning. We tested the assumption that sampling ovarian follicles from one section of one ovary is sufficient for calculating an oocyte PI representative of oocyte maturity for an individual animal. Short-wavelength near-infrared spectroscopy (SW-NIR) scans were performed on three positions per ovary for five fish prior to caviar harvest. Samples of ovarian follicles were subsequently taken from the exact location of the SW-NIR scans for calculation of oocyte PI and follicle diameter. Oocyte PI was statistically different though not biologically relevant within an ovary and between ovaries in four of five fish. Follicle diameter was statistically different but not biologically relevant within an ovary in three of five fish. There were no differences in follicle diameter between ovaries. No statistical differences were observed between SW-NIR spectra collected at different locations within an ovary or between ovaries. These results emphasize the importance of utilizing both oocyte PI measurement and progesterone-induced oocyte maturation assays while deciding when to hormonally induce spawning in sturgeon females.

  4. Pitfalls in the statistical examination and interpretation of the correspondence between physician and patient satisfaction ratings and their relevance for shared decision making research

    PubMed Central

    2011-01-01

    Background The correspondence of satisfaction ratings between physicians and patients can be assessed on different dimensions. One may examine whether they differ between the two groups or focus on measures of association or agreement. The aim of our study was to evaluate methodological difficulties in calculating the correspondence between patient and physician satisfaction ratings and to show the relevance for shared decision making research. Methods We utilised a structured tool for cardiovascular prevention (arriba™) in a pragmatic cluster-randomised controlled trial. Correspondence between patient and physician satisfaction ratings after individual primary care consultations was assessed using the Patient Participation Scale (PPS). We used the Wilcoxon signed-rank test, the marginal homogeneity test, Kendall's tau-b, weighted kappa, percentage of agreement, and the Bland-Altman method to measure differences, associations, and agreement between physicians and patients. Results Statistical measures signal large differences between patient and physician satisfaction ratings with more favourable ratings provided by patients and a low correspondence regardless of group allocation. Closer examination of the raw data revealed a high ceiling effect of satisfaction ratings and only slight disagreement regarding the distributions of differences between physicians' and patients' ratings. Conclusions Traditional statistical measures of association and agreement are not able to capture a clinically relevant appreciation of the physician-patient relationship by both parties in skewed satisfaction ratings. Only the Bland-Altman method for assessing agreement augmented by bar charts of differences was able to indicate this. Trial registration ISRCTN: ISRCT71348772 PMID:21592337

  5. Combining Shapley value and statistics to the analysis of gene expression data in children exposed to air pollution

    PubMed Central

    Moretti, Stefano; van Leeuwen, Danitsja; Gmuender, Hans; Bonassi, Stefano; van Delft, Joost; Kleinjans, Jos; Patrone, Fioravante; Merlo, Domenico Franco

    2008-01-01

    Background In gene expression analysis, statistical tests for differential gene expression provide lists of candidate genes having, individually, a sufficiently low p-value. However, the interpretation of each single p-value within complex systems involving several interacting genes is problematic. In parallel, in the last sixty years, game theory has been applied to political and social problems to assess the power of interacting agents in forcing a decision and, more recently, to represent the relevance of genes in response to certain conditions. Results In this paper we introduce a Bootstrap procedure to test the null hypothesis that each gene has the same relevance between two conditions, where the relevance is represented by the Shapley value of a particular coalitional game defined on a microarray data-set. This method, which is called Comparative Analysis of Shapley value (shortly, CASh), is applied to data concerning the gene expression in children differentially exposed to air pollution. The results provided by CASh are compared with the results from a parametric statistical test for testing differential gene expression. Both lists of genes provided by CASh and t-test are informative enough to discriminate exposed subjects on the basis of their gene expression profiles. While many genes are selected in common by CASh and the parametric test, it turns out that the biological interpretation of the differences between these two selections is more interesting, suggesting a different interpretation of the main biological pathways in gene expression regulation for exposed individuals. A simulation study suggests that CASh offers more power than t-test for the detection of differential gene expression variability. Conclusion CASh is successfully applied to gene expression analysis of a data-set where the joint expression behavior of genes may be critical to characterize the expression response to air pollution. We demonstrate a synergistic effect between coalitional games and statistics that resulted in a selection of genes with a potential impact in the regulation of complex pathways. PMID:18764936

  6. A Compositional Relevance Model for Adaptive Information Retrieval

    NASA Technical Reports Server (NTRS)

    Mathe, Nathalie; Chen, James; Lu, Henry, Jr. (Technical Monitor)

    1994-01-01

    There is a growing need for rapid and effective access to information in large electronic documentation systems. Access can be facilitated if information relevant in the current problem solving context can be automatically supplied to the user. This includes information relevant to particular user profiles, tasks being performed, and problems being solved. However most of this knowledge on contextual relevance is not found within the contents of documents, and current hypermedia tools do not provide any easy mechanism to let users add this knowledge to their documents. We propose a compositional relevance network to automatically acquire the context in which previous information was found relevant. The model records information on the relevance of references based on user feedback for specific queries and contexts. It also generalizes such information to derive relevant references for similar queries and contexts. This model lets users filter information by context of relevance, build personalized views of documents over time, and share their views with other users. It also applies to any type of multimedia information. Compared to other approaches, it is less costly and doesn't require any a priori statistical computation, nor an extended training period. It is currently being implemented into the Computer Integrated Documentation system which enables integration of various technical documents in a hypertext framework.

  7. The rank-heat plot is a novel way to present the results from a network meta-analysis including multiple outcomes.

    PubMed

    Veroniki, Areti Angeliki; Straus, Sharon E; Fyraridis, Alexandros; Tricco, Andrea C

    2016-08-01

    To present a novel and simple graphical approach to improve the presentation of the treatment ranking in a network meta-analysis (NMA) including multiple outcomes. NMA simultaneously compares many relevant interventions for a clinical condition from a network of trials, and allows ranking of the effectiveness and/or safety of each intervention. There are numerous ways to present the NMA results, which can challenge their interpretation by research users. The rank-heat plot is a novel graph that can be used to quickly recognize which interventions are most likely the best or worst interventions with respect to their effectiveness and/or safety for a single or multiple outcome(s) and may increase interpretability. Using empirical NMAs, we show that the need for a concise and informative presentation of results is imperative, particularly as the number of competing treatments and outcomes in an NMA increases. The rank-heat plot is an efficient way to present the results of ranking statistics, particularly when a large amount of data is available, and it is targeted to users from various backgrounds. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Hormesis as a biological hypothesis.

    PubMed Central

    Calabrese, E J; Baldwin, L A

    1998-01-01

    A comprehensive effort was undertaken to identify articles demonstrating chemical hormesis. Nearly 4000 potentially relevant articles were retrieved from preliminary computer database searches by using various key word descriptors and extensive cross-referencing. A priori evaluation criteria were established including study design features (e.g., number of doses, dose range), statistical analysis, and reproducibility of results. Evidence of chemical hormesis was judged to have occurred in approximately 350 of the 4000 studies evaluated. Chemical hormesis was observed in a wide range of taxonomic groups and involved agents representing highly diverse chemical classes, many of potential environmental relevance. Numerous biological end points were assessed; growth responses were the most prevalent, followed by metabolic effects, longevity, reproductive responses, and survival. Hormetic responses were generally observed to be of limited magnitude. The average low-dose maximum stimulation was approximately 50% greater than controls. The hormetic dose-response range was generally limited to about one order of magnitude, with the upper end of the hormetic curve approaching the estimated no observable effect level for the particular end point. Based on the evaluation criteria, high to moderate evidence of hormesis was observed in studies comprised of > 6 doses; with > 3 doses in the hormetic zone. The present analysis suggests that chemical hormesis is a reproducible and relatively common biological phenomenon. A quantitative scheme is presented for future application to the database. PMID:9539030

  9. Suicide and occupation: further supportive evidence for their relevance.

    PubMed

    Nishimura, Mariko; Terao, Takeshi; Soeda, Shuji; Nakamura, Jun; Iwata, Noboru; Sakamoto, Kaoru

    2004-01-01

    In recent years, the relationship between occupation and suicide has been extensively investigated, but few definite conclusions regarding the nature of the relationship have been established. In the present study, this relationship was investigated by examining Japanese governmental statistics. First, correlations of suicide rate relative to industry categories were examined individually for primary industry (farmers, fishermen, and forest workers), secondary industry (construction workers, manufacture works, and miners), and tertiary industry (indoor workers) for all of the 47 prefectures of Japan. Second, in the industries that showed a significant correlation with suicide rate, the relationship to other factors was adjusted using possibly confounding factors. As a result, suicide rate was positively correlated with primary industry percentage, but not with secondary or tertiary industry percentages. Multiple regression analysis showed that suicide rate was positively associated with primary industry percentage with significant tendency while it was significantly and negatively associated with annual total sunshine. Limitations are that individual suicide rates according to occupational types were not available and direct correlations with the above variables could not be investigated. The present findings suggest a possibility that occupational factors associated with primary industry may be relevant to suicide, and that, additionally, annual total sunshine may affect suicide independently. Since workers with primary industry are likely to be exposed to sunshine than other workers, they may tend to be more affected by the decrease of annual total sunshine.

  10. Risk factors for health care-associated infections: From better knowledge to better prevention.

    PubMed

    Ferreira, Etelvina; Pina, Elaine; Sousa-Uva, Mafalda; Sousa-Uva, António

    2017-10-01

    Health care-associated infections (HCAIs) are preventable with adoption of recognized preventive measures. The first step is to identify patients at higher risk of HCAI. This study aimed to identify patient risk factors (RFs) present on admission and acquired during inpatient stay which could be associated with higher risk of acquiring HCAI. A case-control study was conducted in adult patients admitted during 2011 who were hospitalized for >48 hours. Cases were patients with HCAIs. Controls were selected in a ratio of 3:1, case matched by the admission date. The likelihood of increased HCAI was determined through binary logistic regression. RFs identified as being the more relevant for HCAI were being a man (odds ratio [OR], 2.4; 95% confidence interval [CI], 1.2-4.7), being aged >50 years (OR, 2.9; 95% CI, 1.3-6.9), and having an insertion of a central venous line during hospital stay (OR, 12.4; 95% CI, 5.0-30.5). RFs that showed statistical significance on admission were the patient's intrinsic factors, and RFs acquired during hospitalization were extrinsic RFs. When a set of RFs were present, the presence of a central venous line proved to be the more relevant one. Copyright © 2017 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  11. Statistical physics approach to earthquake occurrence and forecasting

    NASA Astrophysics Data System (ADS)

    de Arcangelis, Lucilla; Godano, Cataldo; Grasso, Jean Robert; Lippiello, Eugenio

    2016-04-01

    There is striking evidence that the dynamics of the Earth crust is controlled by a wide variety of mutually dependent mechanisms acting at different spatial and temporal scales. The interplay of these mechanisms produces instabilities in the stress field, leading to abrupt energy releases, i.e., earthquakes. As a consequence, the evolution towards instability before a single event is very difficult to monitor. On the other hand, collective behavior in stress transfer and relaxation within the Earth crust leads to emergent properties described by stable phenomenological laws for a population of many earthquakes in size, time and space domains. This observation has stimulated a statistical mechanics approach to earthquake occurrence, applying ideas and methods as scaling laws, universality, fractal dimension, renormalization group, to characterize the physics of earthquakes. In this review we first present a description of the phenomenological laws of earthquake occurrence which represent the frame of reference for a variety of statistical mechanical models, ranging from the spring-block to more complex fault models. Next, we discuss the problem of seismic forecasting in the general framework of stochastic processes, where seismic occurrence can be described as a branching process implementing space-time-energy correlations between earthquakes. In this context we show how correlations originate from dynamical scaling relations between time and energy, able to account for universality and provide a unifying description for the phenomenological power laws. Then we discuss how branching models can be implemented to forecast the temporal evolution of the earthquake occurrence probability and allow to discriminate among different physical mechanisms responsible for earthquake triggering. In particular, the forecasting problem will be presented in a rigorous mathematical framework, discussing the relevance of the processes acting at different temporal scales for different levels of prediction. In this review we also briefly discuss how the statistical mechanics approach can be applied to non-tectonic earthquakes and to other natural stochastic processes, such as volcanic eruptions and solar flares.

  12. Detecting clinically relevant new information in clinical notes across specialties and settings.

    PubMed

    Zhang, Rui; Pakhomov, Serguei V S; Arsoniadis, Elliot G; Lee, Janet T; Wang, Yan; Melton, Genevieve B

    2017-07-05

    Automated methods for identifying clinically relevant new versus redundant information in electronic health record (EHR) clinical notes is useful for clinicians and researchers involved in patient care and clinical research, respectively. We evaluated methods to automatically identify clinically relevant new information in clinical notes, and compared the quantity of redundant information across specialties and clinical settings. Statistical language models augmented with semantic similarity measures were evaluated as a means to detect and quantify clinically relevant new and redundant information over longitudinal clinical notes for a given patient. A corpus of 591 progress notes over 40 inpatient admissions was annotated for new information longitudinally by physicians to generate a reference standard. Note redundancy between various specialties was evaluated on 71,021 outpatient notes and 64,695 inpatient notes from 500 solid organ transplant patients (April 2015 through August 2015). Our best method achieved at best performance of 0.87 recall, 0.62 precision, and 0.72 F-measure. Addition of semantic similarity metrics compared to baseline improved recall but otherwise resulted in similar performance. While outpatient and inpatient notes had relatively similar levels of high redundancy (61% and 68%, respectively), redundancy differed by author specialty with mean redundancy of 75%, 66%, 57%, and 55% observed in pediatric, internal medicine, psychiatry and surgical notes, respectively. Automated techniques with statistical language models for detecting redundant versus clinically relevant new information in clinical notes do not improve with the addition of semantic similarity measures. While levels of redundancy seem relatively similar in the inpatient and ambulatory settings in the Fairview Health Services, clinical note redundancy appears to vary significantly with different medical specialties.

  13. Outcomes associated with community-based research projects in teaching undergraduate public health.

    PubMed

    Bouhaimed, Manal; Thalib, Lukman; Doi, Suhail A R

    2008-01-01

    Community based research projects have been widely used in teaching public health in many institutions. Nevertheless, there is a paucity of information on the learning outcomes of such a teaching strategy. We therefore attempted to evaluate our experience with such a project based teaching process. The objective of this study was to evaluate the factors related to quality, impact and relevance of a 6-week student project for teaching public health in the faculty of medicine at Kuwait University. Interactive sessions familiarized students with research methods. Concurrently, they designed and completed a participatory project with a Community Medicine mentor. Questionnaires were used to assess quality, impact and relevance of the project, and these were correlated with multiple demographic, statistical and research design factors. We evaluated a total of 104 projects that were completed during the period of September 2001 to June 2006. Three dimensions of outcome were assessed: quality; impact and relevance. The average (mean + SE; maximum of 5) scores across all projects were 2.6 + 0.05 (range 1.7-3.7) for quality, 2.8 + 0.06 (range 1.7-4.3) for impact and 3.3 + 0.08 (range 1.3-5) for relevance. The analysis of the relationship between various factors and the scores on each dimension of assessment revealed that various factors were associated with improved quality, impact or relevance to public health practice. We conclude that use of more objective measurement instruments with better a priori conceptualization along with appropriate use of statistics and a more developed study design were likely to result in more meaningful research outcomes. We also found that a biostatistics or epidemiology mentor improved the research outcome.

  14. Statistical Relational Learning to Predict Primary Myocardial Infarction from Electronic Health Records

    PubMed Central

    Weiss, Jeremy C; Page, David; Peissig, Peggy L; Natarajan, Sriraam; McCarty, Catherine

    2013-01-01

    Electronic health records (EHRs) are an emerging relational domain with large potential to improve clinical outcomes. We apply two statistical relational learning (SRL) algorithms to the task of predicting primary myocardial infarction. We show that one SRL algorithm, relational functional gradient boosting, outperforms propositional learners particularly in the medically-relevant high recall region. We observe that both SRL algorithms predict outcomes better than their propositional analogs and suggest how our methods can augment current epidemiological practices. PMID:25360347

  15. Compounding approach for univariate time series with nonstationary variances

    NASA Astrophysics Data System (ADS)

    Schäfer, Rudi; Barkhofen, Sonja; Guhr, Thomas; Stöckmann, Hans-Jürgen; Kuhl, Ulrich

    2015-12-01

    A defining feature of nonstationary systems is the time dependence of their statistical parameters. Measured time series may exhibit Gaussian statistics on short time horizons, due to the central limit theorem. The sample statistics for long time horizons, however, averages over the time-dependent variances. To model the long-term statistical behavior, we compound the local distribution with the distribution of its parameters. Here, we consider two concrete, but diverse, examples of such nonstationary systems: the turbulent air flow of a fan and a time series of foreign exchange rates. Our main focus is to empirically determine the appropriate parameter distribution for the compounding approach. To this end, we extract the relevant time scales by decomposing the time signals into windows and determine the distribution function of the thus obtained local variances.

  16. Compounding approach for univariate time series with nonstationary variances.

    PubMed

    Schäfer, Rudi; Barkhofen, Sonja; Guhr, Thomas; Stöckmann, Hans-Jürgen; Kuhl, Ulrich

    2015-12-01

    A defining feature of nonstationary systems is the time dependence of their statistical parameters. Measured time series may exhibit Gaussian statistics on short time horizons, due to the central limit theorem. The sample statistics for long time horizons, however, averages over the time-dependent variances. To model the long-term statistical behavior, we compound the local distribution with the distribution of its parameters. Here, we consider two concrete, but diverse, examples of such nonstationary systems: the turbulent air flow of a fan and a time series of foreign exchange rates. Our main focus is to empirically determine the appropriate parameter distribution for the compounding approach. To this end, we extract the relevant time scales by decomposing the time signals into windows and determine the distribution function of the thus obtained local variances.

  17. Compressed ultrasound video image-quality evaluation using a Likert scale and Kappa statistical analysis

    NASA Astrophysics Data System (ADS)

    Stewart, Brent K.; Carter, Stephen J.; Langer, Steven G.; Andrew, Rex K.

    1998-06-01

    Experiments using NASA's Advanced Communications Technology Satellite were conducted to provide an estimate of the compressed video quality required for preservation of clinically relevant features for the detection of trauma. Bandwidth rates of 128, 256 and 384 kbps were used. A five point Likert scale (1 equals no useful information and 5 equals good diagnostic quality) was used for a subjective preference questionnaire to evaluate the quality of the compressed ultrasound imagery at the three compression rates for several anatomical regions of interest. At 384 kbps the Likert scores (mean plus or minus SD) were abdomen (4.45 plus or minus 0.71), carotid artery (4.70 plus or minus 0.36), kidney (5.0 plus or minus 0.0), liver (4.67 plus or minus 0.58) and thyroid (4.03 plus or minus 0.74). Due to the volatile nature of the H.320 compressed digital video stream, no statistically significant results can be derived through this methodology. As the MPEG standard has at its roots many of the same intraframe and motion vector compression algorithms as the H.261 (such as that used in the previous ACTS/AMT experiments), we are using the MPEG compressed video sequences to best gauge what minimum bandwidths are necessary for preservation of clinically relevant features for the detection of trauma. We have been using an MPEG codec board to collect losslessly compressed video clips from high quality S- VHS tapes and through direct digitization of S-video. Due to the large number of videoclips and questions to be presented to the radiologists and for ease of application, we have developed a web browser interface for this video visual perception study. Due to the large numbers of observations required to reach statistical significance in most ROC studies, Kappa statistical analysis is used to analyze the degree of agreement between observers and between viewing assessment. If the degree of agreement amongst readers is high, then there is a possibility that the ratings (i.e., average Likert score at each bandwidth) do in fact reflect the dimension they are purported to reflect (video quality versus bandwidth). It is then possible to make intelligent choice of bandwidth for streaming compressed video and compressed videoclips.

  18. Diagnosis of Mood Disorders.

    ERIC Educational Resources Information Center

    Seligman, Linda; Moore, Bonita Marcus

    1995-01-01

    Provides an overview of mood disorders according to Diagnostic and Statistical Manual (fourth edition) criteria and other relevant information. Differential diagnosis is facilitated through discussion of differences and similarities among mental disorders, age and gender-related patterns of mood disorders, and useful diagnostic tools. (Author)

  19. Investigation of α -induced reactions on Sb isotopes relevant to the astrophysical γ process

    NASA Astrophysics Data System (ADS)

    Korkulu, Z.; Özkan, N.; Kiss, G. G.; Szücs, T.; Gyürky, Gy.; Fülöp, Zs.; Güray, R. T.; Halász, Z.; Rauscher, T.; Somorjai, E.; Török, Zs.; Yalçın, C.

    2018-04-01

    Background: The reaction rates used in γ -process nucleosynthesis network calculations are mostly derived from theoretical, statistical model cross sections. Experimental data is scarce for charged particle reactions at astrophysical, low energies. Where experimental (α ,γ ) data exists, it is often strongly overestimated by Hauser-Feshbach statistical model calculations. Further experimental α -capture cross sections in the intermediate and heavy mass region are necessary to test theoretical models and to gain understanding of heavy element nucleosynthesis in the astrophysical γ process. Purpose: The aim of the present work is to measure the 121Sb(α ,γ )125I , 121Sb(α ,n )124I , and 123Sb(α ,n )126I reaction cross sections. These measurements are important tests of astrophysical reaction rate predictions and extend the experimental database required for an improved understanding of p-isotope production. Method: The α -induced reactions on natural and enriched antimony targets were investigated using the activation technique. The (α ,γ ) cross sections of 121Sb were measured and are reported for the first time. To determine the cross section of the 121Sb(α ,γ )125I , 121Sb(α ,n )124I , and 123Sb(α ,n )126I reactions, the yields of γ rays following the β decay of the reaction products were measured. For the measurement of the lowest cross sections, the characteristic x rays were counted with a low-energy photon spectrometer detector. Results: The cross section of the 121Sb(α ,γ )125I , 121Sb(α ,n )124I , and 123Sb(α ,n )126I reactions were measured with high precision in an energy range between 9.74 and 15.48 MeV, close to the astrophysically relevant energy window. The results are compared with the predictions of statistical model calculations. The (α ,n) data show that the α widths are predicted well for these reactions. The (α ,γ ) results are overestimated by the calculations but this is because of the applied neutron and γ widths. Conclusions: Relevant for the astrophysical reaction rate is the α width used in the calculations. While for other reactions the α widths seem to have been overestimated and their energy dependence was not described well in the measured energy range, this is not the case for the reactions studied here. The result is consistent with the proposal that additional reaction channels, such as Coulomb excitation, may have led to the discrepancies found in other reactions.

  20. An Update on the NASA Planetary Science Division Research and Analysis Program

    NASA Astrophysics Data System (ADS)

    Bernstein, Max; Richey, Christina; Rall, Jonathan

    2015-11-01

    Introduction: NASA’s Planetary Science Division (PSD) solicits its research and analysis (R&A) programs each year in Research Opportunities in Space and Earth Sciences (ROSES). Beginning with the 2014 ROSES solicitation, PSD changed the structure of the program elements under which the majority of planetary science R&A is done. Major changes included the creation of five core research program elements aligned with PSD’s strategic science questions, the introduction of several new R&A opportunities, new submission requirements, and a new timeline for proposal submission.ROSES and NSPIRES: ROSES contains the research announcements for all of SMD. Submission of ROSES proposals is done electronically via NSPIRES: http://nspires.nasaprs.com. We will present further details on the proposal submission process to help guide younger scientists. Statistical trends, including the average award size within the PSD programs, selections rates, and lessons learned, will be presented. Information on new programs will also be presented, if available.Review Process and Volunteering: The SARA website (http://sara.nasa.gov) contains information on all ROSES solicitations. There is an email address (SARA@nasa.gov) for inquiries and an area for volunteer reviewers to sign up. The peer review process is based on Scientific/Technical Merit, Relevance, and Level of Effort, and will be detailed within this presentation.ROSES 2015 submission changes: All PSD programs will continue to use a two-step proposal submission process. A Step-1 proposal is required and must be submitted electronically by the Step-1 due date. The Step-1 proposal should include a description of the science goals and objectives to be addressed by the proposal, a brief description of the methodology to be used to address the science goals and objectives, and the relevance of the proposed research to the call submitted to.

Top